Duckietown Challenges Home Challenges Submissions

Submission 3104

Submission3104
Competingyes
Challengeaido2-LF-sim-validation
UserAngel Woo 🇭🇰
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 22608
Next
User labelHKU Duckietown Project
Admin priority50
Blessingn/a
User priority50

22608

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
22608step1-simulationsuccessyes0:18:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.36350587303071746
survival_time_median14.950000000000076
deviation-center-line_median0.6316239264126045
in-drivable-lane_median5.500000000000035


other stats
agent_compute-ego_max0.2110713259379069
agent_compute-ego_mean0.20859667840825588
agent_compute-ego_median0.20982706868970716
agent_compute-ego_min0.20504201960375928
deviation-center-line_max0.9769025183137768
deviation-center-line_mean0.651205911375337
deviation-center-line_min0.4881492430845232
deviation-heading_max3.270862676190257
deviation-heading_mean2.080258621735755
deviation-heading_median1.724619064883375
deviation-heading_min1.169675554474122
driven_any_max0.9433359659747644
driven_any_mean0.857842099821486
driven_any_median0.9263852997757488
driven_any_min0.678776927870358
driven_lanedir_consec_max0.803520620167895
driven_lanedir_consec_mean0.5074545536121795
driven_lanedir_consec_min0.311460118376489
driven_lanedir_max0.803520620167895
driven_lanedir_mean0.5074545536121795
driven_lanedir_median0.36350587303071746
driven_lanedir_min0.311460118376489
in-drivable-lane_max9.250000000000089
in-drivable-lane_mean5.110000000000048
in-drivable-lane_min1.7000000000000242
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 0.799233615259626, "sim_physics": 0.06157479605336827, "survival_time": 12.700000000000044, "driven_lanedir": 0.36350587303071746, "sim_render-ego": 0.06508222058063418, "in-drivable-lane": 6.650000000000059, "agent_compute-ego": 0.20504201960375928, "deviation-heading": 1.169675554474122, "set_robot_commands": 0.07801662655327264, "deviation-center-line": 0.63907009452613, "driven_lanedir_consec": 0.36350587303071746, "sim_compute_sim_state": 0.04245647858446977, "sim_compute_performance-ego": 0.06683614591913899, "sim_compute_robot_state-ego": 0.06574430240420845}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 0.9263852997757488, "sim_physics": 0.06000910758972168, "survival_time": 14.950000000000076, "driven_lanedir": 0.311460118376489, "sim_render-ego": 0.06763114134470621, "in-drivable-lane": 9.250000000000089, "agent_compute-ego": 0.2110713259379069, "deviation-heading": 1.724619064883375, "set_robot_commands": 0.07885382175445557, "deviation-center-line": 0.4881492430845232, "driven_lanedir_consec": 0.311460118376489, "sim_compute_sim_state": 0.04354879220326741, "sim_compute_performance-ego": 0.06723598957061767, "sim_compute_robot_state-ego": 0.06640129486719767}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 0.9433359659747644, "sim_physics": 0.0590843407313029, "survival_time": 14.950000000000076, "driven_lanedir": 0.7438720649592581, "sim_render-ego": 0.06633347749710083, "in-drivable-lane": 2.450000000000035, "agent_compute-ego": 0.21042896191279092, "deviation-heading": 3.270862676190257, "set_robot_commands": 0.07791696389516195, "deviation-center-line": 0.9769025183137768, "driven_lanedir_consec": 0.7438720649592581, "sim_compute_sim_state": 0.043353341420491534, "sim_compute_performance-ego": 0.06660800377527873, "sim_compute_robot_state-ego": 0.06493072748184205}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 0.9414786902269324, "sim_physics": 0.05936248064041138, "survival_time": 14.950000000000076, "driven_lanedir": 0.803520620167895, "sim_render-ego": 0.06526065031687418, "in-drivable-lane": 1.7000000000000242, "agent_compute-ego": 0.20661401589711503, "deviation-heading": 2.731872804386035, "set_robot_commands": 0.07842476367950439, "deviation-center-line": 0.6316239264126045, "driven_lanedir_consec": 0.803520620167895, "sim_compute_sim_state": 0.042919962406158446, "sim_compute_performance-ego": 0.06705692450205485, "sim_compute_robot_state-ego": 0.065025475025177}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 0.678776927870358, "sim_physics": 0.05964960815670254, "survival_time": 11.100000000000025, "driven_lanedir": 0.31491409152653826, "sim_render-ego": 0.06640669891426156, "in-drivable-lane": 5.500000000000035, "agent_compute-ego": 0.20982706868970716, "deviation-heading": 1.5042630087449855, "set_robot_commands": 0.07748333935265068, "deviation-center-line": 0.5202837745396505, "driven_lanedir_consec": 0.31491409152653826, "sim_compute_sim_state": 0.04238012459901002, "sim_compute_performance-ego": 0.0670436040775196, "sim_compute_robot_state-ego": 0.06547346845403448}}
set_robot_commands_max0.07885382175445557
set_robot_commands_mean0.07813910304700905
set_robot_commands_median0.07801662655327264
set_robot_commands_min0.07748333935265068
sim_compute_performance-ego_max0.06723598957061767
sim_compute_performance-ego_mean0.06695613356892197
sim_compute_performance-ego_median0.0670436040775196
sim_compute_performance-ego_min0.06660800377527873
sim_compute_robot_state-ego_max0.06640129486719767
sim_compute_robot_state-ego_mean0.06551505364649193
sim_compute_robot_state-ego_median0.06547346845403448
sim_compute_robot_state-ego_min0.06493072748184205
sim_compute_sim_state_max0.04354879220326741
sim_compute_sim_state_mean0.04293173984267944
sim_compute_sim_state_median0.042919962406158446
sim_compute_sim_state_min0.04238012459901002
sim_physics_max0.06157479605336827
sim_physics_mean0.05993606663430136
sim_physics_median0.05964960815670254
sim_physics_min0.0590843407313029
sim_render-ego_max0.06763114134470621
sim_render-ego_mean0.06614283773071539
sim_render-ego_median0.06633347749710083
sim_render-ego_min0.06508222058063418
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean13.73000000000006
survival_time_min11.100000000000025
No reset possible
20805step1-simulationsuccessno0:22:21
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
20798step1-simulationhost-errorno0:24:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 483, in get_cr
    submission_id=submission_id, timeout_sec=timeout_sec)
  File "/usr/lib/python3.6/contextlib.py", line 88, in __exit__
    next(self.gen)
  File "/project/src/duckietown_challenges_runner/runner.py", line 343, in setup_logging
    convert(f_stdout)
  File "/project/src/duckietown_challenges_runner/runner.py", line 332, in convert
    data = open(log).read().strip()
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido2-LF-sim-validation/submission3104/step1-simulation-ip-172-31-38-104-5376-job20798/logs/challenges-runner/stdout.log'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible