Duckietown Challenges Home Challenges Submissions

Submission 4656

Submission4656
Competingyes
Challengeaido3-LF-sim-validation
UserRey Wiyatno 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 27212
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

27212

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
27212step1-simulationsuccessyes0:07:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.847425931746789
survival_time_median6.649999999999984
deviation-center-line_median0.32830305781070357
in-drivable-lane_median1.2499999999999956


other stats
agent_compute-ego_max0.04426214255784687
agent_compute-ego_mean0.035161409509271616
agent_compute-ego_median0.04169527053833008
agent_compute-ego_min0.02213298320770264
deviation-center-line_max1.1053030186386503
deviation-center-line_mean0.5356032492817583
deviation-center-line_min0.09732629714935574
deviation-heading_max5.019596552982473
deviation-heading_mean2.0997816369848907
deviation-heading_median1.2079341517232902
deviation-heading_min0.30133427346109626
driven_any_max5.46588171613268
driven_any_mean3.094916549757598
driven_any_median2.7673569485622007
driven_any_min1.5824416274480797
driven_lanedir_consec_max4.88916844778323
driven_lanedir_consec_mean2.4222837679561415
driven_lanedir_consec_min1.251331755341352
driven_lanedir_max4.88916844778323
driven_lanedir_mean2.6347323384961276
driven_lanedir_median2.437877386489574
driven_lanedir_min1.251331755341352
in-drivable-lane_max3.800000000000007
in-drivable-lane_mean1.899999999999998
in-drivable-lane_min1.149999999999996
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 3.479105899101239, "sim_physics": 0.06996249675750732, "survival_time": 14.950000000000076, "driven_lanedir": 2.747858171119695, "sim_render-ego": 0.01066329797108968, "in-drivable-lane": 3.800000000000007, "agent_compute-ego": 0.02213298320770264, "deviation-heading": 5.019596552982473, "set_robot_commands": 0.009016196727752683, "deviation-center-line": 0.85267966681142, "driven_lanedir_consec": 1.685615318419764, "sim_compute_sim_state": 0.0046302119890848795, "sim_compute_performance-ego": 0.006590645313262939, "sim_compute_robot_state-ego": 0.00787729024887085}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 2.17979655754379, "sim_physics": 0.07670973587036133, "survival_time": 6.249999999999986, "driven_lanedir": 1.847425931746789, "sim_render-ego": 0.01110409927368164, "in-drivable-lane": 1.1999999999999955, "agent_compute-ego": 0.025150430679321288, "deviation-heading": 1.2079341517232902, "set_robot_commands": 0.00806818962097168, "deviation-center-line": 0.32830305781070357, "driven_lanedir_consec": 1.847425931746789, "sim_compute_sim_state": 0.00479918098449707, "sim_compute_performance-ego": 0.00681178092956543, "sim_compute_robot_state-ego": 0.007850614547729493}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 5.46588171613268, "sim_physics": 0.0686311149597168, "survival_time": 14.950000000000076, "driven_lanedir": 4.88916844778323, "sim_render-ego": 0.01027458111445109, "in-drivable-lane": 2.099999999999994, "agent_compute-ego": 0.04169527053833008, "deviation-heading": 3.065044186473056, "set_robot_commands": 0.007576135794321696, "deviation-center-line": 1.1053030186386503, "driven_lanedir_consec": 4.88916844778323, "sim_compute_sim_state": 0.0044807966550191244, "sim_compute_performance-ego": 0.006210470199584961, "sim_compute_robot_state-ego": 0.0075700926780700685}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.5824416274480797, "sim_physics": 0.07429140492489464, "survival_time": 3.7999999999999945, "driven_lanedir": 1.251331755341352, "sim_render-ego": 0.011130389414335551, "in-drivable-lane": 1.149999999999996, "agent_compute-ego": 0.04426214255784687, "deviation-heading": 0.30133427346109626, "set_robot_commands": 0.007758444861361855, "deviation-center-line": 0.09732629714935574, "driven_lanedir_consec": 1.251331755341352, "sim_compute_sim_state": 0.004774234796825208, "sim_compute_performance-ego": 0.006619331083799663, "sim_compute_robot_state-ego": 0.008075867828569915}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 2.7673569485622007, "sim_physics": 0.06355945866807063, "survival_time": 6.649999999999984, "driven_lanedir": 2.437877386489574, "sim_render-ego": 0.01074075698852539, "in-drivable-lane": 1.2499999999999956, "agent_compute-ego": 0.04256622056315716, "deviation-heading": 0.9049990202845376, "set_robot_commands": 0.007210389115756616, "deviation-center-line": 0.2944042059986618, "driven_lanedir_consec": 2.437877386489574, "sim_compute_sim_state": 0.004383278968638943, "sim_compute_performance-ego": 0.0061150834076386645, "sim_compute_robot_state-ego": 0.007257323516042609}}
set_robot_commands_max0.009016196727752683
set_robot_commands_mean0.007925871224032908
set_robot_commands_median0.007758444861361855
set_robot_commands_min0.007210389115756616
sim_compute_performance-ego_max0.00681178092956543
sim_compute_performance-ego_mean0.006469462186770332
sim_compute_performance-ego_median0.006590645313262939
sim_compute_performance-ego_min0.0061150834076386645
sim_compute_robot_state-ego_max0.008075867828569915
sim_compute_robot_state-ego_mean0.007726237763856586
sim_compute_robot_state-ego_median0.007850614547729493
sim_compute_robot_state-ego_min0.007257323516042609
sim_compute_sim_state_max0.00479918098449707
sim_compute_sim_state_mean0.004613540678813045
sim_compute_sim_state_median0.0046302119890848795
sim_compute_sim_state_min0.004383278968638943
sim_physics_max0.07670973587036133
sim_physics_mean0.07063084223611014
sim_physics_median0.06996249675750732
sim_physics_min0.06355945866807063
sim_render-ego_max0.011130389414335551
sim_render-ego_mean0.010782624952416673
sim_render-ego_median0.01074075698852539
sim_render-ego_min0.01027458111445109
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean9.320000000000023
survival_time_min3.7999999999999945
No reset possible
26279step1-simulationsuccessno0:05:24
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
26278step1-simulationsuccessno0:07:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
26277step1-simulationhost-errorno0:00:08
Uncaught exception w [...]
Uncaught exception while running Docker Compose:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 894, in run_one
    run_docker(wd, project, cmd)
  File "/project/src/duckietown_challenges_runner/runner.py", line 1277, in run_docker
    subprocess.check_call(cmd0, cwd=cwd, stdout=tee_stdout, stderr=tee_stderr)
  File "/usr/lib/python3.6/subprocess.py", line 306, in check_call
    retcode = call(*popenargs, **kwargs)
  File "/usr/lib/python3.6/subprocess.py", line 287, in call
    with Popen(*popenargs, **kwargs) as p:
  File "/usr/lib/python3.6/subprocess.py", line 729, in __init__
    restore_signals, start_new_session)
  File "/usr/lib/python3.6/subprocess.py", line 1364, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido3-LF-sim-validation/submission4656/step1-simulation-ip-172-31-43-40-10498-job26277': '/tmp/duckietown/DT18/evaluator/executions/aido3-LF-sim-validation/submission4656/step1-simulation-ip-172-31-43-40-10498-job26277'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible