Duckietown Challenges Home Challenges Submissions

Submission 4836

Submission4836
Competingyes
Challengeaido3-LF-sim-validation
UserRey Wiyatno 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 27014
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

27014

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
27014step1-simulationsuccessyes0:13:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.8434017692639806
survival_time_median14.950000000000076
deviation-center-line_median0.8872231883105502
in-drivable-lane_median2.0000000000000133


other stats
agent_compute-ego_max0.05309825658798218
agent_compute-ego_mean0.0500605149269104
agent_compute-ego_median0.05051356236139933
agent_compute-ego_min0.04444584608078003
deviation-center-line_max1.2821821373974418
deviation-center-line_mean0.9286248403485152
deviation-center-line_min0.6726447700360509
deviation-heading_max4.415704787907316
deviation-heading_mean3.3982034330461675
deviation-heading_median3.215089099280813
deviation-heading_min3.0008609040426384
driven_any_max4.3586487730568315
driven_any_mean3.600766406166984
driven_any_median3.49136823587164
driven_any_min2.857537592170254
driven_lanedir_consec_max4.033159850470316
driven_lanedir_consec_mean2.619883221135044
driven_lanedir_consec_min1.2826251668787398
driven_lanedir_max4.033159850470316
driven_lanedir_mean3.259367362600559
driven_lanedir_median3.0707562549722773
driven_lanedir_min2.427705159719596
in-drivable-lane_max4.05000000000005
in-drivable-lane_mean2.2200000000000144
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 4.3586487730568315, "sim_physics": 0.10802626848220824, "survival_time": 14.950000000000076, "driven_lanedir": 3.921813778576625, "sim_render-ego": 0.01194645881652832, "in-drivable-lane": 3.500000000000015, "agent_compute-ego": 0.04444584608078003, "deviation-heading": 3.215089099280813, "set_robot_commands": 0.010515387852986653, "deviation-center-line": 0.8599952764006829, "driven_lanedir_consec": 3.003745422916444, "sim_compute_sim_state": 0.00517625093460083, "sim_compute_performance-ego": 0.007365161577860514, "sim_compute_robot_state-ego": 0.009205009937286375}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 2.857537592170254, "sim_physics": 0.10280134836832684, "survival_time": 14.950000000000076, "driven_lanedir": 2.427705159719596, "sim_render-ego": 0.012248400847117109, "in-drivable-lane": 4.05000000000005, "agent_compute-ego": 0.05051356236139933, "deviation-heading": 3.1135244014961914, "set_robot_commands": 0.00931744893391927, "deviation-center-line": 0.6726447700360509, "driven_lanedir_consec": 1.9364838961457411, "sim_compute_sim_state": 0.005056411425272624, "sim_compute_performance-ego": 0.007378973166147868, "sim_compute_robot_state-ego": 0.00937665065129598}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 3.1501513515646007, "sim_physics": 0.1177333394686381, "survival_time": 14.950000000000076, "driven_lanedir": 2.8434017692639806, "sim_render-ego": 0.01290879726409912, "in-drivable-lane": 1.5499999999999945, "agent_compute-ego": 0.052484455903371176, "deviation-heading": 3.0008609040426384, "set_robot_commands": 0.01065823475519816, "deviation-center-line": 1.2821821373974418, "driven_lanedir_consec": 2.8434017692639806, "sim_compute_sim_state": 0.005287124315897624, "sim_compute_performance-ego": 0.007687190373738607, "sim_compute_robot_state-ego": 0.009645422299702965}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 3.49136823587164, "sim_physics": 0.10282122532526652, "survival_time": 14.950000000000076, "driven_lanedir": 3.0707562549722773, "sim_render-ego": 0.011812660694122314, "in-drivable-lane": 2.0000000000000133, "agent_compute-ego": 0.04976045370101929, "deviation-heading": 4.415704787907316, "set_robot_commands": 0.009637718200683591, "deviation-center-line": 0.8872231883105502, "driven_lanedir_consec": 1.2826251668787398, "sim_compute_sim_state": 0.004897263844807943, "sim_compute_performance-ego": 0.007189579010009765, "sim_compute_robot_state-ego": 0.009243606726328532}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 4.1461260781715925, "sim_physics": 0.10944063266118367, "survival_time": 14.950000000000076, "driven_lanedir": 4.033159850470316, "sim_render-ego": 0.01265847365061442, "in-drivable-lane": 0, "agent_compute-ego": 0.05309825658798218, "deviation-heading": 3.2458379725038777, "set_robot_commands": 0.0098551074663798, "deviation-center-line": 0.9410788295978502, "driven_lanedir_consec": 4.033159850470316, "sim_compute_sim_state": 0.005242922306060791, "sim_compute_performance-ego": 0.007472125689188639, "sim_compute_robot_state-ego": 0.009475536346435548}}
set_robot_commands_max0.01065823475519816
set_robot_commands_mean0.009996779441833494
set_robot_commands_median0.0098551074663798
set_robot_commands_min0.00931744893391927
sim_compute_performance-ego_max0.007687190373738607
sim_compute_performance-ego_mean0.007418605963389079
sim_compute_performance-ego_median0.007378973166147868
sim_compute_performance-ego_min0.007189579010009765
sim_compute_robot_state-ego_max0.009645422299702965
sim_compute_robot_state-ego_mean0.00938924519220988
sim_compute_robot_state-ego_median0.00937665065129598
sim_compute_robot_state-ego_min0.009205009937286375
sim_compute_sim_state_max0.005287124315897624
sim_compute_sim_state_mean0.005131994565327963
sim_compute_sim_state_median0.00517625093460083
sim_compute_sim_state_min0.004897263844807943
sim_physics_max0.1177333394686381
sim_physics_mean0.10816456286112468
sim_physics_median0.10802626848220824
sim_physics_min0.10280134836832684
sim_render-ego_max0.01290879726409912
sim_render-ego_mean0.012314958254496257
sim_render-ego_median0.012248400847117109
sim_render-ego_min0.011812660694122314
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean14.950000000000076
survival_time_min14.950000000000076
No reset possible
26982step1-simulationhost-erroryes0:11:53
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 648, in get_cr
    wd, aws_config, copy_to_machine_cache=copy_to_machine_cache
  File "/project/src/duckietown_challenges_runner/runner.py", line 1309, in upload_files
    aws_config, toupload, copy_to_machine_cache=copy_to_machine_cache
  File "/project/src/duckietown_challenges_runner/runner.py", line 1565, in upload
    copy_to_cache(realfile, sha256hex)
  File "/project/src/duckietown_challenges_runner/runner_cache.py", line 40, in copy_to_cache
    msg = "Copying %s to cache %s" % (friendly_size(os.stat(fn).st_size), have)
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido3-LF-sim-validation/submission4836/step1-simulation-ip-172-31-43-40-10497-job26982/challenge-evaluation-output/episodes/ETHZ_autolab_technical_track-4-0/drawing.html'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible