Duckietown Challenges Home Challenges Submissions

Submission 3296

Submission3296
Competingyes
Challengeaido2-LF-sim-validation
Userjiang peng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 22444
Next
User labelThird test - TH
Admin priority50
Blessingn/a
User priority50

22444

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
22444step1-simulationsuccessyes0:18:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.5625804732500495
survival_time_median14.950000000000076
deviation-center-line_median1.1698837272523748
in-drivable-lane_median1.1000000000000156


other stats
agent_compute-ego_max0.18611751079559327
agent_compute-ego_mean0.18077207835515335
agent_compute-ego_median0.18314343929290772
agent_compute-ego_min0.16847042640050253
deviation-center-line_max1.459319908753633
deviation-center-line_mean1.1987510610496053
deviation-center-line_min0.9391295391734318
deviation-heading_max4.707369714165231
deviation-heading_mean3.2139189593142747
deviation-heading_median3.178631104730079
deviation-heading_min2.008253469291316
driven_any_max5.342279310525335
driven_any_mean3.6604427746521737
driven_any_median3.636894290631455
driven_any_min2.455427904924257
driven_lanedir_consec_max5.29393649255367
driven_lanedir_consec_mean3.245650827220022
driven_lanedir_consec_min1.5258191746980736
driven_lanedir_max5.29393649255367
driven_lanedir_mean3.4301974856370947
driven_lanedir_median3.5625804732500495
driven_lanedir_min2.243778842147973
in-drivable-lane_max4.9499999999999975
in-drivable-lane_mean1.610000000000008
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 2.4870338187439804, "sim_physics": 0.05298996051152547, "survival_time": 14.950000000000076, "driven_lanedir": 2.243778842147973, "sim_render-ego": 0.05283098618189494, "in-drivable-lane": 4.9499999999999975, "agent_compute-ego": 0.18314343929290772, "deviation-heading": 3.771969216372672, "set_robot_commands": 0.07486454010009766, "deviation-center-line": 1.2689583831348747, "driven_lanedir_consec": 1.5258191746980736, "sim_compute_sim_state": 0.03542221228281657, "sim_compute_performance-ego": 0.05726420005162557, "sim_compute_robot_state-ego": 0.05784172693888346}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 5.342279310525335, "sim_physics": 0.05521598736445109, "survival_time": 14.950000000000076, "driven_lanedir": 5.29393649255367, "sim_render-ego": 0.05660218795140584, "in-drivable-lane": 0, "agent_compute-ego": 0.18545785506566365, "deviation-heading": 2.008253469291316, "set_robot_commands": 0.07586246252059936, "deviation-center-line": 1.1698837272523748, "driven_lanedir_consec": 5.29393649255367, "sim_compute_sim_state": 0.03777879476547241, "sim_compute_performance-ego": 0.06016388018925985, "sim_compute_robot_state-ego": 0.06130816698074341}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 4.3805785484358415, "sim_physics": 0.05547953844070434, "survival_time": 14.950000000000076, "driven_lanedir": 3.709366417526528, "sim_render-ego": 0.05222558418909709, "in-drivable-lane": 1.1000000000000156, "agent_compute-ego": 0.16847042640050253, "deviation-heading": 2.403371292012072, "set_robot_commands": 0.0751651398340861, "deviation-center-line": 1.1564637469337116, "driven_lanedir_consec": 3.709366417526528, "sim_compute_sim_state": 0.03715720415115356, "sim_compute_performance-ego": 0.0587774395942688, "sim_compute_robot_state-ego": 0.05917754570643107}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 2.455427904924257, "sim_physics": 0.05616841713587443, "survival_time": 14.950000000000076, "driven_lanedir": 2.3413252027072557, "sim_render-ego": 0.05703994909922282, "in-drivable-lane": 2.0000000000000258, "agent_compute-ego": 0.18611751079559327, "deviation-heading": 4.707369714165231, "set_robot_commands": 0.07751471439997355, "deviation-center-line": 1.459319908753633, "driven_lanedir_consec": 2.1365515780717916, "sim_compute_sim_state": 0.03882108211517334, "sim_compute_performance-ego": 0.060897933642069496, "sim_compute_robot_state-ego": 0.06025100390116374}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 3.636894290631455, "sim_physics": 0.05504616260528565, "survival_time": 14.950000000000076, "driven_lanedir": 3.5625804732500495, "sim_render-ego": 0.05258286396662394, "in-drivable-lane": 0, "agent_compute-ego": 0.18067116022109983, "deviation-heading": 3.178631104730079, "set_robot_commands": 0.07363721370697021, "deviation-center-line": 0.9391295391734318, "driven_lanedir_consec": 3.5625804732500495, "sim_compute_sim_state": 0.0367575740814209, "sim_compute_performance-ego": 0.0574726398785909, "sim_compute_robot_state-ego": 0.05815633376439412}}
set_robot_commands_max0.07751471439997355
set_robot_commands_mean0.07540881411234537
set_robot_commands_median0.0751651398340861
set_robot_commands_min0.07363721370697021
sim_compute_performance-ego_max0.060897933642069496
sim_compute_performance-ego_mean0.058915218671162915
sim_compute_performance-ego_median0.0587774395942688
sim_compute_performance-ego_min0.05726420005162557
sim_compute_robot_state-ego_max0.06130816698074341
sim_compute_robot_state-ego_mean0.05934695545832316
sim_compute_robot_state-ego_median0.05917754570643107
sim_compute_robot_state-ego_min0.05784172693888346
sim_compute_sim_state_max0.03882108211517334
sim_compute_sim_state_mean0.03718737347920735
sim_compute_sim_state_median0.03715720415115356
sim_compute_sim_state_min0.03542221228281657
sim_physics_max0.05616841713587443
sim_physics_mean0.05498001321156819
sim_physics_median0.05521598736445109
sim_physics_min0.05298996051152547
sim_render-ego_max0.05703994909922282
sim_render-ego_mean0.05425631427764892
sim_render-ego_median0.05283098618189494
sim_render-ego_min0.05222558418909709
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean14.950000000000076
survival_time_min14.950000000000076
No reset possible
21817step1-simulationsuccessno0:14:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
21816step1-simulationhost-errorno0:00:48
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 488, in get_cr
    submission_id=submission_id, timeout_sec=timeout_sec)
  File "/usr/lib/python3.6/contextlib.py", line 88, in __exit__
    next(self.gen)
  File "/project/src/duckietown_challenges_runner/runner.py", line 347, in setup_logging
    convert(f_stdout)
  File "/project/src/duckietown_challenges_runner/runner.py", line 343, in convert
    with open(fn, 'w') as f:
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido2-LF-sim-validation/submission3296/step1-simulation-ip-172-31-40-253-32059-job21816/logs/challenges-runner/stdout.html'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible