22415
step1-simulation success yes 2019-05-16 20:02:17+00:00 2019-05-16 20:18:49+00:00 0:16:32 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median 0.9549096814971124 survival_time_median 14.950000000000076 deviation-center-line_median 0.9357839616453356 in-drivable-lane_median 0.849999999999997
other stats agent_compute-ego_max 0.18074224154154456 agent_compute-ego_mean 0.17371583696440154 agent_compute-ego_median 0.172976446946462 agent_compute-ego_min 0.1688112511354334 deviation-center-line_max 2.334655865862999 deviation-center-line_mean 0.9879715075158756 deviation-center-line_min 0.1999349376906932 deviation-heading_max 12.904965982970015 deviation-heading_mean 5.1200907887469835 deviation-heading_median 3.596230592332538 deviation-heading_min 2.174753382622343 driven_any_max 2.748207692459921 driven_any_mean 1.2052548943363444 driven_any_median 1.0879291838247798 driven_any_min 1.9152295794596823e-12 driven_lanedir_consec_max 1.387048580759752 driven_lanedir_consec_mean 0.6854081866510972 driven_lanedir_consec_min -5.89932152017525e-08 driven_lanedir_max 1.387048580759752 driven_lanedir_mean 0.720569905609605 driven_lanedir_median 1.0850827295486316 driven_lanedir_min -5.89932152017525e-08 in-drivable-lane_max 9.100000000000067 in-drivable-lane_mean 3.3900000000000197 in-drivable-lane_min 0 per-episodes details {"ETHZ_autolab_technical_track-0-0": {"driven_any": 2.748207692459921, "sim_physics": 0.05899174769719442, "survival_time": 14.950000000000076, "driven_lanedir": 1.387048580759752, "sim_render-ego": 0.06287358442942302, "in-drivable-lane": 7.0000000000000355, "agent_compute-ego": 0.1760239553451538, "deviation-heading": 2.174753382622343, "set_robot_commands": 0.07759058157602947, "deviation-center-line": 1.0108148594987827, "driven_lanedir_consec": 1.387048580759752, "sim_compute_sim_state": 0.04043722152709961, "sim_compute_performance-ego": 0.06457882006963094, "sim_compute_robot_state-ego": 0.06379093408584595}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 2.112865745022347, "sim_physics": 0.059270910422007245, "survival_time": 14.950000000000076, "driven_lanedir": 1.1307182762896515, "sim_render-ego": 0.061204959551493326, "in-drivable-lane": 9.100000000000067, "agent_compute-ego": 0.1700252898534139, "deviation-heading": 3.596230592332538, "set_robot_commands": 0.07612169663111369, "deviation-center-line": 0.45866791288156766, "driven_lanedir_consec": 0.9549096814971124, "sim_compute_sim_state": 0.04020744721094767, "sim_compute_performance-ego": 0.06616552273432413, "sim_compute_robot_state-ego": 0.06436314105987549}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 0.07727185037275815, "sim_physics": 0.058198738098144534, "survival_time": 4.249999999999993, "driven_lanedir": -5.89932152017525e-08, "sim_render-ego": 0.05917840564952177, "in-drivable-lane": 0.849999999999997, "agent_compute-ego": 0.1688112511354334, "deviation-heading": 2.8226336951116706, "set_robot_commands": 0.07861074279336368, "deviation-center-line": 0.1999349376906932, "driven_lanedir_consec": -5.89932152017525e-08, "sim_compute_sim_state": 0.04163806298199822, "sim_compute_performance-ego": 0.06447155615862678, "sim_compute_robot_state-ego": 0.0631738915162928}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.9152295794596823e-12, "sim_physics": 0.0607490611076355, "survival_time": 14.950000000000076, "driven_lanedir": 4.4320458414404124e-10, "sim_render-ego": 0.06280940612157186, "in-drivable-lane": 0, "agent_compute-ego": 0.18074224154154456, "deviation-heading": 12.904965982970015, "set_robot_commands": 0.07841910759607951, "deviation-center-line": 0.9357839616453356, "driven_lanedir_consec": 4.4320458414404124e-10, "sim_compute_sim_state": 0.04019814411799113, "sim_compute_performance-ego": 0.06492121616999308, "sim_compute_robot_state-ego": 0.06385136047999064}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 1.0879291838247798, "sim_physics": 0.05951239347457886, "survival_time": 14.950000000000076, "driven_lanedir": 1.0850827295486316, "sim_render-ego": 0.06288059234619141, "in-drivable-lane": 0, "agent_compute-ego": 0.172976446946462, "deviation-heading": 4.101870290698347, "set_robot_commands": 0.07712016503016154, "deviation-center-line": 2.334655865862999, "driven_lanedir_consec": 1.0850827295486316, "sim_compute_sim_state": 0.0409175173441569, "sim_compute_performance-ego": 0.06604103088378906, "sim_compute_robot_state-ego": 0.06484139760335286}}set_robot_commands_max 0.07861074279336368 set_robot_commands_mean 0.07757245872534958 set_robot_commands_median 0.07759058157602947 set_robot_commands_min 0.07612169663111369 sim_compute_performance-ego_max 0.06616552273432413 sim_compute_performance-ego_mean 0.06523562920327279 sim_compute_performance-ego_median 0.06492121616999308 sim_compute_performance-ego_min 0.06447155615862678 sim_compute_robot_state-ego_max 0.06484139760335286 sim_compute_robot_state-ego_mean 0.06400414494907154 sim_compute_robot_state-ego_median 0.06385136047999064 sim_compute_robot_state-ego_min 0.0631738915162928 sim_compute_sim_state_max 0.04163806298199822 sim_compute_sim_state_mean 0.04067967863643871 sim_compute_sim_state_median 0.04043722152709961 sim_compute_sim_state_min 0.04019814411799113 sim_physics_max 0.0607490611076355 sim_physics_mean 0.05934457015991211 sim_physics_median 0.059270910422007245 sim_physics_min 0.058198738098144534 sim_render-ego_max 0.06288059234619141 sim_render-ego_mean 0.06178938961964028 sim_render-ego_median 0.06280940612157186 sim_render-ego_min 0.05917840564952177 simulation-passed 1 survival_time_max 14.950000000000076 survival_time_mean 12.81000000000006 survival_time_min 4.249999999999993
No reset possible 21855
step1-simulation host-error no 2019-05-10 04:51:36+00:00 2019-05-10 05:00:30+00:00 0:08:54 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 488, in get_cr
submission_id=submission_id, timeout_sec=timeout_sec)
File "/usr/lib/python3.6/contextlib.py", line 88, in __exit__
next(self.gen)
File "/project/src/duckietown_challenges_runner/runner.py", line 347, in setup_logging
convert(f_stdout)
File "/project/src/duckietown_challenges_runner/runner.py", line 343, in convert
with open(fn, 'w') as f:
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido2-LF-sim-validation/submission3321/step1-simulation-ip-172-31-38-104-5376-job21855/logs/challenges-runner/stdout.html'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible