Duckietown Challenges Home Challenges Submissions

Job 40825

Job ID40825
submission10854
userDishank Bansal 🇨🇦
user labelexercise_ros_template
challengeaido5-LF-sim-validation
stepLFv-sim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatormont02-5bd91ed070bc-1
date started
date completed
duration0:09:05
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.7077608117618737
survival_time_median14.950000000000076
deviation-center-line_median0.4934872808954404
in-drivable-lane_median1.125000000000013


other stats
agent_compute-ego_max0.018354993661244712
agent_compute-ego_mean0.017698457790356056
agent_compute-ego_median0.01755475044250488
agent_compute-ego_min0.01732933661516975
complete-iteration_max0.21511857986450195
complete-iteration_mean0.21086857096821657
complete-iteration_median0.2108644679480908
complete-iteration_min0.2066267681121826
deviation-center-line_max0.6537021714729888
deviation-center-line_mean0.47431286869607825
deviation-center-line_min0.2565747415204433
deviation-heading_max3.2265703818838274
deviation-heading_mean2.6644352170123637
deviation-heading_median2.863214833991794
deviation-heading_min1.7047408181820398
driven_any_max2.5525615417981427
driven_any_mean2.268771076062672
driven_any_median2.5525327394969155
driven_any_min1.4174572834587147
driven_lanedir_consec_max2.4975546245067792
driven_lanedir_consec_mean1.6696364747799797
driven_lanedir_consec_min0.7654696510893919
driven_lanedir_max2.4975546245067792
driven_lanedir_mean1.7165418448103682
driven_lanedir_median1.7077608117618737
driven_lanedir_min0.9530911312109468
get_duckie_state_max2.8888384501139324e-06
get_duckie_state_mean2.8305544572718005e-06
get_duckie_state_median2.8626124064127607e-06
get_duckie_state_min2.7081545661477483e-06
get_robot_state_max0.016269432797151453
get_robot_state_mean0.01604644936673781
get_robot_state_median0.01608784039815267
get_robot_state_min0.015740683873494466
get_state_dump_max0.014043719628277948
get_state_dump_mean0.013739435836380603
get_state_dump_median0.013672516345977784
get_state_dump_min0.0135689910252889
get_ui_image_max0.036667823791503906
get_ui_image_mean0.03658772607644399
get_ui_image_median0.03657733639081319
get_ui_image_min0.03652840773264567
in-drivable-lane_max8.000000000000057
in-drivable-lane_mean2.562500000000021
in-drivable-lane_min0.0
per-episodes
details{"ETHZ_autolab_technical_track-sc0-0-ego": {"driven_any": 1.4174572834587147, "get_ui_image": 0.036667823791503906, "step_physics": 0.08346242343678194, "survival_time": 8.499999999999986, "driven_lanedir": 1.2153475626507988, "get_state_dump": 0.014043719628277948, "sim_render-ego": 0.0062225201550652, "get_robot_state": 0.016269432797151453, "get_duckie_state": 2.7081545661477483e-06, "in-drivable-lane": 0.7500000000000053, "agent_compute-ego": 0.01732933661516975, "deviation-heading": 1.7047408181820398, "complete-iteration": 0.2069239448098575, "set_robot_commands": 0.004801579082713408, "deviation-center-line": 0.2565747415204433, "driven_lanedir_consec": 1.2153475626507988, "sim_compute_sim_state": 0.02374105313244988, "sim_compute_performance-ego": 0.0042340601191801185}, "ETHZ_autolab_technical_track-sc1-0-ego": {"driven_any": 2.5525206784907986, "get_ui_image": 0.036549967924753825, "step_physics": 0.08331032673517863, "survival_time": 14.950000000000076, "driven_lanedir": 2.2001740608729485, "get_state_dump": 0.013669339815775554, "sim_render-ego": 0.006290440559387207, "get_robot_state": 0.015740683873494466, "get_duckie_state": 2.8888384501139324e-06, "in-drivable-lane": 1.500000000000021, "agent_compute-ego": 0.017710875670115152, "deviation-heading": 3.2265703818838274, "complete-iteration": 0.2066267681121826, "set_robot_commands": 0.0047728308041890466, "deviation-center-line": 0.6152932192455625, "driven_lanedir_consec": 2.2001740608729485, "sim_compute_sim_state": 0.024193143049875895, "sim_compute_performance-ego": 0.004237864017486572}, "ETHZ_autolab_technical_track-sc2-0-ego": {"driven_any": 2.552544800503032, "get_ui_image": 0.03660470485687256, "step_physics": 0.08953081130981445, "survival_time": 14.950000000000076, "driven_lanedir": 0.9530911312109468, "get_state_dump": 0.0135689910252889, "sim_render-ego": 0.006465617020924886, "get_robot_state": 0.016219650109608966, "get_duckie_state": 2.886454264322917e-06, "in-drivable-lane": 8.000000000000057, "agent_compute-ego": 0.018354993661244712, "deviation-heading": 3.0855345332679582, "complete-iteration": 0.2148049910863241, "set_robot_commands": 0.0049984359741210935, "deviation-center-line": 0.3716813425453184, "driven_lanedir_consec": 0.7654696510893919, "sim_compute_sim_state": 0.024494966665903728, "sim_compute_performance-ego": 0.004411344528198242}, "ETHZ_autolab_technical_track-sc3-0-ego": {"driven_any": 2.5525615417981427, "get_ui_image": 0.03652840773264567, "step_physics": 0.0888947359720866, "survival_time": 14.950000000000076, "driven_lanedir": 2.4975546245067792, "get_state_dump": 0.013675692876180014, "sim_render-ego": 0.006146461963653565, "get_robot_state": 0.01595603068669637, "get_duckie_state": 2.838770548502604e-06, "in-drivable-lane": 0.0, "agent_compute-ego": 0.017398625214894613, "deviation-heading": 2.6408951347156298, "complete-iteration": 0.21511857986450195, "set_robot_commands": 0.004747296174367269, "deviation-center-line": 0.6537021714729888, "driven_lanedir_consec": 2.4975546245067792, "sim_compute_sim_state": 0.027507274945576983, "sim_compute_performance-ego": 0.004112908045450846}}
set_robot_commands_max0.0049984359741210935
set_robot_commands_mean0.0048300355088477034
set_robot_commands_median0.004787204943451227
set_robot_commands_min0.004747296174367269
sim_compute_performance-ego_max0.004411344528198242
sim_compute_performance-ego_mean0.004249044177578945
sim_compute_performance-ego_median0.004235962068333345
sim_compute_performance-ego_min0.004112908045450846
sim_compute_sim_state_max0.027507274945576983
sim_compute_sim_state_mean0.024984109448451624
sim_compute_sim_state_median0.02434405485788981
sim_compute_sim_state_min0.02374105313244988
sim_render-ego_max0.006465617020924886
sim_render-ego_mean0.006281259924757715
sim_render-ego_median0.0062564803572262035
sim_render-ego_min0.006146461963653565
simulation-passed1
step_physics_max0.08953081130981445
step_physics_mean0.0862995743634654
step_physics_median0.08617857970443427
step_physics_min0.08331032673517863
survival_time_max14.950000000000076
survival_time_mean13.337500000000055
survival_time_min8.499999999999986

Highlights

40825

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-sc0-0

ETHZ_autolab_technical_track-sc1-0

ETHZ_autolab_technical_track-sc2-0

ETHZ_autolab_technical_track-sc3-0

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.