Duckietown Challenges Home Challenges Submissions

Job 22567

Job ID22567
submission3156
userAngel Woo 🇭🇰
user labelAfter debugging
challengeaido2-LF-sim-validation
stepstep1-simulation
statussuccess
up to dateyes
evaluatorip-172-31-42-167-7194
date started
date completed
duration0:15:17
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.17915640456498538
survival_time_median8.74999999999999
deviation-center-line_median0.2003248037254356
in-drivable-lane_median4.049999999999997


other stats
agent_compute-ego_max0.16470309893290205
agent_compute-ego_mean0.15881664171171347
agent_compute-ego_median0.1587613650730678
agent_compute-ego_min0.1522204101085663
deviation-center-line_max0.41658470774036993
deviation-center-line_mean0.22612745800118436
deviation-center-line_min0.13556638336345997
deviation-heading_max5.8981703053044185
deviation-heading_mean3.890366408412045
deviation-heading_median3.470304640722573
deviation-heading_min1.5989489549267215
driven_any_max0.8920135755657805
driven_any_mean0.6059155717399937
driven_any_median0.5478809630822497
driven_any_min0.2849249755599099
driven_lanedir_consec_max0.27036834295453116
driven_lanedir_consec_mean0.1809468053859057
driven_lanedir_consec_min0.07652771045324913
driven_lanedir_max0.27036834295453116
driven_lanedir_mean0.1809468053859057
driven_lanedir_median0.17915640456498538
driven_lanedir_min0.07652771045324913
in-drivable-lane_max7.300000000000045
in-drivable-lane_mean4.690000000000015
in-drivable-lane_min1.8499999999999983
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 0.8920135755657805, "sim_physics": 0.10968043088912964, "survival_time": 14.950000000000076, "driven_lanedir": 0.2375050995277523, "sim_render-ego": 0.06209482431411743, "in-drivable-lane": 7.25000000000004, "agent_compute-ego": 0.16351343790690104, "deviation-heading": 5.856076269421671, "set_robot_commands": 0.0904528244336446, "deviation-center-line": 0.2375517005031117, "driven_lanedir_consec": 0.2375050995277523, "sim_compute_sim_state": 0.03822298367818197, "sim_compute_performance-ego": 0.0674483569463094, "sim_compute_robot_state-ego": 0.07428908904393514}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 0.8887544301759801, "sim_physics": 0.10748707294464112, "survival_time": 14.950000000000076, "driven_lanedir": 0.27036834295453116, "sim_render-ego": 0.06122628529866536, "in-drivable-lane": 7.300000000000045, "agent_compute-ego": 0.16470309893290205, "deviation-heading": 5.8981703053044185, "set_robot_commands": 0.09173007647196452, "deviation-center-line": 0.41658470774036993, "driven_lanedir_consec": 0.27036834295453116, "sim_compute_sim_state": 0.03791768232981363, "sim_compute_performance-ego": 0.06679577668507894, "sim_compute_robot_state-ego": 0.07227627913157145}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 0.2849249755599099, "sim_physics": 0.09624545276165009, "survival_time": 3.999999999999994, "driven_lanedir": 0.07652771045324913, "sim_render-ego": 0.053854212164878845, "in-drivable-lane": 1.8499999999999983, "agent_compute-ego": 0.1522204101085663, "deviation-heading": 1.5989489549267215, "set_robot_commands": 0.08731792271137237, "deviation-center-line": 0.13556638336345997, "driven_lanedir_consec": 0.07652771045324913, "sim_compute_sim_state": 0.03560947477817535, "sim_compute_performance-ego": 0.06238342523574829, "sim_compute_robot_state-ego": 0.06648513078689575}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 0.416003914316048, "sim_physics": 0.10210683179456134, "survival_time": 6.449999999999985, "driven_lanedir": 0.14117646942901052, "sim_render-ego": 0.05539816848991453, "in-drivable-lane": 2.999999999999994, "agent_compute-ego": 0.1548848965371302, "deviation-heading": 2.6283318716848414, "set_robot_commands": 0.08727819975032362, "deviation-center-line": 0.1406096946735447, "driven_lanedir_consec": 0.14117646942901052, "sim_compute_sim_state": 0.03655384492504504, "sim_compute_performance-ego": 0.06369378030762192, "sim_compute_robot_state-ego": 0.06292209883992986}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 0.5478809630822497, "sim_physics": 0.10540775162833076, "survival_time": 8.74999999999999, "driven_lanedir": 0.17915640456498538, "sim_render-ego": 0.05879937716892787, "in-drivable-lane": 4.049999999999997, "agent_compute-ego": 0.1587613650730678, "deviation-heading": 3.470304640722573, "set_robot_commands": 0.09011144093104773, "deviation-center-line": 0.2003248037254356, "driven_lanedir_consec": 0.17915640456498538, "sim_compute_sim_state": 0.03768033708844866, "sim_compute_performance-ego": 0.06490881102425712, "sim_compute_robot_state-ego": 0.06811461857386998}}
set_robot_commands_max0.09173007647196452
set_robot_commands_mean0.08937809285967056
set_robot_commands_median0.09011144093104773
set_robot_commands_min0.08727819975032362
sim_compute_performance-ego_max0.0674483569463094
sim_compute_performance-ego_mean0.06504603003980312
sim_compute_performance-ego_median0.06490881102425712
sim_compute_performance-ego_min0.06238342523574829
sim_compute_robot_state-ego_max0.07428908904393514
sim_compute_robot_state-ego_mean0.06881744327524043
sim_compute_robot_state-ego_median0.06811461857386998
sim_compute_robot_state-ego_min0.06292209883992986
sim_compute_sim_state_max0.03822298367818197
sim_compute_sim_state_mean0.03719686455993293
sim_compute_sim_state_median0.03768033708844866
sim_compute_sim_state_min0.03560947477817535
sim_physics_max0.10968043088912964
sim_physics_mean0.10418550800366258
sim_physics_median0.10540775162833076
sim_physics_min0.09624545276165009
sim_render-ego_max0.06209482431411743
sim_render-ego_mean0.05827457348730081
sim_render-ego_median0.05879937716892787
sim_render-ego_min0.053854212164878845
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean9.820000000000023
survival_time_min3.999999999999994

Highlights

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.