Duckietown Challenges Home Challenges Submissions

Submission 5336

Submission5336
Competingyes
Challengeaido3-LF-sim-validation
UserRey Wiyatno 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 28354
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

28354

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
28354step1-simulationsuccessyes0:06:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.2586648048992704
survival_time_median14.950000000000076
deviation-center-line_median0.809628930719978
in-drivable-lane_median1.3000000000000185


other stats
agent_compute-ego_max0.01635107119878133
agent_compute-ego_mean0.01589619384773474
agent_compute-ego_median0.01587965408960978
agent_compute-ego_min0.015517955621083575
deviation-center-line_max1.036210024583842
deviation-center-line_mean0.8139317780580736
deviation-center-line_min0.6087381298062428
deviation-heading_max5.3622219530446795
deviation-heading_mean3.805079473479356
deviation-heading_median3.720178072453094
deviation-heading_min2.2535432268317375
driven_any_max3.969252682946914
driven_any_mean3.4554236086108787
driven_any_median3.553831177200358
driven_any_min2.8678041844063142
driven_lanedir_consec_max3.8529962388611327
driven_lanedir_consec_mean3.1652765567024597
driven_lanedir_consec_min2.3946255065295072
driven_lanedir_max3.8529962388611327
driven_lanedir_mean3.1659593388168523
driven_lanedir_median3.2620787154712327
driven_lanedir_min2.3946255065295072
in-drivable-lane_max1.900000000000027
in-drivable-lane_mean1.1900000000000168
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 2.8678041844063142, "sim_physics": 0.055976564884185794, "survival_time": 14.950000000000076, "driven_lanedir": 2.3946255065295072, "sim_render-ego": 0.006957437992095947, "in-drivable-lane": 1.900000000000027, "agent_compute-ego": 0.01635107119878133, "deviation-heading": 4.647256868838484, "set_robot_commands": 0.007265555063883464, "deviation-center-line": 0.9870237468558646, "driven_lanedir_consec": 2.3946255065295072, "sim_compute_sim_state": 0.0036018713315327964, "sim_compute_performance-ego": 0.0040997950236002605, "sim_compute_robot_state-ego": 0.005123799641927084}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 3.053309970469099, "sim_physics": 0.05670751651128133, "survival_time": 14.950000000000076, "driven_lanedir": 2.7043031474773973, "sim_render-ego": 0.007065291404724121, "in-drivable-lane": 1.3000000000000185, "agent_compute-ego": 0.01584421157836914, "deviation-heading": 5.3622219530446795, "set_robot_commands": 0.007176233927408854, "deviation-center-line": 0.809628930719978, "driven_lanedir_consec": 2.7043031474773973, "sim_compute_sim_state": 0.0036557475725809735, "sim_compute_performance-ego": 0.003991055488586426, "sim_compute_robot_state-ego": 0.005153706868489583}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 3.832920028031711, "sim_physics": 0.05480601628621419, "survival_time": 14.950000000000076, "driven_lanedir": 3.615793085744991, "sim_render-ego": 0.007071479956309001, "in-drivable-lane": 1.200000000000017, "agent_compute-ego": 0.015517955621083575, "deviation-heading": 3.042197246228786, "set_robot_commands": 0.007058383623758952, "deviation-center-line": 1.036210024583842, "driven_lanedir_consec": 3.615793085744991, "sim_compute_sim_state": 0.0034744962056477867, "sim_compute_performance-ego": 0.003973416487375895, "sim_compute_robot_state-ego": 0.0050950837135314946}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 3.969252682946914, "sim_physics": 0.05789172109753016, "survival_time": 12.150000000000038, "driven_lanedir": 3.8529962388611327, "sim_render-ego": 0.007099366482393241, "in-drivable-lane": 0, "agent_compute-ego": 0.01588807675082988, "deviation-heading": 2.2535432268317375, "set_robot_commands": 0.007224149664733635, "deviation-center-line": 0.6087381298062428, "driven_lanedir_consec": 3.8529962388611327, "sim_compute_sim_state": 0.0038581914862487543, "sim_compute_performance-ego": 0.004017712157449604, "sim_compute_robot_state-ego": 0.005194613962997625}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 3.553831177200358, "sim_physics": 0.05664733409881592, "survival_time": 14.950000000000076, "driven_lanedir": 3.2620787154712327, "sim_render-ego": 0.007075838247934977, "in-drivable-lane": 1.550000000000022, "agent_compute-ego": 0.01587965408960978, "deviation-heading": 3.720178072453094, "set_robot_commands": 0.007065423329671224, "deviation-center-line": 0.6280580583244406, "driven_lanedir_consec": 3.2586648048992704, "sim_compute_sim_state": 0.003693025112152099, "sim_compute_performance-ego": 0.003980721632639567, "sim_compute_robot_state-ego": 0.005145529905954997}}
set_robot_commands_max0.007265555063883464
set_robot_commands_mean0.007157949121891226
set_robot_commands_median0.007176233927408854
set_robot_commands_min0.007058383623758952
sim_compute_performance-ego_max0.0040997950236002605
sim_compute_performance-ego_mean0.0040125401579303496
sim_compute_performance-ego_median0.003991055488586426
sim_compute_performance-ego_min0.003973416487375895
sim_compute_robot_state-ego_max0.005194613962997625
sim_compute_robot_state-ego_mean0.005142546818580156
sim_compute_robot_state-ego_median0.005145529905954997
sim_compute_robot_state-ego_min0.0050950837135314946
sim_compute_sim_state_max0.0038581914862487543
sim_compute_sim_state_mean0.003656666341632482
sim_compute_sim_state_median0.0036557475725809735
sim_compute_sim_state_min0.0034744962056477867
sim_physics_max0.05789172109753016
sim_physics_mean0.05640583057560549
sim_physics_median0.05664733409881592
sim_physics_min0.05480601628621419
sim_render-ego_max0.007099366482393241
sim_render-ego_mean0.0070538828166914576
sim_render-ego_median0.007071479956309001
sim_render-ego_min0.006957437992095947
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean14.390000000000068
survival_time_min12.150000000000038
No reset possible
28352step1-simulationsuccessyes0:07:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible