Duckietown Challenges Home Challenges Submissions

Submission 12678

Submission12678
Competingyes
Challengeaido5-LF-sim-testing
UserBea Baselines 🐤
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFt-sim: 53492
Next
User labelbaseline-duckietown
Admin priority50
Blessingn/a
User priority50

The highlights are available only to the owner and the admins.

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
53492LFt-simsuccessyes0:25:53
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.79824005524381
survival_time_median41.799999999999464
deviation-center-line_median1.3942895943779623
in-drivable-lane_median11.55000000000015


other stats
agent_compute-ego0_max0.013879815183974242
agent_compute-ego0_mean0.013116693596527647
agent_compute-ego0_median0.013014338197160222
agent_compute-ego0_min0.012558282807815908
complete-iteration_max0.2235021587216189
complete-iteration_mean0.1983018140481721
complete-iteration_median0.19934017073648908
complete-iteration_min0.17102475599809128
deviation-center-line_max4.173717558478709
deviation-center-line_mean1.822625602170982
deviation-center-line_min0.3282056614492942
deviation-heading_max11.81039344128615
deviation-heading_mean5.461925204162566
deviation-heading_median4.283562385143984
deviation-heading_min1.4701826050761433
driven_any_max7.921138999296374
driven_any_mean5.2656432993421305
driven_any_median5.458093068855286
driven_any_min2.225248060361575
driven_lanedir_consec_max7.333766369440623
driven_lanedir_consec_mean3.4263526724360065
driven_lanedir_consec_min0.7751642098157845
driven_lanedir_max7.333766369440623
driven_lanedir_mean3.7019258636902026
driven_lanedir_median3.3493864377522016
driven_lanedir_min0.7751642098157845
get_duckie_state_max1.5722524117272064e-06
get_duckie_state_mean1.47085147951992e-06
get_duckie_state_median1.4685481626573575e-06
get_duckie_state_min1.3740571810377577e-06
get_robot_state_max0.004052268429386914
get_robot_state_mean0.003913481923787125
get_robot_state_median0.003940002110280363
get_robot_state_min0.003721655045200856
get_state_dump_max0.005379368667815057
get_state_dump_mean0.005014172693778108
get_state_dump_median0.00499882666296407
get_state_dump_min0.004679668781369231
get_ui_image_max0.03783880463249975
get_ui_image_mean0.03269684635004411
get_ui_image_median0.033009658087427375
get_ui_image_min0.026929264592821956
in-drivable-lane_max18.49999999999983
in-drivable-lane_mean10.950000000000028
in-drivable-lane_min2.1999999999999815
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921138999296374, "get_ui_image": 0.030378248967497076, "step_physics": 0.11006401937073416, "survival_time": 59.99999999999873, "driven_lanedir": 5.244671202126248, "get_state_dump": 0.005130981227738176, "get_robot_state": 0.003973629750578131, "sim_render-ego0": 0.004111464474223039, "get_duckie_state": 1.5722524117272064e-06, "in-drivable-lane": 18.49999999999983, "deviation-heading": 6.328894531889041, "agent_compute-ego0": 0.012924839118239682, "complete-iteration": 0.18149502549342172, "set_robot_commands": 0.002445732723366311, "deviation-center-line": 2.342644112975315, "driven_lanedir_consec": 4.142378437109462, "sim_compute_sim_state": 0.010147424264315462, "sim_compute_performance-ego0": 0.002213819934168426}, "LF-norm-zigzag-000-ego0": {"driven_any": 7.920980455188104, "get_ui_image": 0.03783880463249975, "step_physics": 0.14199264480311308, "survival_time": 59.99999999999873, "driven_lanedir": 7.333766369440623, "get_state_dump": 0.004866672098189965, "get_robot_state": 0.003906374469982595, "sim_render-ego0": 0.004050108713472416, "get_duckie_state": 1.3850511460379696e-06, "in-drivable-lane": 2.1999999999999815, "deviation-heading": 11.81039344128615, "agent_compute-ego0": 0.013103837276080764, "complete-iteration": 0.2235021587216189, "set_robot_commands": 0.00241242519127737, "deviation-center-line": 4.173717558478709, "driven_lanedir_consec": 7.333766369440623, "sim_compute_sim_state": 0.013057903882168811, "sim_compute_performance-ego0": 0.0021796329730158544}, "LF-norm-techtrack-000-ego0": {"driven_any": 2.225248060361575, "get_ui_image": 0.03564106720735768, "step_physics": 0.1345612053087495, "survival_time": 17.90000000000012, "driven_lanedir": 1.454101673378156, "get_state_dump": 0.005379368667815057, "get_robot_state": 0.004052268429386914, "sim_render-ego0": 0.004315008360031255, "get_duckie_state": 1.5520451792767454e-06, "in-drivable-lane": 6.000000000000085, "deviation-heading": 2.238230238398927, "agent_compute-ego0": 0.013879815183974242, "complete-iteration": 0.21718531597955645, "set_robot_commands": 0.0025337638961239447, "deviation-center-line": 0.4459350757806098, "driven_lanedir_consec": 1.454101673378156, "sim_compute_sim_state": 0.014406620625997985, "sim_compute_performance-ego0": 0.002305488400472572}, "LF-norm-small_loop-000-ego0": {"driven_any": 2.995205682522468, "get_ui_image": 0.026929264592821956, "step_physics": 0.10862033110088325, "survival_time": 23.6000000000002, "driven_lanedir": 0.7751642098157845, "get_state_dump": 0.004679668781369231, "get_robot_state": 0.003721655045200856, "sim_render-ego0": 0.003836541548843868, "get_duckie_state": 1.3740571810377577e-06, "in-drivable-lane": 17.100000000000215, "deviation-heading": 1.4701826050761433, "agent_compute-ego0": 0.012558282807815908, "complete-iteration": 0.17102475599809128, "set_robot_commands": 0.002285197219687335, "deviation-center-line": 0.3282056614492942, "driven_lanedir_consec": 0.7751642098157845, "sim_compute_sim_state": 0.006263850859305319, "sim_compute_performance-ego0": 0.0020417773950427583}}
set_robot_commands_max0.0025337638961239447
set_robot_commands_mean0.0024192797576137403
set_robot_commands_median0.00242907895732184
set_robot_commands_min0.002285197219687335
sim_compute_performance-ego0_max0.002305488400472572
sim_compute_performance-ego0_mean0.0021851796756749025
sim_compute_performance-ego0_median0.00219672645359214
sim_compute_performance-ego0_min0.0020417773950427583
sim_compute_sim_state_max0.014406620625997985
sim_compute_sim_state_mean0.010968949907946892
sim_compute_sim_state_median0.011602664073242137
sim_compute_sim_state_min0.006263850859305319
sim_render-ego0_max0.004315008360031255
sim_render-ego0_mean0.004078280774142645
sim_render-ego0_median0.004080786593847727
sim_render-ego0_min0.003836541548843868
simulation-passed1
step_physics_max0.14199264480311308
step_physics_mean0.12380955014587
step_physics_median0.12231261233974182
step_physics_min0.10862033110088325
survival_time_max59.99999999999873
survival_time_mean40.37499999999944
survival_time_min17.90000000000012
No reset possible
53491LFt-simsuccessyes0:30:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible