Duckietown Challenges Home Challenges Submissions

Submission 12781

Submission12781
Competingyes
Challengeaido5-LF-sim-testing
UserJean-Sébastien Grondin 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFt-sim: 58952
Next
User labelexercise_state_estimation
Admin priority50
Blessingn/a
User priority50

The highlights are available only to the owner and the admins.

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58952LFt-simsuccessyes0:20:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median4.284426007986331
survival_time_median32.849999999999355
deviation-center-line_median1.4404055710032897
in-drivable-lane_median6.999999999999833


other stats
agent_compute-ego0_max0.013604373517243758
agent_compute-ego0_mean0.013268734766149816
agent_compute-ego0_median0.013309000403954524
agent_compute-ego0_min0.012852564739446458
complete-iteration_max0.3079248863717784
complete-iteration_mean0.26129427607773364
complete-iteration_median0.26496648282036
complete-iteration_min0.2073192522984361
deviation-center-line_max3.458671769630838
deviation-center-line_mean1.6112656453633618
deviation-center-line_min0.10557966981602998
deviation-heading_max14.898717266903414
deviation-heading_mean7.592122324617371
deviation-heading_median7.466665404397021
deviation-heading_min0.5364412227720272
driven_any_max16.674837293245304
driven_any_mean8.757804762336672
driven_any_median8.988115194764765
driven_any_min0.38015136657185367
driven_lanedir_consec_max13.105185797372744
driven_lanedir_consec_mean5.438475905883022
driven_lanedir_consec_min0.07986581018668382
driven_lanedir_max13.105185797372744
driven_lanedir_mean5.901970092246845
driven_lanedir_median5.136036025863605
driven_lanedir_min0.230622519887425
get_duckie_state_max1.7786026000976562e-06
get_duckie_state_mean1.5869866246762482e-06
get_duckie_state_median1.5497207641601562e-06
get_duckie_state_min1.4699023702870243e-06
get_robot_state_max0.004102482795715332
get_robot_state_mean0.003895663920290564
get_robot_state_median0.003885499841466137
get_robot_state_min0.003709173202514648
get_state_dump_max0.0051449728012084965
get_state_dump_mean0.00491627628792119
get_state_dump_median0.004877598844197521
get_state_dump_min0.004764934662081221
get_ui_image_max0.03631819248199463
get_ui_image_mean0.0324241602533012
get_ui_image_median0.032946450607672116
get_ui_image_min0.02748554731586593
in-drivable-lane_max20.14999999999909
in-drivable-lane_mean8.787499999999689
in-drivable-lane_min0.9999999999999988
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 16.674837293245304, "get_ui_image": 0.030000279884751294, "step_physics": 0.15335040743603098, "survival_time": 59.99999999999873, "driven_lanedir": 13.105185797372744, "get_state_dump": 0.004825889419060167, "get_robot_state": 0.003934041745061184, "sim_render-ego0": 0.0040715483205701584, "get_duckie_state": 1.5363208856511177e-06, "in-drivable-lane": 9.799999999999676, "deviation-heading": 14.898717266903414, "agent_compute-ego0": 0.013165065589098014, "complete-iteration": 0.2237679904743992, "set_robot_commands": 0.0023859031591486873, "deviation-center-line": 3.458671769630838, "driven_lanedir_consec": 13.105185797372744, "sim_compute_sim_state": 0.009771421092634494, "sim_compute_performance-ego0": 0.0021606079247670803}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.3021175786767607, "get_ui_image": 0.03589262133059294, "step_physics": 0.22713201771611752, "survival_time": 5.699999999999988, "driven_lanedir": 0.230622519887425, "get_state_dump": 0.004929308269334876, "get_robot_state": 0.003709173202514648, "sim_render-ego0": 0.004408881975256879, "get_duckie_state": 1.4699023702870243e-06, "in-drivable-lane": 4.199999999999988, "deviation-heading": 0.8849207297810224, "agent_compute-ego0": 0.013604373517243758, "complete-iteration": 0.3079248863717784, "set_robot_commands": 0.0023185087286907696, "deviation-center-line": 0.14734103041153096, "driven_lanedir_consec": 0.07986581018668382, "sim_compute_sim_state": 0.013730631703915802, "sim_compute_performance-ego0": 0.0021024538123089334}, "LF-norm-techtrack-000-ego0": {"driven_any": 0.38015136657185367, "get_ui_image": 0.03631819248199463, "step_physics": 0.22695384979248048, "survival_time": 2.4499999999999993, "driven_lanedir": 0.27324670267340556, "get_state_dump": 0.0051449728012084965, "get_robot_state": 0.004102482795715332, "sim_render-ego0": 0.004265403747558594, "get_duckie_state": 1.7786026000976562e-06, "in-drivable-lane": 0.9999999999999988, "deviation-heading": 0.5364412227720272, "agent_compute-ego0": 0.013452935218811034, "complete-iteration": 0.3061649751663208, "set_robot_commands": 0.0024301433563232422, "deviation-center-line": 0.10557966981602998, "driven_lanedir_consec": 0.27324670267340556, "sim_compute_sim_state": 0.011073393821716309, "sim_compute_performance-ego0": 0.0023070526123046874}, "LF-norm-small_loop-000-ego0": {"driven_any": 16.67411281085277, "get_ui_image": 0.02748554731586593, "step_physics": 0.14355406217233624, "survival_time": 59.99999999999873, "driven_lanedir": 9.998825349053805, "get_state_dump": 0.004764934662081221, "get_robot_state": 0.0038369579378710896, "sim_render-ego0": 0.00400769482246545, "get_duckie_state": 1.563120642669195e-06, "in-drivable-lane": 20.14999999999909, "deviation-heading": 14.04841007901302, "agent_compute-ego0": 0.012852564739446458, "complete-iteration": 0.2073192522984361, "set_robot_commands": 0.002310555146000566, "deviation-center-line": 2.7334701115950484, "driven_lanedir_consec": 8.295605313299257, "sim_compute_sim_state": 0.00638804229272593, "sim_compute_performance-ego0": 0.002021778036811568}}
set_robot_commands_max0.0024301433563232422
set_robot_commands_mean0.0023612775975408165
set_robot_commands_median0.0023522059439197284
set_robot_commands_min0.002310555146000566
sim_compute_performance-ego0_max0.0023070526123046874
sim_compute_performance-ego0_mean0.0021479730965480674
sim_compute_performance-ego0_median0.0021315308685380067
sim_compute_performance-ego0_min0.002021778036811568
sim_compute_sim_state_max0.013730631703915802
sim_compute_sim_state_mean0.010240872227748134
sim_compute_sim_state_median0.010422407457175402
sim_compute_sim_state_min0.00638804229272593
sim_render-ego0_max0.004408881975256879
sim_render-ego0_mean0.0041883822164627705
sim_render-ego0_median0.004168476034064376
sim_render-ego0_min0.00400769482246545
simulation-passed1
step_physics_max0.22713201771611752
step_physics_mean0.1877475842792413
step_physics_median0.19015212861425573
step_physics_min0.14355406217233624
survival_time_max59.99999999999873
survival_time_mean32.03749999999936
survival_time_min2.4499999999999993
No reset possible
58951LFt-simsuccessyes0:25:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible