Duckietown Challenges Home Challenges Submissions

Submission 11362

Submission11362
Competingyes
Challengeaido5-LF-sim-validation
UserLiam Paull 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54802
Next
User labelexercise_state_estimation
Admin priority50
Blessingn/a
User priority50

54802

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54802LFv-simsuccessyes0:29:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.250359390194503
survival_time_median59.99999999999873
deviation-center-line_median2.5882548385942976
in-drivable-lane_median8.699999999999983


other stats
agent_compute-ego0_max0.03502325578169389
agent_compute-ego0_mean0.025990991139969576
agent_compute-ego0_median0.028261725749699505
agent_compute-ego0_min0.01241725727878542
complete-iteration_max0.26072798977213457
complete-iteration_mean0.20320288932225591
complete-iteration_median0.18647059552576223
complete-iteration_min0.17914237646536466
deviation-center-line_max3.416307457823777
deviation-center-line_mean2.344189065321331
deviation-center-line_min0.783939126272952
deviation-heading_max11.336214679551407
deviation-heading_mean7.339858364374369
deviation-heading_median8.181078467513771
deviation-heading_min1.661061842918529
driven_any_max7.921208961708072
driven_any_mean6.304505870967548
driven_any_median7.917539058009882
driven_any_min1.4617364061423548
driven_lanedir_consec_max3.919699507336555
driven_lanedir_consec_mean2.8642439274112466
driven_lanedir_consec_min1.0365574219194262
driven_lanedir_max7.008822512369562
driven_lanedir_mean4.408963595825611
driven_lanedir_median4.795237224506729
driven_lanedir_min1.0365574219194262
get_duckie_state_max1.3733386596374766e-06
get_duckie_state_mean1.3270832387951877e-06
get_duckie_state_median1.3176526692629233e-06
get_duckie_state_min1.299688957017427e-06
get_robot_state_max0.003780615716850033
get_robot_state_mean0.003695803596738792
get_robot_state_median0.0037042992973121574
get_robot_state_min0.003594000075480821
get_state_dump_max0.004824368383961852
get_state_dump_mean0.004722275188461546
get_state_dump_median0.004746000301109398
get_state_dump_min0.004572731767665536
get_ui_image_max0.03667284634487688
get_ui_image_mean0.03059671390686696
get_ui_image_median0.03026599381785905
get_ui_image_min0.02518202164687285
in-drivable-lane_max28.39999999999897
in-drivable-lane_mean12.34999999999974
in-drivable-lane_min3.6000000000000236
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921208961708072, "get_ui_image": 0.028536309608313364, "step_physics": 0.10202008302166105, "survival_time": 59.99999999999873, "driven_lanedir": 7.008822512369562, "get_state_dump": 0.004713261752799587, "get_robot_state": 0.0037421070467323985, "sim_render-ego0": 0.0038244533697631734, "get_duckie_state": 1.3733386596374766e-06, "in-drivable-lane": 5.800000000000082, "deviation-heading": 6.26363328538582, "agent_compute-ego0": 0.02164586537287297, "complete-iteration": 0.17914237646536466, "set_robot_commands": 0.00225301368548213, "deviation-center-line": 3.416307457823777, "driven_lanedir_consec": 3.919699507336555, "sim_compute_sim_state": 0.010278474679100424, "sim_compute_performance-ego0": 0.002045557957505505}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.4617364061423548, "get_ui_image": 0.03667284634487688, "step_physics": 0.1614796139977195, "survival_time": 12.050000000000036, "driven_lanedir": 1.0365574219194262, "get_state_dump": 0.004778738849419208, "get_robot_state": 0.003666491547891916, "sim_render-ego0": 0.003910159276536674, "get_duckie_state": 1.3250949954198412e-06, "in-drivable-lane": 3.6000000000000236, "deviation-heading": 1.661061842918529, "agent_compute-ego0": 0.03502325578169389, "complete-iteration": 0.26072798977213457, "set_robot_commands": 0.002249909826546661, "deviation-center-line": 0.783939126272952, "driven_lanedir_consec": 1.0365574219194262, "sim_compute_sim_state": 0.010813231310568563, "sim_compute_performance-ego0": 0.002048301302697048}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.914385038068975, "get_ui_image": 0.03199567802740473, "step_physics": 0.12119623902040556, "survival_time": 59.99999999999873, "driven_lanedir": 5.914588708028821, "get_state_dump": 0.004572731767665536, "get_robot_state": 0.003780615716850033, "sim_render-ego0": 0.003873863585485606, "get_duckie_state": 1.299688957017427e-06, "in-drivable-lane": 11.599999999999884, "deviation-heading": 11.336214679551407, "agent_compute-ego0": 0.01241725727878542, "complete-iteration": 0.19301164180015545, "set_robot_commands": 0.0023000131141732477, "deviation-center-line": 2.977150305906166, "driven_lanedir_consec": 3.200613750015373, "sim_compute_sim_state": 0.010703476938379496, "sim_compute_performance-ego0": 0.002090543235569175}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.920693077950789, "get_ui_image": 0.02518202164687285, "step_physics": 0.09718195842168016, "survival_time": 59.99999999999873, "driven_lanedir": 3.6758857409846377, "get_state_dump": 0.004824368383961852, "get_robot_state": 0.003594000075480821, "sim_render-ego0": 0.003751960821096149, "get_duckie_state": 1.3102103431060054e-06, "in-drivable-lane": 28.39999999999897, "deviation-heading": 10.09852364964172, "agent_compute-ego0": 0.034877586126526035, "complete-iteration": 0.179929549251369, "set_robot_commands": 0.002258541780546444, "deviation-center-line": 2.199359371282429, "driven_lanedir_consec": 3.300105030373633, "sim_compute_sim_state": 0.006160978075864412, "sim_compute_performance-ego0": 0.002017612759020803}}
set_robot_commands_max0.0023000131141732477
set_robot_commands_mean0.002265369601687121
set_robot_commands_median0.002255777733014287
set_robot_commands_min0.002249909826546661
sim_compute_performance-ego0_max0.002090543235569175
sim_compute_performance-ego0_mean0.0020505038136981327
sim_compute_performance-ego0_median0.002046929630101276
sim_compute_performance-ego0_min0.002017612759020803
sim_compute_sim_state_max0.010813231310568563
sim_compute_sim_state_mean0.009489040250978223
sim_compute_sim_state_median0.01049097580873996
sim_compute_sim_state_min0.006160978075864412
sim_render-ego0_max0.003910159276536674
sim_render-ego0_mean0.0038401092632204007
sim_render-ego0_median0.00384915847762439
sim_render-ego0_min0.003751960821096149
simulation-passed1
step_physics_max0.1614796139977195
step_physics_mean0.12046947361536656
step_physics_median0.1116081610210333
step_physics_min0.09718195842168016
survival_time_max59.99999999999873
survival_time_mean48.01249999999905
survival_time_min12.050000000000036
No reset possible
54781LFv-simsuccessyes0:34:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible