Duckietown Challenges Home Challenges Submissions

Submission 11025

Submission11025
Competingyes
Challengeaido5-LF-sim-validation
UserMo Kleit 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57070
Next
User labelsim-exercise-1
Admin priority50
Blessingn/a
User priority50

57070

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57070LFv-simsuccessyes0:22:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.9753387802844635
survival_time_median38.64999999999942
deviation-center-line_median1.2319156234441822
in-drivable-lane_median20.774999999999665


other stats
agent_compute-ego0_max0.03335689485916786
agent_compute-ego0_mean0.01830228583566683
agent_compute-ego0_median0.013651538476647657
agent_compute-ego0_min0.012549171530204135
complete-iteration_max0.19771311408595035
complete-iteration_mean0.1756350605960873
complete-iteration_median0.16963121036209883
complete-iteration_min0.16556470757420116
deviation-center-line_max2.078859203778461
deviation-center-line_mean1.2019123670580194
deviation-center-line_min0.2649590175652525
deviation-heading_max8.072051478664207
deviation-heading_mean4.408473049121105
deviation-heading_median3.95361182358318
deviation-heading_min1.6546170706538528
driven_any_max7.678237257813193
driven_any_mean4.047420184892982
driven_any_median3.950597712810248
driven_any_min0.6102480561382378
driven_lanedir_consec_max5.382364798943632
driven_lanedir_consec_mean2.3873258806860465
driven_lanedir_consec_min0.2162611632316271
driven_lanedir_max5.382364798943632
driven_lanedir_mean2.3873258806860465
driven_lanedir_median1.9753387802844635
driven_lanedir_min0.2162611632316271
get_duckie_state_max1.3853374280427632e-06
get_duckie_state_mean1.2803014553599396e-06
get_duckie_state_median1.2622685555514449e-06
get_duckie_state_min1.2113312822941056e-06
get_robot_state_max0.003782306219402112
get_robot_state_mean0.0036260707890494945
get_robot_state_median0.003610206980391605
get_robot_state_min0.003501562976012656
get_state_dump_max0.004707709111665424
get_state_dump_mean0.004592803411553532
get_state_dump_median0.00459297054713215
get_state_dump_min0.0044775634402844
get_ui_image_max0.0369784016358225
get_ui_image_mean0.030239442228859013
get_ui_image_median0.02947859564729982
get_ui_image_min0.02502217598501391
in-drivable-lane_max39.199999999998525
in-drivable-lane_mean21.649999999999466
in-drivable-lane_min5.850000000000004
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 1.7670270063533613, "get_ui_image": 0.02716123061496174, "step_physics": 0.10437201835236563, "survival_time": 17.30000000000011, "driven_lanedir": 0.8881247103472769, "get_state_dump": 0.0044775634402844, "get_robot_state": 0.003501562976012656, "sim_render-ego0": 0.0038133938649202287, "get_duckie_state": 1.2113312822941056e-06, "in-drivable-lane": 12.60000000000012, "deviation-heading": 1.6546170706538528, "agent_compute-ego0": 0.012549171530204135, "complete-iteration": 0.1689416359068681, "set_robot_commands": 0.0021068871193042063, "deviation-center-line": 0.5084008713984072, "driven_lanedir_consec": 0.8881247103472769, "sim_compute_sim_state": 0.009016523443656284, "sim_compute_performance-ego0": 0.0018625637296297364}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.6102480561382378, "get_ui_image": 0.0369784016358225, "step_physics": 0.11832766909348336, "survival_time": 9.45, "driven_lanedir": 0.2162611632316271, "get_state_dump": 0.004707709111665424, "get_robot_state": 0.003782306219402112, "sim_render-ego0": 0.004134749111376311, "get_duckie_state": 1.3853374280427632e-06, "in-drivable-lane": 5.850000000000004, "deviation-heading": 1.8665402725114908, "agent_compute-ego0": 0.014113876694127132, "complete-iteration": 0.19771311408595035, "set_robot_commands": 0.0023339296642102695, "deviation-center-line": 0.2649590175652525, "driven_lanedir_consec": 0.2162611632316271, "sim_compute_sim_state": 0.011181747285943282, "sim_compute_performance-ego0": 0.002062921775014777}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.678237257813193, "get_ui_image": 0.0317959606796379, "step_physics": 0.0986778364888238, "survival_time": 59.99999999999873, "driven_lanedir": 5.382364798943632, "get_state_dump": 0.0045084059982871535, "get_robot_state": 0.0036438791876927104, "sim_render-ego0": 0.003910091298505924, "get_duckie_state": 1.2121430840917074e-06, "in-drivable-lane": 28.94999999999921, "deviation-heading": 8.072051478664207, "agent_compute-ego0": 0.013189200259168182, "complete-iteration": 0.17032078481732954, "set_robot_commands": 0.0021672373905864783, "deviation-center-line": 1.9554303754899571, "driven_lanedir_consec": 5.382364798943632, "sim_compute_sim_state": 0.010357544285173124, "sim_compute_performance-ego0": 0.0019880802208537564}, "LF-norm-small_loop-000-ego0": {"driven_any": 6.134168419267135, "get_ui_image": 0.02502217598501391, "step_physics": 0.08503971925683065, "survival_time": 59.99999999999873, "driven_lanedir": 3.0625528502216506, "get_state_dump": 0.004677535095977148, "get_robot_state": 0.0035765347730904993, "sim_render-ego0": 0.003734145533730843, "get_duckie_state": 1.312394027011182e-06, "in-drivable-lane": 39.199999999998525, "deviation-heading": 6.040683374654869, "agent_compute-ego0": 0.03335689485916786, "complete-iteration": 0.16556470757420116, "set_robot_commands": 0.002148088070871828, "deviation-center-line": 2.078859203778461, "driven_lanedir_consec": 3.0625528502216506, "sim_compute_sim_state": 0.006053125729271018, "sim_compute_performance-ego0": 0.0018704215453923692}}
set_robot_commands_max0.0023339296642102695
set_robot_commands_mean0.0021890355612431953
set_robot_commands_median0.0021576627307291533
set_robot_commands_min0.0021068871193042063
sim_compute_performance-ego0_max0.002062921775014777
sim_compute_performance-ego0_mean0.0019459968177226595
sim_compute_performance-ego0_median0.001929250883123063
sim_compute_performance-ego0_min0.0018625637296297364
sim_compute_sim_state_max0.011181747285943282
sim_compute_sim_state_mean0.009152235186010929
sim_compute_sim_state_median0.009687033864414704
sim_compute_sim_state_min0.006053125729271018
sim_render-ego0_max0.004134749111376311
sim_render-ego0_mean0.0038980949521333266
sim_render-ego0_median0.0038617425817130766
sim_render-ego0_min0.003734145533730843
simulation-passed1
step_physics_max0.11832766909348336
step_physics_mean0.10160431079787584
step_physics_median0.10152492742059473
step_physics_min0.08503971925683065
survival_time_max59.99999999999873
survival_time_mean36.68749999999939
survival_time_min9.45
No reset possible
57064LFv-simsuccessyes0:27:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible