Duckietown Challenges Home Challenges Submissions

Submission 11379

Submission11379
Competingyes
Challengeaido5-LF-sim-validation
UserMartin Cote 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54729
Next
User labelbaseline-duckietown
Admin priority50
Blessingn/a
User priority50

54729

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54729LFv-simsuccessyes0:34:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.569980806905086
survival_time_median59.99999999999873
deviation-center-line_median3.1613669928897816
in-drivable-lane_median3.774999999999963


other stats
agent_compute-ego0_max0.030627305660517787
agent_compute-ego0_mean0.016520843071107762
agent_compute-ego0_median0.011928175311600736
agent_compute-ego0_min0.01159971600071179
complete-iteration_max0.20530106542906496
complete-iteration_mean0.1818056639088481
complete-iteration_median0.18168880619871725
complete-iteration_min0.1585439778088928
deviation-center-line_max3.6432755496573943
deviation-center-line_mean3.1739603138174064
deviation-center-line_min2.729831719832668
deviation-heading_max14.285845212341988
deviation-heading_mean9.77079335711182
deviation-heading_median9.180646068762425
deviation-heading_min6.436036078580436
driven_any_max7.921187500366632
driven_any_mean7.919400990208506
driven_any_median7.921058825806382
driven_any_min7.914298808854628
driven_lanedir_consec_max7.438706423772585
driven_lanedir_consec_mean5.537472571520274
driven_lanedir_consec_min3.57122224849834
driven_lanedir_max7.438706423772585
driven_lanedir_mean7.073357843352568
driven_lanedir_median7.216970831580957
driven_lanedir_min6.420783286475777
get_duckie_state_max1.325496130442242e-06
get_duckie_state_mean1.2071801661253968e-06
get_duckie_state_median1.184450001839694e-06
get_duckie_state_min1.1343245303799569e-06
get_robot_state_max0.0037357447844163063
get_robot_state_mean0.00363868494811999
get_robot_state_median0.003635334730346832
get_robot_state_min0.00354832554736999
get_state_dump_max0.004609756723827962
get_state_dump_mean0.004546897447079445
get_state_dump_median0.004562250779729203
get_state_dump_min0.004453331505031411
get_ui_image_max0.03530319227366324
get_ui_image_mean0.029813113359487824
get_ui_image_median0.029459687891252632
get_ui_image_min0.02502988538178278
in-drivable-lane_max6.949999999999832
in-drivable-lane_mean4.2249999999999055
in-drivable-lane_min2.3999999999998636
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921187500366632, "get_ui_image": 0.026761735110953883, "step_physics": 0.09451536453336006, "survival_time": 59.99999999999873, "driven_lanedir": 7.3963595660433565, "get_state_dump": 0.004453331505031411, "get_robot_state": 0.003576868876727991, "sim_render-ego0": 0.003717772271809828, "get_duckie_state": 1.155367302557114e-06, "in-drivable-lane": 2.80000000000004, "deviation-heading": 6.436036078580436, "agent_compute-ego0": 0.01159971600071179, "complete-iteration": 0.1585439778088928, "set_robot_commands": 0.002150217758229531, "deviation-center-line": 2.729831719832668, "driven_lanedir_consec": 4.102379516691615, "sim_compute_sim_state": 0.00968866066372861, "sim_compute_performance-ego0": 0.001998894419102347}, "LF-norm-zigzag-000-ego0": {"driven_any": 7.9209929682732145, "get_ui_image": 0.03530319227366324, "step_physics": 0.1294028538649128, "survival_time": 59.99999999999873, "driven_lanedir": 7.037582097118556, "get_state_dump": 0.004609756723827962, "get_robot_state": 0.0036938005839656737, "sim_render-ego0": 0.003783269091311542, "get_duckie_state": 1.325496130442242e-06, "in-drivable-lane": 4.749999999999886, "deviation-heading": 10.449931608793351, "agent_compute-ego0": 0.011921200327432524, "complete-iteration": 0.20530106542906496, "set_robot_commands": 0.002151104929445189, "deviation-center-line": 3.6432755496573943, "driven_lanedir_consec": 7.037582097118556, "sim_compute_sim_state": 0.012351089869808098, "sim_compute_performance-ego0": 0.0020005766497762077}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.914298808854628, "get_ui_image": 0.03215764067155138, "step_physics": 0.12663368499844796, "survival_time": 59.99999999999873, "driven_lanedir": 6.420783286475777, "get_state_dump": 0.004568632595942082, "get_robot_state": 0.0037357447844163063, "sim_render-ego0": 0.0037877220198276338, "get_duckie_state": 1.2135327011222743e-06, "in-drivable-lane": 6.949999999999832, "deviation-heading": 14.285845212341988, "agent_compute-ego0": 0.011935150295768948, "complete-iteration": 0.19757067234093303, "set_robot_commands": 0.0022611264682232987, "deviation-center-line": 3.5415713463936744, "driven_lanedir_consec": 3.57122224849834, "sim_compute_sim_state": 0.010368285032235813, "sim_compute_performance-ego0": 0.002039500418352545}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.921124683339549, "get_ui_image": 0.02502988538178278, "step_physics": 0.0884364568422875, "survival_time": 59.99999999999873, "driven_lanedir": 7.438706423772585, "get_state_dump": 0.004555868963516324, "get_robot_state": 0.00354832554736999, "sim_render-ego0": 0.003673473067525821, "get_duckie_state": 1.1343245303799569e-06, "in-drivable-lane": 2.3999999999998636, "deviation-heading": 7.911360528731495, "agent_compute-ego0": 0.030627305660517787, "complete-iteration": 0.16580694005650148, "set_robot_commands": 0.0021866197689288263, "deviation-center-line": 2.781162639385889, "driven_lanedir_consec": 7.438706423772585, "sim_compute_sim_state": 0.005780269065367789, "sim_compute_performance-ego0": 0.001888009531114818}}
set_robot_commands_max0.0022611264682232987
set_robot_commands_mean0.002187267231206711
set_robot_commands_median0.0021688623491870076
set_robot_commands_min0.002150217758229531
sim_compute_performance-ego0_max0.002039500418352545
sim_compute_performance-ego0_mean0.0019817452545864795
sim_compute_performance-ego0_median0.0019997355344392774
sim_compute_performance-ego0_min0.001888009531114818
sim_compute_sim_state_max0.012351089869808098
sim_compute_sim_state_mean0.009547076157785078
sim_compute_sim_state_median0.010028472847982212
sim_compute_sim_state_min0.005780269065367789
sim_render-ego0_max0.0037877220198276338
sim_render-ego0_mean0.003740559112618706
sim_render-ego0_median0.003750520681560685
sim_render-ego0_min0.003673473067525821
simulation-passed1
step_physics_max0.1294028538649128
step_physics_mean0.10974709005975208
step_physics_median0.110574524765904
step_physics_min0.0884364568422875
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
54723LFv-simsuccessyes0:37:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
54701LFv-simsuccessyes0:34:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible