Duckietown Challenges Home Challenges Submissions

Submission 13496

Submission13496
Competingyes
Challengeaido5-LF-sim-validation
UserDavid Burke
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 62000
Next
User labeltemplate-random
Admin priority50
Blessingn/a
User priority50

62000

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
62000LFv-simsuccessyes0:03:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.5012185663379716
survival_time_median4.024999999999993
deviation-center-line_median0.1030587216874702
in-drivable-lane_median2.3999999999999932


other stats
agent_compute-ego0_max0.012639779310960036
agent_compute-ego0_mean0.010858457382113893
agent_compute-ego0_median0.01031519866030721
agent_compute-ego0_min0.010163652896881105
complete-iteration_max0.20155711472034457
complete-iteration_mean0.1815128423394447
complete-iteration_median0.18551822896008485
complete-iteration_min0.15345779671726456
deviation-center-line_max0.16018732237566077
deviation-center-line_mean0.10913342723579128
deviation-center-line_min0.07022894319256386
deviation-heading_max1.100637959557365
deviation-heading_mean0.6609883305624223
deviation-heading_median0.6312628998600954
deviation-heading_min0.2807895629721336
driven_any_max2.686914768836086
driven_any_mean1.6111856691434987
driven_any_median1.3378444269645138
driven_any_min1.0821390538088809
driven_lanedir_consec_max0.6591269044442776
driven_lanedir_consec_mean0.5072684850407944
driven_lanedir_consec_min0.3675099030429567
driven_lanedir_max0.6591269044442776
driven_lanedir_mean0.5072684850407944
driven_lanedir_median0.5012185663379716
driven_lanedir_min0.3675099030429567
get_duckie_state_max1.318804867617734e-06
get_duckie_state_mean1.2227937211661266e-06
get_duckie_state_median1.223172460283552e-06
get_duckie_state_min1.1260250964796687e-06
get_robot_state_max0.003677409845632273
get_robot_state_mean0.003587301551057075
get_robot_state_median0.003563704568629831
get_robot_state_min0.003544387221336365
get_state_dump_max0.004744564736639703
get_state_dump_mean0.004558093082492233
get_state_dump_median0.00451282582112721
get_state_dump_min0.004462155951074807
get_ui_image_max0.03639752864837646
get_ui_image_mean0.031160700620070866
get_ui_image_median0.031037275929312845
get_ui_image_min0.026170721973281307
in-drivable-lane_max5.549999999999985
in-drivable-lane_mean2.9624999999999915
in-drivable-lane_min1.499999999999995
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.686914768836086, "get_ui_image": 0.028563120982030055, "step_physics": 0.1037048076416229, "survival_time": 7.099999999999983, "driven_lanedir": 0.38465493566633535, "get_state_dump": 0.004744564736639703, "get_robot_state": 0.003677409845632273, "sim_render-ego0": 0.004639910651253654, "get_duckie_state": 1.318804867617734e-06, "in-drivable-lane": 5.549999999999985, "deviation-heading": 0.9034146476168684, "agent_compute-ego0": 0.012639779310960036, "complete-iteration": 0.1730329240118707, "set_robot_commands": 0.002922668323650226, "deviation-center-line": 0.16018732237566077, "driven_lanedir_consec": 0.38465493566633535, "sim_compute_sim_state": 0.009650485498921854, "sim_compute_performance-ego0": 0.0024141231616893848}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.285517415153549, "get_ui_image": 0.03639752864837646, "step_physics": 0.12996444404125213, "survival_time": 3.949999999999994, "driven_lanedir": 0.3675099030429567, "get_state_dump": 0.004476240277290345, "get_robot_state": 0.003544387221336365, "sim_render-ego0": 0.0036248713731765743, "get_duckie_state": 1.2099742889404295e-06, "in-drivable-lane": 2.3999999999999946, "deviation-heading": 1.100637959557365, "agent_compute-ego0": 0.010163652896881105, "complete-iteration": 0.20155711472034457, "set_robot_commands": 0.0020781099796295167, "deviation-center-line": 0.11986154498234942, "driven_lanedir_consec": 0.3675099030429567, "sim_compute_sim_state": 0.009371224045753478, "sim_compute_performance-ego0": 0.0018641769886016848}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.0821390538088809, "get_ui_image": 0.03351143087659563, "step_physics": 0.13055898462023055, "survival_time": 3.4499999999999957, "driven_lanedir": 0.6591269044442776, "get_state_dump": 0.004549411364964076, "get_robot_state": 0.0035767044339861187, "sim_render-ego0": 0.003732027326311384, "get_duckie_state": 1.2363706316266742e-06, "in-drivable-lane": 1.499999999999995, "deviation-heading": 0.3591111521033225, "agent_compute-ego0": 0.010311862400599888, "complete-iteration": 0.19800353390829903, "set_robot_commands": 0.002078785215105329, "deviation-center-line": 0.07022894319256386, "driven_lanedir_consec": 0.6591269044442776, "sim_compute_sim_state": 0.007682422229221889, "sim_compute_performance-ego0": 0.0019280501774379185}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.3901714387754789, "get_ui_image": 0.026170721973281307, "step_physics": 0.09648379934839456, "survival_time": 4.099999999999993, "driven_lanedir": 0.6177821970096078, "get_state_dump": 0.004462155951074807, "get_robot_state": 0.0035507047032735436, "sim_render-ego0": 0.003610906830753189, "get_duckie_state": 1.1260250964796687e-06, "in-drivable-lane": 2.3999999999999924, "deviation-heading": 0.2807895629721336, "agent_compute-ego0": 0.01031853492001453, "complete-iteration": 0.15345779671726456, "set_robot_commands": 0.002047871968832361, "deviation-center-line": 0.08625589839259101, "driven_lanedir_consec": 0.6177821970096078, "sim_compute_sim_state": 0.004902494959084384, "sim_compute_performance-ego0": 0.001840266836694924}}
set_robot_commands_max0.002922668323650226
set_robot_commands_mean0.002281858871804358
set_robot_commands_median0.002078447597367422
set_robot_commands_min0.002047871968832361
sim_compute_performance-ego0_max0.0024141231616893848
sim_compute_performance-ego0_mean0.002011654291105978
sim_compute_performance-ego0_median0.0018961135830198015
sim_compute_performance-ego0_min0.001840266836694924
sim_compute_sim_state_max0.009650485498921854
sim_compute_sim_state_mean0.007901656683245401
sim_compute_sim_state_median0.008526823137487684
sim_compute_sim_state_min0.004902494959084384
sim_render-ego0_max0.004639910651253654
sim_render-ego0_mean0.0039019290453737006
sim_render-ego0_median0.0036784493497439793
sim_render-ego0_min0.003610906830753189
simulation-passed1
step_physics_max0.13055898462023055
step_physics_mean0.11517800891287504
step_physics_median0.11683462584143751
step_physics_min0.09648379934839456
survival_time_max7.099999999999983
survival_time_mean4.6499999999999915
survival_time_min3.4499999999999957
No reset possible
61999LFv-simsuccessyes0:04:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible