Duckietown Challenges Home Challenges Submissions

Submission 9239

Submission9239
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58485
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58485

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
58485LFv-simsuccessyes0:20:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median4.116890139031213
survival_time_median36.39999999999998
deviation-center-line_median1.885844520898889
in-drivable-lane_median11.349999999999886


other stats
agent_compute-ego0_max0.01261872144845816
agent_compute-ego0_mean0.012410724988442152
agent_compute-ego0_median0.01245846281687336
agent_compute-ego0_min0.012107252871563732
complete-iteration_max0.20313743566855405
complete-iteration_mean0.17988388846270095
complete-iteration_median0.17801681435513397
complete-iteration_min0.16036448947198187
deviation-center-line_max3.160312638762253
deviation-center-line_mean1.7979356409979377
deviation-center-line_min0.25974088343171975
deviation-heading_max11.728057928981128
deviation-heading_mean6.290167721813913
deviation-heading_median5.968541387465835
deviation-heading_min1.495530183342855
driven_any_max10.422607366469997
driven_any_mean6.070407145742276
driven_any_median6.17305486441774
driven_any_min1.5129114876636331
driven_lanedir_consec_max6.185608756717224
driven_lanedir_consec_mean3.7719929280841393
driven_lanedir_consec_min0.6685826775569077
driven_lanedir_max6.185608756717224
driven_lanedir_mean3.7719929280841393
driven_lanedir_median4.116890139031213
driven_lanedir_min0.6685826775569077
get_duckie_state_max2.30715825007512e-06
get_duckie_state_mean2.216070439849369e-06
get_duckie_state_median2.2259341473878615e-06
get_duckie_state_min2.1052552145466327e-06
get_robot_state_max0.0037278383206098505
get_robot_state_mean0.003695647489614904
get_robot_state_median0.003700975167691723
get_robot_state_min0.00365280130246632
get_state_dump_max0.0047594192700508315
get_state_dump_mean0.004688052825915386
get_state_dump_median0.004682982161234814
get_state_dump_min0.004626827711141089
get_ui_image_max0.03646492713536972
get_ui_image_mean0.03111126649791833
get_ui_image_median0.030546549400718
get_ui_image_min0.026887040054867608
in-drivable-lane_max22.149999999999455
in-drivable-lane_mean12.52499999999981
in-drivable-lane_min5.250000000000011
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 4.9370691587765, "get_ui_image": 0.02842205509010328, "step_physics": 0.09667305930103516, "survival_time": 29.30000000000028, "driven_lanedir": 3.4800334831893256, "get_state_dump": 0.004626827711141089, "get_robot_state": 0.00365280130246632, "sim_render-ego0": 0.003731161406742856, "get_duckie_state": 2.231061763211044e-06, "in-drivable-lane": 8.650000000000123, "deviation-heading": 3.1223185627392227, "agent_compute-ego0": 0.012107252871563732, "complete-iteration": 0.16390772085173572, "set_robot_commands": 0.00219529166538427, "deviation-center-line": 1.236329741273805, "driven_lanedir_consec": 3.4800334831893256, "sim_compute_sim_state": 0.010427839711333865, "sim_compute_performance-ego0": 0.0019819911131460782}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.5129114876636331, "get_ui_image": 0.03646492713536972, "step_physics": 0.1273919362288255, "survival_time": 9.700000000000005, "driven_lanedir": 0.6685826775569077, "get_state_dump": 0.0047594192700508315, "get_robot_state": 0.0037278383206098505, "sim_render-ego0": 0.0038381124154115336, "get_duckie_state": 2.30715825007512e-06, "in-drivable-lane": 5.250000000000011, "deviation-heading": 1.495530183342855, "agent_compute-ego0": 0.01261872144845816, "complete-iteration": 0.20313743566855405, "set_robot_commands": 0.002287686176789113, "deviation-center-line": 0.25974088343171975, "driven_lanedir_consec": 0.6685826775569077, "sim_compute_sim_state": 0.009884384350898938, "sim_compute_performance-ego0": 0.00206911135942508}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.409040570058979, "get_ui_image": 0.03267104371133272, "step_physics": 0.11739616306449462, "survival_time": 43.499999999999666, "driven_lanedir": 4.7537467948731, "get_state_dump": 0.004682003835859857, "get_robot_state": 0.003704930008757949, "sim_render-ego0": 0.0038180148697337384, "get_duckie_state": 2.1052552145466327e-06, "in-drivable-lane": 14.049999999999647, "deviation-heading": 8.814764212192447, "agent_compute-ego0": 0.012571868885535189, "complete-iteration": 0.19212590785853215, "set_robot_commands": 0.002183494556921905, "deviation-center-line": 2.535359300523973, "driven_lanedir_consec": 4.7537467948731, "sim_compute_sim_state": 0.013002551929548332, "sim_compute_performance-ego0": 0.0020058248127916515}, "LF-norm-small_loop-000-ego0": {"driven_any": 10.422607366469997, "get_ui_image": 0.026887040054867608, "step_physics": 0.09844392959918706, "survival_time": 59.99999999999873, "driven_lanedir": 6.185608756717224, "get_state_dump": 0.004683960486609771, "get_robot_state": 0.0036970203266254967, "sim_render-ego0": 0.0037751501545520943, "get_duckie_state": 2.2208065315646792e-06, "in-drivable-lane": 22.149999999999455, "deviation-heading": 11.728057928981128, "agent_compute-ego0": 0.012345056748211533, "complete-iteration": 0.16036448947198187, "set_robot_commands": 0.002207558915378847, "deviation-center-line": 3.160312638762253, "driven_lanedir_consec": 6.185608756717224, "sim_compute_sim_state": 0.006257557650589129, "sim_compute_performance-ego0": 0.0019761362639593143}}
set_robot_commands_max0.002287686176789113
set_robot_commands_mean0.002218507828618534
set_robot_commands_median0.0022014252903815586
set_robot_commands_min0.002183494556921905
sim_compute_performance-ego0_max0.00206911135942508
sim_compute_performance-ego0_mean0.002008265887330531
sim_compute_performance-ego0_median0.001993907962968865
sim_compute_performance-ego0_min0.0019761362639593143
sim_compute_sim_state_max0.013002551929548332
sim_compute_sim_state_mean0.009893083410592566
sim_compute_sim_state_median0.0101561120311164
sim_compute_sim_state_min0.006257557650589129
sim_render-ego0_max0.0038381124154115336
sim_render-ego0_mean0.003790609711610056
sim_render-ego0_median0.003796582512142917
sim_render-ego0_min0.003731161406742856
simulation-passed1
step_physics_max0.1273919362288255
step_physics_mean0.1099762720483856
step_physics_median0.10792004633184084
step_physics_min0.09667305930103516
survival_time_max59.99999999999873
survival_time_mean35.62499999999967
survival_time_min9.700000000000005
No reset possible
58481LFv-simsuccessyes0:11:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible