Duckietown Challenges Home Challenges Submissions

Submission 9261

Submission9261
Competingyes
Challengeaido5-LF-sim-validation
UserJerome Labonte 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58462
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58462

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
58462LFv-simsuccessyes0:35:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median9.149661577503
survival_time_median59.99999999999873
deviation-center-line_median3.8694794856383536
in-drivable-lane_median0.0


other stats
agent_compute-ego0_max0.012845380022364988
agent_compute-ego0_mean0.012631304506259
agent_compute-ego0_median0.012620921039660704
agent_compute-ego0_min0.012437995923349602
complete-iteration_max0.20255514326738777
complete-iteration_mean0.17864019909468817
complete-iteration_median0.17686369248373524
complete-iteration_min0.15827826814389448
deviation-center-line_max4.060564372604027
deviation-center-line_mean3.709214259264463
deviation-center-line_min3.0373336931771178
deviation-heading_max12.240754267001009
deviation-heading_mean10.843718160974635
deviation-heading_median11.68288056611905
deviation-heading_min7.76835724465944
driven_any_max11.312336994952757
driven_any_mean9.864685792262607
driven_any_median9.425548327777324
driven_any_min9.295309518543023
driven_lanedir_consec_max11.158396329103674
driven_lanedir_consec_mean9.5818402603761
driven_lanedir_consec_min8.869641557394726
driven_lanedir_max11.158396329103674
driven_lanedir_mean9.585522247322547
driven_lanedir_median9.149661577503
driven_lanedir_min8.88436950518052
get_duckie_state_max1.452943863817099e-06
get_duckie_state_mean1.4185012131309031e-06
get_duckie_state_median1.430511474609375e-06
get_duckie_state_min1.360038039487764e-06
get_robot_state_max0.003758993275854411
get_robot_state_mean0.003726185509604678
get_robot_state_median0.00372067557882012
get_robot_state_min0.0037043976049240582
get_state_dump_max0.004673678511683888
get_state_dump_mean0.00464739281370876
get_state_dump_median0.0046625466866854525
get_state_dump_min0.004590799369780249
get_ui_image_max0.03574775339264755
get_ui_image_mean0.03077811563541053
get_ui_image_median0.03047847777580242
get_ui_image_min0.026407753597389748
in-drivable-lane_max2.8999999999999666
in-drivable-lane_mean0.7249999999999917
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 11.312336994952757, "get_ui_image": 0.02855054345555746, "step_physics": 0.09609502499347722, "survival_time": 59.99999999999873, "driven_lanedir": 11.158396329103674, "get_state_dump": 0.004590799369780249, "get_robot_state": 0.003725677207546568, "sim_render-ego0": 0.003823911022087815, "get_duckie_state": 1.452943863817099e-06, "in-drivable-lane": 0.0, "deviation-heading": 7.76835724465944, "agent_compute-ego0": 0.012503787341661797, "complete-iteration": 0.16290533274635485, "set_robot_commands": 0.0022866712025460556, "deviation-center-line": 3.0373336931771178, "driven_lanedir_consec": 11.158396329103674, "sim_compute_sim_state": 0.009174705842055448, "sim_compute_performance-ego0": 0.0020697970473696845}, "LF-norm-zigzag-000-ego0": {"driven_any": 9.295309518543023, "get_ui_image": 0.03574775339264755, "step_physics": 0.12459027479332156, "survival_time": 59.99999999999873, "driven_lanedir": 9.051200075733076, "get_state_dump": 0.004659470868646652, "get_robot_state": 0.0037043976049240582, "sim_render-ego0": 0.0038985805050915823, "get_duckie_state": 1.360038039487764e-06, "in-drivable-lane": 0.0, "deviation-heading": 12.240754267001009, "agent_compute-ego0": 0.012738054737659616, "complete-iteration": 0.20255514326738777, "set_robot_commands": 0.0022597493577460066, "deviation-center-line": 4.005477585331246, "driven_lanedir_consec": 9.051200075733076, "sim_compute_sim_state": 0.012821906611484652, "sim_compute_performance-ego0": 0.0020501387307884093}, "LF-norm-techtrack-000-ego0": {"driven_any": 9.503272818369046, "get_ui_image": 0.03240641209604738, "step_physics": 0.11584470988709564, "survival_time": 59.99999999999873, "driven_lanedir": 9.248123079272926, "get_state_dump": 0.004673678511683888, "get_robot_state": 0.003715673950093672, "sim_render-ego0": 0.00384932791164376, "get_duckie_state": 1.4362684594502954e-06, "in-drivable-lane": 0.0, "deviation-heading": 12.220022313648457, "agent_compute-ego0": 0.012845380022364988, "complete-iteration": 0.1908220522211156, "set_robot_commands": 0.002290508332200888, "deviation-center-line": 4.060564372604027, "driven_lanedir_consec": 9.248123079272926, "sim_compute_sim_state": 0.013019387668415866, "sim_compute_performance-ego0": 0.00208978529873736}, "LF-norm-small_loop-000-ego0": {"driven_any": 9.347823837185604, "get_ui_image": 0.026407753597389748, "step_physics": 0.09654820948020308, "survival_time": 59.99999999999873, "driven_lanedir": 8.88436950518052, "get_state_dump": 0.004665622504724253, "get_robot_state": 0.003758993275854411, "sim_render-ego0": 0.0038559875520043927, "get_duckie_state": 1.4247544897684546e-06, "in-drivable-lane": 2.8999999999999666, "deviation-heading": 11.145738818589642, "agent_compute-ego0": 0.012437995923349602, "complete-iteration": 0.15827826814389448, "set_robot_commands": 0.00226906356366846, "deviation-center-line": 3.733481385945461, "driven_lanedir_consec": 8.869641557394726, "sim_compute_sim_state": 0.006212162038468004, "sim_compute_performance-ego0": 0.00203879846323539}}
set_robot_commands_max0.002290508332200888
set_robot_commands_mean0.0022764981140403527
set_robot_commands_median0.002277867383107258
set_robot_commands_min0.0022597493577460066
sim_compute_performance-ego0_max0.00208978529873736
sim_compute_performance-ego0_mean0.0020621298850327105
sim_compute_performance-ego0_median0.0020599678890790467
sim_compute_performance-ego0_min0.00203879846323539
sim_compute_sim_state_max0.013019387668415866
sim_compute_sim_state_mean0.010307040540105991
sim_compute_sim_state_median0.01099830622677005
sim_compute_sim_state_min0.006212162038468004
sim_render-ego0_max0.0038985805050915823
sim_render-ego0_mean0.003856951747706888
sim_render-ego0_median0.003852657731824076
sim_render-ego0_min0.003823911022087815
simulation-passed1
step_physics_max0.12459027479332156
step_physics_mean0.10826955478852436
step_physics_median0.10619645968364937
step_physics_min0.09609502499347722
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58459LFv-simsuccessyes0:35:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible