Duckietown Challenges Home Challenges Submissions

Submission 11652

Submission11652
Competingyes
Challengeaido5-LF-sim-validation
UserPhilippe Reddy 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54139
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

54139

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54139LFv-simsuccessyes0:26:45
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median6.0268585350066735
survival_time_median46.84999999999947
deviation-center-line_median2.395840201682114
in-drivable-lane_median12.224999999999833


other stats
agent_compute-ego0_max0.013143268453389976
agent_compute-ego0_mean0.01293854752032288
agent_compute-ego0_median0.013039588410755743
agent_compute-ego0_min0.01253174480639006
complete-iteration_max0.2944679812381142
complete-iteration_mean0.25560204945129555
complete-iteration_median0.2518980533852367
complete-iteration_min0.2241441097965947
deviation-center-line_max3.633486966984053
deviation-center-line_mean2.1460322141173105
deviation-center-line_min0.1589614861209626
deviation-heading_max17.171638689732752
deviation-heading_mean10.53477137201356
deviation-heading_median11.998679937492712
deviation-heading_min0.970086923336064
driven_any_max15.83891765573174
driven_any_mean10.327426218347831
driven_any_median12.249506576209516
driven_any_min0.9717740652405528
driven_lanedir_consec_max10.100815072796594
driven_lanedir_consec_mean5.717413390136701
driven_lanedir_consec_min0.715121417736863
driven_lanedir_max10.100815072796594
driven_lanedir_mean6.291260582857346
driven_lanedir_median7.174552920447962
driven_lanedir_min0.715121417736863
get_duckie_state_max1.4815302713030482e-06
get_duckie_state_mean1.4448079908340489e-06
get_duckie_state_median1.4434850645857286e-06
get_duckie_state_min1.4107315628616898e-06
get_robot_state_max0.004120305813321662
get_robot_state_mean0.003948131449250082
get_robot_state_median0.003951309482436707
get_robot_state_min0.003769601018805253
get_state_dump_max0.005189699596828885
get_state_dump_mean0.005084019792825176
get_state_dump_median0.005115641344596107
get_state_dump_min0.004915096885279606
get_ui_image_max0.03691940558584113
get_ui_image_mean0.03236721907758543
get_ui_image_median0.03216507640552406
get_ui_image_min0.028219317913452453
in-drivable-lane_max23.74999999999949
in-drivable-lane_mean12.424999999999786
in-drivable-lane_min1.499999999999995
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 8.671831859427954, "get_ui_image": 0.029719970491197373, "step_physics": 0.15321073178891784, "survival_time": 33.70000000000022, "driven_lanedir": 6.361430045610415, "get_state_dump": 0.005189699596828885, "get_robot_state": 0.003943239141393591, "sim_render-ego0": 0.004010403244583695, "get_duckie_state": 1.4107315628616898e-06, "in-drivable-lane": 7.900000000000013, "deviation-heading": 7.003648873564713, "agent_compute-ego0": 0.01301378885904948, "complete-iteration": 0.2241441097965947, "set_robot_commands": 0.0023457103305392795, "deviation-center-line": 1.730614390682045, "driven_lanedir_consec": 4.066041274727839, "sim_compute_sim_state": 0.01043363995022244, "sim_compute_performance-ego0": 0.0021853499942355685}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.9717740652405528, "get_ui_image": 0.03691940558584113, "step_physics": 0.2180635878914281, "survival_time": 4.699999999999991, "driven_lanedir": 0.715121417736863, "get_state_dump": 0.004915096885279606, "get_robot_state": 0.003769601018805253, "sim_render-ego0": 0.003886441180580541, "get_duckie_state": 1.460627505653783e-06, "in-drivable-lane": 1.499999999999995, "deviation-heading": 0.970086923336064, "agent_compute-ego0": 0.01253174480639006, "complete-iteration": 0.2944679812381142, "set_robot_commands": 0.002278925243176912, "deviation-center-line": 0.1589614861209626, "driven_lanedir_consec": 0.715121417736863, "sim_compute_sim_state": 0.009951378169812654, "sim_compute_performance-ego0": 0.0020653448606792247}, "LF-norm-techtrack-000-ego0": {"driven_any": 15.827181292991078, "get_ui_image": 0.034610182319850746, "step_physics": 0.19056853823221095, "survival_time": 59.99999999999873, "driven_lanedir": 10.100815072796594, "get_state_dump": 0.005066122639486931, "get_robot_state": 0.003959379823479823, "sim_render-ego0": 0.0040926053859510585, "get_duckie_state": 1.4815302713030482e-06, "in-drivable-lane": 16.54999999999965, "deviation-heading": 16.99371100142071, "agent_compute-ego0": 0.013065387962462007, "complete-iteration": 0.2694529305886071, "set_robot_commands": 0.0023723260052098917, "deviation-center-line": 3.633486966984053, "driven_lanedir_consec": 10.100815072796594, "sim_compute_sim_state": 0.013405100094289406, "sim_compute_performance-ego0": 0.0022174285710800895}, "LF-norm-small_loop-000-ego0": {"driven_any": 15.83891765573174, "get_ui_image": 0.028219317913452453, "step_physics": 0.1679211536315359, "survival_time": 59.99999999999873, "driven_lanedir": 7.987675795285508, "get_state_dump": 0.005165160049705283, "get_robot_state": 0.004120305813321662, "sim_render-ego0": 0.004171053039144219, "get_duckie_state": 1.426342623517674e-06, "in-drivable-lane": 23.74999999999949, "deviation-heading": 17.171638689732752, "agent_compute-ego0": 0.013143268453389976, "complete-iteration": 0.23434317618186623, "set_robot_commands": 0.002481745641297047, "deviation-center-line": 3.0610660126821827, "driven_lanedir_consec": 7.987675795285508, "sim_compute_sim_state": 0.006781825018762847, "sim_compute_performance-ego0": 0.002245247909965166}}
set_robot_commands_max0.002481745641297047
set_robot_commands_mean0.0023696768050557827
set_robot_commands_median0.002359018167874586
set_robot_commands_min0.002278925243176912
sim_compute_performance-ego0_max0.002245247909965166
sim_compute_performance-ego0_mean0.002178342833990012
sim_compute_performance-ego0_median0.002201389282657829
sim_compute_performance-ego0_min0.0020653448606792247
sim_compute_sim_state_max0.013405100094289406
sim_compute_sim_state_mean0.010142985808271837
sim_compute_sim_state_median0.010192509060017544
sim_compute_sim_state_min0.006781825018762847
sim_render-ego0_max0.004171053039144219
sim_render-ego0_mean0.004040125712564879
sim_render-ego0_median0.004051504315267377
sim_render-ego0_min0.003886441180580541
simulation-passed1
step_physics_max0.2180635878914281
step_physics_mean0.1824410028860232
step_physics_median0.17924484593187345
step_physics_min0.15321073178891784
survival_time_max59.99999999999873
survival_time_mean39.59999999999942
survival_time_min4.699999999999991
No reset possible
54138LFv-simsuccessyes0:23:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible