Duckietown Challenges Home Challenges Submissions

Submission 6817

Submission6817
Competingyes
Challengeaido5-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58608
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58608

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
58608LFv-simsuccessyes0:20:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.1069994852979335
survival_time_median32.15000000000004
deviation-center-line_median1.4181389612590551
in-drivable-lane_median7.0500000000000735


other stats
agent_compute-ego0_max0.012814802508200367
agent_compute-ego0_mean0.012407074184303975
agent_compute-ego0_median0.012394351755211183
agent_compute-ego0_min0.012024790718593168
complete-iteration_max0.234248528557439
complete-iteration_mean0.1892088270579365
complete-iteration_median0.17807515445149236
complete-iteration_min0.16643647077132245
deviation-center-line_max3.281597209017729
deviation-center-line_mean1.6480767139539505
deviation-center-line_min0.47443172427996255
deviation-heading_max7.303409713730261
deviation-heading_mean4.038930044702247
deviation-heading_median3.3853092005261294
deviation-heading_min2.081692064026468
driven_any_max7.921176457117004
driven_any_mean4.419285506913345
driven_any_median4.127197321963932
driven_any_min1.5015709266085155
driven_lanedir_consec_max4.182696994793213
driven_lanedir_consec_mean2.4341788149141457
driven_lanedir_consec_min1.340019294267502
driven_lanedir_max7.244848545278506
driven_lanedir_mean3.199716702535469
driven_lanedir_median2.1069994852979335
driven_lanedir_min1.340019294267502
get_duckie_state_max1.3677801915152087e-06
get_duckie_state_mean1.3338164566664544e-06
get_duckie_state_median1.3353019643333535e-06
get_duckie_state_min1.2968817064839025e-06
get_robot_state_max0.0037997172803294903
get_robot_state_mean0.003721692238070424
get_robot_state_median0.003713792575013812
get_robot_state_min0.00365946652192458
get_state_dump_max0.004837973750367456
get_state_dump_mean0.004794823126657764
get_state_dump_median0.0048108433718800575
get_state_dump_min0.0047196320125034875
get_ui_image_max0.03649406663833126
get_ui_image_mean0.03121621929351901
get_ui_image_median0.03048579172589951
get_ui_image_min0.02739922708394576
in-drivable-lane_max19.44999999999972
in-drivable-lane_mean8.787499999999973
in-drivable-lane_min1.6000000000000227
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921176457117004, "get_ui_image": 0.028237637234766418, "step_physics": 0.09988107212774958, "survival_time": 59.99999999999873, "driven_lanedir": 7.244848545278506, "get_state_dump": 0.0047918050910511385, "get_robot_state": 0.00365946652192458, "sim_render-ego0": 0.003730654021683184, "get_duckie_state": 1.3677801915152087e-06, "in-drivable-lane": 3.549999999999997, "deviation-heading": 7.303409713730261, "agent_compute-ego0": 0.012024790718593168, "complete-iteration": 0.16679481959759845, "set_robot_commands": 0.0022148035845093485, "deviation-center-line": 3.281597209017729, "driven_lanedir_consec": 4.182696994793213, "sim_compute_sim_state": 0.010182026125410018, "sim_compute_performance-ego0": 0.0019839193104308015}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.5015709266085155, "get_ui_image": 0.03649406663833126, "step_physics": 0.15739567337497587, "survival_time": 12.35000000000004, "driven_lanedir": 1.340019294267502, "get_state_dump": 0.0048298816527089766, "get_robot_state": 0.003693536404640444, "sim_render-ego0": 0.0038541747677710743, "get_duckie_state": 1.2968817064839025e-06, "in-drivable-lane": 1.6000000000000227, "deviation-heading": 2.081692064026468, "agent_compute-ego0": 0.012814802508200367, "complete-iteration": 0.234248528557439, "set_robot_commands": 0.00224608471316676, "deviation-center-line": 1.0805742463991364, "driven_lanedir_consec": 1.340019294267502, "sim_compute_sim_state": 0.010759677617780624, "sim_compute_performance-ego0": 0.002069487687080137}, "LF-norm-techtrack-000-ego0": {"driven_any": 5.1462172315570704, "get_ui_image": 0.0327339462170326, "step_physics": 0.1162266525110804, "survival_time": 39.84999999999987, "driven_lanedir": 2.539687090505629, "get_state_dump": 0.0047196320125034875, "get_robot_state": 0.00373404874538718, "sim_render-ego0": 0.00387824627391079, "get_duckie_state": 1.3378927283418509e-06, "in-drivable-lane": 19.44999999999972, "deviation-heading": 3.903260095382132, "agent_compute-ego0": 0.012280714541748353, "complete-iteration": 0.18935548930538623, "set_robot_commands": 0.002272178355912517, "deviation-center-line": 1.755703676118974, "driven_lanedir_consec": 2.539687090505629, "sim_compute_sim_state": 0.011365832541520734, "sim_compute_performance-ego0": 0.002051691961168944}, "LF-norm-small_loop-000-ego0": {"driven_any": 3.108177412370794, "get_ui_image": 0.02739922708394576, "step_physics": 0.10309290399356764, "survival_time": 24.450000000000212, "driven_lanedir": 1.6743118800902386, "get_state_dump": 0.004837973750367456, "get_robot_state": 0.0037997172803294903, "sim_render-ego0": 0.003885751354451082, "get_duckie_state": 1.3327112003248566e-06, "in-drivable-lane": 10.55000000000015, "deviation-heading": 2.8673583056701273, "agent_compute-ego0": 0.012507988968674015, "complete-iteration": 0.16643647077132245, "set_robot_commands": 0.0024179259125067265, "deviation-center-line": 0.47443172427996255, "driven_lanedir_consec": 1.6743118800902386, "sim_compute_sim_state": 0.006374239921569824, "sim_compute_performance-ego0": 0.002027959239726164}}
set_robot_commands_max0.0024179259125067265
set_robot_commands_mean0.002287748141523838
set_robot_commands_median0.0022591315345396384
set_robot_commands_min0.0022148035845093485
sim_compute_performance-ego0_max0.002069487687080137
sim_compute_performance-ego0_mean0.002033264549601512
sim_compute_performance-ego0_median0.002039825600447554
sim_compute_performance-ego0_min0.0019839193104308015
sim_compute_sim_state_max0.011365832541520734
sim_compute_sim_state_mean0.0096704440515703
sim_compute_sim_state_median0.01047085187159532
sim_compute_sim_state_min0.006374239921569824
sim_render-ego0_max0.003885751354451082
sim_render-ego0_mean0.003837206604454033
sim_render-ego0_median0.003866210520840932
sim_render-ego0_min0.003730654021683184
simulation-passed1
step_physics_max0.15739567337497587
step_physics_mean0.11914907550184337
step_physics_median0.10965977825232402
step_physics_min0.09988107212774958
survival_time_max59.99999999999873
survival_time_mean34.16249999999971
survival_time_min12.35000000000004
No reset possible
58604LFv-simsuccessyes0:30:15
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible