Duckietown Challenges Home Challenges Submissions

Submission 12677

Submission12677
Competingyes
Challengeaido5-LFI-full-sim-validation
UserBea Baselines 🐤
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFVIv-sim: 53494
Next
User labelbaseline-duckietown
Admin priority50
Blessingn/a
User priority50

53494

Click the images to see detailed statistics about the episode.

LFI-full-4way-000

LFI-full-udem1-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
53494LFVIv-simsuccessyes0:16:47
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median4.896903710151475
survival_time_median45.82499999999952
deviation-center-line_median2.80793549679957
in-drivable-lane_median2.6750000000000314


other stats
agent_compute-ego0_max0.01354304021841344
agent_compute-ego0_mean0.013407588594666836
agent_compute-ego0_median0.013407588594666836
agent_compute-ego0_min0.01327213697092023
complete-iteration_max0.24270744842685735
complete-iteration_mean0.22531717047322505
complete-iteration_median0.22531717047322505
complete-iteration_min0.20792689251959273
deviation-center-line_max3.858815622858591
deviation-center-line_mean2.80793549679957
deviation-center-line_min1.7570553707405494
deviation-heading_max7.8449083207206
deviation-heading_mean6.5168479568585
deviation-heading_median6.5168479568585
deviation-heading_min5.188787592996399
driven_any_max7.921257070461843
driven_any_mean5.98923729120976
driven_any_median5.98923729120976
driven_any_min4.057217511957676
driven_lanedir_consec_max6.666035655388082
driven_lanedir_consec_mean4.896903710151475
driven_lanedir_consec_min3.1277717649148675
driven_lanedir_max7.420939143157886
driven_lanedir_mean5.456426697400823
driven_lanedir_median5.456426697400823
driven_lanedir_min3.491914251643759
get_duckie_state_max1.5307624175288496e-06
get_duckie_state_mean1.4751841552221525e-06
get_duckie_state_median1.4751841552221525e-06
get_duckie_state_min1.419605892915455e-06
get_robot_state_max0.004045164059044221
get_robot_state_mean0.003948640503921424
get_robot_state_median0.003948640503921424
get_robot_state_min0.003852116948798628
get_state_dump_max0.0050791939728266
get_state_dump_mean0.004943022103809706
get_state_dump_median0.004943022103809706
get_state_dump_min0.004806850234792812
get_ui_image_max0.03996430436143364
get_ui_image_mean0.0398154045729608
get_ui_image_median0.0398154045729608
get_ui_image_min0.039666504784487966
in-drivable-lane_max3.200000000000032
in-drivable-lane_mean2.6750000000000314
in-drivable-lane_min2.1500000000000306
per-episodes
details{"LFI-full-4way-000-ego0": {"driven_any": 4.057217511957676, "get_ui_image": 0.03996430436143364, "step_physics": 0.1606578251540849, "survival_time": 31.650000000000315, "driven_lanedir": 3.491914251643759, "get_state_dump": 0.004806850234792812, "get_robot_state": 0.003852116948798628, "sim_render-ego0": 0.004025978620872136, "get_duckie_state": 1.419605892915455e-06, "in-drivable-lane": 2.1500000000000306, "deviation-heading": 7.8449083207206, "agent_compute-ego0": 0.01354304021841344, "complete-iteration": 0.24270744842685735, "set_robot_commands": 0.0023338151654986555, "deviation-center-line": 1.7570553707405494, "driven_lanedir_consec": 3.1277717649148675, "sim_compute_sim_state": 0.01134190687245751, "sim_compute_performance-ego0": 0.0020934805885098333}, "LFI-full-udem1-000-ego0": {"driven_any": 7.921257070461843, "get_ui_image": 0.039666504784487966, "step_physics": 0.12415135135063024, "survival_time": 59.99999999999873, "driven_lanedir": 7.420939143157886, "get_state_dump": 0.0050791939728266, "get_robot_state": 0.004045164059044221, "sim_render-ego0": 0.004022566106893141, "get_duckie_state": 1.5307624175288496e-06, "in-drivable-lane": 3.200000000000032, "deviation-heading": 5.188787592996399, "agent_compute-ego0": 0.01327213697092023, "complete-iteration": 0.20792689251959273, "set_robot_commands": 0.0023679316391258015, "deviation-center-line": 3.858815622858591, "driven_lanedir_consec": 6.666035655388082, "sim_compute_sim_state": 0.013061902207399188, "sim_compute_performance-ego0": 0.002168002672536883}}
set_robot_commands_max0.0023679316391258015
set_robot_commands_mean0.0023508734023122285
set_robot_commands_median0.0023508734023122285
set_robot_commands_min0.0023338151654986555
sim_compute_performance-ego0_max0.002168002672536883
sim_compute_performance-ego0_mean0.0021307416305233586
sim_compute_performance-ego0_median0.0021307416305233586
sim_compute_performance-ego0_min0.0020934805885098333
sim_compute_sim_state_max0.013061902207399188
sim_compute_sim_state_mean0.012201904539928348
sim_compute_sim_state_median0.012201904539928348
sim_compute_sim_state_min0.01134190687245751
sim_render-ego0_max0.004025978620872136
sim_render-ego0_mean0.004024272363882638
sim_render-ego0_median0.004024272363882638
sim_render-ego0_min0.004022566106893141
simulation-passed1
step_physics_max0.1606578251540849
step_physics_mean0.14240458825235758
step_physics_median0.14240458825235758
step_physics_min0.12415135135063024
survival_time_max59.99999999999873
survival_time_mean45.82499999999952
survival_time_min31.650000000000315
No reset possible
53493LFVIv-simsuccessyes0:18:30
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible