Duckietown Challenges Home Challenges Submissions

Submission 5402

Submission5402
Competingyes
Challengeaido3-LF-sim-validation
UserRey Wiyatno 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 28475
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

28475

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
28475step1-simulationsuccessyes0:13:35
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.41518916173012
survival_time_median14.950000000000076
deviation-center-line_median0.60170863210784
in-drivable-lane_median1.800000000000022


other stats
agent_compute-ego_max0.05248964866002401
agent_compute-ego_mean0.05000806688327415
agent_compute-ego_median0.0503218296696158
agent_compute-ego_min0.04645003318786621
deviation-center-line_max0.8208161183870774
deviation-center-line_mean0.6104587937965025
deviation-center-line_min0.32746616790493477
deviation-heading_max3.568637577337554
deviation-heading_mean2.5683645780757898
deviation-heading_median2.512029493270197
deviation-heading_min1.2576340053639938
driven_any_max4.792551038463866
driven_any_mean4.0151868179835235
driven_any_median4.414168912706507
driven_any_min2.172754125096691
driven_lanedir_consec_max4.694578315546972
driven_lanedir_consec_mean3.134385630224763
driven_lanedir_consec_min1.8348187417221873
driven_lanedir_max4.694578315546972
driven_lanedir_mean3.7112806469165727
driven_lanedir_median4.2997157464056555
driven_lanedir_min1.8348187417221873
in-drivable-lane_max4.900000000000019
in-drivable-lane_mean2.1000000000000085
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 3.9144664950641537, "sim_physics": 0.12914867639541627, "survival_time": 14.950000000000076, "driven_lanedir": 3.2591811609264547, "sim_render-ego": 0.014036086400349935, "in-drivable-lane": 4.900000000000019, "agent_compute-ego": 0.04645003318786621, "deviation-heading": 3.568637577337554, "set_robot_commands": 0.01150264024734497, "deviation-center-line": 0.60170863210784, "driven_lanedir_consec": 2.25923266214294, "sim_compute_sim_state": 0.005713214079538981, "sim_compute_performance-ego": 0.008436440626780192, "sim_compute_robot_state-ego": 0.010217756430308023}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 2.172754125096691, "sim_physics": 0.10775342408348532, "survival_time": 6.799999999999984, "driven_lanedir": 1.8348187417221873, "sim_render-ego": 0.012662480859195484, "in-drivable-lane": 0.9999999999999964, "agent_compute-ego": 0.0503218296696158, "deviation-heading": 1.2576340053639938, "set_robot_commands": 0.009452339480904973, "deviation-center-line": 0.32746616790493477, "driven_lanedir_consec": 1.8348187417221873, "sim_compute_sim_state": 0.005553531296112958, "sim_compute_performance-ego": 0.007860749959945679, "sim_compute_robot_state-ego": 0.009490556576672722}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 4.7819935185863995, "sim_physics": 0.11572435935338338, "survival_time": 14.950000000000076, "driven_lanedir": 4.4681092699815945, "sim_render-ego": 0.013040320873260498, "in-drivable-lane": 2.8000000000000043, "agent_compute-ego": 0.05077268759409587, "deviation-heading": 2.24712992425507, "set_robot_commands": 0.010698479811350504, "deviation-center-line": 0.8208161183870774, "driven_lanedir_consec": 4.4681092699815945, "sim_compute_sim_state": 0.0055904308954874676, "sim_compute_performance-ego": 0.007826698621114096, "sim_compute_robot_state-ego": 0.009793220361073812}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 4.792551038463866, "sim_physics": 0.11313371817270916, "survival_time": 14.950000000000076, "driven_lanedir": 4.694578315546972, "sim_render-ego": 0.0130091921488444, "in-drivable-lane": 0, "agent_compute-ego": 0.05248964866002401, "deviation-heading": 2.512029493270197, "set_robot_commands": 0.010172727902730308, "deviation-center-line": 0.5953685823716673, "driven_lanedir_consec": 4.694578315546972, "sim_compute_sim_state": 0.0054100918769836425, "sim_compute_performance-ego": 0.00800714651743571, "sim_compute_robot_state-ego": 0.009528969923655192}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 4.414168912706507, "sim_physics": 0.10499334573745728, "survival_time": 14.950000000000076, "driven_lanedir": 4.2997157464056555, "sim_render-ego": 0.012521371841430665, "in-drivable-lane": 1.800000000000022, "agent_compute-ego": 0.05000613530476888, "deviation-heading": 3.256391890152134, "set_robot_commands": 0.01031584660212199, "deviation-center-line": 0.706934468210993, "driven_lanedir_consec": 2.41518916173012, "sim_compute_sim_state": 0.005520470142364502, "sim_compute_performance-ego": 0.007384899457295736, "sim_compute_robot_state-ego": 0.009520952701568604}}
set_robot_commands_max0.01150264024734497
set_robot_commands_mean0.01042840680889055
set_robot_commands_median0.01031584660212199
set_robot_commands_min0.009452339480904973
sim_compute_performance-ego_max0.008436440626780192
sim_compute_performance-ego_mean0.007903187036514284
sim_compute_performance-ego_median0.007860749959945679
sim_compute_performance-ego_min0.007384899457295736
sim_compute_robot_state-ego_max0.010217756430308023
sim_compute_robot_state-ego_mean0.00971029119865567
sim_compute_robot_state-ego_median0.009528969923655192
sim_compute_robot_state-ego_min0.009490556576672722
sim_compute_sim_state_max0.005713214079538981
sim_compute_sim_state_mean0.00555754765809751
sim_compute_sim_state_median0.005553531296112958
sim_compute_sim_state_min0.0054100918769836425
sim_physics_max0.12914867639541627
sim_physics_mean0.11415070474849028
sim_physics_median0.11313371817270916
sim_physics_min0.10499334573745728
sim_render-ego_max0.014036086400349935
sim_render-ego_mean0.013053890424616197
sim_render-ego_median0.0130091921488444
sim_render-ego_min0.012521371841430665
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean13.32000000000006
survival_time_min6.799999999999984
No reset possible
28472step1-simulationsuccessyes0:14:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
28469step1-simulationsuccessyes0:12:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible