Duckietown Challenges Home Challenges Submissions

Submission 5222

Submission5222
Competingyes
Challengeaido3-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 28166
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

28166

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
28166step1-simulationsuccessyes0:08:45
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.160705171074696
survival_time_median14.950000000000076
deviation-center-line_median0.3227554327735735
in-drivable-lane_median1.2500000000000178


other stats
agent_compute-ego_max0.030186342398325603
agent_compute-ego_mean0.028268165013121904
agent_compute-ego_median0.02820687756759708
agent_compute-ego_min0.02679740031560262
deviation-center-line_max0.658431058515604
deviation-center-line_mean0.4261846074581296
deviation-center-line_min0.295939122375614
deviation-heading_max3.3375820138869625
deviation-heading_mean2.8513058995870666
deviation-heading_median2.8089943944303015
deviation-heading_min2.4778882325614213
driven_any_max2.79011426750239
driven_any_mean2.371516839965488
driven_any_median2.595516374548959
driven_any_min1.6216373104241646
driven_lanedir_consec_max2.3608412564587025
driven_lanedir_consec_mean1.9626041027135084
driven_lanedir_consec_min1.1948675279001164
driven_lanedir_max2.3954534626738058
driven_lanedir_mean1.985806638326871
driven_lanedir_median2.160705171074696
driven_lanedir_min1.2426739086049876
in-drivable-lane_max2.5000000000000107
in-drivable-lane_mean1.5300000000000151
in-drivable-lane_min1.1000000000000083
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 2.595516374548959, "sim_physics": 0.08548532009124755, "survival_time": 14.950000000000076, "driven_lanedir": 2.3241168744715113, "sim_render-ego": 0.01332394520441691, "in-drivable-lane": 1.2500000000000178, "agent_compute-ego": 0.030186342398325603, "deviation-heading": 2.871712509512664, "set_robot_commands": 0.009741764863332112, "deviation-center-line": 0.5396304675555443, "driven_lanedir_consec": 2.3241168744715113, "sim_compute_sim_state": 0.006520845890045166, "sim_compute_performance-ego": 0.008438554604848226, "sim_compute_robot_state-ego": 0.010336698691050212}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 2.1883020739579533, "sim_physics": 0.08506470692308643, "survival_time": 11.850000000000032, "driven_lanedir": 1.8060837748093543, "sim_render-ego": 0.013623087718013972, "in-drivable-lane": 1.1000000000000156, "agent_compute-ego": 0.02820687756759708, "deviation-heading": 2.7603523475439826, "set_robot_commands": 0.009432384233434492, "deviation-center-line": 0.3227554327735735, "driven_lanedir_consec": 1.7724896836625168, "sim_compute_sim_state": 0.006346711629553687, "sim_compute_performance-ego": 0.008493641760781847, "sim_compute_robot_state-ego": 0.01005921786344504}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 2.662014173393973, "sim_physics": 0.08151601076126098, "survival_time": 14.950000000000076, "driven_lanedir": 2.160705171074696, "sim_render-ego": 0.01382293939590454, "in-drivable-lane": 2.5000000000000107, "agent_compute-ego": 0.02890128215154012, "deviation-heading": 3.3375820138869625, "set_robot_commands": 0.009622858365376793, "deviation-center-line": 0.658431058515604, "driven_lanedir_consec": 2.160705171074696, "sim_compute_sim_state": 0.006107859611511231, "sim_compute_performance-ego": 0.008561474482218424, "sim_compute_robot_state-ego": 0.01030478318532308}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.6216373104241646, "sim_physics": 0.08528089655038401, "survival_time": 9.049999999999994, "driven_lanedir": 1.2426739086049876, "sim_render-ego": 0.013508377812843956, "in-drivable-lane": 1.1000000000000083, "agent_compute-ego": 0.02724892263254408, "deviation-heading": 2.8089943944303015, "set_robot_commands": 0.008950918418926429, "deviation-center-line": 0.3141669560703125, "driven_lanedir_consec": 1.1948675279001164, "sim_compute_sim_state": 0.006135032980481564, "sim_compute_performance-ego": 0.00835389326949146, "sim_compute_robot_state-ego": 0.010449554380132347}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 2.79011426750239, "sim_physics": 0.0854801591237386, "survival_time": 14.950000000000076, "driven_lanedir": 2.3954534626738058, "sim_render-ego": 0.013215604623158772, "in-drivable-lane": 1.7000000000000242, "agent_compute-ego": 0.02679740031560262, "deviation-heading": 2.4778882325614213, "set_robot_commands": 0.008925146261850992, "deviation-center-line": 0.295939122375614, "driven_lanedir_consec": 2.3608412564587025, "sim_compute_sim_state": 0.0062702210744222005, "sim_compute_performance-ego": 0.008312780062357585, "sim_compute_robot_state-ego": 0.009719209671020508}}
set_robot_commands_max0.009741764863332112
set_robot_commands_mean0.009334614428584164
set_robot_commands_median0.009432384233434492
set_robot_commands_min0.008925146261850992
sim_compute_performance-ego_max0.008561474482218424
sim_compute_performance-ego_mean0.008432068835939508
sim_compute_performance-ego_median0.008438554604848226
sim_compute_performance-ego_min0.008312780062357585
sim_compute_robot_state-ego_max0.010449554380132347
sim_compute_robot_state-ego_mean0.010173892758194235
sim_compute_robot_state-ego_median0.01030478318532308
sim_compute_robot_state-ego_min0.009719209671020508
sim_compute_sim_state_max0.006520845890045166
sim_compute_sim_state_mean0.006276134237202769
sim_compute_sim_state_median0.0062702210744222005
sim_compute_sim_state_min0.006107859611511231
sim_physics_max0.08548532009124755
sim_physics_mean0.08456541868994352
sim_physics_median0.08528089655038401
sim_physics_min0.08151601076126098
sim_render-ego_max0.01382293939590454
sim_render-ego_mean0.013498790950867631
sim_render-ego_median0.013508377812843956
sim_render-ego_min0.013215604623158772
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean13.150000000000052
survival_time_min9.049999999999994
No reset possible
28165step1-simulationsuccessyes0:09:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible