Duckietown Challenges Home Challenges Submissions

Submission 5829

Submission5829
Competingyes
Challengeaido3-LF-sim-validation
UserRamon Emiliani 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 29508
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

29508

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
29508step1-simulationsuccessyes0:05:18
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.6638998589776813
survival_time_median4.499999999999992
deviation-center-line_median0.19388679416876256
in-drivable-lane_median0.6999999999999975


other stats
agent_compute-ego_max0.046926324566205345
agent_compute-ego_mean0.041798179089273
agent_compute-ego_median0.042865306854248046
agent_compute-ego_min0.03118044209767537
deviation-center-line_max0.2906327496802606
deviation-center-line_mean0.2106840102933127
deviation-center-line_min0.1276339944586687
deviation-heading_max1.9510297323470305
deviation-heading_mean1.4374857091934532
deviation-heading_median1.5752727255045775
deviation-heading_min0.6550645906620719
driven_any_max1.653567878142891
driven_any_mean1.2204636250850986
driven_any_median1.1643871720605112
driven_any_min0.5774835174353393
driven_lanedir_consec_max1.2990088931317425
driven_lanedir_consec_mean0.7342003410234715
driven_lanedir_consec_min0.3754435865596526
driven_lanedir_max1.315682834391388
driven_lanedir_mean0.7388548439762774
driven_lanedir_median0.6704984324820648
driven_lanedir_min0.3754435865596526
in-drivable-lane_max2.699999999999993
in-drivable-lane_mean1.269999999999997
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 1.053315106425834, "sim_physics": 0.07633838883365493, "survival_time": 4.149999999999993, "driven_lanedir": 0.6704984324820648, "sim_render-ego": 0.012910983648644871, "in-drivable-lane": 0.6999999999999975, "agent_compute-ego": 0.03118044209767537, "deviation-heading": 1.7809346386619846, "set_robot_commands": 0.009503436375813312, "deviation-center-line": 0.19388679416876256, "driven_lanedir_consec": 0.6638998589776813, "sim_compute_sim_state": 0.00527808465153338, "sim_compute_performance-ego": 0.007499852812433818, "sim_compute_robot_state-ego": 0.009314824299639968}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 1.653567878142891, "sim_physics": 0.06813284301757813, "survival_time": 6.249999999999986, "driven_lanedir": 1.315682834391388, "sim_render-ego": 0.011427892684936524, "in-drivable-lane": 0.5999999999999979, "agent_compute-ego": 0.042865306854248046, "deviation-heading": 1.9510297323470305, "set_robot_commands": 0.007969194412231446, "deviation-center-line": 0.2906327496802606, "driven_lanedir_consec": 1.2990088931317425, "sim_compute_sim_state": 0.005300642013549805, "sim_compute_performance-ego": 0.006809585571289063, "sim_compute_robot_state-ego": 0.008431922912597657}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 1.1643871720605112, "sim_physics": 0.06956959300571018, "survival_time": 4.499999999999992, "driven_lanedir": 0.3754435865596526, "sim_render-ego": 0.011905701955159505, "in-drivable-lane": 2.3499999999999948, "agent_compute-ego": 0.04271328184339735, "deviation-heading": 1.225126858791602, "set_robot_commands": 0.008394053247239854, "deviation-center-line": 0.19184352502759336, "driven_lanedir_consec": 0.3754435865596526, "sim_compute_sim_state": 0.004905422528584798, "sim_compute_performance-ego": 0.007226422097947862, "sim_compute_robot_state-ego": 0.008853369288974337}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.6535644513609185, "sim_physics": 0.07316278076171875, "survival_time": 6.249999999999986, "driven_lanedir": 0.7850047762885471, "sim_render-ego": 0.011713420867919922, "in-drivable-lane": 2.699999999999993, "agent_compute-ego": 0.04530554008483887, "deviation-heading": 1.5752727255045775, "set_robot_commands": 0.008411699295043946, "deviation-center-line": 0.24942298813127844, "driven_lanedir_consec": 0.7850047762885471, "sim_compute_sim_state": 0.00501956558227539, "sim_compute_performance-ego": 0.0071394920349121095, "sim_compute_robot_state-ego": 0.008371196746826171}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 0.5774835174353393, "sim_physics": 0.07630815108617146, "survival_time": 2.3999999999999995, "driven_lanedir": 0.5476445901597344, "sim_render-ego": 0.012612193822860718, "in-drivable-lane": 0, "agent_compute-ego": 0.046926324566205345, "deviation-heading": 0.6550645906620719, "set_robot_commands": 0.00910275181134542, "deviation-center-line": 0.1276339944586687, "driven_lanedir_consec": 0.5476445901597344, "sim_compute_sim_state": 0.0053622424602508545, "sim_compute_performance-ego": 0.007596507668495178, "sim_compute_robot_state-ego": 0.00899815559387207}}
set_robot_commands_max0.009503436375813312
set_robot_commands_mean0.008676227028334794
set_robot_commands_median0.008411699295043946
set_robot_commands_min0.007969194412231446
sim_compute_performance-ego_max0.007596507668495178
sim_compute_performance-ego_mean0.007254372037015605
sim_compute_performance-ego_median0.007226422097947862
sim_compute_performance-ego_min0.006809585571289063
sim_compute_robot_state-ego_max0.009314824299639968
sim_compute_robot_state-ego_mean0.00879389376838204
sim_compute_robot_state-ego_median0.008853369288974337
sim_compute_robot_state-ego_min0.008371196746826171
sim_compute_sim_state_max0.0053622424602508545
sim_compute_sim_state_mean0.0051731914472388455
sim_compute_sim_state_median0.00527808465153338
sim_compute_sim_state_min0.004905422528584798
sim_physics_max0.07633838883365493
sim_physics_mean0.07270235134096668
sim_physics_median0.07316278076171875
sim_physics_min0.06813284301757813
sim_render-ego_max0.012910983648644871
sim_render-ego_mean0.012114038595904307
sim_render-ego_median0.011905701955159505
sim_render-ego_min0.011427892684936524
simulation-passed1
survival_time_max6.249999999999986
survival_time_mean4.709999999999991
survival_time_min2.3999999999999995
No reset possible
29506step1-simulationsuccessyes0:05:18
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible