Duckietown Challenges Home Challenges Submissions

Submission 5699

Submission5699
Competingyes
Challengeaido3-LF-sim-validation
UserBasile Dura 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 29274
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

29274

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
29274step1-simulationsuccessyes0:05:04
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.26979966488047746
survival_time_median2.4499999999999993
deviation-center-line_median0.11275048197229606
in-drivable-lane_median0.3999999999999986


other stats
agent_compute-ego_max0.04190811818959762
agent_compute-ego_mean0.02509136451507101
agent_compute-ego_median0.021525755723317466
agent_compute-ego_min0.01940070589383443
deviation-center-line_max0.6251379912428785
deviation-center-line_mean0.22208271424445383
deviation-center-line_min0.02990790756265292
deviation-heading_max4.786894966308617
deviation-heading_mean1.8248593130126485
deviation-heading_median0.8116103490792242
deviation-heading_min0.4881466983509043
driven_any_max3.913005939671856
driven_any_mean1.7574567734998792
driven_any_median0.7298570082143749
driven_any_min0.1666753345553215
driven_lanedir_consec_max0.899352684134396
driven_lanedir_consec_mean0.3791895873074223
driven_lanedir_consec_min0.07800710824711565
driven_lanedir_max0.899352684134396
driven_lanedir_mean0.3914162685979933
driven_lanedir_median0.3309330713333323
driven_lanedir_min0.07800710824711565
in-drivable-lane_max12.25000000000007
in-drivable-lane_mean4.520000000000024
in-drivable-lane_min0.04999999999999993
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 0.7298570082143749, "sim_physics": 0.09256733193689463, "survival_time": 2.4499999999999993, "driven_lanedir": 0.5511294486868568, "sim_render-ego": 0.009326759649782764, "in-drivable-lane": 0.3999999999999986, "agent_compute-ego": 0.04190811818959762, "deviation-heading": 0.8116103490792242, "set_robot_commands": 0.008689160249671158, "deviation-center-line": 0.11275048197229606, "driven_lanedir_consec": 0.5511294486868568, "sim_compute_sim_state": 0.004241490850643236, "sim_compute_performance-ego": 0.005735548175111109, "sim_compute_robot_state-ego": 0.007140977042061942}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 3.761596916967217, "sim_physics": 0.07653316418329875, "survival_time": 14.950000000000076, "driven_lanedir": 0.3309330713333323, "sim_render-ego": 0.009790855248769124, "in-drivable-lane": 12.25000000000007, "agent_compute-ego": 0.021525755723317466, "deviation-heading": 2.544764122561613, "set_robot_commands": 0.008547564347585043, "deviation-center-line": 0.30368060839410566, "driven_lanedir_consec": 0.26979966488047746, "sim_compute_sim_state": 0.00418325662612915, "sim_compute_performance-ego": 0.005655322074890137, "sim_compute_robot_state-ego": 0.0069886581103007}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 0.21614866809062544, "sim_physics": 0.0871632993221283, "survival_time": 1.2000000000000004, "driven_lanedir": 0.07800710824711565, "sim_render-ego": 0.010050525267918903, "in-drivable-lane": 0.3500000000000002, "agent_compute-ego": 0.01940070589383443, "deviation-heading": 0.4881466983509043, "set_robot_commands": 0.00811059276262919, "deviation-center-line": 0.03893658205033621, "driven_lanedir_consec": 0.07800710824711565, "sim_compute_sim_state": 0.004414657751719157, "sim_compute_performance-ego": 0.005687485138575236, "sim_compute_robot_state-ego": 0.00761750340461731}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 0.1666753345553215, "sim_physics": 0.07772678136825562, "survival_time": 1.0000000000000002, "driven_lanedir": 0.09765903058826564, "sim_render-ego": 0.010498011112213134, "in-drivable-lane": 0.04999999999999993, "agent_compute-ego": 0.02094782590866089, "deviation-heading": 0.4928804287628832, "set_robot_commands": 0.008816981315612793, "deviation-center-line": 0.02990790756265292, "driven_lanedir_consec": 0.09765903058826564, "sim_compute_sim_state": 0.003853404521942139, "sim_compute_performance-ego": 0.005816614627838135, "sim_compute_robot_state-ego": 0.007102537155151367}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 3.913005939671856, "sim_physics": 0.07840103308359782, "survival_time": 14.950000000000076, "driven_lanedir": 0.899352684134396, "sim_render-ego": 0.009337638219197591, "in-drivable-lane": 9.550000000000054, "agent_compute-ego": 0.021674416859944663, "deviation-heading": 4.786894966308617, "set_robot_commands": 0.008374532063802084, "deviation-center-line": 0.6251379912428785, "driven_lanedir_consec": 0.899352684134396, "sim_compute_sim_state": 0.004208757082621257, "sim_compute_performance-ego": 0.005534106890360514, "sim_compute_robot_state-ego": 0.006873668829600017}}
set_robot_commands_max0.008816981315612793
set_robot_commands_mean0.008507766147860054
set_robot_commands_median0.008547564347585043
set_robot_commands_min0.00811059276262919
sim_compute_performance-ego_max0.005816614627838135
sim_compute_performance-ego_mean0.005685815381355027
sim_compute_performance-ego_median0.005687485138575236
sim_compute_performance-ego_min0.005534106890360514
sim_compute_robot_state-ego_max0.00761750340461731
sim_compute_robot_state-ego_mean0.007144668908346266
sim_compute_robot_state-ego_median0.007102537155151367
sim_compute_robot_state-ego_min0.006873668829600017
sim_compute_sim_state_max0.004414657751719157
sim_compute_sim_state_mean0.004180313366610988
sim_compute_sim_state_median0.004208757082621257
sim_compute_sim_state_min0.003853404521942139
sim_physics_max0.09256733193689463
sim_physics_mean0.08247832197883502
sim_physics_median0.07840103308359782
sim_physics_min0.07653316418329875
sim_render-ego_max0.010498011112213134
sim_render-ego_mean0.009800757899576306
sim_render-ego_median0.009790855248769124
sim_render-ego_min0.009326759649782764
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean6.91000000000003
survival_time_min1.0000000000000002
No reset possible
29273step1-simulationsuccessyes0:04:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible