Duckietown Challenges Home Challenges Submissions

Submission 5275

Submission5275
Competingyes
Challengeaido3-LF-sim-validation
UserRey Wiyatno 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 28254
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

28254

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
28254step1-simulationsuccessyes0:10:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.978602440487695
survival_time_median14.950000000000076
deviation-center-line_median0.7354763623783985
in-drivable-lane_median2.3999999999999915


other stats
agent_compute-ego_max0.05991469542185465
agent_compute-ego_mean0.05492295923687164
agent_compute-ego_median0.05509822050730387
agent_compute-ego_min0.04878690560658773
deviation-center-line_max0.8474248331272387
deviation-center-line_mean0.6471528987326847
deviation-center-line_min0.23983115788269152
deviation-heading_max3.8804047718227928
deviation-heading_mean2.9271230485178292
deviation-heading_median3.107661102829861
deviation-heading_min1.4770420721076063
driven_any_max5.63289268420388
driven_any_mean4.339705475428239
driven_any_median4.420256049596535
driven_any_min2.762396262327768
driven_lanedir_consec_max5.00537934723043
driven_lanedir_consec_mean3.225875075882425
driven_lanedir_consec_min1.884949235556671
driven_lanedir_max5.00537934723043
driven_lanedir_mean3.842921135635832
driven_lanedir_median3.8953650649161857
driven_lanedir_min2.427667885320367
in-drivable-lane_max3.0500000000000176
in-drivable-lane_mean2.20000000000001
in-drivable-lane_min1.099999999999996
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 4.420256049596535, "sim_physics": 0.08443775574366251, "survival_time": 14.950000000000076, "driven_lanedir": 3.8953650649161857, "sim_render-ego": 0.01362705945968628, "in-drivable-lane": 2.3999999999999915, "agent_compute-ego": 0.04878690560658773, "deviation-heading": 3.6677420716805775, "set_robot_commands": 0.009917600949605306, "deviation-center-line": 0.7919816865374985, "driven_lanedir_consec": 2.978602440487695, "sim_compute_sim_state": 0.006475656827290853, "sim_compute_performance-ego": 0.008667467435201009, "sim_compute_robot_state-ego": 0.010009763240814208}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 3.8720447816419465, "sim_physics": 0.08233786185582478, "survival_time": 14.950000000000076, "driven_lanedir": 3.30896753586545, "sim_render-ego": 0.014120570023854574, "in-drivable-lane": 2.450000000000016, "agent_compute-ego": 0.05509822050730387, "deviation-heading": 3.8804047718227928, "set_robot_commands": 0.010220597585042318, "deviation-center-line": 0.7354763623783985, "driven_lanedir_consec": 1.884949235556671, "sim_compute_sim_state": 0.006157945791880289, "sim_compute_performance-ego": 0.00862294594446818, "sim_compute_robot_state-ego": 0.010202045440673827}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 5.63289268420388, "sim_physics": 0.08393104871114095, "survival_time": 14.950000000000076, "driven_lanedir": 5.00537934723043, "sim_render-ego": 0.013910705248514812, "in-drivable-lane": 3.0500000000000176, "agent_compute-ego": 0.05329246997833252, "deviation-heading": 2.50276522414831, "set_robot_commands": 0.010402203400929767, "deviation-center-line": 0.8474248331272387, "driven_lanedir_consec": 5.00537934723043, "sim_compute_sim_state": 0.006647120316823324, "sim_compute_performance-ego": 0.00836256742477417, "sim_compute_robot_state-ego": 0.010470155874888105}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 5.010937599371067, "sim_physics": 0.08899345636367798, "survival_time": 14.950000000000076, "driven_lanedir": 4.577225844846728, "sim_render-ego": 0.015017446676890056, "in-drivable-lane": 2.0000000000000284, "agent_compute-ego": 0.05991469542185465, "deviation-heading": 3.107661102829861, "set_robot_commands": 0.011702049573262532, "deviation-center-line": 0.6210504537375959, "driven_lanedir_consec": 3.833580549278401, "sim_compute_sim_state": 0.0069906481107076006, "sim_compute_performance-ego": 0.0088525390625, "sim_compute_robot_state-ego": 0.011127055486043294}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 2.762396262327768, "sim_physics": 0.08754113061087472, "survival_time": 6.999999999999983, "driven_lanedir": 2.427667885320367, "sim_render-ego": 0.014842602184840611, "in-drivable-lane": 1.099999999999996, "agent_compute-ego": 0.05752250467027937, "deviation-heading": 1.4770420721076063, "set_robot_commands": 0.011198961734771728, "deviation-center-line": 0.23983115788269152, "driven_lanedir_consec": 2.426863806858925, "sim_compute_sim_state": 0.00663954530443464, "sim_compute_performance-ego": 0.00917691503252302, "sim_compute_robot_state-ego": 0.010791623592376709}}
set_robot_commands_max0.011702049573262532
set_robot_commands_mean0.010688282648722331
set_robot_commands_median0.010402203400929767
set_robot_commands_min0.009917600949605306
sim_compute_performance-ego_max0.00917691503252302
sim_compute_performance-ego_mean0.008736486979893276
sim_compute_performance-ego_median0.008667467435201009
sim_compute_performance-ego_min0.00836256742477417
sim_compute_robot_state-ego_max0.011127055486043294
sim_compute_robot_state-ego_mean0.010520128726959227
sim_compute_robot_state-ego_median0.010470155874888105
sim_compute_robot_state-ego_min0.010009763240814208
sim_compute_sim_state_max0.0069906481107076006
sim_compute_sim_state_mean0.0065821832702273415
sim_compute_sim_state_median0.00663954530443464
sim_compute_sim_state_min0.006157945791880289
sim_physics_max0.08899345636367798
sim_physics_mean0.08544825065703618
sim_physics_median0.08443775574366251
sim_physics_min0.08233786185582478
sim_render-ego_max0.015017446676890056
sim_render-ego_mean0.014303676718757269
sim_render-ego_median0.014120570023854574
sim_render-ego_min0.01362705945968628
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean13.36000000000006
survival_time_min6.999999999999983
No reset possible
28253step1-simulationsuccessyes0:07:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
28252step1-simulationsuccessyes0:08:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible