Duckietown Challenges Home Challenges Submissions

Job 17179

Job ID17179
submission2274
userAndrea Censi 🇨🇭
user labelSolution template
challengeaido2_LF_r2-v4
stepstep1-simulation
statussuccess
up to dateyes
evaluatoridsc-rudolf-14710
date started
date completed
duration0:07:13
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.6859990822522983
survival_time_median3.399999999999996
deviation-center-line_median0.1202293815485396
in-drivable-lane_median1.3499999999999952


other stats
agent_compute-ego_max0.15802546826804556
agent_compute-ego_mean0.15400796075210005
agent_compute-ego_median0.15507365740262546
agent_compute-ego_min0.15029484272003174
deviation-center-line_max0.20784364836514316
deviation-center-line_mean0.14265024925288242
deviation-center-line_min0.1070055904497026
deviation-heading_max0.3886055345167397
deviation-heading_mean0.3621006277452553
deviation-heading_median0.37092376484647793
deviation-heading_min0.30707254041461957
driven_any_max1.4782765184420763
driven_any_mean1.131547350619698
driven_any_median1.0097725221691731
driven_any_min0.9477327875430368
driven_lanedir_consec_max0.9654651499017186
driven_lanedir_consec_mean0.7418830599145954
driven_lanedir_consec_min0.5879893173150568
driven_lanedir_max0.9654651499017186
driven_lanedir_mean0.7418830599145954
driven_lanedir_median0.6859990822522983
driven_lanedir_min0.5879893173150568
in-drivable-lane_max2.0499999999999927
in-drivable-lane_mean1.2399999999999956
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 1.0097725221691731, "sim_physics": 0.0035273082116070915, "survival_time": 3.399999999999996, "driven_lanedir": 0.5879893173150568, "sim_render-ego": 0.07074875340742223, "in-drivable-lane": 1.3499999999999952, "agent_compute-ego": 0.15510188130771413, "deviation-heading": 0.3886055345167397, "set_robot_commands": 0.07960655408747055, "deviation-center-line": 0.1128230942788872, "driven_lanedir_consec": 0.5879893173150568, "sim_compute_sim_state": 0.043531954288482666, "sim_compute_performance-ego": 0.06995923378888298, "sim_compute_robot_state-ego": 0.2672942701508017}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 0.9477327875430368, "sim_physics": 0.0033820394485715834, "survival_time": 3.149999999999997, "driven_lanedir": 0.6859990822522983, "sim_render-ego": 0.07048966392638191, "in-drivable-lane": 0.7999999999999972, "agent_compute-ego": 0.1515439540620834, "deviation-heading": 0.3652178406992292, "set_robot_commands": 0.07791074117024739, "deviation-center-line": 0.1070055904497026, "driven_lanedir_consec": 0.6859990822522983, "sim_compute_sim_state": 0.04336022952246287, "sim_compute_performance-ego": 0.06992990251571413, "sim_compute_robot_state-ego": 0.07747689126029847}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 0.9705553879075464, "sim_physics": 0.003409741475031926, "survival_time": 3.2499999999999964, "driven_lanedir": 0.9654651499017186, "sim_render-ego": 0.07316273909348708, "in-drivable-lane": 0, "agent_compute-ego": 0.15507365740262546, "deviation-heading": 0.30707254041461957, "set_robot_commands": 0.07968624921945425, "deviation-center-line": 0.20784364836514316, "driven_lanedir_consec": 0.9654651499017186, "sim_compute_sim_state": 0.04469312887925368, "sim_compute_performance-ego": 0.07166186112623948, "sim_compute_robot_state-ego": 0.08088872982905461}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.251399537036658, "sim_physics": 0.003636935862099252, "survival_time": 4.099999999999993, "driven_lanedir": 0.6183498913595016, "sim_render-ego": 0.07109042493308462, "in-drivable-lane": 1.999999999999993, "agent_compute-ego": 0.15802546826804556, "deviation-heading": 0.37092376484647793, "set_robot_commands": 0.07895466467229331, "deviation-center-line": 0.1202293815485396, "driven_lanedir_consec": 0.6183498913595016, "sim_compute_sim_state": 0.043736719503635314, "sim_compute_performance-ego": 0.07222929524212349, "sim_compute_robot_state-ego": 0.08159367340367014}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 1.4782765184420763, "sim_physics": 0.003357553482055664, "survival_time": 4.99999999999999, "driven_lanedir": 0.8516118587444015, "sim_render-ego": 0.0714106273651123, "in-drivable-lane": 2.0499999999999927, "agent_compute-ego": 0.15029484272003174, "deviation-heading": 0.3786834582492101, "set_robot_commands": 0.0775247859954834, "deviation-center-line": 0.16534953162213956, "driven_lanedir_consec": 0.8516118587444015, "sim_compute_sim_state": 0.04196627140045166, "sim_compute_performance-ego": 0.06776957035064697, "sim_compute_robot_state-ego": 0.07649435043334961}}
set_robot_commands_max0.07968624921945425
set_robot_commands_mean0.07873659902898979
set_robot_commands_median0.07895466467229331
set_robot_commands_min0.0775247859954834
sim_compute_performance-ego_max0.07222929524212349
sim_compute_performance-ego_mean0.07030997260472141
sim_compute_performance-ego_median0.06995923378888298
sim_compute_performance-ego_min0.06776957035064697
sim_compute_robot_state-ego_max0.2672942701508017
sim_compute_robot_state-ego_mean0.11674958301543492
sim_compute_robot_state-ego_median0.08088872982905461
sim_compute_robot_state-ego_min0.07649435043334961
sim_compute_sim_state_max0.04469312887925368
sim_compute_sim_state_mean0.04345766071885724
sim_compute_sim_state_median0.043531954288482666
sim_compute_sim_state_min0.04196627140045166
sim_physics_max0.003636935862099252
sim_physics_mean0.003462715695873104
sim_physics_median0.003409741475031926
sim_physics_min0.003357553482055664
sim_render-ego_max0.07316273909348708
sim_render-ego_mean0.07138044174509764
sim_render-ego_median0.07109042493308462
sim_render-ego_min0.07048966392638191
simulation-passed1
survival_time_max4.99999999999999
survival_time_mean3.779999999999995
survival_time_min3.149999999999997

Highlights

The highlights are hidden.

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.