Duckietown Challenges Home Challenges Submissions

Job 42336

Job ID42336
submission11424
userFrank (Chude) Qian 🇨🇦
user labeltemplate-pytorch
challengeaido5-LF-sim-validation
stepLFv-sim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatorreg02-41dc3689f9ce-1
date started
date completed
duration0:06:32
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.497154433760294
survival_time_median14.950000000000076
deviation-center-line_median0.6221443115952225
in-drivable-lane_median7.300000000000038


other stats
agent_compute-ego_max0.010530442396799724
agent_compute-ego_mean0.010325237115224204
agent_compute-ego_median0.010298550526301064
agent_compute-ego_min0.010173405011494954
complete-iteration_max0.14529285113016766
complete-iteration_mean0.14337913155555726
complete-iteration_median0.14333372871081035
complete-iteration_min0.1415562176704407
deviation-center-line_max0.6916710899261922
deviation-center-line_mean0.5452702630163141
deviation-center-line_min0.2451213389486193
deviation-heading_max5.93711713384486
deviation-heading_mean5.651901930043101
deviation-heading_median5.606441293432905
deviation-heading_min5.4576079994617315
driven_any_max1.775538431446036
driven_any_mean1.5787518665094034
driven_any_median1.5507524180976335
driven_any_min1.4379641983963103
driven_lanedir_consec_max0.5655927098969169
driven_lanedir_consec_mean0.4901871700463795
driven_lanedir_consec_min0.4008471027680131
driven_lanedir_max0.5655927098969169
driven_lanedir_mean0.4901871700463795
driven_lanedir_median0.497154433760294
driven_lanedir_min0.4008471027680131
get_duckie_state_max1.433690388997396e-06
get_duckie_state_mean1.3897816340128585e-06
get_duckie_state_median1.379251480102539e-06
get_duckie_state_min1.3669331868489583e-06
get_robot_state_max0.00990708827972412
get_robot_state_mean0.00975086530049642
get_robot_state_median0.00973790407180786
get_robot_state_min0.009620564778645832
get_state_dump_max0.008658902645111084
get_state_dump_mean0.008489132324854534
get_state_dump_median0.008466012477874756
get_state_dump_min0.008365601698557535
get_ui_image_max0.02437507629394531
get_ui_image_mean0.02429071327050527
get_ui_image_median0.024292035102844237
get_ui_image_min0.02420370658238729
in-drivable-lane_max7.650000000000044
in-drivable-lane_mean7.362500000000038
in-drivable-lane_min7.200000000000035
per-episodes
details{"ETHZ_autolab_technical_track-sc0-0-ego": {"driven_any": 1.4496206382836292, "get_ui_image": 0.02420370658238729, "step_physics": 0.06011564413706461, "survival_time": 14.950000000000076, "driven_lanedir": 0.4316021874854914, "get_state_dump": 0.008399946689605713, "sim_render-ego": 0.004552782376607259, "get_robot_state": 0.009620564778645832, "get_duckie_state": 1.433690388997396e-06, "in-drivable-lane": 7.650000000000044, "agent_compute-ego": 0.010173405011494954, "deviation-heading": 5.4576079994617315, "complete-iteration": 0.1415562176704407, "set_robot_commands": 0.003515135447184245, "deviation-center-line": 0.6268568884605893, "driven_lanedir_consec": 0.4316021874854914, "sim_compute_sim_state": 0.017869106928507485, "sim_compute_performance-ego": 0.003025852839152018}, "ETHZ_autolab_technical_track-sc1-0-ego": {"driven_any": 1.775538431446036, "get_ui_image": 0.02421919822692871, "step_physics": 0.06137986739476522, "survival_time": 14.950000000000076, "driven_lanedir": 0.5627066800350966, "get_state_dump": 0.008365601698557535, "sim_render-ego": 0.004565001328786214, "get_robot_state": 0.009696924686431884, "get_duckie_state": 1.384417215983073e-06, "in-drivable-lane": 7.350000000000035, "agent_compute-ego": 0.010373535950978598, "deviation-heading": 5.685098907142519, "complete-iteration": 0.14218524614969888, "set_robot_commands": 0.003498528798421224, "deviation-center-line": 0.6916710899261922, "driven_lanedir_consec": 0.5627066800350966, "sim_compute_sim_state": 0.017016329765319825, "sim_compute_performance-ego": 0.002989523410797119}, "ETHZ_autolab_technical_track-sc2-0-ego": {"driven_any": 1.4379641983963103, "get_ui_image": 0.02437507629394531, "step_physics": 0.0618480396270752, "survival_time": 14.950000000000076, "driven_lanedir": 0.4008471027680131, "get_state_dump": 0.008658902645111084, "sim_render-ego": 0.004621588389078776, "get_robot_state": 0.00990708827972412, "get_duckie_state": 1.3740857442220053e-06, "in-drivable-lane": 7.25000000000004, "agent_compute-ego": 0.010530442396799724, "deviation-heading": 5.93711713384486, "complete-iteration": 0.1444822112719218, "set_robot_commands": 0.003517900307973226, "deviation-center-line": 0.6174317347298558, "driven_lanedir_consec": 0.4008471027680131, "sim_compute_sim_state": 0.017894496122996012, "sim_compute_performance-ego": 0.003046244780222575}, "ETHZ_autolab_technical_track-sc3-0-ego": {"driven_any": 1.651884197911638, "get_ui_image": 0.024364871978759764, "step_physics": 0.0608271590868632, "survival_time": 14.950000000000076, "driven_lanedir": 0.5655927098969169, "get_state_dump": 0.008532078266143798, "sim_render-ego": 0.004477720260620117, "get_robot_state": 0.009778883457183838, "get_duckie_state": 1.3669331868489583e-06, "in-drivable-lane": 7.200000000000035, "agent_compute-ego": 0.010223565101623536, "deviation-heading": 5.527783679723291, "complete-iteration": 0.14529285113016766, "set_robot_commands": 0.003607897758483887, "deviation-center-line": 0.2451213389486193, "driven_lanedir_consec": 0.5655927098969169, "sim_compute_sim_state": 0.020417743523915607, "sim_compute_performance-ego": 0.0029817668596903483}}
set_robot_commands_max0.003607897758483887
set_robot_commands_mean0.0035348655780156456
set_robot_commands_median0.0035165178775787356
set_robot_commands_min0.003498528798421224
sim_compute_performance-ego_max0.003046244780222575
sim_compute_performance-ego_mean0.003010846972465515
sim_compute_performance-ego_median0.003007688124974569
sim_compute_performance-ego_min0.0029817668596903483
sim_compute_sim_state_max0.020417743523915607
sim_compute_sim_state_mean0.01829941908518473
sim_compute_sim_state_median0.01788180152575175
sim_compute_sim_state_min0.017016329765319825
sim_render-ego_max0.004621588389078776
sim_render-ego_mean0.0045542730887730915
sim_render-ego_median0.004558891852696737
sim_render-ego_min0.004477720260620117
simulation-passed1
step_physics_max0.0618480396270752
step_physics_mean0.061042677561442056
step_physics_median0.061103513240814214
step_physics_min0.06011564413706461
survival_time_max14.950000000000076
survival_time_mean14.950000000000076
survival_time_min14.950000000000076

Highlights

42336

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-sc0-0

ETHZ_autolab_technical_track-sc1-0

ETHZ_autolab_technical_track-sc2-0

ETHZ_autolab_technical_track-sc3-0

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.