Duckietown Challenges Home Challenges Submissions

Job 84315

Job ID84315
submission16666
userLin Wei-Chih
user labelbase-image-ml
challengemooc-visservoing
stepsim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatornogpu-production-b-spot-0-04
date started
date completed
duration0:14:11
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
in-drivable-lane_median48.67499999999872
deviation-center-line_median1.1864010834961416
driven_lanedir_consec_median0.8813505944106288
survival_time_median59.99999999999873


other stats
agent_compute-ego0_max0.005561559821644194
agent_compute-ego0_mean0.005464486138806752
agent_compute-ego0_median0.005464486138806752
agent_compute-ego0_min0.005367412455969309
complete-iteration_max0.12894243145862488
complete-iteration_mean0.12763768340229095
complete-iteration_median0.12763768340229095
complete-iteration_min0.12633293534595702
deviation-center-line_max1.7528693446672947
deviation-center-line_mean1.1864010834961416
deviation-center-line_min0.6199328223249885
deviation-heading_max7.264067941989873
deviation-heading_mean5.608373750385038
deviation-heading_median5.608373750385038
deviation-heading_min3.952679558780204
distance-from-start_max1.2102620052317978
distance-from-start_mean0.9842267019944402
distance-from-start_median0.9842267019944402
distance-from-start_min0.7581913987570826
driven_any_max6.246802386536951
driven_any_mean6.24419416949776
driven_any_median6.24419416949776
driven_any_min6.2415859524585695
driven_lanedir_consec_max1.1196222378050191
driven_lanedir_consec_mean0.8813505944106288
driven_lanedir_consec_min0.6430789510162385
driven_lanedir_max1.2419465366237905
driven_lanedir_mean0.9425127438200144
driven_lanedir_median0.9425127438200144
driven_lanedir_min0.6430789510162385
get_duckie_state_max1.7141918655636903e-06
get_duckie_state_mean1.6608901266055142e-06
get_duckie_state_median1.6608901266055142e-06
get_duckie_state_min1.607588387647338e-06
get_robot_state_max0.0034686492742050895
get_robot_state_mean0.003356363950025827
get_robot_state_median0.003356363950025827
get_robot_state_min0.0032440786258465643
get_state_dump_max0.00442098737457809
get_state_dump_mean0.004355452737641473
get_state_dump_median0.004355452737641473
get_state_dump_min0.004289918100704857
get_ui_image_max0.04011534910019391
get_ui_image_mean0.04002411121333469
get_ui_image_median0.04002411121333469
get_ui_image_min0.03993287332647547
in-drivable-lane_max52.19999999999884
in-drivable-lane_mean48.67499999999872
in-drivable-lane_min45.1499999999986
per-episodes
details{"LF-small-loop-000-ego0": {"driven_any": 6.246802386536951, "get_ui_image": 0.04011534910019391, "step_physics": 0.06292186688622467, "survival_time": 59.99999999999873, "driven_lanedir": 0.6430789510162385, "get_state_dump": 0.004289918100704857, "get_robot_state": 0.0032440786258465643, "sim_render-ego0": 0.0032412604825085745, "get_duckie_state": 1.607588387647338e-06, "in-drivable-lane": 52.19999999999884, "deviation-heading": 3.952679558780204, "agent_compute-ego0": 0.005367412455969309, "complete-iteration": 0.12633293534595702, "set_robot_commands": 0.0018804885267119523, "distance-from-start": 0.7581913987570826, "deviation-center-line": 0.6199328223249885, "driven_lanedir_consec": 0.6430789510162385, "sim_compute_sim_state": 0.003512199276392902, "sim_compute_performance-ego0": 0.0016773985784119313}, "LF-small-loop-001-ego0": {"driven_any": 6.2415859524585695, "get_ui_image": 0.03993287332647547, "step_physics": 0.06470624533025153, "survival_time": 59.99999999999873, "driven_lanedir": 1.2419465366237905, "get_state_dump": 0.00442098737457809, "get_robot_state": 0.0034686492742050895, "sim_render-ego0": 0.0033683036388108176, "get_duckie_state": 1.7141918655636903e-06, "in-drivable-lane": 45.1499999999986, "deviation-heading": 7.264067941989873, "agent_compute-ego0": 0.005561559821644194, "complete-iteration": 0.12894243145862488, "set_robot_commands": 0.002029903326106012, "distance-from-start": 1.2102620052317978, "deviation-center-line": 1.7528693446672947, "driven_lanedir_consec": 1.1196222378050191, "sim_compute_sim_state": 0.0036511476788294505, "sim_compute_performance-ego0": 0.0017171011280755417}}
set_robot_commands_max0.002029903326106012
set_robot_commands_mean0.0019551959264089824
set_robot_commands_median0.0019551959264089824
set_robot_commands_min0.0018804885267119523
sim_compute_performance-ego0_max0.0017171011280755417
sim_compute_performance-ego0_mean0.0016972498532437366
sim_compute_performance-ego0_median0.0016972498532437366
sim_compute_performance-ego0_min0.0016773985784119313
sim_compute_sim_state_max0.0036511476788294505
sim_compute_sim_state_mean0.003581673477611177
sim_compute_sim_state_median0.003581673477611177
sim_compute_sim_state_min0.003512199276392902
sim_render-ego0_max0.0033683036388108176
sim_render-ego0_mean0.003304782060659696
sim_render-ego0_median0.003304782060659696
sim_render-ego0_min0.0032412604825085745
simulation-passed1
step_physics_max0.06470624533025153
step_physics_mean0.06381405610823809
step_physics_median0.06381405610823809
step_physics_min0.06292186688622467
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873

Highlights

84315

Click the images to see detailed statistics about the episode.

LF-small-loop-000

LF-small-loop-001

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.