Duckietown Challenges Home Challenges Submissions

Job 88012

Job ID88012
submission15577
userAdriano Almeida
user labeltemplate-random
challengemooc-objdet
stepsim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatornogpu-production-b-spot-0-04
date started
date completed
duration0:08:56
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median27.250000000000252
driven_lanedir_consec_median2.7643259633965593


other stats
agent_compute-ego0_max0.011727267620610255
agent_compute-ego0_mean0.01137479056590619
agent_compute-ego0_median0.01137479056590619
agent_compute-ego0_min0.01102231351120212
complete-iteration_max0.1991108445560231
complete-iteration_mean0.19538213171798072
complete-iteration_median0.19538213171798072
complete-iteration_min0.19165341887993836
deviation-center-line_max0.6742775393186135
deviation-center-line_mean0.4924668146429858
deviation-center-line_median0.4924668146429858
deviation-center-line_min0.3106560899673582
deviation-heading_max1.446137875740923
deviation-heading_mean1.1690948785052022
deviation-heading_median1.1690948785052022
deviation-heading_min0.8920518812694817
distance-from-start_max13.040328082409216
distance-from-start_mean11.421530719906269
distance-from-start_median11.421530719906269
distance-from-start_min9.802733357403325
driven_any_max13.76149288628964
driven_any_mean11.861708296669304
driven_any_median11.861708296669304
driven_any_min9.961923707048967
driven_lanedir_consec_max3.6097693844991263
driven_lanedir_consec_mean2.7643259633965593
driven_lanedir_consec_min1.918882542293992
driven_lanedir_max3.6097693844991263
driven_lanedir_mean2.7643259633965593
driven_lanedir_median2.7643259633965593
driven_lanedir_min1.918882542293992
get_duckie_state_max0.02058399528719501
get_duckie_state_mean0.01896001224378782
get_duckie_state_median0.01896001224378782
get_duckie_state_min0.017336029200380635
get_robot_state_max0.0033459268623967056
get_robot_state_mean0.003235332621131178
get_robot_state_median0.003235332621131178
get_robot_state_min0.0031247383798656495
get_state_dump_max0.006837437615155654
get_state_dump_mean0.006755411869237702
get_state_dump_median0.006755411869237702
get_state_dump_min0.006673386123319749
get_ui_image_max0.0539129647813851
get_ui_image_mean0.053419145985461255
get_ui_image_median0.053419145985461255
get_ui_image_min0.0529253271895374
in-drivable-lane_max22.90000000000029
in-drivable-lane_mean20.55000000000023
in-drivable-lane_median20.55000000000023
in-drivable-lane_min18.200000000000173
per-episodes
details{"LF-long-loop-with-duckies-000-ego0": {"driven_any": 9.961923707048967, "get_ui_image": 0.0539129647813851, "step_physics": 0.06775198107451395, "survival_time": 22.90000000000019, "driven_lanedir": 1.918882542293992, "get_state_dump": 0.006837437615155654, "get_robot_state": 0.0033459268623967056, "sim_render-ego0": 0.0036009490360102104, "get_duckie_state": 0.02058399528719501, "in-drivable-lane": 18.200000000000173, "deviation-heading": 0.8920518812694817, "agent_compute-ego0": 0.011727267620610255, "complete-iteration": 0.1991108445560231, "set_robot_commands": 0.0020828927524209283, "distance-from-start": 9.802733357403325, "deviation-center-line": 0.3106560899673582, "driven_lanedir_consec": 1.918882542293992, "sim_compute_sim_state": 0.027247011791387155, "sim_compute_performance-ego0": 0.001930154226963816}, "LF-long-loop-with-duckies-001-ego0": {"driven_any": 13.76149288628964, "get_ui_image": 0.0529253271895374, "step_physics": 0.06642904138489734, "survival_time": 31.60000000000031, "driven_lanedir": 3.6097693844991263, "get_state_dump": 0.006673386123319749, "get_robot_state": 0.0031247383798656495, "sim_render-ego0": 0.0033117636308474187, "get_duckie_state": 0.017336029200380635, "in-drivable-lane": 22.90000000000029, "deviation-heading": 1.446137875740923, "agent_compute-ego0": 0.01102231351120212, "complete-iteration": 0.19165341887993836, "set_robot_commands": 0.0017877796438256334, "distance-from-start": 13.040328082409216, "deviation-center-line": 0.6742775393186135, "driven_lanedir_consec": 3.6097693844991263, "sim_compute_sim_state": 0.02732878913999922, "sim_compute_performance-ego0": 0.001638071985229685}}
set_robot_commands_max0.0020828927524209283
set_robot_commands_mean0.001935336198123281
set_robot_commands_median0.001935336198123281
set_robot_commands_min0.0017877796438256334
sim_compute_performance-ego0_max0.001930154226963816
sim_compute_performance-ego0_mean0.0017841131060967509
sim_compute_performance-ego0_median0.0017841131060967509
sim_compute_performance-ego0_min0.001638071985229685
sim_compute_sim_state_max0.02732878913999922
sim_compute_sim_state_mean0.02728790046569319
sim_compute_sim_state_median0.02728790046569319
sim_compute_sim_state_min0.027247011791387155
sim_render-ego0_max0.0036009490360102104
sim_render-ego0_mean0.0034563563334288148
sim_render-ego0_median0.0034563563334288148
sim_render-ego0_min0.0033117636308474187
simulation-passed1
step_physics_max0.06775198107451395
step_physics_mean0.06709051122970565
step_physics_median0.06709051122970565
step_physics_min0.06642904138489734
survival_time_max31.60000000000031
survival_time_mean27.250000000000252
survival_time_min22.90000000000019

Highlights

88012

Click the images to see detailed statistics about the episode.

LF-long-loop-with-duckies-000

LF-long-loop-with-duckies-001

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.