Duckietown Challenges Home Challenges Submissions

Job 80775

Job ID80775
submission16391
userAwni Altabaa
user labeltemplate-ros
challengemooc-modcon
stepsim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatornogpu-production-b-spot-0-03
date started
date completed
duration0:07:51
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
in-drivable-lane_median19.17500000000012
deviation-center-line_median0.4305861755927771
driven_lanedir_consec_median0.4412730717791963
survival_time_median23.725000000000147


other stats
agent_compute-ego0_max0.015991828980899993
agent_compute-ego0_mean0.010730651647417103
agent_compute-ego0_median0.010730651647417103
agent_compute-ego0_min0.0054694743139342165
complete-iteration_max0.1904316083306358
complete-iteration_mean0.18385835222735872
complete-iteration_median0.18385835222735872
complete-iteration_min0.17728509612408164
deviation-center-line_max0.6684126512142614
deviation-center-line_mean0.4305861755927771
deviation-center-line_min0.19275969997129283
deviation-heading_max5.331063071166181
deviation-heading_mean3.5579425304409424
deviation-heading_median3.5579425304409424
deviation-heading_min1.7848219897157036
distance-from-start_max3.764493740035152
distance-from-start_mean2.680853842907464
distance-from-start_median2.680853842907464
distance-from-start_min1.5972139457797756
driven_any_max4.54366296561745
driven_any_mean3.163513065412765
driven_any_median3.163513065412765
driven_any_min1.7833631652080797
driven_lanedir_consec_max0.6984863144308486
driven_lanedir_consec_mean0.4412730717791963
driven_lanedir_consec_min0.18405982912754393
driven_lanedir_max0.6984863144308486
driven_lanedir_mean0.4412730717791963
driven_lanedir_median0.4412730717791963
driven_lanedir_min0.18405982912754393
get_duckie_state_max1.081753344762893e-06
get_duckie_state_mean1.0548866197993311e-06
get_duckie_state_median1.0548866197993311e-06
get_duckie_state_min1.0280198948357694e-06
get_robot_state_max0.0031072572140710755
get_robot_state_mean0.0030870718039339527
get_robot_state_median0.0030870718039339527
get_robot_state_min0.00306688639379683
get_state_dump_max0.003869457613854181
get_state_dump_mean0.0038550246794957473
get_state_dump_median0.0038550246794957473
get_state_dump_min0.003840591745137314
get_ui_image_max0.06190437504223415
get_ui_image_mean0.06119553001642349
get_ui_image_median0.06119553001642349
get_ui_image_min0.060486684990612834
in-drivable-lane_max26.50000000000019
in-drivable-lane_mean19.17500000000012
in-drivable-lane_min11.850000000000048
per-episodes
details{"LF-full-loop-000-ego0": {"driven_any": 4.54366296561745, "get_ui_image": 0.06190437504223415, "step_physics": 0.07296742498874664, "survival_time": 33.55000000000023, "driven_lanedir": 0.6984863144308486, "get_state_dump": 0.003869457613854181, "get_robot_state": 0.00306688639379683, "sim_render-ego0": 0.0031584621894927252, "get_duckie_state": 1.081753344762893e-06, "in-drivable-lane": 26.50000000000019, "deviation-heading": 5.331063071166181, "agent_compute-ego0": 0.015991828980899993, "complete-iteration": 0.1904316083306358, "set_robot_commands": 0.0017793788796379452, "distance-from-start": 3.764493740035152, "deviation-center-line": 0.6684126512142614, "driven_lanedir_consec": 0.6984863144308486, "sim_compute_sim_state": 0.02603164847408022, "sim_compute_performance-ego0": 0.0015885049388522194}, "LF-full-loop-001-ego0": {"driven_any": 1.7833631652080797, "get_ui_image": 0.060486684990612834, "step_physics": 0.07288740185426555, "survival_time": 13.900000000000064, "driven_lanedir": 0.18405982912754393, "get_state_dump": 0.003840591745137314, "get_robot_state": 0.0031072572140710755, "sim_render-ego0": 0.0032281200518317547, "get_duckie_state": 1.0280198948357694e-06, "in-drivable-lane": 11.850000000000048, "deviation-heading": 1.7848219897157036, "agent_compute-ego0": 0.0054694743139342165, "complete-iteration": 0.17728509612408164, "set_robot_commands": 0.0017706886414558655, "distance-from-start": 1.5972139457797756, "deviation-center-line": 0.19275969997129283, "driven_lanedir_consec": 0.18405982912754393, "sim_compute_sim_state": 0.024812372781897105, "sim_compute_performance-ego0": 0.0016114549397567694}}
set_robot_commands_max0.0017793788796379452
set_robot_commands_mean0.0017750337605469057
set_robot_commands_median0.0017750337605469057
set_robot_commands_min0.0017706886414558655
sim_compute_performance-ego0_max0.0016114549397567694
sim_compute_performance-ego0_mean0.0015999799393044945
sim_compute_performance-ego0_median0.0015999799393044945
sim_compute_performance-ego0_min0.0015885049388522194
sim_compute_sim_state_max0.02603164847408022
sim_compute_sim_state_mean0.025422010627988663
sim_compute_sim_state_median0.025422010627988663
sim_compute_sim_state_min0.024812372781897105
sim_render-ego0_max0.0032281200518317547
sim_render-ego0_mean0.00319329112066224
sim_render-ego0_median0.00319329112066224
sim_render-ego0_min0.0031584621894927252
simulation-passed1
step_physics_max0.07296742498874664
step_physics_mean0.0729274134215061
step_physics_median0.0729274134215061
step_physics_min0.07288740185426555
survival_time_max33.55000000000023
survival_time_mean23.725000000000147
survival_time_min13.900000000000064

Highlights

80775

Click the images to see detailed statistics about the episode.

LF-full-loop-000

LF-full-loop-001

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.