Duckietown Challenges Home Challenges Submissions

Job 77415

Job ID77415
submission15607
userLiam Paull 🇨🇦
user labelobjdet exercise
challengemooc-visservoing
stepsim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatornogpu-production-b-spot-0-08
date started
date completed
duration0:14:58
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
in-drivable-lane_median17.899999999999448
deviation-center-line_median4.763522637209633
driven_lanedir_consec_median5.420484394531957
survival_time_median59.99999999999873


other stats
agent_compute-ego0_max0.023486397447038152
agent_compute-ego0_mean0.01983993635090265
agent_compute-ego0_median0.01983993635090265
agent_compute-ego0_min0.016193475254767147
complete-iteration_max0.1402110767602722
complete-iteration_mean0.13930718100735984
complete-iteration_median0.13930718100735984
complete-iteration_min0.13840328525444748
deviation-center-line_max5.340632304718403
deviation-center-line_mean4.763522637209633
deviation-center-line_min4.186412969700863
deviation-heading_max7.653326100238271
deviation-heading_mean5.7782375781274355
deviation-heading_median5.7782375781274355
deviation-heading_min3.9031490560166
distance-from-start_max1.5323801436181732
distance-from-start_mean1.2847634284570906
distance-from-start_median1.2847634284570906
distance-from-start_min1.0371467132960082
driven_any_max7.920860694378992
driven_any_mean7.917701108309985
driven_any_median7.917701108309985
driven_any_min7.914541522240978
driven_lanedir_consec_max6.525757516077108
driven_lanedir_consec_mean5.420484394531957
driven_lanedir_consec_min4.315211272986808
driven_lanedir_max6.525757516077108
driven_lanedir_mean5.420484394531957
driven_lanedir_median5.420484394531957
driven_lanedir_min4.315211272986808
get_duckie_state_max1.3846541126006648e-06
get_duckie_state_mean1.3643061489387913e-06
get_duckie_state_median1.3643061489387913e-06
get_duckie_state_min1.3439581852769175e-06
get_robot_state_max0.0033692779588659636
get_robot_state_mean0.003356843467159732
get_robot_state_median0.003356843467159732
get_robot_state_min0.0033444089754535
get_state_dump_max0.004395538722347161
get_state_dump_mean0.0043590627641701685
get_state_dump_median0.0043590627641701685
get_state_dump_min0.004322586805993175
get_ui_image_max0.04003622966642483
get_ui_image_mean0.03993988652511203
get_ui_image_median0.03993988652511203
get_ui_image_min0.039843543383799224
in-drivable-lane_max26.549999999999137
in-drivable-lane_mean17.899999999999448
in-drivable-lane_min9.249999999999758
per-episodes
details{"LF-small-loop-000-ego0": {"driven_any": 7.920860694378992, "get_ui_image": 0.04003622966642483, "step_physics": 0.05835659497981266, "survival_time": 59.99999999999873, "driven_lanedir": 4.315211272986808, "get_state_dump": 0.004322586805993175, "get_robot_state": 0.0033444089754535, "sim_render-ego0": 0.003324241066455444, "get_duckie_state": 1.3439581852769175e-06, "in-drivable-lane": 26.549999999999137, "deviation-heading": 3.9031490560166, "agent_compute-ego0": 0.023486397447038152, "complete-iteration": 0.1402110767602722, "set_robot_commands": 0.0020133394881350908, "distance-from-start": 1.0371467132960082, "deviation-center-line": 4.186412969700863, "driven_lanedir_consec": 4.315211272986808, "sim_compute_sim_state": 0.0035384149972247044, "sim_compute_performance-ego0": 0.0017021097410330666}, "LF-small-loop-001-ego0": {"driven_any": 7.914541522240978, "get_ui_image": 0.039843543383799224, "step_physics": 0.06374562372275931, "survival_time": 59.99999999999873, "driven_lanedir": 6.525757516077108, "get_state_dump": 0.004395538722347161, "get_robot_state": 0.0033692779588659636, "sim_render-ego0": 0.003352215248381069, "get_duckie_state": 1.3846541126006648e-06, "in-drivable-lane": 9.249999999999758, "deviation-heading": 7.653326100238271, "agent_compute-ego0": 0.016193475254767147, "complete-iteration": 0.13840328525444748, "set_robot_commands": 0.0020394664719936553, "distance-from-start": 1.5323801436181732, "deviation-center-line": 5.340632304718403, "driven_lanedir_consec": 6.525757516077108, "sim_compute_sim_state": 0.003626485152804385, "sim_compute_performance-ego0": 0.0017480125633703482}}
set_robot_commands_max0.0020394664719936553
set_robot_commands_mean0.002026402980064373
set_robot_commands_median0.002026402980064373
set_robot_commands_min0.0020133394881350908
sim_compute_performance-ego0_max0.0017480125633703482
sim_compute_performance-ego0_mean0.0017250611522017074
sim_compute_performance-ego0_median0.0017250611522017074
sim_compute_performance-ego0_min0.0017021097410330666
sim_compute_sim_state_max0.003626485152804385
sim_compute_sim_state_mean0.003582450075014545
sim_compute_sim_state_median0.003582450075014545
sim_compute_sim_state_min0.0035384149972247044
sim_render-ego0_max0.003352215248381069
sim_render-ego0_mean0.0033382281574182568
sim_render-ego0_median0.0033382281574182568
sim_render-ego0_min0.003324241066455444
simulation-passed1
step_physics_max0.06374562372275931
step_physics_mean0.061051109351285986
step_physics_median0.061051109351285986
step_physics_min0.05835659497981266
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873

Highlights

77415

Click the images to see detailed statistics about the episode.

LF-small-loop-000

LF-small-loop-001

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.