Duckietown Challenges Home Challenges Submissions

Job 83481

Job ID83481
submission16607
userArwa Alabdulkarim
user labelbase-image-ml
challengemooc-visservoing
stepsim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatorgpu-production-spot-0-04
date started
date completed
duration0:06:52
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
in-drivable-lane_median3.1500000000000448
deviation-center-line_median0.4431919943602463
driven_lanedir_consec_median1.002518157618047
survival_time_median13.350000000000056


other stats
agent_compute-ego0_max0.005905123487595589
agent_compute-ego0_mean0.0058699450238631195
agent_compute-ego0_median0.0058699450238631195
agent_compute-ego0_min0.005834766560130649
complete-iteration_max0.11608931422233582
complete-iteration_mean0.11578303865260547
complete-iteration_median0.11578303865260547
complete-iteration_min0.11547676308287516
deviation-center-line_max0.6579687214554257
deviation-center-line_mean0.4431919943602463
deviation-center-line_min0.2284152672650669
deviation-heading_max1.1645310322731963
deviation-heading_mean1.1530589292923237
deviation-heading_median1.1530589292923237
deviation-heading_min1.1415868263114508
distance-from-start_max1.250877136816668
distance-from-start_mean1.2127296383075008
distance-from-start_median1.2127296383075008
distance-from-start_min1.1745821397983336
driven_any_max1.389934167155075
driven_any_mean1.2824933070238045
driven_any_median1.2824933070238045
driven_any_min1.175052446892534
driven_lanedir_consec_max1.2268793870284185
driven_lanedir_consec_mean1.002518157618047
driven_lanedir_consec_min0.7781569282076757
driven_lanedir_max1.2268793870284185
driven_lanedir_mean1.002518157618047
driven_lanedir_median1.002518157618047
driven_lanedir_min0.7781569282076757
get_duckie_state_max1.2757049666510688e-06
get_duckie_state_mean1.272834223231107e-06
get_duckie_state_median1.272834223231107e-06
get_duckie_state_min1.2699634798111451e-06
get_robot_state_max0.0035215087475315215
get_robot_state_mean0.0034891541339590556
get_robot_state_median0.0034891541339590556
get_robot_state_min0.0034567995203865897
get_state_dump_max0.004545908781789964
get_state_dump_mean0.004517055193369534
get_state_dump_median0.004517055193369534
get_state_dump_min0.004488201604949104
get_ui_image_max0.0286668481098281
get_ui_image_mean0.02856051199485324
get_ui_image_median0.02856051199485324
get_ui_image_min0.028454175879878384
in-drivable-lane_max4.150000000000059
in-drivable-lane_mean3.1500000000000448
in-drivable-lane_min2.1500000000000306
per-episodes
details{"LF-small-loop-000-ego0": {"driven_any": 1.175052446892534, "get_ui_image": 0.028454175879878384, "step_physics": 0.06279815204681889, "survival_time": 12.35000000000004, "driven_lanedir": 0.7781569282076757, "get_state_dump": 0.004545908781789964, "get_robot_state": 0.0035215087475315215, "sim_render-ego0": 0.0034963642397234517, "get_duckie_state": 1.2699634798111451e-06, "in-drivable-lane": 4.150000000000059, "deviation-heading": 1.1415868263114508, "agent_compute-ego0": 0.005905123487595589, "complete-iteration": 0.11608931422233582, "set_robot_commands": 0.0020976172339531684, "distance-from-start": 1.1745821397983336, "deviation-center-line": 0.2284152672650669, "driven_lanedir_consec": 0.7781569282076757, "sim_compute_sim_state": 0.003354942606341454, "sim_compute_performance-ego0": 0.00183704879976088}, "LF-small-loop-001-ego0": {"driven_any": 1.389934167155075, "get_ui_image": 0.0286668481098281, "step_physics": 0.061634679635365806, "survival_time": 14.350000000000067, "driven_lanedir": 1.2268793870284185, "get_state_dump": 0.004488201604949104, "get_robot_state": 0.0034567995203865897, "sim_render-ego0": 0.003417412439982096, "get_duckie_state": 1.2757049666510688e-06, "in-drivable-lane": 2.1500000000000306, "deviation-heading": 1.1645310322731963, "agent_compute-ego0": 0.005834766560130649, "complete-iteration": 0.11547676308287516, "set_robot_commands": 0.0021132652958234153, "distance-from-start": 1.250877136816668, "deviation-center-line": 0.6579687214554257, "driven_lanedir_consec": 1.2268793870284185, "sim_compute_sim_state": 0.003998895486195882, "sim_compute_performance-ego0": 0.001791050864590539}}
set_robot_commands_max0.0021132652958234153
set_robot_commands_mean0.002105441264888292
set_robot_commands_median0.002105441264888292
set_robot_commands_min0.0020976172339531684
sim_compute_performance-ego0_max0.00183704879976088
sim_compute_performance-ego0_mean0.0018140498321757095
sim_compute_performance-ego0_median0.0018140498321757095
sim_compute_performance-ego0_min0.001791050864590539
sim_compute_sim_state_max0.003998895486195882
sim_compute_sim_state_mean0.003676919046268668
sim_compute_sim_state_median0.003676919046268668
sim_compute_sim_state_min0.003354942606341454
sim_render-ego0_max0.0034963642397234517
sim_render-ego0_mean0.003456888339852774
sim_render-ego0_median0.003456888339852774
sim_render-ego0_min0.003417412439982096
simulation-passed1
step_physics_max0.06279815204681889
step_physics_mean0.062216415841092346
step_physics_median0.062216415841092346
step_physics_min0.061634679635365806
survival_time_max14.350000000000067
survival_time_mean13.350000000000056
survival_time_min12.35000000000004

Highlights

83481

Click the images to see detailed statistics about the episode.

LF-small-loop-000

LF-small-loop-001

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.