Duckietown Challenges Home Challenges Submissions

Job 81220

Job ID81220
submission16433
userLin Wei-Chih
user labelbase-image-ml
challengemooc-visservoing
stepsim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatornogpu-production-b-spot-0-04
date started
date completed
duration0:15:47
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
in-drivable-lane_median39.47499999999896
deviation-center-line_median1.9278164311968873
driven_lanedir_consec_median0.7702955865989867
survival_time_median59.99999999999873


other stats
agent_compute-ego0_max0.00520747210163558
agent_compute-ego0_mean0.005131495683814564
agent_compute-ego0_median0.005131495683814564
agent_compute-ego0_min0.005055519265993549
complete-iteration_max0.12244413476700984
complete-iteration_mean0.12077621436932998
complete-iteration_median0.12077621436932998
complete-iteration_min0.11910829397165013
deviation-center-line_max3.170048315778367
deviation-center-line_mean1.9278164311968873
deviation-center-line_min0.6855845466154077
deviation-heading_max23.261141179445232
deviation-heading_mean15.527256602587103
deviation-heading_median15.527256602587103
deviation-heading_min7.793372025728973
distance-from-start_max0.3535624435606021
distance-from-start_mean0.31232573281486553
distance-from-start_median0.31232573281486553
distance-from-start_min0.2710890220691289
driven_any_max6.245213790665887
driven_any_mean6.24212408221152
driven_any_median6.24212408221152
driven_any_min6.239034373757153
driven_lanedir_consec_max0.943561970914663
driven_lanedir_consec_mean0.7702955865989867
driven_lanedir_consec_min0.5970292022833102
driven_lanedir_max2.11209119750325
driven_lanedir_mean1.35456019989328
driven_lanedir_median1.35456019989328
driven_lanedir_min0.5970292022833102
get_duckie_state_max1.1337289802239997e-06
get_duckie_state_mean1.1025618553955689e-06
get_duckie_state_median1.1025618553955689e-06
get_duckie_state_min1.071394730567138e-06
get_robot_state_max0.0032084639324534447
get_robot_state_mean0.00318785243784756
get_robot_state_median0.00318785243784756
get_robot_state_min0.0031672409432416753
get_state_dump_max0.004082825459806647
get_state_dump_mean0.004052324954119451
get_state_dump_median0.004052324954119451
get_state_dump_min0.004021824448432255
get_ui_image_max0.03819977750786139
get_ui_image_mean0.03701835746669849
get_ui_image_median0.03701835746669849
get_ui_image_min0.035836937425535585
in-drivable-lane_max50.149999999998656
in-drivable-lane_mean39.47499999999896
in-drivable-lane_min28.79999999999925
per-episodes
details{"LF-small-loop-000-ego0": {"driven_any": 6.245213790665887, "get_ui_image": 0.03819977750786139, "step_physics": 0.06195860004345642, "survival_time": 59.99999999999873, "driven_lanedir": 0.5970292022833102, "get_state_dump": 0.004082825459806647, "get_robot_state": 0.0031672409432416753, "sim_render-ego0": 0.0031563262955334463, "get_duckie_state": 1.1337289802239997e-06, "in-drivable-lane": 50.149999999998656, "deviation-heading": 7.793372025728973, "agent_compute-ego0": 0.005055519265993549, "complete-iteration": 0.12244413476700984, "set_robot_commands": 0.0018011900308626478, "distance-from-start": 0.3535624435606021, "deviation-center-line": 0.6855845466154077, "driven_lanedir_consec": 0.5970292022833102, "sim_compute_sim_state": 0.0033054464961170256, "sim_compute_performance-ego0": 0.0016407843533403966}, "LF-small-loop-001-ego0": {"driven_any": 6.239034373757153, "get_ui_image": 0.035836937425535585, "step_physics": 0.06021252599584371, "survival_time": 59.99999999999873, "driven_lanedir": 2.11209119750325, "get_state_dump": 0.004021824448432255, "get_robot_state": 0.0032084639324534447, "sim_render-ego0": 0.0032195997277862523, "get_duckie_state": 1.071394730567138e-06, "in-drivable-lane": 28.79999999999925, "deviation-heading": 23.261141179445232, "agent_compute-ego0": 0.00520747210163558, "complete-iteration": 0.11910829397165013, "set_robot_commands": 0.0018384700810085428, "distance-from-start": 0.2710890220691289, "deviation-center-line": 3.170048315778367, "driven_lanedir_consec": 0.943561970914663, "sim_compute_sim_state": 0.003772315137293019, "sim_compute_performance-ego0": 0.0017083616280535873}}
set_robot_commands_max0.0018384700810085428
set_robot_commands_mean0.0018198300559355953
set_robot_commands_median0.0018198300559355953
set_robot_commands_min0.0018011900308626478
sim_compute_performance-ego0_max0.0017083616280535873
sim_compute_performance-ego0_mean0.001674572990696992
sim_compute_performance-ego0_median0.001674572990696992
sim_compute_performance-ego0_min0.0016407843533403966
sim_compute_sim_state_max0.003772315137293019
sim_compute_sim_state_mean0.0035388808167050225
sim_compute_sim_state_median0.0035388808167050225
sim_compute_sim_state_min0.0033054464961170256
sim_render-ego0_max0.0032195997277862523
sim_render-ego0_mean0.0031879630116598496
sim_render-ego0_median0.0031879630116598496
sim_render-ego0_min0.0031563262955334463
simulation-passed1
step_physics_max0.06195860004345642
step_physics_mean0.06108556301965007
step_physics_median0.06108556301965007
step_physics_min0.06021252599584371
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873

Highlights

81220

Click the images to see detailed statistics about the episode.

LF-small-loop-000

LF-small-loop-001

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.