Duckietown Challenges Home Challenges Submissions

Job 43522

Job ID43522
submission11687
userDishank Bansal 🇨🇦
user labelsim-exercise-2
challengeaido5-LF-sim-validation
stepLFv-sim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatorafdaniele05-c7e03feddacf-1
date started
date completed
duration0:06:37
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.2700999093695895
survival_time_median14.950000000000076
deviation-center-line_median0.717304490997436
in-drivable-lane_median1.20000000000001


other stats
agent_compute-ego_max0.014236265023549398
agent_compute-ego_mean0.01369667910923392
agent_compute-ego_median0.01389784256617228
agent_compute-ego_min0.012754766281041722
complete-iteration_max0.1488986086845398
complete-iteration_mean0.14549721502651602
complete-iteration_median0.14811943935135663
complete-iteration_min0.13685137271881104
deviation-center-line_max0.9695713439721244
deviation-center-line_mean0.7189299484713387
deviation-center-line_min0.4715394679183586
deviation-heading_max3.458851204067821
deviation-heading_mean2.7952128634239433
deviation-heading_median3.021172127199966
deviation-heading_min1.6796559952280188
driven_any_max4.178628441136877
driven_any_mean3.710800585762964
driven_any_median3.890178552459198
driven_any_min2.8842167969965837
driven_lanedir_consec_max4.050339292817771
driven_lanedir_consec_mean3.294373919128936
driven_lanedir_consec_min2.5869565649587933
driven_lanedir_max4.050339292817771
driven_lanedir_mean3.294373919128936
driven_lanedir_median3.2700999093695895
driven_lanedir_min2.5869565649587933
get_duckie_state_max1.3891855875651045e-06
get_duckie_state_mean1.335019445688711e-06
get_duckie_state_median1.3758233711544404e-06
get_duckie_state_min1.1992454528808594e-06
get_robot_state_max0.009392295180067504
get_robot_state_mean0.009004989539162588
get_robot_state_median0.008979077736536662
get_robot_state_min0.008669507503509522
get_state_dump_max0.007814628256242828
get_state_dump_mean0.007395138137758115
get_state_dump_median0.007292983929316203
get_state_dump_min0.007179956436157226
get_ui_image_max0.025822985967000327
get_ui_image_mean0.02500964583650147
get_ui_image_median0.025115110510486663
get_ui_image_min0.023985376358032228
in-drivable-lane_max2.80000000000004
in-drivable-lane_mean1.300000000000015
in-drivable-lane_min0.0
per-episodes
details{"ETHZ_autolab_technical_track-sc0-0-ego": {"driven_any": 4.147525930842179, "get_ui_image": 0.023985376358032228, "step_physics": 0.05775856892267863, "survival_time": 14.950000000000076, "driven_lanedir": 3.806317475746328, "get_state_dump": 0.007179956436157226, "sim_render-ego": 0.003803588549296061, "get_robot_state": 0.008669507503509522, "get_duckie_state": 1.1992454528808594e-06, "in-drivable-lane": 1.200000000000017, "agent_compute-ego": 0.013580804665883382, "deviation-heading": 2.8793034299413103, "complete-iteration": 0.13685137271881104, "set_robot_commands": 0.003210129737854004, "deviation-center-line": 0.786596114453887, "driven_lanedir_consec": 3.806317475746328, "sim_compute_sim_state": 0.016090675989786783, "sim_compute_performance-ego": 0.0024915226300557453}, "ETHZ_autolab_technical_track-sc1-0-ego": {"driven_any": 4.178628441136877, "get_ui_image": 0.025822985967000327, "step_physics": 0.06559995094935099, "survival_time": 14.950000000000076, "driven_lanedir": 4.050339292817771, "get_state_dump": 0.007262169520060221, "sim_render-ego": 0.004141484101613363, "get_robot_state": 0.008936742146809895, "get_duckie_state": 1.3891855875651045e-06, "in-drivable-lane": 0.0, "agent_compute-ego": 0.014236265023549398, "deviation-heading": 3.1630408244586214, "complete-iteration": 0.1488986086845398, "set_robot_commands": 0.003581175009409586, "deviation-center-line": 0.9695713439721244, "driven_lanedir_consec": 4.050339292817771, "sim_compute_sim_state": 0.016489181518554687, "sim_compute_performance-ego": 0.002741703192392985}, "ETHZ_autolab_technical_track-sc2-0-ego": {"driven_any": 3.632831174076217, "get_ui_image": 0.02546408971150716, "step_physics": 0.06485973119735718, "survival_time": 14.950000000000076, "driven_lanedir": 2.7338823429928505, "get_state_dump": 0.007323798338572184, "sim_render-ego": 0.004089376131693522, "get_robot_state": 0.009021413326263428, "get_duckie_state": 1.3669331868489583e-06, "in-drivable-lane": 2.80000000000004, "agent_compute-ego": 0.01421488046646118, "deviation-heading": 3.458851204067821, "complete-iteration": 0.14803231875101724, "set_robot_commands": 0.0034351007143656413, "deviation-center-line": 0.648012867540985, "driven_lanedir_consec": 2.7338823429928505, "sim_compute_sim_state": 0.016876881122589112, "sim_compute_performance-ego": 0.0026530234018961587}, "ETHZ_autolab_technical_track-sc3-0-ego": {"driven_any": 2.8842167969965837, "get_ui_image": 0.02476613130946617, "step_physics": 0.06437188218542411, "survival_time": 8.84999999999999, "driven_lanedir": 2.5869565649587933, "get_state_dump": 0.007814628256242828, "sim_render-ego": 0.003975624418528067, "get_robot_state": 0.009392295180067504, "get_duckie_state": 1.3847135554599224e-06, "in-drivable-lane": 1.2000000000000028, "agent_compute-ego": 0.012754766281041722, "deviation-heading": 1.6796559952280188, "complete-iteration": 0.14820655995169602, "set_robot_commands": 0.003041666106315656, "deviation-center-line": 0.4715394679183586, "driven_lanedir_consec": 2.5869565649587933, "sim_compute_sim_state": 0.01945320765177409, "sim_compute_performance-ego": 0.0025487013455838133}}
set_robot_commands_max0.003581175009409586
set_robot_commands_mean0.003317017891986222
set_robot_commands_median0.003322615226109823
set_robot_commands_min0.003041666106315656
sim_compute_performance-ego_max0.002741703192392985
sim_compute_performance-ego_mean0.0026087376424821754
sim_compute_performance-ego_median0.002600862373739986
sim_compute_performance-ego_min0.0024915226300557453
sim_compute_sim_state_max0.01945320765177409
sim_compute_sim_state_mean0.01722748657067617
sim_compute_sim_state_median0.0166830313205719
sim_compute_sim_state_min0.016090675989786783
sim_render-ego_max0.004141484101613363
sim_render-ego_mean0.004002518300282754
sim_render-ego_median0.004032500275110795
sim_render-ego_min0.003803588549296061
simulation-passed1
step_physics_max0.06559995094935099
step_physics_mean0.06314753331370272
step_physics_median0.06461580669139064
step_physics_min0.05775856892267863
survival_time_max14.950000000000076
survival_time_mean13.425000000000058
survival_time_min8.84999999999999

Highlights

43522

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-sc0-0

ETHZ_autolab_technical_track-sc1-0

ETHZ_autolab_technical_track-sc2-0

ETHZ_autolab_technical_track-sc3-0

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.