Duckietown Challenges Home Challenges Submissions

Job 43633

Job ID43633
submission11699
userDishank Bansal 🇨🇦
user labelsim-exercise-2
challengeaido5-LF-sim-validation
stepLFv-sim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatorafdaniele11-e40cd4f987ce-1
date started
date completed
duration0:10:34
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.910391447497296
survival_time_median14.950000000000076
deviation-center-line_median0.9191628149513398
in-drivable-lane_median1.3250000000000188


other stats
agent_compute-ego_max0.025888420740763345
agent_compute-ego_mean0.02569403926531474
agent_compute-ego_median0.025660410324732465
agent_compute-ego_min0.02556691567103068
complete-iteration_max0.2703807711601257
complete-iteration_mean0.2636521146694819
complete-iteration_median0.26214430888493856
complete-iteration_min0.2599390697479248
deviation-center-line_max1.0954146475332085
deviation-center-line_mean0.91958403629652
deviation-center-line_min0.7445958677501916
deviation-heading_max3.190003184686437
deviation-heading_mean3.0712857104875617
deviation-heading_median3.0832024604055266
deviation-heading_min2.928734736452756
driven_any_max5.038052791998315
driven_any_mean4.633640400767297
driven_any_median4.812224538374598
driven_any_min3.872059734321677
driven_lanedir_consec_max4.576171925822377
driven_lanedir_consec_mean3.783334398325137
driven_lanedir_consec_min2.736382772483578
driven_lanedir_max4.576171925822377
driven_lanedir_mean4.238011417894299
driven_lanedir_median4.384305518224748
driven_lanedir_min3.6072627093053233
get_duckie_state_max3.0199686686197915e-06
get_duckie_state_mean2.925395965576172e-06
get_duckie_state_median2.9699007670084636e-06
get_duckie_state_min2.7418136596679688e-06
get_robot_state_max0.019018041292826336
get_robot_state_mean0.018922595580418904
get_robot_state_median0.018934685389200845
get_robot_state_min0.018802970250447592
get_state_dump_max0.01919251759847005
get_state_dump_mean0.019072390794754028
get_state_dump_median0.019049286444981897
get_state_dump_min0.01899847269058227
get_ui_image_max0.04263540426890056
get_ui_image_mean0.042031926512718205
get_ui_image_median0.041907186905543015
get_ui_image_min0.04167792797088623
in-drivable-lane_max1.7500000000000044
in-drivable-lane_mean1.2500000000000129
in-drivable-lane_min0.6000000000000085
per-episodes
details{"ETHZ_autolab_technical_track-sc0-0-ego": {"driven_any": 4.923564628665179, "get_ui_image": 0.04205280065536499, "step_physics": 0.1010368792215983, "survival_time": 14.950000000000076, "driven_lanedir": 4.576171925822377, "get_state_dump": 0.01908239523569743, "sim_render-ego": 0.00821421225865682, "get_robot_state": 0.018802970250447592, "get_duckie_state": 2.972284952799479e-06, "in-drivable-lane": 1.1000000000000156, "agent_compute-ego": 0.02556691567103068, "deviation-heading": 2.928734736452756, "complete-iteration": 0.2613628045717875, "set_robot_commands": 0.006476887861887614, "deviation-center-line": 0.9980335399442596, "driven_lanedir_consec": 4.576171925822377, "sim_compute_sim_state": 0.03409734725952149, "sim_compute_performance-ego": 0.005853631496429443}, "ETHZ_autolab_technical_track-sc1-0-ego": {"driven_any": 4.700884448084016, "get_ui_image": 0.04176157315572103, "step_physics": 0.09991181294123332, "survival_time": 14.950000000000076, "driven_lanedir": 4.213520185689268, "get_state_dump": 0.01901617765426636, "sim_render-ego": 0.008365570704142252, "get_robot_state": 0.019018041292826336, "get_duckie_state": 2.967516581217448e-06, "in-drivable-lane": 1.550000000000022, "agent_compute-ego": 0.025733074347178145, "deviation-heading": 3.116995461345102, "complete-iteration": 0.2599390697479248, "set_robot_commands": 0.006693347295125326, "deviation-center-line": 0.84029208995842, "driven_lanedir_consec": 4.213520185689268, "sim_compute_sim_state": 0.03332924604415893, "sim_compute_performance-ego": 0.005928256511688232}, "ETHZ_autolab_technical_track-sc2-0-ego": {"driven_any": 3.872059734321677, "get_ui_image": 0.04167792797088623, "step_physics": 0.10330390453338624, "survival_time": 14.950000000000076, "driven_lanedir": 3.6072627093053233, "get_state_dump": 0.01899847269058227, "sim_render-ego": 0.008176869551340738, "get_robot_state": 0.018881702423095705, "get_duckie_state": 2.7418136596679688e-06, "in-drivable-lane": 0.6000000000000085, "agent_compute-ego": 0.025587746302286784, "deviation-heading": 3.190003184686437, "complete-iteration": 0.2629258131980896, "set_robot_commands": 0.006631092230478922, "deviation-center-line": 1.0954146475332085, "driven_lanedir_consec": 3.6072627093053233, "sim_compute_sim_state": 0.03361570676167806, "sim_compute_performance-ego": 0.005874257087707519}, "ETHZ_autolab_technical_track-sc3-0-ego": {"driven_any": 5.038052791998315, "get_ui_image": 0.04263540426890056, "step_physics": 0.10555952787399292, "survival_time": 14.950000000000076, "driven_lanedir": 4.555090850760227, "get_state_dump": 0.01919251759847005, "sim_render-ego": 0.008325708707173666, "get_robot_state": 0.01898766835530599, "get_duckie_state": 3.0199686686197915e-06, "in-drivable-lane": 1.7500000000000044, "agent_compute-ego": 0.025888420740763345, "deviation-heading": 3.0494094594659513, "complete-iteration": 0.2703807711601257, "set_robot_commands": 0.006610705852508545, "deviation-center-line": 0.7445958677501916, "driven_lanedir_consec": 2.736382772483578, "sim_compute_sim_state": 0.03710706392923991, "sim_compute_performance-ego": 0.005890734990437825}}
set_robot_commands_max0.006693347295125326
set_robot_commands_mean0.006603008310000102
set_robot_commands_median0.006620899041493733
set_robot_commands_min0.006476887861887614
sim_compute_performance-ego_max0.005928256511688232
sim_compute_performance-ego_mean0.005886720021565754
sim_compute_performance-ego_median0.005882496039072672
sim_compute_performance-ego_min0.005853631496429443
sim_compute_sim_state_max0.03710706392923991
sim_compute_sim_state_mean0.0345373409986496
sim_compute_sim_state_median0.033856527010599775
sim_compute_sim_state_min0.03332924604415893
sim_render-ego_max0.008365570704142252
sim_render-ego_mean0.00827059030532837
sim_render-ego_median0.008269960482915242
sim_render-ego_min0.008176869551340738
simulation-passed1
step_physics_max0.10555952787399292
step_physics_mean0.10245303114255268
step_physics_median0.10217039187749229
step_physics_min0.09991181294123332
survival_time_max14.950000000000076
survival_time_mean14.950000000000076
survival_time_min14.950000000000076

Highlights

43633

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-sc0-0

ETHZ_autolab_technical_track-sc1-0

ETHZ_autolab_technical_track-sc2-0

ETHZ_autolab_technical_track-sc3-0

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.