Duckietown Challenges Home Challenges Submissions

Submission 11534

Submission11534
Competingyes
Challengeaido5-LF-sim-validation
UserDaniil Lisus
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54462
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

54462

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54462LFv-simsuccessyes0:37:28
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median9.465583233386322
survival_time_median59.99999999999873
deviation-center-line_median3.841597991453291
in-drivable-lane_median11.59999999999979


other stats
agent_compute-ego0_max0.012604058731803291
agent_compute-ego0_mean0.012320505158637014
agent_compute-ego0_median0.0123195793415244
agent_compute-ego0_min0.012038803219695969
complete-iteration_max0.24948599566584048
complete-iteration_mean0.20775295155470347
complete-iteration_median0.2002215406281267
complete-iteration_min0.18108272929672
deviation-center-line_max5.123844936099861
deviation-center-line_mean4.039600375707031
deviation-center-line_min3.351360583821681
deviation-heading_max15.764193696221788
deviation-heading_mean13.941287623012636
deviation-heading_median13.98094090395701
deviation-heading_min12.03907498791474
driven_any_max19.170612605652337
driven_any_mean16.50582047528968
driven_any_median16.267411418985017
driven_any_min14.317846457536357
driven_lanedir_consec_max13.008449258147277
driven_lanedir_consec_mean9.415822445217689
driven_lanedir_consec_min5.723674055950835
driven_lanedir_max16.231003001523135
driven_lanedir_mean13.007982258307193
driven_lanedir_median12.328694028845788
driven_lanedir_min11.143537974014052
get_duckie_state_max1.5605014303456185e-06
get_duckie_state_mean1.5239039669900673e-06
get_duckie_state_median1.5282809585456942e-06
get_duckie_state_min1.478552520523262e-06
get_robot_state_max0.003911493422089766
get_robot_state_mean0.003815412782325478
get_robot_state_median0.003806618951169067
get_robot_state_min0.0037369198048740103
get_state_dump_max0.0048718006714530615
get_state_dump_mean0.0047967319516921915
get_state_dump_median0.004783727942060967
get_state_dump_min0.004747671251193768
get_ui_image_max0.036564515777256175
get_ui_image_mean0.031281528297903594
get_ui_image_median0.03102403229023396
get_ui_image_min0.026513532833890255
in-drivable-lane_max15.69999999999929
in-drivable-lane_mean10.137499999999717
in-drivable-lane_min1.649999999999987
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 19.170612605652337, "get_ui_image": 0.02851654012236965, "step_physics": 0.1186452862821352, "survival_time": 59.99999999999873, "driven_lanedir": 13.008449258147277, "get_state_dump": 0.004792024253508531, "get_robot_state": 0.0037369198048740103, "sim_render-ego0": 0.0038392152714788862, "get_duckie_state": 1.5158736636299177e-06, "in-drivable-lane": 15.69999999999929, "deviation-heading": 13.12119982344632, "agent_compute-ego0": 0.012038803219695969, "complete-iteration": 0.1853206445533568, "set_robot_commands": 0.0022400027012249315, "deviation-center-line": 4.094121679760953, "driven_lanedir_consec": 13.008449258147277, "sim_compute_sim_state": 0.00940250913666051, "sim_compute_performance-ego0": 0.0020176643733676525}, "LF-norm-zigzag-000-ego0": {"driven_any": 14.317846457536357, "get_ui_image": 0.036564515777256175, "step_physics": 0.17118393504101298, "survival_time": 57.44999999999887, "driven_lanedir": 11.143537974014052, "get_state_dump": 0.0048718006714530615, "get_robot_state": 0.003824108994525412, "sim_render-ego0": 0.003989037223484205, "get_duckie_state": 1.5605014303456185e-06, "in-drivable-lane": 10.949999999999545, "deviation-heading": 15.764193696221788, "agent_compute-ego0": 0.01253925261290177, "complete-iteration": 0.24948599566584048, "set_robot_commands": 0.0023472253136012865, "deviation-center-line": 3.58907430314563, "driven_lanedir_consec": 5.723674055950835, "sim_compute_sim_state": 0.01191789834395699, "sim_compute_performance-ego0": 0.002153099930804709}, "LF-norm-techtrack-000-ego0": {"driven_any": 17.042812470014617, "get_ui_image": 0.033531524458098275, "step_physics": 0.13921482993800077, "survival_time": 59.99999999999873, "driven_lanedir": 16.231003001523135, "get_state_dump": 0.004747671251193768, "get_robot_state": 0.003911493422089766, "sim_render-ego0": 0.003954754780968659, "get_duckie_state": 1.478552520523262e-06, "in-drivable-lane": 1.649999999999987, "deviation-heading": 12.03907498791474, "agent_compute-ego0": 0.012604058731803291, "complete-iteration": 0.2151224367028966, "set_robot_commands": 0.002343051737293812, "deviation-center-line": 5.123844936099861, "driven_lanedir_consec": 11.60063129797322, "sim_compute_sim_state": 0.012572740138718529, "sim_compute_performance-ego0": 0.0021464709536816854}, "LF-norm-small_loop-000-ego0": {"driven_any": 15.492010367955416, "get_ui_image": 0.026513532833890255, "step_physics": 0.11939447745990991, "survival_time": 59.99999999999873, "driven_lanedir": 11.648938799544302, "get_state_dump": 0.0047754316306134045, "get_robot_state": 0.003789128907812723, "sim_render-ego0": 0.0038354912963536854, "get_duckie_state": 1.540688253461471e-06, "in-drivable-lane": 12.250000000000036, "deviation-heading": 14.8406819844677, "agent_compute-ego0": 0.01209990607014703, "complete-iteration": 0.18108272929672, "set_robot_commands": 0.002301142872819098, "deviation-center-line": 3.351360583821681, "driven_lanedir_consec": 7.330535168799424, "sim_compute_sim_state": 0.006259027468374032, "sim_compute_performance-ego0": 0.0020231704330761963}}
set_robot_commands_max0.0023472253136012865
set_robot_commands_mean0.0023078556562347824
set_robot_commands_median0.0023220973050564553
set_robot_commands_min0.0022400027012249315
sim_compute_performance-ego0_max0.002153099930804709
sim_compute_performance-ego0_mean0.002085101422732561
sim_compute_performance-ego0_median0.002084820693378941
sim_compute_performance-ego0_min0.0020176643733676525
sim_compute_sim_state_max0.012572740138718529
sim_compute_sim_state_mean0.010038043771927517
sim_compute_sim_state_median0.010660203740308748
sim_compute_sim_state_min0.006259027468374032
sim_render-ego0_max0.003989037223484205
sim_render-ego0_mean0.003904624643071359
sim_render-ego0_median0.0038969850262237727
sim_render-ego0_min0.0038354912963536854
simulation-passed1
step_physics_max0.17118393504101298
step_physics_mean0.1371096321802647
step_physics_median0.12930465369895533
step_physics_min0.1186452862821352
survival_time_max59.99999999999873
survival_time_mean59.36249999999876
survival_time_min57.44999999999887
No reset possible
54461LFv-simsuccessyes0:23:21
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible