Duckietown Challenges Home Challenges Submissions

Submission 13060

Submission13060
Competingyes
Challengeaido5-LF-sim-validation
UserAyman Shams 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 60327
Next
User labelreal-exercise-2
Admin priority50
Blessingn/a
User priority50

60327

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
60327LFv-simsuccessyes0:10:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.4813887401447434
survival_time_median12.025000000000036
deviation-center-line_median0.33340942391202694
in-drivable-lane_median7.375000000000044


other stats
agent_compute-ego0_max0.012853647326375102
agent_compute-ego0_mean0.012468157675968872
agent_compute-ego0_median0.012552329750047575
agent_compute-ego0_min0.011914323877405238
complete-iteration_max0.32124705157929184
complete-iteration_mean0.2743518521151472
complete-iteration_median0.27950336057492353
complete-iteration_min0.21715363573144983
deviation-center-line_max0.3870476888632763
deviation-center-line_mean0.3079057221606805
deviation-center-line_min0.17775635195539194
deviation-heading_max3.1613492775963747
deviation-heading_mean1.9831563519956563
deviation-heading_median1.955000530765019
deviation-heading_min0.8612750688562134
driven_any_max3.0219426576807313
driven_any_mean1.7506289090169815
driven_any_median1.458525488985167
driven_any_min1.06352200041686
driven_lanedir_consec_max0.6434150560631537
driven_lanedir_consec_mean0.46863083437962505
driven_lanedir_consec_min0.2683308011658596
driven_lanedir_max0.6434150560631537
driven_lanedir_mean0.46863083437962505
driven_lanedir_median0.4813887401447434
driven_lanedir_min0.2683308011658596
get_duckie_state_max2.1307556717484085e-06
get_duckie_state_mean2.023018830533502e-06
get_duckie_state_median2.0432158815422896e-06
get_duckie_state_min1.8748878873010197e-06
get_robot_state_max0.003597752784139677
get_robot_state_mean0.0035524875002215187
get_robot_state_median0.00354879641367936
get_robot_state_min0.0035146043893876768
get_state_dump_max0.004507047398354166
get_state_dump_mean0.004465613498574803
get_state_dump_median0.004475445993848094
get_state_dump_min0.004404514608248858
get_ui_image_max0.035823614944314734
get_ui_image_mean0.030636070126712704
get_ui_image_median0.030327150690084907
get_ui_image_min0.026066364182366263
in-drivable-lane_max20.200000000000177
in-drivable-lane_mean9.637500000000069
in-drivable-lane_min3.600000000000005
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 3.0219426576807313, "get_ui_image": 0.028342496378187844, "step_physics": 0.19339828003423448, "survival_time": 23.900000000000205, "driven_lanedir": 0.2683308011658596, "get_state_dump": 0.004507047398354166, "get_robot_state": 0.003597752784139677, "sim_render-ego0": 0.003737460596327493, "get_duckie_state": 2.000924192042341e-06, "in-drivable-lane": 20.200000000000177, "deviation-heading": 2.8652045617829556, "agent_compute-ego0": 0.012631630350005404, "complete-iteration": 0.2605370126337995, "set_robot_commands": 0.0021764012617457636, "deviation-center-line": 0.3870476888632763, "driven_lanedir_consec": 0.2683308011658596, "sim_compute_sim_state": 0.010116928056784612, "sim_compute_performance-ego0": 0.0019473362566284943}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.269322136955765, "get_ui_image": 0.035823614944314734, "step_physics": 0.24707806278282488, "survival_time": 10.600000000000016, "driven_lanedir": 0.3545609852306253, "get_state_dump": 0.004404514608248858, "get_robot_state": 0.0035146043893876768, "sim_render-ego0": 0.0037103617135347895, "get_duckie_state": 1.8748878873010197e-06, "in-drivable-lane": 6.350000000000023, "deviation-heading": 3.1613492775963747, "agent_compute-ego0": 0.012473029150089748, "complete-iteration": 0.32124705157929184, "set_robot_commands": 0.002145910486928734, "deviation-center-line": 0.35731718121760325, "driven_lanedir_consec": 0.3545609852306253, "sim_compute_sim_state": 0.010077963412647516, "sim_compute_performance-ego0": 0.0019393269444855168}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.06352200041686, "get_ui_image": 0.03231180500198197, "step_physics": 0.22913667133876256, "survival_time": 9.049999999999994, "driven_lanedir": 0.6434150560631537, "get_state_dump": 0.004465281308352292, "get_robot_state": 0.0035358876972408086, "sim_render-ego0": 0.003723122261382721, "get_duckie_state": 2.085507571042239e-06, "in-drivable-lane": 3.600000000000005, "deviation-heading": 1.0447964997470818, "agent_compute-ego0": 0.012853647326375102, "complete-iteration": 0.29846970851604754, "set_robot_commands": 0.0021515337975470572, "deviation-center-line": 0.17775635195539194, "driven_lanedir_consec": 0.6434150560631537, "sim_compute_sim_state": 0.008290594750708277, "sim_compute_performance-ego0": 0.0019178325003320043}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.6477288410145698, "get_ui_image": 0.026066364182366263, "step_physics": 0.15816378593444824, "survival_time": 13.450000000000056, "driven_lanedir": 0.6082164950588616, "get_state_dump": 0.004485610679343895, "get_robot_state": 0.003561705130117911, "sim_render-ego0": 0.0036919125804194697, "get_duckie_state": 2.1307556717484085e-06, "in-drivable-lane": 8.400000000000066, "deviation-heading": 0.8612750688562134, "agent_compute-ego0": 0.011914323877405238, "complete-iteration": 0.21715363573144983, "set_robot_commands": 0.0021713936770403827, "deviation-center-line": 0.3095016666064506, "driven_lanedir_consec": 0.6082164950588616, "sim_compute_sim_state": 0.005099051086990922, "sim_compute_performance-ego0": 0.001919327841864692}}
set_robot_commands_max0.0021764012617457636
set_robot_commands_mean0.0021613098058154844
set_robot_commands_median0.00216146373729372
set_robot_commands_min0.002145910486928734
sim_compute_performance-ego0_max0.0019473362566284943
sim_compute_performance-ego0_mean0.0019309558858276769
sim_compute_performance-ego0_median0.001929327393175104
sim_compute_performance-ego0_min0.0019178325003320043
sim_compute_sim_state_max0.010116928056784612
sim_compute_sim_state_mean0.008396134326782831
sim_compute_sim_state_median0.009184279081677895
sim_compute_sim_state_min0.005099051086990922
sim_render-ego0_max0.003737460596327493
sim_render-ego0_mean0.003715714287916118
sim_render-ego0_median0.0037167419874587554
sim_render-ego0_min0.0036919125804194697
simulation-passed1
step_physics_max0.24707806278282488
step_physics_mean0.20694420002256753
step_physics_median0.2112674756864985
step_physics_min0.15816378593444824
survival_time_max23.900000000000205
survival_time_mean14.25000000000007
survival_time_min9.049999999999994
No reset possible
60322LFv-simsuccessyes0:11:30
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible