Duckietown Challenges Home Challenges Submissions

Submission 10741

Submission10741
Competingyes
Challengeaido5-LF-sim-validation
UserDishank Bansal 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57933
Next
User labelsim-exercise-1
Admin priority50
Blessingn/a
User priority50

57933

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57933LFv-simsuccessyes0:21:08
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.069844448409803
survival_time_median40.77499999999962
deviation-center-line_median1.3321586049224068
in-drivable-lane_median6.600000000000081


other stats
agent_compute-ego0_max0.01398670862591456
agent_compute-ego0_mean0.0131660450229005
agent_compute-ego0_median0.013075080858424931
agent_compute-ego0_min0.012527309748837572
complete-iteration_max0.2654725748395163
complete-iteration_mean0.20035318092129964
complete-iteration_median0.1842551656197065
complete-iteration_min0.1674298176062693
deviation-center-line_max4.261005810816155
deviation-center-line_mean1.7575262151127582
deviation-center-line_min0.10478183979006513
deviation-heading_max9.84094520520859
deviation-heading_mean4.599731587483875
deviation-heading_median3.709101742446872
deviation-heading_min1.139777659833168
driven_any_max8.338044608497098
driven_any_mean4.93143667931345
driven_any_median5.549897845350873
driven_any_min0.2879064180549526
driven_lanedir_consec_max8.020109102326298
driven_lanedir_consec_mean3.5727722948963416
driven_lanedir_consec_min0.13129118043946297
driven_lanedir_max8.020109102326298
driven_lanedir_mean3.5727722948963416
driven_lanedir_median3.069844448409803
driven_lanedir_min0.13129118043946297
get_duckie_state_max1.457002427842882e-06
get_duckie_state_mean1.3574985383178782e-06
get_duckie_state_median1.344756290692368e-06
get_duckie_state_min1.2834791440438954e-06
get_robot_state_max0.00393308032859275
get_robot_state_mean0.003866726665002738
get_robot_state_median0.0038842630237849902
get_robot_state_min0.0037653002838482215
get_state_dump_max0.0051816342368958485
get_state_dump_mean0.004964510454413595
get_state_dump_median0.00492896033604107
get_state_dump_min0.004818486908676396
get_ui_image_max0.03825705770462278
get_ui_image_mean0.0315065962859618
get_ui_image_median0.03033325888810152
get_ui_image_min0.027102809663021397
in-drivable-lane_max23.799999999999756
in-drivable-lane_mean9.499999999999966
in-drivable-lane_min0.9999999999999432
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.576438154149887, "get_ui_image": 0.028303341882785014, "step_physics": 0.1064655524265917, "survival_time": 55.249999999999, "driven_lanedir": 4.220698576020977, "get_state_dump": 0.004818486908676396, "get_robot_state": 0.003873500642897207, "sim_render-ego0": 0.0041002451187878795, "get_duckie_state": 1.3427751620466842e-06, "in-drivable-lane": 23.799999999999756, "deviation-heading": 4.824539780969563, "agent_compute-ego0": 0.012527309748837572, "complete-iteration": 0.1756800523071565, "set_robot_commands": 0.0022910839825814717, "deviation-center-line": 1.8251994302404455, "driven_lanedir_consec": 4.220698576020977, "sim_compute_sim_state": 0.01105171215685108, "sim_compute_performance-ego0": 0.0021643267833302724}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.2879064180549526, "get_ui_image": 0.03825705770462278, "step_physics": 0.18476254599434988, "survival_time": 3.099999999999997, "driven_lanedir": 0.13129118043946297, "get_state_dump": 0.0051816342368958485, "get_robot_state": 0.003895025404672774, "sim_render-ego0": 0.004244021006992885, "get_duckie_state": 1.457002427842882e-06, "in-drivable-lane": 1.4499999999999962, "deviation-heading": 1.139777659833168, "agent_compute-ego0": 0.01398670862591456, "complete-iteration": 0.2654725748395163, "set_robot_commands": 0.0024410240233890593, "deviation-center-line": 0.10478183979006513, "driven_lanedir_consec": 0.13129118043946297, "sim_compute_sim_state": 0.01050862433418395, "sim_compute_performance-ego0": 0.002103945565602136}, "LF-norm-techtrack-000-ego0": {"driven_any": 3.523357536551859, "get_ui_image": 0.03236317589341802, "step_physics": 0.11902097778030772, "survival_time": 26.30000000000024, "driven_lanedir": 1.9189903207986283, "get_state_dump": 0.004947098654180596, "get_robot_state": 0.0037653002838482215, "sim_render-ego0": 0.004013156076536233, "get_duckie_state": 1.2834791440438954e-06, "in-drivable-lane": 11.750000000000169, "deviation-heading": 2.593663703924181, "agent_compute-ego0": 0.013617483443055705, "complete-iteration": 0.19283027893225657, "set_robot_commands": 0.0022512482058617378, "deviation-center-line": 0.8391177796043682, "driven_lanedir_consec": 1.9189903207986283, "sim_compute_sim_state": 0.010749891994121632, "sim_compute_performance-ego0": 0.0020130427105377025}, "LF-norm-small_loop-000-ego0": {"driven_any": 8.338044608497098, "get_ui_image": 0.027102809663021397, "step_physics": 0.10394293084728232, "survival_time": 59.99999999999873, "driven_lanedir": 8.020109102326298, "get_state_dump": 0.004910822017901545, "get_robot_state": 0.00393308032859275, "sim_render-ego0": 0.00401724665290013, "get_duckie_state": 1.3467374193380515e-06, "in-drivable-lane": 0.9999999999999432, "deviation-heading": 9.84094520520859, "agent_compute-ego0": 0.012532678273794156, "complete-iteration": 0.1674298176062693, "set_robot_commands": 0.002296333011242869, "deviation-center-line": 4.261005810816155, "driven_lanedir_consec": 8.020109102326298, "sim_compute_sim_state": 0.006496182290044653, "sim_compute_performance-ego0": 0.0021111030165698506}}
set_robot_commands_max0.0024410240233890593
set_robot_commands_mean0.0023199223057687843
set_robot_commands_median0.0022937084969121705
set_robot_commands_min0.0022512482058617378
sim_compute_performance-ego0_max0.0021643267833302724
sim_compute_performance-ego0_mean0.0020981045190099903
sim_compute_performance-ego0_median0.002107524291085994
sim_compute_performance-ego0_min0.0020130427105377025
sim_compute_sim_state_max0.01105171215685108
sim_compute_sim_state_mean0.009701602693800328
sim_compute_sim_state_median0.010629258164152793
sim_compute_sim_state_min0.006496182290044653
sim_render-ego0_max0.004244021006992885
sim_render-ego0_mean0.004093667213804282
sim_render-ego0_median0.004058745885844005
sim_render-ego0_min0.004013156076536233
simulation-passed1
step_physics_max0.18476254599434988
step_physics_mean0.1285480017621329
step_physics_median0.1127432651034497
step_physics_min0.10394293084728232
survival_time_max59.99999999999873
survival_time_mean36.16249999999949
survival_time_min3.099999999999997
No reset possible
57932LFv-simsuccessyes0:31:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible