Duckietown Challenges Home Challenges Submissions

Submission 11030

Submission11030
Competingyes
Challengeaido5-LF-sim-validation
UserMo Kleit 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57059
Next
User labelsim-exercise-1
Admin priority50
Blessingn/a
User priority50

57059

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57059LFv-simsuccessyes0:15:47
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.272282011431431
survival_time_median21.62500000000017
deviation-center-line_median0.7780625807688288
in-drivable-lane_median10.125000000000115


other stats
agent_compute-ego0_max0.013586632379396694
agent_compute-ego0_mean0.012838949863196572
agent_compute-ego0_median0.012808712853642537
agent_compute-ego0_min0.01215174136610452
complete-iteration_max0.18727133910693425
complete-iteration_mean0.16922111063784484
complete-iteration_median0.17049314855373565
complete-iteration_min0.14862680633697384
deviation-center-line_max4.090417640673199
deviation-center-line_mean1.4818197188408493
deviation-center-line_min0.28073607315254007
deviation-heading_max13.10281462040622
deviation-heading_mean4.691103974146976
deviation-heading_median1.9866596086654629
deviation-heading_min1.6882820588507574
driven_any_max8.617610743868251
driven_any_mean3.433407725211655
driven_any_median2.250184654371409
driven_any_min0.6156508482355505
driven_lanedir_consec_max7.774821502288197
driven_lanedir_consec_mean2.6383945684434265
driven_lanedir_consec_min0.23419274862264627
driven_lanedir_max7.774821502288197
driven_lanedir_mean2.6383945684434265
driven_lanedir_median1.272282011431431
driven_lanedir_min0.23419274862264627
get_duckie_state_max1.3281537600212697e-06
get_duckie_state_mean1.2637223959643458e-06
get_duckie_state_median1.2569962501085714e-06
get_duckie_state_min1.2127433236189714e-06
get_robot_state_max0.003611892929876038
get_robot_state_mean0.003566451809504125
get_robot_state_median0.0035801590292312297
get_robot_state_min0.0034935962496780034
get_state_dump_max0.004544824829900452
get_state_dump_mean0.004485849234751847
get_state_dump_median0.004491272585713479
get_state_dump_min0.004416026937679981
get_ui_image_max0.03552261946713113
get_ui_image_mean0.0297224630551117
get_ui_image_median0.028804645638104743
get_ui_image_min0.025757941477106177
in-drivable-lane_max15.300000000000216
in-drivable-lane_mean10.312500000000114
in-drivable-lane_min5.7000000000000055
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 1.8425914314558616, "get_ui_image": 0.026994549132323197, "step_physics": 0.10829683930760972, "survival_time": 17.90000000000012, "driven_lanedir": 0.9508895746612598, "get_state_dump": 0.004494595328413344, "get_robot_state": 0.003586348716927106, "sim_render-ego0": 0.003858476628168047, "get_duckie_state": 1.29569539784721e-06, "in-drivable-lane": 12.90000000000013, "deviation-heading": 1.6882820588507574, "agent_compute-ego0": 0.012393449342350442, "complete-iteration": 0.1728689790104093, "set_robot_commands": 0.0021500740210658, "deviation-center-line": 0.516448377668857, "driven_lanedir_consec": 0.9508895746612598, "sim_compute_sim_state": 0.00901327558214618, "sim_compute_performance-ego0": 0.002002925925932223}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.6156508482355505, "get_ui_image": 0.03552261946713113, "step_physics": 0.11110260598946615, "survival_time": 9.5, "driven_lanedir": 0.23419274862264627, "get_state_dump": 0.004544824829900452, "get_robot_state": 0.003611892929876038, "sim_render-ego0": 0.003982144500572644, "get_duckie_state": 1.3281537600212697e-06, "in-drivable-lane": 5.7000000000000055, "deviation-heading": 1.8605658808771288, "agent_compute-ego0": 0.013223976364934631, "complete-iteration": 0.18727133910693425, "set_robot_commands": 0.0022359730685568603, "deviation-center-line": 0.28073607315254007, "driven_lanedir_consec": 0.23419274862264627, "sim_compute_sim_state": 0.010918157887084322, "sim_compute_performance-ego0": 0.0020492214182908624}, "LF-norm-techtrack-000-ego0": {"driven_any": 2.657777877286956, "get_ui_image": 0.030614742143886296, "step_physics": 0.09802575655809538, "survival_time": 25.350000000000225, "driven_lanedir": 1.593674448201602, "get_state_dump": 0.004416026937679981, "get_robot_state": 0.0034935962496780034, "sim_render-ego0": 0.003826959865299735, "get_duckie_state": 1.2127433236189714e-06, "in-drivable-lane": 15.300000000000216, "deviation-heading": 2.112753336453797, "agent_compute-ego0": 0.013586632379396694, "complete-iteration": 0.168117318097062, "set_robot_commands": 0.0020836840464374213, "deviation-center-line": 1.0396767838688008, "driven_lanedir_consec": 1.593674448201602, "sim_compute_sim_state": 0.010103275926094356, "sim_compute_performance-ego0": 0.0018887318025423783}, "LF-norm-small_loop-000-ego0": {"driven_any": 8.617610743868251, "get_ui_image": 0.025757941477106177, "step_physics": 0.08894221550419765, "survival_time": 59.99999999999873, "driven_lanedir": 7.774821502288197, "get_state_dump": 0.004487949843013614, "get_robot_state": 0.003573969341535354, "sim_render-ego0": 0.0037658540136510385, "get_duckie_state": 1.2182971023699329e-06, "in-drivable-lane": 7.350000000000104, "deviation-heading": 13.10281462040622, "agent_compute-ego0": 0.01215174136610452, "complete-iteration": 0.14862680633697384, "set_robot_commands": 0.0021263629967326627, "deviation-center-line": 4.090417640673199, "driven_lanedir_consec": 7.774821502288197, "sim_compute_sim_state": 0.005837902836160398, "sim_compute_performance-ego0": 0.001904314701801335}}
set_robot_commands_max0.0022359730685568603
set_robot_commands_mean0.002149023533198186
set_robot_commands_median0.0021382185088992312
set_robot_commands_min0.0020836840464374213
sim_compute_performance-ego0_max0.0020492214182908624
sim_compute_performance-ego0_mean0.0019612984621416997
sim_compute_performance-ego0_median0.001953620313866779
sim_compute_performance-ego0_min0.0018887318025423783
sim_compute_sim_state_max0.010918157887084322
sim_compute_sim_state_mean0.008968153057871313
sim_compute_sim_state_median0.009558275754120268
sim_compute_sim_state_min0.005837902836160398
sim_render-ego0_max0.003982144500572644
sim_render-ego0_mean0.003858358751922866
sim_render-ego0_median0.0038427182467338913
sim_render-ego0_min0.0037658540136510385
simulation-passed1
step_physics_max0.11110260598946615
step_physics_mean0.10159185433984222
step_physics_median0.10316129793285254
step_physics_min0.08894221550419765
survival_time_max59.99999999999873
survival_time_mean28.18749999999977
survival_time_min9.5
No reset possible
57055LFv-simsuccessyes0:15:09
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible