Duckietown Challenges Home Challenges Submissions

Submission 11655

Submission11655
Competingyes
Challengeaido5-LF-sim-validation
UserMo Kleit 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54129
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

54129

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54129LFv-simsuccessyes0:25:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.53095272241352
survival_time_median43.47499999999967
deviation-center-line_median1.9233120551879983
in-drivable-lane_median17.249999999999716


other stats
agent_compute-ego0_max0.012972465908254375
agent_compute-ego0_mean0.012323886282599544
agent_compute-ego0_median0.012169879025263484
agent_compute-ego0_min0.011983321171616832
complete-iteration_max0.2854366266090451
complete-iteration_mean0.2484782536198393
complete-iteration_median0.24581053695785432
complete-iteration_min0.21685531395460347
deviation-center-line_max2.390674416113139
deviation-center-line_mean1.5939408236344628
deviation-center-line_min0.1384647680487149
deviation-heading_max12.402792374212655
deviation-heading_mean8.222465307990904
deviation-heading_median9.848940498880143
deviation-heading_min0.7891878599906662
driven_any_max23.55575286841991
driven_any_mean14.82857042080396
driven_any_median16.7963103894068
driven_any_min2.1659080359823344
driven_lanedir_consec_max10.695067593228313
driven_lanedir_consec_mean5.617777298574387
driven_lanedir_consec_min0.7141361562421955
driven_lanedir_max10.695067593228313
driven_lanedir_mean6.950555802789822
driven_lanedir_median8.196509730844388
driven_lanedir_min0.7141361562421955
get_duckie_state_max1.518221017248045e-06
get_duckie_state_mean1.4612168779400776e-06
get_duckie_state_median1.4808273147134917e-06
get_duckie_state_min1.3649918650852817e-06
get_robot_state_max0.003963519921943323
get_robot_state_mean0.003932560589473398
get_robot_state_median0.003951662056427784
get_robot_state_min0.0038633983230947
get_state_dump_max0.005179785590135415
get_state_dump_mean0.005096600124602351
get_state_dump_median0.005088780830499259
get_state_dump_min0.005029053247275473
get_ui_image_max0.03747403712673042
get_ui_image_mean0.03163064661245559
get_ui_image_median0.030896964183424455
get_ui_image_min0.027254620956243023
in-drivable-lane_max34.049999999999116
in-drivable-lane_mean18.199999999999633
in-drivable-lane_min4.249999999999985
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 20.778768363561827, "get_ui_image": 0.028586967318665748, "step_physics": 0.16089566214745785, "survival_time": 53.4999999999991, "driven_lanedir": 10.695067593228313, "get_state_dump": 0.005029053247275473, "get_robot_state": 0.0038633983230947, "sim_render-ego0": 0.003972377429823247, "get_duckie_state": 1.518221017248045e-06, "in-drivable-lane": 22.94999999999941, "deviation-heading": 12.402792374212655, "agent_compute-ego0": 0.012257327552603073, "complete-iteration": 0.2289153717033892, "set_robot_commands": 0.0023157607718787385, "deviation-center-line": 2.390674416113139, "driven_lanedir_consec": 10.695067593228313, "sim_compute_sim_state": 0.0097532067534859, "sim_compute_performance-ego0": 0.002141438723723539}, "LF-norm-zigzag-000-ego0": {"driven_any": 2.1659080359823344, "get_ui_image": 0.03747403712673042, "step_physics": 0.2062564187377464, "survival_time": 6.499999999999985, "driven_lanedir": 0.7141361562421955, "get_state_dump": 0.005179785590135415, "get_robot_state": 0.003954894670093333, "sim_render-ego0": 0.004191718938696475, "get_duckie_state": 1.3649918650852817e-06, "in-drivable-lane": 4.249999999999985, "deviation-heading": 0.7891878599906662, "agent_compute-ego0": 0.012972465908254375, "complete-iteration": 0.2854366266090451, "set_robot_commands": 0.002470198478407532, "deviation-center-line": 0.1384647680487149, "driven_lanedir_consec": 0.7141361562421955, "sim_compute_sim_state": 0.010563950502235473, "sim_compute_performance-ego0": 0.0022761130150947863}, "LF-norm-techtrack-000-ego0": {"driven_any": 12.81385241525177, "get_ui_image": 0.03320696104818316, "step_physics": 0.18759092893173449, "survival_time": 33.45000000000024, "driven_lanedir": 7.664774701691019, "get_state_dump": 0.005094142458332119, "get_robot_state": 0.003963519921943323, "sim_render-ego0": 0.0040244266168395085, "get_duckie_state": 1.4821095253104597e-06, "in-drivable-lane": 11.550000000000022, "deviation-heading": 8.106635733084893, "agent_compute-ego0": 0.012082430497923891, "complete-iteration": 0.2627057022123194, "set_robot_commands": 0.00237265024612199, "deviation-center-line": 1.581389776592395, "driven_lanedir_consec": 5.752589526434654, "sim_compute_sim_state": 0.01208009470754595, "sim_compute_performance-ego0": 0.0021875402820644096}, "LF-norm-small_loop-000-ego0": {"driven_any": 23.55575286841991, "get_ui_image": 0.027254620956243023, "step_physics": 0.15333966291715065, "survival_time": 59.99999999999873, "driven_lanedir": 8.728244759997759, "get_state_dump": 0.005083419202666398, "get_robot_state": 0.003948429442762236, "sim_render-ego0": 0.004037658340428692, "get_duckie_state": 1.479545104116524e-06, "in-drivable-lane": 34.049999999999116, "deviation-heading": 11.591245264675395, "agent_compute-ego0": 0.011983321171616832, "complete-iteration": 0.21685531395460347, "set_robot_commands": 0.0023488901139893797, "deviation-center-line": 2.265234333783602, "driven_lanedir_consec": 5.309315918392388, "sim_compute_sim_state": 0.006598349316332561, "sim_compute_performance-ego0": 0.0021605205774108734}}
set_robot_commands_max0.002470198478407532
set_robot_commands_mean0.0023768749025994103
set_robot_commands_median0.0023607701800556847
set_robot_commands_min0.0023157607718787385
sim_compute_performance-ego0_max0.0022761130150947863
sim_compute_performance-ego0_mean0.002191403149573402
sim_compute_performance-ego0_median0.0021740304297376417
sim_compute_performance-ego0_min0.002141438723723539
sim_compute_sim_state_max0.01208009470754595
sim_compute_sim_state_mean0.00974890031989997
sim_compute_sim_state_median0.010158578627860683
sim_compute_sim_state_min0.006598349316332561
sim_render-ego0_max0.004191718938696475
sim_render-ego0_mean0.004056545331446981
sim_render-ego0_median0.0040310424786341
sim_render-ego0_min0.003972377429823247
simulation-passed1
step_physics_max0.2062564187377464
step_physics_mean0.17702066818352233
step_physics_median0.17424329553959617
step_physics_min0.15333966291715065
survival_time_max59.99999999999873
survival_time_mean38.362499999999514
survival_time_min6.499999999999985
No reset possible
54126LFv-simsuccessyes0:09:10
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible