Duckietown Challenges Home Challenges Submissions

Submission 11560

Submission11560
Competingyes
Challengeaido5-LF-sim-validation
UserYishu Malhotra 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54376
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

54376

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54376LFv-simsuccessyes0:38:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median8.4540036958145
survival_time_median59.99999999999873
deviation-center-line_median3.787393040390469
in-drivable-lane_median6.074999999999765


other stats
agent_compute-ego0_max0.012417950499166957
agent_compute-ego0_mean0.012158793743207264
agent_compute-ego0_median0.01217778769500383
agent_compute-ego0_min0.01186164908365445
complete-iteration_max0.2463485503772415
complete-iteration_mean0.21262724704358155
complete-iteration_median0.2129349190045128
complete-iteration_min0.1782905997880591
deviation-center-line_max4.13585014843881
deviation-center-line_mean3.56840721003942
deviation-center-line_min2.5629926109379335
deviation-heading_max19.61169560075959
deviation-heading_mean17.407508087774975
deviation-heading_median17.99100814588045
deviation-heading_min14.036320458579414
driven_any_max11.946717839451614
driven_any_mean11.50890848374462
driven_any_median11.566497880470852
driven_any_min10.955920334585151
driven_lanedir_consec_max10.934222476114703
driven_lanedir_consec_mean8.543772404454144
driven_lanedir_consec_min6.33285975007287
driven_lanedir_max10.934222476114703
driven_lanedir_mean9.210975986985568
driven_lanedir_median9.78017950478906
driven_lanedir_min6.349322462249449
get_duckie_state_max1.3524398995887122e-06
get_duckie_state_mean1.2733222364687526e-06
get_duckie_state_median1.2697129325009107e-06
get_duckie_state_min1.2014231812844764e-06
get_robot_state_max0.0038105034207617104
get_robot_state_mean0.003748431446818147
get_robot_state_median0.00376946294833778
get_robot_state_min0.0036442964698353178
get_state_dump_max0.004859374412261168
get_state_dump_mean0.004688026502832606
get_state_dump_median0.004649266215982683
get_state_dump_min0.004594199167103891
get_ui_image_max0.03568118855319948
get_ui_image_mean0.03066707847193202
get_ui_image_median0.030430995726632708
get_ui_image_min0.026125133881263193
in-drivable-lane_max23.44999999999868
in-drivable-lane_mean9.024999999999554
in-drivable-lane_min0.5000000000000071
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 10.955920334585151, "get_ui_image": 0.02913989388172012, "step_physics": 0.13955767428945037, "survival_time": 59.49999999999876, "driven_lanedir": 6.349322462249449, "get_state_dump": 0.004859374412261168, "get_robot_state": 0.0038105034207617104, "sim_render-ego0": 0.003901380936104545, "get_duckie_state": 1.3524398995887122e-06, "in-drivable-lane": 23.44999999999868, "deviation-heading": 14.036320458579414, "agent_compute-ego0": 0.012333522415481428, "complete-iteration": 0.2074646857683645, "set_robot_commands": 0.002311477132648305, "deviation-center-line": 2.5629926109379335, "driven_lanedir_consec": 6.33285975007287, "sim_compute_sim_state": 0.009388133520643617, "sim_compute_performance-ego0": 0.002075474168151292}, "LF-norm-zigzag-000-ego0": {"driven_any": 11.946717839451614, "get_ui_image": 0.03568118855319948, "step_physics": 0.16758962237368416, "survival_time": 59.99999999999873, "driven_lanedir": 10.934222476114703, "get_state_dump": 0.004639985658644041, "get_robot_state": 0.0037747958022093, "sim_render-ego0": 0.0038755241778371335, "get_duckie_state": 1.2943290056138114e-06, "in-drivable-lane": 0.5000000000000071, "deviation-heading": 19.61169560075959, "agent_compute-ego0": 0.012417950499166957, "complete-iteration": 0.2463485503772415, "set_robot_commands": 0.0023034113233631397, "deviation-center-line": 4.13585014843881, "driven_lanedir_consec": 10.934222476114703, "sim_compute_sim_state": 0.013892208308998889, "sim_compute_performance-ego0": 0.002086236018324573}, "LF-norm-techtrack-000-ego0": {"driven_any": 11.703579858292503, "get_ui_image": 0.031722097571545295, "step_physics": 0.1453220943527158, "survival_time": 59.99999999999873, "driven_lanedir": 9.962196586065398, "get_state_dump": 0.004658546773321325, "get_robot_state": 0.003764130094466261, "sim_render-ego0": 0.003824356295087753, "get_duckie_state": 1.2014231812844764e-06, "in-drivable-lane": 5.699999999999896, "deviation-heading": 19.59498521602345, "agent_compute-ego0": 0.012022052974526232, "complete-iteration": 0.2184051522406611, "set_robot_commands": 0.0022552305216793217, "deviation-center-line": 4.026517634370256, "driven_lanedir_consec": 9.962196586065398, "sim_compute_sim_state": 0.01271764364568121, "sim_compute_performance-ego0": 0.0020345236042159285}, "LF-norm-small_loop-000-ego0": {"driven_any": 11.4294159026492, "get_ui_image": 0.026125133881263193, "step_physics": 0.11802689796879726, "survival_time": 59.99999999999873, "driven_lanedir": 9.598162423512726, "get_state_dump": 0.004594199167103891, "get_robot_state": 0.0036442964698353178, "sim_render-ego0": 0.0037595977592627073, "get_duckie_state": 1.24509685938801e-06, "in-drivable-lane": 6.449999999999633, "deviation-heading": 16.387031075737454, "agent_compute-ego0": 0.01186164908365445, "complete-iteration": 0.1782905997880591, "set_robot_commands": 0.002199880884251527, "deviation-center-line": 3.5482684464106815, "driven_lanedir_consec": 6.945810805563603, "sim_compute_sim_state": 0.006036233147614008, "sim_compute_performance-ego0": 0.0019606577168892665}}
set_robot_commands_max0.002311477132648305
set_robot_commands_mean0.0022674999654855735
set_robot_commands_median0.002279320922521231
set_robot_commands_min0.002199880884251527
sim_compute_performance-ego0_max0.002086236018324573
sim_compute_performance-ego0_mean0.002039222876895265
sim_compute_performance-ego0_median0.00205499888618361
sim_compute_performance-ego0_min0.0019606577168892665
sim_compute_sim_state_max0.013892208308998889
sim_compute_sim_state_mean0.010508554655734432
sim_compute_sim_state_median0.011052888583162414
sim_compute_sim_state_min0.006036233147614008
sim_render-ego0_max0.003901380936104545
sim_render-ego0_mean0.0038402147920730345
sim_render-ego0_median0.003849940236462443
sim_render-ego0_min0.0037595977592627073
simulation-passed1
step_physics_max0.16758962237368416
step_physics_mean0.1426240722461619
step_physics_median0.1424398843210831
step_physics_min0.11802689796879726
survival_time_max59.99999999999873
survival_time_mean59.874999999998735
survival_time_min59.49999999999876
No reset possible
54370LFv-simsuccessyes0:37:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible