Duckietown Challenges Home Challenges Submissions

Submission 11666

Submission11666
Competingyes
Challengeaido5-LF-sim-validation
UserDishank Bansal 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54097
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

54097

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54097LFv-simsuccessyes0:29:44
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median10.092476561014358
survival_time_median59.99999999999873
deviation-center-line_median2.6006573373870543
in-drivable-lane_median0.0


other stats
agent_compute-ego0_max0.013000814047185307
agent_compute-ego0_mean0.01239918178953895
agent_compute-ego0_median0.012402863824099525
agent_compute-ego0_min0.011790185462771436
complete-iteration_max0.21423833098240835
complete-iteration_mean0.18927170160346216
complete-iteration_median0.19447718929192304
complete-iteration_min0.15389409684759425
deviation-center-line_max2.8026309993270817
deviation-center-line_mean2.098608353189527
deviation-center-line_min0.3904877386569198
deviation-heading_max11.67196476668347
deviation-heading_mean8.354374450179598
deviation-heading_median10.351951611656403
deviation-heading_min1.0416298107221167
driven_any_max11.78153786164248
driven_any_mean8.49244126589326
driven_any_median10.355030587290187
driven_any_min1.478166027350186
driven_lanedir_consec_max11.515669516830416
driven_lanedir_consec_mean8.122985249817434
driven_lanedir_consec_min0.791318360410608
driven_lanedir_max11.515669516830416
driven_lanedir_mean8.122985249817434
driven_lanedir_median10.092476561014358
driven_lanedir_min0.791318360410608
get_duckie_state_max1.4354743925756857e-06
get_duckie_state_mean1.3804403217006074e-06
get_duckie_state_median1.405796143137148e-06
get_duckie_state_min1.2746946079524484e-06
get_robot_state_max0.004045758616616585
get_robot_state_mean0.0037631132064476873
get_robot_state_median0.0037256554699658753
get_robot_state_min0.0035553832692424145
get_state_dump_max0.00491567376650541
get_state_dump_mean0.004716041111910623
get_state_dump_median0.004672448454419841
get_state_dump_min0.0046035937722973975
get_ui_image_max0.0353933140995302
get_ui_image_mean0.03088345899094027
get_ui_image_median0.031090249427649305
get_ui_image_min0.025960023008932277
in-drivable-lane_max7.250000000000056
in-drivable-lane_mean1.812500000000014
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 11.78153786164248, "get_ui_image": 0.028956507763001047, "step_physics": 0.10997396662868528, "survival_time": 59.99999999999873, "driven_lanedir": 11.515669516830416, "get_state_dump": 0.004695995562678074, "get_robot_state": 0.003759923525198016, "sim_render-ego0": 0.0038213531341679784, "get_duckie_state": 1.4330921919518565e-06, "in-drivable-lane": 0.0, "deviation-heading": 10.474772871133109, "agent_compute-ego0": 0.012548533605596207, "complete-iteration": 0.1778158393132498, "set_robot_commands": 0.002229945447224562, "deviation-center-line": 2.8026309993270817, "driven_lanedir_consec": 11.515669516830416, "sim_compute_sim_state": 0.00969342903531064, "sim_compute_performance-ego0": 0.0020466458291237203}, "LF-norm-zigzag-000-ego0": {"driven_any": 9.823747660396537, "get_ui_image": 0.0353933140995302, "step_physics": 0.13665255340906504, "survival_time": 59.99999999999873, "driven_lanedir": 9.54396517709346, "get_state_dump": 0.0046035937722973975, "get_robot_state": 0.0036913874147337343, "sim_render-ego0": 0.003811202378793124, "get_duckie_state": 1.3785000943224395e-06, "in-drivable-lane": 0.0, "deviation-heading": 11.67196476668347, "agent_compute-ego0": 0.012257194042602844, "complete-iteration": 0.21423833098240835, "set_robot_commands": 0.0022565544296760144, "deviation-center-line": 2.708172900056742, "driven_lanedir_consec": 9.54396517709346, "sim_compute_sim_state": 0.013416084421365883, "sim_compute_performance-ego0": 0.0020700447962345627}, "LF-norm-techtrack-000-ego0": {"driven_any": 10.886313514183842, "get_ui_image": 0.03322399109229756, "step_physics": 0.13331385317888983, "survival_time": 59.99999999999873, "driven_lanedir": 10.640987944935254, "get_state_dump": 0.00491567376650541, "get_robot_state": 0.004045758616616585, "sim_render-ego0": 0.004095111659524046, "get_duckie_state": 1.4354743925756857e-06, "in-drivable-lane": 0.0, "deviation-heading": 10.229130352179702, "agent_compute-ego0": 0.013000814047185307, "complete-iteration": 0.2111385392705963, "set_robot_commands": 0.002447753822078911, "deviation-center-line": 2.4931417747173668, "driven_lanedir_consec": 10.640987944935254, "sim_compute_sim_state": 0.013707997499159432, "sim_compute_performance-ego0": 0.0022890974738814253}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.478166027350186, "get_ui_image": 0.025960023008932277, "step_physics": 0.09503042510175327, "survival_time": 12.650000000000045, "driven_lanedir": 0.791318360410608, "get_state_dump": 0.004648901346161609, "get_robot_state": 0.0035553832692424145, "sim_render-ego0": 0.003624699247164989, "get_duckie_state": 1.2746946079524484e-06, "in-drivable-lane": 7.250000000000056, "deviation-heading": 1.0416298107221167, "agent_compute-ego0": 0.011790185462771436, "complete-iteration": 0.15389409684759425, "set_robot_commands": 0.0020700467853095586, "deviation-center-line": 0.3904877386569198, "driven_lanedir_consec": 0.791318360410608, "sim_compute_sim_state": 0.005260685297447865, "sim_compute_performance-ego0": 0.001871694730022761}}
set_robot_commands_max0.002447753822078911
set_robot_commands_mean0.0022510751210722615
set_robot_commands_median0.002243249938450288
set_robot_commands_min0.0020700467853095586
sim_compute_performance-ego0_max0.0022890974738814253
sim_compute_performance-ego0_mean0.002069370707315617
sim_compute_performance-ego0_median0.0020583453126791417
sim_compute_performance-ego0_min0.001871694730022761
sim_compute_sim_state_max0.013707997499159432
sim_compute_sim_state_mean0.010519549063320956
sim_compute_sim_state_median0.011554756728338262
sim_compute_sim_state_min0.005260685297447865
sim_render-ego0_max0.004095111659524046
sim_render-ego0_mean0.003838091604912534
sim_render-ego0_median0.0038162777564805512
sim_render-ego0_min0.003624699247164989
simulation-passed1
step_physics_max0.13665255340906504
step_physics_mean0.11874269957959836
step_physics_median0.12164390990378755
step_physics_min0.09503042510175327
survival_time_max59.99999999999873
survival_time_mean48.162499999999056
survival_time_min12.650000000000045
No reset possible
54090LFv-simsuccessyes0:35:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible