Duckietown Challenges Home Challenges Submissions

Submission 10962

Submission10962
Competingyes
Challengeaido5-LF-sim-validation
UserHimanshu Arora 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57324
Next
User labelsim-exercise-1
Admin priority50
Blessingn/a
User priority50

57324

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57324LFv-simsuccessyes0:32:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.070013310088246
survival_time_median59.99999999999873
deviation-center-line_median2.995272382831122
in-drivable-lane_median15.599999999999646


other stats
agent_compute-ego0_max0.012947976043281904
agent_compute-ego0_mean0.012285854345370288
agent_compute-ego0_median0.012332387411227136
agent_compute-ego0_min0.011530666515744966
complete-iteration_max0.21630857806717924
complete-iteration_mean0.18795446853338443
complete-iteration_median0.1966967976758323
complete-iteration_min0.14211570071469387
deviation-center-line_max3.3740280351187555
deviation-center-line_mean2.8897562195443203
deviation-center-line_min2.1944520773962832
deviation-heading_max17.343571254716494
deviation-heading_mean13.803908218851138
deviation-heading_median14.273541525300502
deviation-heading_min9.324978570087056
driven_any_max11.088169769558734
driven_any_mean9.600468482736629
driven_any_median10.18129019094474
driven_any_min6.951123779498293
driven_lanedir_consec_max5.977459400974896
driven_lanedir_consec_mean4.988493774831577
driven_lanedir_consec_min3.836489078174915
driven_lanedir_max9.26497461132758
driven_lanedir_mean6.583730590085642
driven_lanedir_median6.186015880834019
driven_lanedir_min4.697915987346947
get_duckie_state_max1.524409882531972e-06
get_duckie_state_mean1.3483939460085065e-06
get_duckie_state_median1.3727431094815192e-06
get_duckie_state_min1.1236796825390144e-06
get_robot_state_max0.0039252089421814625
get_robot_state_mean0.003761862722195603
get_robot_state_median0.003841433894326546
get_robot_state_min0.00343937415794786
get_state_dump_max0.005134630362060445
get_state_dump_mean0.004750472803655908
get_state_dump_median0.004790150652717888
get_state_dump_min0.004286959547127409
get_ui_image_max0.034235915772424554
get_ui_image_mean0.030041858540990164
get_ui_image_median0.03113225616086631
get_ui_image_min0.023667006069803473
in-drivable-lane_max21.04999999999954
in-drivable-lane_mean15.32499999999964
in-drivable-lane_min9.049999999999727
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 11.088169769558734, "get_ui_image": 0.028621216598498037, "step_physics": 0.11231413471212397, "survival_time": 59.99999999999873, "driven_lanedir": 9.26497461132758, "get_state_dump": 0.005134630362060445, "get_robot_state": 0.003839072934991613, "sim_render-ego0": 0.004071319629310271, "get_duckie_state": 1.524409882531972e-06, "in-drivable-lane": 9.049999999999727, "deviation-heading": 12.245483456605829, "agent_compute-ego0": 0.012507436475984063, "complete-iteration": 0.18167255343644445, "set_robot_commands": 0.0022532773156845004, "deviation-center-line": 3.2470860799824734, "driven_lanedir_consec": 5.977459400974896, "sim_compute_sim_state": 0.010709386185543622, "sim_compute_performance-ego0": 0.0021208773842461403}, "LF-norm-zigzag-000-ego0": {"driven_any": 10.041803930366136, "get_ui_image": 0.034235915772424554, "step_physics": 0.13655058410542892, "survival_time": 59.99999999999873, "driven_lanedir": 5.916334471523147, "get_state_dump": 0.004620719015548668, "get_robot_state": 0.0038437948536614794, "sim_render-ego0": 0.0039940201968177965, "get_duckie_state": 1.3653979908913796e-06, "in-drivable-lane": 21.04999999999954, "deviation-heading": 17.343571254716494, "agent_compute-ego0": 0.012157338346470209, "complete-iteration": 0.21172104191522015, "set_robot_commands": 0.00223274770922506, "deviation-center-line": 2.7434586856797702, "driven_lanedir_consec": 5.442110632829546, "sim_compute_sim_state": 0.011838081972882908, "sim_compute_performance-ego0": 0.002157693699337263}, "LF-norm-techtrack-000-ego0": {"driven_any": 10.320776451523344, "get_ui_image": 0.03364329572323459, "step_physics": 0.1400814169551014, "survival_time": 59.99999999999873, "driven_lanedir": 6.455697290144892, "get_state_dump": 0.00495958228988711, "get_robot_state": 0.0039252089421814625, "sim_render-ego0": 0.004076967032922495, "get_duckie_state": 1.380088228071659e-06, "in-drivable-lane": 19.49999999999961, "deviation-heading": 16.301599593995174, "agent_compute-ego0": 0.012947976043281904, "complete-iteration": 0.21630857806717924, "set_robot_commands": 0.002263651997917995, "deviation-center-line": 3.3740280351187555, "driven_lanedir_consec": 3.836489078174915, "sim_compute_sim_state": 0.012212094617425949, "sim_compute_performance-ego0": 0.00210714479171664}, "LF-norm-small_loop-000-ego0": {"driven_any": 6.951123779498293, "get_ui_image": 0.023667006069803473, "step_physics": 0.08610425030656636, "survival_time": 40.549999999999834, "driven_lanedir": 4.697915987346947, "get_state_dump": 0.004286959547127409, "get_robot_state": 0.00343937415794786, "sim_render-ego0": 0.003552618578737006, "get_duckie_state": 1.1236796825390144e-06, "in-drivable-lane": 11.699999999999688, "deviation-heading": 9.324978570087056, "agent_compute-ego0": 0.011530666515744966, "complete-iteration": 0.14211570071469387, "set_robot_commands": 0.0019903638092755097, "deviation-center-line": 2.1944520773962832, "driven_lanedir_consec": 4.697915987346947, "sim_compute_sim_state": 0.005694758128650083, "sim_compute_performance-ego0": 0.0017746848425841683}}
set_robot_commands_max0.002263651997917995
set_robot_commands_mean0.002185010208025766
set_robot_commands_median0.00224301251245478
set_robot_commands_min0.0019903638092755097
sim_compute_performance-ego0_max0.002157693699337263
sim_compute_performance-ego0_mean0.002040100179471053
sim_compute_performance-ego0_median0.0021140110879813902
sim_compute_performance-ego0_min0.0017746848425841683
sim_compute_sim_state_max0.012212094617425949
sim_compute_sim_state_mean0.01011358022612564
sim_compute_sim_state_median0.011273734079213263
sim_compute_sim_state_min0.005694758128650083
sim_render-ego0_max0.004076967032922495
sim_render-ego0_mean0.003923731359446892
sim_render-ego0_median0.004032669913064033
sim_render-ego0_min0.003552618578737006
simulation-passed1
step_physics_max0.1400814169551014
step_physics_mean0.11876259651980516
step_physics_median0.12443235940877644
step_physics_min0.08610425030656636
survival_time_max59.99999999999873
survival_time_mean55.13749999999901
survival_time_min40.549999999999834
No reset possible
57320LFv-simsuccessyes0:25:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible