Duckietown Challenges Home Challenges Submissions

Submission 11661

Submission11661
Competingyes
Challengeaido5-LF-sim-validation
UserDishank Bansal 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54120
Next
User labelexercise_state_estimation
Admin priority50
Blessingn/a
User priority50

54120

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54120LFv-simsuccessyes0:35:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median9.448808074531168
survival_time_median59.99999999999873
deviation-center-line_median2.613807417697111
in-drivable-lane_median0.29999999999998295


other stats
agent_compute-ego0_max0.0123673884894429
agent_compute-ego0_mean0.011675166498115914
agent_compute-ego0_median0.011514734467499262
agent_compute-ego0_min0.011303808568022234
complete-iteration_max0.2080875503133477
complete-iteration_mean0.18475626251481156
complete-iteration_median0.18470840450131223
complete-iteration_min0.16152069074327405
deviation-center-line_max3.290584703236746
deviation-center-line_mean2.595229043243916
deviation-center-line_min1.8627166343446957
deviation-heading_max11.10160907425213
deviation-heading_mean10.165592587777576
deviation-heading_median10.415632411744234
deviation-heading_min8.729496453369707
driven_any_max12.159296854514997
driven_any_mean10.617984010691984
driven_any_median10.32717848997082
driven_any_min9.6582822083113
driven_lanedir_consec_max11.897012715005124
driven_lanedir_consec_mean9.71173182162732
driven_lanedir_consec_min8.052298422441824
driven_lanedir_max11.897012715005124
driven_lanedir_mean10.195188798115057
driven_lanedir_median10.049340478014866
driven_lanedir_min8.785061521425375
get_duckie_state_max1.4396432436673866e-06
get_duckie_state_mean1.31427993583838e-06
get_duckie_state_median1.2920460633493083e-06
get_duckie_state_min1.233384372987517e-06
get_robot_state_max0.003953529535781136
get_robot_state_mean0.0037114170568372487
get_robot_state_median0.0036508859345359074
get_robot_state_min0.003590366822496044
get_state_dump_max0.005032129033618327
get_state_dump_mean0.004663663988407208
get_state_dump_median0.004553171419084916
get_state_dump_min0.004516184081840674
get_ui_image_max0.034528254271546176
get_ui_image_mean0.030143518084590384
get_ui_image_median0.03005989743311339
get_ui_image_min0.02592602320058856
in-drivable-lane_max6.700000000000095
in-drivable-lane_mean1.8250000000000153
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 12.159296854514997, "get_ui_image": 0.02760758804937485, "step_physics": 0.10253919094031697, "survival_time": 59.99999999999873, "driven_lanedir": 11.897012715005124, "get_state_dump": 0.004543492835725376, "get_robot_state": 0.003590366822496044, "sim_render-ego0": 0.0036625199075741737, "get_duckie_state": 1.233384372987517e-06, "in-drivable-lane": 0.0, "deviation-heading": 10.142520381735055, "agent_compute-ego0": 0.011406294412160296, "complete-iteration": 0.16671867414279146, "set_robot_commands": 0.002153981238181744, "deviation-center-line": 2.919670921561889, "driven_lanedir_consec": 11.897012715005124, "sim_compute_sim_state": 0.00916824015252894, "sim_compute_performance-ego0": 0.0019543288053819083}, "LF-norm-zigzag-000-ego0": {"driven_any": 10.363323228775675, "get_ui_image": 0.034528254271546176, "step_physics": 0.13258629436794667, "survival_time": 59.99999999999873, "driven_lanedir": 10.11255462763696, "get_state_dump": 0.004516184081840674, "get_robot_state": 0.0035984043674008436, "sim_render-ego0": 0.003697292691563488, "get_duckie_state": 1.2480746101677964e-06, "in-drivable-lane": 0.0, "deviation-heading": 10.688744441753414, "agent_compute-ego0": 0.011623174522838228, "complete-iteration": 0.2080875503133477, "set_robot_commands": 0.0021570969580809937, "deviation-center-line": 2.3079439138323337, "driven_lanedir_consec": 10.11255462763696, "sim_compute_sim_state": 0.013308067504412723, "sim_compute_performance-ego0": 0.001980114539001109}, "LF-norm-techtrack-000-ego0": {"driven_any": 10.291033751165962, "get_ui_image": 0.03251220681685194, "step_physics": 0.1260720146982795, "survival_time": 59.99999999999873, "driven_lanedir": 9.98612632839277, "get_state_dump": 0.005032129033618327, "get_robot_state": 0.003953529535781136, "sim_render-ego0": 0.0039691470445542415, "get_duckie_state": 1.4396432436673866e-06, "in-drivable-lane": 0.5999999999999659, "deviation-heading": 11.10160907425213, "agent_compute-ego0": 0.0123673884894429, "complete-iteration": 0.2026981348598331, "set_robot_commands": 0.0023482983356510768, "deviation-center-line": 3.290584703236746, "driven_lanedir_consec": 8.052298422441824, "sim_compute_sim_state": 0.014147740220348605, "sim_compute_performance-ego0": 0.002187733844753904}, "LF-norm-small_loop-000-ego0": {"driven_any": 9.6582822083113, "get_ui_image": 0.02592602320058856, "step_physics": 0.10182798335593904, "survival_time": 59.99999999999873, "driven_lanedir": 8.785061521425375, "get_state_dump": 0.0045628500024444555, "get_robot_state": 0.003703367501670971, "sim_render-ego0": 0.003702232581590435, "get_duckie_state": 1.3360175165308207e-06, "in-drivable-lane": 6.700000000000095, "deviation-heading": 8.729496453369707, "agent_compute-ego0": 0.011303808568022234, "complete-iteration": 0.16152069074327405, "set_robot_commands": 0.0022377614474713456, "deviation-center-line": 1.8627166343446957, "driven_lanedir_consec": 8.785061521425375, "sim_compute_sim_state": 0.0061768510756544225, "sim_compute_performance-ego0": 0.001988767882767168}}
set_robot_commands_max0.0023482983356510768
set_robot_commands_mean0.0022242844948462902
set_robot_commands_median0.00219742920277617
set_robot_commands_min0.002153981238181744
sim_compute_performance-ego0_max0.002187733844753904
sim_compute_performance-ego0_mean0.002027736267976022
sim_compute_performance-ego0_median0.0019844412108841387
sim_compute_performance-ego0_min0.0019543288053819083
sim_compute_sim_state_max0.014147740220348605
sim_compute_sim_state_mean0.010700224738236171
sim_compute_sim_state_median0.01123815382847083
sim_compute_sim_state_min0.0061768510756544225
sim_render-ego0_max0.0039691470445542415
sim_render-ego0_mean0.0037577980563205846
sim_render-ego0_median0.0036997626365769614
sim_render-ego0_min0.0036625199075741737
simulation-passed1
step_physics_max0.13258629436794667
step_physics_mean0.11575637084062054
step_physics_median0.11430560281929825
step_physics_min0.10182798335593904
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
54117LFv-simsuccessyes0:36:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
54113LFv-simsuccessyes0:36:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible