Duckietown Challenges Home Challenges Submissions

Submission 13046

Submission13046
Competingyes
Challengeaido5-LF-sim-validation
UserDishank Bansal 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 60299
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

60299

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
60299LFv-simsuccessyes0:17:52
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.4122782926857127
survival_time_median26.150000000000013
deviation-center-line_median1.3379885621227583
in-drivable-lane_median12.499999999999943


other stats
agent_compute-ego0_max0.011542007653573228
agent_compute-ego0_mean0.011199387029281674
agent_compute-ego0_median0.011166297176216556
agent_compute-ego0_min0.01092294611112036
complete-iteration_max0.2078236014460598
complete-iteration_mean0.17546786454672575
complete-iteration_median0.1701522003162208
complete-iteration_min0.1537434561084015
deviation-center-line_max2.194394715866391
deviation-center-line_mean1.3006613865514869
deviation-center-line_min0.3322737060940399
deviation-heading_max20.002521843860126
deviation-heading_mean8.642439679354133
deviation-heading_median6.340435654305151
deviation-heading_min1.8863655649461055
driven_any_max12.970883430642532
driven_any_mean7.968067922328122
driven_any_median8.039466579962342
driven_any_min2.822455098745269
driven_lanedir_consec_max5.689017858802761
driven_lanedir_consec_mean2.8422068940952157
driven_lanedir_consec_min0.8552531322066743
driven_lanedir_max6.352385136104832
driven_lanedir_mean3.8594030218058903
driven_lanedir_median3.8758782832070633
driven_lanedir_min1.333470384704603
get_duckie_state_max1.4939206711789394e-06
get_duckie_state_mean1.4160275957770675e-06
get_duckie_state_median1.4281377896234623e-06
get_duckie_state_min1.3139141326824058e-06
get_robot_state_max0.0036704696790136474
get_robot_state_mean0.0036245650332592102
get_robot_state_median0.0036358522105728666
get_robot_state_min0.0035560860328774616
get_state_dump_max0.004544211156440504
get_state_dump_mean0.004443592173819005
get_state_dump_median0.00442948593205026
get_state_dump_min0.004371185674734995
get_ui_image_max0.03103720734351356
get_ui_image_mean0.02643911981945092
get_ui_image_median0.025906399372800433
get_ui_image_min0.022906473188689266
in-drivable-lane_max33.19999999999924
in-drivable-lane_mean15.724999999999786
in-drivable-lane_min4.70000000000001
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 11.628050059679024, "get_ui_image": 0.024152318117823986, "step_physics": 0.10218345434173905, "survival_time": 38.249999999999964, "driven_lanedir": 6.352385136104832, "get_state_dump": 0.004440045232249925, "get_robot_state": 0.0036550019490812218, "sim_render-ego0": 0.003577059616307988, "get_duckie_state": 1.377598735122083e-06, "in-drivable-lane": 19.04999999999988, "deviation-heading": 9.408496821695175, "agent_compute-ego0": 0.01126914796256521, "complete-iteration": 0.1615200699465082, "set_robot_commands": 0.0019800834182659576, "deviation-center-line": 1.8240642354931424, "driven_lanedir_consec": 5.689017858802761, "sim_compute_sim_state": 0.008283489366735553, "sim_compute_performance-ego0": 0.0018845220769976823}, "LF-norm-zigzag-000-ego0": {"driven_any": 12.970883430642532, "get_ui_image": 0.03103720734351356, "step_physics": 0.1387203385389029, "survival_time": 56.9999999999989, "driven_lanedir": 4.402760809987214, "get_state_dump": 0.0044189266318505945, "get_robot_state": 0.0035560860328774616, "sim_render-ego0": 0.003606118203045296, "get_duckie_state": 1.3139141326824058e-06, "in-drivable-lane": 33.19999999999924, "deviation-heading": 20.002521843860126, "agent_compute-ego0": 0.011542007653573228, "complete-iteration": 0.2078236014460598, "set_robot_commands": 0.001969601374566816, "deviation-center-line": 2.194394715866391, "driven_lanedir_consec": 2.3135663732044645, "sim_compute_sim_state": 0.01101944030741659, "sim_compute_performance-ego0": 0.0018640820338577688}, "LF-norm-techtrack-000-ego0": {"driven_any": 4.450883100245661, "get_ui_image": 0.027660480627776884, "step_physics": 0.11676233203698558, "survival_time": 14.050000000000065, "driven_lanedir": 3.3489957564269126, "get_state_dump": 0.004371185674734995, "get_robot_state": 0.003616702472064512, "sim_render-ego0": 0.003528672752650917, "get_duckie_state": 1.4939206711789394e-06, "in-drivable-lane": 4.70000000000001, "deviation-heading": 3.2723744869151257, "agent_compute-ego0": 0.011063446389867905, "complete-iteration": 0.17878433068593344, "set_robot_commands": 0.0019798616990975453, "deviation-center-line": 0.8519128887523747, "driven_lanedir_consec": 2.5109902121669614, "sim_compute_sim_state": 0.007845728955370314, "sim_compute_performance-ego0": 0.0018620888392130537}, "LF-norm-small_loop-000-ego0": {"driven_any": 2.822455098745269, "get_ui_image": 0.022906473188689266, "step_physics": 0.0989576650388313, "survival_time": 9.850000000000003, "driven_lanedir": 1.333470384704603, "get_state_dump": 0.004544211156440504, "get_robot_state": 0.0036704696790136474, "sim_render-ego0": 0.003605077965090973, "get_duckie_state": 1.4786768441248425e-06, "in-drivable-lane": 5.950000000000007, "deviation-heading": 1.8863655649461055, "agent_compute-ego0": 0.01092294611112036, "complete-iteration": 0.1537434561084015, "set_robot_commands": 0.001947776235715307, "deviation-center-line": 0.3322737060940399, "driven_lanedir_consec": 0.8552531322066743, "sim_compute_sim_state": 0.005240324771765507, "sim_compute_performance-ego0": 0.001855143392928923}}
set_robot_commands_max0.0019800834182659576
set_robot_commands_mean0.0019693306819114063
set_robot_commands_median0.0019747315368321807
set_robot_commands_min0.001947776235715307
sim_compute_performance-ego0_max0.0018845220769976823
sim_compute_performance-ego0_mean0.0018664590857493568
sim_compute_performance-ego0_median0.0018630854365354111
sim_compute_performance-ego0_min0.001855143392928923
sim_compute_sim_state_max0.01101944030741659
sim_compute_sim_state_mean0.008097245850321991
sim_compute_sim_state_median0.008064609161052934
sim_compute_sim_state_min0.005240324771765507
sim_render-ego0_max0.003606118203045296
sim_render-ego0_mean0.003579232134273794
sim_render-ego0_median0.003591068790699481
sim_render-ego0_min0.003528672752650917
simulation-passed1
step_physics_max0.1387203385389029
step_physics_mean0.1141559474891147
step_physics_median0.1094728931893623
step_physics_min0.0989576650388313
survival_time_max56.9999999999989
survival_time_mean29.787499999999735
survival_time_min9.850000000000003
No reset possible
60297LFv-simsuccessyes0:07:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible