Duckietown Challenges Home Challenges Submissions

Submission 11480

Submission11480
Competingyes
Challengeaido5-LF-sim-validation
UserRaphael Jean
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54500
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

54500

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54500LFv-simsuccessyes0:41:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median12.42984942351157
survival_time_median59.99999999999873
deviation-center-line_median3.251893885253857
in-drivable-lane_median3.349999999999884


other stats
agent_compute-ego0_max0.0462248821639697
agent_compute-ego0_mean0.021041470053099923
agent_compute-ego0_median0.012762644805082372
agent_compute-ego0_min0.01241570843826525
complete-iteration_max0.25589574413010824
complete-iteration_mean0.22618002971784237
complete-iteration_median0.22380942786166708
complete-iteration_min0.20120551901792705
deviation-center-line_max3.6423614987800073
deviation-center-line_mean3.2190957451191347
deviation-center-line_min2.730233711188819
deviation-heading_max12.771791674185016
deviation-heading_mean11.782590201472336
deviation-heading_median11.847424802448892
deviation-heading_min10.663719526806537
driven_any_max16.43606524831945
driven_any_mean14.273423067551397
driven_any_median14.250470487688556
driven_any_min12.156686046509025
driven_lanedir_consec_max16.0043948438138
driven_lanedir_consec_mean12.566211398874245
driven_lanedir_consec_min9.400751904660044
driven_lanedir_max16.0043948438138
driven_lanedir_mean12.926732476784228
driven_lanedir_median13.150891579331535
driven_lanedir_min9.400751904660044
get_duckie_state_max1.6576146007477493e-06
get_duckie_state_mean1.4863346233774345e-06
get_duckie_state_median1.4644384767930697e-06
get_duckie_state_min1.3588469391758497e-06
get_robot_state_max0.004207095734582753
get_robot_state_mean0.004017370748211145
get_robot_state_median0.003989460069273632
get_robot_state_min0.003883467119714562
get_state_dump_max0.005285687490267916
get_state_dump_mean0.005112911950605504
get_state_dump_median0.005094372759651483
get_state_dump_min0.004977214792851136
get_ui_image_max0.03694519355481794
get_ui_image_mean0.03171232896593609
get_ui_image_median0.03201089165391374
get_ui_image_min0.025882339001098938
in-drivable-lane_max12.049999999999429
in-drivable-lane_mean4.687499999999799
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 16.43606524831945, "get_ui_image": 0.029637535049158965, "step_physics": 0.13048179064265497, "survival_time": 59.99999999999873, "driven_lanedir": 16.0043948438138, "get_state_dump": 0.0051087584721853496, "get_robot_state": 0.004092612135519493, "sim_render-ego0": 0.004182079451765049, "get_duckie_state": 1.6576146007477493e-06, "in-drivable-lane": 0.0, "deviation-heading": 11.347532782313127, "agent_compute-ego0": 0.012548197119758093, "complete-iteration": 0.20120551901792705, "set_robot_commands": 0.002444180918176605, "deviation-center-line": 3.226793608786098, "driven_lanedir_consec": 16.0043948438138, "sim_compute_sim_state": 0.010263133307082964, "sim_compute_performance-ego0": 0.002354658017249826}, "LF-norm-zigzag-000-ego0": {"driven_any": 12.156686046509025, "get_ui_image": 0.03694519355481794, "step_physics": 0.17446732053986003, "survival_time": 56.09999999999895, "driven_lanedir": 9.400751904660044, "get_state_dump": 0.004977214792851136, "get_robot_state": 0.003883467119714562, "sim_render-ego0": 0.003973143926605197, "get_duckie_state": 1.4207454419836655e-06, "in-drivable-lane": 12.049999999999429, "deviation-heading": 10.663719526806537, "agent_compute-ego0": 0.01241570843826525, "complete-iteration": 0.25589574413010824, "set_robot_commands": 0.0023050630931225826, "deviation-center-line": 2.730233711188819, "driven_lanedir_consec": 9.400751904660044, "sim_compute_sim_state": 0.014695710830144976, "sim_compute_performance-ego0": 0.0021458351495323284}, "LF-norm-techtrack-000-ego0": {"driven_any": 14.51832602912718, "get_ui_image": 0.034384248258668516, "step_physics": 0.15923716106780067, "survival_time": 59.99999999999873, "driven_lanedir": 13.398254236596344, "get_state_dump": 0.005285687490267916, "get_robot_state": 0.004207095734582753, "sim_render-ego0": 0.004256609377515604, "get_duckie_state": 1.5081315116024732e-06, "in-drivable-lane": 3.2499999999998828, "deviation-heading": 12.771791674185016, "agent_compute-ego0": 0.012977092490406655, "complete-iteration": 0.23902468657513443, "set_robot_commands": 0.0025673170669390497, "deviation-center-line": 3.6423614987800073, "driven_lanedir_consec": 13.398254236596344, "sim_compute_sim_state": 0.0136405208724226, "sim_compute_performance-ego0": 0.0023694629970934865}, "LF-norm-small_loop-000-ego0": {"driven_any": 13.982614946249932, "get_ui_image": 0.025882339001098938, "step_physics": 0.11280870695693804, "survival_time": 59.99999999999873, "driven_lanedir": 12.90352892206673, "get_state_dump": 0.005079987047117616, "get_robot_state": 0.0038863080030277705, "sim_render-ego0": 0.0038956047394789824, "get_duckie_state": 1.3588469391758497e-06, "in-drivable-lane": 3.4499999999998856, "deviation-heading": 12.34731682258466, "agent_compute-ego0": 0.0462248821639697, "complete-iteration": 0.20859416914819975, "set_robot_commands": 0.0023613333404312324, "deviation-center-line": 3.276994161721615, "driven_lanedir_consec": 11.461444610426796, "sim_compute_sim_state": 0.006279643628122804, "sim_compute_performance-ego0": 0.0020889495433518332}}
set_robot_commands_max0.0025673170669390497
set_robot_commands_mean0.002419473604667367
set_robot_commands_median0.0024027571293039184
set_robot_commands_min0.0023050630931225826
sim_compute_performance-ego0_max0.0023694629970934865
sim_compute_performance-ego0_mean0.0022397264268068685
sim_compute_performance-ego0_median0.0022502465833910772
sim_compute_performance-ego0_min0.0020889495433518332
sim_compute_sim_state_max0.014695710830144976
sim_compute_sim_state_mean0.011219752159443337
sim_compute_sim_state_median0.011951827089752782
sim_compute_sim_state_min0.006279643628122804
sim_render-ego0_max0.004256609377515604
sim_render-ego0_mean0.004076859373841208
sim_render-ego0_median0.004077611689185123
sim_render-ego0_min0.0038956047394789824
simulation-passed1
step_physics_max0.17446732053986003
step_physics_mean0.14424874480181343
step_physics_median0.14485947585522782
step_physics_min0.11280870695693804
survival_time_max59.99999999999873
survival_time_mean59.02499999999878
survival_time_min56.09999999999895
No reset possible
54498LFv-simsuccessyes0:40:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible