Duckietown Challenges Home Challenges Submissions

Submission 11740

Submission11740
Competingyes
Challengeaido5-LF-sim-validation
UserJean-Sébastien Grondin 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 53872
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

53872

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
53872LFv-simsuccessyes0:35:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median4.64668553988053
survival_time_median59.99999999999873
deviation-center-line_median3.890990649798208
in-drivable-lane_median11.72499999999965


other stats
agent_compute-ego0_max0.01326705732512335
agent_compute-ego0_mean0.012592989378772736
agent_compute-ego0_median0.012557950323065212
agent_compute-ego0_min0.011988999543837166
complete-iteration_max0.21956525567894236
complete-iteration_mean0.2029562807210383
complete-iteration_median0.2062561651947695
complete-iteration_min0.1797475368156719
deviation-center-line_max4.197238669329663
deviation-center-line_mean3.5200960984484007
deviation-center-line_min2.1011644248675254
deviation-heading_max21.391657149738997
deviation-heading_mean16.21027647925019
deviation-heading_median15.499616628243274
deviation-heading_min12.450215510775214
driven_any_max9.43381437333822
driven_any_mean8.211940718665915
driven_any_median8.862938141411938
driven_any_min5.688072218501561
driven_lanedir_consec_max6.068645624516983
driven_lanedir_consec_mean4.3638187864396265
driven_lanedir_consec_min2.0932584414804625
driven_lanedir_max8.437917086509863
driven_lanedir_mean6.497320920545013
driven_lanedir_median7.053463901290767
driven_lanedir_min3.444438793088656
get_duckie_state_max1.4259455900803692e-06
get_duckie_state_mean1.353106845862819e-06
get_duckie_state_median1.3500129451958167e-06
get_duckie_state_min1.2864559029792734e-06
get_robot_state_max0.004242737426249609
get_robot_state_mean0.003919087641307185
get_robot_state_median0.003862615254722423
get_robot_state_min0.003708382629534287
get_state_dump_max0.006036375155734381
get_state_dump_mean0.005182863291549543
get_state_dump_median0.004968417872000892
get_state_dump_min0.00475824226646201
get_ui_image_max0.036645983380911885
get_ui_image_mean0.032846681383272944
get_ui_image_median0.03312958120604935
get_ui_image_min0.0284815797400812
in-drivable-lane_max18.3999999999999
in-drivable-lane_mean11.937499999999764
in-drivable-lane_min5.8999999999998565
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 9.346358791383429, "get_ui_image": 0.0284815797400812, "step_physics": 0.11266798421207018, "survival_time": 59.99999999999873, "driven_lanedir": 8.169920177850589, "get_state_dump": 0.004759186213459202, "get_robot_state": 0.003708382629534287, "sim_render-ego0": 0.003753165222028213, "get_duckie_state": 1.4259455900803692e-06, "in-drivable-lane": 9.049999999999551, "deviation-heading": 14.776247330265296, "agent_compute-ego0": 0.011988999543837166, "complete-iteration": 0.1797475368156719, "set_robot_commands": 0.0022120461872872665, "deviation-center-line": 3.879653272619261, "driven_lanedir_consec": 6.006594957405672, "sim_compute_sim_state": 0.010076216912884994, "sim_compute_performance-ego0": 0.002010427446389178}, "LF-norm-zigzag-000-ego0": {"driven_any": 5.688072218501561, "get_ui_image": 0.036645983380911885, "step_physics": 0.1382816324212863, "survival_time": 45.04999999999958, "driven_lanedir": 3.444438793088656, "get_state_dump": 0.006036375155734381, "get_robot_state": 0.003872646725098468, "sim_render-ego0": 0.004164776093679627, "get_duckie_state": 1.2864559029792734e-06, "in-drivable-lane": 18.3999999999999, "deviation-heading": 12.450215510775214, "agent_compute-ego0": 0.01272242904502378, "complete-iteration": 0.21956525567894236, "set_robot_commands": 0.002299118729229777, "deviation-center-line": 2.1011644248675254, "driven_lanedir_consec": 2.0932584414804625, "sim_compute_sim_state": 0.01320385139955914, "sim_compute_performance-ego0": 0.0022515008296247595}, "LF-norm-techtrack-000-ego0": {"driven_any": 8.379517491440447, "get_ui_image": 0.03290658271084419, "step_physics": 0.1385516502974333, "survival_time": 59.99999999999873, "driven_lanedir": 5.937007624730946, "get_state_dump": 0.00475824226646201, "get_robot_state": 0.003852583784346378, "sim_render-ego0": 0.0039879817152698276, "get_duckie_state": 1.3219228295064985e-06, "in-drivable-lane": 14.399999999999748, "deviation-heading": 21.391657149738997, "agent_compute-ego0": 0.012393471601106642, "complete-iteration": 0.21373810418738015, "set_robot_commands": 0.002332599236506606, "deviation-center-line": 4.197238669329663, "driven_lanedir_consec": 3.2867761223553886, "sim_compute_sim_state": 0.012686721689000317, "sim_compute_performance-ego0": 0.002177265065595768}, "LF-norm-small_loop-000-ego0": {"driven_any": 9.43381437333822, "get_ui_image": 0.033352579701254506, "step_physics": 0.1262753791952014, "survival_time": 59.99999999999873, "driven_lanedir": 8.437917086509863, "get_state_dump": 0.0051776495305425815, "get_robot_state": 0.004242737426249609, "sim_render-ego0": 0.004276562292907359, "get_duckie_state": 1.3781030608851349e-06, "in-drivable-lane": 5.8999999999998565, "deviation-heading": 16.22298592622125, "agent_compute-ego0": 0.01326705732512335, "complete-iteration": 0.1987742262021588, "set_robot_commands": 0.0026808791513943256, "deviation-center-line": 3.902328026977154, "driven_lanedir_consec": 6.068645624516983, "sim_compute_sim_state": 0.0070527455491090595, "sim_compute_performance-ego0": 0.002356634251184011}}
set_robot_commands_max0.0026808791513943256
set_robot_commands_mean0.002381160826104494
set_robot_commands_median0.0023158589828681914
set_robot_commands_min0.0022120461872872665
sim_compute_performance-ego0_max0.002356634251184011
sim_compute_performance-ego0_mean0.0021989568981984294
sim_compute_performance-ego0_median0.0022143829476102637
sim_compute_performance-ego0_min0.002010427446389178
sim_compute_sim_state_max0.01320385139955914
sim_compute_sim_state_mean0.010754883887638376
sim_compute_sim_state_median0.011381469300942654
sim_compute_sim_state_min0.0070527455491090595
sim_render-ego0_max0.004276562292907359
sim_render-ego0_mean0.004045621330971257
sim_render-ego0_median0.004076378904474727
sim_render-ego0_min0.003753165222028213
simulation-passed1
step_physics_max0.1385516502974333
step_physics_mean0.1289441615314978
step_physics_median0.13227850580824385
step_physics_min0.11266798421207018
survival_time_max59.99999999999873
survival_time_mean56.26249999999894
survival_time_min45.04999999999958
No reset possible
53868LFv-simsuccessyes0:35:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible