Duckietown Challenges Home Challenges Submissions

Submission 10045

Submission10045
Competingyes
Challengeaido5-LF-sim-validation
UserJean-Sébastien Grondin 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58003
Next
User labelsim-exercise-1
Admin priority50
Blessingn/a
User priority50

58003

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
58003LFv-simsuccessyes0:29:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median9.151761729472934
survival_time_median59.99999999999873
deviation-center-line_median3.2045841888192284
in-drivable-lane_median0.0


other stats
agent_compute-ego0_max0.02640069394584103
agent_compute-ego0_mean0.015873115422001732
agent_compute-ego0_median0.012571839559629388
agent_compute-ego0_min0.01194808862290712
complete-iteration_max0.20584664192605528
complete-iteration_mean0.1776289263798952
complete-iteration_median0.17912407808756453
complete-iteration_min0.14642090741839636
deviation-center-line_max4.356777868969345
deviation-center-line_mean3.1760072261364387
deviation-center-line_min1.9380826579379529
deviation-heading_max14.163098610754805
deviation-heading_mean9.839285115169478
deviation-heading_median8.663128823878345
deviation-heading_min7.867784202166417
driven_any_max10.21461876470506
driven_any_mean8.224497363941612
driven_any_median9.365295430703991
driven_any_min3.9527798296534065
driven_lanedir_consec_max10.0864371587689
driven_lanedir_consec_mean7.858384607934693
driven_lanedir_consec_min3.0435778140239984
driven_lanedir_max10.0864371587689
driven_lanedir_mean7.858384607934693
driven_lanedir_median9.151761729472934
driven_lanedir_min3.0435778140239984
get_duckie_state_max1.5290674829917838e-06
get_duckie_state_mean1.3592172724161928e-06
get_duckie_state_median1.3787978694004183e-06
get_duckie_state_min1.1502058678721509e-06
get_robot_state_max0.003946083294946734
get_robot_state_mean0.0035921619402450216
get_robot_state_median0.0035840636189037513
get_robot_state_min0.0032544372282258477
get_state_dump_max0.005114763703389733
get_state_dump_mean0.004635295875169704
get_state_dump_median0.004638636042732283
get_state_dump_min0.0041491477118245175
get_ui_image_max0.03442484774488084
get_ui_image_mean0.029877909722251232
get_ui_image_median0.030958472342415715
get_ui_image_min0.02316984645929265
in-drivable-lane_max7.1000000000000405
in-drivable-lane_mean1.77500000000001
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 10.21461876470506, "get_ui_image": 0.028858766269922054, "step_physics": 0.10491587815931894, "survival_time": 59.99999999999873, "driven_lanedir": 10.0864371587689, "get_state_dump": 0.004977184370296583, "get_robot_state": 0.003759512198556968, "sim_render-ego0": 0.003983326101183991, "get_duckie_state": 1.5134914630060887e-06, "in-drivable-lane": 0.0, "deviation-heading": 7.867784202166417, "agent_compute-ego0": 0.012137636554727547, "complete-iteration": 0.17253683290314814, "set_robot_commands": 0.0022237096400582524, "deviation-center-line": 2.762791503618852, "driven_lanedir_consec": 10.0864371587689, "sim_compute_sim_state": 0.009499921290503257, "sim_compute_performance-ego0": 0.0020793752805279456}, "LF-norm-zigzag-000-ego0": {"driven_any": 9.008034798706849, "get_ui_image": 0.03305817841490937, "step_physics": 0.11392348989856728, "survival_time": 59.99999999999873, "driven_lanedir": 8.697172539674972, "get_state_dump": 0.004300087715167983, "get_robot_state": 0.003408615039250535, "sim_render-ego0": 0.003594433040444202, "get_duckie_state": 1.244104275794748e-06, "in-drivable-lane": 0.0, "deviation-heading": 14.163098610754805, "agent_compute-ego0": 0.01194808862290712, "complete-iteration": 0.1857113232719809, "set_robot_commands": 0.0020263254592857395, "deviation-center-line": 4.356777868969345, "driven_lanedir_consec": 8.697172539674972, "sim_compute_sim_state": 0.01155075304315648, "sim_compute_performance-ego0": 0.0018178836987675676}, "LF-norm-techtrack-000-ego0": {"driven_any": 3.9527798296534065, "get_ui_image": 0.03442484774488084, "step_physics": 0.12876461885620397, "survival_time": 32.85000000000027, "driven_lanedir": 3.0435778140239984, "get_state_dump": 0.005114763703389733, "get_robot_state": 0.003946083294946734, "sim_render-ego0": 0.004124841066841659, "get_duckie_state": 1.5290674829917838e-06, "in-drivable-lane": 7.1000000000000405, "deviation-heading": 9.193552051543191, "agent_compute-ego0": 0.013006042564531228, "complete-iteration": 0.20584664192605528, "set_robot_commands": 0.0023139805779268676, "deviation-center-line": 1.9380826579379529, "driven_lanedir_consec": 3.0435778140239984, "sim_compute_sim_state": 0.011859004620723089, "sim_compute_performance-ego0": 0.0021870897171345164}, "LF-norm-small_loop-000-ego0": {"driven_any": 9.722556062701138, "get_ui_image": 0.02316984645929265, "step_physics": 0.07719556258977403, "survival_time": 59.99999999999873, "driven_lanedir": 9.606350919270898, "get_state_dump": 0.0041491477118245175, "get_robot_state": 0.0032544372282258477, "sim_render-ego0": 0.003307817182771173, "get_duckie_state": 1.1502058678721509e-06, "in-drivable-lane": 0.0, "deviation-heading": 8.132705596213498, "agent_compute-ego0": 0.02640069394584103, "complete-iteration": 0.14642090741839636, "set_robot_commands": 0.0018208814997359376, "deviation-center-line": 3.646376874019604, "driven_lanedir_consec": 9.606350919270898, "sim_compute_sim_state": 0.0054048495328396584, "sim_compute_performance-ego0": 0.0016406507515887435}}
set_robot_commands_max0.0023139805779268676
set_robot_commands_mean0.0020962242942516992
set_robot_commands_median0.002125017549671996
set_robot_commands_min0.0018208814997359376
sim_compute_performance-ego0_max0.0021870897171345164
sim_compute_performance-ego0_mean0.0019312498620046932
sim_compute_performance-ego0_median0.0019486294896477567
sim_compute_performance-ego0_min0.0016406507515887435
sim_compute_sim_state_max0.011859004620723089
sim_compute_sim_state_mean0.00957863212180562
sim_compute_sim_state_median0.010525337166829869
sim_compute_sim_state_min0.0054048495328396584
sim_render-ego0_max0.004124841066841659
sim_render-ego0_mean0.003752604347810255
sim_render-ego0_median0.003788879570814096
sim_render-ego0_min0.003307817182771173
simulation-passed1
step_physics_max0.12876461885620397
step_physics_mean0.10619988737596606
step_physics_median0.10941968402894311
step_physics_min0.07719556258977403
survival_time_max59.99999999999873
survival_time_mean53.21249999999911
survival_time_min32.85000000000027
No reset possible
58002LFv-simsuccessyes0:35:04
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible