Duckietown Challenges Home Challenges Submissions

Submission 13471

Submission13471
Competingyes
Challengeaido5-LF-sim-validation
UserRaphael Jean
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 61970
Next
User labelmobile-segmentation-pedestrian
Admin priority50
Blessingn/a
User priority50

61970

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
61970LFv-simsuccessyes0:25:27
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median12.02404573136815
survival_time_median45.32499999999951
deviation-center-line_median2.8699866879186295
in-drivable-lane_median5.924999999999994


other stats
agent_compute-ego0_max0.015787941625141553
agent_compute-ego0_mean0.01544475067294577
agent_compute-ego0_median0.015538254347967169
agent_compute-ego0_min0.014914552370707192
complete-iteration_max0.2543731592950367
complete-iteration_mean0.20946723736450248
complete-iteration_median0.20227357993049755
complete-iteration_min0.17894863030197816
deviation-center-line_max5.940622411994401
deviation-center-line_mean3.2065526331775533
deviation-center-line_min1.1456147448785534
deviation-heading_max9.26717211454492
deviation-heading_mean6.052357226838228
deviation-heading_median5.2927150600286215
deviation-heading_min4.35682667275075
driven_any_max20.839969937012576
driven_any_mean14.722458675737975
driven_any_median15.557156806317572
driven_any_min6.935551153304191
driven_lanedir_consec_max20.67722266830486
driven_lanedir_consec_mean12.300306519730723
driven_lanedir_consec_min4.475911947881732
driven_lanedir_max20.67722266830486
driven_lanedir_mean12.300808366176208
driven_lanedir_median12.025049424259116
driven_lanedir_min4.475911947881732
get_duckie_state_max1.162581801026186e-06
get_duckie_state_mean1.1241739035929443e-06
get_duckie_state_median1.1238031442913782e-06
get_duckie_state_min1.0865075247628348e-06
get_robot_state_max0.00365169141612184
get_robot_state_mean0.003565558672921969
get_robot_state_median0.0035486072367740488
get_robot_state_min0.003513328802017938
get_state_dump_max0.0045379742694635575
get_state_dump_mean0.0044347154982720864
get_state_dump_median0.004420721870591285
get_state_dump_min0.00435944398244222
get_ui_image_max0.035624031793503534
get_ui_image_mean0.03043537016197756
get_ui_image_median0.030244735754300513
get_ui_image_min0.02562797734580568
in-drivable-lane_max13.650000000000178
in-drivable-lane_mean6.375000000000041
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 20.816194890009147, "get_ui_image": 0.02813249404583247, "step_physics": 0.1118468759855958, "survival_time": 59.99999999999873, "driven_lanedir": 20.67722266830486, "get_state_dump": 0.0045379742694635575, "get_robot_state": 0.00365169141612184, "sim_render-ego0": 0.0036711490323005566, "get_duckie_state": 1.103554438988831e-06, "in-drivable-lane": 0.0, "deviation-heading": 5.390132416987151, "agent_compute-ego0": 0.015388941784683214, "complete-iteration": 0.18073429473730843, "set_robot_commands": 0.0022186605658360463, "deviation-center-line": 4.497352045396556, "driven_lanedir_consec": 20.67722266830486, "sim_compute_sim_state": 0.00927979344630817, "sim_compute_performance-ego0": 0.00192966965414106}, "LF-norm-zigzag-000-ego0": {"driven_any": 6.935551153304191, "get_ui_image": 0.035624031793503534, "step_physics": 0.17687467961084274, "survival_time": 20.950000000000163, "driven_lanedir": 4.475911947881732, "get_state_dump": 0.00435944398244222, "get_robot_state": 0.003513328802017938, "sim_render-ego0": 0.003664106982094901, "get_duckie_state": 1.0865075247628348e-06, "in-drivable-lane": 6.90000000000006, "deviation-heading": 4.35682667275075, "agent_compute-ego0": 0.014914552370707192, "complete-iteration": 0.2543731592950367, "set_robot_commands": 0.002114911306472052, "deviation-center-line": 1.1456147448785534, "driven_lanedir_consec": 4.475911947881732, "sim_compute_sim_state": 0.011339898904164631, "sim_compute_performance-ego0": 0.0018970290819803873}, "LF-norm-techtrack-000-ego0": {"driven_any": 10.298118722625995, "get_ui_image": 0.032356977462768555, "step_physics": 0.14745816190390323, "survival_time": 30.6500000000003, "driven_lanedir": 5.2585914340863, "get_state_dump": 0.004461976138310634, "get_robot_state": 0.0035367811929907784, "sim_render-ego0": 0.003618882222750288, "get_duckie_state": 1.162581801026186e-06, "in-drivable-lane": 13.650000000000178, "deviation-heading": 5.195297703070092, "agent_compute-ego0": 0.015787941625141553, "complete-iteration": 0.22381286512368664, "set_robot_commands": 0.002086504662852334, "deviation-center-line": 1.2426213304407032, "driven_lanedir_consec": 5.256584048304368, "sim_compute_sim_state": 0.012549377031357358, "sim_compute_performance-ego0": 0.001884254648165128}, "LF-norm-small_loop-000-ego0": {"driven_any": 20.839969937012576, "get_ui_image": 0.02562797734580568, "step_physics": 0.11601547734326469, "survival_time": 59.99999999999873, "driven_lanedir": 18.79150741443193, "get_state_dump": 0.004379467602871935, "get_robot_state": 0.0035604332805573196, "sim_render-ego0": 0.003575530278494118, "get_duckie_state": 1.1440518495939256e-06, "in-drivable-lane": 4.949999999999927, "deviation-heading": 9.26717211454492, "agent_compute-ego0": 0.015687566911251123, "complete-iteration": 0.17894863030197816, "set_robot_commands": 0.0021072122874009817, "deviation-center-line": 5.940622411994401, "driven_lanedir_consec": 18.79150741443193, "sim_compute_sim_state": 0.006058999441942506, "sim_compute_performance-ego0": 0.0018654997203868197}}
set_robot_commands_max0.0022186605658360463
set_robot_commands_mean0.0021318222056403535
set_robot_commands_median0.002111061796936517
set_robot_commands_min0.002086504662852334
sim_compute_performance-ego0_max0.00192966965414106
sim_compute_performance-ego0_mean0.0018941132761683488
sim_compute_performance-ego0_median0.0018906418650727575
sim_compute_performance-ego0_min0.0018654997203868197
sim_compute_sim_state_max0.012549377031357358
sim_compute_sim_state_mean0.009807017205943168
sim_compute_sim_state_median0.0103098461752364
sim_compute_sim_state_min0.006058999441942506
sim_render-ego0_max0.0036711490323005566
sim_render-ego0_mean0.003632417128909966
sim_render-ego0_median0.0036414946024225943
sim_render-ego0_min0.003575530278494118
simulation-passed1
step_physics_max0.17687467961084274
step_physics_mean0.13804879871090162
step_physics_median0.13173681962358397
step_physics_min0.1118468759855958
survival_time_max59.99999999999873
survival_time_mean42.89999999999948
survival_time_min20.950000000000163
No reset possible
61969LFv-simabortedyes0:30:04
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible