Duckietown Challenges Home Challenges Submissions

Submission 9314

Submission9314
Competingyes
Challengeaido5-LF-sim-validation
UserJerome Labonte 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58239
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58239

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
58239LFv-simsuccessyes0:30:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.597215610362803
survival_time_median58.54999999999881
deviation-center-line_median3.667435746445787
in-drivable-lane_median8.674999999999761


other stats
agent_compute-ego0_max0.013881672496985575
agent_compute-ego0_mean0.012758680885240944
agent_compute-ego0_median0.012509422337979102
agent_compute-ego0_min0.012134206368019993
complete-iteration_max0.2048968075713349
complete-iteration_mean0.18143096070071324
complete-iteration_median0.18473730514226763
complete-iteration_min0.15135242494698273
deviation-center-line_max4.3857215721510885
deviation-center-line_mean3.32544651899513
deviation-center-line_min1.5811930109378547
deviation-heading_max22.983679004873395
deviation-heading_mean14.188053285155242
deviation-heading_median13.624619348090008
deviation-heading_min6.519295439567559
driven_any_max12.283007711678422
driven_any_mean10.60802423591451
driven_any_median11.215808675796364
driven_any_min7.717471880386884
driven_lanedir_consec_max6.949097787916864
driven_lanedir_consec_mean5.490788535566745
driven_lanedir_consec_min3.819625133624512
driven_lanedir_max11.38190251750487
driven_lanedir_mean9.021566038400394
driven_lanedir_median9.106152697023152
driven_lanedir_min6.492056242050406
get_duckie_state_max1.748289865246877e-06
get_duckie_state_mean1.39156186492325e-06
get_duckie_state_median1.2956193642850525e-06
get_duckie_state_min1.2267188658760184e-06
get_robot_state_max0.0039960608882551344
get_robot_state_mean0.003743812856305925
get_robot_state_median0.0036806557404756007
get_robot_state_min0.0036178790560173657
get_state_dump_max0.005175560331276775
get_state_dump_mean0.004743915327006082
get_state_dump_median0.004642059463545444
get_state_dump_min0.004515982049656665
get_ui_image_max0.034839424066599164
get_ui_image_mean0.03065442812094983
get_ui_image_median0.031230782833737536
get_ui_image_min0.0253167227497251
in-drivable-lane_max10.749999999999698
in-drivable-lane_mean7.974999999999763
in-drivable-lane_min3.7999999999998337
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 12.283007711678422, "get_ui_image": 0.027902038369349496, "step_physics": 0.1027148649357042, "survival_time": 59.99999999999873, "driven_lanedir": 11.38190251750487, "get_state_dump": 0.0045841224584650935, "get_robot_state": 0.003718989179295167, "sim_render-ego0": 0.003758571824860712, "get_duckie_state": 1.2635589142226857e-06, "in-drivable-lane": 3.7999999999998337, "deviation-heading": 15.55118747428988, "agent_compute-ego0": 0.012325172321087713, "complete-iteration": 0.16897876455226013, "set_robot_commands": 0.002262098406077821, "deviation-center-line": 3.8551709758486927, "driven_lanedir_consec": 6.949097787916864, "sim_compute_sim_state": 0.00961673031440881, "sim_compute_performance-ego0": 0.0020072414515715257}, "LF-norm-zigzag-000-ego0": {"driven_any": 10.211928171763825, "get_ui_image": 0.034839424066599164, "step_physics": 0.12914908398795782, "survival_time": 59.99999999999873, "driven_lanedir": 7.795405918859824, "get_state_dump": 0.004699996468625795, "get_robot_state": 0.0036178790560173657, "sim_render-ego0": 0.003746595906774567, "get_duckie_state": 1.3276798143474189e-06, "in-drivable-lane": 10.749999999999698, "deviation-heading": 22.983679004873395, "agent_compute-ego0": 0.012693672354870494, "complete-iteration": 0.2048968075713349, "set_robot_commands": 0.002207555937628067, "deviation-center-line": 4.3857215721510885, "driven_lanedir_consec": 3.819625133624512, "sim_compute_sim_state": 0.011865576935449706, "sim_compute_performance-ego0": 0.0019863547135352292}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.717471880386884, "get_ui_image": 0.03455952729812557, "step_physics": 0.12171651221610404, "survival_time": 35.10000000000014, "driven_lanedir": 6.492056242050406, "get_state_dump": 0.005175560331276775, "get_robot_state": 0.0039960608882551344, "sim_render-ego0": 0.0040780439824501784, "get_duckie_state": 1.748289865246877e-06, "in-drivable-lane": 7.649999999999868, "deviation-heading": 6.519295439567559, "agent_compute-ego0": 0.013881672496985575, "complete-iteration": 0.2004958457322751, "set_robot_commands": 0.0025027449405719, "deviation-center-line": 1.5811930109378547, "driven_lanedir_consec": 5.4629385175293095, "sim_compute_sim_state": 0.012213131781153456, "sim_compute_performance-ego0": 0.00225923580261925}, "LF-norm-small_loop-000-ego0": {"driven_any": 12.219689179828904, "get_ui_image": 0.0253167227497251, "step_physics": 0.0916888125612354, "survival_time": 57.09999999999889, "driven_lanedir": 10.41689947518648, "get_state_dump": 0.004515982049656665, "get_robot_state": 0.003642322301656034, "sim_render-ego0": 0.0037637438986870874, "get_duckie_state": 1.2267188658760184e-06, "in-drivable-lane": 9.699999999999656, "deviation-heading": 11.698051221890134, "agent_compute-ego0": 0.012134206368019993, "complete-iteration": 0.15135242494698273, "set_robot_commands": 0.0022411100299131004, "deviation-center-line": 3.4797005170428825, "driven_lanedir_consec": 5.731492703196297, "sim_compute_sim_state": 0.005962932725173804, "sim_compute_performance-ego0": 0.0019997549182160946}}
set_robot_commands_max0.0025027449405719
set_robot_commands_mean0.002303377328547722
set_robot_commands_median0.0022516042179954606
set_robot_commands_min0.002207555937628067
sim_compute_performance-ego0_max0.00225923580261925
sim_compute_performance-ego0_mean0.002063146721485525
sim_compute_performance-ego0_median0.00200349818489381
sim_compute_performance-ego0_min0.0019863547135352292
sim_compute_sim_state_max0.012213131781153456
sim_compute_sim_state_mean0.009914592939046445
sim_compute_sim_state_median0.010741153624929256
sim_compute_sim_state_min0.005962932725173804
sim_render-ego0_max0.0040780439824501784
sim_render-ego0_mean0.003836738903193136
sim_render-ego0_median0.0037611578617739
sim_render-ego0_min0.003746595906774567
simulation-passed1
step_physics_max0.12914908398795782
step_physics_mean0.11131731842525038
step_physics_median0.11221568857590414
step_physics_min0.0916888125612354
survival_time_max59.99999999999873
survival_time_mean53.04999999999912
survival_time_min35.10000000000014
No reset possible
58217LFv-simsuccessyes0:28:41
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible