Duckietown Challenges Home Challenges Submissions

Submission 11167

Submission11167
Competingyes
Challengeaido5-LF-sim-validation
UserMoustafa Elarabi
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 55519
Next
User labeltemplate-pytorch
Admin priority50
Blessingn/a
User priority50

55519

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
55519LFv-simsuccessyes0:31:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.4815851961754022
survival_time_median42.62499999999971
deviation-center-line_median1.0098613597978954
in-drivable-lane_median27.274999999999608


other stats
agent_compute-ego0_max0.05029541369407408
agent_compute-ego0_mean0.03739151486107654
agent_compute-ego0_median0.0395721945309979
agent_compute-ego0_min0.020126256688236255
complete-iteration_max0.3681785410450351
complete-iteration_mean0.2976833374535869
complete-iteration_median0.28224625947702486
complete-iteration_min0.2580622898152627
deviation-center-line_max1.1508576054655049
deviation-center-line_mean0.9418763103362155
deviation-center-line_min0.5969249162835667
deviation-heading_max9.082964170945452
deviation-heading_mean5.608300750717598
deviation-heading_median5.080894811791719
deviation-heading_min3.1884492083415017
driven_any_max2.202699231259796
driven_any_mean1.5449584642569514
driven_any_median1.457888940709605
driven_any_min1.0613567443487992
driven_lanedir_consec_max0.6417410672665989
driven_lanedir_consec_mean0.4557825992970111
driven_lanedir_consec_min0.21821893757064115
driven_lanedir_max0.6417410672665989
driven_lanedir_mean0.4557825992970111
driven_lanedir_median0.4815851961754022
driven_lanedir_min0.21821893757064115
get_duckie_state_max1.345911333637853e-06
get_duckie_state_mean1.2596133063441274e-06
get_duckie_state_median1.275114349779871e-06
get_duckie_state_min1.1423131921789148e-06
get_robot_state_max0.0037792404121327625
get_robot_state_mean0.0035378930625026275
get_robot_state_median0.003522174167121133
get_robot_state_min0.0033279835036354797
get_state_dump_max0.004663060313073274
get_state_dump_mean0.0044595382860222535
get_state_dump_median0.004461237514581041
get_state_dump_min0.004252617801853659
get_ui_image_max0.03455080563022244
get_ui_image_mean0.028960411991288022
get_ui_image_median0.02799701896214926
get_ui_image_min0.025296804410631115
in-drivable-lane_max50.99999999999894
in-drivable-lane_mean29.537499999999586
in-drivable-lane_min12.60000000000018
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.202699231259796, "get_ui_image": 0.025296804410631115, "step_physics": 0.17989099254020544, "survival_time": 59.99999999999873, "driven_lanedir": 0.21821893757064115, "get_state_dump": 0.004252617801853659, "get_robot_state": 0.0033279835036354797, "sim_render-ego0": 0.003435574403710409, "get_duckie_state": 1.2317962392382977e-06, "in-drivable-lane": 50.99999999999894, "deviation-heading": 6.487915056884767, "agent_compute-ego0": 0.029037822196128268, "complete-iteration": 0.2580622898152627, "set_robot_commands": 0.0021735700739114907, "deviation-center-line": 0.9927052965714216, "driven_lanedir_consec": 0.21821893757064115, "sim_compute_sim_state": 0.00880761071109057, "sim_compute_performance-ego0": 0.0017599984072924255}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.272252525709817, "get_ui_image": 0.03455080563022244, "step_physics": 0.2564291261857556, "survival_time": 30.950000000000305, "driven_lanedir": 0.3516373155816064, "get_state_dump": 0.004568658336516349, "get_robot_state": 0.003624090840739589, "sim_render-ego0": 0.0037717911504930065, "get_duckie_state": 1.345911333637853e-06, "in-drivable-lane": 19.25000000000024, "deviation-heading": 9.082964170945452, "agent_compute-ego0": 0.05029541369407408, "complete-iteration": 0.3681785410450351, "set_robot_commands": 0.002433245797311106, "deviation-center-line": 1.0270174230243694, "driven_lanedir_consec": 0.3516373155816064, "sim_compute_sim_state": 0.01039338650241975, "sim_compute_performance-ego0": 0.002022845898905108}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.0613567443487992, "get_ui_image": 0.0294062361425283, "step_physics": 0.21492275113774992, "survival_time": 31.800000000000317, "driven_lanedir": 0.6417410672665989, "get_state_dump": 0.004353816692645733, "get_robot_state": 0.0034202574935026783, "sim_render-ego0": 0.0036012871860522117, "get_duckie_state": 1.1423131921789148e-06, "in-drivable-lane": 12.60000000000018, "deviation-heading": 3.67387456669867, "agent_compute-ego0": 0.020126256688236255, "complete-iteration": 0.2881248379053089, "set_robot_commands": 0.0021865940543134323, "deviation-center-line": 0.5969249162835667, "driven_lanedir_consec": 0.6417410672665989, "sim_compute_sim_state": 0.008200473650658336, "sim_compute_performance-ego0": 0.001825166834018294}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.643525355709393, "get_ui_image": 0.026587801781770223, "step_physics": 0.1774105731571946, "survival_time": 53.4499999999991, "driven_lanedir": 0.611533076769198, "get_state_dump": 0.004663060313073274, "get_robot_state": 0.0037792404121327625, "sim_render-ego0": 0.0038819437829133506, "get_duckie_state": 1.3184324603214443e-06, "in-drivable-lane": 35.299999999998974, "deviation-heading": 3.1884492083415017, "agent_compute-ego0": 0.05010656686586754, "complete-iteration": 0.2763676810487409, "set_robot_commands": 0.0025043066416945412, "deviation-center-line": 1.1508576054655049, "driven_lanedir_consec": 0.611533076769198, "sim_compute_sim_state": 0.005314857046180796, "sim_compute_performance-ego0": 0.0020277861122773074}}
set_robot_commands_max0.0025043066416945412
set_robot_commands_mean0.0023244291418076427
set_robot_commands_median0.0023099199258122694
set_robot_commands_min0.0021735700739114907
sim_compute_performance-ego0_max0.0020277861122773074
sim_compute_performance-ego0_mean0.001908949313123284
sim_compute_performance-ego0_median0.001924006366461701
sim_compute_performance-ego0_min0.0017599984072924255
sim_compute_sim_state_max0.01039338650241975
sim_compute_sim_state_mean0.008179081977587363
sim_compute_sim_state_median0.008504042180874454
sim_compute_sim_state_min0.005314857046180796
sim_render-ego0_max0.0038819437829133506
sim_render-ego0_mean0.0036726491307922447
sim_render-ego0_median0.00368653916827261
sim_render-ego0_min0.003435574403710409
simulation-passed1
step_physics_max0.2564291261857556
step_physics_mean0.2071633607552264
step_physics_median0.19740687183897768
step_physics_min0.1774105731571946
survival_time_max59.99999999999873
survival_time_mean44.04999999999961
survival_time_min30.950000000000305
No reset possible
55481LFv-simsuccessyes0:16:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible