Duckietown Challenges Home Challenges Submissions

Submission 12638

Submission12638
Competingyes
Challengeaido5-LF-sim-validation
UserBea Baselines 🐤
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54065
Next
User labeltemplate-pytorch
Admin priority50
Blessingn/a
User priority50

54065

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54065LFv-simsuccessyes0:40:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.1134117085216153
survival_time_median59.99999999999873
deviation-center-line_median2.1743090991350376
in-drivable-lane_median33.74999999999912


other stats
agent_compute-ego0_max0.013084527356340725
agent_compute-ego0_mean0.012239597371773952
agent_compute-ego0_median0.012047353434026688
agent_compute-ego0_min0.011779155262701716
complete-iteration_max0.30194475926725595
complete-iteration_mean0.2542346877221958
complete-iteration_median0.2568461025485786
complete-iteration_min0.20130178652437009
deviation-center-line_max2.2452540190555976
deviation-center-line_mean1.888190043849676
deviation-center-line_min0.9588879580730324
deviation-heading_max23.080281126832592
deviation-heading_mean18.09234672354862
deviation-heading_median19.81965946121704
deviation-heading_min9.649786844927805
driven_any_max8.064697851417009
driven_any_mean7.870280647311227
driven_any_median7.84097061961888
driven_any_min7.734483498590139
driven_lanedir_consec_max2.829246171663354
driven_lanedir_consec_mean1.9192727803484977
driven_lanedir_consec_min0.6210215326874056
driven_lanedir_max2.829246171663354
driven_lanedir_mean1.9489697247571385
driven_lanedir_median2.1221061602043982
driven_lanedir_min0.7224204069564031
get_duckie_state_max1.8335004134737976e-06
get_duckie_state_mean1.677466272612992e-06
get_duckie_state_median1.6492768985643474e-06
get_duckie_state_min1.5778108798494744e-06
get_robot_state_max0.003560460675864494
get_robot_state_mean0.003498243551071637
get_robot_state_median0.0034958010013653377
get_robot_state_min0.0034409115256913794
get_state_dump_max0.004413450290321013
get_state_dump_mean0.0043289307055127905
get_state_dump_median0.0043198182918348475
get_state_dump_min0.004262635948060454
get_ui_image_max0.03501311805623457
get_ui_image_mean0.02984699554784808
get_ui_image_median0.02972552043809978
get_ui_image_min0.024923823258958193
in-drivable-lane_max49.59999999999882
in-drivable-lane_mean36.4249999999991
in-drivable-lane_min28.59999999999936
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 8.064697851417009, "get_ui_image": 0.0276271705325696, "step_physics": 0.1757225386804585, "survival_time": 59.99999999999873, "driven_lanedir": 2.384279618495936, "get_state_dump": 0.004369218383204629, "get_robot_state": 0.003560460675864494, "sim_render-ego0": 0.00362234052075236, "get_duckie_state": 1.6917594763559665e-06, "in-drivable-lane": 32.29999999999922, "deviation-heading": 20.240111522920376, "agent_compute-ego0": 0.01207009402044012, "complete-iteration": 0.23887992441207545, "set_robot_commands": 0.0021453640244584794, "deviation-center-line": 2.1084650585294424, "driven_lanedir_consec": 2.384279618495936, "sim_compute_sim_state": 0.007812861300427947, "sim_compute_performance-ego0": 0.0018749848492040323}, "LF-norm-zigzag-000-ego0": {"driven_any": 7.803139442227, "get_ui_image": 0.03501311805623457, "step_physics": 0.22744252917967073, "survival_time": 59.99999999999873, "driven_lanedir": 1.8599327019128604, "get_state_dump": 0.004270418200465066, "get_robot_state": 0.0034409115256913794, "sim_render-ego0": 0.0035220550359238395, "get_duckie_state": 1.5778108798494744e-06, "in-drivable-lane": 35.19999999999901, "deviation-heading": 19.399207399513703, "agent_compute-ego0": 0.012024612847613256, "complete-iteration": 0.30194475926725595, "set_robot_commands": 0.002071823704550407, "deviation-center-line": 2.2452540190555976, "driven_lanedir_consec": 1.8425437985472943, "sim_compute_sim_state": 0.01228164415573895, "sim_compute_performance-ego0": 0.0018063641706176049}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.734483498590139, "get_ui_image": 0.03182387034362996, "step_physics": 0.2051513337969879, "survival_time": 59.99999999999873, "driven_lanedir": 2.829246171663354, "get_state_dump": 0.004413450290321013, "get_robot_state": 0.0035345768749862785, "sim_render-ego0": 0.0036207249916959663, "get_duckie_state": 1.8335004134737976e-06, "in-drivable-lane": 28.59999999999936, "deviation-heading": 23.080281126832592, "agent_compute-ego0": 0.013084527356340725, "complete-iteration": 0.2748122806850818, "set_robot_commands": 0.002175764676236193, "deviation-center-line": 2.2401531397406327, "driven_lanedir_consec": 2.829246171663354, "sim_compute_sim_state": 0.009035606368396006, "sim_compute_performance-ego0": 0.001896749825203647}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.878801797010761, "get_ui_image": 0.024923823258958193, "step_physics": 0.14360672051860926, "survival_time": 59.99999999999873, "driven_lanedir": 0.7224204069564031, "get_state_dump": 0.004262635948060454, "get_robot_state": 0.003457025127744397, "sim_render-ego0": 0.0035089525354593423, "get_duckie_state": 1.6067943207727284e-06, "in-drivable-lane": 49.59999999999882, "deviation-heading": 9.649786844927805, "agent_compute-ego0": 0.011779155262701716, "complete-iteration": 0.20130178652437009, "set_robot_commands": 0.0020734193819349355, "deviation-center-line": 0.9588879580730324, "driven_lanedir_consec": 0.6210215326874056, "sim_compute_sim_state": 0.005811897940083805, "sim_compute_performance-ego0": 0.0018070311867922767}}
set_robot_commands_max0.002175764676236193
set_robot_commands_mean0.002116592946795004
set_robot_commands_median0.002109391703196707
set_robot_commands_min0.002071823704550407
sim_compute_performance-ego0_max0.001896749825203647
sim_compute_performance-ego0_mean0.0018462825079543904
sim_compute_performance-ego0_median0.0018410080179981545
sim_compute_performance-ego0_min0.0018063641706176049
sim_compute_sim_state_max0.01228164415573895
sim_compute_sim_state_mean0.008735502441161678
sim_compute_sim_state_median0.008424233834411977
sim_compute_sim_state_min0.005811897940083805
sim_render-ego0_max0.00362234052075236
sim_render-ego0_mean0.003568518270957877
sim_render-ego0_median0.0035713900138099027
sim_render-ego0_min0.0035089525354593423
simulation-passed1
step_physics_max0.22744252917967073
step_physics_mean0.1879807805439316
step_physics_median0.1904369362387232
step_physics_min0.14360672051860926
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
54062LFv-simsuccessyes0:40:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible