Duckietown Challenges Home Challenges Submissions

Submission 12563

Submission12563
Competingyes
Challengeaido5-LF-sim-validation
UserBea Baselines 🐤
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 53426
Next
User labelstraight
Admin priority50
Blessingn/a
User priority50

53426

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
53426LFv-simsuccessyes0:07:17
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.4679448379078439
survival_time_median9.099999999999994
deviation-center-line_median0.28010547392805896
in-drivable-lane_median5.3999999999999995


other stats
agent_compute-ego0_max0.010609603871059956
agent_compute-ego0_mean0.010465710281549847
agent_compute-ego0_median0.010442245308815438
agent_compute-ego0_min0.010368746637508568
complete-iteration_max0.21633373007262496
complete-iteration_mean0.18576746491319152
complete-iteration_median0.18228029702758877
complete-iteration_min0.1621755355249637
deviation-center-line_max0.32072493910540395
deviation-center-line_mean0.25559979359844043
deviation-center-line_min0.14146328743223982
deviation-heading_max2.573443739016378
deviation-heading_mean1.5714288535293366
deviation-heading_median1.510197503673426
deviation-heading_min0.6918766677541162
driven_any_max3.1619666979301324
driven_any_mean1.7218199472807565
driven_any_median1.3303261886048383
driven_any_min1.064660713983217
driven_lanedir_consec_max0.6348088856547565
driven_lanedir_consec_mean0.46870191550881374
driven_lanedir_consec_min0.30410910056481066
driven_lanedir_max0.6348088856547565
driven_lanedir_mean0.46870191550881374
driven_lanedir_median0.4679448379078439
driven_lanedir_min0.30410910056481066
get_duckie_state_max1.4558877095137493e-06
get_duckie_state_mean1.3853645562767696e-06
get_duckie_state_median1.4048971388759277e-06
get_duckie_state_min1.2757762378414737e-06
get_robot_state_max0.003886801855904715
get_robot_state_mean0.003803510800974111
get_robot_state_median0.0038290733911516737
get_robot_state_min0.00366909456568838
get_state_dump_max0.004997380826839063
get_state_dump_mean0.004786021617668961
get_state_dump_median0.004794735146187667
get_state_dump_min0.004557235351461448
get_ui_image_max0.0376459622787217
get_ui_image_mean0.031248602043665422
get_ui_image_median0.030362797103141707
get_ui_image_min0.026622851689656574
in-drivable-lane_max17.10000000000013
in-drivable-lane_mean7.750000000000028
in-drivable-lane_min3.099999999999989
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 3.1619666979301324, "get_ui_image": 0.02960693600154159, "step_physics": 0.11021558010932242, "survival_time": 20.15000000000015, "driven_lanedir": 0.30410910056481066, "get_state_dump": 0.004735172385036355, "get_robot_state": 0.003852306026043278, "sim_render-ego0": 0.004008052372696376, "get_duckie_state": 1.4558877095137493e-06, "in-drivable-lane": 17.10000000000013, "deviation-heading": 2.2006053649220667, "agent_compute-ego0": 0.010443676816354884, "complete-iteration": 0.1777373475603538, "set_robot_commands": 0.0022772521075635852, "deviation-center-line": 0.32072493910540395, "driven_lanedir_consec": 0.30410910056481066, "sim_compute_sim_state": 0.010427498581385848, "sim_compute_performance-ego0": 0.002078334293743171}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.2799908315657689, "get_ui_image": 0.0376459622787217, "step_physics": 0.1403742294527043, "survival_time": 8.79999999999999, "driven_lanedir": 0.36310013377323225, "get_state_dump": 0.004854297907338978, "get_robot_state": 0.003805840756260069, "sim_render-ego0": 0.003936461809664797, "get_duckie_state": 1.4183884960109905e-06, "in-drivable-lane": 5.2999999999999945, "deviation-heading": 2.573443739016378, "agent_compute-ego0": 0.010609603871059956, "complete-iteration": 0.21633373007262496, "set_robot_commands": 0.002174104000888975, "deviation-center-line": 0.290227640897717, "driven_lanedir_consec": 0.36310013377323225, "sim_compute_sim_state": 0.01074156653409624, "sim_compute_performance-ego0": 0.002099070845350707}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.064660713983217, "get_ui_image": 0.031118658204741825, "step_physics": 0.1206447288690024, "survival_time": 7.499999999999981, "driven_lanedir": 0.6348088856547565, "get_state_dump": 0.004557235351461448, "get_robot_state": 0.00366909456568838, "sim_render-ego0": 0.003860269950715122, "get_duckie_state": 1.2757762378414737e-06, "in-drivable-lane": 3.099999999999989, "deviation-heading": 0.8197896424247854, "agent_compute-ego0": 0.010368746637508568, "complete-iteration": 0.18682324649482376, "set_robot_commands": 0.002319694354834146, "deviation-center-line": 0.14146328743223982, "driven_lanedir_consec": 0.6348088856547565, "sim_compute_sim_state": 0.008150448072825046, "sim_compute_performance-ego0": 0.0020482919074052215}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.3806615456439078, "get_ui_image": 0.026622851689656574, "step_physics": 0.10246950608712656, "survival_time": 9.4, "driven_lanedir": 0.5727895420424556, "get_state_dump": 0.004997380826839063, "get_robot_state": 0.003886801855904715, "sim_render-ego0": 0.003993406497612201, "get_duckie_state": 1.3914057817408648e-06, "in-drivable-lane": 5.500000000000004, "deviation-heading": 0.6918766677541162, "agent_compute-ego0": 0.01044081380127599, "complete-iteration": 0.1621755355249637, "set_robot_commands": 0.002301320827827252, "deviation-center-line": 0.26998330695840084, "driven_lanedir_consec": 0.5727895420424556, "sim_compute_sim_state": 0.005351824735207533, "sim_compute_performance-ego0": 0.002018922220462214}}
set_robot_commands_max0.002319694354834146
set_robot_commands_mean0.0022680928227784896
set_robot_commands_median0.0022892864676954186
set_robot_commands_min0.002174104000888975
sim_compute_performance-ego0_max0.002099070845350707
sim_compute_performance-ego0_mean0.0020611548167403285
sim_compute_performance-ego0_median0.0020633131005741963
sim_compute_performance-ego0_min0.002018922220462214
sim_compute_sim_state_max0.01074156653409624
sim_compute_sim_state_mean0.008667834480878666
sim_compute_sim_state_median0.009288973327105446
sim_compute_sim_state_min0.005351824735207533
sim_render-ego0_max0.004008052372696376
sim_render-ego0_mean0.0039495476576721245
sim_render-ego0_median0.003964934153638499
sim_render-ego0_min0.003860269950715122
simulation-passed1
step_physics_max0.1403742294527043
step_physics_mean0.11842601112953892
step_physics_median0.1154301544891624
step_physics_min0.10246950608712656
survival_time_max20.15000000000015
survival_time_mean11.46250000000003
survival_time_min7.499999999999981
No reset possible
53415LFv-simsuccessyes0:07:18
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible