Duckietown Challenges Home Challenges Submissions

Submission 6847

Submission6847
Competingyes
Challengeaido5-LF-sim-validation
UserLiam Paull 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58527
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58527

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
58527LFv-simsuccessyes0:35:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median6.199799238659958
survival_time_median59.99999999999873
deviation-center-line_median3.7318697333274047
in-drivable-lane_median4.149999999999846


other stats
agent_compute-ego0_max0.012766385058578505
agent_compute-ego0_mean0.012349342873054775
agent_compute-ego0_median0.012492470697598294
agent_compute-ego0_min0.011646045038444018
complete-iteration_max0.21488402328522976
complete-iteration_mean0.1850880195358016
complete-iteration_median0.18580335443164783
complete-iteration_min0.15386134599468096
deviation-center-line_max4.295744428109565
deviation-center-line_mean3.65381144917302
deviation-center-line_min2.855761901927705
deviation-heading_max12.094012130429492
deviation-heading_mean10.446082521816756
deviation-heading_median10.897658178073817
deviation-heading_min7.895001600689901
driven_any_max7.921119043226048
driven_any_mean7.919339129753699
driven_any_median7.920917289168868
driven_any_min7.914402897451014
driven_lanedir_consec_max7.163010239766431
driven_lanedir_consec_mean6.193762333278199
driven_lanedir_consec_min5.212440616026455
driven_lanedir_max7.163010239766431
driven_lanedir_mean6.977793770115135
driven_lanedir_median7.073748634320695
driven_lanedir_min6.6006675720527195
get_duckie_state_max1.3449507688701798e-06
get_duckie_state_mean1.288919425030533e-06
get_duckie_state_median1.291351254834025e-06
get_duckie_state_min1.2280244215839014e-06
get_robot_state_max0.003751948513059592
get_robot_state_mean0.003670221562191013
get_robot_state_median0.0037016383217931487
get_robot_state_min0.003525661092118161
get_state_dump_max0.004713501759512438
get_state_dump_mean0.004623681778316196
get_state_dump_median0.004621204984475929
get_state_dump_min0.004538815384800487
get_ui_image_max0.036022598400004796
get_ui_image_mean0.030649829615561987
get_ui_image_median0.03059550129702248
get_ui_image_min0.02538571746819819
in-drivable-lane_max6.499999999999719
in-drivable-lane_mean4.587499999999806
in-drivable-lane_min3.549999999999814
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921119043226048, "get_ui_image": 0.028695074346639236, "step_physics": 0.10694603876309232, "survival_time": 59.99999999999873, "driven_lanedir": 7.096075236205976, "get_state_dump": 0.004632180576816784, "get_robot_state": 0.003751948513059592, "sim_render-ego0": 0.0038080985698969935, "get_duckie_state": 1.3276798143474189e-06, "in-drivable-lane": 4.199999999999779, "deviation-heading": 7.895001600689901, "agent_compute-ego0": 0.012439364299091272, "complete-iteration": 0.17367412803770602, "set_robot_commands": 0.00225250985005019, "deviation-center-line": 3.3524205782647907, "driven_lanedir_consec": 5.212440616026455, "sim_compute_sim_state": 0.00906638440045588, "sim_compute_performance-ego0": 0.0019935422098507592}, "LF-norm-zigzag-000-ego0": {"driven_any": 7.920951171276801, "get_ui_image": 0.036022598400004796, "step_physics": 0.137030432762254, "survival_time": 59.99999999999873, "driven_lanedir": 7.163010239766431, "get_state_dump": 0.004713501759512438, "get_robot_state": 0.003677878947579593, "sim_render-ego0": 0.003810291584087947, "get_duckie_state": 1.3449507688701798e-06, "in-drivable-lane": 3.549999999999814, "deviation-heading": 11.268760427572834, "agent_compute-ego0": 0.012766385058578505, "complete-iteration": 0.21488402328522976, "set_robot_commands": 0.002195243136670369, "deviation-center-line": 4.111318888390018, "driven_lanedir_consec": 7.163010239766431, "sim_compute_sim_state": 0.01255007409533295, "sim_compute_performance-ego0": 0.002029421327513124}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.914402897451014, "get_ui_image": 0.03249592824740573, "step_physics": 0.12481543622743478, "survival_time": 59.99999999999873, "driven_lanedir": 6.6006675720527195, "get_state_dump": 0.004610229392135074, "get_robot_state": 0.003725397696006705, "sim_render-ego0": 0.003811116222537229, "get_duckie_state": 1.2280244215839014e-06, "in-drivable-lane": 6.499999999999719, "deviation-heading": 12.094012130429492, "agent_compute-ego0": 0.012545577096105315, "complete-iteration": 0.19793258082558968, "set_robot_commands": 0.0022522972386445133, "deviation-center-line": 2.855761901927705, "driven_lanedir_consec": 5.515701403811265, "sim_compute_sim_state": 0.011556723234953234, "sim_compute_performance-ego0": 0.0020334091313574136}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.920883407060934, "get_ui_image": 0.02538571746819819, "step_physics": 0.0951392692292759, "survival_time": 59.99999999999873, "driven_lanedir": 7.051422032435414, "get_state_dump": 0.004538815384800487, "get_robot_state": 0.003525661092118161, "sim_render-ego0": 0.0036420030061847263, "get_duckie_state": 1.2550226953206313e-06, "in-drivable-lane": 4.099999999999912, "deviation-heading": 10.526555928574796, "agent_compute-ego0": 0.011646045038444018, "complete-iteration": 0.15386134599468096, "set_robot_commands": 0.0021344855067930452, "deviation-center-line": 4.295744428109565, "driven_lanedir_consec": 6.88389707350865, "sim_compute_sim_state": 0.005900028841779393, "sim_compute_performance-ego0": 0.0018700211371708473}}
set_robot_commands_max0.00225250985005019
set_robot_commands_mean0.0022086339330395295
set_robot_commands_median0.002223770187657441
set_robot_commands_min0.0021344855067930452
sim_compute_performance-ego0_max0.0020334091313574136
sim_compute_performance-ego0_mean0.001981598451473036
sim_compute_performance-ego0_median0.0020114817686819417
sim_compute_performance-ego0_min0.0018700211371708473
sim_compute_sim_state_max0.01255007409533295
sim_compute_sim_state_mean0.009768302643130364
sim_compute_sim_state_median0.010311553817704556
sim_compute_sim_state_min0.005900028841779393
sim_render-ego0_max0.003811116222537229
sim_render-ego0_mean0.003767877345676724
sim_render-ego0_median0.00380919507699247
sim_render-ego0_min0.0036420030061847263
simulation-passed1
step_physics_max0.137030432762254
step_physics_mean0.11598279424551423
step_physics_median0.11588073749526356
step_physics_min0.0951392692292759
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58524LFv-simsuccessyes0:37:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible