Duckietown Challenges Home Challenges Submissions

Submission 11627

Submission11627
Competingyes
Challengeaido5-LF-sim-validation
UserPhilippe Reddy 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54232
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

54232

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
54232LFv-simsuccessyes0:21:53
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.9486318133149862
survival_time_median35.50000000000012
deviation-center-line_median0.6023764859568255
in-drivable-lane_median13.675000000000075


other stats
agent_compute-ego0_max0.012858593853008148
agent_compute-ego0_mean0.01229438033248288
agent_compute-ego0_median0.01231861684229012
agent_compute-ego0_min0.01168169379234314
complete-iteration_max0.25258309692586567
complete-iteration_mean0.23260918111341455
complete-iteration_median0.23104142802492167
complete-iteration_min0.2157707714779492
deviation-center-line_max2.220226081760883
deviation-center-line_mean0.9134054322130866
deviation-center-line_min0.2286426751778122
deviation-heading_max10.00956841184108
deviation-heading_mean4.6825517524505935
deviation-heading_median3.5321264998793813
deviation-heading_min1.6563855982025288
driven_any_max7.917533349122718
driven_any_mean4.494483768423433
driven_any_median4.571508934239677
driven_any_min0.9173838560916604
driven_lanedir_consec_max2.4217018412430344
driven_lanedir_consec_mean1.198906880356693
driven_lanedir_consec_min0.47666205355376423
driven_lanedir_max4.326667730131297
driven_lanedir_mean1.6751483525787585
driven_lanedir_median0.9486318133149862
driven_lanedir_min0.47666205355376423
get_duckie_state_max2.115168035986589e-06
get_duckie_state_mean1.933767051530775e-06
get_duckie_state_median1.947917910004169e-06
get_duckie_state_min1.724064350128174e-06
get_robot_state_max0.0037378685162724502
get_robot_state_mean0.0036552500655829224
get_robot_state_median0.0036948122698096334
get_robot_state_min0.003493507206439972
get_state_dump_max0.004652552107438684
get_state_dump_mean0.0045872671920584255
get_state_dump_median0.0046332908709572785
get_state_dump_min0.004429934918880463
get_ui_image_max0.03479988574981689
get_ui_image_mean0.031049106838918976
get_ui_image_median0.031296953444714025
get_ui_image_min0.02680263471643097
in-drivable-lane_max52.14999999999869
in-drivable-lane_mean20.437499999999687
in-drivable-lane_min2.2499999999999227
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 4.82057700171727, "get_ui_image": 0.02918714412393417, "step_physics": 0.14841975980901465, "survival_time": 37.350000000000016, "driven_lanedir": 4.326667730131297, "get_state_dump": 0.004652552107438684, "get_robot_state": 0.0036744504051412497, "sim_render-ego0": 0.003690688367833428, "get_duckie_state": 2.115168035986589e-06, "in-drivable-lane": 2.2499999999999227, "deviation-heading": 10.00956841184108, "agent_compute-ego0": 0.012555054164825276, "complete-iteration": 0.2157707714779492, "set_robot_commands": 0.002211232873845228, "deviation-center-line": 2.220226081760883, "driven_lanedir_consec": 2.4217018412430344, "sim_compute_sim_state": 0.009294304936964882, "sim_compute_performance-ego0": 0.0019941339518297165}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.9173838560916604, "get_ui_image": 0.03479988574981689, "step_physics": 0.16598474234342575, "survival_time": 7.94999999999998, "driven_lanedir": 0.47666205355376423, "get_state_dump": 0.004429934918880463, "get_robot_state": 0.003493507206439972, "sim_render-ego0": 0.003558121621608734, "get_duckie_state": 1.724064350128174e-06, "in-drivable-lane": 3.5499999999999874, "deviation-heading": 1.6563855982025288, "agent_compute-ego0": 0.01168169379234314, "complete-iteration": 0.2380747139453888, "set_robot_commands": 0.002150985598564148, "deviation-center-line": 0.2286426751778122, "driven_lanedir_consec": 0.47666205355376423, "sim_compute_sim_state": 0.010013227164745332, "sim_compute_performance-ego0": 0.001880398392677307}, "LF-norm-techtrack-000-ego0": {"driven_any": 4.322440866762085, "get_ui_image": 0.033406762765493876, "step_physics": 0.17842925656089442, "survival_time": 33.650000000000226, "driven_lanedir": 1.079844966662735, "get_state_dump": 0.004638157895481551, "get_robot_state": 0.003715174134478017, "sim_render-ego0": 0.0038150124450819422, "get_duckie_state": 1.8931991622426744e-06, "in-drivable-lane": 23.800000000000164, "deviation-heading": 3.61893800455609, "agent_compute-ego0": 0.012858593853008148, "complete-iteration": 0.25258309692586567, "set_robot_commands": 0.0022325607718629015, "deviation-center-line": 0.688264629293359, "driven_lanedir_consec": 1.079844966662735, "sim_compute_sim_state": 0.01138730926400476, "sim_compute_performance-ego0": 0.0020122078830481283}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.917533349122718, "get_ui_image": 0.02680263471643097, "step_physics": 0.1624084450025344, "survival_time": 59.99999999999873, "driven_lanedir": 0.8174186599672375, "get_state_dump": 0.004628423846433006, "get_robot_state": 0.0037378685162724502, "sim_render-ego0": 0.003802913908755948, "get_duckie_state": 2.002636657765664e-06, "in-drivable-lane": 52.14999999999869, "deviation-heading": 3.4453149952026725, "agent_compute-ego0": 0.01208217951975496, "complete-iteration": 0.22400814210445455, "set_robot_commands": 0.002237967309308588, "deviation-center-line": 0.5164883426202921, "driven_lanedir_consec": 0.8174186599672375, "sim_compute_sim_state": 0.006216547272783037, "sim_compute_performance-ego0": 0.0020047232669954196}}
set_robot_commands_max0.002237967309308588
set_robot_commands_mean0.002208186638395216
set_robot_commands_median0.0022218968228540645
set_robot_commands_min0.002150985598564148
sim_compute_performance-ego0_max0.0020122078830481283
sim_compute_performance-ego0_mean0.001972865873637643
sim_compute_performance-ego0_median0.0019994286094125683
sim_compute_performance-ego0_min0.001880398392677307
sim_compute_sim_state_max0.01138730926400476
sim_compute_sim_state_mean0.009227847159624504
sim_compute_sim_state_median0.009653766050855106
sim_compute_sim_state_min0.006216547272783037
sim_render-ego0_max0.0038150124450819422
sim_render-ego0_mean0.003716684085820013
sim_render-ego0_median0.003746801138294688
sim_render-ego0_min0.003558121621608734
simulation-passed1
step_physics_max0.17842925656089442
step_physics_mean0.1638105509289673
step_physics_median0.16419659367298006
step_physics_min0.14841975980901465
survival_time_max59.99999999999873
survival_time_mean34.737499999999734
survival_time_min7.94999999999998
No reset possible
54230LFv-simsuccessyes0:29:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible