Duckietown Challenges Home Challenges Submissions

Submission 10861

Submission10861
Competingyes
Challengeaido5-LF-sim-validation
UserDaniil Lisus
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57721
Next
User labelsim-exercise-1
Admin priority50
Blessingn/a
User priority50

57721

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57721LFv-simsuccessyes0:30:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.449470962389439
survival_time_median59.99999999999873
deviation-center-line_median3.2559747482356167
in-drivable-lane_median14.97499999999972


other stats
agent_compute-ego0_max0.03406216004408964
agent_compute-ego0_mean0.02248948565114299
agent_compute-ego0_median0.021871797861430568
agent_compute-ego0_min0.012152186837621176
complete-iteration_max0.23328477556163524
complete-iteration_mean0.1993216715203169
complete-iteration_median0.19024158169288224
complete-iteration_min0.1835187471338681
deviation-center-line_max4.627435254493809
deviation-center-line_mean2.9468997534181893
deviation-center-line_min0.6482142627077155
deviation-heading_max25.8619225109968
deviation-heading_mean15.249386514917475
deviation-heading_median15.772854021910504
deviation-heading_min3.5899155048520925
driven_any_max14.04649443241095
driven_any_mean10.097111103679095
driven_any_median12.337878458096508
driven_any_min1.666193066112415
driven_lanedir_consec_max8.499473783541035
driven_lanedir_consec_mean4.953163750504533
driven_lanedir_consec_min0.4142392936982193
driven_lanedir_max11.780648100972403
driven_lanedir_mean6.560594365071538
driven_lanedir_median6.886096116602257
driven_lanedir_min0.6895371261092342
get_duckie_state_max1.5454632895333423e-06
get_duckie_state_mean1.502625331365922e-06
get_duckie_state_median1.497709383873221e-06
get_duckie_state_min1.4696192681839028e-06
get_robot_state_max0.003929946345155384
get_robot_state_mean0.00382676115874354
get_robot_state_median0.003840140458646166
get_robot_state_min0.003696817372526442
get_state_dump_max0.004840738469615368
get_state_dump_mean0.0047244280410536806
get_state_dump_median0.0047409217820179456
get_state_dump_min0.004575130130563464
get_ui_image_max0.03655651050443753
get_ui_image_mean0.031559597490796495
get_ui_image_median0.031096974077804117
get_ui_image_min0.027487931303140225
in-drivable-lane_max25.04999999999927
in-drivable-lane_mean14.962499999999686
in-drivable-lane_min4.85000000000004
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 1.666193066112415, "get_ui_image": 0.02892138915402549, "step_physics": 0.11790930479764938, "survival_time": 11.150000000000023, "driven_lanedir": 0.6895371261092342, "get_state_dump": 0.004575130130563464, "get_robot_state": 0.003696817372526442, "sim_render-ego0": 0.003888787967818124, "get_duckie_state": 1.5454632895333423e-06, "in-drivable-lane": 4.85000000000004, "deviation-heading": 3.5899155048520925, "agent_compute-ego0": 0.012387193739414217, "complete-iteration": 0.1835187471338681, "set_robot_commands": 0.002291889062949589, "deviation-center-line": 0.6482142627077155, "driven_lanedir_consec": 0.4142392936982193, "sim_compute_sim_state": 0.007688357361725399, "sim_compute_performance-ego0": 0.002070106565952301}, "LF-norm-zigzag-000-ego0": {"driven_any": 11.2261808466692, "get_ui_image": 0.03655651050443753, "step_physics": 0.13488531847182758, "survival_time": 59.99999999999873, "driven_lanedir": 4.488178509237835, "get_state_dump": 0.004733714731805628, "get_robot_state": 0.003902490688898879, "sim_render-ego0": 0.00411977001669802, "get_duckie_state": 1.5134914630060887e-06, "in-drivable-lane": 25.04999999999927, "deviation-heading": 25.8619225109968, "agent_compute-ego0": 0.03135640198344692, "complete-iteration": 0.23328477556163524, "set_robot_commands": 0.002358941015454752, "deviation-center-line": 3.0472860089825637, "driven_lanedir_consec": 3.2148643855897316, "sim_compute_sim_state": 0.013011456726988984, "sim_compute_performance-ego0": 0.0022549655018599206}, "LF-norm-techtrack-000-ego0": {"driven_any": 14.04649443241095, "get_ui_image": 0.03327255900158275, "step_physics": 0.11679045882054312, "survival_time": 59.99999999999873, "driven_lanedir": 9.28401372396668, "get_state_dump": 0.004748128832230262, "get_robot_state": 0.003777790228393453, "sim_render-ego0": 0.0039176522047692395, "get_duckie_state": 1.4696192681839028e-06, "in-drivable-lane": 18.249999999999545, "deviation-heading": 19.23871950509587, "agent_compute-ego0": 0.012152186837621176, "complete-iteration": 0.1907333454224192, "set_robot_commands": 0.002230734948214643, "deviation-center-line": 3.4646634874886693, "driven_lanedir_consec": 7.684077539189147, "sim_compute_sim_state": 0.011724085533847222, "sim_compute_performance-ego0": 0.0020292148701257255}, "LF-norm-small_loop-000-ego0": {"driven_any": 13.449576069523818, "get_ui_image": 0.027487931303140225, "step_physics": 0.10423147708152752, "survival_time": 59.99999999999873, "driven_lanedir": 11.780648100972403, "get_state_dump": 0.004840738469615368, "get_robot_state": 0.003929946345155384, "sim_render-ego0": 0.00406978191880759, "get_duckie_state": 1.481927304740353e-06, "in-drivable-lane": 11.699999999999893, "deviation-heading": 12.306988538725138, "agent_compute-ego0": 0.03406216004408964, "complete-iteration": 0.18974981796334525, "set_robot_commands": 0.0023845520543615386, "deviation-center-line": 4.627435254493809, "driven_lanedir_consec": 8.499473783541035, "sim_compute_sim_state": 0.0064790604215776, "sim_compute_performance-ego0": 0.0021656059801925926}}
set_robot_commands_max0.0023845520543615386
set_robot_commands_mean0.0023165292702451304
set_robot_commands_median0.0023254150392021705
set_robot_commands_min0.002230734948214643
sim_compute_performance-ego0_max0.0022549655018599206
sim_compute_performance-ego0_mean0.002129973229532635
sim_compute_performance-ego0_median0.002117856273072447
sim_compute_performance-ego0_min0.0020292148701257255
sim_compute_sim_state_max0.013011456726988984
sim_compute_sim_state_mean0.009725740011034802
sim_compute_sim_state_median0.00970622144778631
sim_compute_sim_state_min0.0064790604215776
sim_render-ego0_max0.00411977001669802
sim_render-ego0_mean0.003998998027023243
sim_render-ego0_median0.003993717061788415
sim_render-ego0_min0.003888787967818124
simulation-passed1
step_physics_max0.13488531847182758
step_physics_mean0.1184541397928869
step_physics_median0.11734988180909624
step_physics_min0.10423147708152752
survival_time_max59.99999999999873
survival_time_mean47.78749999999905
survival_time_min11.150000000000023
No reset possible
57718LFv-simsuccessyes0:27:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible