Duckietown Challenges Home Challenges Submissions

Submission 10858

Submission10858
Competingyes
Challengeaido5-LF-sim-validation
UserDaniil Lisus
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57730
Next
User labelsim-exercise-1
Admin priority50
Blessingn/a
User priority50

57730

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57730LFv-simsuccessyes0:35:53
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.132029632130063
survival_time_median59.99999999999873
deviation-center-line_median4.753579188139365
in-drivable-lane_median7.649999999999826


other stats
agent_compute-ego0_max0.03675323699932114
agent_compute-ego0_mean0.03046477038489807
agent_compute-ego0_median0.03532297694613594
agent_compute-ego0_min0.014459890647999293
complete-iteration_max0.23864859665164745
complete-iteration_mean0.20680687171476345
complete-iteration_median0.20074568154199385
complete-iteration_min0.18708752712341867
deviation-center-line_max5.01324023838048
deviation-center-line_mean4.171997518893974
deviation-center-line_min2.167591460916687
deviation-heading_max24.96358833668896
deviation-heading_mean17.66027608349192
deviation-heading_median19.066878450196477
deviation-heading_min7.543759096885768
driven_any_max8.965214409650184
driven_any_mean8.502539635456554
driven_any_median8.598377374376344
driven_any_min7.84818938342334
driven_lanedir_consec_max5.887915737650246
driven_lanedir_consec_mean4.757847429789063
driven_lanedir_consec_min2.8794147172458784
driven_lanedir_max7.777543026183331
driven_lanedir_mean6.795911241536561
driven_lanedir_median7.092752629893355
driven_lanedir_min5.220596680176202
get_duckie_state_max1.7646151121014063e-06
get_duckie_state_mean1.5888757175012811e-06
get_duckie_state_median1.5460432660389548e-06
get_duckie_state_min1.498801225825809e-06
get_robot_state_max0.004062754625484013
get_robot_state_mean0.003915290941154433
get_robot_state_median0.0039005645804361536
get_robot_state_min0.0037972799782614104
get_state_dump_max0.005022704650917021
get_state_dump_mean0.004877323453101969
get_state_dump_median0.00491726180099627
get_state_dump_min0.004652065559498314
get_ui_image_max0.0396599821206632
get_ui_image_mean0.033090732209433524
get_ui_image_median0.032505484903073534
get_ui_image_min0.02769197691092384
in-drivable-lane_max13.049999999999686
in-drivable-lane_mean8.599999999999842
in-drivable-lane_min6.050000000000029
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 8.869377208684767, "get_ui_image": 0.0313169962162777, "step_physics": 0.11543157019285635, "survival_time": 59.99999999999873, "driven_lanedir": 7.528314803935725, "get_state_dump": 0.005022704650917021, "get_robot_state": 0.004062754625484013, "sim_render-ego0": 0.004239451180489038, "get_duckie_state": 1.7646151121014063e-06, "in-drivable-lane": 6.599999999999827, "deviation-heading": 21.687938888992804, "agent_compute-ego0": 0.035726472201891284, "complete-iteration": 0.2113844228723861, "set_robot_commands": 0.0024950601576170656, "deviation-center-line": 4.575387277249996, "driven_lanedir_consec": 5.887915737650246, "sim_compute_sim_state": 0.010734106082900378, "sim_compute_performance-ego0": 0.0022540630448569265}, "LF-norm-zigzag-000-ego0": {"driven_any": 8.327377540067923, "get_ui_image": 0.0396599821206632, "step_physics": 0.1308975735870031, "survival_time": 59.99999999999873, "driven_lanedir": 5.220596680176202, "get_state_dump": 0.004927898823073464, "get_robot_state": 0.003970022503283498, "sim_render-ego0": 0.004235340891829339, "get_duckie_state": 1.56927466094742e-06, "in-drivable-lane": 13.049999999999686, "deviation-heading": 24.96358833668896, "agent_compute-ego0": 0.03675323699932114, "complete-iteration": 0.23864859665164745, "set_robot_commands": 0.002487198300008274, "deviation-center-line": 4.931771099028733, "driven_lanedir_consec": 2.8794147172458784, "sim_compute_sim_state": 0.013393699874687354, "sim_compute_performance-ego0": 0.0022269684905116506}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.84818938342334, "get_ui_image": 0.03369397358986938, "step_physics": 0.1130404915045766, "survival_time": 41.1499999999998, "driven_lanedir": 6.657190455850985, "get_state_dump": 0.004652065559498314, "get_robot_state": 0.0037972799782614104, "sim_render-ego0": 0.004076706842311378, "get_duckie_state": 1.5228118711304897e-06, "in-drivable-lane": 8.699999999999825, "deviation-heading": 7.543759096885768, "agent_compute-ego0": 0.014459890647999293, "complete-iteration": 0.19010694021160163, "set_robot_commands": 0.0022971835529919968, "deviation-center-line": 2.167591460916687, "driven_lanedir_consec": 5.7753794562863, "sim_compute_sim_state": 0.011910316724221683, "sim_compute_performance-ego0": 0.002088078017373687}, "LF-norm-small_loop-000-ego0": {"driven_any": 8.965214409650184, "get_ui_image": 0.02769197691092384, "step_physics": 0.10113753764258138, "survival_time": 59.99999999999873, "driven_lanedir": 7.777543026183331, "get_state_dump": 0.004906624778919077, "get_robot_state": 0.0038311066575888097, "sim_render-ego0": 0.004026588254129757, "get_duckie_state": 1.498801225825809e-06, "in-drivable-lane": 6.050000000000029, "deviation-heading": 16.445818011400146, "agent_compute-ego0": 0.034919481690380595, "complete-iteration": 0.18708752712341867, "set_robot_commands": 0.002334177245903174, "deviation-center-line": 5.01324023838048, "driven_lanedir_consec": 4.488679807973826, "sim_compute_sim_state": 0.006105365403784403, "sim_compute_performance-ego0": 0.0020422822331310213}}
set_robot_commands_max0.0024950601576170656
set_robot_commands_mean0.0024034048141301275
set_robot_commands_median0.0024106877729557236
set_robot_commands_min0.0022971835529919968
sim_compute_performance-ego0_max0.0022540630448569265
sim_compute_performance-ego0_mean0.0021528479464683215
sim_compute_performance-ego0_median0.0021575232539426687
sim_compute_performance-ego0_min0.0020422822331310213
sim_compute_sim_state_max0.013393699874687354
sim_compute_sim_state_mean0.010535872021398454
sim_compute_sim_state_median0.01132221140356103
sim_compute_sim_state_min0.006105365403784403
sim_render-ego0_max0.004239451180489038
sim_render-ego0_mean0.004144521792189878
sim_render-ego0_median0.004156023867070358
sim_render-ego0_min0.004026588254129757
simulation-passed1
step_physics_max0.1308975735870031
step_physics_mean0.11512679323175436
step_physics_median0.11423603084871647
step_physics_min0.10113753764258138
survival_time_max59.99999999999873
survival_time_mean55.28749999999899
survival_time_min41.1499999999998
No reset possible
57726LFv-simsuccessyes0:29:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible