Duckietown Challenges Home Challenges Submissions

Submission 10009

Submission10009
Competingyes
Challengeaido5-LF-sim-validation
UserMo Kleit 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58073
Next
User labelsim-exercise-1
Admin priority50
Blessingn/a
User priority50

58073

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
58073LFv-simsuccessyes0:22:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.4758565102317975
survival_time_median40.34999999999944
deviation-center-line_median1.4693819943109867
in-drivable-lane_median7.699999999999948


other stats
agent_compute-ego0_max0.013330224623163064
agent_compute-ego0_mean0.012959985361140071
agent_compute-ego0_median0.0129324761556646
agent_compute-ego0_min0.012644764510068029
complete-iteration_max0.19349206397178073
complete-iteration_mean0.18464994964007023
complete-iteration_median0.19121375553318076
complete-iteration_min0.16268022352213862
deviation-center-line_max3.161596131549446
deviation-center-line_mean1.6298951755512163
deviation-center-line_min0.4192205820334469
deviation-heading_max11.061255615016035
deviation-heading_mean5.858355018273468
deviation-heading_median5.394191738028022
deviation-heading_min1.5837809820217914
driven_any_max9.443458142433103
driven_any_mean4.77244536226513
driven_any_median4.427985324552647
driven_any_min0.7903526575221245
driven_lanedir_consec_max9.257373225701416
driven_lanedir_consec_mean4.153692866058962
driven_lanedir_consec_min0.4056852180708352
driven_lanedir_max9.257373225701416
driven_lanedir_mean4.153692866058962
driven_lanedir_median3.4758565102317975
driven_lanedir_min0.4056852180708352
get_duckie_state_max1.3839114796031603e-06
get_duckie_state_mean1.3605566372360654e-06
get_duckie_state_median1.3658942826880107e-06
get_duckie_state_min1.326526503965079e-06
get_robot_state_max0.0039453687119940534
get_robot_state_mean0.003835666001807311
get_robot_state_median0.00385360212440759
get_robot_state_min0.0036900910464200106
get_state_dump_max0.004925756629162486
get_state_dump_mean0.004835848382475649
get_state_dump_median0.004841761950542062
get_state_dump_min0.004734112999655983
get_ui_image_max0.03712527318434282
get_ui_image_mean0.0327101359206846
get_ui_image_median0.03287305750804337
get_ui_image_min0.02796915548230885
in-drivable-lane_max14.400000000000173
in-drivable-lane_mean7.450000000000017
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 1.9482776255134548, "get_ui_image": 0.03037903509944318, "step_physics": 0.12183831571096396, "survival_time": 20.70000000000016, "driven_lanedir": 0.9560583665847386, "get_state_dump": 0.00486746006701366, "get_robot_state": 0.00385005100663886, "sim_render-ego0": 0.00411036560334355, "get_duckie_state": 1.326526503965079e-06, "in-drivable-lane": 14.400000000000173, "deviation-heading": 1.7474913688432114, "agent_compute-ego0": 0.013330224623163064, "complete-iteration": 0.1930887348680611, "set_robot_commands": 0.0024280530860625116, "deviation-center-line": 0.6220257626082388, "driven_lanedir_consec": 0.9560583665847386, "sim_compute_sim_state": 0.010044197289340468, "sim_compute_performance-ego0": 0.0021507694060543935}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.7903526575221245, "get_ui_image": 0.03712527318434282, "step_physics": 0.11260971481149848, "survival_time": 10.95000000000002, "driven_lanedir": 0.4056852180708352, "get_state_dump": 0.004734112999655983, "get_robot_state": 0.0036900910464200106, "sim_render-ego0": 0.003795757076957009, "get_duckie_state": 1.3839114796031603e-06, "in-drivable-lane": 5.9000000000000306, "deviation-heading": 1.5837809820217914, "agent_compute-ego0": 0.012644764510068029, "complete-iteration": 0.18933877619830045, "set_robot_commands": 0.002208756316791881, "deviation-center-line": 0.4192205820334469, "driven_lanedir_consec": 0.4056852180708352, "sim_compute_sim_state": 0.010491784052415328, "sim_compute_performance-ego0": 0.0019517725164240056}, "LF-norm-techtrack-000-ego0": {"driven_any": 6.907693023591838, "get_ui_image": 0.03536707991664356, "step_physics": 0.1132361420386042, "survival_time": 59.99999999999873, "driven_lanedir": 5.995654653878856, "get_state_dump": 0.004925756629162486, "get_robot_state": 0.0039453687119940534, "sim_render-ego0": 0.004183272735760869, "get_duckie_state": 1.3731401429188242e-06, "in-drivable-lane": 9.499999999999863, "deviation-heading": 9.040892107212832, "agent_compute-ego0": 0.013214319572162866, "complete-iteration": 0.19349206397178073, "set_robot_commands": 0.002410775517345368, "deviation-center-line": 3.161596131549446, "driven_lanedir_consec": 5.995654653878856, "sim_compute_sim_state": 0.013882351953917797, "sim_compute_performance-ego0": 0.002231342012340282}, "LF-norm-small_loop-000-ego0": {"driven_any": 9.443458142433103, "get_ui_image": 0.02796915548230885, "step_physics": 0.09842490097763736, "survival_time": 59.99999999999873, "driven_lanedir": 9.257373225701416, "get_state_dump": 0.004816063834070464, "get_robot_state": 0.00385715324217632, "sim_render-ego0": 0.004015180887925834, "get_duckie_state": 1.3586484224571972e-06, "in-drivable-lane": 0.0, "deviation-heading": 11.061255615016035, "agent_compute-ego0": 0.012650632739166336, "complete-iteration": 0.16268022352213862, "set_robot_commands": 0.002333967016698121, "deviation-center-line": 2.3167382260137344, "driven_lanedir_consec": 9.257373225701416, "sim_compute_sim_state": 0.006439028731194464, "sim_compute_performance-ego0": 0.0020859870783593833}}
set_robot_commands_max0.0024280530860625116
set_robot_commands_mean0.00234538798422447
set_robot_commands_median0.0023723712670217444
set_robot_commands_min0.002208756316791881
sim_compute_performance-ego0_max0.002231342012340282
sim_compute_performance-ego0_mean0.0021049677532945163
sim_compute_performance-ego0_median0.0021183782422068884
sim_compute_performance-ego0_min0.0019517725164240056
sim_compute_sim_state_max0.013882351953917797
sim_compute_sim_state_mean0.010214340506717014
sim_compute_sim_state_median0.010267990670877898
sim_compute_sim_state_min0.006439028731194464
sim_render-ego0_max0.004183272735760869
sim_render-ego0_mean0.004026144075996815
sim_render-ego0_median0.004062773245634692
sim_render-ego0_min0.003795757076957009
simulation-passed1
step_physics_max0.12183831571096396
step_physics_mean0.111527268384676
step_physics_median0.11292292842505132
step_physics_min0.09842490097763736
survival_time_max59.99999999999873
survival_time_mean37.912499999999405
survival_time_min10.95000000000002
No reset possible
58070LFv-simsuccessyes0:19:41
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible