Duckietown Challenges Home Challenges Submissions

Submission 11274

Submission11274
Competingyes
Challengeaido5-LF-sim-validation
UserAndrea Censi 🇨🇭
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 55421
Next
User labeltemplate-random
Admin priority50
Blessingn/a
User priority50

55421

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
55421LFv-simsuccessyes0:05:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.5040495128145347
survival_time_median4.6499999999999915
deviation-center-line_median0.10620228737371736
in-drivable-lane_median2.674999999999992


other stats
agent_compute-ego0_max0.01182859804895189
agent_compute-ego0_mean0.011562640686744232
agent_compute-ego0_median0.011566216464032605
agent_compute-ego0_min0.011289531769959824
complete-iteration_max0.1940263121024422
complete-iteration_mean0.17012357672347608
complete-iteration_median0.16682415706028247
complete-iteration_min0.15281968067089716
deviation-center-line_max0.16455763165774434
deviation-center-line_mean0.1092777488823554
deviation-center-line_min0.06014878912424249
deviation-heading_max1.819139548495243
deviation-heading_mean0.860079214882594
deviation-heading_median0.6517793617099584
deviation-heading_min0.31761858761521594
driven_any_max3.078535752374254
driven_any_mean1.8526223552312917
driven_any_median1.6219947547238958
driven_any_min1.0879641591031215
driven_lanedir_consec_max0.6643282900394009
driven_lanedir_consec_mean0.4884432809313369
driven_lanedir_consec_min0.28134580805687714
driven_lanedir_max0.6643282900394009
driven_lanedir_mean0.4924956463812869
driven_lanedir_median0.5121542437144349
driven_lanedir_min0.28134580805687714
get_duckie_state_max1.7161170641581218e-06
get_duckie_state_mean1.4569815837126118e-06
get_duckie_state_median1.4170983337316498e-06
get_duckie_state_min1.2776126032290251e-06
get_robot_state_max0.0037055437763532
get_robot_state_mean0.0035909953394627154
get_robot_state_median0.0035925001481880887
get_robot_state_min0.003473437285121483
get_state_dump_max0.004865010579427083
get_state_dump_mean0.004753977643953641
get_state_dump_median0.0047261776278553005
get_state_dump_min0.00469854474067688
get_ui_image_max0.03334955288016278
get_ui_image_mean0.028150487325879176
get_ui_image_median0.027037323464321184
get_ui_image_min0.02517774949471156
in-drivable-lane_max6.549999999999982
in-drivable-lane_mean3.3499999999999894
in-drivable-lane_min1.499999999999995
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 3.078535752374254, "get_ui_image": 0.025824101665351963, "step_physics": 0.09311577790900122, "survival_time": 7.84999999999998, "driven_lanedir": 0.28134580805687714, "get_state_dump": 0.00471629975717279, "get_robot_state": 0.003473437285121483, "sim_render-ego0": 0.0038045780568183225, "get_duckie_state": 1.4169306694706783e-06, "in-drivable-lane": 6.549999999999982, "deviation-heading": 0.891006057417327, "agent_compute-ego0": 0.011501624614377565, "complete-iteration": 0.15564645996576623, "set_robot_commands": 0.002090307730662672, "deviation-center-line": 0.13094886067496936, "driven_lanedir_consec": 0.28134580805687714, "sim_compute_sim_state": 0.009072223796120172, "sim_compute_performance-ego0": 0.0019627190843413147}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.5806581102909054, "get_ui_image": 0.03334955288016278, "step_physics": 0.1246599503185438, "survival_time": 4.549999999999992, "driven_lanedir": 0.3891269085945981, "get_state_dump": 0.00473605549853781, "get_robot_state": 0.0035660215046094813, "sim_render-ego0": 0.003625105256619661, "get_duckie_state": 1.2776126032290251e-06, "in-drivable-lane": 2.449999999999993, "deviation-heading": 1.819139548495243, "agent_compute-ego0": 0.011289531769959824, "complete-iteration": 0.1940263121024422, "set_robot_commands": 0.0020220953485240107, "deviation-center-line": 0.16455763165774434, "driven_lanedir_consec": 0.37291744679479777, "sim_compute_sim_state": 0.008786017480103867, "sim_compute_performance-ego0": 0.0019112462582795515}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.0879641591031215, "get_ui_image": 0.028250545263290405, "step_physics": 0.11456498503684998, "survival_time": 3.5499999999999954, "driven_lanedir": 0.6643282900394009, "get_state_dump": 0.00469854474067688, "get_robot_state": 0.0036189787917666966, "sim_render-ego0": 0.003721316655476888, "get_duckie_state": 1.4172659979926216e-06, "in-drivable-lane": 1.499999999999995, "deviation-heading": 0.41255266600258983, "agent_compute-ego0": 0.01182859804895189, "complete-iteration": 0.17800185415479872, "set_robot_commands": 0.002054032352235582, "deviation-center-line": 0.08145571407246535, "driven_lanedir_consec": 0.6643282900394009, "sim_compute_sim_state": 0.007234752178192139, "sim_compute_performance-ego0": 0.0019407504134707984}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.6633313991568863, "get_ui_image": 0.02517774949471156, "step_physics": 0.0943042164047559, "survival_time": 4.749999999999991, "driven_lanedir": 0.6351815788342716, "get_state_dump": 0.004865010579427083, "get_robot_state": 0.0037055437763532, "sim_render-ego0": 0.003745396931966146, "get_duckie_state": 1.7161170641581218e-06, "in-drivable-lane": 2.89999999999999, "deviation-heading": 0.31761858761521594, "agent_compute-ego0": 0.011630808313687645, "complete-iteration": 0.15281968067089716, "set_robot_commands": 0.0021455188592274985, "deviation-center-line": 0.06014878912424249, "driven_lanedir_consec": 0.6351815788342716, "sim_compute_sim_state": 0.005059865613778432, "sim_compute_performance-ego0": 0.0020975694060325623}}
set_robot_commands_max0.0021455188592274985
set_robot_commands_mean0.0020779885726624407
set_robot_commands_median0.002072170041449127
set_robot_commands_min0.0020220953485240107
sim_compute_performance-ego0_max0.0020975694060325623
sim_compute_performance-ego0_mean0.0019780712905310568
sim_compute_performance-ego0_median0.0019517347489060564
sim_compute_performance-ego0_min0.0019112462582795515
sim_compute_sim_state_max0.009072223796120172
sim_compute_sim_state_mean0.007538214767048653
sim_compute_sim_state_median0.008010384829148003
sim_compute_sim_state_min0.005059865613778432
sim_render-ego0_max0.0038045780568183225
sim_render-ego0_mean0.0037240992252202546
sim_render-ego0_median0.003733356793721517
sim_render-ego0_min0.003625105256619661
simulation-passed1
step_physics_max0.1246599503185438
step_physics_mean0.10666123241728773
step_physics_median0.10443460072080292
step_physics_min0.09311577790900122
survival_time_max7.84999999999998
survival_time_mean5.17499999999999
survival_time_min3.5499999999999954
No reset possible
55395LFv-simsuccessyes0:03:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible