Duckietown Challenges Home Challenges Submissions

Submission 12641

Submission12641
Competingyes
Challengeaido5-LFP-sim-validation
UserBea Baselines 🐤
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFP-sim: 59960
Next
User labeltemplate-pytorch
Admin priority50
Blessingn/a
User priority50

59960

Click the images to see detailed statistics about the episode.

LFP-norm-loop-000

LFP-norm-small_loop-000

LFP-norm-techtrack-000

LFP-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
59960LFP-simsuccessyes0:45:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median59.99999999999873
in-drivable-lane_median36.89999999999928
driven_lanedir_consec_median0.19511449878368647
deviation-center-line_median2.699073041429577


other stats
agent_compute-ego0_max0.013527305795191529
agent_compute-ego0_mean0.012583472051787238
agent_compute-ego0_median0.012514171354181066
agent_compute-ego0_min0.01177823970359529
complete-iteration_max0.35396902547291575
complete-iteration_mean0.29808727410512603
complete-iteration_median0.3118515274705339
complete-iteration_min0.21467701600652056
deviation-center-line_max3.4173777780130545
deviation-center-line_mean2.531546349056083
deviation-center-line_min1.3106615353521225
deviation-heading_max23.193541990891905
deviation-heading_mean17.626210619906626
deviation-heading_median18.29045258884301
deviation-heading_min10.730395311048577
driven_any_max1.4579836598943228
driven_any_mean1.2789032326558978
driven_any_median1.2600331064378674
driven_any_min1.1375630578535334
driven_lanedir_consec_max0.36231839500353424
driven_lanedir_consec_mean0.1891488268043809
driven_lanedir_consec_min0.004047914646616357
driven_lanedir_max0.36231839500353424
driven_lanedir_mean0.2343155912630618
driven_lanedir_median0.2547415874361482
driven_lanedir_min0.06546079517641656
get_duckie_state_max0.02389040219595192
get_duckie_state_mean0.0170936973069133
get_duckie_state_median0.0202234553655518
get_duckie_state_min0.004037476300597687
get_robot_state_max0.003580581933433666
get_robot_state_mean0.0035166223777720173
get_robot_state_median0.00349899146280916
get_robot_state_min0.003487924652036084
get_state_dump_max0.008160625667397326
get_state_dump_mean0.00712292006768156
get_state_dump_median0.007661080380264269
get_state_dump_min0.005008893842800372
get_ui_image_max0.038533030501214
get_ui_image_mean0.032545416777973665
get_ui_image_median0.03300692854475518
get_ui_image_min0.025634779521170305
in-drivable-lane_max46.3999999999986
in-drivable-lane_mean37.649999999999125
in-drivable-lane_min30.399999999999352
per-episodes
details{"LFP-norm-loop-000-ego0": {"driven_any": 1.4579836598943228, "get_ui_image": 0.03119224275180839, "step_physics": 0.198247869644038, "survival_time": 59.99999999999873, "driven_lanedir": 0.36231839500353424, "get_state_dump": 0.008160625667397326, "get_robot_state": 0.003496702068751301, "sim_render-ego0": 0.0035976619148730834, "get_duckie_state": 0.02389040219595192, "in-drivable-lane": 30.399999999999352, "deviation-heading": 23.193541990891905, "agent_compute-ego0": 0.012024170552364098, "complete-iteration": 0.294228388407546, "set_robot_commands": 0.00213248862712012, "deviation-center-line": 3.4173777780130545, "driven_lanedir_consec": 0.36231839500353424, "sim_compute_sim_state": 0.00955453403387141, "sim_compute_performance-ego0": 0.0018545811023442176}, "LFP-norm-zigzag-000-ego0": {"driven_any": 1.1375630578535334, "get_ui_image": 0.038533030501214, "step_physics": 0.2506195057639472, "survival_time": 59.99999999999873, "driven_lanedir": 0.18471497248134003, "get_state_dump": 0.007725901647372409, "get_robot_state": 0.003580581933433666, "sim_render-ego0": 0.003739797304710877, "get_duckie_state": 0.020484845505268943, "in-drivable-lane": 39.44999999999948, "deviation-heading": 16.2042660597576, "agent_compute-ego0": 0.013004172155998034, "complete-iteration": 0.35396902547291575, "set_robot_commands": 0.0022394869547898723, "deviation-center-line": 3.0314590971674193, "driven_lanedir_consec": 0.004047914646616357, "sim_compute_sim_state": 0.01201787141836454, "sim_compute_performance-ego0": 0.0019387541762200325}, "LFP-norm-techtrack-000-ego0": {"driven_any": 1.2883202611150304, "get_ui_image": 0.03482161433770198, "step_physics": 0.22723321930554188, "survival_time": 59.99999999999873, "driven_lanedir": 0.3247682023909564, "get_state_dump": 0.007596259113156131, "get_robot_state": 0.003501280856867019, "sim_render-ego0": 0.0035472146478124104, "get_duckie_state": 0.01996206522583465, "in-drivable-lane": 34.349999999999085, "deviation-heading": 20.376639117928423, "agent_compute-ego0": 0.013527305795191529, "complete-iteration": 0.32947466653352175, "set_robot_commands": 0.002101677244251515, "deviation-center-line": 2.3666869856917345, "driven_lanedir_consec": 0.3247682023909564, "sim_compute_sim_state": 0.015259815195418714, "sim_compute_performance-ego0": 0.0018468177090278772}, "LFP-norm-small_loop-000-ego0": {"driven_any": 1.2317459517607043, "get_ui_image": 0.025634779521170305, "step_physics": 0.15056841835987558, "survival_time": 59.99999999999873, "driven_lanedir": 0.06546079517641656, "get_state_dump": 0.005008893842800372, "get_robot_state": 0.003487924652036084, "sim_render-ego0": 0.0035829518260209387, "get_duckie_state": 0.004037476300597687, "in-drivable-lane": 46.3999999999986, "deviation-heading": 10.730395311048577, "agent_compute-ego0": 0.01177823970359529, "complete-iteration": 0.21467701600652056, "set_robot_commands": 0.0021026346903855755, "deviation-center-line": 1.3106615353521225, "driven_lanedir_consec": 0.06546079517641656, "sim_compute_sim_state": 0.0065561879385917215, "sim_compute_performance-ego0": 0.0018456150947462808}}
set_robot_commands_max0.0022394869547898723
set_robot_commands_mean0.0021440718791367708
set_robot_commands_median0.002117561658752848
set_robot_commands_min0.002101677244251515
sim_compute_performance-ego0_max0.0019387541762200325
sim_compute_performance-ego0_mean0.0018714420205846016
sim_compute_performance-ego0_median0.0018506994056860472
sim_compute_performance-ego0_min0.0018456150947462808
sim_compute_sim_state_max0.015259815195418714
sim_compute_sim_state_mean0.010847102146561595
sim_compute_sim_state_median0.010786202726117976
sim_compute_sim_state_min0.0065561879385917215
sim_render-ego0_max0.003739797304710877
sim_render-ego0_mean0.003616906423354328
sim_render-ego0_median0.003590306870447011
sim_render-ego0_min0.0035472146478124104
simulation-passed1
step_physics_max0.2506195057639472
step_physics_mean0.20666725326835064
step_physics_median0.21274054447478993
step_physics_min0.15056841835987558
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
53877LFP-simsuccessno0:43:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible