Duckietown Challenges Home Challenges Submissions

Submission 10807

Submission10807
Competingyes
Challengeaido5-LF-sim-validation
UserFernanda Custodio Pereira do Carmo 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57826
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

57826

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57826LFv-simsuccessyes0:11:10
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.577055664083586
survival_time_median20.325000000000152
deviation-center-line_median0.5277967972495952
in-drivable-lane_median8.12500000000011


other stats
agent_compute-ego0_max0.014026288063295426
agent_compute-ego0_mean0.013331702233767648
agent_compute-ego0_median0.013225099703978168
agent_compute-ego0_min0.01285032146381882
complete-iteration_max0.22164431951379265
complete-iteration_mean0.18316031752230305
complete-iteration_median0.17164190276765168
complete-iteration_min0.16771314504011622
deviation-center-line_max1.3950762569134432
deviation-center-line_mean0.6387298424498955
deviation-center-line_min0.10424951838694876
deviation-heading_max3.8313162972856336
deviation-heading_mean2.0182716274290753
deviation-heading_median1.6811621225260571
deviation-heading_min0.8794459673785541
driven_any_max4.616697307742068
driven_any_mean3.065551287946462
driven_any_median3.499345558881347
driven_any_min0.6468167262810862
driven_lanedir_consec_max2.3034704373581767
driven_lanedir_consec_mean1.4098699067596148
driven_lanedir_consec_min0.1818978615131104
driven_lanedir_max2.3034704373581767
driven_lanedir_mean1.4098699067596148
driven_lanedir_median1.577055664083586
driven_lanedir_min0.1818978615131104
get_duckie_state_max2.4786535299049234e-06
get_duckie_state_mean2.297550487673464e-06
get_duckie_state_median2.4055320232459164e-06
get_duckie_state_min1.9004843742971e-06
get_robot_state_max0.0041815312403552934
get_robot_state_mean0.0039862575331954635
get_robot_state_median0.0040599875918914275
get_robot_state_min0.003643523708643703
get_state_dump_max0.005159308833460654
get_state_dump_mean0.004990029298031384
get_state_dump_median0.005056911314456427
get_state_dump_min0.0046869857297520265
get_ui_image_max0.03618130376262049
get_ui_image_mean0.03067309589408228
get_ui_image_median0.030025156060859263
get_ui_image_min0.02646076769199011
in-drivable-lane_max17.750000000000195
in-drivable-lane_mean9.262500000000102
in-drivable-lane_min3.049999999999991
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.4728100866640252, "get_ui_image": 0.02907927101161204, "step_physics": 0.09889921525708673, "survival_time": 14.650000000000071, "driven_lanedir": 2.0201042287423756, "get_state_dump": 0.005071567029369121, "get_robot_state": 0.004014543124607631, "sim_render-ego0": 0.004148093210596617, "get_duckie_state": 2.337150833233684e-06, "in-drivable-lane": 3.0500000000000433, "deviation-heading": 1.418133769525609, "agent_compute-ego0": 0.0134397858665103, "complete-iteration": 0.16968013721258465, "set_robot_commands": 0.002355513929509792, "deviation-center-line": 0.5991900707147363, "driven_lanedir_consec": 2.0201042287423756, "sim_compute_sim_state": 0.010373318276437771, "sim_compute_performance-ego0": 0.0022035780407133557}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.6468167262810862, "get_ui_image": 0.03618130376262049, "step_physics": 0.14197192397168887, "survival_time": 4.599999999999992, "driven_lanedir": 0.1818978615131104, "get_state_dump": 0.005159308833460654, "get_robot_state": 0.004105432059175225, "sim_render-ego0": 0.004290560240386635, "get_duckie_state": 2.4739132132581487e-06, "in-drivable-lane": 3.049999999999991, "deviation-heading": 0.8794459673785541, "agent_compute-ego0": 0.014026288063295426, "complete-iteration": 0.22164431951379265, "set_robot_commands": 0.002472518592752436, "deviation-center-line": 0.10424951838694876, "driven_lanedir_consec": 0.1818978615131104, "sim_compute_sim_state": 0.01102852052257907, "sim_compute_performance-ego0": 0.0023020467450541836}, "LF-norm-techtrack-000-ego0": {"driven_any": 4.525881031098669, "get_ui_image": 0.030971041110106484, "step_physics": 0.101482797721526, "survival_time": 26.00000000000023, "driven_lanedir": 1.1340070994247962, "get_state_dump": 0.0046869857297520265, "get_robot_state": 0.003643523708643703, "sim_render-ego0": 0.003895068580495647, "get_duckie_state": 1.9004843742971e-06, "in-drivable-lane": 17.750000000000195, "deviation-heading": 3.8313162972856336, "agent_compute-ego0": 0.013010413541446034, "complete-iteration": 0.17360366832271876, "set_robot_commands": 0.002169911097198897, "deviation-center-line": 0.456403523784454, "driven_lanedir_consec": 1.1340070994247962, "sim_compute_sim_state": 0.01166676010600436, "sim_compute_performance-ego0": 0.001991440833415729}, "LF-norm-small_loop-000-ego0": {"driven_any": 4.616697307742068, "get_ui_image": 0.02646076769199011, "step_physics": 0.10313323893637028, "survival_time": 26.45000000000024, "driven_lanedir": 2.3034704373581767, "get_state_dump": 0.005042255599543733, "get_robot_state": 0.0041815312403552934, "sim_render-ego0": 0.004280218538248314, "get_duckie_state": 2.4786535299049234e-06, "in-drivable-lane": 13.200000000000175, "deviation-heading": 1.9441904755265051, "agent_compute-ego0": 0.01285032146381882, "complete-iteration": 0.16771314504011622, "set_robot_commands": 0.0024524954130064765, "deviation-center-line": 1.3950762569134432, "driven_lanedir_consec": 2.3034704373581767, "sim_compute_sim_state": 0.006836440428247992, "sim_compute_performance-ego0": 0.002374451565292646}}
set_robot_commands_max0.002472518592752436
set_robot_commands_mean0.0023626097581169003
set_robot_commands_median0.0024040046712581345
set_robot_commands_min0.002169911097198897
sim_compute_performance-ego0_max0.002374451565292646
sim_compute_performance-ego0_mean0.0022178792961189786
sim_compute_performance-ego0_median0.0022528123928837697
sim_compute_performance-ego0_min0.001991440833415729
sim_compute_sim_state_max0.01166676010600436
sim_compute_sim_state_mean0.009976259833317298
sim_compute_sim_state_median0.010700919399508424
sim_compute_sim_state_min0.006836440428247992
sim_render-ego0_max0.004290560240386635
sim_render-ego0_mean0.0041534851424318035
sim_render-ego0_median0.004214155874422466
sim_render-ego0_min0.003895068580495647
simulation-passed1
step_physics_max0.14197192397168887
step_physics_mean0.11137179397166797
step_physics_median0.10230801832894812
step_physics_min0.09889921525708673
survival_time_max26.45000000000024
survival_time_mean17.925000000000136
survival_time_min4.599999999999992
No reset possible
57821LFv-simsuccessyes0:10:10
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible