Duckietown Challenges Home Challenges Submissions

Submission 6211

Submission6211
Competingyes
Challengeaido3-LF-sim-validation
UserRey Wiyatno 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 30309
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

30309

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
30309step1-simulationsuccessyes0:08:18
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.8675098949482885
survival_time_median12.200000000000038
deviation-center-line_median0.41944787060926125
in-drivable-lane_median0


other stats
agent_compute-ego_max0.02451856772104899
agent_compute-ego_mean0.019969484899401504
agent_compute-ego_median0.01802317385977887
agent_compute-ego_min0.016948007742563883
deviation-center-line_max0.8234808974875352
deviation-center-line_mean0.5218541192773049
deviation-center-line_min0.1585558747401986
deviation-heading_max4.544805891375227
deviation-heading_mean1.9947261421446656
deviation-heading_median1.5396767830660587
deviation-heading_min0.9731824271877358
driven_any_max2.3491534090091704
driven_any_mean1.7191518555013143
driven_any_median1.907128736949862
driven_any_min0.6940583136437746
driven_lanedir_consec_max2.3252511519785672
driven_lanedir_consec_mean1.660164278495452
driven_lanedir_consec_min0.636040236619136
driven_lanedir_max2.3252511519785672
driven_lanedir_mean1.660164278495452
driven_lanedir_median1.8675098949482885
driven_lanedir_min0.636040236619136
in-drivable-lane_max0.14999999999999947
in-drivable-lane_mean0.029999999999999895
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 0.6940583136437746, "sim_physics": 0.06440949947276015, "survival_time": 4.699999999999991, "driven_lanedir": 0.636040236619136, "sim_render-ego": 0.008036958410384808, "in-drivable-lane": 0.14999999999999947, "agent_compute-ego": 0.01802317385977887, "deviation-heading": 0.9731824271877358, "set_robot_commands": 0.00781908948370751, "deviation-center-line": 0.1585558747401986, "driven_lanedir_consec": 0.636040236619136, "sim_compute_sim_state": 0.003951732148515417, "sim_compute_performance-ego": 0.004758233719683708, "sim_compute_robot_state-ego": 0.0056568916807783415}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 1.907128736949862, "sim_physics": 0.06200533714450774, "survival_time": 12.200000000000038, "driven_lanedir": 1.8675098949482885, "sim_render-ego": 0.007809905732264285, "in-drivable-lane": 0, "agent_compute-ego": 0.01710411368823442, "deviation-heading": 1.9072597567296152, "set_robot_commands": 0.007608917892956343, "deviation-center-line": 0.39877358533472534, "driven_lanedir_consec": 1.8675098949482885, "sim_compute_sim_state": 0.003940918406502145, "sim_compute_performance-ego": 0.0044251338380282045, "sim_compute_robot_state-ego": 0.005674049502513448}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 2.349039613334623, "sim_physics": 0.06316728989283243, "survival_time": 14.950000000000076, "driven_lanedir": 2.19181762792902, "sim_render-ego": 0.0076904193560282386, "in-drivable-lane": 0, "agent_compute-ego": 0.016948007742563883, "deviation-heading": 4.544805891375227, "set_robot_commands": 0.007326941490173339, "deviation-center-line": 0.8234808974875352, "driven_lanedir_consec": 2.19181762792902, "sim_compute_sim_state": 0.003962121804555257, "sim_compute_performance-ego": 0.0044460010528564456, "sim_compute_robot_state-ego": 0.005630248387654622}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.296379204569143, "sim_physics": 0.09483725400198072, "survival_time": 8.399999999999984, "driven_lanedir": 1.2802024810022503, "sim_render-ego": 0.010852890355246407, "in-drivable-lane": 0, "agent_compute-ego": 0.023253561485381352, "deviation-heading": 1.008705852364693, "set_robot_commands": 0.009361169167927333, "deviation-center-line": 0.41944787060926125, "driven_lanedir_consec": 1.2802024810022503, "sim_compute_sim_state": 0.005412643864041283, "sim_compute_performance-ego": 0.00659450888633728, "sim_compute_robot_state-ego": 0.008274523984818231}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 2.3491534090091704, "sim_physics": 0.09384605169296265, "survival_time": 14.950000000000076, "driven_lanedir": 2.3252511519785672, "sim_render-ego": 0.011196145216623942, "in-drivable-lane": 0, "agent_compute-ego": 0.02451856772104899, "deviation-heading": 1.5396767830660587, "set_robot_commands": 0.00992632786432902, "deviation-center-line": 0.8090123682148037, "driven_lanedir_consec": 2.3252511519785672, "sim_compute_sim_state": 0.005488681793212891, "sim_compute_performance-ego": 0.006665407021840413, "sim_compute_robot_state-ego": 0.00829802433649699}}
set_robot_commands_max0.00992632786432902
set_robot_commands_mean0.00840848917981871
set_robot_commands_median0.00781908948370751
set_robot_commands_min0.007326941490173339
sim_compute_performance-ego_max0.006665407021840413
sim_compute_performance-ego_mean0.00537785690374921
sim_compute_performance-ego_median0.004758233719683708
sim_compute_performance-ego_min0.0044251338380282045
sim_compute_robot_state-ego_max0.00829802433649699
sim_compute_robot_state-ego_mean0.006706747578452327
sim_compute_robot_state-ego_median0.005674049502513448
sim_compute_robot_state-ego_min0.005630248387654622
sim_compute_sim_state_max0.005488681793212891
sim_compute_sim_state_mean0.004551219603365398
sim_compute_sim_state_median0.003962121804555257
sim_compute_sim_state_min0.003940918406502145
sim_physics_max0.09483725400198072
sim_physics_mean0.07565308644100874
sim_physics_median0.06440949947276015
sim_physics_min0.06200533714450774
sim_render-ego_max0.011196145216623942
sim_render-ego_mean0.009117263814109535
sim_render-ego_median0.008036958410384808
sim_render-ego_min0.0076904193560282386
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean11.040000000000036
survival_time_min4.699999999999991
No reset possible
30308step1-simulationsuccessyes0:07:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible