Duckietown Challenges Home Challenges Submissions

Submission 3055

Submission3055
Competingyes
Challengeaido2-LF-sim-validation
UserAndrea Censi 🇨🇭
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 22635
Next
User labelrotation
Admin priority50
Blessingn/a
User priority50

22635

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
22635step1-simulationsuccessyes0:23:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median-0.0005547867508171045
survival_time_median14.950000000000076
deviation-center-line_median0.44901648578509457
in-drivable-lane_median7.600000000000044


other stats
agent_compute-ego_max0.1631548055013021
agent_compute-ego_mean0.15689803377787276
agent_compute-ego_median0.15747249364852903
agent_compute-ego_min0.14748021523157756
deviation-center-line_max0.5610133203215761
deviation-center-line_mean0.3827485453764907
deviation-center-line_min0.13860669343031343
deviation-heading_max5.876680788839771
deviation-heading_mean5.630778146323924
deviation-heading_median5.657649106857695
deviation-heading_min5.445457843647878
driven_any_max0.026478672834056367
driven_any_mean0.02486423959225153
driven_any_median0.02593019554121306
driven_any_min0.022165452661374756
driven_lanedir_consec_max0.0021599934447711355
driven_lanedir_consec_mean-0.00046586963337973407
driven_lanedir_consec_min-0.003361806018059577
driven_lanedir_max0.0021599934447711355
driven_lanedir_mean-0.00046586963337973407
driven_lanedir_median-0.0005547867508171045
driven_lanedir_min-0.003361806018059577
in-drivable-lane_max7.700000000000042
in-drivable-lane_mean7.620000000000043
in-drivable-lane_min7.550000000000042
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 0.026478672834056367, "sim_physics": 0.1172210160891215, "survival_time": 14.950000000000076, "driven_lanedir": 0.0021599934447711355, "sim_render-ego": 0.05829017957051595, "in-drivable-lane": 7.600000000000043, "agent_compute-ego": 0.14748021523157756, "deviation-heading": 5.657649106857695, "set_robot_commands": 0.09229744116465252, "deviation-center-line": 0.5610133203215761, "driven_lanedir_consec": 0.0021599934447711355, "sim_compute_sim_state": 0.03641358375549317, "sim_compute_performance-ego": 0.06496172825495403, "sim_compute_robot_state-ego": 0.07016754388809204}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 0.02593019554121306, "sim_physics": 0.12107119798660278, "survival_time": 14.950000000000076, "driven_lanedir": -0.003361806018059577, "sim_render-ego": 0.06279377778371176, "in-drivable-lane": 7.600000000000044, "agent_compute-ego": 0.15747249364852903, "deviation-heading": 5.445457843647878, "set_robot_commands": 0.09696175257364908, "deviation-center-line": 0.13860669343031343, "driven_lanedir_consec": -0.003361806018059577, "sim_compute_sim_state": 0.03884407838185628, "sim_compute_performance-ego": 0.06971666097640991, "sim_compute_robot_state-ego": 0.07682432095209757}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 0.023804404672438047, "sim_physics": 0.099455828666687, "survival_time": 14.950000000000076, "driven_lanedir": 0.0021181059556472093, "sim_render-ego": 0.060255119005839027, "in-drivable-lane": 7.550000000000042, "agent_compute-ego": 0.1631548055013021, "deviation-heading": 5.876680788839771, "set_robot_commands": 0.09659610907236736, "deviation-center-line": 0.4784517475259532, "driven_lanedir_consec": 0.0021181059556472093, "sim_compute_sim_state": 0.039843077659606936, "sim_compute_performance-ego": 0.06824734926223755, "sim_compute_robot_state-ego": 0.07280610879262288}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 0.022165452661374756, "sim_physics": 0.1208221952120463, "survival_time": 14.950000000000076, "driven_lanedir": -0.002690854798440334, "sim_render-ego": 0.06404405355453491, "in-drivable-lane": 7.700000000000042, "agent_compute-ego": 0.15691089789072674, "deviation-heading": 5.721005148732661, "set_robot_commands": 0.0985324756304423, "deviation-center-line": 0.44901648578509457, "driven_lanedir_consec": -0.002690854798440334, "sim_compute_sim_state": 0.0391504955291748, "sim_compute_performance-ego": 0.07155119975407918, "sim_compute_robot_state-ego": 0.07634682099024455}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 0.025942472252175396, "sim_physics": 0.1318619632720947, "survival_time": 14.950000000000076, "driven_lanedir": -0.0005547867508171045, "sim_render-ego": 0.06630721251169841, "in-drivable-lane": 7.650000000000044, "agent_compute-ego": 0.15947175661722818, "deviation-heading": 5.453097843541612, "set_robot_commands": 0.10188673496246338, "deviation-center-line": 0.28665447981951625, "driven_lanedir_consec": -0.0005547867508171045, "sim_compute_sim_state": 0.041293907165527347, "sim_compute_performance-ego": 0.07429551839828491, "sim_compute_robot_state-ego": 0.08194465319315593}}
set_robot_commands_max0.10188673496246338
set_robot_commands_mean0.09725490268071492
set_robot_commands_median0.09696175257364908
set_robot_commands_min0.09229744116465252
sim_compute_performance-ego_max0.07429551839828491
sim_compute_performance-ego_mean0.06975449132919312
sim_compute_performance-ego_median0.06971666097640991
sim_compute_performance-ego_min0.06496172825495403
sim_compute_robot_state-ego_max0.08194465319315593
sim_compute_robot_state-ego_mean0.07561788956324259
sim_compute_robot_state-ego_median0.07634682099024455
sim_compute_robot_state-ego_min0.07016754388809204
sim_compute_sim_state_max0.041293907165527347
sim_compute_sim_state_mean0.0391090284983317
sim_compute_sim_state_median0.0391504955291748
sim_compute_sim_state_min0.03641358375549317
sim_physics_max0.1318619632720947
sim_physics_mean0.11808644024531044
sim_physics_median0.1208221952120463
sim_physics_min0.099455828666687
sim_render-ego_max0.06630721251169841
sim_render-ego_mean0.06233806848526001
sim_render-ego_median0.06279377778371176
sim_render-ego_min0.05829017957051595
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean14.950000000000076
survival_time_min14.950000000000076
No reset possible
20854step1-simulationsuccessno0:20:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
20840step1-simulationhost-errorno0:14:10
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 483, in get_cr
    submission_id=submission_id, timeout_sec=timeout_sec)
  File "/usr/lib/python3.6/contextlib.py", line 88, in __exit__
    next(self.gen)
  File "/project/src/duckietown_challenges_runner/runner.py", line 343, in setup_logging
    convert(f_stdout)
  File "/project/src/duckietown_challenges_runner/runner.py", line 332, in convert
    data = open(log).read().strip()
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido2-LF-sim-validation/submission3055/step1-simulation-ip-172-31-25-98-2348-job20840/logs/challenges-runner/stdout.log'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
20177step1-simulationsuccessno0:22:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
20103step1-simulationsuccessno0:19:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible