Duckietown Challenges Home Challenges Submissions

Submission 3114

Submission3114
Competingyes
Challengeaido2-LF-sim-validation
UserPeter Almasi 🇭🇺
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 22605
Next
User labelBaseline solution using reinforcement learning
Admin priority50
Blessingn/a
User priority50

22605

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
22605step1-simulationsuccessyes0:27:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.185353061495548
survival_time_median14.950000000000076
deviation-center-line_median0.9435008220616988
in-drivable-lane_median0


other stats
agent_compute-ego_max0.28572375535964967
agent_compute-ego_mean0.2666672357657017
agent_compute-ego_median0.256938803990682
agent_compute-ego_min0.25221720774968465
deviation-center-line_max0.9822461139602914
deviation-center-line_mean0.8606329948319832
deviation-center-line_min0.5558214411706502
deviation-heading_max3.2600620014175123
deviation-heading_mean2.7476538296717954
deviation-heading_median2.654186664499502
deviation-heading_min2.334890485276209
driven_any_max2.3249545528410955
driven_any_mean2.168326033245046
driven_any_median2.226723976325144
driven_any_min1.9668256472301673
driven_lanedir_consec_max2.27781394553347
driven_lanedir_consec_mean2.0950503876198985
driven_lanedir_consec_min1.8348894432223557
driven_lanedir_max2.27781394553347
driven_lanedir_mean2.0950503876198985
driven_lanedir_median2.185353061495548
driven_lanedir_min1.8348894432223557
in-drivable-lane_max1.0000000000000142
in-drivable-lane_mean0.20000000000000284
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 1.9668256472301673, "sim_physics": 0.12550162919711716, "survival_time": 13.65000000000006, "driven_lanedir": 1.899485956723519, "sim_render-ego": 0.06567252075279152, "in-drivable-lane": 0, "agent_compute-ego": 0.28487812555753267, "deviation-heading": 2.93529075502866, "set_robot_commands": 0.09840394376398442, "deviation-center-line": 0.5558214411706502, "driven_lanedir_consec": 1.899485956723519, "sim_compute_sim_state": 0.04072246184715858, "sim_compute_performance-ego": 0.0730905864701603, "sim_compute_robot_state-ego": 0.07988009610018887}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 2.321044713767219, "sim_physics": 0.11436676184336345, "survival_time": 14.950000000000076, "driven_lanedir": 2.2777095311245996, "sim_render-ego": 0.0660379449526469, "in-drivable-lane": 0, "agent_compute-ego": 0.28572375535964967, "deviation-heading": 2.5538392421370943, "set_robot_commands": 0.1003089435895284, "deviation-center-line": 0.9709933754294416, "driven_lanedir_consec": 2.2777095311245996, "sim_compute_sim_state": 0.04115877389907837, "sim_compute_performance-ego": 0.07202286799748739, "sim_compute_robot_state-ego": 0.08049248774846395}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 2.3249545528410955, "sim_physics": 0.10777249972025552, "survival_time": 14.950000000000076, "driven_lanedir": 2.27781394553347, "sim_render-ego": 0.057764697869618735, "in-drivable-lane": 0, "agent_compute-ego": 0.256938803990682, "deviation-heading": 2.654186664499502, "set_robot_commands": 0.09034234126408897, "deviation-center-line": 0.9435008220616988, "driven_lanedir_consec": 2.27781394553347, "sim_compute_sim_state": 0.03809483289718628, "sim_compute_performance-ego": 0.06409759998321533, "sim_compute_robot_state-ego": 0.06853599468866985}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 2.0020812760616047, "sim_physics": 0.10309399843215944, "survival_time": 14.950000000000076, "driven_lanedir": 1.8348894432223557, "sim_render-ego": 0.05772845347722371, "in-drivable-lane": 1.0000000000000142, "agent_compute-ego": 0.25221720774968465, "deviation-heading": 3.2600620014175123, "set_robot_commands": 0.08923460880915324, "deviation-center-line": 0.8506032215378343, "driven_lanedir_consec": 1.8348894432223557, "sim_compute_sim_state": 0.03678608973821004, "sim_compute_performance-ego": 0.0691389004389445, "sim_compute_robot_state-ego": 0.07210917949676514}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 2.226723976325144, "sim_physics": 0.10789947907129924, "survival_time": 14.950000000000076, "driven_lanedir": 2.185353061495548, "sim_render-ego": 0.06056863069534302, "in-drivable-lane": 0, "agent_compute-ego": 0.25357828617095945, "deviation-heading": 2.334890485276209, "set_robot_commands": 0.09022313435872396, "deviation-center-line": 0.9822461139602914, "driven_lanedir_consec": 2.185353061495548, "sim_compute_sim_state": 0.037215991020202635, "sim_compute_performance-ego": 0.0681192684173584, "sim_compute_robot_state-ego": 0.07108312129974365}}
set_robot_commands_max0.1003089435895284
set_robot_commands_mean0.0937025943570958
set_robot_commands_median0.09034234126408897
set_robot_commands_min0.08923460880915324
sim_compute_performance-ego_max0.0730905864701603
sim_compute_performance-ego_mean0.06929384466143318
sim_compute_performance-ego_median0.0691389004389445
sim_compute_performance-ego_min0.06409759998321533
sim_compute_robot_state-ego_max0.08049248774846395
sim_compute_robot_state-ego_mean0.07442017586676629
sim_compute_robot_state-ego_median0.07210917949676514
sim_compute_robot_state-ego_min0.06853599468866985
sim_compute_sim_state_max0.04115877389907837
sim_compute_sim_state_mean0.03879562988036718
sim_compute_sim_state_median0.03809483289718628
sim_compute_sim_state_min0.03678608973821004
sim_physics_max0.12550162919711716
sim_physics_mean0.11172687365283895
sim_physics_median0.10789947907129924
sim_physics_min0.10309399843215944
sim_render-ego_max0.0660379449526469
sim_render-ego_mean0.061554449549524784
sim_render-ego_median0.06056863069534302
sim_render-ego_min0.05772845347722371
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean14.690000000000072
survival_time_min13.65000000000006
No reset possible
21409step1-simulationsuccessno0:14:41
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
21406step1-simulationhost-errorno0:05:40
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 483, in get_cr
    submission_id=submission_id, timeout_sec=timeout_sec)
  File "/usr/lib/python3.6/contextlib.py", line 88, in __exit__
    next(self.gen)
  File "/project/src/duckietown_challenges_runner/runner.py", line 343, in setup_logging
    convert(f_stdout)
  File "/project/src/duckietown_challenges_runner/runner.py", line 332, in convert
    data = open(log).read().strip()
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido2-LF-sim-validation/submission3114/step1-simulation-ip-172-31-40-253-32059-job21406/logs/challenges-runner/stdout.log'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible