Duckietown Challenges Home Challenges Submissions

Submission 6836

Submission6836
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58563
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58563

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58563LFv-simsuccessyes0:45:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.0
survival_time_median59.99999999999873
deviation-center-line_median1.2422730096440104
in-drivable-lane_median0.0


other stats
agent_compute-ego0_max0.013863969107253862
agent_compute-ego0_mean0.01349989614518457
agent_compute-ego0_median0.013634092404780838
agent_compute-ego0_min0.012867430663922744
complete-iteration_max0.3737413805787708
complete-iteration_mean0.32043829612390484
complete-iteration_median0.32394912389791775
complete-iteration_min0.26011355612101306
deviation-center-line_max4.053503393024394
deviation-center-line_mean1.731069028434027
deviation-center-line_min0.386226701423694
deviation-heading_max27.859809596736422
deviation-heading_mean14.925250437835436
deviation-heading_median14.309269950178932
deviation-heading_min3.22265225424745
driven_any_max2.6645352591003757e-13
driven_any_mean1.9984014443252818e-13
driven_any_median2.6645352591003757e-13
driven_any_min0.0
driven_lanedir_consec_max0.000286102294921875
driven_lanedir_consec_mean7.152557373046875e-05
driven_lanedir_consec_min0.0
driven_lanedir_max0.000286102294921875
driven_lanedir_mean7.152557373046875e-05
driven_lanedir_median0.0
driven_lanedir_min0.0
get_duckie_state_max1.4656489338108542e-06
get_duckie_state_mean1.3910066475975423e-06
get_duckie_state_median1.3781030608851349e-06
get_duckie_state_min1.3421715348090458e-06
get_robot_state_max0.004020071942839992
get_robot_state_mean0.0038994625645017342
get_robot_state_median0.0038756312776068464
get_robot_state_min0.0038265157599532535
get_state_dump_max0.004982528440362706
get_state_dump_mean0.004885346466655239
get_state_dump_median0.004875497059659299
get_state_dump_min0.0048078633069396515
get_ui_image_max0.03774603598322301
get_ui_image_mean0.03261442739302471
get_ui_image_median0.03223961368389272
get_ui_image_min0.028232446221090376
in-drivable-lane_max0.0
in-drivable-lane_mean0.0
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.030214931843779068, "step_physics": 0.23326154970110308, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.004883332415286151, "get_robot_state": 0.0039037075963842182, "sim_render-ego0": 0.004010270179856528, "get_duckie_state": 1.3947784652519384e-06, "in-drivable-lane": 0.0, "deviation-heading": 22.66279310353771, "agent_compute-ego0": 0.013554243719845788, "complete-iteration": 0.30345383770360634, "set_robot_commands": 0.002387273321540826, "deviation-center-line": 4.053503393024394, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.009005303982394025, "sim_compute_performance-ego0": 0.0021449018775374566}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.0, "get_ui_image": 0.03774603598322301, "step_physics": 0.29285367581369876, "survival_time": 59.99999999999873, "driven_lanedir": 0.000286102294921875, "get_state_dump": 0.004867661704032447, "get_robot_state": 0.0038265157599532535, "sim_render-ego0": 0.003980345571170143, "get_duckie_state": 1.3421715348090458e-06, "in-drivable-lane": 0.0, "deviation-heading": 27.859809596736422, "agent_compute-ego0": 0.013863969107253862, "complete-iteration": 0.3737413805787708, "set_robot_commands": 0.0022820793123269063, "deviation-center-line": 1.0457540566888746, "driven_lanedir_consec": 0.000286102294921875, "sim_compute_sim_state": 0.012117934365951448, "sim_compute_performance-ego0": 0.002112445585138097}, "LF-norm-techtrack-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.03426429552400638, "step_physics": 0.26924645692283766, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.004982528440362706, "get_robot_state": 0.004020071942839992, "sim_render-ego0": 0.00404726635109475, "get_duckie_state": 1.4656489338108542e-06, "in-drivable-lane": 0.0, "deviation-heading": 3.22265225424745, "agent_compute-ego0": 0.013713941089715886, "complete-iteration": 0.34444441009222915, "set_robot_commands": 0.002432315176869312, "deviation-center-line": 1.4387919625991463, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.00948678901253096, "sim_compute_performance-ego0": 0.00215455634111568}, "LF-norm-small_loop-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.028232446221090376, "step_physics": 0.19569750193453747, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.0048078633069396515, "get_robot_state": 0.0038475549588294746, "sim_render-ego0": 0.003901900498694325, "get_duckie_state": 1.3614276565183312e-06, "in-drivable-lane": 0.0, "deviation-heading": 5.955746796820156, "agent_compute-ego0": 0.012867430663922744, "complete-iteration": 0.26011355612101306, "set_robot_commands": 0.002283374633916113, "deviation-center-line": 0.386226701423694, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.0063421593220605145, "sim_compute_performance-ego0": 0.002042305062553666}}
set_robot_commands_max0.002432315176869312
set_robot_commands_mean0.0023462606111632894
set_robot_commands_median0.0023353239777284696
set_robot_commands_min0.0022820793123269063
sim_compute_performance-ego0_max0.00215455634111568
sim_compute_performance-ego0_mean0.002113552216586225
sim_compute_performance-ego0_median0.002128673731337777
sim_compute_performance-ego0_min0.002042305062553666
sim_compute_sim_state_max0.012117934365951448
sim_compute_sim_state_mean0.009238046670734235
sim_compute_sim_state_median0.009246046497462492
sim_compute_sim_state_min0.0063421593220605145
sim_render-ego0_max0.00404726635109475
sim_render-ego0_mean0.003984945650203937
sim_render-ego0_median0.0039953078755133355
sim_render-ego0_min0.003901900498694325
simulation-passed1
step_physics_max0.29285367581369876
step_physics_mean0.24776479609304425
step_physics_median0.25125400331197034
step_physics_min0.19569750193453747
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58561LFv-simsuccessyes0:45:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58560LFv-simsuccessyes0:45:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58559LFv-simsuccessyes0:45:52
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52597LFv-simerrorno0:11:10
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139878508722352
- M:video_aido:cmdline(in:/;out:/) 139878508722256
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52591LFv-simhost-errorno0:13:01
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52588LFv-simerrorno0:10:32
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140543076741616
- M:video_aido:cmdline(in:/;out:/) 140543076743392
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52578LFv-simhost-errorno0:10:27
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52502LFv-simhost-errorno0:13:15
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 59, in get_services_id
    raise ZValueError(container_ids=container_ids, services=services, res=res, names=names)
zuper_commons.types.exceptions.ZValueError: 

│ container_ids: [cdedc948e3b07fbb178cf69f3471a1710e92fce2659166e193d44664ee96b573]
│      services: dict[3]
│                │ evaluator:
│                │ dict[7]
│                │ │ image: docker.io/andreacensi/aido5-lf-sim-validation-lfv-sim-evaluator@sha256:2bc9fe8514d570141f87b0626353cbc6aebad89d220fecc8876a008efd430515
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ experiment_manager_parameters:
│                │ │ │ |episodes_per_scenario: 1
│                │ │ │ |episode_length_s: 60.0
│                │ │ │ |min_episode_length_s: 0.0
│                │ │ │ |seed: 888
│                │ │ │ |physics_dt: 0.05
│                │ │ │ |max_failures: 2
│                │ │ │ |fifo_dir: /fifos
│                │ │ │ |sim_in: /fifos/simulator-in
│                │ │ │ |sim_out: /fifos/simulator-out
│                │ │ │ |sm_in: /fifos/scenario_maker-in
│                │ │ │ |sm_out: /fifos/scenario_maker-out
│                │ │ │ |timeout_initialization: 120
│                │ │ │ |timeout_regular: 120
│                │ │ │ |port: 10123
│                │ │ │ |scenarios:
│                │ │ │ |- /scenarios
│                │ │ │ |
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ submission_id: 6836
│                │ │ │ submitter_name: melisande
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_10_18_16_39_07@sha256:d46a1e162a23be313d74622e5bd0705d845c591f0f11e830fb8d1ddd4f89054c
│                │ │ │ username: ubuntu
│                │ │ │ uid: 0
│                │ │ │ USER: ubuntu
│                │ │ │ HOME: /fake-home/ubuntu
│                │ │ ports: [10123]
│                │ │ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: nogpu-prod-04_f1b0bf41b8f7}
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission6836/LFv-sim-nogpu-prod-04_f1b0bf41b8f7-job52502-a-wd:/challenges:rw,
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission6836/LFv-sim-nogpu-prod-04_f1b0bf41b8f7-job52502-a-fifos:/fifos:rw,
│                │ │  /tmp/duckietown/dt-challenges-runner/20_12_02_11_49_30-25453/fake-ubuntu-home:/fake-home/ubuntu:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ simulator:
│                │ dict[6]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-simulator-gym@sha256:4848042f4d088b99b480cb1fc276e32f956d6b9dee27d70fcbaba500d8a8768c
│                │ │ environment:
│                │ │ dict[12]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |env_constructor: Simulator
│                │ │ │ |env_parameters:
│                │ │ │ |  max_steps: 500001 # we don't want the gym to reset itself
│                │ │ │ |  domain_rand: 0
│                │ │ │ |  camera_width: 640
│                │ │ │ |  camera_height: 480
│                │ │ │ |  distortion: true
│                │ │ │ |  num_tris_distractors: 0
│                │ │ │ |  color_ground: [0, 0.3, 0] # green
│                │ │ │ |  enable_leds: true
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/simulator-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/simulator-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ submission_id: 6836
│                │ │ │ submitter_name: melisande
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_10_18_16_39_07@sha256:d46a1e162a23be313d74622e5bd0705d845c591f0f11e830fb8d1ddd4f89054c
│                │ │ │ username: ubuntu
│                │ │ │ uid: 0
│                │ │ │ USER: ubuntu
│                │ │ │ HOME: /fake-home/ubuntu
│                │ │ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: nogpu-prod-04_f1b0bf41b8f7}
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission6836/LFv-sim-nogpu-prod-04_f1b0bf41b8f7-job52502-a-wd:/challenges:rw,
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission6836/LFv-sim-nogpu-prod-04_f1b0bf41b8f7-job52502-a-fifos:/fifos:rw,
│                │ │  /tmp/duckietown/dt-challenges-runner/20_12_02_11_49_30-25453/fake-ubuntu-home:/fake-home/ubuntu:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ solution-ego0:
│                │ dict[6]
│                │ │ image: docker.io/melisande/aido-submissions@sha256:d46a1e162a23be313d74622e5bd0705d845c591f0f11e830fb8d1ddd4f89054c
│                │ │ environment:
│                │ │ dict[13]
│                │ │ │ AIDONODE_NAME: ego0
│                │ │ │ AIDONODE_DATA_IN: /fifos/ego0-in
│                │ │ │ AIDO_REQUIRE_GPU: 1
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/ego0-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ submission_id: 6836
│                │ │ │ submitter_name: melisande
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_10_18_16_39_07@sha256:d46a1e162a23be313d74622e5bd0705d845c591f0f11e830fb8d1ddd4f89054c
│                │ │ │ username: ubuntu
│                │ │ │ uid: 0
│                │ │ │ USER: ubuntu
│                │ │ │ HOME: /fake-home/ubuntu
│                │ │ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: nogpu-prod-04_f1b0bf41b8f7}
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission6836/LFv-sim-nogpu-prod-04_f1b0bf41b8f7-job52502-a-wd:/challenges:rw,
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission6836/LFv-sim-nogpu-prod-04_f1b0bf41b8f7-job52502-a-fifos:/fifos:rw,
│                │ │  /tmp/duckietown/dt-challenges-runner/20_12_02_11_49_30-25453/fake-ubuntu-home:/fake-home/ubuntu:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│           res: {solution-ego0: cdedc948e3b07fbb178cf69f3471a1710e92fce2659166e193d44664ee96b573}
│         names: dict[1]
│                │ cdedc948e3b07fbb178cf69f3471a1710e92fce2659166e193d44664ee96b573: nogpu-prod-04_f1b0bf41b8f7-job52502-994434_solution-ego0_1

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 777, in get_cr
    cr = run_single(
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 991, in run_single
    write_logs(wd, project, services=config["services"])
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 120, in write_logs
    services2id: Dict[ServiceName, ContainerID] = get_services_id(wd, project, services)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 63, in get_services_id
    raise DockerComposeFail(msg, output=output.decode(), names=names) from e
duckietown_challenges_runner.docker_compose.DockerComposeFail: Cannot get process ids
│ output: |cdedc948e3b07fbb178cf69f3471a1710e92fce2659166e193d44664ee96b573
│         |
│  names: dict[1]
│         │ cdedc948e3b07fbb178cf69f3471a1710e92fce2659166e193d44664ee96b573: nogpu-prod-04_f1b0bf41b8f7-job52502-994434_solution-ego0_1
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52495LFv-simerrorno0:14:26
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140487071016320
- M:video_aido:cmdline(in:/;out:/) 140487071015120
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41812LFv-simsuccessno0:09:09
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38369LFv-simsuccessno0:16:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36454LFv-simsuccessno0:09:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35875LFv-simsuccessno0:00:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35454LFv-simerrorno0:22:03
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg05-b2dee9d94ee0-1-job35454:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg05-b2dee9d94ee0-1-job35454/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg05-b2dee9d94ee0-1-job35454/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg05-b2dee9d94ee0-1-job35454/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg05-b2dee9d94ee0-1-job35454/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg05-b2dee9d94ee0-1-job35454/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35139LFv-simsuccessno0:22:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35138LFv-simsuccessno0:23:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34414LFv-simsuccessno0:27:11
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34249LFv-simabortedno0:27:46
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg07-1f09cddcc73e-1-job34249:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg07-1f09cddcc73e-1-job34249/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33848LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg03-c2bc3037870e-1-job33848'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33839LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg03-c2bc3037870e-1-job33839'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33834LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg04-bf35e9d68df4-1-job33834'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33828LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg04-bf35e9d68df4-1-job33828'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33823LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg01-53440c9394b5-1-job33823'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33819LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg05-5ca0d35e6d82-1-job33819'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33814LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg05-5ca0d35e6d82-1-job33814'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33803LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg11-951de1eeccca-1-job33803'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33722LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6836/LFv-sim-reg07-c4e193407567-1-job33722'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33420LFv-simsuccessno0:23:46
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33419LFv-simsuccessno0:24:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible