Duckietown Challenges Home Challenges Submissions

Submission 9312

Submission9312
Competingyes
Challengeaido5-LF-sim-validation
UserJerome Labonte 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58248
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58248

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58248LFv-simsuccessyes0:25:19
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median6.031820652770268
survival_time_median49.9999999999993
deviation-center-line_median2.890422675124041
in-drivable-lane_median3.4499999999998687


other stats
agent_compute-ego0_max0.012406478362732317
agent_compute-ego0_mean0.012276380741659668
agent_compute-ego0_median0.012284028484818316
agent_compute-ego0_min0.01213098763426972
complete-iteration_max0.1979052769832122
complete-iteration_mean0.17552726892415405
complete-iteration_median0.17495263801231964
complete-iteration_min0.15429852268876482
deviation-center-line_max3.758763972217636
deviation-center-line_mean2.5023341170384743
deviation-center-line_min0.4697271456881783
deviation-heading_max12.82412797422542
deviation-heading_mean8.260324347972649
deviation-heading_median9.078803948993649
deviation-heading_min2.0595615196778807
driven_any_max11.679181502401502
driven_any_mean8.223934584604503
driven_any_median9.015676038053552
driven_any_min3.185204759909409
driven_lanedir_consec_max7.104699727992231
driven_lanedir_consec_mean5.146472905121384
driven_lanedir_consec_min1.4175505869527705
driven_lanedir_max11.29941124583484
driven_lanedir_mean7.387417596565012
driven_lanedir_median8.41635427673622
driven_lanedir_min1.4175505869527705
get_duckie_state_max2.30358800324274e-06
get_duckie_state_mean2.202856139607719e-06
get_duckie_state_median2.22553352004613e-06
get_duckie_state_min2.056769515095876e-06
get_robot_state_max0.003805351098510844
get_robot_state_mean0.003676816666338264
get_robot_state_median0.003668067316877763
get_robot_state_min0.003565780933086689
get_state_dump_max0.004711381600957231
get_state_dump_mean0.004611894442696353
get_state_dump_median0.0045920555990656385
get_state_dump_min0.004552084971696902
get_ui_image_max0.03427312679779835
get_ui_image_mean0.030095626159444507
get_ui_image_median0.030282848125690973
get_ui_image_min0.02554368158859774
in-drivable-lane_max11.500000000000124
in-drivable-lane_mean4.774999999999968
in-drivable-lane_min0.70000000000001
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 11.679181502401502, "get_ui_image": 0.028943001281014093, "step_physics": 0.10325940344156968, "survival_time": 59.99999999999873, "driven_lanedir": 11.29941124583484, "get_state_dump": 0.004711381600957231, "get_robot_state": 0.003805351098510844, "sim_render-ego0": 0.003842394119694668, "get_duckie_state": 2.30358800324274e-06, "in-drivable-lane": 0.70000000000001, "deviation-heading": 11.522751178524354, "agent_compute-ego0": 0.012376527206586064, "complete-iteration": 0.17183745850333565, "set_robot_commands": 0.002266838786802522, "deviation-center-line": 3.751344232044408, "driven_lanedir_consec": 6.253869021466859, "sim_compute_sim_state": 0.010468882386829336, "sim_compute_performance-ego0": 0.002075689420612726}, "LF-norm-zigzag-000-ego0": {"driven_any": 3.185204759909409, "get_ui_image": 0.03427312679779835, "step_physics": 0.1253918115909283, "survival_time": 19.45000000000014, "driven_lanedir": 1.4175505869527705, "get_state_dump": 0.004552084971696902, "get_robot_state": 0.003565780933086689, "sim_render-ego0": 0.003664032006875063, "get_duckie_state": 2.1659410916841942e-06, "in-drivable-lane": 11.500000000000124, "deviation-heading": 2.0595615196778807, "agent_compute-ego0": 0.012191529763050568, "complete-iteration": 0.1979052769832122, "set_robot_commands": 0.0020778521513327573, "deviation-center-line": 0.4697271456881783, "driven_lanedir_consec": 1.4175505869527705, "sim_compute_sim_state": 0.010203690406603691, "sim_compute_performance-ego0": 0.0019014523579524113}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.531352203733553, "get_ui_image": 0.03162269497036785, "step_physics": 0.10669178790069848, "survival_time": 39.999999999999865, "driven_lanedir": 6.810000834221187, "get_state_dump": 0.004589244816335995, "get_robot_state": 0.0036260673913467542, "sim_render-ego0": 0.0036889935254157706, "get_duckie_state": 2.056769515095876e-06, "in-drivable-lane": 4.89999999999978, "deviation-heading": 6.634856719462939, "agent_compute-ego0": 0.012406478362732317, "complete-iteration": 0.17806781752130363, "set_robot_commands": 0.0021248685286732647, "deviation-center-line": 2.029501118203675, "driven_lanedir_consec": 5.809772284073675, "sim_compute_sim_state": 0.01130362336852875, "sim_compute_performance-ego0": 0.0019284964500741568}, "LF-norm-small_loop-000-ego0": {"driven_any": 10.499999872373552, "get_ui_image": 0.02554368158859774, "step_physics": 0.09404513798188806, "survival_time": 59.99999999999873, "driven_lanedir": 10.022707719251253, "get_state_dump": 0.0045948663817952816, "get_robot_state": 0.003710067242408771, "sim_render-ego0": 0.003754696977029335, "get_duckie_state": 2.285125948408065e-06, "in-drivable-lane": 1.9999999999999576, "deviation-heading": 12.82412797422542, "agent_compute-ego0": 0.01213098763426972, "complete-iteration": 0.15429852268876482, "set_robot_commands": 0.002183860585056276, "deviation-center-line": 3.758763972217636, "driven_lanedir_consec": 7.104699727992231, "sim_compute_sim_state": 0.006277977874336592, "sim_compute_performance-ego0": 0.001972915925749335}}
set_robot_commands_max0.002266838786802522
set_robot_commands_mean0.002163355012966205
set_robot_commands_median0.00215436455686477
set_robot_commands_min0.0020778521513327573
sim_compute_performance-ego0_max0.002075689420612726
sim_compute_performance-ego0_mean0.001969638538597157
sim_compute_performance-ego0_median0.0019507061879117456
sim_compute_performance-ego0_min0.0019014523579524113
sim_compute_sim_state_max0.01130362336852875
sim_compute_sim_state_mean0.009563543509074592
sim_compute_sim_state_median0.010336286396716514
sim_compute_sim_state_min0.006277977874336592
sim_render-ego0_max0.003842394119694668
sim_render-ego0_mean0.003737529157253709
sim_render-ego0_median0.003721845251222553
sim_render-ego0_min0.003664032006875063
simulation-passed1
step_physics_max0.1253918115909283
step_physics_mean0.10734703522877112
step_physics_median0.10497559567113408
step_physics_min0.09404513798188806
survival_time_max59.99999999999873
survival_time_mean44.862499999999365
survival_time_min19.45000000000014
No reset possible
58246LFv-simhost-erroryes0:31:11
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 59, in get_services_id
    raise ZValueError(container_ids=container_ids, services=services, res=res, names=names)
zuper_commons.types.exceptions.ZValueError: 

│ container_ids: [791a9a3ef6aceb8d2aba4ccff4f9b1973088d56feedaba8354b8b2b169111171,
│                 e4a7c844e8f516a7dddb16e05f69cf540c45ebec690951b763316664626eb17a]
│      services: dict[3]
│                │ evaluator:
│                │ dict[7]
│                │ │ image: docker.io/andreacensi/aido5-lf-sim-validation-lfv-sim-evaluator@sha256:6d0af9441525e1ed05be582f41e00fc178083c86797d28cd1a255c7025d0fd50
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ experiment_manager_parameters:
│                │ │ │ |episodes_per_scenario: 1
│                │ │ │ |episode_length_s: 60.0
│                │ │ │ |min_episode_length_s: 0.0
│                │ │ │ |seed: 888
│                │ │ │ |physics_dt: 0.05
│                │ │ │ |max_failures: 2
│                │ │ │ |fifo_dir: /fifos
│                │ │ │ |sim_in: /fifos/simulator-in
│                │ │ │ |sim_out: /fifos/simulator-out
│                │ │ │ |sm_in: /fifos/scenario_maker-in
│                │ │ │ |sm_out: /fifos/scenario_maker-out
│                │ │ │ |timeout_initialization: 120
│                │ │ │ |timeout_regular: 120
│                │ │ │ |port: 10123
│                │ │ │ |scenarios:
│                │ │ │ |- /scenarios
│                │ │ │ |
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ submission_id: 9312
│                │ │ │ submitter_name: jeromelabonte
│                │ │ │ SUBMISSION_CONTAINER: docker.io/jeromelabonte/aido-submissions:2020_10_25_18_51_25@sha256:04c7e311c0eaf6eb7705d26af59a86294addff2361cf6d4c9fb9c8ea981aa3c6
│                │ │ │ username: ubuntu
│                │ │ │ uid: 0
│                │ │ │ USER: ubuntu
│                │ │ │ HOME: /fake-home/ubuntu
│                │ │ ports: [10123]
│                │ │ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: nogpu-prod-07_27d69f530ab0}
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission9312/LFv-sim-nogpu-prod-07_27d69f530ab0-job58246-a-wd:/challenges:rw,
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission9312/LFv-sim-nogpu-prod-07_27d69f530ab0-job58246-a-fifos:/fifos:rw,
│                │ │  /tmp/duckietown/dt-challenges-runner/20_12_03_22_29_20-39644/fake-ubuntu-home:/fake-home/ubuntu:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ simulator:
│                │ dict[6]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-simulator-gym@sha256:c0096866077db3574e425d40603d8f5fc8ebbd164da7c0578df94ff4ede58d95
│                │ │ environment:
│                │ │ dict[12]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |env_constructor: Simulator
│                │ │ │ |env_parameters:
│                │ │ │ |  max_steps: 500001 # we don't want the gym to reset itself
│                │ │ │ |  domain_rand: 0
│                │ │ │ |  camera_width: 640
│                │ │ │ |  camera_height: 480
│                │ │ │ |  distortion: true
│                │ │ │ |  num_tris_distractors: 0
│                │ │ │ |  color_ground: [0, 0.3, 0] # green
│                │ │ │ |  enable_leds: true
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/simulator-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/simulator-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ submission_id: 9312
│                │ │ │ submitter_name: jeromelabonte
│                │ │ │ SUBMISSION_CONTAINER: docker.io/jeromelabonte/aido-submissions:2020_10_25_18_51_25@sha256:04c7e311c0eaf6eb7705d26af59a86294addff2361cf6d4c9fb9c8ea981aa3c6
│                │ │ │ username: ubuntu
│                │ │ │ uid: 0
│                │ │ │ USER: ubuntu
│                │ │ │ HOME: /fake-home/ubuntu
│                │ │ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: nogpu-prod-07_27d69f530ab0}
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission9312/LFv-sim-nogpu-prod-07_27d69f530ab0-job58246-a-wd:/challenges:rw,
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission9312/LFv-sim-nogpu-prod-07_27d69f530ab0-job58246-a-fifos:/fifos:rw,
│                │ │  /tmp/duckietown/dt-challenges-runner/20_12_03_22_29_20-39644/fake-ubuntu-home:/fake-home/ubuntu:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ solution-ego0:
│                │ dict[6]
│                │ │ image: docker.io/jeromelabonte/aido-submissions@sha256:04c7e311c0eaf6eb7705d26af59a86294addff2361cf6d4c9fb9c8ea981aa3c6
│                │ │ environment:
│                │ │ dict[13]
│                │ │ │ AIDONODE_NAME: ego0
│                │ │ │ AIDONODE_DATA_IN: /fifos/ego0-in
│                │ │ │ AIDO_REQUIRE_GPU: 1
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/ego0-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ submission_id: 9312
│                │ │ │ submitter_name: jeromelabonte
│                │ │ │ SUBMISSION_CONTAINER: docker.io/jeromelabonte/aido-submissions:2020_10_25_18_51_25@sha256:04c7e311c0eaf6eb7705d26af59a86294addff2361cf6d4c9fb9c8ea981aa3c6
│                │ │ │ username: ubuntu
│                │ │ │ uid: 0
│                │ │ │ USER: ubuntu
│                │ │ │ HOME: /fake-home/ubuntu
│                │ │ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: nogpu-prod-07_27d69f530ab0}
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission9312/LFv-sim-nogpu-prod-07_27d69f530ab0-job58246-a-wd:/challenges:rw,
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission9312/LFv-sim-nogpu-prod-07_27d69f530ab0-job58246-a-fifos:/fifos:rw,
│                │ │  /tmp/duckietown/dt-challenges-runner/20_12_03_22_29_20-39644/fake-ubuntu-home:/fake-home/ubuntu:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│           res: dict[2]
│                │ evaluator: 791a9a3ef6aceb8d2aba4ccff4f9b1973088d56feedaba8354b8b2b169111171
│                │ solution-ego0: e4a7c844e8f516a7dddb16e05f69cf540c45ebec690951b763316664626eb17a
│         names: dict[2]
│                │ 791a9a3ef6aceb8d2aba4ccff4f9b1973088d56feedaba8354b8b2b169111171: nogpu-prod-07_27d69f530ab0-job58246-666783_evaluator_1
│                │ e4a7c844e8f516a7dddb16e05f69cf540c45ebec690951b763316664626eb17a: nogpu-prod-07_27d69f530ab0-job58246-666783_solution-ego0_1

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 745, in get_cr
    cr = run_single(
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 959, in run_single
    write_logs(wd, project, services=config["services"])
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 120, in write_logs
    services2id: Dict[ServiceName, ContainerID] = get_services_id(wd, project, services)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 63, in get_services_id
    raise DockerComposeFail(msg, output=output.decode(), names=names) from e
duckietown_challenges_runner.docker_compose.DockerComposeFail: Cannot get process ids
│ output: |791a9a3ef6aceb8d2aba4ccff4f9b1973088d56feedaba8354b8b2b169111171
│         |e4a7c844e8f516a7dddb16e05f69cf540c45ebec690951b763316664626eb17a
│         |
│  names: dict[2]
│         │ 791a9a3ef6aceb8d2aba4ccff4f9b1973088d56feedaba8354b8b2b169111171: nogpu-prod-07_27d69f530ab0-job58246-666783_evaluator_1
│         │ e4a7c844e8f516a7dddb16e05f69cf540c45ebec690951b763316664626eb17a: nogpu-prod-07_27d69f530ab0-job58246-666783_solution-ego0_1
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52420LFv-simerrorno0:09:53
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139866483119536
- M:video_aido:cmdline(in:/;out:/) 139866483120544
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52411LFv-simerrorno0:03:47
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139751068943744
- M:video_aido:cmdline(in:/;out:/) 139751070121216
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52410LFv-simtimeoutno0:13:22
Timeout because eval [...]
Timeout because evaluator contacted us
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52326LFv-simhost-errorno0:07:07
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41742LFv-simsuccessno0:09:32
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41741LFv-simsuccessno0:09:25
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41736LFv-simsuccessno0:09:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41734LFv-simsuccessno0:10:15
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38198LFv-simerrorno0:00:35
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9312/LFv-sim-mont01-6ef51bb8a9d6-1-job38198-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38196LFv-simerrorno0:00:40
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9312/LFv-sim-mont02-80325a328f54-1-job38196-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38195LFv-simerrorno0:00:41
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9312/LFv-sim-mont04-e828c68b6a88-1-job38195-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36332LFv-simerrorno0:00:50
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9312/LFv-sim-Sandy1-sandy-1-job36332-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36330LFv-simsuccessno0:09:53
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35763LFv-simsuccessno0:01:09
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35761LFv-simsuccessno0:00:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35367LFv-simerrorno0:22:28
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9312/LFv-sim-reg02-1b92df2e7e91-1-job35367:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9312/LFv-sim-reg02-1b92df2e7e91-1-job35367/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9312/LFv-sim-reg02-1b92df2e7e91-1-job35367/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9312/LFv-sim-reg02-1b92df2e7e91-1-job35367/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9312/LFv-sim-reg02-1b92df2e7e91-1-job35367/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9312/LFv-sim-reg02-1b92df2e7e91-1-job35367/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35000LFv-simsuccessno0:24:14
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34659LFv-simsuccessno0:25:24
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34658LFv-simsuccessno0:24:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible