Duckietown Challenges Home Challenges Submissions

Submission 9369

Submission9369
Competingyes
Challengeaido5-LF-sim-validation
UserYishu Malhotra 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58141
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58141

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58141LFv-simsuccessyes0:04:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.47469644768734554
survival_time_median5.799999999999987
deviation-center-line_median0.1652944732306097
in-drivable-lane_median3.424999999999989


other stats
agent_compute-ego0_max0.014596813856953324
agent_compute-ego0_mean0.013418847389869985
agent_compute-ego0_median0.013179750080337827
agent_compute-ego0_min0.01271907554185095
complete-iteration_max0.20055002026853308
complete-iteration_mean0.18011886253051654
complete-iteration_median0.18268783873418495
complete-iteration_min0.1545497523851631
deviation-center-line_max0.207379547721231
deviation-center-line_mean0.1575691113149522
deviation-center-line_min0.09230795107735856
deviation-heading_max1.6028043324994483
deviation-heading_mean0.9725201313494434
deviation-heading_median0.9222809147569432
deviation-heading_min0.4427143633844387
driven_any_max2.742138185962727
driven_any_mean1.6170509091491772
driven_any_median1.3305380001207268
driven_any_min1.064989450392528
driven_lanedir_consec_max0.6308086958038511
driven_lanedir_consec_mean0.476292081256212
driven_lanedir_consec_min0.3249667338463058
driven_lanedir_max0.6308086958038511
driven_lanedir_mean0.476292081256212
driven_lanedir_median0.47469644768734554
driven_lanedir_min0.3249667338463058
get_duckie_state_max1.672211043331601e-06
get_duckie_state_mean1.4866710385120431e-06
get_duckie_state_median1.4968560743003703e-06
get_duckie_state_min1.2807609621158317e-06
get_robot_state_max0.003997400553539546
get_robot_state_mean0.003799567614904775
get_robot_state_median0.0037603725751731423
get_robot_state_min0.0036801247557332697
get_state_dump_max0.005149003231164181
get_state_dump_mean0.004820475857051579
get_state_dump_median0.00471920336302421
get_state_dump_min0.004694493470993717
get_ui_image_max0.03672630597004848
get_ui_image_mean0.030905236098716236
get_ui_image_median0.029943165771601735
get_ui_image_min0.027008306881612983
in-drivable-lane_max8.850000000000005
in-drivable-lane_mean4.462499999999994
in-drivable-lane_min2.1499999999999924
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.742138185962727, "get_ui_image": 0.028065526157344152, "step_physics": 0.09710758979167414, "survival_time": 10.85000000000002, "driven_lanedir": 0.3249667338463058, "get_state_dump": 0.004715734665546942, "get_robot_state": 0.003813918577421696, "sim_render-ego0": 0.00406168797694215, "get_duckie_state": 1.672211043331601e-06, "in-drivable-lane": 8.850000000000005, "deviation-heading": 1.3647347330347204, "agent_compute-ego0": 0.013279320996835693, "complete-iteration": 0.16529566322991607, "set_robot_commands": 0.0022818025099028143, "deviation-center-line": 0.207379547721231, "driven_lanedir_consec": 0.3249667338463058, "sim_compute_sim_state": 0.009721477097327557, "sim_compute_performance-ego0": 0.002150901960670401}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.2746330422663896, "get_ui_image": 0.03672630597004848, "step_physics": 0.12449542189066388, "survival_time": 5.599999999999988, "driven_lanedir": 0.3485414367201807, "get_state_dump": 0.004694493470993717, "get_robot_state": 0.0037068265729245887, "sim_render-ego0": 0.0038974158531796615, "get_duckie_state": 1.377764001356817e-06, "in-drivable-lane": 3.3999999999999906, "deviation-heading": 1.6028043324994483, "agent_compute-ego0": 0.013080179163839963, "complete-iteration": 0.20055002026853308, "set_robot_commands": 0.002274981642191389, "deviation-center-line": 0.17269441294522414, "driven_lanedir_consec": 0.3485414367201807, "sim_compute_sim_state": 0.009608821531312655, "sim_compute_performance-ego0": 0.0019836552375185807}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.064989450392528, "get_ui_image": 0.031820805385859324, "step_physics": 0.12729428994535197, "survival_time": 4.899999999999991, "driven_lanedir": 0.6308086958038511, "get_state_dump": 0.005149003231164181, "get_robot_state": 0.003997400553539546, "sim_render-ego0": 0.004011917595911508, "get_duckie_state": 1.6159481472439236e-06, "in-drivable-lane": 2.1499999999999924, "deviation-heading": 0.4798270964791663, "agent_compute-ego0": 0.014596813856953324, "complete-iteration": 0.20008001423845387, "set_robot_commands": 0.0023890890256322997, "deviation-center-line": 0.09230795107735856, "driven_lanedir_consec": 0.6308086958038511, "sim_compute_sim_state": 0.008497700546727036, "sim_compute_performance-ego0": 0.0022202019739632653}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.3864429579750637, "get_ui_image": 0.027008306881612983, "step_physics": 0.09316998079788588, "survival_time": 5.999999999999987, "driven_lanedir": 0.6008514586545104, "get_state_dump": 0.004722672060501477, "get_robot_state": 0.0036801247557332697, "sim_render-ego0": 0.003851551654910253, "get_duckie_state": 1.2807609621158317e-06, "in-drivable-lane": 3.4499999999999877, "deviation-heading": 0.4427143633844387, "agent_compute-ego0": 0.01271907554185095, "complete-iteration": 0.1545497523851631, "set_robot_commands": 0.0022397415697082016, "deviation-center-line": 0.15789453351599525, "driven_lanedir_consec": 0.6008514586545104, "sim_compute_sim_state": 0.005127786604826115, "sim_compute_performance-ego0": 0.001951653110094307}}
set_robot_commands_max0.0023890890256322997
set_robot_commands_mean0.0022964036868586765
set_robot_commands_median0.0022783920760471015
set_robot_commands_min0.0022397415697082016
sim_compute_performance-ego0_max0.0022202019739632653
sim_compute_performance-ego0_mean0.0020766030705616383
sim_compute_performance-ego0_median0.0020672785990944907
sim_compute_performance-ego0_min0.001951653110094307
sim_compute_sim_state_max0.009721477097327557
sim_compute_sim_state_mean0.00823894644504834
sim_compute_sim_state_median0.009053261039019846
sim_compute_sim_state_min0.005127786604826115
sim_render-ego0_max0.00406168797694215
sim_render-ego0_mean0.0039556432702358936
sim_render-ego0_median0.003954666724545585
sim_render-ego0_min0.003851551654910253
simulation-passed1
step_physics_max0.12729428994535197
step_physics_mean0.11051682060639396
step_physics_median0.11080150584116902
step_physics_min0.09316998079788588
survival_time_max10.85000000000002
survival_time_mean6.837499999999996
survival_time_min4.899999999999991
No reset possible
58139LFv-simsuccessyes0:05:24
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52265LFv-simerrorno0:01:38
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140269797543072
- M:video_aido:cmdline(in:/;out:/) 140269798675504
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52248LFv-simerrorno0:02:44
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140229167516496
- M:video_aido:cmdline(in:/;out:/) 140229169168000
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52228LFv-simerrorno0:03:07
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139794827130960
- M:video_aido:cmdline(in:/;out:/) 139794828793216
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41681LFv-simsuccessno0:05:48
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38132LFv-simerrorno0:00:46
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9369/LFv-sim-mont01-6ef51bb8a9d6-1-job38132-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38126LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 654, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38121LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 654, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38116LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 654, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38114LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 654, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38107LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 654, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36259LFv-simerrorno0:00:54
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9369/LFv-sim-Sandy1-sandy-1-job36259-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35702LFv-simsuccessno0:00:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35321LFv-simerrorno0:11:56
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9369/LFv-sim-reg05-b2dee9d94ee0-1-job35321:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9369/LFv-sim-reg05-b2dee9d94ee0-1-job35321/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9369/LFv-sim-reg05-b2dee9d94ee0-1-job35321/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9369/LFv-sim-reg05-b2dee9d94ee0-1-job35321/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9369/LFv-sim-reg05-b2dee9d94ee0-1-job35321/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9369/LFv-sim-reg05-b2dee9d94ee0-1-job35321/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34960LFv-simsuccessno0:12:46
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34821LFv-simsuccessno0:15:21
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible