Duckietown Challenges Home Challenges Submissions

Submission 9316

Submission9316
Competingyes
Challengeaido5-LF-sim-validation
UserYishu Malhotra 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58208
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58208

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58208LFv-simsuccessyes0:05:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.3853849236817926
survival_time_median5.349999999999989
deviation-center-line_median0.11770715996561308
in-drivable-lane_median3.47499999999999


other stats
agent_compute-ego0_max0.013833780757716446
agent_compute-ego0_mean0.013460001234059283
agent_compute-ego0_median0.013510854971035251
agent_compute-ego0_min0.012984514236450195
complete-iteration_max0.2031560831881584
complete-iteration_mean0.18278777900567936
complete-iteration_median0.18722198411866164
complete-iteration_min0.15355106459723578
deviation-center-line_max0.17267119329177
deviation-center-line_mean0.12450991996341523
deviation-center-line_min0.08995416663066484
deviation-heading_max2.1062177677608003
deviation-heading_mean0.8254184367681409
deviation-heading_median0.4301745456679913
deviation-heading_min0.33510688797578014
driven_any_max1.7369358790122689
driven_any_mean1.3779608075352732
driven_any_median1.4270822659673563
driven_any_min0.920742819194111
driven_lanedir_consec_max0.5085667744416196
driven_lanedir_consec_mean0.3611295624605418
driven_lanedir_consec_min0.165181628036962
driven_lanedir_max0.5085667744416196
driven_lanedir_mean0.3624402628329257
driven_lanedir_median0.3880063244265606
driven_lanedir_min0.165181628036962
get_duckie_state_max1.5588907095102163e-06
get_duckie_state_mean1.414963146320092e-06
get_duckie_state_median1.4102829014086582e-06
get_duckie_state_min1.2803960729528356e-06
get_robot_state_max0.0038628333654159154
get_robot_state_mean0.003790519306414027
get_robot_state_median0.003799824752298064
get_robot_state_min0.003699594355644064
get_state_dump_max0.005063533782958984
get_state_dump_mean0.004876463387043251
get_state_dump_median0.004910025544228103
get_state_dump_min0.004622268676757813
get_ui_image_max0.03574633851964423
get_ui_image_mean0.030867118068382444
get_ui_image_median0.030845273888978
get_ui_image_min0.02603158597592954
in-drivable-lane_max4.6499999999999835
in-drivable-lane_mean3.487499999999989
in-drivable-lane_min2.3499999999999948
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 0.920742819194111, "get_ui_image": 0.02892111203609369, "step_physics": 0.1140574369675074, "survival_time": 3.8499999999999943, "driven_lanedir": 0.165181628036962, "get_state_dump": 0.005063533782958984, "get_robot_state": 0.0038628333654159154, "sim_render-ego0": 0.003962321159167168, "get_duckie_state": 1.5588907095102163e-06, "in-drivable-lane": 2.9999999999999942, "deviation-heading": 0.33510688797578014, "agent_compute-ego0": 0.01382733614016802, "complete-iteration": 0.18207663144820777, "set_robot_commands": 0.002511840600233811, "deviation-center-line": 0.08995416663066484, "driven_lanedir_consec": 0.165181628036962, "sim_compute_sim_state": 0.007762319002396021, "sim_compute_performance-ego0": 0.002020857273003994}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.2545156271151603, "get_ui_image": 0.03574633851964423, "step_physics": 0.12811592284669268, "survival_time": 4.6499999999999915, "driven_lanedir": 0.33386546905298786, "get_state_dump": 0.004808299084927173, "get_robot_state": 0.003699594355644064, "sim_render-ego0": 0.0038583253292327232, "get_duckie_state": 1.4330478424721575e-06, "in-drivable-lane": 2.3499999999999948, "deviation-heading": 2.1062177677608003, "agent_compute-ego0": 0.012984514236450195, "complete-iteration": 0.2031560831881584, "set_robot_commands": 0.002232437438153206, "deviation-center-line": 0.17267119329177, "driven_lanedir_consec": 0.32862266756345204, "sim_compute_sim_state": 0.009537463492535533, "sim_compute_performance-ego0": 0.0020828069524562107}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.5996489048195528, "get_ui_image": 0.032769435741862314, "step_physics": 0.1201106388060773, "survival_time": 6.0499999999999865, "driven_lanedir": 0.5085667744416196, "get_state_dump": 0.0050117520035290325, "get_robot_state": 0.0038358696171494782, "sim_render-ego0": 0.003896474838256836, "get_duckie_state": 1.3875179603451587e-06, "in-drivable-lane": 3.949999999999986, "deviation-heading": 0.4139637087804739, "agent_compute-ego0": 0.013833780757716446, "complete-iteration": 0.19236733678911552, "set_robot_commands": 0.0023600586125108063, "deviation-center-line": 0.10329078573220549, "driven_lanedir_consec": 0.5085667744416196, "sim_compute_sim_state": 0.008406506210077004, "sim_compute_performance-ego0": 0.002054861334503674}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.7369358790122689, "get_ui_image": 0.02603158597592954, "step_physics": 0.09242582321166992, "survival_time": 6.699999999999984, "driven_lanedir": 0.4421471798001333, "get_state_dump": 0.004622268676757813, "get_robot_state": 0.0037637798874466505, "sim_render-ego0": 0.0037778130284062137, "get_duckie_state": 1.2803960729528356e-06, "in-drivable-lane": 4.6499999999999835, "deviation-heading": 0.4463853825555087, "agent_compute-ego0": 0.013194373801902488, "complete-iteration": 0.15355106459723578, "set_robot_commands": 0.0023394054836697047, "deviation-center-line": 0.13212353419902065, "driven_lanedir_consec": 0.4421471798001333, "sim_compute_sim_state": 0.00531123125994647, "sim_compute_performance-ego0": 0.0020014480308250143}}
set_robot_commands_max0.002511840600233811
set_robot_commands_mean0.002360935533641882
set_robot_commands_median0.0023497320480902555
set_robot_commands_min0.002232437438153206
sim_compute_performance-ego0_max0.0020828069524562107
sim_compute_performance-ego0_mean0.002039993397697223
sim_compute_performance-ego0_median0.002037859303753834
sim_compute_performance-ego0_min0.0020014480308250143
sim_compute_sim_state_max0.009537463492535533
sim_compute_sim_state_mean0.007754379991238756
sim_compute_sim_state_median0.008084412606236512
sim_compute_sim_state_min0.00531123125994647
sim_render-ego0_max0.003962321159167168
sim_render-ego0_mean0.003873733588765735
sim_render-ego0_median0.00387740008374478
sim_render-ego0_min0.0037778130284062137
simulation-passed1
step_physics_max0.12811592284669268
step_physics_mean0.11367745545798684
step_physics_median0.11708403788679234
step_physics_min0.09242582321166992
survival_time_max6.699999999999984
survival_time_mean5.312499999999989
survival_time_min3.8499999999999943
No reset possible
52362LFv-simerrorno0:03:49
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140578906990576
- M:video_aido:cmdline(in:/;out:/) 140578906991008
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52357LFv-simerrorno0:01:44
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140302539982016
- M:video_aido:cmdline(in:/;out:/) 140302505389936
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52354LFv-simerrorno0:02:03
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139968797515728
- M:video_aido:cmdline(in:/;out:/) 139963990945552
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52321LFv-simhost-errorno0:01:39
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41728LFv-simsuccessno0:06:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38185LFv-simerrorno0:00:37
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9316/LFv-sim-mont05-227ea22a5fff-1-job38185-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38180LFv-simerrorno0:00:42
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9316/LFv-sim-mont02-80325a328f54-1-job38180-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38176LFv-simerrorno0:00:41
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9316/LFv-sim-mont05-227ea22a5fff-1-job38176-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36321LFv-simerrorno0:00:47
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9316/LFv-sim-Sandy2-sandy-1-job36321-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35760LFv-simsuccessno0:00:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35759LFv-simsuccessno0:00:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35364LFv-simerrorno0:18:58
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9316/LFv-sim-reg05-b2dee9d94ee0-1-job35364:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9316/LFv-sim-reg05-b2dee9d94ee0-1-job35364/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9316/LFv-sim-reg05-b2dee9d94ee0-1-job35364/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9316/LFv-sim-reg05-b2dee9d94ee0-1-job35364/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9316/LFv-sim-reg05-b2dee9d94ee0-1-job35364/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9316/LFv-sim-reg05-b2dee9d94ee0-1-job35364/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35362LFv-simhost-errorno0:00:18
Error while running [...]
Error while running Docker Compose:

Could not run command
│    cmd: [docker-compose, -p, reg01-94a6fab21ac9-1-job35362-850628, pull]
│ stdout: ''
│  sderr: ''
│      e: Command '['docker-compose', '-p', 'reg01-94a6fab21ac9-1-job35362-850628', 'pull']' returned non-zero exit status 1.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34995LFv-simsuccessno0:18:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34666LFv-simsuccessno0:18:17
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible