Duckietown Challenges Home Challenges Submissions

Submission 6827

Submission6827
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58584
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58584

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58584LFv-simsuccessyes0:04:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.5236188742699501
survival_time_median7.1749999999999865
deviation-center-line_median0.07387700261480823
in-drivable-lane_median4.999999999999986


other stats
agent_compute-ego0_max0.013102665180113255
agent_compute-ego0_mean0.012503697260516122
agent_compute-ego0_median0.0125813379836269
agent_compute-ego0_min0.01174944789469743
complete-iteration_max0.2562879585638279
complete-iteration_mean0.19736520186275255
complete-iteration_median0.19631428730657088
complete-iteration_min0.14054427427404068
deviation-center-line_max0.15601696162497183
deviation-center-line_mean0.08910857762704727
deviation-center-line_min0.0526633436536008
deviation-heading_max1.174580837791698
deviation-heading_mean0.6493536189559338
deviation-heading_median0.5287485565984813
deviation-heading_min0.3653365248350745
driven_any_max3.078147593526138
driven_any_mean1.8414105678074564
driven_any_median1.9830500001809548
driven_any_min0.3213946773417773
driven_lanedir_consec_max0.7357576341364731
driven_lanedir_consec_mean0.4880588692228427
driven_lanedir_consec_min0.16924009421499742
driven_lanedir_max0.7357576341364731
driven_lanedir_mean0.4880588692228427
driven_lanedir_median0.5236188742699501
driven_lanedir_min0.16924009421499742
get_duckie_state_max1.302286356437106e-06
get_duckie_state_mean1.23099243414851e-06
get_duckie_state_median1.215496473439565e-06
get_duckie_state_min1.190690433277803e-06
get_robot_state_max0.0035337363972383386
get_robot_state_mean0.003487579333338271
get_robot_state_median0.0034800494314794505
get_robot_state_min0.003456482073155845
get_state_dump_max0.004342257275300868
get_state_dump_mean0.004300948995437181
get_state_dump_median0.004292093038705559
get_state_dump_min0.004277352629036739
get_ui_image_max0.035185872054681544
get_ui_image_mean0.030497351578222864
get_ui_image_median0.030325279419610756
get_ui_image_min0.026152975418988395
in-drivable-lane_max7.55000000000001
in-drivable-lane_mean4.649999999999996
in-drivable-lane_min1.0500000000000007
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.285469447568221, "get_ui_image": 0.028473102345186123, "step_physics": 0.14042084357317755, "survival_time": 8.449999999999985, "driven_lanedir": 0.41287453895922455, "get_state_dump": 0.004342257275300868, "get_robot_state": 0.0035337363972383386, "sim_render-ego0": 0.0036829261218800265, "get_duckie_state": 1.190690433277803e-06, "in-drivable-lane": 6.399999999999985, "deviation-heading": 1.174580837791698, "agent_compute-ego0": 0.012518494269427131, "complete-iteration": 0.20554146626416375, "set_robot_commands": 0.002088755719801959, "deviation-center-line": 0.15601696162497183, "driven_lanedir_consec": 0.41287453895922455, "sim_compute_sim_state": 0.008521642404444077, "sim_compute_performance-ego0": 0.00188599895028507}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.3213946773417773, "get_ui_image": 0.035185872054681544, "step_physics": 0.18442345828544804, "survival_time": 2.000000000000001, "driven_lanedir": 0.16924009421499742, "get_state_dump": 0.004296384206632289, "get_robot_state": 0.003456482073155845, "sim_render-ego0": 0.003640058563976753, "get_duckie_state": 1.2095381573932927e-06, "in-drivable-lane": 1.0500000000000007, "deviation-heading": 0.5489768946378994, "agent_compute-ego0": 0.013102665180113255, "complete-iteration": 0.2562879585638279, "set_robot_commands": 0.0020604366209448837, "deviation-center-line": 0.0526633436536008, "driven_lanedir_consec": 0.16924009421499742, "sim_compute_sim_state": 0.008235320812318384, "sim_compute_performance-ego0": 0.0018164762636510335}, "LF-norm-techtrack-000-ego0": {"driven_any": 3.078147593526138, "get_ui_image": 0.03217745649403539, "step_physics": 0.1181848201845667, "survival_time": 10.100000000000009, "driven_lanedir": 0.7357576341364731, "get_state_dump": 0.004277352629036739, "get_robot_state": 0.0034697830970651412, "sim_render-ego0": 0.003616732329570601, "get_duckie_state": 1.2214547894858374e-06, "in-drivable-lane": 7.55000000000001, "deviation-heading": 0.5085202185590632, "agent_compute-ego0": 0.012644181697826669, "complete-iteration": 0.187087108348978, "set_robot_commands": 0.0020353242094293604, "deviation-center-line": 0.08211992040333092, "driven_lanedir_consec": 0.7357576341364731, "sim_compute_sim_state": 0.008781590485220472, "sim_compute_performance-ego0": 0.0018287409702545316}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.680630552793689, "get_ui_image": 0.026152975418988395, "step_physics": 0.08252231012873289, "survival_time": 5.899999999999987, "driven_lanedir": 0.6343632095806757, "get_state_dump": 0.004287801870778829, "get_robot_state": 0.00349031576589376, "sim_render-ego0": 0.0035644799721341173, "get_duckie_state": 1.302286356437106e-06, "in-drivable-lane": 3.599999999999987, "deviation-heading": 0.3653365248350745, "agent_compute-ego0": 0.01174944789469743, "complete-iteration": 0.14054427427404068, "set_robot_commands": 0.0020435697892132927, "deviation-center-line": 0.06563408482628554, "driven_lanedir_consec": 0.6343632095806757, "sim_compute_sim_state": 0.004834573809840098, "sim_compute_performance-ego0": 0.0018247876848493305}}
set_robot_commands_max0.002088755719801959
set_robot_commands_mean0.0020570215848473737
set_robot_commands_median0.0020520032050790884
set_robot_commands_min0.0020353242094293604
sim_compute_performance-ego0_max0.00188599895028507
sim_compute_performance-ego0_mean0.0018390009672599916
sim_compute_performance-ego0_median0.001826764327551931
sim_compute_performance-ego0_min0.0018164762636510335
sim_compute_sim_state_max0.008781590485220472
sim_compute_sim_state_mean0.007593281877955757
sim_compute_sim_state_median0.008378481608381231
sim_compute_sim_state_min0.004834573809840098
sim_render-ego0_max0.0036829261218800265
sim_render-ego0_mean0.0036260492468903746
sim_render-ego0_median0.003628395446773677
sim_render-ego0_min0.0035644799721341173
simulation-passed1
step_physics_max0.18442345828544804
step_physics_mean0.1313878580429813
step_physics_median0.12930283187887212
step_physics_min0.08252231012873289
survival_time_max10.100000000000009
survival_time_mean6.612499999999995
survival_time_min2.000000000000001
No reset possible
52538LFv-simerrorno0:05:36
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140566038954096
- M:video_aido:cmdline(in:/;out:/) 140566040040832
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52536LFv-simerrorno0:02:56
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139917675868112
- M:video_aido:cmdline(in:/;out:/) 139917676025312
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52520LFv-simhost-errorno0:04:34
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41822LFv-simsuccessno0:08:30
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38393LFv-simsuccessno0:09:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36466LFv-simsuccessno0:08:33
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36465LFv-simsuccessno0:08:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35880LFv-simsuccessno0:01:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35464LFv-simerrorno0:18:15
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg04-c054faef3177-1-job35464:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg04-c054faef3177-1-job35464/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg04-c054faef3177-1-job35464/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg04-c054faef3177-1-job35464/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg04-c054faef3177-1-job35464/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg04-c054faef3177-1-job35464/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35150LFv-simsuccessno0:16:19
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34424LFv-simsuccessno0:20:33
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34423LFv-simsuccessno0:19:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34255LFv-simabortedno0:21:33
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg12-dacebf82dd85-1-job34255:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg12-dacebf82dd85-1-job34255/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33905LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg04-bf35e9d68df4-1-job33905'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33899LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg04-bf35e9d68df4-1-job33899'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33893LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg05-5ca0d35e6d82-1-job33893'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33888LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg03-c2bc3037870e-1-job33888'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33885LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg03-c2bc3037870e-1-job33885'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33878LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg03-c2bc3037870e-1-job33878'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33874LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg07-c4e193407567-1-job33874'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33864LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg01-53440c9394b5-1-job33864'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33857LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg11-951de1eeccca-1-job33857'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33853LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6827/LFv-sim-reg11-951de1eeccca-1-job33853'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33402LFv-simsuccessno0:05:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33401LFv-simsuccessno0:05:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible