Duckietown Challenges Home Challenges Submissions

Submission 6823

Submission6823
Competingyes
Challengeaido5-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58603
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58603

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58603LFv-simsuccessyes0:38:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.4489118375587235
survival_time_median59.99999999999873
deviation-center-line_median3.4919943728009795
in-drivable-lane_median5.674999999999999


other stats
agent_compute-ego0_max0.03623349402568223
agent_compute-ego0_mean0.03301122072435835
agent_compute-ego0_median0.03468424483798724
agent_compute-ego0_min0.026442899195776692
complete-iteration_max0.24109112868995888
complete-iteration_mean0.22076288344560316
complete-iteration_median0.2187792913403539
complete-iteration_min0.204401822411746
deviation-center-line_max3.983788818890544
deviation-center-line_mean3.0009667599893435
deviation-center-line_min1.0360894754648724
deviation-heading_max10.816300728670148
deviation-heading_mean7.523406598849888
deviation-heading_median8.255483134125473
deviation-heading_min2.766359398478454
driven_any_max7.921189557671148
driven_any_mean7.918882892492416
driven_any_median7.919920767603441
driven_any_min7.914500477091632
driven_lanedir_consec_max7.277166190205988
driven_lanedir_consec_mean4.9105849800488315
driven_lanedir_consec_min1.4673500548718903
driven_lanedir_max7.515544586938774
driven_lanedir_mean5.828561654684992
driven_lanedir_median6.903225144897043
driven_lanedir_min1.992251742007105
get_duckie_state_max1.4414298941352583e-06
get_duckie_state_mean1.3922473770891202e-06
get_duckie_state_median1.4217767389886685e-06
get_duckie_state_min1.2840061362438853e-06
get_robot_state_max0.004104897541170017
get_robot_state_mean0.003946112355622126
get_robot_state_median0.003951581094187563
get_robot_state_min0.00377638969294336
get_state_dump_max0.005194311435772517
get_state_dump_mean0.0049649526634184555
get_state_dump_median0.004972009436474752
get_state_dump_min0.004721480344951798
get_ui_image_max0.0373663201518698
get_ui_image_mean0.0318835092821685
get_ui_image_median0.03178971325924355
get_ui_image_min0.026588290458317104
in-drivable-lane_max44.04999999999863
in-drivable-lane_mean14.274999999999665
in-drivable-lane_min1.7000000000000242
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921189557671148, "get_ui_image": 0.029601229517584936, "step_physics": 0.11089677437457514, "survival_time": 59.99999999999873, "driven_lanedir": 7.515544586938774, "get_state_dump": 0.005115849687892333, "get_robot_state": 0.004104897541170017, "sim_render-ego0": 0.0041446640132170335, "get_duckie_state": 1.4360699427316428e-06, "in-drivable-lane": 1.7000000000000242, "deviation-heading": 7.0230464244129625, "agent_compute-ego0": 0.03482159587564715, "complete-iteration": 0.204401822411746, "set_robot_commands": 0.002523766469201081, "deviation-center-line": 3.4390018719564224, "driven_lanedir_consec": 4.36853957552935, "sim_compute_sim_state": 0.010801152325391174, "sim_compute_performance-ego0": 0.0022834903691630877}, "LF-norm-zigzag-000-ego0": {"driven_any": 7.921009879214831, "get_ui_image": 0.0373663201518698, "step_physics": 0.13720494265560307, "survival_time": 59.99999999999873, "driven_lanedir": 7.277166190205988, "get_state_dump": 0.004828169185057171, "get_robot_state": 0.0038239529091154503, "sim_render-ego0": 0.003910837721368058, "get_duckie_state": 1.4074835352456935e-06, "in-drivable-lane": 2.7500000000000204, "deviation-heading": 10.816300728670148, "agent_compute-ego0": 0.03623349402568223, "complete-iteration": 0.24109112868995888, "set_robot_commands": 0.00237005616504882, "deviation-center-line": 3.983788818890544, "driven_lanedir_consec": 7.277166190205988, "sim_compute_sim_state": 0.013172023203053345, "sim_compute_performance-ego0": 0.0020803374910632537}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.914500477091632, "get_ui_image": 0.03397819700090216, "step_physics": 0.12153674065322304, "survival_time": 59.99999999999873, "driven_lanedir": 6.529284099588098, "get_state_dump": 0.005194311435772517, "get_robot_state": 0.004079209279259675, "sim_render-ego0": 0.004179973586413584, "get_duckie_state": 1.4414298941352583e-06, "in-drivable-lane": 8.599999999999977, "deviation-heading": 9.487919843837984, "agent_compute-ego0": 0.026442899195776692, "complete-iteration": 0.21442365844879024, "set_robot_commands": 0.0024761986871444613, "deviation-center-line": 3.544986873645537, "driven_lanedir_consec": 6.529284099588098, "sim_compute_sim_state": 0.014104516778162973, "sim_compute_performance-ego0": 0.002325776812436678}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.91883165599205, "get_ui_image": 0.026588290458317104, "step_physics": 0.13916487161761815, "survival_time": 59.99999999999873, "driven_lanedir": 1.992251742007105, "get_state_dump": 0.004721480344951798, "get_robot_state": 0.00377638969294336, "sim_render-ego0": 0.0038040827751953735, "get_duckie_state": 1.2840061362438853e-06, "in-drivable-lane": 44.04999999999863, "deviation-heading": 2.766359398478454, "agent_compute-ego0": 0.03454689380032732, "complete-iteration": 0.22313492423191753, "set_robot_commands": 0.0023175919681266382, "deviation-center-line": 1.0360894754648724, "driven_lanedir_consec": 1.4673500548718903, "sim_compute_sim_state": 0.006096770622450347, "sim_compute_performance-ego0": 0.0020233399663539256}}
set_robot_commands_max0.002523766469201081
set_robot_commands_mean0.00242190332238025
set_robot_commands_median0.0024231274260966404
set_robot_commands_min0.0023175919681266382
sim_compute_performance-ego0_max0.002325776812436678
sim_compute_performance-ego0_mean0.002178236159754236
sim_compute_performance-ego0_median0.0021819139301131707
sim_compute_performance-ego0_min0.0020233399663539256
sim_compute_sim_state_max0.014104516778162973
sim_compute_sim_state_mean0.01104361573226446
sim_compute_sim_state_median0.011986587764222258
sim_compute_sim_state_min0.006096770622450347
sim_render-ego0_max0.004179973586413584
sim_render-ego0_mean0.004009889524048512
sim_render-ego0_median0.004027750867292545
sim_render-ego0_min0.0038040827751953735
simulation-passed1
step_physics_max0.13916487161761815
step_physics_mean0.12720083232525486
step_physics_median0.1293708416544131
step_physics_min0.11089677437457514
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58600LFv-simsuccessyes0:20:48
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58599LFv-simsuccessyes0:30:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58595LFv-simsuccessyes0:37:31
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58594LFv-simsuccessyes0:30:25
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58593LFv-simsuccessyes0:28:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58592LFv-simsuccessyes0:36:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52539LFv-simerrorno0:07:17
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139650190718720
- M:video_aido:cmdline(in:/;out:/) 139650190717088
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52537LFv-simerrorno0:07:26
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139673297395616
- M:video_aido:cmdline(in:/;out:/) 139673297392928
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52527LFv-simhost-errorno0:09:37
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41828LFv-simsuccessno0:08:36
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38406LFv-simsuccessno0:15:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36470LFv-simsuccessno0:10:35
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35886LFv-simsuccessno0:01:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35469LFv-simerrorno0:21:41
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg04-c054faef3177-1-job35469:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg04-c054faef3177-1-job35469/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg04-c054faef3177-1-job35469/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg04-c054faef3177-1-job35469/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg04-c054faef3177-1-job35469/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg04-c054faef3177-1-job35469/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35155LFv-simsuccessno0:23:42
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34430LFv-simsuccessno0:27:25
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34262LFv-simabortedno0:24:31
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg11-cc3acf431491-1-job34262:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg11-cc3acf431491-1-job34262/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34259LFv-simabortedno0:23:45
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg03-4882a976d5dc-1-job34259:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg03-4882a976d5dc-1-job34259/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33944LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg04-bf35e9d68df4-1-job33944'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33938LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg04-bf35e9d68df4-1-job33938'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33929LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg01-53440c9394b5-1-job33929'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33925LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg01-53440c9394b5-1-job33925'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33917LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg07-c4e193407567-1-job33917'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33914LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg11-951de1eeccca-1-job33914'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33906LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg03-c2bc3037870e-1-job33906'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33904LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg05-5ca0d35e6d82-1-job33904'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33898LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6823/LFv-sim-reg03-c2bc3037870e-1-job33898'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33395LFv-simsuccessno0:13:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33394LFv-simsuccessno0:13:27
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible