Duckietown Challenges Home Challenges Submissions

Submission 6729

Submission6729
Competingyes
Challengeaido5-LF-sim-validation
UserAndrea Censi 🇨🇭
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58694
Next
User labeltemplate-random
Admin priority50
Blessingn/a
User priority50

58694

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58694LFv-simsuccessyes0:03:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.5040495128145347
survival_time_median4.6499999999999915
deviation-center-line_median0.10620228737371736
in-drivable-lane_median2.674999999999992


other stats
agent_compute-ego0_max0.011678496996561686
agent_compute-ego0_mean0.01107091282517773
agent_compute-ego0_median0.010954903060417616
agent_compute-ego0_min0.010695348183314003
complete-iteration_max0.20592417319615683
complete-iteration_mean0.18176633294127095
complete-iteration_median0.1867567584057988
complete-iteration_min0.14762764175732931
deviation-center-line_max0.16455763165774434
deviation-center-line_mean0.1092777488823554
deviation-center-line_min0.06014878912424249
deviation-heading_max1.819139548495243
deviation-heading_mean0.860079214882594
deviation-heading_median0.6517793617099584
deviation-heading_min0.31761858761521594
driven_any_max3.078535752374254
driven_any_mean1.8526223552312917
driven_any_median1.6219947547238958
driven_any_min1.0879641591031215
driven_lanedir_consec_max0.6643282900394009
driven_lanedir_consec_mean0.4884432809313369
driven_lanedir_consec_min0.28134580805687714
driven_lanedir_max0.6643282900394009
driven_lanedir_mean0.4924956463812869
driven_lanedir_median0.5121542437144349
driven_lanedir_min0.28134580805687714
get_duckie_state_max2.719178984436808e-06
get_duckie_state_mean2.5242453615636136e-06
get_duckie_state_median2.524127130923064e-06
get_duckie_state_min2.329548199971517e-06
get_robot_state_max0.004234246585680091
get_robot_state_mean0.003921774089165997
get_robot_state_median0.003935038330853405
get_robot_state_min0.0035827731092770896
get_state_dump_max0.005053483777576023
get_state_dump_mean0.004848143537460187
get_state_dump_median0.00483327656246242
get_state_dump_min0.004672537247339885
get_ui_image_max0.03459103729413903
get_ui_image_mean0.030096862274242863
get_ui_image_median0.030416191106271813
get_ui_image_min0.0249640295902888
in-drivable-lane_max6.549999999999982
in-drivable-lane_mean3.3499999999999894
in-drivable-lane_min1.499999999999995
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 3.078535752374254, "get_ui_image": 0.028269434277015396, "step_physics": 0.10209103928336614, "survival_time": 7.84999999999998, "driven_lanedir": 0.28134580805687714, "get_state_dump": 0.004928644699386403, "get_robot_state": 0.003876836994026281, "sim_render-ego0": 0.003871139091781423, "get_duckie_state": 2.719178984436808e-06, "in-drivable-lane": 6.549999999999982, "deviation-heading": 0.891006057417327, "agent_compute-ego0": 0.010842711110658284, "complete-iteration": 0.1682066525085063, "set_robot_commands": 0.0022720581368554996, "deviation-center-line": 0.13094886067496936, "driven_lanedir_consec": 0.28134580805687714, "sim_compute_sim_state": 0.009790221347084528, "sim_compute_performance-ego0": 0.002167083040068421}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.5806581102909054, "get_ui_image": 0.03459103729413903, "step_physics": 0.1328091465908548, "survival_time": 4.549999999999992, "driven_lanedir": 0.3891269085945981, "get_state_dump": 0.004737908425538436, "get_robot_state": 0.004234246585680091, "sim_render-ego0": 0.004027675027432649, "get_duckie_state": 2.485254536504331e-06, "in-drivable-lane": 2.449999999999993, "deviation-heading": 1.819139548495243, "agent_compute-ego0": 0.011067095010176949, "complete-iteration": 0.2053068643030913, "set_robot_commands": 0.0023221166237540865, "deviation-center-line": 0.16455763165774434, "driven_lanedir_consec": 0.37291744679479777, "sim_compute_sim_state": 0.009331993434740149, "sim_compute_performance-ego0": 0.0020931181700333304}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.0879641591031215, "get_ui_image": 0.032562947935528226, "step_physics": 0.13552984926435682, "survival_time": 3.5499999999999954, "driven_lanedir": 0.6643282900394009, "get_state_dump": 0.005053483777576023, "get_robot_state": 0.003993239667680528, "sim_render-ego0": 0.004119247198104858, "get_duckie_state": 2.562999725341797e-06, "in-drivable-lane": 1.499999999999995, "deviation-heading": 0.41255266600258983, "agent_compute-ego0": 0.011678496996561686, "complete-iteration": 0.20592417319615683, "set_robot_commands": 0.002364549371931288, "deviation-center-line": 0.08145571407246535, "driven_lanedir_consec": 0.6643282900394009, "sim_compute_sim_state": 0.008309347762001885, "sim_compute_performance-ego0": 0.002214441696802775}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.6633313991568863, "get_ui_image": 0.0249640295902888, "step_physics": 0.091043122112751, "survival_time": 4.749999999999991, "driven_lanedir": 0.6351815788342716, "get_state_dump": 0.004672537247339885, "get_robot_state": 0.0035827731092770896, "sim_render-ego0": 0.003679941097895304, "get_duckie_state": 2.329548199971517e-06, "in-drivable-lane": 2.89999999999999, "deviation-heading": 0.31761858761521594, "agent_compute-ego0": 0.010695348183314003, "complete-iteration": 0.14762764175732931, "set_robot_commands": 0.002208530902862549, "deviation-center-line": 0.06014878912424249, "driven_lanedir_consec": 0.6351815788342716, "sim_compute_sim_state": 0.004741432766119639, "sim_compute_performance-ego0": 0.0019519304235776265}}
set_robot_commands_max0.002364549371931288
set_robot_commands_mean0.0022918137588508556
set_robot_commands_median0.002297087380304793
set_robot_commands_min0.002208530902862549
sim_compute_performance-ego0_max0.002214441696802775
sim_compute_performance-ego0_mean0.0021066433326205384
sim_compute_performance-ego0_median0.002130100605050876
sim_compute_performance-ego0_min0.0019519304235776265
sim_compute_sim_state_max0.009790221347084528
sim_compute_sim_state_mean0.00804324882748655
sim_compute_sim_state_median0.008820670598371017
sim_compute_sim_state_min0.004741432766119639
sim_render-ego0_max0.004119247198104858
sim_render-ego0_mean0.003924500603803559
sim_render-ego0_median0.003949407059607036
sim_render-ego0_min0.003679941097895304
simulation-passed1
step_physics_max0.13552984926435682
step_physics_mean0.1153682893128322
step_physics_median0.11745009293711048
step_physics_min0.091043122112751
survival_time_max7.84999999999998
survival_time_mean5.17499999999999
survival_time_min3.5499999999999954
No reset possible
58690LFv-simsuccessyes0:04:04
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58689LFv-simsuccessyes0:04:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58687LFv-simsuccessyes0:04:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58683LFv-simsuccessyes0:04:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58682LFv-simsuccessyes0:04:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52640LFv-simerrorno0:02:39
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139765878935408
- M:video_aido:cmdline(in:/;out:/) 139765878935360
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52631LFv-simerrorno0:02:42
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140032157481376
- M:video_aido:cmdline(in:/;out:/) 140032157403888
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52619LFv-simerrorno0:01:44
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140200083909408
- M:video_aido:cmdline(in:/;out:/) 140200083907776
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41873LFv-simsuccessno0:03:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38462LFv-simsuccessno0:04:14
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38461LFv-simsuccessno0:04:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36507LFv-simsuccessno0:04:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35930LFv-simsuccessno0:00:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35928LFv-simsuccessno0:01:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35584LFv-simabortedno0:03:30
KeyboardInterrupt: T [...]
KeyboardInterrupt:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.43-py3.8.egg/duckietown_challenges_runner/runner.py", line 1040, in run_one
    heartbeat()
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.43-py3.8.egg/duckietown_challenges_runner/runner.py", line 521, in heartbeat
    raise KeyboardInterrupt(msg)
KeyboardInterrupt: The server told us to abort the job because: The challenge has been updated.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35583LFv-simabortedno0:03:58
KeyboardInterrupt: T [...]
KeyboardInterrupt:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.43-py3.8.egg/duckietown_challenges_runner/runner.py", line 1040, in run_one
    heartbeat()
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.43-py3.8.egg/duckietown_challenges_runner/runner.py", line 521, in heartbeat
    raise KeyboardInterrupt(msg)
KeyboardInterrupt: The server told us to abort the job because: The challenge has been updated.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35539LFv-simabortedno0:08:33
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg05-117cf9595558-1-job35539:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg05-117cf9595558-1-job35539/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg05-117cf9595558-1-job35539/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg05-117cf9595558-1-job35539/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg05-117cf9595558-1-job35539/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg05-117cf9595558-1-job35539/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35515LFv-simabortedno0:07:30
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg02-e2c486fc80ef-1-job35515:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg02-e2c486fc80ef-1-job35515/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg02-e2c486fc80ef-1-job35515/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg02-e2c486fc80ef-1-job35515/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg02-e2c486fc80ef-1-job35515/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg02-e2c486fc80ef-1-job35515/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35508LFv-simabortedno0:07:05
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg04-c054faef3177-1-job35508:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg04-c054faef3177-1-job35508/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg04-c054faef3177-1-job35508/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg04-c054faef3177-1-job35508/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg04-c054faef3177-1-job35508/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg04-c054faef3177-1-job35508/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35500LFv-simabortedno0:07:03
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg03-0c28c9d61367-1-job35500:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg03-0c28c9d61367-1-job35500/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg03-0c28c9d61367-1-job35500/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg03-0c28c9d61367-1-job35500/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg03-0c28c9d61367-1-job35500/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg03-0c28c9d61367-1-job35500/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35193LFv-simabortedno0:08:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34334LFv-simabortedno0:07:56
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg12-d77ebe78e2f9-1-job34334:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg12-d77ebe78e2f9-1-job34334/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34316LFv-simabortedno0:07:28
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg10-e57b0b7c7a6d-1-job34316:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg10-e57b0b7c7a6d-1-job34316/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34315LFv-simabortedno0:07:57
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg11-cc3acf431491-1-job34315:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg11-cc3acf431491-1-job34315/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34212LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg04-bf35e9d68df4-1-job34212'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34205LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg01-53440c9394b5-1-job34205'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34203LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg01-53440c9394b5-1-job34203'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34181LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg07-c4e193407567-1-job34181'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34175LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg05-5ca0d35e6d82-1-job34175'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34173LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg03-c2bc3037870e-1-job34173'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34166LFv-simabortedno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6729/LFv-sim-reg11-951de1eeccca-1-job34166'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33228LFv-simabortedno0:04:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33227LFv-simabortedno0:04:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible