Duckietown Challenges Home Challenges Submissions

Submission 6818

Submission6818
Competingyes
Challengeaido5-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58609
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58609

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58609LFv-simsuccessyes0:37:10
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.522897095810201
survival_time_median59.99999999999873
deviation-center-line_median3.474110524118846
in-drivable-lane_median7.80000000000001


other stats
agent_compute-ego0_max0.03486884781760439
agent_compute-ego0_mean0.03384862985340979
agent_compute-ego0_median0.034065842132187205
agent_compute-ego0_min0.032393987331660365
complete-iteration_max0.23491191248611845
complete-iteration_mean0.2098729964299166
complete-iteration_median0.21554890183187544
complete-iteration_min0.17348226956979718
deviation-center-line_max4.383506449255173
deviation-center-line_mean3.411632385891361
deviation-center-line_min2.314802046072579
deviation-heading_max11.29980846184461
deviation-heading_mean9.293824658760215
deviation-heading_median9.174348110968628
deviation-heading_min7.5267939512589965
driven_any_max7.921161663669287
driven_any_mean7.9193316698367155
driven_any_median7.920928738470464
driven_any_min7.914307538736647
driven_lanedir_consec_max6.872980320485295
driven_lanedir_consec_mean5.568724774260459
driven_lanedir_consec_min4.356124584936136
driven_lanedir_max6.944293980494202
driven_lanedir_mean6.3855761417293015
driven_lanedir_median6.5644046810044285
driven_lanedir_min5.469201224414145
get_duckie_state_max1.6419317799742076e-06
get_duckie_state_mean1.4279803864465563e-06
get_duckie_state_median1.3952747570485695e-06
get_duckie_state_min1.2794402517148797e-06
get_robot_state_max0.003958053533282506
get_robot_state_mean0.003821301767967028
get_robot_state_median0.003885154025342244
get_robot_state_min0.0035568454879011144
get_state_dump_max0.005128738385850842
get_state_dump_mean0.004992673835786157
get_state_dump_median0.005011569153359291
get_state_dump_min0.004818818650575204
get_ui_image_max0.03517493419504285
get_ui_image_mean0.03066919710515044
get_ui_image_median0.031360835953616384
get_ui_image_min0.024780182318326138
in-drivable-lane_max16.54999999999917
in-drivable-lane_mean9.437499999999766
in-drivable-lane_min5.599999999999877
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921161663669287, "get_ui_image": 0.02912488150457657, "step_physics": 0.11182416785666588, "survival_time": 59.99999999999873, "driven_lanedir": 6.944293980494202, "get_state_dump": 0.005128738385850842, "get_robot_state": 0.003915323405142728, "sim_render-ego0": 0.0040407516279387335, "get_duckie_state": 1.6419317799742076e-06, "in-drivable-lane": 5.749999999999989, "deviation-heading": 7.5267939512589965, "agent_compute-ego0": 0.03486884781760439, "complete-iteration": 0.20448694280740323, "set_robot_commands": 0.002407806104267765, "deviation-center-line": 3.6028006181494447, "driven_lanedir_consec": 4.857607268824881, "sim_compute_sim_state": 0.010867794387842793, "sim_compute_performance-ego0": 0.0022013209245286317}, "LF-norm-zigzag-000-ego0": {"driven_any": 7.920978811526191, "get_ui_image": 0.03517493419504285, "step_physics": 0.13517662706621283, "survival_time": 59.99999999999873, "driven_lanedir": 6.872980320485295, "get_state_dump": 0.004818818650575204, "get_robot_state": 0.00385498464554176, "sim_render-ego0": 0.003976145552953614, "get_duckie_state": 1.2977037898309026e-06, "in-drivable-lane": 5.599999999999877, "deviation-heading": 11.29980846184461, "agent_compute-ego0": 0.03457916447959474, "complete-iteration": 0.23491191248611845, "set_robot_commands": 0.002355589457693743, "deviation-center-line": 4.383506449255173, "driven_lanedir_consec": 6.872980320485295, "sim_compute_sim_state": 0.012673166371900572, "sim_compute_performance-ego0": 0.002204604986605299}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.914307538736647, "get_ui_image": 0.033596790402656194, "step_physics": 0.12896709160244932, "survival_time": 59.99999999999873, "driven_lanedir": 6.255829041523562, "get_state_dump": 0.005107119915189592, "get_robot_state": 0.003958053533282506, "sim_render-ego0": 0.004025267919434, "get_duckie_state": 1.4928457242662364e-06, "in-drivable-lane": 9.85000000000003, "deviation-heading": 10.365620126945965, "agent_compute-ego0": 0.032393987331660365, "complete-iteration": 0.22661086085634763, "set_robot_commands": 0.002460334422090071, "deviation-center-line": 3.3454204300882475, "driven_lanedir_consec": 6.18818692279552, "sim_compute_sim_state": 0.01379502802268353, "sim_compute_performance-ego0": 0.00219472004511672}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.920878665414738, "get_ui_image": 0.024780182318326138, "step_physics": 0.09319396876574158, "survival_time": 59.99999999999873, "driven_lanedir": 5.469201224414145, "get_state_dump": 0.004916018391528991, "get_robot_state": 0.0035568454879011144, "sim_render-ego0": 0.0036084614228844942, "get_duckie_state": 1.2794402517148797e-06, "in-drivable-lane": 16.54999999999917, "deviation-heading": 7.983076094991292, "agent_compute-ego0": 0.033552519784779675, "complete-iteration": 0.17348226956979718, "set_robot_commands": 0.002184554202471248, "deviation-center-line": 2.314802046072579, "driven_lanedir_consec": 4.356124584936136, "sim_compute_sim_state": 0.005760778892447212, "sim_compute_performance-ego0": 0.001839791606804612}}
set_robot_commands_max0.002460334422090071
set_robot_commands_mean0.002352071046630707
set_robot_commands_median0.0023816977809807544
set_robot_commands_min0.002184554202471248
sim_compute_performance-ego0_max0.002204604986605299
sim_compute_performance-ego0_mean0.0021101093907638156
sim_compute_performance-ego0_median0.002198020484822676
sim_compute_performance-ego0_min0.001839791606804612
sim_compute_sim_state_max0.01379502802268353
sim_compute_sim_state_mean0.010774191918718527
sim_compute_sim_state_median0.011770480379871684
sim_compute_sim_state_min0.005760778892447212
sim_render-ego0_max0.0040407516279387335
sim_render-ego0_mean0.003912656630802711
sim_render-ego0_median0.004000706736193807
sim_render-ego0_min0.0036084614228844942
simulation-passed1
step_physics_max0.13517662706621283
step_physics_mean0.1172904638227674
step_physics_median0.1203956297295576
step_physics_min0.09319396876574158
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58607LFv-simsuccessyes0:35:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58606LFv-simsuccessyes0:29:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58605LFv-simsuccessyes0:27:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52555LFv-simerrorno0:09:28
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139678035488240
- M:video_aido:cmdline(in:/;out:/) 139678035456592
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52553LFv-simerrorno0:09:12
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140581551001664
- M:video_aido:cmdline(in:/;out:/) 140581551004592
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52549LFv-simerrorno0:11:02
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139871284735280
- M:video_aido:cmdline(in:/;out:/) 139870961415552
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52546LFv-simtimeoutno----No reset possible
41833LFv-simsuccessno0:09:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41832LFv-simsuccessno0:09:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38412LFv-simsuccessno0:08:46
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38411LFv-simsuccessno0:08:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36477LFv-simsuccessno0:09:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35892LFv-simsuccessno0:01:09
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35557LFv-simerrorno0:23:06
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg02-fd8739355f97-1-job35557:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg02-fd8739355f97-1-job35557/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg02-fd8739355f97-1-job35557/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg02-fd8739355f97-1-job35557/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg02-fd8739355f97-1-job35557/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg02-fd8739355f97-1-job35557/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35544LFv-simabortedno0:24:13
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-5753c726a5d0-1-job35544:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-5753c726a5d0-1-job35544/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-5753c726a5d0-1-job35544/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-5753c726a5d0-1-job35544/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-5753c726a5d0-1-job35544/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-5753c726a5d0-1-job35544/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35517LFv-simtimeoutno1:05:18
I can see how the jo [...]
I can see how the job 35517 is timeout because passed 3918 seconds and the timeout is 3600.0.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35475LFv-simabortedno0:20:59
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-c054faef3177-1-job35475:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-c054faef3177-1-job35475/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-c054faef3177-1-job35475/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-c054faef3177-1-job35475/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-c054faef3177-1-job35475/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-c054faef3177-1-job35475/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35161LFv-simsuccessno0:23:09
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34436LFv-simsuccessno0:27:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34265LFv-simabortedno0:24:34
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg01-d4ceb20fdede-1-job34265:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg01-d4ceb20fdede-1-job34265/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33984LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-bf35e9d68df4-1-job33984'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33981LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg04-bf35e9d68df4-1-job33981'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33967LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg01-53440c9394b5-1-job33967'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33961LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg11-951de1eeccca-1-job33961'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33956LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg07-c4e193407567-1-job33956'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33948LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg07-c4e193407567-1-job33948'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33946LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg03-c2bc3037870e-1-job33946'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33940LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg03-c2bc3037870e-1-job33940'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33934LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg05-5ca0d35e6d82-1-job33934'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33928LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6818/LFv-sim-reg05-5ca0d35e6d82-1-job33928'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33387LFv-simsuccessno0:15:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33386LFv-simsuccessno0:14:17
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible