Duckietown Challenges Home Challenges Submissions

Submission 9240

Submission9240
Competingyes
Challengeaido5-LF-sim-validation
UserLiam Paull 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58491
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58491

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58491LFv-simsuccessyes0:25:38
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.456528248256044
survival_time_median58.24999999999883
deviation-center-line_median2.492625973725853
in-drivable-lane_median9.124999999999613


other stats
agent_compute-ego0_max0.012853144334545184
agent_compute-ego0_mean0.012400620901542384
agent_compute-ego0_median0.012353809027340451
agent_compute-ego0_min0.01204172121694344
complete-iteration_max0.20811789153052157
complete-iteration_mean0.18254873922287249
complete-iteration_median0.18414812721138857
complete-iteration_min0.15378081093819115
deviation-center-line_max3.094139115882151
deviation-center-line_mean2.064642485302729
deviation-center-line_min0.17917887787705902
deviation-heading_max7.774115996813168
deviation-heading_mean4.729938328834216
deviation-heading_median5.011928312307123
deviation-heading_min1.1217806939094448
driven_any_max7.921247080837139
driven_any_mean5.966082322899854
driven_any_median7.638945836617108
driven_any_min0.6651905375280585
driven_lanedir_consec_max7.411035574154021
driven_lanedir_consec_mean4.636592934628801
driven_lanedir_consec_min0.22227966784909636
driven_lanedir_max7.411035574154021
driven_lanedir_mean4.636592934628801
driven_lanedir_median5.456528248256044
driven_lanedir_min0.22227966784909636
get_duckie_state_max2.3248292921385497e-06
get_duckie_state_mean2.2591156051018852e-06
get_duckie_state_median2.2633790613902005e-06
get_duckie_state_min2.18487500548859e-06
get_robot_state_max0.003769166273526867
get_robot_state_mean0.0036556945855578857
get_robot_state_median0.003708577076660207
get_robot_state_min0.0034364579153842614
get_state_dump_max0.004722409918706676
get_state_dump_mean0.004578221224709684
get_state_dump_median0.004600248567071386
get_state_dump_min0.00438997784598929
get_ui_image_max0.033846878614582
get_ui_image_mean0.03027573095168552
get_ui_image_median0.031039585938934383
get_ui_image_min0.02517687331429131
in-drivable-lane_max17.299999999999017
in-drivable-lane_mean9.699999999999514
in-drivable-lane_min3.2499999999998153
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921247080837139, "get_ui_image": 0.028933810155457203, "step_physics": 0.09851736589633456, "survival_time": 59.99999999999873, "driven_lanedir": 7.411035574154021, "get_state_dump": 0.004651266371181466, "get_robot_state": 0.003767865980594581, "sim_render-ego0": 0.003826956268551943, "get_duckie_state": 2.3248292921385497e-06, "in-drivable-lane": 3.2499999999998153, "deviation-heading": 5.436749801999805, "agent_compute-ego0": 0.0125090662982442, "complete-iteration": 0.1661389948029403, "set_robot_commands": 0.002252986885725112, "deviation-center-line": 3.094139115882151, "driven_lanedir_consec": 7.411035574154021, "sim_compute_sim_state": 0.009528367743702554, "sim_compute_performance-ego0": 0.002063904475609925}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.6651905375280585, "get_ui_image": 0.033846878614582, "step_physics": 0.13680343158909533, "survival_time": 6.0499999999999865, "driven_lanedir": 0.22227966784909636, "get_state_dump": 0.00438997784598929, "get_robot_state": 0.0034364579153842614, "sim_render-ego0": 0.003580982567834072, "get_duckie_state": 2.202440480716893e-06, "in-drivable-lane": 3.7999999999999865, "deviation-heading": 1.1217806939094448, "agent_compute-ego0": 0.012198551756436709, "complete-iteration": 0.20811789153052157, "set_robot_commands": 0.002103680469950692, "deviation-center-line": 0.17917887787705902, "driven_lanedir_consec": 0.22227966784909636, "sim_compute_sim_state": 0.009784890002891664, "sim_compute_performance-ego0": 0.0018910873131673844}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.356782773792383, "get_ui_image": 0.033145361722411566, "step_physics": 0.1259030566396806, "survival_time": 56.49999999999893, "driven_lanedir": 5.351764885392933, "get_state_dump": 0.004722409918706676, "get_robot_state": 0.003769166273526867, "sim_render-ego0": 0.003883380999510316, "get_duckie_state": 2.3243176420635086e-06, "in-drivable-lane": 14.44999999999924, "deviation-heading": 7.774115996813168, "agent_compute-ego0": 0.012853144334545184, "complete-iteration": 0.20215725961983677, "set_robot_commands": 0.0022588659450082415, "deviation-center-line": 2.5969888330159105, "driven_lanedir_consec": 5.351764885392933, "sim_compute_sim_state": 0.013468076442850047, "sim_compute_performance-ego0": 0.0020641188617304927}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.921108899441833, "get_ui_image": 0.02517687331429131, "step_physics": 0.09429430385910402, "survival_time": 59.99999999999873, "driven_lanedir": 5.561291611119154, "get_state_dump": 0.004549230762961306, "get_robot_state": 0.0036492881727258337, "sim_render-ego0": 0.0037941837390197703, "get_duckie_state": 2.18487500548859e-06, "in-drivable-lane": 17.299999999999017, "deviation-heading": 4.587106822614443, "agent_compute-ego0": 0.01204172121694344, "complete-iteration": 0.15378081093819115, "set_robot_commands": 0.002234797989895302, "deviation-center-line": 2.3882631144357953, "driven_lanedir_consec": 5.561291611119154, "sim_compute_sim_state": 0.005991484699995691, "sim_compute_performance-ego0": 0.0019665449286976225}}
set_robot_commands_max0.0022588659450082415
set_robot_commands_mean0.002212582822644837
set_robot_commands_median0.0022438924378102073
set_robot_commands_min0.002103680469950692
sim_compute_performance-ego0_max0.0020641188617304927
sim_compute_performance-ego0_mean0.001996413894801356
sim_compute_performance-ego0_median0.002015224702153774
sim_compute_performance-ego0_min0.0018910873131673844
sim_compute_sim_state_max0.013468076442850047
sim_compute_sim_state_mean0.00969320472235999
sim_compute_sim_state_median0.009656628873297109
sim_compute_sim_state_min0.005991484699995691
sim_render-ego0_max0.003883380999510316
sim_render-ego0_mean0.0037713758937290257
sim_render-ego0_median0.0038105700037858567
sim_render-ego0_min0.003580982567834072
simulation-passed1
step_physics_max0.13680343158909533
step_physics_mean0.11387953949605364
step_physics_median0.11221021126800758
step_physics_min0.09429430385910402
survival_time_max59.99999999999873
survival_time_mean45.63749999999909
survival_time_min6.0499999999999865
No reset possible
58490LFv-simsuccessyes0:22:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58489LFv-simsuccessyes0:22:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58488LFv-simsuccessyes0:24:38
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58487LFv-simsuccessyes0:31:47
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58486LFv-simsuccessyes0:19:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58484LFv-simsuccessyes0:18:45
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58483LFv-simsuccessyes0:27:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58480LFv-simsuccessyes0:25:17
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52452LFv-simerrorno0:15:28
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140617698764160
- M:video_aido:cmdline(in:/;out:/) 140617698763008
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52442LFv-simerrorno0:08:30
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139707617486736
- M:video_aido:cmdline(in:/;out:/) 139707617573568
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52433LFv-simerrorno0:09:12
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140531662051744
- M:video_aido:cmdline(in:/;out:/) 140531664504000
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52430LFv-simerrorno0:08:37
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140333777032480
- M:video_aido:cmdline(in:/;out:/) 140333777033536
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41777LFv-simsuccessno0:09:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38316LFv-simsuccessno0:10:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36411LFv-simerrorno0:00:43
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9240/LFv-sim-Sandy1-sandy-1-job36411-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35840LFv-simerrorno0:00:46
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1063, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9240/LFv-sim-noname-sandy-1-job35840-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35839LFv-simsuccessno0:01:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35421LFv-simerrorno0:24:21
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9240/LFv-sim-reg04-c054faef3177-1-job35421:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9240/LFv-sim-reg04-c054faef3177-1-job35421/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9240/LFv-sim-reg04-c054faef3177-1-job35421/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9240/LFv-sim-reg04-c054faef3177-1-job35421/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9240/LFv-sim-reg04-c054faef3177-1-job35421/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9240/LFv-sim-reg04-c054faef3177-1-job35421/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35066LFv-simsuccessno0:23:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34520LFv-simsuccessno0:21:53
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34519LFv-simsuccessno0:21:47
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible