Duckietown Challenges Home Challenges Submissions

Submission 6820

Submission6820
Competingyes
Challengeaido5-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58602
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58602

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58602LFv-simsuccessyes0:05:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median-0.32027842728315703
survival_time_median4.324999999999992
deviation-center-line_median0.06690133195122017
in-drivable-lane_median2.874999999999992


other stats
agent_compute-ego0_max0.014091593878609794
agent_compute-ego0_mean0.013400990811500512
agent_compute-ego0_median0.01345523488169481
agent_compute-ego0_min0.012601899604002634
complete-iteration_max0.3710172971089681
complete-iteration_mean0.31725029235225566
complete-iteration_median0.32361273636702004
complete-iteration_min0.2507583995660146
deviation-center-line_max0.107685034389672
deviation-center-line_mean0.07227400053305474
deviation-center-line_min0.04760830384010663
deviation-heading_max0.5665698275884498
deviation-heading_mean0.4905716715136074
deviation-heading_median0.47005316463113567
deviation-heading_min0.4556105292037088
driven_any_max2.239692485332469
driven_any_mean1.309399244293134
driven_any_median1.1477987295891183
driven_any_min0.7023070326618298
driven_lanedir_consec_max-0.18658003306811463
driven_lanedir_consec_mean-0.3009862030194837
driven_lanedir_consec_min-0.3768079244435061
driven_lanedir_max-0.18658003306811463
driven_lanedir_mean-0.3009862030194837
driven_lanedir_median-0.32027842728315703
driven_lanedir_min-0.3768079244435061
get_duckie_state_max1.68100188050089e-06
get_duckie_state_mean1.5843019820562635e-06
get_duckie_state_median1.5867702544681611e-06
get_duckie_state_min1.4826655387878418e-06
get_robot_state_max0.004215376717703683
get_robot_state_mean0.004029298940872198
get_robot_state_median0.004043783165734529
get_robot_state_min0.00381425271431605
get_state_dump_max0.005377580249120319
get_state_dump_mean0.005155563436168635
get_state_dump_median0.005205467743209646
get_state_dump_min0.004833738009134929
get_ui_image_max0.037902353604634606
get_ui_image_mean0.03344155610161587
get_ui_image_median0.03412545572832355
get_ui_image_min0.027612959345181785
in-drivable-lane_max6.499999999999981
in-drivable-lane_mean3.4749999999999908
in-drivable-lane_min1.6499999999999964
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 0.9993011843400336, "get_ui_image": 0.03270665603347971, "step_physics": 0.2389670655697207, "survival_time": 3.899999999999994, "driven_lanedir": -0.3768079244435061, "get_state_dump": 0.005257479752166362, "get_robot_state": 0.003954238529446759, "sim_render-ego0": 0.004139854938169069, "get_duckie_state": 1.68100188050089e-06, "in-drivable-lane": 2.3999999999999932, "deviation-heading": 0.5665698275884498, "agent_compute-ego0": 0.013068090511273734, "complete-iteration": 0.31032238127310063, "set_robot_commands": 0.002333176286914681, "deviation-center-line": 0.107685034389672, "driven_lanedir_consec": -0.3768079244435061, "sim_compute_sim_state": 0.007614497896991199, "sim_compute_performance-ego0": 0.002189584925204893}, "LF-norm-zigzag-000-ego0": {"driven_any": 2.239692485332469, "get_ui_image": 0.037902353604634606, "step_physics": 0.2869125493367513, "survival_time": 7.449999999999981, "driven_lanedir": -0.18658003306811463, "get_state_dump": 0.00515345573425293, "get_robot_state": 0.004133327802022298, "sim_render-ego0": 0.004309209187825521, "get_duckie_state": 1.5083948771158855e-06, "in-drivable-lane": 6.499999999999981, "deviation-heading": 0.4694494205448173, "agent_compute-ego0": 0.013842379252115886, "complete-iteration": 0.3710172971089681, "set_robot_commands": 0.0023654492696126303, "deviation-center-line": 0.06828132357618376, "driven_lanedir_consec": -0.18658003306811463, "sim_compute_sim_state": 0.01378758430480957, "sim_compute_performance-ego0": 0.002513054211934408}, "LF-norm-techtrack-000-ego0": {"driven_any": 0.7023070326618298, "get_ui_image": 0.035544255423167394, "step_physics": 0.260136812452286, "survival_time": 3.099999999999997, "driven_lanedir": -0.3163302046675507, "get_state_dump": 0.005377580249120319, "get_robot_state": 0.004215376717703683, "sim_render-ego0": 0.004127744644407242, "get_duckie_state": 1.6651456318204363e-06, "in-drivable-lane": 1.6499999999999964, "deviation-heading": 0.470656908717454, "agent_compute-ego0": 0.014091593878609794, "complete-iteration": 0.33690309146093944, "set_robot_commands": 0.002437648319062733, "deviation-center-line": 0.06552134032625659, "driven_lanedir_consec": -0.3163302046675507, "sim_compute_sim_state": 0.008617529793391152, "sim_compute_performance-ego0": 0.0022567643059624564}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.296296274838203, "get_ui_image": 0.027612959345181785, "step_physics": 0.1870161940654119, "survival_time": 4.749999999999991, "driven_lanedir": -0.32422664989876343, "get_state_dump": 0.004833738009134929, "get_robot_state": 0.00381425271431605, "sim_render-ego0": 0.003902855018774668, "get_duckie_state": 1.4826655387878418e-06, "in-drivable-lane": 3.3499999999999908, "deviation-heading": 0.4556105292037088, "agent_compute-ego0": 0.012601899604002634, "complete-iteration": 0.2507583995660146, "set_robot_commands": 0.0021766647696495056, "deviation-center-line": 0.04760830384010663, "driven_lanedir_consec": -0.32422664989876343, "sim_compute_sim_state": 0.00664822260538737, "sim_compute_performance-ego0": 0.0020650203029314675}}
set_robot_commands_max0.002437648319062733
set_robot_commands_mean0.0023282346613098873
set_robot_commands_median0.002349312778263655
set_robot_commands_min0.0021766647696495056
sim_compute_performance-ego0_max0.002513054211934408
sim_compute_performance-ego0_mean0.0022561059365083062
sim_compute_performance-ego0_median0.002223174615583675
sim_compute_performance-ego0_min0.0020650203029314675
sim_compute_sim_state_max0.01378758430480957
sim_compute_sim_state_mean0.009166958650144824
sim_compute_sim_state_median0.008116013845191176
sim_compute_sim_state_min0.00664822260538737
sim_render-ego0_max0.004309209187825521
sim_render-ego0_mean0.004119915947294125
sim_render-ego0_median0.004133799791288155
sim_render-ego0_min0.003902855018774668
simulation-passed1
step_physics_max0.2869125493367513
step_physics_mean0.24325815535604248
step_physics_median0.24955193901100337
step_physics_min0.1870161940654119
survival_time_max7.449999999999981
survival_time_mean4.799999999999991
survival_time_min3.099999999999997
No reset possible
58601LFv-simsuccessyes0:04:41
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58598LFv-simsuccessyes0:04:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58596LFv-simsuccessyes0:04:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52534LFv-simerrorno0:03:16
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140050693264048
- M:video_aido:cmdline(in:/;out:/) 140050693250928
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41830LFv-simsuccessno0:03:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41829LFv-simsuccessno0:03:45
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38407LFv-simsuccessno0:07:14
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36475LFv-simsuccessno0:04:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35888LFv-simsuccessno0:01:15
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35473LFv-simerrorno0:07:20
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg03-0c28c9d61367-1-job35473:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg03-0c28c9d61367-1-job35473/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg03-0c28c9d61367-1-job35473/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg03-0c28c9d61367-1-job35473/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg03-0c28c9d61367-1-job35473/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg03-0c28c9d61367-1-job35473/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35159LFv-simsuccessno0:24:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34434LFv-simsuccessno0:10:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34264LFv-simabortedno0:08:41
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg07-1f09cddcc73e-1-job34264:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg07-1f09cddcc73e-1-job34264/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33965LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg04-bf35e9d68df4-1-job33965'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33957LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg05-5ca0d35e6d82-1-job33957'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33949LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg05-5ca0d35e6d82-1-job33949'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33943LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg01-53440c9394b5-1-job33943'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33937LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg01-53440c9394b5-1-job33937'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33930LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg11-951de1eeccca-1-job33930'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33926LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg11-951de1eeccca-1-job33926'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33918LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg03-c2bc3037870e-1-job33918'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33913LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6820/LFv-sim-reg07-c4e193407567-1-job33913'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33390LFv-simsuccessno0:06:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible