Duckietown Challenges Home Challenges Submissions

Submission 9273

Submission9273
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58453
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58453

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58453LFv-simsuccessyes0:36:11
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median6.214594995663928
survival_time_median59.99999999999873
deviation-center-line_median2.092971252385448
in-drivable-lane_median0.0


other stats
agent_compute-ego0_max0.035346434376420426
agent_compute-ego0_mean0.026195725433832404
agent_compute-ego0_median0.028277386832892347
agent_compute-ego0_min0.012881693693124485
complete-iteration_max0.22112639321574165
complete-iteration_mean0.1904555620698508
complete-iteration_median0.1830391065961217
complete-iteration_min0.17461764187141818
deviation-center-line_max2.675162132566143
deviation-center-line_mean2.2114867099658304
deviation-center-line_min1.9848422025262824
deviation-heading_max5.6494627486041225
deviation-heading_mean4.748203202983404
deviation-heading_median4.904037834252742
deviation-heading_min3.5352743948240075
driven_any_max6.253638965517213
driven_any_mean6.252302176355952
driven_any_median6.25360641246172
driven_any_min6.248356914983156
driven_lanedir_consec_max6.2381360766486535
driven_lanedir_consec_mean6.218706567149556
driven_lanedir_consec_min6.207500200621712
driven_lanedir_max6.2381360766486535
driven_lanedir_mean6.218706567149556
driven_lanedir_median6.214594995663928
driven_lanedir_min6.207500200621712
get_duckie_state_max1.445003195071002e-06
get_duckie_state_mean1.3568617719893254e-06
get_duckie_state_median1.3448515105108535e-06
get_duckie_state_min1.292740871864592e-06
get_robot_state_max0.003770084404925522
get_robot_state_mean0.00372596267458799
get_robot_state_median0.003739793830668301
get_robot_state_min0.003654178632089835
get_state_dump_max0.004697651589145073
get_state_dump_mean0.004634311455274799
get_state_dump_median0.004624747019028485
get_state_dump_min0.004590100193897155
get_ui_image_max0.03627281522472931
get_ui_image_mean0.030666171611099812
get_ui_image_median0.030428210960439003
get_ui_image_min0.02553544929879194
in-drivable-lane_max0.0
in-drivable-lane_mean0.0
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 6.253638965517213, "get_ui_image": 0.028625472201395788, "step_physics": 0.10411162479632502, "survival_time": 59.99999999999873, "driven_lanedir": 6.2213534691169805, "get_state_dump": 0.004697651589145073, "get_robot_state": 0.003770084404925522, "sim_render-ego0": 0.0038540275964411377, "get_duckie_state": 1.445003195071002e-06, "in-drivable-lane": 0.0, "deviation-heading": 4.253559538548736, "agent_compute-ego0": 0.02238986097108712, "complete-iteration": 0.1820700283749316, "set_robot_commands": 0.002325542761225387, "deviation-center-line": 2.675162132566143, "driven_lanedir_consec": 6.2213534691169805, "sim_compute_sim_state": 0.010101820408156471, "sim_compute_performance-ego0": 0.0021016905448716645}, "LF-norm-zigzag-000-ego0": {"driven_any": 6.253604583339699, "get_ui_image": 0.03627281522472931, "step_physics": 0.1208999514281998, "survival_time": 59.99999999999873, "driven_lanedir": 6.207836522210877, "get_state_dump": 0.004645628496371737, "get_robot_state": 0.003743003349717114, "sim_render-ego0": 0.003853782031260164, "get_duckie_state": 1.3084236926381337e-06, "in-drivable-lane": 0.0, "deviation-heading": 5.6494627486041225, "agent_compute-ego0": 0.035346434376420426, "complete-iteration": 0.22112639321574165, "set_robot_commands": 0.002393230609750866, "deviation-center-line": 2.094709215042879, "driven_lanedir_consec": 6.207836522210877, "sim_compute_sim_state": 0.011758323513796486, "sim_compute_performance-ego0": 0.0021192435916516307}, "LF-norm-techtrack-000-ego0": {"driven_any": 6.248356914983156, "get_ui_image": 0.03223094971948222, "step_physics": 0.10960765067584112, "survival_time": 59.99999999999873, "driven_lanedir": 6.207500200621712, "get_state_dump": 0.004603865541685233, "get_robot_state": 0.003736584311619488, "sim_render-ego0": 0.003805332040905853, "get_duckie_state": 1.3812793283835736e-06, "in-drivable-lane": 0.0, "deviation-heading": 5.554516129956748, "agent_compute-ego0": 0.012881693693124485, "complete-iteration": 0.18400818481731177, "set_robot_commands": 0.0022737340665081956, "deviation-center-line": 1.9848422025262824, "driven_lanedir_consec": 6.207500200621712, "sim_compute_sim_state": 0.012688667351359827, "sim_compute_performance-ego0": 0.0020900967714689257}, "LF-norm-small_loop-000-ego0": {"driven_any": 6.253608241583741, "get_ui_image": 0.02553544929879194, "step_physics": 0.09261237612175605, "survival_time": 59.99999999999873, "driven_lanedir": 6.2381360766486535, "get_state_dump": 0.004590100193897155, "get_robot_state": 0.003654178632089835, "sim_render-ego0": 0.003752748932469199, "get_duckie_state": 1.292740871864592e-06, "in-drivable-lane": 0.0, "deviation-heading": 3.5352743948240075, "agent_compute-ego0": 0.034164912694697576, "complete-iteration": 0.17461764187141818, "set_robot_commands": 0.002248441845451565, "deviation-center-line": 2.0912332897280166, "driven_lanedir_consec": 6.2381360766486535, "sim_compute_sim_state": 0.005991245288832996, "sim_compute_performance-ego0": 0.00197945149316081}}
set_robot_commands_max0.002393230609750866
set_robot_commands_mean0.0023102373207340033
set_robot_commands_median0.002299638413866791
set_robot_commands_min0.002248441845451565
sim_compute_performance-ego0_max0.0021192435916516307
sim_compute_performance-ego0_mean0.0020726206002882573
sim_compute_performance-ego0_median0.002095893658170295
sim_compute_performance-ego0_min0.00197945149316081
sim_compute_sim_state_max0.012688667351359827
sim_compute_sim_state_mean0.010135014140536448
sim_compute_sim_state_median0.010930071960976478
sim_compute_sim_state_min0.005991245288832996
sim_render-ego0_max0.0038540275964411377
sim_render-ego0_mean0.003816472650269089
sim_render-ego0_median0.003829557036083009
sim_render-ego0_min0.003752748932469199
simulation-passed1
step_physics_max0.1208999514281998
step_physics_mean0.1068079007555305
step_physics_median0.10685963773608306
step_physics_min0.09261237612175605
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58451LFv-simsuccessyes0:33:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58450LFv-simsuccessyes0:37:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58445LFv-simsuccessyes0:34:09
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58444LFv-simsuccessyes0:35:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58440LFv-simsuccessyes0:35:44
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58435LFv-simsuccessyes0:36:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58430LFv-simsuccessyes0:35:41
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52508LFv-simerrorno0:09:05
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140531682077760
- M:video_aido:cmdline(in:/;out:/) 140531682088512
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52505LFv-simerrorno0:08:54
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139667273761264
- M:video_aido:cmdline(in:/;out:/) 139667274165504
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52500LFv-simerrorno0:09:25
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140617869747584
- M:video_aido:cmdline(in:/;out:/) 140617869689280
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52379LFv-simtimeoutno0:14:42
Timeout because eval [...]
Timeout because evaluator contacted us
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41763LFv-simsuccessno0:10:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38263LFv-simsuccessno0:10:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38261LFv-simsuccessno0:09:11
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38259LFv-simsuccessno0:09:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36373LFv-simerrorno0:00:47
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9273/LFv-sim-Sandy1-sandy-1-job36373-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35807LFv-simsuccessno0:01:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35399LFv-simerrorno0:21:48
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9273/LFv-sim-reg03-0c28c9d61367-1-job35399:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9273/LFv-sim-reg03-0c28c9d61367-1-job35399/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9273/LFv-sim-reg03-0c28c9d61367-1-job35399/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9273/LFv-sim-reg03-0c28c9d61367-1-job35399/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9273/LFv-sim-reg03-0c28c9d61367-1-job35399/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9273/LFv-sim-reg03-0c28c9d61367-1-job35399/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35037LFv-simsuccessno0:23:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34582LFv-simsuccessno0:25:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible