Duckietown Challenges Home Challenges Submissions

Submission 6852

Submission6852
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58517
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58517

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58517LFv-simsuccessyes0:11:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.5240190696856952
survival_time_median19.475000000000147
deviation-center-line_median0.8513289698196783
in-drivable-lane_median7.150000000000059


other stats
agent_compute-ego0_max0.013531981952606685
agent_compute-ego0_mean0.012720114649187916
agent_compute-ego0_median0.01268763880603122
agent_compute-ego0_min0.011973199032082542
complete-iteration_max0.21073509235771337
complete-iteration_mean0.18926134044556173
complete-iteration_median0.191713508524512
complete-iteration_min0.16288325237550946
deviation-center-line_max1.872857122116128
deviation-center-line_mean0.9509368884469848
deviation-center-line_min0.22823249203245383
deviation-heading_max8.592933225157614
deviation-heading_mean4.734570651683564
deviation-heading_median4.859731153875548
deviation-heading_min0.625887073825545
driven_any_max7.161183210697387
driven_any_mean3.897188872646371
driven_any_median3.678140197237401
driven_any_min1.071291885413297
driven_lanedir_consec_max2.5565599558454064
driven_lanedir_consec_mean1.5395192168775471
driven_lanedir_consec_min0.5534787722933927
driven_lanedir_max3.5382427218958483
driven_lanedir_mean1.7849399083901578
driven_lanedir_median1.5240190696856952
driven_lanedir_min0.5534787722933927
get_duckie_state_max1.404020521375868e-06
get_duckie_state_mean1.321567945579272e-06
get_duckie_state_median1.313490601258453e-06
get_duckie_state_min1.255270058424315e-06
get_robot_state_max0.0037592875092563567
get_robot_state_mean0.0036881118760337074
get_robot_state_median0.0036793564258435457
get_robot_state_min0.003634447143191383
get_state_dump_max0.005276980043268528
get_state_dump_mean0.00480875822400765
get_state_dump_median0.0046781514745214406
get_state_dump_min0.00460174990371919
get_ui_image_max0.03589555844157732
get_ui_image_mean0.031061219197455636
get_ui_image_median0.030911358783848996
get_ui_image_min0.02652660078054723
in-drivable-lane_max20.94999999999998
in-drivable-lane_mean9.575000000000022
in-drivable-lane_min3.049999999999989
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.161183210697387, "get_ui_image": 0.028805210892583283, "step_physics": 0.11696816531705184, "survival_time": 35.450000000000124, "driven_lanedir": 2.3467882866669756, "get_state_dump": 0.00460174990371919, "get_robot_state": 0.003685435107056524, "sim_render-ego0": 0.003789747265023245, "get_duckie_state": 1.2921615385673415e-06, "in-drivable-lane": 20.94999999999998, "deviation-heading": 8.592933225157614, "agent_compute-ego0": 0.01265436998555358, "complete-iteration": 0.1850913675738053, "set_robot_commands": 0.0021881872499492807, "deviation-center-line": 1.3378200799834203, "driven_lanedir_consec": 2.3467882866669756, "sim_compute_sim_state": 0.010292949139232366, "sim_compute_performance-ego0": 0.002013985539825869}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.2746908710611196, "get_ui_image": 0.03589555844157732, "step_physics": 0.13547427638047407, "survival_time": 7.299999999999982, "driven_lanedir": 0.7012498527044149, "get_state_dump": 0.005276980043268528, "get_robot_state": 0.0036732777446305672, "sim_render-ego0": 0.003727434443778732, "get_duckie_state": 1.334819663949564e-06, "in-drivable-lane": 3.049999999999989, "deviation-heading": 1.7829688026092236, "agent_compute-ego0": 0.012720907626508855, "complete-iteration": 0.21073509235771337, "set_robot_commands": 0.0022104094628574086, "deviation-center-line": 0.36483785965593635, "driven_lanedir_consec": 0.7012498527044149, "sim_compute_sim_state": 0.009733766114630667, "sim_compute_performance-ego0": 0.0019354804032513885}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.071291885413297, "get_ui_image": 0.03301750667511471, "step_physics": 0.12691700080084423, "survival_time": 6.249999999999986, "driven_lanedir": 0.5534787722933927, "get_state_dump": 0.0047326333939083035, "get_robot_state": 0.003634447143191383, "sim_render-ego0": 0.0038782623079088, "get_duckie_state": 1.404020521375868e-06, "in-drivable-lane": 3.149999999999989, "deviation-heading": 0.625887073825545, "agent_compute-ego0": 0.013531981952606685, "complete-iteration": 0.19833564947521876, "set_robot_commands": 0.0022142784936087473, "deviation-center-line": 0.22823249203245383, "driven_lanedir_consec": 0.5534787722933927, "sim_compute_sim_state": 0.00829764207204183, "sim_compute_performance-ego0": 0.0020144364190480065}, "LF-norm-small_loop-000-ego0": {"driven_any": 6.081589523413682, "get_ui_image": 0.02652660078054723, "step_physics": 0.10153586029630354, "survival_time": 31.650000000000315, "driven_lanedir": 3.5382427218958483, "get_state_dump": 0.004623669555134578, "get_robot_state": 0.0037592875092563567, "sim_render-ego0": 0.0038193558292810097, "get_duckie_state": 1.255270058424315e-06, "in-drivable-lane": 11.15000000000013, "deviation-heading": 7.936493505141872, "agent_compute-ego0": 0.011973199032082542, "complete-iteration": 0.16288325237550946, "set_robot_commands": 0.0022271615849684466, "deviation-center-line": 1.872857122116128, "driven_lanedir_consec": 2.5565599558454064, "sim_compute_sim_state": 0.006254818913312365, "sim_compute_performance-ego0": 0.0020752347982268228}}
set_robot_commands_max0.0022271615849684466
set_robot_commands_mean0.0022100091978459707
set_robot_commands_median0.002212343978233078
set_robot_commands_min0.0021881872499492807
sim_compute_performance-ego0_max0.0020752347982268228
sim_compute_performance-ego0_mean0.002009784290088022
sim_compute_performance-ego0_median0.0020142109794369376
sim_compute_performance-ego0_min0.0019354804032513885
sim_compute_sim_state_max0.010292949139232366
sim_compute_sim_state_mean0.008644794059804306
sim_compute_sim_state_median0.009015704093336248
sim_compute_sim_state_min0.006254818913312365
sim_render-ego0_max0.0038782623079088
sim_render-ego0_mean0.003803699961497947
sim_render-ego0_median0.003804551547152127
sim_render-ego0_min0.003727434443778732
simulation-passed1
step_physics_max0.13547427638047407
step_physics_mean0.12022382569866844
step_physics_median0.12194258305894803
step_physics_min0.10153586029630354
survival_time_max35.450000000000124
survival_time_mean20.1625000000001
survival_time_min6.249999999999986
No reset possible
58512LFv-simsuccessyes0:07:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58511LFv-simsuccessyes0:08:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58509LFv-simsuccessyes0:09:27
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52459LFv-simerrorno0:02:18
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140351005520368
- M:video_aido:cmdline(in:/;out:/) 140351005522096
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52457LFv-simerrorno0:04:19
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139654283633952
- M:video_aido:cmdline(in:/;out:/) 139654283592080
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52449LFv-simerrorno0:02:38
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140494267970272
- M:video_aido:cmdline(in:/;out:/) 140494269297712
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41788LFv-simsuccessno0:09:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41787LFv-simsuccessno0:08:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38335LFv-simsuccessno0:08:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36434LFv-simsuccessno0:09:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35858LFv-simsuccessno0:00:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35853LFv-simsuccessno0:01:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35437LFv-simerrorno0:22:14
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6852/LFv-sim-reg05-b2dee9d94ee0-1-job35437:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6852/LFv-sim-reg05-b2dee9d94ee0-1-job35437/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6852/LFv-sim-reg05-b2dee9d94ee0-1-job35437/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6852/LFv-sim-reg05-b2dee9d94ee0-1-job35437/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6852/LFv-sim-reg05-b2dee9d94ee0-1-job35437/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6852/LFv-sim-reg05-b2dee9d94ee0-1-job35437/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35120LFv-simsuccessno0:23:42
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33464LFv-simsuccessno0:23:45
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33451LFv-simsuccessno0:15:10
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33450LFv-simsuccessno0:14:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible