Duckietown Challenges Home Challenges Submissions

Submission 6841

Submission6841
Competingyes
Challengeaido5-LF-sim-validation
UserHimanshu Arora 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58548
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58548

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58548LFv-simsuccessyes0:19:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.9842544719330188
survival_time_median34.47500000000001
deviation-center-line_median1.7164230388422106
in-drivable-lane_median13.499999999999886


other stats
agent_compute-ego0_max0.01328945423358053
agent_compute-ego0_mean0.012857950480847697
agent_compute-ego0_median0.012892211540617015
agent_compute-ego0_min0.012357924608576218
complete-iteration_max0.21526585169286536
complete-iteration_mean0.1865151289681605
complete-iteration_median0.18252033535791184
complete-iteration_min0.16575399346395298
deviation-center-line_max4.018009763370728
deviation-center-line_mean1.9230170819112888
deviation-center-line_min0.24121248659000616
deviation-heading_max8.08495010803266
deviation-heading_mean4.751941412809582
deviation-heading_median4.89522464128032
deviation-heading_min1.132366260645024
driven_any_max8.33815898951966
driven_any_mean4.641435774004411
driven_any_median4.669362304155106
driven_any_min0.8888594981877713
driven_lanedir_consec_max5.713401648025677
driven_lanedir_consec_mean2.555446018911113
driven_lanedir_consec_min0.539873483752737
driven_lanedir_max5.713401648025677
driven_lanedir_mean2.9300258031460658
driven_lanedir_median2.7334140404029243
driven_lanedir_min0.539873483752737
get_duckie_state_max1.4564169924123072e-06
get_duckie_state_mean1.366642048469318e-06
get_duckie_state_median1.388227531983138e-06
get_duckie_state_min1.2336961374986892e-06
get_robot_state_max0.0039714932661249714
get_robot_state_mean0.003799766434061086
get_robot_state_median0.003793677200515017
get_robot_state_min0.003640218069089339
get_state_dump_max0.0050402710670485245
get_state_dump_mean0.004887909666011536
get_state_dump_median0.004875675933053884
get_state_dump_min0.0047600157308898515
get_ui_image_max0.03556641956303744
get_ui_image_mean0.03163844349091911
get_ui_image_median0.031144651723396637
get_ui_image_min0.02869805095384573
in-drivable-lane_max17.649999999999906
in-drivable-lane_mean11.899999999999917
in-drivable-lane_min2.9499999999999895
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 8.33815898951966, "get_ui_image": 0.029097740894352565, "step_physics": 0.0977253715362676, "survival_time": 59.99999999999873, "driven_lanedir": 5.713401648025677, "get_state_dump": 0.004796177620296177, "get_robot_state": 0.0037248553483313464, "sim_render-ego0": 0.003867128111738448, "get_duckie_state": 1.407682051964346e-06, "in-drivable-lane": 17.649999999999906, "deviation-heading": 7.175113672617631, "agent_compute-ego0": 0.012597222610079777, "complete-iteration": 0.16575399346395298, "set_robot_commands": 0.0022712113160475605, "deviation-center-line": 4.018009763370728, "driven_lanedir_consec": 5.713401648025677, "sim_compute_sim_state": 0.009560841704089874, "sim_compute_performance-ego0": 0.002028511922424977}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.8888594981877713, "get_ui_image": 0.03556641956303744, "step_physics": 0.14038371559757515, "survival_time": 7.399999999999982, "driven_lanedir": 0.539873483752737, "get_state_dump": 0.0047600157308898515, "get_robot_state": 0.003640218069089339, "sim_render-ego0": 0.003815185303656047, "get_duckie_state": 1.2336961374986892e-06, "in-drivable-lane": 2.9499999999999895, "deviation-heading": 1.132366260645024, "agent_compute-ego0": 0.012357924608576218, "complete-iteration": 0.21526585169286536, "set_robot_commands": 0.002250127344323485, "deviation-center-line": 0.24121248659000616, "driven_lanedir_consec": 0.539873483752737, "sim_compute_sim_state": 0.010425439616977767, "sim_compute_performance-ego0": 0.001982423283109729}, "LF-norm-techtrack-000-ego0": {"driven_any": 5.696555139710691, "get_ui_image": 0.033191562552440705, "step_physics": 0.1217894010725909, "survival_time": 41.84999999999976, "driven_lanedir": 4.1348781019299, "get_state_dump": 0.0049551742458115895, "get_robot_state": 0.0038624990526986863, "sim_render-ego0": 0.003969894104185992, "get_duckie_state": 1.3687730120019298e-06, "in-drivable-lane": 10.249999999999543, "deviation-heading": 8.08495010803266, "agent_compute-ego0": 0.01318720047115426, "complete-iteration": 0.19677675851489593, "set_robot_commands": 0.0023662904566398384, "deviation-center-line": 2.743463227053909, "driven_lanedir_consec": 2.6365589649900887, "sim_compute_sim_state": 0.011245406328351516, "sim_compute_performance-ego0": 0.0021176964115698182}, "LF-norm-small_loop-000-ego0": {"driven_any": 3.642169468599522, "get_ui_image": 0.02869805095384573, "step_physics": 0.10175940037651834, "survival_time": 27.10000000000025, "driven_lanedir": 1.3319499788759488, "get_state_dump": 0.0050402710670485245, "get_robot_state": 0.0039714932661249714, "sim_render-ego0": 0.00404590401201617, "get_duckie_state": 1.4564169924123072e-06, "in-drivable-lane": 16.750000000000227, "deviation-heading": 2.61533560994301, "agent_compute-ego0": 0.01328945423358053, "complete-iteration": 0.1682639122009277, "set_robot_commands": 0.002400221745612213, "deviation-center-line": 0.6893828506305119, "driven_lanedir_consec": 1.3319499788759488, "sim_compute_sim_state": 0.006864840154489759, "sim_compute_performance-ego0": 0.0021031684418848645}}
set_robot_commands_max0.002400221745612213
set_robot_commands_mean0.002321962715655774
set_robot_commands_median0.0023187508863436995
set_robot_commands_min0.002250127344323485
sim_compute_performance-ego0_max0.0021176964115698182
sim_compute_performance-ego0_mean0.0020579500147473472
sim_compute_performance-ego0_median0.0020658401821549207
sim_compute_performance-ego0_min0.001982423283109729
sim_compute_sim_state_max0.011245406328351516
sim_compute_sim_state_mean0.00952413195097723
sim_compute_sim_state_median0.00999314066053382
sim_compute_sim_state_min0.006864840154489759
sim_render-ego0_max0.00404590401201617
sim_render-ego0_mean0.003924527882899164
sim_render-ego0_median0.00391851110796222
sim_render-ego0_min0.003815185303656047
simulation-passed1
step_physics_max0.14038371559757515
step_physics_mean0.115414472145738
step_physics_median0.11177440072455462
step_physics_min0.0977253715362676
survival_time_max59.99999999999873
survival_time_mean34.08749999999968
survival_time_min7.399999999999982
No reset possible
52497LFv-simerrorno0:03:35
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140651901222480
- M:video_aido:cmdline(in:/;out:/) 140651902324112
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52492LFv-simerrorno0:04:07
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140320138533424
- M:video_aido:cmdline(in:/;out:/) 140320138533232
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52491LFv-simerrorno0:07:32
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140167039284032
- M:video_aido:cmdline(in:/;out:/) 140167037558304
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41805LFv-simsuccessno0:09:42
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41804LFv-simsuccessno0:09:36
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38359LFv-simsuccessno0:19:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38358LFv-simsuccessno0:09:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36450LFv-simsuccessno0:09:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35867LFv-simsuccessno0:01:04
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35450LFv-simerrorno0:22:36
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg03-0c28c9d61367-1-job35450:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg03-0c28c9d61367-1-job35450/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg03-0c28c9d61367-1-job35450/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg03-0c28c9d61367-1-job35450/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg03-0c28c9d61367-1-job35450/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg03-0c28c9d61367-1-job35450/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35449LFv-simerrorno0:22:43
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg01-94a6fab21ac9-1-job35449:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg01-94a6fab21ac9-1-job35449/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg01-94a6fab21ac9-1-job35449/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg01-94a6fab21ac9-1-job35449/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg01-94a6fab21ac9-1-job35449/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6841/LFv-sim-reg01-94a6fab21ac9-1-job35449/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35134LFv-simsuccessno0:23:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35133LFv-simsuccessno0:23:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33580LFv-simsuccessno0:25:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33431LFv-simsuccessno0:16:30
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33430LFv-simsuccessno0:16:53
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible