Duckietown Challenges Home Challenges Submissions

Submission 9372

Submission9372
Competingyes
Challengeaido5-LF-sim-validation
UserHimanshu Arora 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58130
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58130

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58130LFv-simsuccessyes0:10:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.6802040444950532
survival_time_median16.850000000000104
deviation-center-line_median0.6257290246613993
in-drivable-lane_median5.650000000000027


other stats
agent_compute-ego0_max0.013403873765066769
agent_compute-ego0_mean0.01245852501634039
agent_compute-ego0_median0.012267455999350071
agent_compute-ego0_min0.01189531430159465
complete-iteration_max0.2336737975645601
complete-iteration_mean0.1794407361605256
complete-iteration_median0.16682638803400002
complete-iteration_min0.15043637100954232
deviation-center-line_max1.8290838142640249
deviation-center-line_mean0.7965754916991898
deviation-center-line_min0.10576010320993597
deviation-heading_max5.2380138566704755
deviation-heading_mean2.482922195151132
deviation-heading_median1.9512848926405315
deviation-heading_min0.7911051386529898
driven_any_max5.485168064782831
driven_any_mean3.053165173809763
driven_any_median3.0484427195506667
driven_any_min0.6306071913548879
driven_lanedir_consec_max3.367558460408598
driven_lanedir_consec_mean1.7380404570145909
driven_lanedir_consec_min0.22419527865965885
driven_lanedir_max3.367558460408598
driven_lanedir_mean1.7380404570145909
driven_lanedir_median1.6802040444950532
driven_lanedir_min0.22419527865965885
get_duckie_state_max1.0769018966160463e-06
get_duckie_state_mean1.0327917293891592e-06
get_duckie_state_median1.0286843114714144e-06
get_duckie_state_min9.968963979977613e-07
get_robot_state_max0.0035751588273756573
get_robot_state_mean0.003486064389458403
get_robot_state_median0.003458263609246105
get_robot_state_min0.003452571511965746
get_state_dump_max0.00437867523419975
get_state_dump_mean0.004297457763159149
get_state_dump_median0.004272133309735098
get_state_dump_min0.004266889198966648
get_ui_image_max0.03660769676894284
get_ui_image_mean0.03091866286880058
get_ui_image_median0.03062417436768368
get_ui_image_min0.025818605970892108
in-drivable-lane_max15.90000000000019
in-drivable-lane_mean7.500000000000059
in-drivable-lane_min2.799999999999992
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 4.665716458241799, "get_ui_image": 0.028521316358358553, "step_physics": 0.09306058128281396, "survival_time": 25.200000000000223, "driven_lanedir": 3.367558460408598, "get_state_dump": 0.00437867523419975, "get_robot_state": 0.0035751588273756573, "sim_render-ego0": 0.003651107183777459, "get_duckie_state": 1.0466811680557705e-06, "in-drivable-lane": 7.000000000000061, "deviation-heading": 3.0705199082545613, "agent_compute-ego0": 0.01189531430159465, "complete-iteration": 0.1596949067446265, "set_robot_commands": 0.002144366915863339, "deviation-center-line": 1.8290838142640249, "driven_lanedir_consec": 3.367558460408598, "sim_compute_sim_state": 0.010509831362431592, "sim_compute_performance-ego0": 0.0018868611590697032}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.6306071913548879, "get_ui_image": 0.03660769676894284, "step_physics": 0.15691751308655472, "survival_time": 4.399999999999992, "driven_lanedir": 0.22419527865965885, "get_state_dump": 0.0042720510718527805, "get_robot_state": 0.0034531689761729723, "sim_render-ego0": 0.003605920277284772, "get_duckie_state": 1.0769018966160463e-06, "in-drivable-lane": 2.799999999999992, "deviation-heading": 0.7911051386529898, "agent_compute-ego0": 0.013403873765066769, "complete-iteration": 0.2336737975645601, "set_robot_commands": 0.0034741230225295164, "deviation-center-line": 0.10576010320993597, "driven_lanedir_consec": 0.22419527865965885, "sim_compute_sim_state": 0.010038472293468005, "sim_compute_performance-ego0": 0.0018301760212758953}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.4311689808595345, "get_ui_image": 0.03272703237700881, "step_physics": 0.10478646992242824, "survival_time": 8.499999999999986, "driven_lanedir": 0.707275657578087, "get_state_dump": 0.004272215547617416, "get_robot_state": 0.003452571511965746, "sim_render-ego0": 0.0036575557195652297, "get_duckie_state": 9.968963979977613e-07, "in-drivable-lane": 4.299999999999993, "deviation-heading": 0.8320498770265018, "agent_compute-ego0": 0.012519309395237974, "complete-iteration": 0.17395786932337354, "set_robot_commands": 0.0020777551751387747, "deviation-center-line": 0.2708529286108065, "driven_lanedir_consec": 0.707275657578087, "sim_compute_sim_state": 0.008562958031369928, "sim_compute_performance-ego0": 0.0018308887704771165}, "LF-norm-small_loop-000-ego0": {"driven_any": 5.485168064782831, "get_ui_image": 0.025818605970892108, "step_physics": 0.09158071264717148, "survival_time": 32.15000000000031, "driven_lanedir": 2.6531324314120193, "get_state_dump": 0.004266889198966648, "get_robot_state": 0.0034633582423192373, "sim_render-ego0": 0.0035861145635569316, "get_duckie_state": 1.0106874548870584e-06, "in-drivable-lane": 15.90000000000019, "deviation-heading": 5.2380138566704755, "agent_compute-ego0": 0.012015602603462173, "complete-iteration": 0.15043637100954232, "set_robot_commands": 0.002030684340814626, "deviation-center-line": 0.9806051207119922, "driven_lanedir_consec": 2.6531324314120193, "sim_compute_sim_state": 0.005781932647183815, "sim_compute_performance-ego0": 0.001821253980909075}}
set_robot_commands_max0.0034741230225295164
set_robot_commands_mean0.0024317323635865643
set_robot_commands_median0.002111061045501057
set_robot_commands_min0.002030684340814626
sim_compute_performance-ego0_max0.0018868611590697032
sim_compute_performance-ego0_mean0.0018422949829329477
sim_compute_performance-ego0_median0.0018305323958765056
sim_compute_performance-ego0_min0.001821253980909075
sim_compute_sim_state_max0.010509831362431592
sim_compute_sim_state_mean0.008723298583613336
sim_compute_sim_state_median0.009300715162418965
sim_compute_sim_state_min0.005781932647183815
sim_render-ego0_max0.0036575557195652297
sim_render-ego0_mean0.003625174436046098
sim_render-ego0_median0.003628513730531116
sim_render-ego0_min0.0035861145635569316
simulation-passed1
step_physics_max0.15691751308655472
step_physics_mean0.1115863192347421
step_physics_median0.0989235256026211
step_physics_min0.09158071264717148
survival_time_max32.15000000000031
survival_time_mean17.562500000000128
survival_time_min4.399999999999992
No reset possible
52206LFv-simerrorno0:05:52
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140264051100736
- M:video_aido:cmdline(in:/;out:/) 140264051156352
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52205LFv-simerrorno0:06:30
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139749000791904
- M:video_aido:cmdline(in:/;out:/) 139749000792960
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52196LFv-simerrorno0:05:26
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140144401821168
- M:video_aido:cmdline(in:/;out:/) 140144402181856
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41678LFv-simsuccessno0:09:42
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38104LFv-simsuccessno0:11:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36256LFv-simsuccessno0:18:09
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36255LFv-simsuccessno0:09:36
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36254LFv-simsuccessno0:15:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35701LFv-simsuccessno0:00:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35700LFv-simsuccessno0:01:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35317LFv-simerrorno0:23:03
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9372/LFv-sim-reg01-94a6fab21ac9-1-job35317:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9372/LFv-sim-reg01-94a6fab21ac9-1-job35317/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9372/LFv-sim-reg01-94a6fab21ac9-1-job35317/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9372/LFv-sim-reg01-94a6fab21ac9-1-job35317/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9372/LFv-sim-reg01-94a6fab21ac9-1-job35317/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9372/LFv-sim-reg01-94a6fab21ac9-1-job35317/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34899LFv-simsuccessno0:24:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34897LFv-simsuccessno0:24:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34829LFv-simsuccessno0:24:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible