Duckietown Challenges Home Challenges Submissions

Submission 10854

Submission10854
Competingyes
Challengeaido5-LF-sim-validation
UserDishank Bansal 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57744
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

57744

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
57744LFv-simsuccessyes0:15:15
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.7491471932165088
survival_time_median13.275000000000054
deviation-center-line_median0.35941647700012147
in-drivable-lane_median6.425000000000029


other stats
agent_compute-ego0_max0.013406648644605544
agent_compute-ego0_mean0.013033803984414026
agent_compute-ego0_median0.013126399721612505
agent_compute-ego0_min0.012475767849825552
complete-iteration_max0.23298384862787583
complete-iteration_mean0.20655155826927357
complete-iteration_median0.2129258506492732
complete-iteration_min0.16737068315067202
deviation-center-line_max0.8658074201512707
deviation-center-line_mean0.4461667692025132
deviation-center-line_min0.2000267026585392
deviation-heading_max2.8930042290982727
deviation-heading_mean1.8201783211075704
deviation-heading_median1.8063736297423596
deviation-heading_min0.7749617958472895
driven_any_max9.33060731718023
driven_any_mean3.8188933055940537
driven_any_median2.1374161482204688
driven_any_min1.6701336087550458
driven_lanedir_consec_max2.1013254684912597
driven_lanedir_consec_mean1.0758169787789131
driven_lanedir_consec_min0.7036480601913753
driven_lanedir_max2.1013254684912597
driven_lanedir_mean1.0758169787789131
driven_lanedir_median0.7491471932165088
driven_lanedir_min0.7036480601913753
get_duckie_state_max1.4990762947431576e-06
get_duckie_state_mean1.4327472193641053e-06
get_duckie_state_median1.4511150553597635e-06
get_duckie_state_min1.3296824719937363e-06
get_robot_state_max0.004200072598828841
get_robot_state_mean0.004033903502048971
get_robot_state_median0.003995336425531572
get_robot_state_min0.0039448685583039
get_state_dump_max0.005463136753565828
get_state_dump_mean0.005228431517679675
get_state_dump_median0.005196959629243262
get_state_dump_min0.005056670058666347
get_ui_image_max0.03627616829342312
get_ui_image_mean0.0319349531391076
get_ui_image_median0.03213267039480988
get_ui_image_min0.027198303473387527
in-drivable-lane_max48.649999999999025
in-drivable-lane_mean15.99999999999978
in-drivable-lane_min2.5000000000000355
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 1.7923921092207975, "get_ui_image": 0.030116378473290264, "step_physics": 0.14083372443783126, "survival_time": 11.300000000000026, "driven_lanedir": 0.7070502201502911, "get_state_dump": 0.005056670058666347, "get_robot_state": 0.0039448685583039, "sim_render-ego0": 0.004156758606696444, "get_duckie_state": 1.3296824719937363e-06, "in-drivable-lane": 6.750000000000034, "deviation-heading": 1.1621823234491524, "agent_compute-ego0": 0.012475767849825552, "complete-iteration": 0.2102818373541475, "set_robot_commands": 0.002296152619013177, "deviation-center-line": 0.40564328623370777, "driven_lanedir_consec": 0.7070502201502911, "sim_compute_sim_state": 0.009168295083066965, "sim_compute_performance-ego0": 0.0021334387657400795}, "LF-norm-zigzag-000-ego0": {"driven_any": 2.4824401872201403, "get_ui_image": 0.03627616829342312, "step_physics": 0.15401983728595808, "survival_time": 15.250000000000082, "driven_lanedir": 2.1013254684912597, "get_state_dump": 0.005156266923044242, "get_robot_state": 0.0039869219649071785, "sim_render-ego0": 0.004264572866601882, "get_duckie_state": 1.4990762947431576e-06, "in-drivable-lane": 2.5000000000000355, "deviation-heading": 2.8930042290982727, "agent_compute-ego0": 0.013132189613541746, "complete-iteration": 0.23298384862787583, "set_robot_commands": 0.0023214832630032807, "deviation-center-line": 0.8658074201512707, "driven_lanedir_consec": 2.1013254684912597, "sim_compute_sim_state": 0.01151478446386998, "sim_compute_performance-ego0": 0.002201151224522809}, "LF-norm-techtrack-000-ego0": {"driven_any": 9.33060731718023, "get_ui_image": 0.034148962316329495, "step_physics": 0.1367304589527665, "survival_time": 54.49999999999904, "driven_lanedir": 0.7912441662827266, "get_state_dump": 0.005237652335442281, "get_robot_state": 0.004200072598828841, "sim_render-ego0": 0.00438314069858065, "get_duckie_state": 1.4482125790156322e-06, "in-drivable-lane": 48.649999999999025, "deviation-heading": 2.450564936035567, "agent_compute-ego0": 0.013406648644605544, "complete-iteration": 0.21556986394439892, "set_robot_commands": 0.0025042196023802926, "deviation-center-line": 0.31318966776653523, "driven_lanedir_consec": 0.7912441662827266, "sim_compute_sim_state": 0.0124888767571104, "sim_compute_performance-ego0": 0.002362830610476318}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.6701336087550458, "get_ui_image": 0.027198303473387527, "step_physics": 0.10317838583753702, "survival_time": 10.600000000000016, "driven_lanedir": 0.7036480601913753, "get_state_dump": 0.005463136753565828, "get_robot_state": 0.004003750886155966, "sim_render-ego0": 0.004271094228180361, "get_duckie_state": 1.4540175317038951e-06, "in-drivable-lane": 6.100000000000024, "deviation-heading": 0.7749617958472895, "agent_compute-ego0": 0.01312060982968326, "complete-iteration": 0.16737068315067202, "set_robot_commands": 0.0024085705269110596, "deviation-center-line": 0.2000267026585392, "driven_lanedir_consec": 0.7036480601913753, "sim_compute_sim_state": 0.005409170204484967, "sim_compute_performance-ego0": 0.002213671733515923}}
set_robot_commands_max0.0025042196023802926
set_robot_commands_mean0.0023826065028269526
set_robot_commands_median0.00236502689495717
set_robot_commands_min0.002296152619013177
sim_compute_performance-ego0_max0.002362830610476318
sim_compute_performance-ego0_mean0.0022277730835637824
sim_compute_performance-ego0_median0.002207411479019366
sim_compute_performance-ego0_min0.0021334387657400795
sim_compute_sim_state_max0.0124888767571104
sim_compute_sim_state_mean0.009645281627133078
sim_compute_sim_state_median0.010341539773468474
sim_compute_sim_state_min0.005409170204484967
sim_render-ego0_max0.00438314069858065
sim_render-ego0_mean0.004268891600014834
sim_render-ego0_median0.0042678335473911215
sim_render-ego0_min0.004156758606696444
simulation-passed1
step_physics_max0.15401983728595808
step_physics_mean0.1336906016285232
step_physics_median0.13878209169529887
step_physics_min0.10317838583753702
survival_time_max54.49999999999904
survival_time_mean22.912499999999795
survival_time_min10.600000000000016
No reset possible
57741LFv-simsuccessyes0:10:41
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
57740LFv-simsuccessyes0:16:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
57739LFv-simsuccessyes0:18:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51482LFv-simerrorno0:09:15
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140411577140368
- M:video_aido:cmdline(in:/;out:/) 140411577143008
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51464LFv-simerrorno0:06:46
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140049777095392
- M:video_aido:cmdline(in:/;out:/) 140049777499200
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40841LFv-simsuccessno0:11:25
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40839LFv-simsuccessno0:11:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40837LFv-simsuccessno0:09:21
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40833LFv-simtimeoutno0:09:15
Timeout because eval [...]
Timeout because evaluator contacted us
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40825LFv-simsuccessno0:09:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40823LFv-simsuccessno0:09:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38188LFv-simerrorno0:00:37
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission10854/LFv-sim-mont01-6ef51bb8a9d6-1-job38188-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38187LFv-simerrorno0:00:33
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission10854/LFv-sim-mont03-cfb9f976bc49-1-job38187-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible