Duckietown Challenges Home Challenges Submissions

Submission 9317

Submission9317
Competingyes
Challengeaido5-LF-sim-validation
UserYishu Malhotra 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58207
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58207

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58207LFv-simsuccessyes0:06:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.7887294732782361
survival_time_median9.55
deviation-center-line_median0.24292755678083647
in-drivable-lane_median6.599999999999999


other stats
agent_compute-ego0_max0.014073064592149522
agent_compute-ego0_mean0.012775190683229272
agent_compute-ego0_median0.012583384940581172
agent_compute-ego0_min0.01186092825960522
complete-iteration_max0.18397002977038188
complete-iteration_mean0.16905693298111085
complete-iteration_median0.17212890233899572
complete-iteration_min0.14799989747607015
deviation-center-line_max0.2706747497263237
deviation-center-line_mean0.2043382524396596
deviation-center-line_min0.06082314647064173
deviation-heading_max2.035795709899228
deviation-heading_mean1.329246631392555
deviation-heading_median1.436952653529622
deviation-heading_min0.40728550861174817
driven_any_max5.190622234694534
driven_any_mean3.2427257367827105
driven_any_median2.9942276612554743
driven_any_min1.7918253899253584
driven_lanedir_consec_max0.9506983561718854
driven_lanedir_consec_mean0.8044308225023503
driven_lanedir_consec_min0.689565987281044
driven_lanedir_max0.9506983561718854
driven_lanedir_mean0.8044308225023503
driven_lanedir_median0.7887294732782361
driven_lanedir_min0.689565987281044
get_duckie_state_max1.3688514972555226e-06
get_duckie_state_mean1.3086822898557003e-06
get_duckie_state_median1.3309811788891988e-06
get_duckie_state_min1.2039153043888818e-06
get_robot_state_max0.003862868414984809
get_robot_state_mean0.003714744915812503
get_robot_state_median0.003694310701147872
get_robot_state_min0.00360748984596946
get_state_dump_max0.004954904223245287
get_state_dump_mean0.004708129536265579
get_state_dump_median0.004698985589940364
get_state_dump_min0.004479642741936298
get_ui_image_max0.03399508340018136
get_ui_image_mean0.029978165458545647
get_ui_image_median0.03025021000383457
get_ui_image_min0.02541715842633208
in-drivable-lane_max11.75000000000007
in-drivable-lane_mean7.125000000000013
in-drivable-lane_min3.5499999999999874
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.6672310165536564, "get_ui_image": 0.02771793425768271, "step_physics": 0.09862032566947498, "survival_time": 8.649999999999988, "driven_lanedir": 0.7299288541400641, "get_state_dump": 0.004791479001100036, "get_robot_state": 0.003752013732647074, "sim_render-ego0": 0.0036623491638008206, "get_duckie_state": 1.3688514972555226e-06, "in-drivable-lane": 6.099999999999993, "deviation-heading": 0.9818304430466956, "agent_compute-ego0": 0.012517641330587453, "complete-iteration": 0.16467994657056084, "set_robot_commands": 0.0022351824004074624, "deviation-center-line": 0.25686001778412937, "driven_lanedir_consec": 0.7299288541400641, "sim_compute_sim_state": 0.009347393594939132, "sim_compute_performance-ego0": 0.001952316569185805}, "LF-norm-zigzag-000-ego0": {"driven_any": 3.3212243059572923, "get_ui_image": 0.03399508340018136, "step_physics": 0.10703266575222924, "survival_time": 10.450000000000014, "driven_lanedir": 0.847530092416408, "get_state_dump": 0.004606492178780692, "get_robot_state": 0.00363660766964867, "sim_render-ego0": 0.003614163398742676, "get_duckie_state": 1.3374146961030505e-06, "in-drivable-lane": 7.100000000000004, "deviation-heading": 1.892074864012548, "agent_compute-ego0": 0.012649128550574894, "complete-iteration": 0.1795778581074306, "set_robot_commands": 0.002024903751554943, "deviation-center-line": 0.2706747497263237, "driven_lanedir_consec": 0.847530092416408, "sim_compute_sim_state": 0.010087587719871885, "sim_compute_performance-ego0": 0.0018504085994902112}, "LF-norm-techtrack-000-ego0": {"driven_any": 5.190622234694534, "get_ui_image": 0.032782485749986434, "step_physics": 0.10745774753510004, "survival_time": 15.700000000000088, "driven_lanedir": 0.9506983561718854, "get_state_dump": 0.004954904223245287, "get_robot_state": 0.003862868414984809, "sim_render-ego0": 0.0040165946597144715, "get_duckie_state": 1.3245476616753472e-06, "in-drivable-lane": 11.75000000000007, "deviation-heading": 2.035795709899228, "agent_compute-ego0": 0.014073064592149522, "complete-iteration": 0.18397002977038188, "set_robot_commands": 0.002326572509039016, "deviation-center-line": 0.22899509577754357, "driven_lanedir_consec": 0.9506983561718854, "sim_compute_sim_state": 0.012253584180559432, "sim_compute_performance-ego0": 0.0021527411445738777}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.7918253899253584, "get_ui_image": 0.02541715842633208, "step_physics": 0.08976413198739044, "survival_time": 5.999999999999987, "driven_lanedir": 0.689565987281044, "get_state_dump": 0.004479642741936298, "get_robot_state": 0.00360748984596946, "sim_render-ego0": 0.003770871595902877, "get_duckie_state": 1.2039153043888818e-06, "in-drivable-lane": 3.5499999999999874, "deviation-heading": 0.40728550861174817, "agent_compute-ego0": 0.01186092825960522, "complete-iteration": 0.14799989747607015, "set_robot_commands": 0.002088032478143361, "deviation-center-line": 0.06082314647064173, "driven_lanedir_consec": 0.689565987281044, "sim_compute_sim_state": 0.005025686311327721, "sim_compute_performance-ego0": 0.0019052008951991055}}
set_robot_commands_max0.002326572509039016
set_robot_commands_mean0.0021686727847861956
set_robot_commands_median0.0021616074392754115
set_robot_commands_min0.002024903751554943
sim_compute_performance-ego0_max0.0021527411445738777
sim_compute_performance-ego0_mean0.00196516680211225
sim_compute_performance-ego0_median0.0019287587321924551
sim_compute_performance-ego0_min0.0018504085994902112
sim_compute_sim_state_max0.012253584180559432
sim_compute_sim_state_mean0.009178562951674542
sim_compute_sim_state_median0.009717490657405507
sim_compute_sim_state_min0.005025686311327721
sim_render-ego0_max0.0040165946597144715
sim_render-ego0_mean0.003765994704540211
sim_render-ego0_median0.003716610379851848
sim_render-ego0_min0.003614163398742676
simulation-passed1
step_physics_max0.10745774753510004
step_physics_mean0.10071871773604868
step_physics_median0.10282649571085212
step_physics_min0.08976413198739044
survival_time_max15.700000000000088
survival_time_mean10.20000000000002
survival_time_min5.999999999999987
No reset possible
52400LFv-simerrorno0:02:51
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139624642242016
- M:video_aido:cmdline(in:/;out:/) 139624642243312
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52388LFv-simerrorno0:04:22
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140481551326800
- M:video_aido:cmdline(in:/;out:/) 140481551169424
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52325LFv-simhost-errorno0:08:34
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41727LFv-simsuccessno0:04:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41726LFv-simsuccessno0:04:33
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38172LFv-simerrorno0:00:35
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9317/LFv-sim-mont04-e828c68b6a88-1-job38172-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36322LFv-simerrorno0:00:42
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9317/LFv-sim-Sandy1-sandy-1-job36322-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35756LFv-simsuccessno0:01:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35361LFv-simerrorno0:08:35
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9317/LFv-sim-reg03-0c28c9d61367-1-job35361:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9317/LFv-sim-reg03-0c28c9d61367-1-job35361/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9317/LFv-sim-reg03-0c28c9d61367-1-job35361/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9317/LFv-sim-reg03-0c28c9d61367-1-job35361/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9317/LFv-sim-reg03-0c28c9d61367-1-job35361/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9317/LFv-sim-reg03-0c28c9d61367-1-job35361/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34996LFv-simsuccessno0:09:28
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34668LFv-simsuccessno0:09:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34667LFv-simsuccessno0:10:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible