Duckietown Challenges Home Challenges Submissions

Submission 834

Submission834
Competingyes
Challengeaido1_LF1_r3-v3
UserSamuel Lavoie
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💔
Jobsstep1-simulation: 12297 step2-scoring: 6259 step3-videos: 6262 step4-viz: 6421
Next
User labelYoung Duke
Admin priority50
Blessingn/a
User priority50

6421

Click the images to see detailed statistics about the episode.

ep000

ep001

ep002

ep003

ep004

6262

ep000

ep001

ep002

ep003

ep004

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
12297step1-simulationfailedyes0:04:44
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
    solution.run(cis)
  File "solution.py", line 111, in run
    solve(params, cis)
  File "solution.py", line 75, in solve
    observation, reward, done, info = env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/core.py", line 332, in step
    return self.env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/core.py", line 304, in step
    observation, reward, done, info = self.env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/core.py", line 304, in step
    observation, reward, done, info = self.env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/core.py", line 304, in step
    observation, reward, done, info = self.env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/wrappers/time_limit.py", line 31, in step
    observation, reward, done, info = self.env.step(action)
  File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
    obs, rew, done, misc = self.sim.step(action, with_observation=True)
  File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
    return self._failsafe_observe(msg)
  File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
    raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
    raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
    solution.run(cis)
  File "solution.py", line 111, in run
    solve(params, cis)
  File "solution.py", line 75, in solve
    observation, reward, done, info = env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/core.py", line 332, in step
    return self.env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/core.py", line 304, in step
    observation, reward, done, info = self.env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/core.py", line 304, in step
    observation, reward, done, info = self.env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/core.py", line 304, in step
    observation, reward, done, info = self.env.step(action)
  File "/opt/conda/lib/python3.6/site-packages/gym/wrappers/time_limit.py", line 31, in step
    observation, reward, done, info = self.env.step(action)
  File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
    obs, rew, done, misc = self.sim.step(action, with_observation=True)
  File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
    return self._failsafe_observe(msg)
  File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
    raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator


Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
6421step4-vizsuccessyes0:04:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median2.007174640650131
deviation-center-line_median0.7470490299516872
in-drivable-lane_median10.96666666666663


other stats
deviation-center-line_max1.2882373660159392
deviation-center-line_mean0.6562525500331089
deviation-center-line_min0
deviation-heading_max2.3062969492956733
deviation-heading_mean1.2751324529310195
deviation-heading_median1.2324123415856356
deviation-heading_min0
driven_any_max6.370932198627862
driven_any_mean6.286530397440294
driven_any_median6.33989131858102
driven_any_min6.120627628089538
driven_lanedir_max4.038876905070678
driven_lanedir_mean2.148530641897353
driven_lanedir_min0
in-drivable-lane_max16.63333333333332
in-drivable-lane_mean10.726666666666649
in-drivable-lane_min5.566666666666647
per-episodes
details{"ep000": {"driven_any": 6.232326621558599, "driven_lanedir": 0, "in-drivable-lane": 16.63333333333332, "deviation-heading": 0, "deviation-center-line": 0}, "ep001": {"driven_any": 6.120627628089538, "driven_lanedir": 2.007174640650131, "in-drivable-lane": 10.96666666666663, "deviation-heading": 1.2324123415856356, "deviation-center-line": 0.7470490299516872}, "ep002": {"driven_any": 6.368874220344451, "driven_lanedir": 4.038876905070678, "in-drivable-lane": 5.566666666666647, "deviation-heading": 2.3062969492956733, "deviation-center-line": 1.1050416948852386}, "ep003": {"driven_any": 6.33989131858102, "driven_lanedir": 3.5174523047355586, "in-drivable-lane": 7.000000000000002, "deviation-heading": 2.290768452385038, "deviation-center-line": 1.2882373660159392}, "ep004": {"driven_any": 6.370932198627862, "driven_lanedir": 1.1791493590303976, "in-drivable-lane": 13.466666666666656, "deviation-heading": 0.5461845213887516, "deviation-center-line": 0.14093465931267904}}
No reset possible
6420step4-vizabortedyes0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 353, in go_
    os.makedirs(wd)
  File "/usr/lib/python2.7/os.py", line 157, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido1_LF1_r3-v3/submission834/step4-viz-nutonomy-P50-2210-job6420'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
6262step3-videossuccessyes0:01:14
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
videos1
No reset possible
6259step2-scoringsuccessyes0:00:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median16.666666666666654


other stats
episodes
details{"ep000": {"nsteps": 500, "reward": -2.485302177786827, "good_angle": 44.70155231345539, "survival_time": 16.666666666666654, "traveled_tiles": 11, "valid_direction": 5.533333333333351}, "ep001": {"nsteps": 500, "reward": -1.4064829516261816, "good_angle": 16.665641929545327, "survival_time": 16.666666666666654, "traveled_tiles": 8, "valid_direction": 6.0333333333333155}, "ep002": {"nsteps": 500, "reward": -0.5573239942416549, "good_angle": 2.385354831913217, "survival_time": 16.666666666666654, "traveled_tiles": 13, "valid_direction": 6.166666666666648}, "ep003": {"nsteps": 500, "reward": -0.78558545652451, "good_angle": 2.7125707063362507, "survival_time": 16.666666666666654, "traveled_tiles": 13, "valid_direction": 6.833333333333345}, "ep004": {"nsteps": 500, "reward": -2.124756822562427, "good_angle": 32.37004984167209, "survival_time": 16.666666666666654, "traveled_tiles": 11, "valid_direction": 3.9333333333333194}}
good_angle_max44.70155231345539
good_angle_mean19.767033924584457
good_angle_median16.665641929545327
good_angle_min2.385354831913217
reward_max-0.5573239942416549
reward_mean-1.47189028054832
reward_median-1.4064829516261816
reward_min-2.485302177786827
survival_time_max16.666666666666654
survival_time_mean16.666666666666654
survival_time_min16.666666666666654
traveled_tiles_max13
traveled_tiles_mean11.2
traveled_tiles_median11
traveled_tiles_min8
valid_direction_max6.833333333333345
valid_direction_mean5.699999999999995
valid_direction_median6.0333333333333155
valid_direction_min3.9333333333333194
No reset possible