AI Driving Olympics Home Challenges Submissions Jobs

Submission 224

Submission224
Competingyes
Challengeaido1_LF1_r3-v3
UserJulian Zilly
Date submitted
Complete
DetailsEvaluation is complete.
Sisters
Result💔
Jobsstep1-simulation: 13904
Next
User labelBaseline solution using imitation learning from logs
Admin priority50
Blessing50
User priority

Evaluation jobs for this submission

Job IDsubmissionuseruser labelchallengestepstatusup to dateevaluatordate starteddate completeddurationmessage
13904224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationfailedyes-0:01:34
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 581, in wrap_solution
    solution.run(cis)
  File "solution.py", line 88, in run
    raise InvalidSubmission(str(e))
InvalidSubmission: Can't connect to the gym-duckietown-server

Hidden. If you are the author, please login using the top-right link or use the dashboard.
10630224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationfailedno-0:02:44
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 581, in wrap_solution
    solution.run(cis)
  File "solution.py", line 88, in run
    raise InvalidSubmission(str(e))
InvalidSubmission: Can't connect to the gym-duckietown-server

Hidden. If you are the author, please login using the top-right link or use the dashboard.
10629224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationfailedno-0:02:45
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 581, in wrap_solution
    solution.run(cis)
  File "solution.py", line 88, in run
    raise InvalidSubmission(str(e))
InvalidSubmission: Can't connect to the gym-duckietown-server

Hidden. If you are the author, please login using the top-right link or use the dashboard.
3116224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationfailedno-0:00:23
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 581, in wrap_solution
    solution.run(cis)
  File "solution.py", line 88, in run
    raise InvalidSubmission(str(e))
InvalidSubmission: Can't connect to the gym-duckietown-server

Hidden. If you are the author, please login using the top-right link or use the dashboard.
2940224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationfailedno-0:02:27
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 581, in wrap_solution
    solution.run(cis)
  File "solution.py", line 88, in run
    raise InvalidSubmission(str(e))
InvalidSubmission: Can't connect to the gym-duckietown-server

Hidden. If you are the author, please login using the top-right link or use the dashboard.
2825224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step4-vizsuccessno-0:02:08Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
videos1
2824224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step3-videossuccessno-0:03:24Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
videos1
2822224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step2-scoringsuccessno-0:00:16Hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median16.666666666666654


other stats
episodes
details{"ep000": {"nsteps": 500, "reward": -1.2812926945090295, "good_angle": 60.62226718882041, "survival_time": 16.666666666666654, "traveled_tiles": 3, "valid_direction": 7.133333333333309}, "ep001": {"nsteps": 500, "reward": -0.39393230858445166, "good_angle": 18.195943465006614, "survival_time": 16.666666666666654, "traveled_tiles": 3, "valid_direction": 1.933333333333362}, "ep002": {"nsteps": 364, "reward": -2.7482636302229366, "good_angle": 0.7961713109202434, "survival_time": 12.1333333333333, "traveled_tiles": 3, "valid_direction": 1.5999999999999943}, "ep003": {"nsteps": 500, "reward": -1.1700545222759249, "good_angle": 58.375203150184426, "survival_time": 16.666666666666654, "traveled_tiles": 4, "valid_direction": 7.066666666666666}, "ep004": {"nsteps": 500, "reward": -1.5698140780329704, "good_angle": 12.501617076206289, "survival_time": 16.666666666666654, "traveled_tiles": 3, "valid_direction": 6.866666666666678}}
good_angle_max60.62226718882041
good_angle_mean30.098240438227595
good_angle_median18.195943465006614
good_angle_min0.7961713109202434
reward_max-0.39393230858445166
reward_mean-1.4326714467250627
reward_median-1.2812926945090295
reward_min-2.7482636302229366
survival_time_max16.666666666666654
survival_time_mean15.759999999999986
survival_time_min12.1333333333333
traveled_tiles_max4
traveled_tiles_mean3.2
traveled_tiles_median3
traveled_tiles_min3
valid_direction_max7.133333333333309
valid_direction_mean4.920000000000002
valid_direction_median6.866666666666678
valid_direction_min1.5999999999999943
2809224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step2-scoringtimeoutno-0:11:01Hidden. If you are the author, please login using the top-right link or use the dashboard.
2795224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationsuccessno-0:02:45Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
2794224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationtimeoutno-0:11:00Hidden. If you are the author, please login using the top-right link or use the dashboard.
2592224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationfailedno-0:00:23
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 581, in wrap_solution
    solution.run(cis)
  File "solution.py", line 88, in run
    raise InvalidSubmission(str(e))
InvalidSubmission: Can't connect to the gym-duckietown-server

Hidden. If you are the author, please login using the top-right link or use the dashboard.
555224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationfailedno-0:00:47
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 581, in wrap_solution
    solution.run(cis)
  File "solution.py", line 88, in run
    raise InvalidSubmission(str(e))
InvalidSubmission: Can't connect to the gym-duckietown-server

Hidden. If you are the author, please login using the top-right link or use the dashboard.
509224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationtimeoutno-0:11:00Hidden. If you are the author, please login using the top-right link or use the dashboard.
508224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationsuccessno-0:04:44Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
507224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationabortedno-0:02:44
Uncaught exception w [...]
Uncaught exception while running Docker Compose:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 412, in run
    run_docker(wd, project, cmd)
  File "/project/src/duckietown_challenges_runner/runner.py", line 560, in run_docker
    raise DockerComposeFail(msg)
DockerComposeFail: Could not run ['docker-compose', '-p', 'job507-4555', 'up', '--abort-on-container-exit']:

   >  Command '['docker-compose', '-p', 'job507-4555', 'up', '--abort-on-container-exit']' returned non-zero exit status 1

 docker-compose stdout  | None
Hidden. If you are the author, please login using the top-right link or use the dashboard.
506224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationtimeoutno-0:11:00Hidden. If you are the author, please login using the top-right link or use the dashboard.
505224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationsuccessno-0:04:22Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
504224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationsuccessno-0:04:44Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
503224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationsuccessno-0:04:46Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
502224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationtimeoutno-0:11:01Hidden. If you are the author, please login using the top-right link or use the dashboard.
501224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationsuccessno-0:04:09Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
476224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step4-vizabortedno-0:07:26
InvalidEnvironment: [...]
InvalidEnvironment:
Traceback (most recent call last):
  File "/visualizer/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 530, in wrap_scorer
    evaluator.score(cie)
  File "/visualizer/videos.py", line 12, in score
    logdir = cie.get_completed_step_evaluation_file('step1-simulation', 'episodes')
  File "/visualizer/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 398, in get_completed_step_evaluation_file
    return get_completed_step_evaluation_file(self.root, step_name, basename)
  File "/visualizer/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 427, in get_completed_step_evaluation_file
    available = get_completed_step_evaluation_files(root, step_name)
  File "/visualizer/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 421, in get_completed_step_evaluation_files
    raise InvalidEnvironment(msg)
InvalidEnvironment: Could not find dir /previous-steps/step1-simulation/challenge-evaluation-output
Hidden. If you are the author, please login using the top-right link or use the dashboard.
475224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationabortedno-0:04:50Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
397224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step3-videosabortedno-0:00:50Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
videos1
396224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step2-scoringabortedno-0:00:14Hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median16.333333333333304


other stats
episodes
details{"ep000": {"nsteps": 490, "reward": -4.171412545077655, "good_angle": 14.379182228026304, "survival_time": 16.333333333333304, "traveled_tiles": 3, "valid_direction": 6.066666666666664}, "ep001": {"nsteps": 500, "reward": -1.0339719699919223, "good_angle": 52.42523924297719, "survival_time": 16.666666666666654, "traveled_tiles": 3, "valid_direction": 7.133333333333309}, "ep002": {"nsteps": 500, "reward": -1.4501068297624589, "good_angle": 74.28098761715059, "survival_time": 16.666666666666654, "traveled_tiles": 3, "valid_direction": 7.899999999999977}, "ep003": {"nsteps": 246, "reward": -4.999525907684148, "good_angle": 1.22724322608484, "survival_time": 8.199999999999982, "traveled_tiles": 1, "valid_direction": 2.6666666666666647}, "ep004": {"nsteps": 163, "reward": -7.094105166571637, "good_angle": 0.711536075115334, "survival_time": 5.433333333333324, "traveled_tiles": 1, "valid_direction": 2.0666666666666673}}
good_angle_max74.28098761715059
good_angle_mean28.604837677870854
good_angle_median14.379182228026304
good_angle_min0.711536075115334
reward_max-1.0339719699919223
reward_mean-3.749824483817565
reward_median-4.171412545077655
reward_min-7.094105166571637
survival_time_max16.666666666666654
survival_time_mean12.659999999999984
survival_time_min5.433333333333324
traveled_tiles_max3
traveled_tiles_mean2.2
traveled_tiles_median3
traveled_tiles_min1
valid_direction_max7.899999999999977
valid_direction_mean5.166666666666656
valid_direction_median6.066666666666664
valid_direction_min2.0666666666666673
395224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationabortedno-0:02:39Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1