AI Driving Olympics Home Challenges Submissions Jobs

Submission 832

Submission832
Competingyes
Challengeaido1_LF1_r3-v3
Userheyt0ny
Date submitted
Complete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 12275 step2-scoring: 12287 step3-videos: 12290 step4-viz: 12298
Next
User labelSAIC MOSCOW MML
Admin priority50
Blessing50
User priority

12298

Click the images to see detailed statistics about the episode.

12290

Evaluation jobs for this submission

Job IDsubmissionuseruser labelchallengestepstatusup to dateevaluatordate starteddate completeddurationmessage
12298832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step4-vizsuccessyes-0:01:37Hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median1.6771841021538645
deviation-center-line_median0.1783283136710628
in-drivable-lane_median1.433333333333335


other stats
deviation-center-line_max0.515637096594444
deviation-center-line_mean0.2356417449451789
deviation-center-line_min0
deviation-heading_max1.4656693619568912
deviation-heading_mean0.7459655493715927
deviation-heading_median0.7745742164091756
deviation-heading_min0
driven_any_max6.3134671434570295
driven_any_mean3.4877442976286384
driven_any_median3.725514177372329
driven_any_min0.6568618121898631
driven_lanedir_max3.859421629992368
driven_lanedir_mean2.045330451352009
driven_lanedir_min0
in-drivable-lane_max3.999999999999992
in-drivable-lane_mean1.9266666666666643
in-drivable-lane_min0.6000000000000012
per-episodes
details{"ep000": {"driven_any": 0.6568618121898631, "driven_lanedir": 0, "in-drivable-lane": 1.166666666666667, "deviation-heading": 0, "deviation-center-line": 0}, "ep001": {"driven_any": 6.3134671434570295, "driven_lanedir": 3.859421629992368, "in-drivable-lane": 2.433333333333327, "deviation-heading": 1.4656693619568912, "deviation-center-line": 0.515637096594444}, "ep002": {"driven_any": 4.707218426372725, "driven_lanedir": 3.254461726664029, "in-drivable-lane": 1.433333333333335, "deviation-heading": 1.1203534513583189, "deviation-center-line": 0.34943685020863585}, "ep003": {"driven_any": 2.035659928751244, "driven_lanedir": 1.6771841021538645, "in-drivable-lane": 0.6000000000000012, "deviation-heading": 0.3692307171335784, "deviation-center-line": 0.13480646425175163}, "ep004": {"driven_any": 3.725514177372329, "driven_lanedir": 1.4355847979497849, "in-drivable-lane": 3.999999999999992, "deviation-heading": 0.7745742164091756, "deviation-center-line": 0.1783283136710628}}
12290832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step3-videossuccessyes-0:01:48Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
videos1
12287832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step2-scoringsuccessyes-0:00:20Hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median5.49999999999999


other stats
episodes
details{"ep000": {"nsteps": 36, "reward": -29.487023785710335, "good_angle": 0.31190937854029055, "survival_time": 1.2000000000000004, "traveled_tiles": 2, "valid_direction": 0.6000000000000004}, "ep001": {"nsteps": 229, "reward": -4.282930066459304, "good_angle": 1.4287265284343271, "survival_time": 7.633333333333316, "traveled_tiles": 12, "valid_direction": 2.3999999999999964}, "ep002": {"nsteps": 165, "reward": -5.662128073592303, "good_angle": 0.6266620386561079, "survival_time": 5.49999999999999, "traveled_tiles": 9, "valid_direction": 0.8999999999999994}, "ep003": {"nsteps": 75, "reward": -12.773614008923373, "good_angle": 0.11498710295399482, "survival_time": 2.500000000000001, "traveled_tiles": 4, "valid_direction": 0.03333333333333344}, "ep004": {"nsteps": 181, "reward": -5.942324150456579, "good_angle": 1.588377960926397, "survival_time": 6.033333333333322, "traveled_tiles": 8, "valid_direction": 3.2666666666666626}}
good_angle_max1.588377960926397
good_angle_mean0.8141326019022234
good_angle_median0.6266620386561079
good_angle_min0.11498710295399482
reward_max-4.282930066459304
reward_mean-11.629604017028376
reward_median-5.942324150456579
reward_min-29.487023785710335
survival_time_max7.633333333333316
survival_time_mean4.573333333333325
survival_time_min1.2000000000000004
traveled_tiles_max12
traveled_tiles_mean7
traveled_tiles_median8
traveled_tiles_min2
valid_direction_max3.2666666666666626
valid_direction_mean1.4399999999999984
valid_direction_median0.8999999999999994
valid_direction_min0.03333333333333344
12275832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationsuccessyes-0:06:57Hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
6740832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationfailedno-0:11:12
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
    cie.wait_for_solution()
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
    raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Hidden. If you are the author, please login using the top-right link or use the dashboard.
6272832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationabortedno-0:10:34
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
    cie.wait_for_solution()
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
    raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Hidden. If you are the author, please login using the top-right link or use the dashboard.
6221832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationabortedno-0:31:05Hidden. If you are the author, please login using the top-right link or use the dashboard.
6201832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationabortedno-0:35:14Hidden. If you are the author, please login using the top-right link or use the dashboard.
6200832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationabortedno-0:37:21Hidden. If you are the author, please login using the top-right link or use the dashboard.
6181832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationabortedno-0:33:44Hidden. If you are the author, please login using the top-right link or use the dashboard.
6180832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationtimeoutno-0:33:29Hidden. If you are the author, please login using the top-right link or use the dashboard.
6110832heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationabortedno-0:10:35
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
    cie.wait_for_solution()
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
    raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Hidden. If you are the author, please login using the top-right link or use the dashboard.