Duckietown Challenges Home Challenges Submissions

Submission 692

Submission692
Competingyes
Challengeaido1_LF1_r3-v3
UserMandana Samiei 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 13400 step2-scoring: 13413 step3-videos: 13415 step4-viz: 13419
Next
User labelROS-based Lane Following
Admin priority50
Blessingn/a
User priority50

13419

Click the images to see detailed statistics about the episode.

ep000

ep001

ep002

ep003

ep004

13415

ep000

ep001

ep002

ep003

ep004

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
13419step4-vizsuccessyes0:01:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median0.34313593269183285
deviation-center-line_median0.09724580597368736
in-drivable-lane_median0


other stats
deviation-center-line_max1.9773514632296616
deviation-center-line_mean0.45188269718263846
deviation-center-line_min0.01866991914782249
deviation-heading_max1.114656919516871
deviation-heading_mean0.3941699734731935
deviation-heading_median0.2438711580236882
deviation-heading_min0.1362515798019552
driven_any_max5.335399996580846
driven_any_mean1.261934751187555
driven_any_median0.35334413210034665
driven_any_min0.08559260003666666
driven_lanedir_max3.297042777375582
driven_lanedir_mean0.8392510607205754
driven_lanedir_min0.03942047374687485
in-drivable-lane_max6.1666666666666465
in-drivable-lane_mean1.239999999999996
in-drivable-lane_min0
per-episodes
details{"ep000": {"driven_any": 5.335399996580846, "driven_lanedir": 3.297042777375582, "in-drivable-lane": 6.1666666666666465, "deviation-heading": 1.114656919516871, "deviation-center-line": 1.9773514632296616}, "ep001": {"driven_any": 0.35334413210034665, "driven_lanedir": 0.34313593269183285, "in-drivable-lane": 0, "deviation-heading": 0.22290041007221928, "deviation-center-line": 0.09724580597368736}, "ep002": {"driven_any": 0.08559260003666666, "driven_lanedir": 0.03942047374687485, "in-drivable-lane": 0.033333333333333326, "deviation-heading": 0.2531697999512338, "deviation-center-line": 0.01866991914782249}, "ep003": {"driven_any": 0.36408715549480714, "driven_lanedir": 0.3507915541299744, "in-drivable-lane": 0, "deviation-heading": 0.2438711580236882, "deviation-center-line": 0.10341019594875436}, "ep004": {"driven_any": 0.17124987172510894, "driven_lanedir": 0.16586456565861263, "in-drivable-lane": 0, "deviation-heading": 0.1362515798019552, "deviation-center-line": 0.06273610161326618}}
No reset possible
13415step3-videossuccessyes0:01:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
videos1
No reset possible
13413step2-scoringsuccessyes0:00:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median1.166666666666667


other stats
episodes
details{"ep000": {"nsteps": 500, "reward": 0.028394302139990033, "good_angle": 1.441985062222596, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 2.76666666666668}, "ep001": {"nsteps": 35, "reward": -28.25103762420559, "good_angle": 0.12191323184594569, "survival_time": 1.166666666666667, "traveled_tiles": 2, "valid_direction": 0.2000000000000003}, "ep002": {"nsteps": 10, "reward": -100.43815182261169, "good_angle": 0.4186790906623381, "survival_time": 0.3333333333333333, "traveled_tiles": 1, "valid_direction": 0.3}, "ep003": {"nsteps": 36, "reward": -27.48564502120846, "good_angle": 0.15747205936686254, "survival_time": 1.2000000000000004, "traveled_tiles": 1, "valid_direction": 0.20000000000000043}, "ep004": {"nsteps": 18, "reward": -55.39126540230225, "good_angle": 0.047270287446045056, "survival_time": 0.6, "traveled_tiles": 2, "valid_direction": 0.09999999999999998}}
good_angle_max1.441985062222596
good_angle_mean0.43746394630875746
good_angle_median0.15747205936686254
good_angle_min0.047270287446045056
reward_max0.028394302139990033
reward_mean-42.3075411136376
reward_median-28.25103762420559
reward_min-100.43815182261169
survival_time_max16.666666666666654
survival_time_mean3.993333333333331
survival_time_min0.3333333333333333
traveled_tiles_max10
traveled_tiles_mean3.2
traveled_tiles_median2
traveled_tiles_min1
valid_direction_max2.76666666666668
valid_direction_mean0.7133333333333362
valid_direction_median0.20000000000000043
valid_direction_min0.09999999999999998
No reset possible
13400step1-simulationsuccessyes0:05:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
No reset possible
7126step4-vizsuccessyes0:01:25
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
7121step3-videossuccessyes0:00:47
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
7119step2-scoringsuccessyes0:00:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
7105step1-simulationsuccessno0:02:41
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
6671step1-simulationabortedno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 353, in go_
    os.makedirs(wd)
  File "/usr/lib/python2.7/os.py", line 157, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido1_LF1_r3-v3/submission692/step1-simulation-nutonomy-P50-2110-job6671'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
6472step1-simulationabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 353, in go_
    os.makedirs(wd)
  File "/usr/lib/python2.7/os.py", line 157, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido1_LF1_r3-v3/submission692/step1-simulation-nutonomy-P50-2210-job6472'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
5413step4-vizsuccessno0:02:18
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
5412step3-videossuccessno0:02:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
5411step2-scoringsuccessno0:00:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
5410step2-scoringsuccessno0:02:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
5409step1-simulationsuccessno0:08:19
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible