503
224
Julian Zilly Baseline solution using imitation learning from logs aido1_LF1_r3-v3
step1-simulation success no idsc-rudolf-11141
2018-10-24 11:43:13+00:00 2018-10-24 11:47:59+00:00 0:04:46 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 492
267
Andrea Censi Β π¨πRandom execution aido1_LF1_r3-v3
step2-scoring success no idsc-rudolf-11141
2018-10-24 11:26:25+00:00 2018-10-24 11:28:02+00:00 0:01:37 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 4.866666666666659
other stats episodes details {"ep000": {"nsteps": 262, "reward": -5.360776264106727, "good_angle": 3.8727163243445295, "survival_time": 8.733333333333313, "traveled_tiles": 3, "valid_direction": 3.3666666666666556}, "ep001": {"nsteps": 206, "reward": -5.986628623424127, "good_angle": 23.322008728745647, "survival_time": 6.866666666666652, "traveled_tiles": 2, "valid_direction": 3.966666666666653}, "ep002": {"nsteps": 146, "reward": -9.33520656824112, "good_angle": 25.90880075988976, "survival_time": 4.866666666666659, "traveled_tiles": 2, "valid_direction": 3.9333333333333256}, "ep003": {"nsteps": 109, "reward": -9.662303757007926, "good_angle": 1.1208104660652094, "survival_time": 3.63333333333333, "traveled_tiles": 1, "valid_direction": 2.233333333333329}, "ep004": {"nsteps": 82, "reward": -12.854354936402382, "good_angle": 0.6882428680029298, "survival_time": 2.7333333333333334, "traveled_tiles": 1, "valid_direction": 1.8}}good_angle_max 25.90880075988976 good_angle_mean 10.982515829409616 good_angle_median 3.8727163243445295 good_angle_min 0.6882428680029298 reward_max -5.360776264106727 reward_mean -8.639854029836457 reward_median -9.33520656824112 reward_min -12.854354936402382 survival_time_max 8.733333333333313 survival_time_mean 5.366666666666658 survival_time_min 2.7333333333333334 traveled_tiles_max 3 traveled_tiles_mean 1.8 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 3.966666666666653 valid_direction_mean 3.0599999999999925 valid_direction_median 3.3666666666666556 valid_direction_min 1.8
No reset possible 468
188
Brian Wang Β πΊπΈROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no idsc-rudolf-11141
2018-10-24 11:12:14+00:00 2018-10-24 11:13:13+00:00 0:00:59 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Env Duckietown-Lf-Lfv-Navv-Silent-v1 not found (valid versions include ['Duckietown-Lf-Lfv-Navv-Silent-v0'])
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible