10656
173
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:22 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Env Duckietown-Lf-Lfv-Navv-Silent-v1 not found (valid versions include ['Duckietown-Lf-Lfv-Navv-Silent-v0'])
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10643
217
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:34 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Env Duckietown-Lf-Lfv-Navv-Silent-v1 not found (valid versions include ['Duckietown-Lf-Lfv-Navv-Silent-v0'])
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10629
224
Julian Zilly Baseline solution using imitation learning from logs aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:45 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 581, in wrap_solution
solution.run(cis)
File "solution.py", line 88, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Can't connect to the gym-duckietown-server
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10577
330
Benjamin Ramtoula 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:12:45 Timeout:
Waited 653 [...] Timeout:
Waited 653.343262911 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10540
352
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:43 Timeout:
Waited 602 [...] Timeout:
Waited 602.075906992 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10526
364
Pravish Sainath 🇨🇦PyTorch template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:07 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.5595474272820549 deviation-center-line_median 0.28694183703818704 in-drivable-lane_median 3.266666666666655
other stats deviation-center-line_max 0.5604621640188293 deviation-center-line_mean 0.3088653044866211 deviation-center-line_min 0.15469694830615136 deviation-heading_max 3.6140301383805906 deviation-heading_mean 1.4974190774376648 deviation-heading_median 1.0668466076435914 deviation-heading_min 0.31545320326952786 driven_any_max 1.950092311294152 driven_any_mean 0.9747195659796196 driven_any_median 1.1829301176841596 driven_any_min 0.12776479806561683 driven_lanedir_max 0.9420818020679832 driven_lanedir_mean 0.4962804615613585 driven_lanedir_min 0.12241453975447893 in-drivable-lane_max 5.0333333333333155 in-drivable-lane_mean 2.579999999999991 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3747020105583748, "driven_lanedir": 0.1538791159023375, "in-drivable-lane": 0.5333333333333333, "deviation-heading": 1.598145774419718, "deviation-center-line": 0.1745379082770337}, "ep001": {"driven_any": 0.12776479806561683, "driven_lanedir": 0.12241453975447893, "in-drivable-lane": 0, "deviation-heading": 0.31545320326952786, "deviation-center-line": 0.15469694830615136}, "ep002": {"driven_any": 1.950092311294152, "driven_lanedir": 0.9420818020679832, "in-drivable-lane": 5.0333333333333155, "deviation-heading": 3.6140301383805906, "deviation-center-line": 0.5604621640188293}, "ep003": {"driven_any": 1.1829301176841596, "driven_lanedir": 0.5595474272820549, "in-drivable-lane": 4.066666666666652, "deviation-heading": 0.8926196634748961, "deviation-center-line": 0.28694183703818704}, "ep004": {"driven_any": 1.238108592295796, "driven_lanedir": 0.7034794227999377, "in-drivable-lane": 3.266666666666655, "deviation-heading": 1.0668466076435914, "deviation-center-line": 0.36768766479290393}}
No reset possible 10515
360
Pravish Sainath 🇨🇦PyTorch template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:21 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 7.599999999999983
other stats episodes details {"ep000": {"nsteps": 79, "reward": -12.958519177867949, "good_angle": 1.2841682992947514, "survival_time": 2.6333333333333337, "traveled_tiles": 1, "valid_direction": 2.1333333333333337}, "ep001": {"nsteps": 57, "reward": -18.10167970312269, "good_angle": 0.05340326991644695, "survival_time": 1.9000000000000024, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 228, "reward": -4.6792328577987, "good_angle": 1.929419517714582, "survival_time": 7.599999999999983, "traveled_tiles": 3, "valid_direction": 3.599999999999987}, "ep003": {"nsteps": 232, "reward": -5.362626247872209, "good_angle": 1.091317569886713, "survival_time": 7.733333333333316, "traveled_tiles": 3, "valid_direction": 4.533333333333317}, "ep004": {"nsteps": 236, "reward": -5.280892781806731, "good_angle": 8.315036035917688, "survival_time": 7.866666666666648, "traveled_tiles": 3, "valid_direction": 4.833333333333316}}good_angle_max 8.315036035917688 good_angle_mean 2.5346689385460364 good_angle_median 1.2841682992947514 good_angle_min 0.05340326991644695 reward_max -4.6792328577987 reward_mean -9.276590153693656 reward_median -5.362626247872209 reward_min -18.10167970312269 survival_time_max 7.866666666666648 survival_time_mean 5.546666666666657 survival_time_min 1.9000000000000024 traveled_tiles_max 3 traveled_tiles_mean 2.2 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 4.833333333333316 valid_direction_mean 3.0199999999999907 valid_direction_median 3.599999999999987 valid_direction_min 0
No reset possible 10498
360
Pravish Sainath 🇨🇦PyTorch template aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:06 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10477
392
Liam Paull 🇨🇦PyTorch template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:50 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.5710878225860856 deviation-center-line_median 0.247384429109263 in-drivable-lane_median 2.8333333333333233
other stats deviation-center-line_max 0.5035553322993236 deviation-center-line_mean 0.28248656738463157 deviation-center-line_min 0.14910829526168953 deviation-heading_max 3.347989546936417 deviation-heading_mean 1.461698882593587 deviation-heading_median 0.9516191374409037 deviation-heading_min 0.1658724944785747 driven_any_max 1.275524314857489 driven_any_mean 0.8380464569105286 driven_any_median 1.1030957281125682 driven_any_min 0.1913053757833091 driven_lanedir_max 0.6293731629942543 driven_lanedir_mean 0.4235395088966246 driven_lanedir_min 0.13222156212750225 in-drivable-lane_max 3.966666666666653 in-drivable-lane_mean 2.206666666666659 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3538945413067501, "driven_lanedir": 0.13222156212750225, "in-drivable-lane": 0.7333333333333333, "deviation-heading": 1.8958033720502327, "deviation-center-line": 0.1984014722048932}, "ep001": {"driven_any": 0.1913053757833091, "driven_lanedir": 0.1889002600497922, "in-drivable-lane": 0, "deviation-heading": 0.1658724944785747, "deviation-center-line": 0.14910829526168953}, "ep002": {"driven_any": 1.275524314857489, "driven_lanedir": 0.6293731629942543, "in-drivable-lane": 2.8333333333333233, "deviation-heading": 3.347989546936417, "deviation-center-line": 0.5035553322993236}, "ep003": {"driven_any": 1.266412324492526, "driven_lanedir": 0.5961147367254891, "in-drivable-lane": 3.966666666666653, "deviation-heading": 0.9516191374409037, "deviation-center-line": 0.31398330804798846}, "ep004": {"driven_any": 1.1030957281125682, "driven_lanedir": 0.5710878225860856, "in-drivable-lane": 3.4999999999999876, "deviation-heading": 0.947209862061806, "deviation-center-line": 0.247384429109263}}
No reset possible 10464
392
Liam Paull 🇨🇦PyTorch template aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:52 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10463
392
Liam Paull 🇨🇦PyTorch template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 6.566666666666653
other stats episodes details {"ep000": {"nsteps": 77, "reward": -13.320760825407, "good_angle": 1.2576378238431154, "survival_time": 2.5666666666666673, "traveled_tiles": 1, "valid_direction": 2.166666666666667}, "ep001": {"nsteps": 43, "reward": -23.800569601530253, "good_angle": 0.02138122609946388, "survival_time": 1.4333333333333345, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 233, "reward": -4.918556525531852, "good_angle": 3.342051963448038, "survival_time": 7.766666666666649, "traveled_tiles": 3, "valid_direction": 5.766666666666652}, "ep003": {"nsteps": 222, "reward": -5.592750079032068, "good_angle": 1.0321873104354908, "survival_time": 7.399999999999984, "traveled_tiles": 3, "valid_direction": 4.999999999999984}, "ep004": {"nsteps": 197, "reward": -6.2597347586805725, "good_angle": 1.4186458328127616, "survival_time": 6.566666666666653, "traveled_tiles": 2, "valid_direction": 4.766666666666651}}good_angle_max 3.342051963448038 good_angle_mean 1.414380831327774 good_angle_median 1.2576378238431154 good_angle_min 0.02138122609946388 reward_max -4.918556525531852 reward_mean -10.77847435803635 reward_median -6.2597347586805725 reward_min -23.800569601530253 survival_time_max 7.766666666666649 survival_time_mean 5.146666666666658 survival_time_min 1.4333333333333345 traveled_tiles_max 3 traveled_tiles_mean 2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 5.766666666666652 valid_direction_mean 3.5399999999999907 valid_direction_median 4.766666666666651 valid_direction_min 0
No reset possible 10438
392
Liam Paull 🇨🇦PyTorch template aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:19 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10419
410
Liam Paull 🇨🇦Template for ROS Submission aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:54 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.4234837373444573 deviation-center-line_median 0.16483417063736874 in-drivable-lane_median 2.466666666666658
other stats deviation-center-line_max 0.28924845715410685 deviation-center-line_mean 0.19272202697959837 deviation-center-line_min 0.1253833219935629 deviation-heading_max 1.2301769728897156 deviation-heading_mean 0.9549473930107688 deviation-heading_median 1.0684684712414263 deviation-heading_min 0.477930265319624 driven_any_max 2.013832697585437 driven_any_mean 1.0185367564458176 driven_any_median 0.8697044725327119 driven_any_min 0.37472506406383016 driven_lanedir_max 0.6221646501296145 driven_lanedir_mean 0.4048640519105616 driven_lanedir_min 0.15083821255592256 in-drivable-lane_max 7.86666666666664 in-drivable-lane_mean 2.8799999999999906 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.37472506406383016, "driven_lanedir": 0.15083821255592256, "in-drivable-lane": 0.5666666666666669, "deviation-heading": 1.2301769728897156, "deviation-center-line": 0.1253833219935629}, "ep001": {"driven_any": 1.1960298271488394, "driven_lanedir": 0.4234837373444573, "in-drivable-lane": 3.4999999999999876, "deviation-heading": 1.1795678461784105, "deviation-center-line": 0.22259042055281617}, "ep002": {"driven_any": 0.6383917208982697, "driven_lanedir": 0.6221646501296145, "in-drivable-lane": 0, "deviation-heading": 0.477930265319624, "deviation-center-line": 0.16483417063736874}, "ep003": {"driven_any": 0.8697044725327119, "driven_lanedir": 0.39915843911274695, "in-drivable-lane": 2.466666666666658, "deviation-heading": 0.8185934094246683, "deviation-center-line": 0.16155376456013726}, "ep004": {"driven_any": 2.013832697585437, "driven_lanedir": 0.4286752204100668, "in-drivable-lane": 7.86666666666664, "deviation-heading": 1.0684684712414263, "deviation-center-line": 0.28924845715410685}}
No reset possible 10413
401
Vadim Volodin 🇷🇺ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:04 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 89, in run
raise InvalidSubmission(str(e))
InvalidSubmission: error loading <rosparam> tag:
'param' attribute must be set for non-dictionary values
XML is <rosparam command="load" file="$(find duckietown)/config/$(arg config)/line_detector/$(arg node_name)/$(arg param_file_name).yaml"/>
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10377
1268
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:00 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10368
422
Jahanvi Kolte 🇺🇸Random execution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 8.49999999999998
other stats episodes details {"ep000": {"nsteps": 80, "reward": -12.748537062454853, "good_angle": 1.1868112025365083, "survival_time": 2.666666666666667, "traveled_tiles": 1, "valid_direction": 2.2666666666666666}, "ep001": {"nsteps": 68, "reward": -15.271285963409088, "good_angle": 0.015743744257526764, "survival_time": 2.2666666666666684, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 255, "reward": -4.408749323849585, "good_angle": 0.8258220654657238, "survival_time": 8.49999999999998, "traveled_tiles": 3, "valid_direction": 1.9666666666666608}, "ep003": {"nsteps": 455, "reward": -2.205750543997907, "good_angle": 1.0273499729886408, "survival_time": 15.166666666666623, "traveled_tiles": 4, "valid_direction": 2.0999999999999925}, "ep004": {"nsteps": 343, "reward": -3.5153058469224225, "good_angle": 18.91386569355628, "survival_time": 11.433333333333303, "traveled_tiles": 3, "valid_direction": 3.866666666666654}}good_angle_max 18.91386569355628 good_angle_mean 4.393918535760935 good_angle_median 1.0273499729886408 good_angle_min 0.015743744257526764 reward_max -2.205750543997907 reward_mean -7.629925748126771 reward_median -4.408749323849585 reward_min -15.271285963409088 survival_time_max 15.166666666666623 survival_time_mean 8.006666666666648 survival_time_min 2.2666666666666684 traveled_tiles_max 4 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 3.866666666666654 valid_direction_mean 2.0399999999999947 valid_direction_median 2.0999999999999925 valid_direction_min 0
No reset possible 10345
440
Orlando Marquez 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:57 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.6382543184608221, "good_angle": 26.18110894363119, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 11.699999999999978}, "ep001": {"nsteps": 500, "reward": -0.4151158498296281, "good_angle": 13.553417258114488, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 12.89999999999999}, "ep002": {"nsteps": 500, "reward": -0.2500560587957734, "good_angle": 13.658932232202693, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 13.033333333333326}, "ep003": {"nsteps": 500, "reward": -0.17730202129750977, "good_angle": 13.64760310090124, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.999999999999991}, "ep004": {"nsteps": 500, "reward": -0.29369213936530286, "good_angle": 13.728997991349493, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.96666666666666}}good_angle_max 26.18110894363119 good_angle_mean 16.15401190523982 good_angle_median 13.658932232202693 good_angle_min 13.553417258114488 reward_max -0.17730202129750977 reward_mean -0.3548840775498073 reward_median -0.29369213936530286 reward_min -0.6382543184608221 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 13.033333333333326 valid_direction_mean 12.719999999999988 valid_direction_median 12.96666666666666 valid_direction_min 11.699999999999978
No reset possible 10303
464
Orlando Marquez 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step4-viz error yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:16:25 Timeout:
Waited 601 [...] Timeout:
Waited 601.376960039 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10286
447
Jonathan Plante 🇨🇦Random execution aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:23 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 64, in run
solve(gym_environment, cis) # let's try to solve the challenge, exciting ah?
File "solution.py", line 39, in solve
observation, reward, done, info = env.step(action)
File "/usr/local/lib/python2.7/dist-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/notebooks/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/notebooks/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/notebooks/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10253
476
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:52 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.7799355320280401 deviation-center-line_median 0.26096492222089973 in-drivable-lane_median 0
other stats deviation-center-line_max 0.32531904507525244 deviation-center-line_mean 0.2203781292734231 deviation-center-line_min 0.114983175928838 deviation-heading_max 1.3754934906840324 deviation-heading_mean 0.7095561500373504 deviation-heading_median 0.5277183602353276 deviation-heading_min 0.10877298330675804 driven_any_max 2.113619834797092 driven_any_mean 0.9028962122718814 driven_any_median 0.7849332026871493 driven_any_min 0.2299417322410001 driven_lanedir_max 2.048265975633722 driven_lanedir_mean 0.8413852869243291 driven_lanedir_min 0.12534079152165456 in-drivable-lane_max 0.3333333333333333 in-drivable-lane_mean 0.0733333333333333 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.34541416159621446, "driven_lanedir": 0.12534079152165456, "in-drivable-lane": 0.3333333333333333, "deviation-heading": 1.3754934906840324, "deviation-center-line": 0.13033360016176446}, "ep001": {"driven_any": 0.2299417322410001, "driven_lanedir": 0.22851367509441145, "in-drivable-lane": 0, "deviation-heading": 0.10877298330675804, "deviation-center-line": 0.114983175928838}, "ep002": {"driven_any": 1.0405721300379518, "driven_lanedir": 1.0248704603438177, "in-drivable-lane": 0, "deviation-heading": 0.5277183602353276, "deviation-center-line": 0.32531904507525244}, "ep003": {"driven_any": 2.113619834797092, "driven_lanedir": 2.048265975633722, "in-drivable-lane": 0.033333333333333215, "deviation-heading": 1.2492566360037414, "deviation-center-line": 0.27028990298036065}, "ep004": {"driven_any": 0.7849332026871493, "driven_lanedir": 0.7799355320280401, "in-drivable-lane": 0, "deviation-heading": 0.2865392799568926, "deviation-center-line": 0.26096492222089973}}
No reset possible 10239
476
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:34 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10224
474
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:52 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 59, in run
raise InvalidSubmission(str(e))
InvalidSubmission: local variable 'a' referenced before assignment
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10220
477
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:03 Error while running [...]
Pulling evaluator ... error
stderr | ERROR: for evaluator Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr | Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10169
503
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step3-videos error yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:15:18 Timeout:
Waited 603 [...] Timeout:
Waited 603.191922903 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10156
500
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:26 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 94, in run
solve(params, cis)
File "solution.py", line 59, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/wrappers.py", line 92, in step
ob, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10142
508
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:24 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 46, "reward": -22.89030233349489, "good_angle": 2.374591976676323, "survival_time": 1.5333333333333348, "traveled_tiles": 1, "valid_direction": 1.366666666666668}, "ep001": {"nsteps": 500, "reward": -2.828535573657602, "good_angle": 17.08240173076721, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 16.599999999999984}, "ep002": {"nsteps": 381, "reward": -5.436100737414178, "good_angle": 19.811794049954475, "survival_time": 12.699999999999966, "traveled_tiles": 2, "valid_direction": 12.633333333333296}, "ep003": {"nsteps": 500, "reward": -2.8611773381936363, "good_angle": 17.637656044536502, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 16.599999999999984}, "ep004": {"nsteps": 500, "reward": -2.8609809123687446, "good_angle": 17.428966619126555, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 16.599999999999984}}good_angle_max 19.811794049954475 good_angle_mean 14.867082084212214 good_angle_median 17.428966619126555 good_angle_min 2.374591976676323 reward_max -2.828535573657602 reward_mean -7.37541937902581 reward_median -2.8611773381936363 reward_min -22.89030233349489 survival_time_max 16.666666666666654 survival_time_mean 12.846666666666652 survival_time_min 1.5333333333333348 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 16.599999999999984 valid_direction_mean 12.759999999999982 valid_direction_median 16.599999999999984 valid_direction_min 1.366666666666668
No reset possible 10088
530
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step3-videos error yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:13:15 Timeout:
Waited 602 [...] Timeout:
Waited 602.514548063 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10067
530
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:23 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 9.033333333333312
other stats episodes details {"ep000": {"nsteps": 271, "reward": -4.996563631450118, "good_angle": 1.1005898013252264, "survival_time": 9.033333333333312, "traveled_tiles": 4, "valid_direction": 2.2999999999999936}, "ep001": {"nsteps": 284, "reward": -4.302450111462131, "good_angle": 16.31835594891589, "survival_time": 9.466666666666644, "traveled_tiles": 5, "valid_direction": 5.933333333333316}, "ep002": {"nsteps": 149, "reward": -7.720443555126169, "good_angle": 1.9595474474189016, "survival_time": 4.966666666666659, "traveled_tiles": 4, "valid_direction": 3.6333333333333258}, "ep003": {"nsteps": 183, "reward": -6.0104711054572375, "good_angle": 1.28250587874637, "survival_time": 6.099999999999988, "traveled_tiles": 3, "valid_direction": 1.4999999999999951}, "ep004": {"nsteps": 500, "reward": -0.4445230919736205, "good_angle": 11.891637649787322, "survival_time": 16.666666666666654, "traveled_tiles": 6, "valid_direction": 2.8999999999999915}}good_angle_max 16.31835594891589 good_angle_mean 6.510527345238742 good_angle_median 1.9595474474189016 good_angle_min 1.1005898013252264 reward_max -0.4445230919736205 reward_mean -4.6948902990938555 reward_median -4.996563631450118 reward_min -7.720443555126169 survival_time_max 16.666666666666654 survival_time_mean 9.246666666666652 survival_time_min 4.966666666666659 traveled_tiles_max 6 traveled_tiles_mean 4.4 traveled_tiles_median 4 traveled_tiles_min 3 valid_direction_max 5.933333333333316 valid_direction_mean 3.2533333333333245 valid_direction_median 2.8999999999999915 valid_direction_min 1.4999999999999951
No reset possible 10026
555
Dzenan Lapandic Baseline solution using imitation learning from logs aido1_LFV_r1-v3
step3-videos error yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:13:32 Timeout:
Waited 601 [...] Timeout:
Waited 601.110584021 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10014
555
Dzenan Lapandic Baseline solution using imitation learning from logs aido1_LFV_r1-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:00 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 203, "reward": -7.517153282177272, "good_angle": 18.773294719833164, "survival_time": 6.766666666666652, "traveled_tiles": 1, "valid_direction": 5.699999999999986}, "ep001": {"nsteps": 500, "reward": -0.9862548477053642, "good_angle": 15.15010660837492, "survival_time": 16.666666666666654, "traveled_tiles": 5, "valid_direction": 4.6000000000000165}, "ep002": {"nsteps": 500, "reward": -0.36153481669770554, "good_angle": 17.143092288140753, "survival_time": 16.666666666666654, "traveled_tiles": 4, "valid_direction": 4.099999999999985}, "ep003": {"nsteps": 497, "reward": -2.535689466121331, "good_angle": 16.606684416108454, "survival_time": 16.56666666666665, "traveled_tiles": 5, "valid_direction": 3.266666666666655}, "ep004": {"nsteps": 500, "reward": -0.38193090801686047, "good_angle": 16.47307776883877, "survival_time": 16.666666666666654, "traveled_tiles": 5, "valid_direction": 3.566666666666654}}good_angle_max 18.773294719833164 good_angle_mean 16.82925116025921 good_angle_median 16.606684416108454 good_angle_min 15.15010660837492 reward_max -0.36153481669770554 reward_mean -2.356512664143706 reward_median -0.9862548477053642 reward_min -7.517153282177272 survival_time_max 16.666666666666654 survival_time_mean 14.666666666666652 survival_time_min 6.766666666666652 traveled_tiles_max 5 traveled_tiles_mean 4 traveled_tiles_median 5 traveled_tiles_min 1 valid_direction_max 5.699999999999986 valid_direction_mean 4.24666666666666 valid_direction_median 4.099999999999985 valid_direction_min 3.266666666666655
No reset possible 9953
555
Dzenan Lapandic Baseline solution using imitation learning from logs aido1_LFV_r1-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:13:25 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9929
577
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:38 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.556511482949603 deviation-center-line_median 0.27999622831039583 in-drivable-lane_median 1.8999999999999948
other stats deviation-center-line_max 1.0923573999517773 deviation-center-line_mean 0.3762508402977914 deviation-center-line_min 0.01894951343622136 deviation-heading_max 2.812228279935465 deviation-heading_mean 1.1750749123509934 deviation-heading_median 0.7236008447660953 deviation-heading_min 0.2547098672697907 driven_any_max 4.014281751451181 driven_any_mean 1.7571030075714682 driven_any_median 1.3053964573512806 driven_any_min 0.8694291840591771 driven_lanedir_max 1.9536100225069195 driven_lanedir_mean 0.7749555096591664 driven_lanedir_min 0.0011339543734978363 in-drivable-lane_max 7.899999999999973 in-drivable-lane_mean 3.3799999999999906 in-drivable-lane_min 0.6333333333333311 per-episodes details {"ep000": {"driven_any": 1.3991649606908276, "driven_lanedir": 0.0011339543734978363, "in-drivable-lane": 4.599999999999993, "deviation-heading": 0.2547098672697907, "deviation-center-line": 0.01894951343622136}, "ep001": {"driven_any": 1.3053964573512806, "driven_lanedir": 0.9462014920882916, "in-drivable-lane": 0.6333333333333311, "deviation-heading": 1.4360808982249993, "deviation-center-line": 0.3276055880041256}, "ep002": {"driven_any": 0.8694291840591771, "driven_lanedir": 0.4173205963775197, "in-drivable-lane": 1.8999999999999948, "deviation-heading": 0.7236008447660953, "deviation-center-line": 0.16234547178643677}, "ep003": {"driven_any": 1.1972426843048751, "driven_lanedir": 0.556511482949603, "in-drivable-lane": 1.86666666666666, "deviation-heading": 0.6487546715586159, "deviation-center-line": 0.27999622831039583}, "ep004": {"driven_any": 4.014281751451181, "driven_lanedir": 1.9536100225069195, "in-drivable-lane": 7.899999999999973, "deviation-heading": 2.812228279935465, "deviation-center-line": 1.0923573999517773}}
No reset possible 9899
589
Allen Ou Random execution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1345792827291852 deviation-center-line_median 0.5295830920597379 in-drivable-lane_median 0.13333333333333286
other stats deviation-center-line_max 0.7378207006812516 deviation-center-line_mean 0.4900399484331072 deviation-center-line_min 0.20311558318368145 deviation-heading_max 3.808159527676094 deviation-heading_mean 1.6781475948542628 deviation-heading_median 1.4899546091755629 deviation-heading_min 0.17127441041787572 driven_any_max 2.1153121209904326 driven_any_mean 1.1386097563326252 driven_any_median 1.176546503372836 driven_any_min 0.35906447693458193 driven_lanedir_max 2.0492581688837292 driven_lanedir_mean 1.0021894977163797 driven_lanedir_min 0.14214400169969954 in-drivable-lane_max 0.9666666666666632 in-drivable-lane_mean 0.35999999999999915 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.35906447693458193, "driven_lanedir": 0.14214400169969954, "in-drivable-lane": 0.6, "deviation-heading": 2.0571874727373767, "deviation-center-line": 0.20311558318368145}, "ep001": {"driven_any": 0.426843205820173, "driven_lanedir": 0.4261321804702509, "in-drivable-lane": 0, "deviation-heading": 0.17127441041787572, "deviation-center-line": 0.33548596571987577}, "ep002": {"driven_any": 1.176546503372836, "driven_lanedir": 1.1345792827291852, "in-drivable-lane": 0.13333333333333286, "deviation-heading": 0.8641619542644038, "deviation-center-line": 0.644194400520989}, "ep003": {"driven_any": 2.1153121209904326, "driven_lanedir": 2.0492581688837292, "in-drivable-lane": 0.09999999999999964, "deviation-heading": 1.4899546091755629, "deviation-center-line": 0.5295830920597379}, "ep004": {"driven_any": 1.6152824745451018, "driven_lanedir": 1.2588338547990334, "in-drivable-lane": 0.9666666666666632, "deviation-heading": 3.808159527676094, "deviation-center-line": 0.7378207006812516}}
No reset possible 9883
589
Allen Ou Random execution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:32 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9825
611
David Abraham Random execution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:11 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.7872823522902446 deviation-center-line_median 0.4924466032907536 in-drivable-lane_median 0.6
other stats deviation-center-line_max 0.9340264091312448 deviation-center-line_mean 0.4707611002268899 deviation-center-line_min 0.20408684322219023 deviation-heading_max 3.4179149684912975 deviation-heading_mean 1.5413294630833605 deviation-heading_median 1.7278723332621622 deviation-heading_min 0.1718557269049959 driven_any_max 2.126250746723231 driven_any_mean 1.031475547026967 driven_any_median 0.7886940123975841 driven_any_min 0.28712661499059744 driven_lanedir_max 1.7778433931464237 driven_lanedir_mean 0.850335199014679 driven_lanedir_min 0.13984334205645788 in-drivable-lane_max 1.8333333333333268 in-drivable-lane_mean 0.6933333333333314 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.35862063448435333, "driven_lanedir": 0.13984334205645788, "in-drivable-lane": 0.6, "deviation-heading": 2.0712689055952787, "deviation-center-line": 0.20408684322219023}, "ep001": {"driven_any": 0.28712661499059744, "driven_lanedir": 0.28610137099865884, "in-drivable-lane": 0, "deviation-heading": 0.1718557269049959, "deviation-center-line": 0.2267554337778766}, "ep002": {"driven_any": 0.7886940123975841, "driven_lanedir": 0.7872823522902446, "in-drivable-lane": 0, "deviation-heading": 0.31773538116306804, "deviation-center-line": 0.4964902117123841}, "ep003": {"driven_any": 2.126250746723231, "driven_lanedir": 1.7778433931464237, "in-drivable-lane": 1.0333333333333297, "deviation-heading": 3.4179149684912975, "deviation-center-line": 0.9340264091312448}, "ep004": {"driven_any": 1.596685726539069, "driven_lanedir": 1.2606055365816102, "in-drivable-lane": 1.8333333333333268, "deviation-heading": 1.7278723332621622, "deviation-center-line": 0.4924466032907536}}
No reset possible 9698
1264
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:40 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.4666666666666677
other stats episodes details {"ep000": {"nsteps": 244, "reward": -4.374022879227462, "good_angle": 0.6697246300368211, "survival_time": 8.133333333333315, "traveled_tiles": 2, "valid_direction": 1.5999999999999943}, "ep001": {"nsteps": 70, "reward": -14.862278394613949, "good_angle": 0.02721982411650538, "survival_time": 2.333333333333335, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 46, "reward": -22.19249565705009, "good_angle": 0.5394539707593838, "survival_time": 1.5333333333333348, "traveled_tiles": 2, "valid_direction": 0.6333333333333349}, "ep003": {"nsteps": 127, "reward": -8.308228122143765, "good_angle": 0.31993870085218107, "survival_time": 4.233333333333328, "traveled_tiles": 2, "valid_direction": 0.4999999999999988}, "ep004": {"nsteps": 74, "reward": -14.009329262617472, "good_angle": 0.30200732283866505, "survival_time": 2.4666666666666677, "traveled_tiles": 1, "valid_direction": 0.7}}good_angle_max 0.6697246300368211 good_angle_mean 0.37166888972071127 good_angle_median 0.31993870085218107 good_angle_min 0.02721982411650538 reward_max -4.374022879227462 reward_mean -12.749270863130548 reward_median -14.009329262617472 reward_min -22.19249565705009 survival_time_max 8.133333333333315 survival_time_mean 3.739999999999996 survival_time_min 1.5333333333333348 traveled_tiles_max 2 traveled_tiles_mean 1.6 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 1.5999999999999943 valid_direction_mean 0.6866666666666656 valid_direction_median 0.6333333333333349 valid_direction_min 0
No reset possible 9696
1263
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:24 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.2579538234338239 deviation-center-line_median 0.23259959175930447 in-drivable-lane_median 0
other stats deviation-center-line_max 0.37115750725571234 deviation-center-line_mean 0.25213306400442015 deviation-center-line_min 0.13936827322860265 deviation-heading_max 1.8281884765517609 deviation-heading_mean 0.7894623611677554 deviation-heading_median 0.5856311002918942 deviation-heading_min 0.20912261203556157 driven_any_max 0.7643249106749713 driven_any_mean 0.37929668685291495 driven_any_median 0.2714239367291276 driven_any_min 0.17856280709133762 driven_lanedir_max 0.5207870063831364 driven_lanedir_mean 0.32107774167930886 driven_lanedir_min 0.15957286305009477 in-drivable-lane_max 1.8666666666666691 in-drivable-lane_mean 0.37333333333333385 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.7643249106749713, "driven_lanedir": 0.5207870063831364, "in-drivable-lane": 1.8666666666666691, "deviation-heading": 1.8281884765517609, "deviation-center-line": 0.3268927272324219}, "ep001": {"driven_any": 0.1857222782258954, "driven_lanedir": 0.1841615106914234, "in-drivable-lane": 0, "deviation-heading": 0.20912261203556157, "deviation-center-line": 0.19064722054605937}, "ep002": {"driven_any": 0.17856280709133762, "driven_lanedir": 0.15957286305009477, "in-drivable-lane": 0, "deviation-heading": 0.5856311002918942, "deviation-center-line": 0.13936827322860265}, "ep003": {"driven_any": 0.4964495015432431, "driven_lanedir": 0.48291350483806594, "in-drivable-lane": 0, "deviation-heading": 0.7469149400415147, "deviation-center-line": 0.37115750725571234}, "ep004": {"driven_any": 0.2714239367291276, "driven_lanedir": 0.2579538234338239, "in-drivable-lane": 0, "deviation-heading": 0.5774546769180455, "deviation-center-line": 0.23259959175930447}}
No reset possible 9694
1263
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:46 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.6000000000000005
other stats episodes details {"ep000": {"nsteps": 216, "reward": -4.779693161937757, "good_angle": 0.8930417034838899, "survival_time": 7.199999999999984, "traveled_tiles": 2, "valid_direction": 1.233333333333329}, "ep001": {"nsteps": 54, "reward": -19.074678830526494, "good_angle": 0.030804110510028737, "survival_time": 1.8000000000000025, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 52, "reward": -19.67153688061696, "good_angle": 0.4558554753518166, "survival_time": 1.7333333333333354, "traveled_tiles": 2, "valid_direction": 0.6333333333333351}, "ep003": {"nsteps": 141, "reward": -7.491190272812725, "good_angle": 0.3361085678973663, "survival_time": 4.699999999999993, "traveled_tiles": 2, "valid_direction": 0.36666666666666536}, "ep004": {"nsteps": 78, "reward": -13.293082039325665, "good_angle": 0.3197428499284079, "survival_time": 2.6000000000000005, "traveled_tiles": 1, "valid_direction": 0.5999999999999979}}good_angle_max 0.8930417034838899 good_angle_mean 0.40711054143430186 good_angle_median 0.3361085678973663 good_angle_min 0.030804110510028737 reward_max -4.779693161937757 reward_mean -12.86203623704392 reward_median -13.293082039325665 reward_min -19.67153688061696 survival_time_max 7.199999999999984 survival_time_mean 3.6066666666666634 survival_time_min 1.7333333333333354 traveled_tiles_max 2 traveled_tiles_mean 1.6 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 1.233333333333329 valid_direction_mean 0.5666666666666654 valid_direction_median 0.5999999999999979 valid_direction_min 0
No reset possible 9691
1262
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:31 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9688
1262
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.3000000000000016
other stats episodes details {"ep000": {"nsteps": 227, "reward": -4.550470444116852, "good_angle": 0.7578883000350387, "survival_time": 7.5666666666666496, "traveled_tiles": 2, "valid_direction": 1.0666666666666635}, "ep001": {"nsteps": 53, "reward": -19.43188279192403, "good_angle": 0.03328980550980436, "survival_time": 1.7666666666666688, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 122, "reward": -8.659334487113796, "good_angle": 0.2732274925487942, "survival_time": 4.066666666666662, "traveled_tiles": 2, "valid_direction": 0.36666666666666536}, "ep003": {"nsteps": 61, "reward": -16.777097997118215, "good_angle": 0.4679190169094889, "survival_time": 2.033333333333336, "traveled_tiles": 1, "valid_direction": 0.9666666666666692}, "ep004": {"nsteps": 69, "reward": -14.972821211469348, "good_angle": 0.37034570333692146, "survival_time": 2.3000000000000016, "traveled_tiles": 1, "valid_direction": 0.5999999999999996}}good_angle_max 0.7578883000350387 good_angle_mean 0.3805340636680095 good_angle_median 0.37034570333692146 good_angle_min 0.03328980550980436 reward_max -4.550470444116852 reward_mean -12.87832138634845 reward_median -14.972821211469348 reward_min -19.43188279192403 survival_time_max 7.5666666666666496 survival_time_mean 3.5466666666666633 survival_time_min 1.7666666666666688 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.0666666666666635 valid_direction_mean 0.5999999999999995 valid_direction_median 0.5999999999999996 valid_direction_min 0
No reset possible 9686
1258
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:16 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9683
1261
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:08:15 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9682
1257
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring aborted yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:56 Error while running [...]
Creating job9682-5397_scorer_1 ... error
stderr | ERROR: for job9682-5397_scorer_1 Cannot start service scorer: network job9682-5397_evaluation not found
stderr |
stderr | ERROR: for scorer Cannot start service scorer: network job9682-5397_evaluation not found
stderr | Encountered errors while bringing up the project.
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9680
1257
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:42 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9679
1256
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:39 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.329163691620562 deviation-center-line_median 0.35827461205411315 in-drivable-lane_median 0
other stats deviation-center-line_max 0.3937814135164912 deviation-center-line_mean 0.291213522879301 deviation-center-line_min 0.13450398548614903 deviation-heading_max 1.7665900011038165 deviation-heading_mean 0.8255861889551109 deviation-heading_median 0.6007002310416163 deviation-heading_min 0.4156529767389977 driven_any_max 0.7928861093818448 driven_any_mean 0.3971517271419362 driven_any_median 0.33572577130684705 driven_any_min 0.16428089881466457 driven_lanedir_max 0.5553370718712161 driven_lanedir_mean 0.3382078713840022 driven_lanedir_min 0.14698925329475143 in-drivable-lane_max 1.8666666666666691 in-drivable-lane_mean 0.37333333333333385 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.7928861093818448, "driven_lanedir": 0.5553370718712161, "in-drivable-lane": 1.8666666666666691, "deviation-heading": 1.7665900011038165, "deviation-center-line": 0.35827461205411315}, "ep001": {"driven_any": 0.33572577130684705, "driven_lanedir": 0.329163691620562, "in-drivable-lane": 0, "deviation-heading": 0.4156529767389977, "deviation-center-line": 0.36163658004759613}, "ep002": {"driven_any": 0.16428089881466457, "driven_lanedir": 0.14698925329475143, "in-drivable-lane": 0, "deviation-heading": 0.5629682504671758, "deviation-center-line": 0.13450398548614903}, "ep003": {"driven_any": 0.44286019832024265, "driven_lanedir": 0.4279002685926354, "in-drivable-lane": 0, "deviation-heading": 0.7820194854239486, "deviation-center-line": 0.3937814135164912}, "ep004": {"driven_any": 0.250005657886082, "driven_lanedir": 0.23164907154084613, "in-drivable-lane": 0, "deviation-heading": 0.6007002310416163, "deviation-center-line": 0.2078710232921555}}
No reset possible 9676
1256
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:15 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9675
1256
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:56 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.1999999999999984
other stats episodes details {"ep000": {"nsteps": 224, "reward": -4.6342155636666575, "good_angle": 0.7731504622031996, "survival_time": 7.46666666666665, "traveled_tiles": 2, "valid_direction": 1.4999999999999951}, "ep001": {"nsteps": 96, "reward": -11.020615682937205, "good_angle": 0.1791175555044248, "survival_time": 3.1999999999999984, "traveled_tiles": 1, "valid_direction": 0.2666666666666657}, "ep002": {"nsteps": 48, "reward": -21.29593548985819, "good_angle": 0.40772748053506686, "survival_time": 1.6000000000000016, "traveled_tiles": 2, "valid_direction": 0.5666666666666684}, "ep003": {"nsteps": 126, "reward": -8.428223190622198, "good_angle": 0.3376643428838403, "survival_time": 4.199999999999995, "traveled_tiles": 2, "valid_direction": 0.5999999999999996}, "ep004": {"nsteps": 72, "reward": -14.353482292758097, "good_angle": 0.4188168584270678, "survival_time": 2.4000000000000012, "traveled_tiles": 1, "valid_direction": 0.5333333333333321}}good_angle_max 0.7731504622031996 good_angle_mean 0.4232953399107198 good_angle_median 0.40772748053506686 good_angle_min 0.1791175555044248 reward_max -4.6342155636666575 reward_mean -11.94649444396847 reward_median -11.020615682937205 reward_min -21.29593548985819 survival_time_max 7.46666666666665 survival_time_mean 3.773333333333329 survival_time_min 1.6000000000000016 traveled_tiles_max 2 traveled_tiles_mean 1.6 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 1.4999999999999951 valid_direction_mean 0.6933333333333322 valid_direction_median 0.5666666666666684 valid_direction_min 0.2666666666666657
No reset possible 9673
1256
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:54 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9669
1252
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:30 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9665
1252
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:16:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9664
1253
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:02 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 359, in go_
try_s3(aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 969, in try_s3
s3_object.upload_fileobj(data)
File "/usr/local/lib/python2.7/dist-packages/boto3/s3/inject.py", line 621, in object_upload_fileobj
ExtraArgs=ExtraArgs, Callback=Callback, Config=Config)
File "/usr/local/lib/python2.7/dist-packages/boto3/s3/inject.py", line 539, in upload_fileobj
return future.result()
File "/usr/local/lib/python2.7/dist-packages/s3transfer/futures.py", line 73, in result
return self._coordinator.result()
File "/usr/local/lib/python2.7/dist-packages/s3transfer/futures.py", line 233, in result
raise self._exception
ClientError: An error occurred (RequestTimeTooSkewed) when calling the PutObject operation: The difference between the request time and the current time is too large.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9663
1249
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 8:11:12 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 623, in _make_api_call
raise error_class(parsed_response, operation_name)
ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9658
1246
David Abraham Pytorch IL aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:31 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 85, in run
solve(params, cis) # let's try to solve the challenge,
File "solution.py", line 51, in solve
observation, reward, done, info = env.step(action)
File "/usr/local/lib/python2.7/dist-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/notebooks/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/notebooks/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/notebooks/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9656
1244
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 1:09:52 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 110, in run
solve(params, cis)
File "solution.py", line 78, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/wrappers.py", line 79, in step
ob, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9653
1242
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:08 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.2378347411753543 deviation-center-line_median 0.2359500665551364 in-drivable-lane_median 0
other stats deviation-center-line_max 0.3250914468388402 deviation-center-line_mean 0.21342037797633387 deviation-center-line_min 0.12383235141164016 deviation-heading_max 1.808108121549126 deviation-heading_mean 0.7886711265137754 deviation-heading_median 0.6117626678783025 deviation-heading_min 0.24236497338488927 driven_any_max 0.7750378034964654 driven_any_mean 0.32786458578902156 driven_any_median 0.23930295462218812 driven_any_min 0.1499947733843908 driven_lanedir_max 0.5287097105035634 driven_lanedir_mean 0.2669678259654299 driven_lanedir_min 0.12833585003948067 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.7750378034964654, "driven_lanedir": 0.5287097105035634, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.808108121549126, "deviation-center-line": 0.3250914468388402}, "ep001": {"driven_any": 0.23930295462218812, "driven_lanedir": 0.2378347411753543, "in-drivable-lane": 0, "deviation-heading": 0.24236497338488927, "deviation-center-line": 0.2526981491511795}, "ep002": {"driven_any": 0.1499947733843908, "driven_lanedir": 0.12833585003948067, "in-drivable-lane": 0, "deviation-heading": 0.6117626678783025, "deviation-center-line": 0.12383235141164016}, "ep003": {"driven_any": 0.19284024828464635, "driven_lanedir": 0.17181572015847513, "in-drivable-lane": 0, "deviation-heading": 0.7499416997872996, "deviation-center-line": 0.12952987592487306}, "ep004": {"driven_any": 0.28214714915741734, "driven_lanedir": 0.26814310795027607, "in-drivable-lane": 0, "deviation-heading": 0.5311781699692596, "deviation-center-line": 0.2359500665551364}}
No reset possible 9651
1242
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:20 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9650
1242
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:02 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.3000000000000016
other stats episodes details {"ep000": {"nsteps": 219, "reward": -4.715856245643413, "good_angle": 0.8852054606636683, "survival_time": 7.299999999999984, "traveled_tiles": 2, "valid_direction": 1.066666666666664}, "ep001": {"nsteps": 69, "reward": -15.071732969819635, "good_angle": 0.0279712245940719, "survival_time": 2.3000000000000016, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 44, "reward": -23.206919590857897, "good_angle": 0.4878454926533588, "survival_time": 1.466666666666668, "traveled_tiles": 2, "valid_direction": 0.7000000000000013}, "ep003": {"nsteps": 56, "reward": -18.231142271576186, "good_angle": 0.46932790821030024, "survival_time": 1.8666666666666691, "traveled_tiles": 1, "valid_direction": 1.0000000000000029}, "ep004": {"nsteps": 81, "reward": -12.805850816729627, "good_angle": 0.35088257841654297, "survival_time": 2.7, "traveled_tiles": 1, "valid_direction": 0.3999999999999986}}good_angle_max 0.8852054606636683 good_angle_mean 0.4442465329075884 good_angle_median 0.46932790821030024 good_angle_min 0.0279712245940719 reward_max -4.715856245643413 reward_mean -14.806300378925352 reward_median -15.071732969819635 reward_min -23.206919590857897 survival_time_max 7.299999999999984 survival_time_mean 3.1266666666666643 survival_time_min 1.466666666666668 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.066666666666664 valid_direction_mean 0.6333333333333333 valid_direction_median 0.7000000000000013 valid_direction_min 0
No reset possible 9647
1242
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:09 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9645
1241
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:12:04 Timeout:
Waited 600 [...] Timeout:
Waited 600.856384993 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9643
1239
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:14:29 Timeout:
Waited 616 [...] Timeout:
Waited 616.993263006 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9641
1238
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:14:34 Timeout:
Waited 626 [...] Timeout:
Waited 626.475270033 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9639
1237
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:12:40 Timeout:
Waited 604 [...] Timeout:
Waited 604.070347071 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9636
1235
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:08:11 Error while running [...]
Pulling evaluator ... error
stderr | ERROR: for evaluator Get https://registry-1.docker.io/v2/andreacensi/aido1_lf1_r3-v3-step1-simulation-evaluator/manifests/2018_11_08_16_21_06: Get https://auth.docker.io/token?scope=repository%3Aandreacensi%2Faido1_lf1_r3-v3-step1-simulation-evaluator%3Apull&service=registry.docker.io: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr | Get https://registry-1.docker.io/v2/andreacensi/aido1_lf1_r3-v3-step1-simulation-evaluator/manifests/2018_11_08_16_21_06: Get https://auth.docker.io/token?scope=repository%3Aandreacensi%2Faido1_lf1_r3-v3-step1-simulation-evaluator%3Apull&service=registry.docker.io: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9635
1234
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:49 Error while running [...]
Pulling solution ... error
stderr | ERROR: for solution Get https://registry-1.docker.io/v2/gunshi/aido1_lf1_r3-v3-submission/manifests/2018_11_15_10_23_07: Get https://auth.docker.io/token?scope=repository%3Agunshi%2Faido1_lf1_r3-v3-submission%3Apull&service=registry.docker.io: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr | Get https://registry-1.docker.io/v2/gunshi/aido1_lf1_r3-v3-submission/manifests/2018_11_15_10_23_07: Get https://auth.docker.io/token?scope=repository%3Agunshi%2Faido1_lf1_r3-v3-submission%3Apull&service=registry.docker.io: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9633
1233
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:12:15 Timeout:
Waited 613 [...] Timeout:
Waited 613.238981962 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9630
1230
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:13:09 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 5.233205580663916 deviation-center-line_median 0.9622528676347796 in-drivable-lane_median 1.86666666666666
other stats deviation-center-line_max 1.4887727823522934 deviation-center-line_mean 1.0165775523241307 deviation-center-line_min 0.6787543098843081 deviation-heading_max 2.9759401772443645 deviation-heading_mean 2.5017191355213586 deviation-heading_median 2.7190723206994707 deviation-heading_min 1.5376994510379216 driven_any_max 6.340440794264879 driven_any_mean 5.482326668108364 driven_any_median 6.323719113434832 driven_any_min 2.1357289801566743 driven_lanedir_max 6.012629298880372 driven_lanedir_mean 4.674452797642727 driven_lanedir_min 1.6482613967380175 in-drivable-lane_max 3.0999999999999908 in-drivable-lane_mean 1.7466666666666608 in-drivable-lane_min 0.33333333333333215 per-episodes details {"ep000": {"driven_any": 6.278494215108127, "driven_lanedir": 4.988778536550464, "in-drivable-lane": 3.0999999999999908, "deviation-heading": 2.9759401772443645, "deviation-center-line": 0.9622528676347796}, "ep001": {"driven_any": 2.1357289801566743, "driven_lanedir": 1.6482613967380175, "in-drivable-lane": 0.93333333333333, "deviation-heading": 1.5376994510379216, "deviation-center-line": 0.6787543098843081}, "ep002": {"driven_any": 6.340440794264879, "driven_lanedir": 5.489389175380863, "in-drivable-lane": 1.86666666666666, "deviation-heading": 2.4944146864875445, "deviation-center-line": 1.016623969752082}, "ep003": {"driven_any": 6.333250237577308, "driven_lanedir": 5.233205580663916, "in-drivable-lane": 2.499999999999991, "deviation-heading": 2.7190723206994707, "deviation-center-line": 0.9364838319971908}, "ep004": {"driven_any": 6.323719113434832, "driven_lanedir": 6.012629298880372, "in-drivable-lane": 0.33333333333333215, "deviation-heading": 2.781469042137491, "deviation-center-line": 1.4887727823522934}}
No reset possible 9629
1230
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:03 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9628
1230
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:34 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.09904257215256802, "good_angle": 1.0968452424452118, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.4333333333333247}, "ep001": {"nsteps": 179, "reward": -6.0697646474923586, "good_angle": 6.483064691787094, "survival_time": 5.966666666666655, "traveled_tiles": 5, "valid_direction": 1.899999999999997}, "ep002": {"nsteps": 500, "reward": -0.17009984451299534, "good_angle": 0.9263830970823708, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.1333333333333275}, "ep003": {"nsteps": 500, "reward": -0.1293248508713441, "good_angle": 1.1081828344412037, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.8999999999999897}, "ep004": {"nsteps": 500, "reward": -0.1788246242830646, "good_angle": 11.447960051210329, "survival_time": 16.666666666666654, "traveled_tiles": 13, "valid_direction": 3.1999999999999886}}good_angle_max 11.447960051210329 good_angle_mean 4.212487183393241 good_angle_median 1.1081828344412037 good_angle_min 0.9263830970823708 reward_max -0.09904257215256802 reward_mean -1.329411307862466 reward_median -0.17009984451299534 reward_min -6.0697646474923586 survival_time_max 16.666666666666654 survival_time_mean 14.526666666666651 survival_time_min 5.966666666666655 traveled_tiles_max 13 traveled_tiles_mean 10.8 traveled_tiles_median 12 traveled_tiles_min 5 valid_direction_max 3.1999999999999886 valid_direction_mean 2.5133333333333256 valid_direction_median 2.4333333333333247 valid_direction_min 1.899999999999997
No reset possible 9626
1230
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:20:05 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9623
1227
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:44 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9614
1224
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:10 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9580
1220
Mandana Samiei 🇨🇦Improved ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:00 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.4000000000000012
other stats episodes details {"ep000": {"nsteps": 219, "reward": -4.711433204928574, "good_angle": 0.8593331353981082, "survival_time": 7.299999999999984, "traveled_tiles": 2, "valid_direction": 0.8999999999999968}, "ep001": {"nsteps": 108, "reward": -9.843404257463083, "good_angle": 0.03543433536997532, "survival_time": 3.599999999999997, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 49, "reward": -20.87239823323123, "good_angle": 0.4476014846146789, "survival_time": 1.633333333333335, "traveled_tiles": 2, "valid_direction": 0.8000000000000015}, "ep003": {"nsteps": 57, "reward": -17.92643308273533, "good_angle": 0.48546858928437353, "survival_time": 1.9000000000000024, "traveled_tiles": 1, "valid_direction": 0.933333333333336}, "ep004": {"nsteps": 72, "reward": -14.369250705258716, "good_angle": 0.39009567649755905, "survival_time": 2.4000000000000012, "traveled_tiles": 1, "valid_direction": 0.49999999999999867}}good_angle_max 0.8593331353981082 good_angle_mean 0.443586644232939 good_angle_median 0.4476014846146789 good_angle_min 0.03543433536997532 reward_max -4.711433204928574 reward_mean -13.54458389672339 reward_median -14.369250705258716 reward_min -20.87239823323123 survival_time_max 7.299999999999984 survival_time_mean 3.3666666666666636 survival_time_min 1.633333333333335 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 0.933333333333336 valid_direction_mean 0.6266666666666667 valid_direction_median 0.8000000000000015 valid_direction_min 0
No reset possible 9561
1218
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:16:28 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 4.991915856330536 deviation-center-line_median 0.9364838319971908 in-drivable-lane_median 2.499999999999991
other stats deviation-center-line_max 1.0676711077521397 deviation-center-line_mean 0.8130895808496119 deviation-center-line_min 0.1229629401888518 deviation-heading_max 2.8510293130096054 deviation-heading_mean 2.058613300121737 deviation-heading_median 2.4944146864875445 deviation-heading_min 0.06335780452941518 driven_any_max 6.340440794264879 driven_any_mean 5.139007959420065 driven_any_median 6.333250237577308 driven_any_min 0.3930632008756537 driven_lanedir_max 5.489389175380863 driven_lanedir_mean 4.13773982673794 driven_lanedir_min 0.3919959004281719 in-drivable-lane_max 4.066666666666652 in-drivable-lane_mean 2.3199999999999923 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 6.288365540766493, "driven_lanedir": 4.991915856330536, "in-drivable-lane": 3.166666666666657, "deviation-heading": 2.8510293130096054, "deviation-center-line": 0.921706054557795}, "ep001": {"driven_any": 0.3930632008756537, "driven_lanedir": 0.3919959004281719, "in-drivable-lane": 0, "deviation-heading": 0.06335780452941518, "deviation-center-line": 0.1229629401888518}, "ep002": {"driven_any": 6.340440794264879, "driven_lanedir": 5.489389175380863, "in-drivable-lane": 1.86666666666666, "deviation-heading": 2.4944146864875445, "deviation-center-line": 1.016623969752082}, "ep003": {"driven_any": 6.333250237577308, "driven_lanedir": 5.233205580663916, "in-drivable-lane": 2.499999999999991, "deviation-heading": 2.7190723206994707, "deviation-center-line": 0.9364838319971908}, "ep004": {"driven_any": 6.339920023615989, "driven_lanedir": 4.582192620886213, "in-drivable-lane": 4.066666666666652, "deviation-heading": 2.1651923758826483, "deviation-center-line": 1.0676711077521397}}
No reset possible 9555
1218
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:13 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.09804204974533058, "good_angle": 1.1029803703513434, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.6999999999999904}, "ep001": {"nsteps": 33, "reward": -30.893865944761217, "good_angle": 0.005861928445149117, "survival_time": 1.1, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 500, "reward": -0.17009984451299534, "good_angle": 0.9263830970823708, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.1333333333333275}, "ep003": {"nsteps": 500, "reward": -0.1293248508713441, "good_angle": 1.1081828344412037, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.8999999999999897}, "ep004": {"nsteps": 500, "reward": -0.33169459652155636, "good_angle": 15.213106584682173, "survival_time": 16.666666666666654, "traveled_tiles": 13, "valid_direction": 3.333333333333332}}good_angle_max 15.213106584682173 good_angle_mean 3.671302963000448 good_angle_median 1.1029803703513434 good_angle_min 0.005861928445149117 reward_max -0.09804204974533058 reward_mean -6.324605457282489 reward_median -0.17009984451299534 reward_min -30.893865944761217 survival_time_max 16.666666666666654 survival_time_mean 13.553333333333324 survival_time_min 1.1 traveled_tiles_max 13 traveled_tiles_mean 10 traveled_tiles_median 12 traveled_tiles_min 1 valid_direction_max 3.333333333333332 valid_direction_mean 2.213333333333328 valid_direction_median 2.6999999999999904 valid_direction_min 0
No reset possible 9548
1218
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:57 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 99, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9543
1216
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:15:39 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 8.612961693827268 deviation-center-line_median 1.66985101080796 in-drivable-lane_median 2.866666666666658
other stats deviation-center-line_max 1.8035219663301965 deviation-center-line_mean 1.6672943452898472 deviation-center-line_min 1.4702979645281968 deviation-heading_max 2.241648152033716 deviation-heading_mean 1.9336734107589677 deviation-heading_median 1.922786813802037 deviation-heading_min 1.631921368105764 driven_any_max 10.671132227372407 driven_any_mean 10.670566151885527 driven_any_median 10.67072031050144 driven_any_min 10.66985370366328 driven_lanedir_max 10.311802660582206 driven_lanedir_mean 9.103591402096644 driven_lanedir_min 8.108706441916016 in-drivable-lane_max 3.633333333333322 in-drivable-lane_mean 2.09333333333333 in-drivable-lane_min 0.13333333333333286 per-episodes details {"ep000": {"driven_any": 10.67107143524292, "driven_lanedir": 8.108706441916016, "in-drivable-lane": 3.633333333333322, "deviation-heading": 1.922786813802037, "deviation-center-line": 1.6270418552733978}, "ep001": {"driven_any": 10.66985370366328, "driven_lanedir": 9.999863293554435, "in-drivable-lane": 0.6999999999999975, "deviation-heading": 2.1745670886237196, "deviation-center-line": 1.8035219663301965}, "ep002": {"driven_any": 10.67072031050144, "driven_lanedir": 8.612961693827268, "in-drivable-lane": 2.866666666666658, "deviation-heading": 1.6974436312296035, "deviation-center-line": 1.7657589295094858}, "ep003": {"driven_any": 10.671132227372407, "driven_lanedir": 8.484622920603297, "in-drivable-lane": 3.13333333333334, "deviation-heading": 1.631921368105764, "deviation-center-line": 1.66985101080796}, "ep004": {"driven_any": 10.6700530826476, "driven_lanedir": 10.311802660582206, "in-drivable-lane": 0.13333333333333286, "deviation-heading": 2.241648152033716, "deviation-center-line": 1.4702979645281968}}
No reset possible 9529
1216
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:22 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.47290116201527416, "good_angle": 0.620527758689177, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.4666666666666688}, "ep001": {"nsteps": 500, "reward": -0.3814245066232979, "good_angle": 20.941140955183226, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 3.2333333333333236}, "ep002": {"nsteps": 500, "reward": -0.4611001099124551, "good_angle": 0.6792900770956242, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.699999999999997}, "ep003": {"nsteps": 500, "reward": -0.49903926293135736, "good_angle": 0.5362987411971321, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.4333333333333318}, "ep004": {"nsteps": 500, "reward": -0.264232877929011, "good_angle": 6.46899920987801, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 1.433333333333334}}good_angle_max 20.941140955183226 good_angle_mean 5.849251348408634 good_angle_median 0.6792900770956242 good_angle_min 0.5362987411971321 reward_max -0.264232877929011 reward_mean -0.4157395838822791 reward_median -0.4611001099124551 reward_min -0.49903926293135736 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 18 traveled_tiles_mean 16.4 traveled_tiles_median 18 traveled_tiles_min 10 valid_direction_max 3.2333333333333236 valid_direction_mean 1.853333333333331 valid_direction_median 1.4666666666666688 valid_direction_min 1.4333333333333318
No reset possible 9519
1215
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:15 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.5497224612673745, "good_angle": 0.5439216145491801, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.4666666666666714}, "ep001": {"nsteps": 500, "reward": -0.3210465839970857, "good_angle": 21.22155437720925, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 3.666666666666668}, "ep002": {"nsteps": 500, "reward": -0.6283954581220169, "good_angle": 0.5428892749021961, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.266666666666663}, "ep003": {"nsteps": 500, "reward": -0.3240200620025862, "good_angle": 0.6178064113300014, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.8333333333333268}, "ep004": {"nsteps": 500, "reward": -0.2921708645388717, "good_angle": 13.532035740296866, "survival_time": 16.666666666666654, "traveled_tiles": 15, "valid_direction": 2.566666666666661}}good_angle_max 21.22155437720925 good_angle_mean 7.2916414836575 good_angle_median 0.6178064113300014 good_angle_min 0.5428892749021961 reward_max -0.2921708645388717 reward_mean -0.42307108598558696 reward_median -0.3240200620025862 reward_min -0.6283954581220169 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 18 traveled_tiles_mean 17.4 traveled_tiles_median 18 traveled_tiles_min 15 valid_direction_max 3.666666666666668 valid_direction_mean 2.1599999999999984 valid_direction_median 1.8333333333333268 valid_direction_min 1.266666666666663
No reset possible 9498
1213
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:12:49 Timeout:
Waited 602 [...] Timeout:
Waited 602.476073027 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9495
1212
Iban Harlouchet 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:02 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9474
1209
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:23 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 12.2333333333333
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.6730641153701581, "good_angle": 1.1257769409562324, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.933333333333323}, "ep001": {"nsteps": 52, "reward": -19.990229403743378, "good_angle": 2.9851842843742027, "survival_time": 1.7333333333333354, "traveled_tiles": 2, "valid_direction": 0.3666666666666678}, "ep002": {"nsteps": 500, "reward": -0.7178257898052689, "good_angle": 1.356196950867545, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 3.066666666666656}, "ep003": {"nsteps": 367, "reward": -3.119152601437811, "good_angle": 0.4137677896881461, "survival_time": 12.2333333333333, "traveled_tiles": 9, "valid_direction": 1.1333333333333293}, "ep004": {"nsteps": 79, "reward": -13.080453655387782, "good_angle": 0.007977152289526793, "survival_time": 2.6333333333333337, "traveled_tiles": 2, "valid_direction": 0}}good_angle_max 2.9851842843742027 good_angle_mean 1.1777806236351307 good_angle_median 1.1257769409562324 good_angle_min 0.007977152289526793 reward_max -0.6730641153701581 reward_mean -7.51614511314888 reward_median -3.119152601437811 reward_min -19.990229403743378 survival_time_max 16.666666666666654 survival_time_mean 9.986666666666654 survival_time_min 1.7333333333333354 traveled_tiles_max 12 traveled_tiles_mean 7.4 traveled_tiles_median 9 traveled_tiles_min 2 valid_direction_max 3.066666666666656 valid_direction_mean 1.4999999999999951 valid_direction_median 1.1333333333333293 valid_direction_min 0
No reset possible 9449
1203
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:49 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.5914854221134958 deviation-center-line_median 0.1642850250933818 in-drivable-lane_median 0
other stats deviation-center-line_max 1.202360235140529 deviation-center-line_mean 0.3330298026826093 deviation-center-line_min 0.04735951717671134 deviation-heading_max 1.2397319145651913 deviation-heading_mean 0.4740879467799844 deviation-heading_median 0.2254548132096628 deviation-heading_min 0.1615613325475736 driven_any_max 5.749287235287586 driven_any_mean 1.5519938363195305 driven_any_median 0.5998638140333198 driven_any_min 0.10398542495011756 driven_lanedir_max 2.734846790088305 driven_lanedir_mean 0.8971341792826234 driven_lanedir_min 0.09320063744972051 in-drivable-lane_max 9.033333333333337 in-drivable-lane_mean 1.9266666666666672 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 1.0949170429668078, "driven_lanedir": 0.8603876718799879, "in-drivable-lane": 0.6, "deviation-heading": 0.5675258645162795, "deviation-center-line": 0.18595604881096625}, "ep001": {"driven_any": 0.10398542495011756, "driven_lanedir": 0.09320063744972051, "in-drivable-lane": 0, "deviation-heading": 0.17616580906121468, "deviation-center-line": 0.04735951717671134}, "ep002": {"driven_any": 0.2119156643598204, "driven_lanedir": 0.2057503748816072, "in-drivable-lane": 0, "deviation-heading": 0.1615613325475736, "deviation-center-line": 0.0651881871914581}, "ep003": {"driven_any": 5.749287235287586, "driven_lanedir": 2.734846790088305, "in-drivable-lane": 9.033333333333337, "deviation-heading": 1.2397319145651913, "deviation-center-line": 1.202360235140529}, "ep004": {"driven_any": 0.5998638140333198, "driven_lanedir": 0.5914854221134958, "in-drivable-lane": 0, "deviation-heading": 0.2254548132096628, "deviation-center-line": 0.1642850250933818}}
No reset possible 9440
1203
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:59 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 1.7666666666666688
other stats episodes details {"ep000": {"nsteps": 94, "reward": -10.880140648332445, "good_angle": 0.2389941919765413, "survival_time": 3.133333333333332, "traveled_tiles": 3, "valid_direction": 0.2333333333333325}, "ep001": {"nsteps": 13, "reward": -77.54556220540634, "good_angle": 0.12722664664767305, "survival_time": 0.4333333333333333, "traveled_tiles": 1, "valid_direction": 0.2666666666666666}, "ep002": {"nsteps": 22, "reward": -45.918432533741, "good_angle": 0.04667275508301345, "survival_time": 0.7333333333333333, "traveled_tiles": 2, "valid_direction": 0.1333333333333333}, "ep003": {"nsteps": 500, "reward": -0.7811867533139885, "good_angle": 1.41219851160597, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 3.3666666666666556}, "ep004": {"nsteps": 53, "reward": -19.349999496959292, "good_angle": 0.09429089177989708, "survival_time": 1.7666666666666688, "traveled_tiles": 2, "valid_direction": 0.2333333333333341}}good_angle_max 1.41219851160597 good_angle_mean 0.38387659941861896 good_angle_median 0.12722664664767305 good_angle_min 0.04667275508301345 reward_max -0.7811867533139885 reward_mean -30.895064327550617 reward_median -19.349999496959292 reward_min -77.54556220540634 survival_time_max 16.666666666666654 survival_time_mean 4.546666666666664 survival_time_min 0.4333333333333333 traveled_tiles_max 12 traveled_tiles_mean 4 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 3.3666666666666556 valid_direction_mean 0.8466666666666643 valid_direction_median 0.2333333333333341 valid_direction_min 0.1333333333333333
No reset possible 9414
1200
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:58 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.8362446914895556 deviation-center-line_median 0.19237932893445284 in-drivable-lane_median 0
other stats deviation-center-line_max 1.1129167357857068 deviation-center-line_mean 0.3380787869604759 deviation-center-line_min 0.07539513465930048 deviation-heading_max 1.5918982097655303 deviation-heading_mean 0.5166769406591565 deviation-heading_median 0.2192371997002948 deviation-heading_min 0.10487125934157168 driven_any_max 5.700788194700402 driven_any_mean 1.6672793771194456 driven_any_median 0.8400096686125453 driven_any_min 0.27234852657857656 driven_lanedir_max 2.522384205927833 driven_lanedir_mean 0.9865151287859 driven_lanedir_min 0.2608997391817156 in-drivable-lane_max 9.233333333333302 in-drivable-lane_mean 1.9533333333333271 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 1.1833576228464997, "driven_lanedir": 0.9761642916727722, "in-drivable-lane": 0.5333333333333333, "deviation-heading": 0.4859369522196323, "deviation-center-line": 0.2062981096933459}, "ep001": {"driven_any": 0.3398928728592049, "driven_lanedir": 0.3368827156576235, "in-drivable-lane": 0, "deviation-heading": 0.10487125934157168, "deviation-center-line": 0.10340462572957348}, "ep002": {"driven_any": 5.700788194700402, "driven_lanedir": 2.522384205927833, "in-drivable-lane": 9.233333333333302, "deviation-heading": 1.5918982097655303, "deviation-center-line": 1.1129167357857068}, "ep003": {"driven_any": 0.8400096686125453, "driven_lanedir": 0.8362446914895556, "in-drivable-lane": 0, "deviation-heading": 0.18144108226875305, "deviation-center-line": 0.19237932893445284}, "ep004": {"driven_any": 0.27234852657857656, "driven_lanedir": 0.2608997391817156, "in-drivable-lane": 0, "deviation-heading": 0.2192371997002948, "deviation-center-line": 0.07539513465930048}}
No reset possible 9381
1197
Iban Harlouchet 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:08:06 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9369
1193
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:22 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 1.5333333333333348
other stats episodes details {"ep000": {"nsteps": 71, "reward": -14.22417149125946, "good_angle": 0.3338534572823311, "survival_time": 2.366666666666668, "traveled_tiles": 2, "valid_direction": 0.33333333333333215}, "ep001": {"nsteps": 28, "reward": -36.29263753443957, "good_angle": 0.0069164988446014765, "survival_time": 0.9333333333333332, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 42, "reward": -24.26853745324271, "good_angle": 0.13949759890818414, "survival_time": 1.400000000000001, "traveled_tiles": 2, "valid_direction": 0.2333333333333341}, "ep003": {"nsteps": 46, "reward": -22.1674320626518, "good_angle": 0.019680242706311565, "survival_time": 1.5333333333333348, "traveled_tiles": 2, "valid_direction": 0}, "ep004": {"nsteps": 61, "reward": -16.83373046116751, "good_angle": 0.010555403823120408, "survival_time": 2.033333333333336, "traveled_tiles": 2, "valid_direction": 0}}good_angle_max 0.3338534572823311 good_angle_mean 0.10210064031290976 good_angle_median 0.019680242706311565 good_angle_min 0.0069164988446014765 reward_max -14.22417149125946 reward_mean -22.75730180055221 reward_median -22.1674320626518 reward_min -36.29263753443957 survival_time_max 2.366666666666668 survival_time_mean 1.6533333333333349 survival_time_min 0.9333333333333332 traveled_tiles_max 2 traveled_tiles_mean 1.8 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 0.33333333333333215 valid_direction_mean 0.11333333333333324 valid_direction_median 0 valid_direction_min 0
No reset possible 9306
1180
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step4-viz error yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:41:37 Timeout:
Waited 179 [...] Timeout:
Waited 1797.46514106 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9281
1178
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:02 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.28060777012340976 deviation-center-line_median 0.22721196650330475 in-drivable-lane_median 0
other stats deviation-center-line_max 0.397387598231267 deviation-center-line_mean 0.25293935350159086 deviation-center-line_min 0.13210976538295569 deviation-heading_max 1.7444564974638874 deviation-heading_mean 0.7954867340184384 deviation-heading_median 0.6045070411462525 deviation-heading_min 0.2618915420632834 driven_any_max 0.5678510646493321 driven_any_mean 0.3450032324283415 driven_any_median 0.2821608403404374 driven_any_min 0.19998585583219508 driven_lanedir_max 0.42669405448571496 driven_lanedir_mean 0.2838617218300153 driven_lanedir_min 0.1775901034723475 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.5678510646493321, "driven_lanedir": 0.3130090110394963, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.7444564974638874, "deviation-center-line": 0.22721196650330475}, "ep001": {"driven_any": 0.2821608403404374, "driven_lanedir": 0.28060777012340976, "in-drivable-lane": 0, "deviation-heading": 0.2618915420632834, "deviation-center-line": 0.30221720153425513}, "ep002": {"driven_any": 0.43572704749065816, "driven_lanedir": 0.42669405448571496, "in-drivable-lane": 0, "deviation-heading": 0.5989330467566464, "deviation-center-line": 0.397387598231267}, "ep003": {"driven_any": 0.19998585583219508, "driven_lanedir": 0.1775901034723475, "in-drivable-lane": 0, "deviation-heading": 0.7676455426621223, "deviation-center-line": 0.13210976538295569}, "ep004": {"driven_any": 0.2392913538290846, "driven_lanedir": 0.22140767002910788, "in-drivable-lane": 0, "deviation-heading": 0.6045070411462525, "deviation-center-line": 0.20577023585617177}}
No reset possible 9262
1177
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:07 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.3140615527405992 deviation-center-line_median 0.2120344929627202 in-drivable-lane_median 0
other stats deviation-center-line_max 0.3825504647991373 deviation-center-line_mean 0.2555782149297515 deviation-center-line_min 0.13299211806624614 deviation-heading_max 1.8124373524520645 deviation-heading_mean 0.8371617918782187 deviation-heading_median 0.6068979230123304 deviation-heading_min 0.43067669951320775 driven_any_max 0.5857327986545419 driven_any_mean 0.3564370675813312 driven_any_median 0.3214398113505129 driven_any_min 0.20355954308580224 driven_lanedir_max 0.41144029184237896 driven_lanedir_mean 0.29196390143672385 driven_lanedir_min 0.1819474087964288 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.5857327986545419, "driven_lanedir": 0.32097209404557536, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.8124373524520645, "deviation-center-line": 0.2120344929627202}, "ep001": {"driven_any": 0.3214398113505129, "driven_lanedir": 0.3140615527405992, "in-drivable-lane": 0, "deviation-heading": 0.43067669951320775, "deviation-center-line": 0.34423621628418904}, "ep002": {"driven_any": 0.4214477505013758, "driven_lanedir": 0.41144029184237896, "in-drivable-lane": 0, "deviation-heading": 0.6068979230123304, "deviation-center-line": 0.3825504647991373}, "ep003": {"driven_any": 0.20355954308580224, "driven_lanedir": 0.1819474087964288, "in-drivable-lane": 0, "deviation-heading": 0.739481192030715, "deviation-center-line": 0.13299211806624614}, "ep004": {"driven_any": 0.25000543431442346, "driven_lanedir": 0.23139815975863695, "in-drivable-lane": 0, "deviation-heading": 0.5963157923827757, "deviation-center-line": 0.2060777825364651}}
No reset possible 9220
1174
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step2-scoring aborted yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:25:49 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 610, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 136, in _send_request
success_response, exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 210, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 277, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
EndpointConnectionError: Could not connect to the endpoint URL: "https://duckietown-ai-driving-olympics-1.s3.amazonaws.com/v3/frankfurt/by-value/sha256/be6c892ed4210246dbd123f76738d5eea52fa288c9c12ab870908ceb65609c11"
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9196
1173
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:39 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9186
1171
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:05 InvalidEvaluator:
Tr [...] InvalidEvaluator:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 488, in wrap_evaluator
evaluator.score(cie)
File "eval.py", line 97, in score
raise dc.InvalidEvaluator(msg)
InvalidEvaluator: Gym exited with code 2
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9177
1170
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:53 InvalidEvaluator:
Tr [...] InvalidEvaluator:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 488, in wrap_evaluator
evaluator.score(cie)
File "eval.py", line 97, in score
raise dc.InvalidEvaluator(msg)
InvalidEvaluator: Gym exited with code 2
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9162
1166
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:19 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.833333333333333
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.6136273357272148, "good_angle": 0.5537070922558297, "survival_time": 16.666666666666654, "traveled_tiles": 4, "valid_direction": 1.90000000000002}, "ep001": {"nsteps": 85, "reward": -12.354958914307987, "good_angle": 0.21080076316883192, "survival_time": 2.833333333333333, "traveled_tiles": 1, "valid_direction": 0.2666666666666657}, "ep002": {"nsteps": 50, "reward": -20.48418119311333, "good_angle": 0.6347664704960587, "survival_time": 1.6666666666666683, "traveled_tiles": 2, "valid_direction": 0.633333333333335}, "ep003": {"nsteps": 62, "reward": -16.497841009209232, "good_angle": 0.7729083682194203, "survival_time": 2.066666666666669, "traveled_tiles": 1, "valid_direction": 0.7000000000000013}, "ep004": {"nsteps": 361, "reward": -2.9291283473774916, "good_angle": 7.2072485876985635, "survival_time": 12.0333333333333, "traveled_tiles": 3, "valid_direction": 3.366666666666659}}good_angle_max 7.2072485876985635 good_angle_mean 1.8758862563677408 good_angle_median 0.6347664704960587 good_angle_min 0.21080076316883192 reward_max -0.6136273357272148 reward_mean -10.575947359947053 reward_median -12.354958914307987 reward_min -20.48418119311333 survival_time_max 16.666666666666654 survival_time_mean 7.053333333333325 survival_time_min 1.6666666666666683 traveled_tiles_max 4 traveled_tiles_mean 2.2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 3.366666666666659 valid_direction_mean 1.3733333333333362 valid_direction_median 0.7000000000000013 valid_direction_min 0.2666666666666657
No reset possible 9154
1164
Mandana Samiei 🇨🇦My Improved ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.20552955460650055 deviation-center-line_median 0.147201300169161 in-drivable-lane_median 0
other stats deviation-center-line_max 0.35753048570357787 deviation-center-line_mean 0.1907781397534624 deviation-center-line_min 0.09263278175679672 deviation-heading_max 1.7183292422352954 deviation-heading_mean 0.7984177229793936 deviation-heading_median 0.5858313015455301 deviation-heading_min 0.38730041512189434 driven_any_max 0.7928873277300456 driven_any_mean 0.3064261243806684 driven_any_median 0.22498124254441865 driven_any_min 0.08927605174825964 driven_lanedir_max 0.5510805099921994 driven_lanedir_mean 0.24507277155126625 driven_lanedir_min 0.07586780200910015 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.7928873277300456, "driven_lanedir": 0.5510805099921994, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.7183292422352954, "deviation-center-line": 0.35753048570357787}, "ep001": {"driven_any": 0.08927605174825964, "driven_lanedir": 0.07586780200910015, "in-drivable-lane": 0, "deviation-heading": 0.38730041512189434, "deviation-center-line": 0.09263278175679672}, "ep002": {"driven_any": 0.17141768242842462, "driven_lanedir": 0.15359552259448783, "in-drivable-lane": 0, "deviation-heading": 0.5858313015455301, "deviation-center-line": 0.1407064486409552}, "ep003": {"driven_any": 0.22498124254441865, "driven_lanedir": 0.20552955460650055, "in-drivable-lane": 0, "deviation-heading": 0.7449872273121362, "deviation-center-line": 0.147201300169161}, "ep004": {"driven_any": 0.2535683174521935, "driven_lanedir": 0.23929046855404312, "in-drivable-lane": 0, "deviation-heading": 0.5556404286821123, "deviation-center-line": 0.2158196824968214}}
No reset possible 9130
1163
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:40 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9103
1158
David Abraham Pytorch IL aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:15:57 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1101918964409605 deviation-center-line_median 0.4264052630691605 in-drivable-lane_median 10.333333333333334
other stats deviation-center-line_max 0.8858535845217879 deviation-center-line_mean 0.5311440203468543 deviation-center-line_min 0.234202886401531 deviation-heading_max 1.4096575925372257 deviation-heading_mean 0.8364169828850242 deviation-heading_median 0.6608668232516459 deviation-heading_min 0.340749904207677 driven_any_max 4.640049332926225 driven_any_mean 3.9162458484764726 driven_any_median 4.637588355885155 driven_any_min 2.4031890794300694 driven_lanedir_max 1.7007899331865348 driven_lanedir_mean 1.0827797427751649 driven_lanedir_min 0.5633401627106651 in-drivable-lane_max 13.93333333333332 in-drivable-lane_mean 10.046666666666656 in-drivable-lane_min 6.499999999999978 per-episodes details {"ep000": {"driven_any": 3.261059175339024, "driven_lanedir": 1.1101918964409605, "in-drivable-lane": 7.699999999999976, "deviation-heading": 0.340749904207677, "deviation-center-line": 0.7351663906709174}, "ep001": {"driven_any": 4.637588355885155, "driven_lanedir": 1.3080523770740686, "in-drivable-lane": 11.766666666666662, "deviation-heading": 1.1881383356422144, "deviation-center-line": 0.4264052630691605}, "ep002": {"driven_any": 2.4031890794300694, "driven_lanedir": 0.5633401627106651, "in-drivable-lane": 6.499999999999978, "deviation-heading": 0.6608668232516459, "deviation-center-line": 0.234202886401531}, "ep003": {"driven_any": 4.639343298801891, "driven_lanedir": 0.7315243444635957, "in-drivable-lane": 13.93333333333332, "deviation-heading": 0.582672258786358, "deviation-center-line": 0.37409197707087416}, "ep004": {"driven_any": 4.640049332926225, "driven_lanedir": 1.7007899331865348, "in-drivable-lane": 10.333333333333334, "deviation-heading": 1.4096575925372257, "deviation-center-line": 0.8858535845217879}}
No reset possible 9084
1156
Mandana Samiei 🇨🇦My Improved ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:42 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.333333333333335
other stats episodes details {"ep000": {"nsteps": 221, "reward": -4.669979619494654, "good_angle": 0.854506904324985, "survival_time": 7.36666666666665, "traveled_tiles": 2, "valid_direction": 1.1333333333333293}, "ep001": {"nsteps": 93, "reward": -11.33756122505793, "good_angle": 0.03361800838719996, "survival_time": 3.0999999999999988, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 47, "reward": -21.753391784556367, "good_angle": 0.4597435137438536, "survival_time": 1.5666666666666682, "traveled_tiles": 2, "valid_direction": 0.7333333333333348}, "ep003": {"nsteps": 59, "reward": -17.308526529220202, "good_angle": 0.48544786970937487, "survival_time": 1.9666666666666697, "traveled_tiles": 1, "valid_direction": 0.9000000000000027}, "ep004": {"nsteps": 70, "reward": -14.75984545179776, "good_angle": 0.36665932241402327, "survival_time": 2.333333333333335, "traveled_tiles": 1, "valid_direction": 0.4999999999999991}}good_angle_max 0.854506904324985 good_angle_mean 0.43999512371588734 good_angle_median 0.4597435137438536 good_angle_min 0.03361800838719996 reward_max -4.669979619494654 reward_mean -13.965860922025382 reward_median -14.75984545179776 reward_min -21.753391784556367 survival_time_max 7.36666666666665 survival_time_mean 3.2666666666666644 survival_time_min 1.5666666666666682 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.1333333333333293 valid_direction_mean 0.6533333333333331 valid_direction_median 0.7333333333333348 valid_direction_min 0
No reset possible 9067
1155
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:01 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 5.49999999999999
other stats episodes details {"ep000": {"nsteps": 405, "reward": -3.0253246738566775, "good_angle": 3.686269832724286, "survival_time": 13.499999999999964, "traveled_tiles": 5, "valid_direction": 5.79999999999998}, "ep001": {"nsteps": 63, "reward": -16.402814594999192, "good_angle": 0.03886629792191191, "survival_time": 2.1000000000000023, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 334, "reward": -4.357125898646079, "good_angle": 4.619266888418511, "survival_time": 11.133333333333304, "traveled_tiles": 5, "valid_direction": 7.333333333333316}, "ep003": {"nsteps": 165, "reward": -7.449461330192086, "good_angle": 0.8916389022939682, "survival_time": 5.49999999999999, "traveled_tiles": 3, "valid_direction": 2.8999999999999977}, "ep004": {"nsteps": 51, "reward": -20.07449903149231, "good_angle": 0.017789674217765197, "survival_time": 1.700000000000002, "traveled_tiles": 1, "valid_direction": 0}}good_angle_max 4.619266888418511 good_angle_mean 1.8507663191152883 good_angle_median 0.8916389022939682 good_angle_min 0.017789674217765197 reward_max -3.0253246738566775 reward_mean -10.261845105837269 reward_median -7.449461330192086 reward_min -20.07449903149231 survival_time_max 13.499999999999964 survival_time_mean 6.786666666666653 survival_time_min 1.700000000000002 traveled_tiles_max 5 traveled_tiles_mean 3 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 7.333333333333316 valid_direction_mean 3.206666666666659 valid_direction_median 2.8999999999999977 valid_direction_min 0
No reset possible 9047
1152
Iban Harlouchet 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:09 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.200000000000002
other stats episodes details {"ep000": {"nsteps": 165, "reward": -6.174903824806891, "good_angle": 1.283565821367642, "survival_time": 5.49999999999999, "traveled_tiles": 2, "valid_direction": 1.4666666666666617}, "ep001": {"nsteps": 97, "reward": -10.920560585469314, "good_angle": 0.18588145102361045, "survival_time": 3.2333333333333316, "traveled_tiles": 1, "valid_direction": 0.29999999999999893}, "ep002": {"nsteps": 46, "reward": -22.23213052360908, "good_angle": 0.3563900941668696, "survival_time": 1.5333333333333348, "traveled_tiles": 2, "valid_direction": 0.8333333333333347}, "ep003": {"nsteps": 54, "reward": -18.898264500278017, "good_angle": 0.6519035773051937, "survival_time": 1.8000000000000025, "traveled_tiles": 1, "valid_direction": 0.7000000000000022}, "ep004": {"nsteps": 66, "reward": -15.628937987215592, "good_angle": 0.3027082238358728, "survival_time": 2.200000000000002, "traveled_tiles": 1, "valid_direction": 0.6333333333333337}}good_angle_max 1.283565821367642 good_angle_mean 0.5560898335398378 good_angle_median 0.3563900941668696 good_angle_min 0.18588145102361045 reward_max -6.174903824806891 reward_mean -14.770959484275778 reward_median -15.628937987215592 reward_min -22.23213052360908 survival_time_max 5.49999999999999 survival_time_mean 2.8533333333333326 survival_time_min 1.5333333333333348 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.4666666666666617 valid_direction_mean 0.7866666666666662 valid_direction_median 0.7000000000000022 valid_direction_min 0.29999999999999893
No reset possible 9034
1151
Krishna Murthy Jatavallabhula 🇨🇦gym_duckietown + opencv aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:37 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 556, in run
solve(params, cis)
File "solution.py", line 521, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 105, in step
assert len(action) == 2
TypeError: object of type 'NoneType' has no len()
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9017
1147
Krishna Murthy Jatavallabhula 🇨🇦gym_duckietown + opencv aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:35 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 188, "reward": -5.582159952426507, "good_angle": 2.8921088224751617, "survival_time": 6.266666666666654, "traveled_tiles": 1, "valid_direction": 5.166666666666654}, "ep001": {"nsteps": 184, "reward": -6.006585545190003, "good_angle": 0.027950161519413434, "survival_time": 6.133333333333321, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 500, "reward": -0.28698273333907126, "good_angle": 0.5037599778514814, "survival_time": 16.666666666666654, "traveled_tiles": 3, "valid_direction": 1.7000000000000297}, "ep003": {"nsteps": 500, "reward": -0.09759001524746418, "good_angle": 0.00037922062684382017, "survival_time": 16.666666666666654, "traveled_tiles": 3, "valid_direction": 0}, "ep004": {"nsteps": 500, "reward": 0.026040908180526456, "good_angle": 0.07595671165358207, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 0}}good_angle_max 2.8921088224751617 good_angle_mean 0.7000309788252965 good_angle_median 0.07595671165358207 good_angle_min 0.00037922062684382017 reward_max 0.026040908180526456 reward_mean -2.389455467604504 reward_median -0.28698273333907126 reward_min -6.006585545190003 survival_time_max 16.666666666666654 survival_time_mean 12.479999999999986 survival_time_min 6.133333333333321 traveled_tiles_max 3 traveled_tiles_mean 2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 5.166666666666654 valid_direction_mean 1.3733333333333366 valid_direction_median 0 valid_direction_min 0
No reset possible 8991
1145
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:10 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8980
1141
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:05 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 4.522617242685771 deviation-center-line_median 1.0722181438139846 in-drivable-lane_median 3.6666666666666536
other stats deviation-center-line_max 2.427560823019536 deviation-center-line_mean 0.989303812396787 deviation-center-line_min 0.07084844041760953 deviation-heading_max 2.9940108666606484 deviation-heading_mean 1.6738970044968795 deviation-heading_median 1.1978509745800303 deviation-heading_min 0.7098822712947942 driven_any_max 6.368037280314137 driven_any_mean 4.093409343802219 driven_any_median 6.270558193861044 driven_any_min 0.3466666666666718 driven_lanedir_max 4.968473186127305 driven_lanedir_mean 3.024454216109337 driven_lanedir_min 0.14089729353622693 in-drivable-lane_max 4.2666666666666515 in-drivable-lane_mean 2.5466666666666575 in-drivable-lane_min 0.2 per-episodes details {"ep000": {"driven_any": 0.3466666666666718, "driven_lanedir": 0.14089729353622693, "in-drivable-lane": 0.2, "deviation-heading": 0.7098822712947942, "deviation-center-line": 0.07084844041760953}, "ep001": {"driven_any": 1.1775735695514091, "driven_lanedir": 0.9286859742671404, "in-drivable-lane": 0.3999999999999986, "deviation-heading": 1.1978509745800303, "deviation-center-line": 0.2442298264141319}, "ep002": {"driven_any": 6.270558193861044, "driven_lanedir": 4.522617242685771, "in-drivable-lane": 4.199999999999985, "deviation-heading": 2.9940108666606484, "deviation-center-line": 1.0722181438139846}, "ep003": {"driven_any": 6.304211008617831, "driven_lanedir": 4.561597383930241, "in-drivable-lane": 4.2666666666666515, "deviation-heading": 2.558912564235724, "deviation-center-line": 1.131661828318672}, "ep004": {"driven_any": 6.368037280314137, "driven_lanedir": 4.968473186127305, "in-drivable-lane": 3.6666666666666536, "deviation-heading": 0.9088283457132, "deviation-center-line": 2.427560823019536}}
No reset possible 8851
1122
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring aborted yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 1:05:08 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 610, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 136, in _send_request
success_response, exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 210, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 277, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
EndpointConnectionError: Could not connect to the endpoint URL: "https://duckietown-ai-driving-olympics-1.s3.amazonaws.com/v3/frankfurt/by-value/sha256/be6c892ed4210246dbd123f76738d5eea52fa288c9c12ab870908ceb65609c11"
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8843
1121
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:47 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8779
1113
Krishna Murthy Jatavallabhula 🇨🇦gym_duckietown + opencv aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:02 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -1.3001166402101516, "good_angle": 54.33312579346873, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 14.76666666666666}, "ep001": {"nsteps": 500, "reward": -1.753247414290905, "good_angle": 13.710775419046632, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.899999999999988}, "ep002": {"nsteps": 500, "reward": -1.575717896401882, "good_angle": 13.670340188871982, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.866666666666656}, "ep003": {"nsteps": 500, "reward": -1.3772086943238973, "good_angle": 13.665101915411906, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 13.03333333333332}, "ep004": {"nsteps": 500, "reward": -1.6150296981334686, "good_angle": 13.667722647448745, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.966666666666653}}good_angle_max 54.33312579346873 good_angle_mean 21.809413192849597 good_angle_median 13.670340188871982 good_angle_min 13.665101915411906 reward_max -1.3001166402101516 reward_mean -1.524264068672061 reward_median -1.575717896401882 reward_min -1.753247414290905 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 1 traveled_tiles_mean 1 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 14.76666666666666 valid_direction_mean 13.306666666666654 valid_direction_median 12.966666666666653 valid_direction_min 12.866666666666656
No reset possible 8769
1112
Krishna Murthy Jatavallabhula 🇨🇦gym_duckietown + opencv aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:23 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -1.30106463265419, "good_angle": 54.24953817239443, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 14.833333333333323}, "ep001": {"nsteps": 500, "reward": -1.753247414290905, "good_angle": 13.665899235634638, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.899999999999988}, "ep002": {"nsteps": 500, "reward": -1.575717896401882, "good_angle": 13.670340188871982, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.866666666666656}, "ep003": {"nsteps": 500, "reward": -1.3772086943238973, "good_angle": 13.665101915411906, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 13.03333333333332}, "ep004": {"nsteps": 500, "reward": -1.6150296981334686, "good_angle": 13.667722647448745, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.966666666666653}}good_angle_max 54.24953817239443 good_angle_mean 21.78372043195234 good_angle_median 13.667722647448745 good_angle_min 13.665101915411906 reward_max -1.30106463265419 reward_mean -1.5244536671608688 reward_median -1.575717896401882 reward_min -1.753247414290905 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 1 traveled_tiles_mean 1 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 14.833333333333323 valid_direction_mean 13.319999999999988 valid_direction_median 12.966666666666653 valid_direction_min 12.866666666666656
No reset possible 8740
1108
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:18:37 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 4.770405108500412 deviation-center-line_median 1.2882499664978988 in-drivable-lane_median 1.533333333333328
other stats deviation-center-line_max 2.6795700709364354 deviation-center-line_mean 1.6171486262204433 deviation-center-line_min 0.7957689340509382 deviation-heading_max 3.3901748526040403 deviation-heading_mean 2.315147760415663 deviation-heading_median 2.325774299185338 deviation-heading_min 1.1332800828333809 driven_any_max 5.648673734230038 driven_any_mean 4.91142241599376 driven_any_median 5.287518881534349 driven_any_min 2.9804186489360935 driven_lanedir_max 5.620425621008038 driven_lanedir_mean 4.396179359628715 driven_lanedir_min 2.363778209447126 in-drivable-lane_max 2.266666666666673 in-drivable-lane_mean 1.2533333333333323 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 5.17202384603784, "driven_lanedir": 4.287933297769848, "in-drivable-lane": 2.266666666666673, "deviation-heading": 3.3901748526040403, "deviation-center-line": 1.2882499664978988}, "ep001": {"driven_any": 5.4684769692304815, "driven_lanedir": 4.770405108500412, "in-drivable-lane": 1.6666666666666607, "deviation-heading": 2.4959908952433354, "deviation-center-line": 2.212813773513863}, "ep002": {"driven_any": 2.9804186489360935, "driven_lanedir": 2.363778209447126, "in-drivable-lane": 1.533333333333328, "deviation-heading": 2.325774299185338, "deviation-center-line": 0.7957689340509382}, "ep003": {"driven_any": 5.287518881534349, "driven_lanedir": 4.93835456141815, "in-drivable-lane": 0.8000000000000007, "deviation-heading": 2.2305186722122214, "deviation-center-line": 1.1093403861030806}, "ep004": {"driven_any": 5.648673734230038, "driven_lanedir": 5.620425621008038, "in-drivable-lane": 0, "deviation-heading": 1.1332800828333809, "deviation-center-line": 2.6795700709364354}}
No reset possible 8645
1087
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step4-viz timeout yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:31:01 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8565
1075
Claudio Ruch Big Fleet Controller aido1_amod_service_quality_r1-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:40 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
service_quality -34.43219128592712
other stats efficiency -71.83220514370853 fleet_size -1000000000
No reset possible 8553
1072
Claudio Ruch Algorithm of Webinar aido1_amod_service_quality_r1-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:12:49 Timeout:
Waited 603 [...] Timeout:
Waited 603.24469614 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8542
1069
Claudio Ruch Webinar Algorithm aido1_amod_fleet_size_r1-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:18:16 Timeout:
Waited 604 [...] Timeout:
Waited 604.03264308 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8516
1065
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8502
1063
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:19 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8496
1061
Mandana Samiei 🇨🇦bazinga!!! aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:09 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8487
1059
Mandana Samiei 🇨🇦bazinga!!! aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:12:12 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.18432958645266373 deviation-center-line_median 0.1903067388339497 in-drivable-lane_median 0
other stats deviation-center-line_max 0.3638369144072225 deviation-center-line_mean 0.20281651861136468 deviation-center-line_min 0.12969935361256285 deviation-heading_max 1.764454269158792 deviation-heading_mean 0.7934173788539537 deviation-heading_median 0.6420676885644672 deviation-heading_min 0.21000204087185195 driven_any_max 0.8143193326557515 driven_any_mean 0.31857438282486134 driven_any_median 0.1999837410490108 driven_any_min 0.15712882070308834 driven_lanedir_max 0.5781194172899131 driven_lanedir_mean 0.2592283223446602 driven_lanedir_min 0.1361674384288447 in-drivable-lane_max 1.8666666666666691 in-drivable-lane_mean 0.37333333333333385 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.8143193326557515, "driven_lanedir": 0.5781194172899131, "in-drivable-lane": 1.8666666666666691, "deviation-heading": 1.764454269158792, "deviation-center-line": 0.3638369144072225}, "ep001": {"driven_any": 0.18572532904378736, "driven_lanedir": 0.18432958645266373, "in-drivable-lane": 0, "deviation-heading": 0.21000204087185195, "deviation-center-line": 0.1903067388339497}, "ep002": {"driven_any": 0.15712882070308834, "driven_lanedir": 0.1361674384288447, "in-drivable-lane": 0, "deviation-heading": 0.6420676885644672, "deviation-center-line": 0.12969935361256285}, "ep003": {"driven_any": 0.1999837410490108, "driven_lanedir": 0.17974964386648518, "in-drivable-lane": 0, "deviation-heading": 0.7291853471865158, "deviation-center-line": 0.1320024695877375}, "ep004": {"driven_any": 0.2357146906726685, "driven_lanedir": 0.2177755256853944, "in-drivable-lane": 0, "deviation-heading": 0.6213775484881418, "deviation-center-line": 0.1982371166153506}}
No reset possible 8479
1058
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:08:58 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8475
1057
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:54 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.5926259270254522, "good_angle": 0.8191843059460885, "survival_time": 16.666666666666654, "traveled_tiles": 9, "valid_direction": 1.8333333333333268}, "ep001": {"nsteps": 500, "reward": -0.5868085568649695, "good_angle": 44.97307514874745, "survival_time": 16.666666666666654, "traveled_tiles": 9, "valid_direction": 6.566666666666675}, "ep002": {"nsteps": 500, "reward": -0.660754750397522, "good_angle": 0.9797513618123548, "survival_time": 16.666666666666654, "traveled_tiles": 9, "valid_direction": 2.666666666666661}, "ep003": {"nsteps": 500, "reward": -0.6510454961033538, "good_angle": 0.842613897542074, "survival_time": 16.666666666666654, "traveled_tiles": 9, "valid_direction": 2.3333333333333273}, "ep004": {"nsteps": 500, "reward": -0.47875257387198505, "good_angle": 29.73362753251532, "survival_time": 16.666666666666654, "traveled_tiles": 9, "valid_direction": 4.3333333333333215}}good_angle_max 44.97307514874745 good_angle_mean 15.469650449312656 good_angle_median 0.9797513618123548 good_angle_min 0.8191843059460885 reward_max -0.47875257387198505 reward_mean -0.5939974608526565 reward_median -0.5926259270254522 reward_min -0.660754750397522 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 9 traveled_tiles_mean 9 traveled_tiles_median 9 traveled_tiles_min 9 valid_direction_max 6.566666666666675 valid_direction_mean 3.5466666666666624 valid_direction_median 2.666666666666661 valid_direction_min 1.8333333333333268
No reset possible 8465
1056
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:31 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.6666666666666634
other stats episodes details {"ep000": {"nsteps": 34, "reward": -29.653921192765292, "good_angle": 0.5432750363228189, "survival_time": 1.1333333333333335, "traveled_tiles": 1, "valid_direction": 0.9666666666666668}, "ep001": {"nsteps": 34, "reward": -29.970305812709473, "good_angle": 0.0050401930608778515, "survival_time": 1.1333333333333335, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 110, "reward": -9.593723950873722, "good_angle": 0.43242677581517874, "survival_time": 3.6666666666666634, "traveled_tiles": 3, "valid_direction": 0.93333333333333}, "ep003": {"nsteps": 196, "reward": -5.411052850930362, "good_angle": 0.4479678779418729, "survival_time": 6.53333333333332, "traveled_tiles": 4, "valid_direction": 0.9666666666666632}, "ep004": {"nsteps": 148, "reward": -7.345814647617108, "good_angle": 8.8217135095513, "survival_time": 4.933333333333326, "traveled_tiles": 3, "valid_direction": 1.699999999999994}}good_angle_max 8.8217135095513 good_angle_mean 2.05008467853841 good_angle_median 0.4479678779418729 good_angle_min 0.0050401930608778515 reward_max -5.411052850930362 reward_mean -16.39496369097919 reward_median -9.593723950873722 reward_min -29.970305812709473 survival_time_max 6.53333333333332 survival_time_mean 3.4799999999999955 survival_time_min 1.1333333333333335 traveled_tiles_max 4 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 1.699999999999994 valid_direction_mean 0.9133333333333308 valid_direction_median 0.9666666666666632 valid_direction_min 0
No reset possible 8449
1055
Benjamin Ramtoula 🇨🇦My ROS solution - param 3 aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:38 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8429
1052
Mandana Samiei 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:24 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 1.9000000000000024
other stats episodes details {"ep000": {"nsteps": 228, "reward": -4.544706287524222, "good_angle": 0.7055388557312879, "survival_time": 7.599999999999983, "traveled_tiles": 2, "valid_direction": 0.7999999999999974}, "ep001": {"nsteps": 18, "reward": -56.22312803235319, "good_angle": 0.4065845464531816, "survival_time": 0.6, "traveled_tiles": 1, "valid_direction": 0.4666666666666666}, "ep002": {"nsteps": 45, "reward": -22.69302364322874, "good_angle": 0.4408105114032268, "survival_time": 1.500000000000001, "traveled_tiles": 2, "valid_direction": 0.7333333333333347}, "ep003": {"nsteps": 57, "reward": -17.89524257875848, "good_angle": 0.5804233306775364, "survival_time": 1.9000000000000024, "traveled_tiles": 1, "valid_direction": 0.8000000000000024}, "ep004": {"nsteps": 69, "reward": -14.968996045382127, "good_angle": 0.3457390421344991, "survival_time": 2.3000000000000016, "traveled_tiles": 1, "valid_direction": 0.8000000000000003}}good_angle_max 0.7055388557312879 good_angle_mean 0.4958192572799464 good_angle_median 0.4408105114032268 good_angle_min 0.3457390421344991 reward_max -4.544706287524222 reward_mean -23.26501931744935 reward_median -17.89524257875848 reward_min -56.22312803235319 survival_time_max 7.599999999999983 survival_time_mean 2.7799999999999976 survival_time_min 0.6 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 0.8000000000000024 valid_direction_mean 0.7200000000000003 valid_direction_median 0.7999999999999974 valid_direction_min 0.4666666666666666
No reset possible 8401
1048
Mandana Samiei 🇨🇦Solution template aido1_luck-v3
step1 success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:32:00 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8354
1042
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step3-videos error yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:43:22 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 623, in _make_api_call
raise error_class(parsed_response, operation_name)
ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8329
1037
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:31 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 89, in solve
observation, reward, done, info = env.step(action)
File "/workspace/wrappers.py", line 252, in step
return self.step_wait()
File "/workspace/wrappers.py", line 271, in step_wait
results = [env.step(a) for (a,env) in zip(self.actions, self.envs)]
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8280
1027
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:24:31 Timeout:
Waited 643 [...] Timeout:
Waited 643.184754848 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8276
1025
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:33 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 89, in solve
observation, reward, done, info = env.step(action)
File "/workspace/wrappers.py", line 252, in step
return self.step_wait()
File "/workspace/wrappers.py", line 271, in step_wait
results = [env.step(a) for (a,env) in zip(self.actions, self.envs)]
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8237
1016
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:01 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 7.533333333333316
other stats episodes details {"ep000": {"nsteps": 76, "reward": -13.416412785493058, "good_angle": 0.9723576108709706, "survival_time": 2.533333333333334, "traveled_tiles": 1, "valid_direction": 2.333333333333334}, "ep001": {"nsteps": 108, "reward": -9.848042460503402, "good_angle": 2.041641545621004, "survival_time": 3.599999999999997, "traveled_tiles": 2, "valid_direction": 0.5666666666666647}, "ep002": {"nsteps": 226, "reward": -4.574839447271614, "good_angle": 1.5760425170941503, "survival_time": 7.533333333333316, "traveled_tiles": 5, "valid_direction": 2.5999999999999908}, "ep003": {"nsteps": 500, "reward": -0.13448444758018013, "good_angle": 0.6536031283623902, "survival_time": 16.666666666666654, "traveled_tiles": 6, "valid_direction": 2.06666666666666}, "ep004": {"nsteps": 282, "reward": -4.769903202343019, "good_angle": 16.060156931528677, "survival_time": 9.399999999999975, "traveled_tiles": 3, "valid_direction": 4.166666666666653}}good_angle_max 16.060156931528677 good_angle_mean 4.260760346695439 good_angle_median 1.5760425170941503 good_angle_min 0.6536031283623902 reward_max -0.13448444758018013 reward_mean -6.548736468638255 reward_median -4.769903202343019 reward_min -13.416412785493058 survival_time_max 16.666666666666654 survival_time_mean 7.946666666666656 survival_time_min 2.533333333333334 traveled_tiles_max 6 traveled_tiles_mean 3.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 4.166666666666653 valid_direction_mean 2.3466666666666605 valid_direction_median 2.333333333333334 valid_direction_min 0.5666666666666647
No reset possible 8227
1015
David Abraham Pytorch IL aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:30 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.08394985266846833 deviation-center-line_median 0.0778449551053503 in-drivable-lane_median 0
other stats deviation-center-line_max 0.16898661366894904 deviation-center-line_mean 0.10439867905289547 deviation-center-line_min 0.07248195277312836 deviation-heading_max 1.7253708370840477 deviation-heading_mean 0.7875510991650424 deviation-heading_median 0.5037614413554353 deviation-heading_min 0.4630156498742613 driven_any_max 0.7513848549189196 driven_any_mean 0.2360831876768089 driven_any_median 0.10707211462789996 driven_any_min 0.0994549271256484 driven_lanedir_max 0.18517716028379183 driven_lanedir_mean 0.10393327662038757 driven_lanedir_min 0.07534926778880213 in-drivable-lane_max 3.1999999999999984 in-drivable-lane_mean 0.6399999999999997 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.7513848549189196, "driven_lanedir": 0.18517716028379183, "in-drivable-lane": 3.1999999999999984, "deviation-heading": 1.7253708370840477, "deviation-center-line": 0.16898661366894904}, "ep001": {"driven_any": 0.10707211462789996, "driven_lanedir": 0.09269693187246995, "in-drivable-lane": 0, "deviation-heading": 0.5037614413554353, "deviation-center-line": 0.1279065617281198}, "ep002": {"driven_any": 0.0994549271256484, "driven_lanedir": 0.07534926778880213, "in-drivable-lane": 0, "deviation-heading": 0.4951494830918272, "deviation-center-line": 0.07248195277312836}, "ep003": {"driven_any": 0.11762045880819127, "driven_lanedir": 0.08249317048840554, "in-drivable-lane": 0, "deviation-heading": 0.7504580844196407, "deviation-center-line": 0.07477331198892995}, "ep004": {"driven_any": 0.1048835829033852, "driven_lanedir": 0.08394985266846833, "in-drivable-lane": 0, "deviation-heading": 0.4630156498742613, "deviation-center-line": 0.0778449551053503}}
No reset possible 8214
1013
David Abraham Pytorch IL aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:51 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.045501921968241 deviation-center-line_median 0.2091624126838803 in-drivable-lane_median 0.6666666666666643
other stats deviation-center-line_max 0.4092460733590128 deviation-center-line_mean 0.22209535103391645 deviation-center-line_min 0.09069371930535795 deviation-heading_max 1.9898991959179324 deviation-heading_mean 0.94011531750907 deviation-heading_median 0.9544381615223942 deviation-heading_min 0.21977299030009587 driven_any_max 4.767455751110348 driven_any_mean 2.001900079658949 driven_any_median 1.441799742533946 driven_any_min 0.4699652285178057 driven_lanedir_max 3.5709948744457645 driven_lanedir_mean 1.5296434476594245 driven_lanedir_min 0.3889697115797952 in-drivable-lane_max 1.233333333333329 in-drivable-lane_mean 0.6866666666666648 in-drivable-lane_min 0.16666666666666718 per-episodes details {"ep000": {"driven_any": 2.345456280523682, "driven_lanedir": 1.78342391275278, "in-drivable-lane": 1.0666666666666644, "deviation-heading": 1.272508723248177, "deviation-center-line": 0.3002014448323916}, "ep001": {"driven_any": 0.4699652285178057, "driven_lanedir": 0.3889697115797952, "in-drivable-lane": 0.16666666666666718, "deviation-heading": 0.26395751655675054, "deviation-center-line": 0.1011731049889396}, "ep002": {"driven_any": 1.441799742533946, "driven_lanedir": 1.045501921968241, "in-drivable-lane": 0.6666666666666643, "deviation-heading": 0.9544381615223942, "deviation-center-line": 0.2091624126838803}, "ep003": {"driven_any": 4.767455751110348, "driven_lanedir": 3.5709948744457645, "in-drivable-lane": 1.233333333333329, "deviation-heading": 1.9898991959179324, "deviation-center-line": 0.4092460733590128}, "ep004": {"driven_any": 0.984823395608963, "driven_lanedir": 0.85932681755054, "in-drivable-lane": 0.29999999999999893, "deviation-heading": 0.21977299030009587, "deviation-center-line": 0.09069371930535795}}
No reset possible 8203
1013
David Abraham Pytorch IL aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:25 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.8666666666666623
other stats episodes details {"ep000": {"nsteps": 190, "reward": -5.205830141776977, "good_angle": 1.396838191834734, "survival_time": 6.333333333333321, "traveled_tiles": 4, "valid_direction": 0.7333333333333308}, "ep001": {"nsteps": 46, "reward": -22.44177474923756, "good_angle": 0.6096318451873292, "survival_time": 1.5333333333333348, "traveled_tiles": 1, "valid_direction": 0.33333333333333437}, "ep002": {"nsteps": 116, "reward": -8.61695077078987, "good_angle": 1.3546548994691416, "survival_time": 3.8666666666666623, "traveled_tiles": 3, "valid_direction": 0.6666666666666643}, "ep003": {"nsteps": 339, "reward": -2.86965735018072, "good_angle": 1.3466075416605623, "survival_time": 11.29999999999997, "traveled_tiles": 7, "valid_direction": 0.7999999999999983}, "ep004": {"nsteps": 87, "reward": -11.927641337501932, "good_angle": 0.7192143058080895, "survival_time": 2.8999999999999995, "traveled_tiles": 2, "valid_direction": 0.466666666666665}}good_angle_max 1.396838191834734 good_angle_mean 1.0853893567919712 good_angle_median 1.3466075416605623 good_angle_min 0.6096318451873292 reward_max -2.86965735018072 reward_mean -10.212370869897413 reward_median -8.61695077078987 reward_min -22.44177474923756 survival_time_max 11.29999999999997 survival_time_mean 5.186666666666658 survival_time_min 1.5333333333333348 traveled_tiles_max 7 traveled_tiles_mean 3.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 0.7999999999999983 valid_direction_mean 0.5999999999999985 valid_direction_median 0.6666666666666643 valid_direction_min 0.33333333333333437
No reset possible 8194
1011
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:26 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 67, in solve
from model import A2CPG
File "/workspace/model.py", line 5, in <module>
import torch
ModuleNotFoundError: No module named 'torch'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 67, in solve
from model import A2CPG
File "/workspace/model.py", line 5, in <module>
import torch
ModuleNotFoundError: No module named 'torch'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8189
1010
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:12:32 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 124, in run
solve(params, cis)
File "solution.py", line 45, in solve
env = DummyVecEnv([make_env(params)])
File "/workspace/wrappers.py", line 261, in __init__
self.envs = [fn() for fn in env_fns]
File "/workspace/wrappers.py", line 261, in <listcomp>
self.envs = [fn() for fn in env_fns]
TypeError: 'NoneType' object is not callable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 124, in run
solve(params, cis)
File "solution.py", line 45, in solve
env = DummyVecEnv([make_env(params)])
File "/workspace/wrappers.py", line 261, in __init__
self.envs = [fn() for fn in env_fns]
File "/workspace/wrappers.py", line 261, in <listcomp>
self.envs = [fn() for fn in env_fns]
TypeError: 'NoneType' object is not callable
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8177
1008
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:11 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 122, in run
solve(params, cis)
File "solution.py", line 43, in solve
env = DummyVecEnv([env])
File "/workspace/wrappers.py", line 261, in __init__
self.envs = [fn() for fn in env_fns]
File "/workspace/wrappers.py", line 261, in <listcomp>
self.envs = [fn() for fn in env_fns]
TypeError: 'ScaleObservations' object is not callable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 122, in run
solve(params, cis)
File "solution.py", line 43, in solve
env = DummyVecEnv([env])
File "/workspace/wrappers.py", line 261, in __init__
self.envs = [fn() for fn in env_fns]
File "/workspace/wrappers.py", line 261, in <listcomp>
self.envs = [fn() for fn in env_fns]
TypeError: 'ScaleObservations' object is not callable
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8171
1004
Laurent Mandrile Tensorflow template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:08:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.13312858444658682 deviation-center-line_median 0.08744010848352336 in-drivable-lane_median 8.366666666666639
other stats deviation-center-line_max 0.2951251593343002 deviation-center-line_mean 0.12729673166499883 deviation-center-line_min 0.047860468716545944 deviation-heading_max 1.2058790583016687 deviation-heading_mean 0.7178061254891885 deviation-heading_median 0.7905924332634966 deviation-heading_min 0.1745193388357215 driven_any_max 4.454184189442348 driven_any_mean 2.5257450968093726 driven_any_median 2.3537329967677016 driven_any_min 1.0662869877338133 driven_lanedir_max 0.4531051551321528 driven_lanedir_mean 0.193667880939661 driven_lanedir_min 0.07969926819667117 in-drivable-lane_max 15.999999999999988 in-drivable-lane_mean 9.086666666666648 in-drivable-lane_min 3.766666666666661 per-episodes details {"ep000": {"driven_any": 1.995026722965619, "driven_lanedir": 0.07969926819667117, "in-drivable-lane": 7.866666666666648, "deviation-heading": 0.1745193388357215, "deviation-center-line": 0.047860468716545944}, "ep001": {"driven_any": 2.3537329967677016, "driven_lanedir": 0.13312858444658682, "in-drivable-lane": 9.433333333333303, "deviation-heading": 1.2058790583016687, "deviation-center-line": 0.1298299749331926}, "ep002": {"driven_any": 4.454184189442348, "driven_lanedir": 0.18029873261228624, "in-drivable-lane": 15.999999999999988, "deviation-heading": 0.47847911565363, "deviation-center-line": 0.07622794685743216}, "ep003": {"driven_any": 1.0662869877338133, "driven_lanedir": 0.12210766431060804, "in-drivable-lane": 3.766666666666661, "deviation-heading": 0.7905924332634966, "deviation-center-line": 0.08744010848352336}, "ep004": {"driven_any": 2.7594945871373833, "driven_lanedir": 0.4531051551321528, "in-drivable-lane": 8.366666666666639, "deviation-heading": 0.9395606813914256, "deviation-center-line": 0.2951251593343002}}
No reset possible 8154
1003
Ruixiang Zhang 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:46 The result file is n [...] The result file is not found. This usually means that the evaluator did not finish
and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8134
1000
Ruixiang Zhang 🇨🇦stay simple aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:07 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -1.2171028180718422, "good_angle": 48.97482659061072, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 13.999999999999991}, "ep001": {"nsteps": 500, "reward": -1.6295727347135545, "good_angle": 12.537820502688945, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.366666666666664}, "ep002": {"nsteps": 500, "reward": -1.448073302745819, "good_angle": 12.631829880438866, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.566666666666665}, "ep003": {"nsteps": 500, "reward": -1.2542260830253362, "good_angle": 12.6059242905706, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.499999999999998}, "ep004": {"nsteps": 500, "reward": -1.4929706085324288, "good_angle": 12.700758660733726, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.666666666666664}}good_angle_max 48.97482659061072 good_angle_mean 19.890231985008572 good_angle_median 12.631829880438866 good_angle_min 12.537820502688945 reward_max -1.2171028180718422 reward_mean -1.408389109417796 reward_median -1.448073302745819 reward_min -1.6295727347135545 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 1 traveled_tiles_mean 1 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 13.999999999999991 valid_direction_mean 12.82 valid_direction_median 12.566666666666665 valid_direction_min 12.366666666666664
No reset possible 8131
999
Laurent Mandrile Tensorflow template aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:13 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 86, in run
solve(params, cis) # let's try to solve the challenge,
File "solution.py", line 29, in solve
from model import TfInference
ImportError: cannot import name TfInference
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8128
997
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:25 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 3.475756387262958 deviation-center-line_median 0.49560335850959286 in-drivable-lane_median 0.4999999999999982
other stats deviation-center-line_max 0.6065490419207544 deviation-center-line_mean 0.4244847081266988 deviation-center-line_min 0.14133001541463242 deviation-heading_max 2.0975060124105807 deviation-heading_mean 1.3631473642597385 deviation-heading_median 1.3679922168602072 deviation-heading_min 0.489578171429585 driven_any_max 4.919781794687056 driven_any_mean 3.5985282374288516 driven_any_median 3.7340983940482646 driven_any_min 1.7385862129427283 driven_lanedir_max 4.139898140690089 driven_lanedir_mean 3.043896426628955 driven_lanedir_min 1.3724302605954322 in-drivable-lane_max 1.0666666666666638 in-drivable-lane_mean 0.6666666666666645 in-drivable-lane_min 0.2666666666666657 per-episodes details {"ep000": {"driven_any": 4.454081912764259, "driven_lanedir": 3.5609636276616254, "in-drivable-lane": 1.066666666666663, "deviation-heading": 2.0975060124105807, "deviation-center-line": 0.49560335850959286}, "ep001": {"driven_any": 3.1460928727019506, "driven_lanedir": 2.6704337169346704, "in-drivable-lane": 0.4999999999999982, "deviation-heading": 1.187324127325582, "deviation-center-line": 0.36788207121595706}, "ep002": {"driven_any": 3.7340983940482646, "driven_lanedir": 3.475756387262958, "in-drivable-lane": 0.2666666666666657, "deviation-heading": 1.3679922168602072, "deviation-center-line": 0.5110590535725572}, "ep003": {"driven_any": 4.919781794687056, "driven_lanedir": 4.139898140690089, "in-drivable-lane": 1.0666666666666638, "deviation-heading": 1.6733362932727374, "deviation-center-line": 0.6065490419207544}, "ep004": {"driven_any": 1.7385862129427283, "driven_lanedir": 1.3724302605954322, "in-drivable-lane": 0.4333333333333318, "deviation-heading": 0.489578171429585, "deviation-center-line": 0.14133001541463242}}
No reset possible 8117
995
Manfred Diaz DAgger aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:41 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 4.5823839291727095 deviation-center-line_median 0.6555775253964764 in-drivable-lane_median 1.9333333333333569
other stats deviation-center-line_max 0.7478136342323259 deviation-center-line_mean 0.5131483896272906 deviation-center-line_min 0.01634856632848061 deviation-heading_max 2.88546402769535 deviation-heading_mean 2.00166574551457 deviation-heading_median 2.456345551636541 deviation-heading_min 0.11742828899154267 driven_any_max 5.511832628995549 driven_any_mean 5.419852401357481 driven_any_median 5.471390396157749 driven_any_min 5.152325443868452 driven_lanedir_max 5.071319658152142 driven_lanedir_mean 3.744807265656315 driven_lanedir_min -0.004230523974507516 in-drivable-lane_max 16.43333333333332 in-drivable-lane_mean 4.513333333333337 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 5.509384097521629, "driven_lanedir": 4.407508082933724, "in-drivable-lane": 2.466666666666681, "deviation-heading": 2.774756167413308, "deviation-center-line": 0.7368158955986679}, "ep001": {"driven_any": 5.511832628995549, "driven_lanedir": -0.004230523974507516, "in-drivable-lane": 16.43333333333332, "deviation-heading": 0.11742828899154267, "deviation-center-line": 0.01634856632848061}, "ep002": {"driven_any": 5.471390396157749, "driven_lanedir": 4.667055181997508, "in-drivable-lane": 1.733333333333328, "deviation-heading": 2.88546402769535, "deviation-center-line": 0.7478136342323259}, "ep003": {"driven_any": 5.454329440244028, "driven_lanedir": 4.5823839291727095, "in-drivable-lane": 1.9333333333333569, "deviation-heading": 2.456345551636541, "deviation-center-line": 0.6555775253964764}, "ep004": {"driven_any": 5.152325443868452, "driven_lanedir": 5.071319658152142, "in-drivable-lane": 0, "deviation-heading": 1.7743346918361105, "deviation-center-line": 0.4091863265805024}}
No reset possible 8095
994
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:07 Timeout:
Waited 603 [...] Timeout:
Waited 603.670032024 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8073
989
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:13:40 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8003
977
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:32 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.5617032380115112 deviation-center-line_median 0.14308533804851947 in-drivable-lane_median 0.16666666666666607
other stats deviation-center-line_max 0.769540697826587 deviation-center-line_mean 0.24717872572231145 deviation-center-line_min 0.08229304132893547 deviation-heading_max 1.0474203473264154 deviation-heading_mean 0.5589297051313892 deviation-heading_median 0.5746229109495427 deviation-heading_min 0.0638008653416417 driven_any_max 2.221621798459929 driven_any_mean 0.9336463312269558 driven_any_median 0.7983162016184838 driven_any_min 0.3399979234285063 driven_lanedir_max 2.096862852800724 driven_lanedir_mean 0.8285242118983527 driven_lanedir_min 0.33872968824241 in-drivable-lane_max 0.4666666666666681 in-drivable-lane_mean 0.16666666666666682 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.7983162016184838, "driven_lanedir": 0.6497025332567524, "in-drivable-lane": 0.2, "deviation-heading": 0.5746229109495427, "deviation-center-line": 0.14308533804851947}, "ep001": {"driven_any": 0.3399979234285063, "driven_lanedir": 0.33872968824241, "in-drivable-lane": 0, "deviation-heading": 0.0638008653416417, "deviation-center-line": 0.08229304132893547}, "ep002": {"driven_any": 0.4983265844759663, "driven_lanedir": 0.4956227471803656, "in-drivable-lane": 0, "deviation-heading": 0.0971725872124701, "deviation-center-line": 0.08844675530989567}, "ep003": {"driven_any": 2.221621798459929, "driven_lanedir": 2.096862852800724, "in-drivable-lane": 0.16666666666666607, "deviation-heading": 1.0474203473264154, "deviation-center-line": 0.769540697826587}, "ep004": {"driven_any": 0.8099691481518935, "driven_lanedir": 0.5617032380115112, "in-drivable-lane": 0.4666666666666681, "deviation-heading": 1.011631814826876, "deviation-center-line": 0.15252779609761974}}
No reset possible 8000
977
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:42 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 1.8000000000000025
other stats episodes details {"ep000": {"nsteps": 54, "reward": -18.739795033500908, "good_angle": 0.17111099844596908, "survival_time": 1.8000000000000025, "traveled_tiles": 2, "valid_direction": 0.6666666666666665}, "ep001": {"nsteps": 23, "reward": -44.04389274120331, "good_angle": 0.007825117906309196, "survival_time": 0.7666666666666666, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 31, "reward": -32.69075833405218, "good_angle": 0.013064824326917994, "survival_time": 1.0333333333333332, "traveled_tiles": 2, "valid_direction": 0}, "ep003": {"nsteps": 169, "reward": -6.273822177544556, "good_angle": 0.4380316878030926, "survival_time": 5.633333333333323, "traveled_tiles": 4, "valid_direction": 1.2999999999999965}, "ep004": {"nsteps": 70, "reward": -14.708262841722794, "good_angle": 1.2340384580676431, "survival_time": 2.333333333333335, "traveled_tiles": 2, "valid_direction": 1.7000000000000015}}good_angle_max 1.2340384580676431 good_angle_mean 0.3728142173099864 good_angle_median 0.17111099844596908 good_angle_min 0.007825117906309196 reward_max -6.273822177544556 reward_mean -23.29130622560475 reward_median -18.739795033500908 reward_min -44.04389274120331 survival_time_max 5.633333333333323 survival_time_mean 2.3133333333333317 survival_time_min 0.7666666666666666 traveled_tiles_max 4 traveled_tiles_mean 2.2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 1.7000000000000015 valid_direction_mean 0.7333333333333328 valid_direction_median 0.6666666666666665 valid_direction_min 0
No reset possible 7983
976
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:31 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7979
975
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:41 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 6.566666666666653
other stats episodes details {"ep000": {"nsteps": 179, "reward": -6.143844588318912, "good_angle": 0.20726566058568413, "survival_time": 5.966666666666655, "traveled_tiles": 4, "valid_direction": 0.6666666666666665}, "ep001": {"nsteps": 311, "reward": -3.7829405356683674, "good_angle": 7.451692345729561, "survival_time": 10.36666666666664, "traveled_tiles": 5, "valid_direction": 3.5999999999999917}, "ep002": {"nsteps": 58, "reward": -18.3334619030517, "good_angle": 1.1586740658805823, "survival_time": 1.933333333333336, "traveled_tiles": 2, "valid_direction": 1.8666666666666691}, "ep003": {"nsteps": 197, "reward": -5.360624333171098, "good_angle": 0.3756824015196448, "survival_time": 6.566666666666653, "traveled_tiles": 5, "valid_direction": 1.099999999999996}, "ep004": {"nsteps": 203, "reward": -5.274989076942811, "good_angle": 10.522511174400554, "survival_time": 6.766666666666652, "traveled_tiles": 5, "valid_direction": 2.0666666666666593}}good_angle_max 10.522511174400554 good_angle_mean 3.9431651296232055 good_angle_median 1.1586740658805823 good_angle_min 0.20726566058568413 reward_max -3.7829405356683674 reward_mean -7.779172087430576 reward_median -5.360624333171098 reward_min -18.3334619030517 survival_time_max 10.36666666666664 survival_time_mean 6.319999999999988 survival_time_min 1.933333333333336 traveled_tiles_max 5 traveled_tiles_mean 4.2 traveled_tiles_median 5 traveled_tiles_min 2 valid_direction_max 3.5999999999999917 valid_direction_mean 1.8599999999999963 valid_direction_median 1.8666666666666691 valid_direction_min 0.6666666666666665
No reset possible 7972
974
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:36 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.6265215747606048 deviation-center-line_median 0.607494048957489 in-drivable-lane_median 0
other stats deviation-center-line_max 0.6495016046287412 deviation-center-line_mean 0.5309295980722656 deviation-center-line_min 0.33273595264346006 deviation-heading_max 1.546889597996067 deviation-heading_mean 0.6466674829044743 deviation-heading_median 0.4233646963244864 deviation-heading_min 0.2887009286973376 driven_any_max 1.1250794556315609 driven_any_mean 0.6536182871501773 driven_any_median 0.6286179118526486 driven_any_min 0.31430732071544254 driven_lanedir_max 0.9001668163287937 driven_lanedir_mean 0.6067702335617516 driven_lanedir_min 0.31267344503052197 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 1.1250794556315609, "driven_lanedir": 0.9001668163287937, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.546889597996067, "deviation-center-line": 0.6495016046287412}, "ep001": {"driven_any": 0.31430732071544254, "driven_lanedir": 0.31267344503052197, "in-drivable-lane": 0, "deviation-heading": 0.2887009286973376, "deviation-center-line": 0.33273595264346006}, "ep002": {"driven_any": 0.6286179118526486, "driven_lanedir": 0.6265215747606048, "in-drivable-lane": 0, "deviation-heading": 0.4233646963244864, "deviation-center-line": 0.607494048957489}, "ep003": {"driven_any": 0.7250534689028811, "driven_lanedir": 0.7214747889992279, "in-drivable-lane": 0, "deviation-heading": 0.5927178567152452, "deviation-center-line": 0.6202123184120287}, "ep004": {"driven_any": 0.4750332786483535, "driven_lanedir": 0.4730145426896099, "in-drivable-lane": 0, "deviation-heading": 0.38166433478923545, "deviation-center-line": 0.4447040657196087}}
No reset possible 7957
973
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:54 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.6505766153109622 deviation-center-line_median 0.16014963851655162 in-drivable-lane_median 0
other stats deviation-center-line_max 0.7603053620165483 deviation-center-line_mean 0.2564637341640809 deviation-center-line_min 0.09210699726460604 deviation-heading_max 0.9642614812629324 deviation-heading_mean 0.3996068763062016 deviation-heading_median 0.1951626949882956 deviation-heading_min 0.08068462255521815 driven_any_max 2.9665349579780176 driven_any_mean 1.03460971155806 driven_any_median 0.6566328495709304 driven_any_min 0.3366328495709288 driven_lanedir_max 2.923355747095626 driven_lanedir_mean 0.9885477295561832 driven_lanedir_min 0.3321162244136633 in-drivable-lane_max 0.3666666666666667 in-drivable-lane_mean 0.07333333333333333 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.8532550200589977, "driven_lanedir": 0.6787448401236269, "in-drivable-lane": 0.3666666666666667, "deviation-heading": 0.6046620566532065, "deviation-center-line": 0.1704259240979422}, "ep001": {"driven_any": 0.3599928806114263, "driven_lanedir": 0.3579452208370377, "in-drivable-lane": 0, "deviation-heading": 0.08068462255521815, "deviation-center-line": 0.09933074892475642}, "ep002": {"driven_any": 0.3366328495709288, "driven_lanedir": 0.3321162244136633, "in-drivable-lane": 0, "deviation-heading": 0.15326352607135502, "deviation-center-line": 0.09210699726460604}, "ep003": {"driven_any": 0.6566328495709304, "driven_lanedir": 0.6505766153109622, "in-drivable-lane": 0, "deviation-heading": 0.1951626949882956, "deviation-center-line": 0.16014963851655162}, "ep004": {"driven_any": 2.9665349579780176, "driven_lanedir": 2.923355747095626, "in-drivable-lane": 0, "deviation-heading": 0.9642614812629324, "deviation-center-line": 0.7603053620165483}}
No reset possible 7928
970
Martin Weiss 🇨🇦PyTorch template aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:17:26 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 105, in run
solve(params, cis)
File "solution.py", line 51, in solve
import model
File "/workspace/model.py", line 107
SyntaxError: Non-ASCII character '\xce' in file /workspace/model.py on line 107, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7925
968
Martin Weiss 🇨🇦PyTorch template aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:26 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 105, in run
solve(params, cis)
File "solution.py", line 51, in solve
import model
File "/workspace/model.py", line 107
SyntaxError: Non-ASCII character '\xce' in file /workspace/model.py on line 107, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7902
964
David Abraham Pytorch IL aido1_LF1_r3-v3
step4-viz error yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:28:37 Timeout:
Waited 600 [...] Timeout:
Waited 600.31933403 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7882
960
Benjamin Ramtoula 🇨🇦My ROS solution - param 2 aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:13:37 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median -0.05236453754182824 deviation-center-line_median 0.2099877184635163 in-drivable-lane_median 3.0999999999999917
other stats deviation-center-line_max 0.2921825358035191 deviation-center-line_mean 0.187967706799728 deviation-center-line_min 0.0989868940483092 deviation-heading_max 2.949954185574554 deviation-heading_mean 1.7476743354047506 deviation-heading_median 1.535667945515126 deviation-heading_min 0.6320226257699753 driven_any_max 0.41832743908947273 driven_any_mean 0.29244418607459205 driven_any_median 0.3210725109192495 driven_any_min 0.08072947721934053 driven_lanedir_max -0.0107304741593639 driven_lanedir_mean -0.04891939048690741 driven_lanedir_min -0.10155567397387522 in-drivable-lane_max 15.733333333333324 in-drivable-lane_mean 5.079999999999994 in-drivable-lane_min 0.7 per-episodes details {"ep000": {"driven_any": 0.4004767904332172, "driven_lanedir": -0.011198358120041885, "in-drivable-lane": 15.733333333333324, "deviation-heading": 0.6320226257699753, "deviation-center-line": 0.2099877184635163}, "ep001": {"driven_any": 0.08072947721934053, "driven_lanedir": -0.0107304741593639, "in-drivable-lane": 0.7, "deviation-heading": 0.6721443925459542, "deviation-center-line": 0.0989868940483092}, "ep002": {"driven_any": 0.41832743908947273, "driven_lanedir": -0.10155567397387522, "in-drivable-lane": 3.3666666666666587, "deviation-heading": 2.9485825276181443, "deviation-center-line": 0.23840697507328568}, "ep003": {"driven_any": 0.2416147127116804, "driven_lanedir": -0.05236453754182824, "in-drivable-lane": 2.499999999999995, "deviation-heading": 1.535667945515126, "deviation-center-line": 0.10027441061000993}, "ep004": {"driven_any": 0.3210725109192495, "driven_lanedir": -0.06874790863942781, "in-drivable-lane": 3.0999999999999917, "deviation-heading": 2.949954185574554, "deviation-center-line": 0.2921825358035191}}
No reset possible 7867
958
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:22 Timeout:
Waited 603 [...] Timeout:
Waited 603.801175117 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7835
955
Pravish Sainath 🇨🇦PyTorch template aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:19:25 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7818
953
Claudio Ruch Funky Controller 1 aido1_amod_fleet_size_r1-v3
step2-scoring aborted yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:48 Error while running [...]
Creating job7818-4625_aido-scorer_1 ... error
stderr | ERROR: for job7818-4625_aido-scorer_1 Cannot start service aido-scorer: network job7818-4625_evaluation not found
stderr |
stderr | ERROR: for aido-scorer Cannot start service aido-scorer: network job7818-4625_evaluation not found
stderr | Encountered errors while bringing up the project.
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7815
953
Claudio Ruch Funky Controller 1 aido1_amod_fleet_size_r1-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:30 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7796
947
Claudio Ruch Java template aido1_amod_service_quality_r1-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:46 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/amod/target/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/aidamod/target/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/aidamod/target/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "/project/solution.py", line 34, in run
subprocess.check_call(cmd, cwd=cwd, stdout=sys.stdout, stderr=sys.stderr)
File "/usr/lib/python2.7/subprocess.py", line 186, in check_call
raise CalledProcessError(retcode, cmd)
CalledProcessError: Command '['java', '-cp', 'classes/', 'aidamod.demo.AidoGuest', 'aido-host']' returned non-zero exit status 1
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7784
945
Claudio Ruch Python template aido1_amod_fleet_size_r1-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:19:42 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7755
938
Hristo Vrigazov 🇧🇬ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:49 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.22441378044490048 deviation-center-line_median 0.20461859560324655 in-drivable-lane_median 0
other stats deviation-center-line_max 0.35117867998151586 deviation-center-line_mean 0.2321390040111884 deviation-center-line_min 0.11717741145249154 deviation-heading_max 1.7689633985112407 deviation-heading_mean 0.8436685155746894 deviation-heading_median 0.6163705402862939 deviation-heading_min 0.47882103104580054 driven_any_max 0.792884836955395 driven_any_mean 0.3471504714895969 driven_any_median 0.24285936883078943 driven_any_min 0.1821423849156732 driven_lanedir_max 0.5524478189545186 driven_lanedir_mean 0.2841118190815721 driven_lanedir_min 0.1638919573009654 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.792884836955395, "driven_lanedir": 0.5524478189545186, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.7689633985112407, "deviation-center-line": 0.3441421247225109}, "ep001": {"driven_any": 0.3214341180106259, "driven_lanedir": 0.31362340026870106, "in-drivable-lane": 0, "deviation-heading": 0.47882103104580054, "deviation-center-line": 0.35117867998151586}, "ep002": {"driven_any": 0.1821423849156732, "driven_lanedir": 0.1638919573009654, "in-drivable-lane": 0, "deviation-heading": 0.5481629995268564, "deviation-center-line": 0.14357820829617726}, "ep003": {"driven_any": 0.19643164873550087, "driven_lanedir": 0.16618213843877516, "in-drivable-lane": 0, "deviation-heading": 0.8060246085032547, "deviation-center-line": 0.11717741145249154}, "ep004": {"driven_any": 0.24285936883078943, "driven_lanedir": 0.22441378044490048, "in-drivable-lane": 0, "deviation-heading": 0.6163705402862939, "deviation-center-line": 0.20461859560324655}}
No reset possible 7714
929
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:45 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.15452735404053652 deviation-center-line_median 0.2678767736384 in-drivable-lane_median 0.4000000000000008
other stats deviation-center-line_max 0.31082345672843653 deviation-center-line_mean 0.23635870772679252 deviation-center-line_min 0.11648018064560534 deviation-heading_max 2.5372186668491503 deviation-heading_mean 1.6780024160940694 deviation-heading_median 1.729976996227318 deviation-heading_min 0.8733144887328745 driven_any_max 1.9561262793008625 driven_any_mean 1.1234495227446186 driven_any_median 0.9265852125600964 driven_any_min 0.26993116944546164 driven_lanedir_max 0.3939344103988932 driven_lanedir_mean 0.15505554608366856 driven_lanedir_min -0.006124570623144088 in-drivable-lane_max 4.19999999999999 in-drivable-lane_mean 1.5199999999999982 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 1.9561262793008625, "driven_lanedir": 0.3939344103988932, "in-drivable-lane": 2.999999999999999, "deviation-heading": 2.0063800547895867, "deviation-center-line": 0.31082345672843653}, "ep001": {"driven_any": 0.26993116944546164, "driven_lanedir": -0.006124570623144088, "in-drivable-lane": 0, "deviation-heading": 0.8733144887328745, "deviation-center-line": 0.11648018064560534}, "ep002": {"driven_any": 1.8539052586165647, "driven_lanedir": 0.15452735404053652, "in-drivable-lane": 4.19999999999999, "deviation-heading": 2.5372186668491503, "deviation-center-line": 0.2678767736384}, "ep003": {"driven_any": 0.9265852125600964, "driven_lanedir": 0.20110015817026072, "in-drivable-lane": 0, "deviation-heading": 1.243121873871418, "deviation-center-line": 0.2956628607939545}, "ep004": {"driven_any": 0.6106996938001087, "driven_lanedir": 0.031840378431796434, "in-drivable-lane": 0.4000000000000008, "deviation-heading": 1.729976996227318, "deviation-center-line": 0.19095026682756633}}
No reset possible 7696
928
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:43 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7694
927
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:42 The result file is n [...] The result file is not found. This usually means that the evaluator did not finish
and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7683
926
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7677
926
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:09 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7673
925
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:47 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 92, in run
solve(params, cis) # let's try to solve the challenge,
File "solution.py", line 50, in solve
if np.abs(action) > 0.8:
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7669
924
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:05 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 92, in run
solve(params, cis) # let's try to solve the challenge,
File "solution.py", line 27, in solve
from model import TensorflowModel
File "/workspace/model.py", line 4, in <module>
from _layers import one_residual
ImportError: No module named _layers
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7612
915
Ruixiang Zhang 🇨🇦stay simple aido1_LF1_r3-v3
step3-videos aborted yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 1:37:23 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 610, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 136, in _send_request
success_response, exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 210, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 277, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
EndpointConnectionError: Could not connect to the endpoint URL: "https://duckietown-ai-driving-olympics-1.s3.amazonaws.com/v3/frankfurt/by-value/sha256/76656221fa8202d9d61f4910f7fc13344b2309186fd4b28b6dd5844e2cec2514"
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7610
915
Ruixiang Zhang 🇨🇦stay simple aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:08:04 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7605
914
Manfred Diaz Tensorflow template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:12 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 4.992468761476724 deviation-center-line_median 0.6972622730664987 in-drivable-lane_median 0.16666666666666607
other stats deviation-center-line_max 0.9008921652485434 deviation-center-line_mean 0.6745505335631848 deviation-center-line_min 0.32062786569173557 deviation-heading_max 2.7495979913439865 deviation-heading_mean 2.1170669624782383 deviation-heading_median 2.460817383756992 deviation-heading_min 1.21993034050232 driven_any_max 5.496788900325122 driven_any_mean 4.551286930889951 driven_any_median 5.1587489504322654 driven_any_min 1.51567855004027 driven_lanedir_max 5.1766735171540805 driven_lanedir_mean 4.323558621259602 driven_lanedir_min 1.3714743878852935 in-drivable-lane_max 0.8666666666666654 in-drivable-lane_mean 0.27999999999999936 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 5.496788900325122, "driven_lanedir": 4.992468761476724, "in-drivable-lane": 0.8666666666666654, "deviation-heading": 2.7495979913439865, "deviation-center-line": 0.8781617758575586}, "ep001": {"driven_any": 5.1587489504322654, "driven_lanedir": 4.9820201302259335, "in-drivable-lane": 0.16666666666666607, "deviation-heading": 2.567406093405443, "deviation-center-line": 0.6972622730664987}, "ep002": {"driven_any": 1.51567855004027, "driven_lanedir": 1.3714743878852935, "in-drivable-lane": 0.13333333333333286, "deviation-heading": 1.21993034050232, "deviation-center-line": 0.32062786569173557}, "ep003": {"driven_any": 5.439032905533301, "driven_lanedir": 5.1766735171540805, "in-drivable-lane": 0.2333333333333325, "deviation-heading": 2.460817383756992, "deviation-center-line": 0.9008921652485434}, "ep004": {"driven_any": 5.146185348118799, "driven_lanedir": 5.095156309555976, "in-drivable-lane": 0, "deviation-heading": 1.5875830033824527, "deviation-center-line": 0.5758085879515876}}
No reset possible 7496
895
Martin Weiss 🇨🇦PyTorch template aido1_LFV_r1-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:35 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 106, in run
solve(params, cis)
File "solution.py", line 72, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7427
881
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:32 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.9666666666666623
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.1105157022489002, "good_angle": 1.0447757462834268, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.466666666666658}, "ep001": {"nsteps": 30, "reward": -33.928936811288196, "good_angle": 0.006434561455823128, "survival_time": 1, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 119, "reward": -8.641524390074887, "good_angle": 0.7378925848713735, "survival_time": 3.9666666666666623, "traveled_tiles": 5, "valid_direction": 1.6666666666666623}, "ep003": {"nsteps": 333, "reward": -3.1504083392769657, "good_angle": 0.970298812755563, "survival_time": 11.099999999999971, "traveled_tiles": 9, "valid_direction": 1.7666666666666604}, "ep004": {"nsteps": 82, "reward": -12.565208740532398, "good_angle": 0.131403194538547, "survival_time": 2.7333333333333334, "traveled_tiles": 2, "valid_direction": 0.29999999999999893}}good_angle_max 1.0447757462834268 good_angle_mean 0.5781609799809466 good_angle_median 0.7378925848713735 good_angle_min 0.006434561455823128 reward_max -0.1105157022489002 reward_mean -11.679318796684267 reward_median -8.641524390074887 reward_min -33.928936811288196 survival_time_max 16.666666666666654 survival_time_mean 7.093333333333324 survival_time_min 1 traveled_tiles_max 12 traveled_tiles_mean 5.8 traveled_tiles_median 5 traveled_tiles_min 1 valid_direction_max 2.466666666666658 valid_direction_mean 1.239999999999996 valid_direction_median 1.6666666666666623 valid_direction_min 0
No reset possible 7423
879
Mandana Samiei 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:26 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7422
878
Mandana Samiei 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:13 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.42583954081123343 deviation-center-line_median 0.17320688210284996 in-drivable-lane_median 0.8999999999999968
other stats deviation-center-line_max 0.2228829251353732 deviation-center-line_mean 0.16434571981296003 deviation-center-line_min 0.09093832662212982 deviation-heading_max 1.4075803299755223 deviation-heading_mean 0.9176843383697352 deviation-heading_median 0.8078204297304372 deviation-heading_min 0.5613042426861848 driven_any_max 0.9219548432356148 driven_any_mean 0.7299005046004915 driven_any_median 0.793164018584243 driven_any_min 0.4054401413209636 driven_lanedir_max 0.5721425621400456 driven_lanedir_mean 0.39567261648125274 driven_lanedir_min 0.19465232407579425 in-drivable-lane_max 1.699999999999994 in-drivable-lane_mean 1.0266666666666644 in-drivable-lane_min 0.7333333333333356 per-episodes details {"ep000": {"driven_any": 0.4054401413209636, "driven_lanedir": 0.19465232407579425, "in-drivable-lane": 0.7999999999999999, "deviation-heading": 1.4075803299755223, "deviation-center-line": 0.17320688210284996}, "ep001": {"driven_any": 0.9219548432356148, "driven_lanedir": 0.5721425621400456, "in-drivable-lane": 0.8999999999999968, "deviation-heading": 1.0430727836679292, "deviation-center-line": 0.2228829251353732}, "ep002": {"driven_any": 0.8734300505158462, "driven_lanedir": 0.42583954081123343, "in-drivable-lane": 1.699999999999994, "deviation-heading": 0.7686439057886024, "deviation-center-line": 0.14653797754600437}, "ep003": {"driven_any": 0.793164018584243, "driven_lanedir": 0.4434648915398909, "in-drivable-lane": 0.9999999999999964, "deviation-heading": 0.8078204297304372, "deviation-center-line": 0.18816248765844276}, "ep004": {"driven_any": 0.6555134693457899, "driven_lanedir": 0.3422637638392996, "in-drivable-lane": 0.7333333333333356, "deviation-heading": 0.5613042426861848, "deviation-center-line": 0.09093832662212982}}
No reset possible 7419
878
Mandana Samiei 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:21 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.2333333333333316
other stats episodes details {"ep000": {"nsteps": 71, "reward": -14.38939671668316, "good_angle": 0.7496343144478863, "survival_time": 2.366666666666668, "traveled_tiles": 1, "valid_direction": 1.8333333333333344}, "ep001": {"nsteps": 116, "reward": -8.956627277789861, "good_angle": 5.476824528401408, "survival_time": 3.8666666666666623, "traveled_tiles": 2, "valid_direction": 2.6999999999999957}, "ep002": {"nsteps": 110, "reward": -10.047834566732954, "good_angle": 1.5761100266806696, "survival_time": 3.6666666666666634, "traveled_tiles": 4, "valid_direction": 2.766666666666664}, "ep003": {"nsteps": 97, "reward": -10.94542623118056, "good_angle": 1.261777326348147, "survival_time": 3.2333333333333316, "traveled_tiles": 2, "valid_direction": 2.466666666666665}, "ep004": {"nsteps": 58, "reward": -18.156620403141552, "good_angle": 1.5198559032862808, "survival_time": 1.933333333333336, "traveled_tiles": 1, "valid_direction": 1.5000000000000029}}good_angle_max 5.476824528401408 good_angle_mean 2.1168404198328785 good_angle_median 1.5198559032862808 good_angle_min 0.7496343144478863 reward_max -8.956627277789861 reward_mean -12.499181039105618 reward_median -10.94542623118056 reward_min -18.156620403141552 survival_time_max 3.8666666666666623 survival_time_mean 3.0133333333333323 survival_time_min 1.933333333333336 traveled_tiles_max 4 traveled_tiles_mean 2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 2.766666666666664 valid_direction_mean 2.2533333333333325 valid_direction_median 2.466666666666665 valid_direction_min 1.5000000000000029
No reset possible 7412
878
Mandana Samiei 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:11 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7402
875
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:58 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.3000000000000016
other stats episodes details {"ep000": {"nsteps": 227, "reward": -4.5744455482416875, "good_angle": 0.7937623130473065, "survival_time": 7.5666666666666496, "traveled_tiles": 2, "valid_direction": 1.2333333333333298}, "ep001": {"nsteps": 99, "reward": -10.711396231193737, "good_angle": 0.19533482846247435, "survival_time": 3.299999999999998, "traveled_tiles": 1, "valid_direction": 0.2666666666666657}, "ep002": {"nsteps": 54, "reward": -18.98695876918457, "good_angle": 0.34666212011073927, "survival_time": 1.8000000000000025, "traveled_tiles": 2, "valid_direction": 0.6666666666666685}, "ep003": {"nsteps": 53, "reward": -19.22507929984691, "good_angle": 0.5550129967406443, "survival_time": 1.7666666666666688, "traveled_tiles": 1, "valid_direction": 0.9000000000000022}, "ep004": {"nsteps": 69, "reward": -14.967612463063087, "good_angle": 0.3386407174472176, "survival_time": 2.3000000000000016, "traveled_tiles": 1, "valid_direction": 0.7}}good_angle_max 0.7937623130473065 good_angle_mean 0.4458825951616764 good_angle_median 0.34666212011073927 good_angle_min 0.19533482846247435 reward_max -4.5744455482416875 reward_mean -13.693098462306 reward_median -14.967612463063087 reward_min -19.22507929984691 survival_time_max 7.5666666666666496 survival_time_mean 3.346666666666664 survival_time_min 1.7666666666666688 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.2333333333333298 valid_direction_mean 0.7533333333333333 valid_direction_median 0.7 valid_direction_min 0.2666666666666657
No reset possible 7395
872
Patrick Pfreundschuh 🇨ðŸ‡AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:04:13 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1375737746774517 deviation-center-line_median 0.5233382188689624 in-drivable-lane_median 0.5333333333333314
other stats deviation-center-line_max 0.7196196419963142 deviation-center-line_mean 0.4850483715737888 deviation-center-line_min 0.2161094845563237 deviation-heading_max 4.133717663596858 deviation-heading_mean 2.05915676392721 deviation-heading_median 1.5952607622438522 deviation-heading_min 1.0170399968556776 driven_any_max 2.115708161909215 driven_any_mean 1.264357759309159 driven_any_median 1.176032842574574 driven_any_min 0.3707164126748642 driven_lanedir_max 2.042416367957612 driven_lanedir_mean 1.058483924035592 driven_lanedir_min 0.1608909824334681 in-drivable-lane_max 2.0666666666666593 in-drivable-lane_mean 0.6666666666666647 in-drivable-lane_min 0.033333333333333215 per-episodes details {"ep000": {"driven_any": 0.3707164126748642, "driven_lanedir": 0.1608909824334681, "in-drivable-lane": 0.6, "deviation-heading": 2.054758003125388, "deviation-center-line": 0.2161094845563237}, "ep001": {"driven_any": 1.0627573630453715, "driven_lanedir": 0.6853250703247407, "in-drivable-lane": 0.5333333333333314, "deviation-heading": 4.133717663596858, "deviation-center-line": 0.7196196419963142}, "ep002": {"driven_any": 1.176032842574574, "driven_lanedir": 1.1375737746774517, "in-drivable-lane": 0.033333333333333215, "deviation-heading": 1.0170399968556776, "deviation-center-line": 0.5233382188689624}, "ep003": {"driven_any": 2.115708161909215, "driven_lanedir": 2.042416367957612, "in-drivable-lane": 0.09999999999999964, "deviation-heading": 1.5952607622438522, "deviation-center-line": 0.5403088586859086}, "ep004": {"driven_any": 1.5965740163417703, "driven_lanedir": 1.2662134247846877, "in-drivable-lane": 2.0666666666666593, "deviation-heading": 1.4950073938142772, "deviation-center-line": 0.4258656537614352}}
No reset possible 7390
872
Patrick Pfreundschuh 🇨ðŸ‡AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:35 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 8.466666666666647
other stats episodes details {"ep000": {"nsteps": 82, "reward": -12.451754005661869, "good_angle": 1.2095133155489088, "survival_time": 2.7333333333333334, "traveled_tiles": 1, "valid_direction": 2.2666666666666666}, "ep001": {"nsteps": 231, "reward": -5.429866900691739, "good_angle": 16.17046107357505, "survival_time": 7.699999999999982, "traveled_tiles": 2, "valid_direction": 3.966666666666653}, "ep002": {"nsteps": 254, "reward": -4.458102977005985, "good_angle": 0.9789399157082482, "survival_time": 8.466666666666647, "traveled_tiles": 3, "valid_direction": 2.2333333333333263}, "ep003": {"nsteps": 456, "reward": -2.1862141733201272, "good_angle": 1.3191079488592092, "survival_time": 15.199999999999957, "traveled_tiles": 4, "valid_direction": 2.5999999999999908}, "ep004": {"nsteps": 343, "reward": -3.453919063369423, "good_angle": 21.221826811575664, "survival_time": 11.433333333333303, "traveled_tiles": 3, "valid_direction": 3.93333333333332}}good_angle_max 21.221826811575664 good_angle_mean 8.179969813053415 good_angle_median 1.3191079488592092 good_angle_min 0.9789399157082482 reward_max -2.1862141733201272 reward_mean -5.595971424009829 reward_median -4.458102977005985 reward_min -12.451754005661869 survival_time_max 15.199999999999957 survival_time_mean 9.106666666666644 survival_time_min 2.7333333333333334 traveled_tiles_max 4 traveled_tiles_mean 2.6 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 3.966666666666653 valid_direction_mean 2.9999999999999916 valid_direction_median 2.5999999999999908 valid_direction_min 2.2333333333333263
No reset possible 7253
843
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:19 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7246
843
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:50 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7241
645
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:58 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.9667184269351564 deviation-center-line_median 1.2944816163430288 in-drivable-lane_median 0.4666666666666668
other stats deviation-center-line_max 1.332540552664937 deviation-center-line_mean 0.9557224490437596 deviation-center-line_min 0.3160574297309683 deviation-heading_max 12.275421338609751 deviation-heading_mean 8.530644290489814 deviation-heading_median 12.103540642741873 deviation-heading_min 1.351018203217863 driven_any_max 1.546688395233118 driven_any_mean 1.169644276395163 driven_any_median 1.5466883952331036 driven_any_min 0.5869961981908871 driven_lanedir_max 1.0061254000059208 driven_lanedir_mean 0.694241123541594 driven_lanedir_min 0.11832612271112986 in-drivable-lane_max 4.366666666666661 in-drivable-lane_mean 1.2666666666666655 in-drivable-lane_min 0.3999999999999986 per-episodes details {"ep000": {"driven_any": 0.5869961981908871, "driven_lanedir": 0.11832612271112986, "in-drivable-lane": 4.366666666666661, "deviation-heading": 1.351018203217863, "deviation-center-line": 0.3160574297309683}, "ep001": {"driven_any": 1.5466883952331036, "driven_lanedir": 1.0061254000059208, "in-drivable-lane": 0.3999999999999986, "deviation-heading": 12.103540642741873, "deviation-center-line": 1.31446479721917}, "ep002": {"driven_any": 1.546688395233118, "driven_lanedir": 0.9909278972008222, "in-drivable-lane": 0.4666666666666668, "deviation-heading": 12.26450711077056, "deviation-center-line": 1.332540552664937}, "ep003": {"driven_any": 0.6211599980856, "driven_lanedir": 0.38910777085494086, "in-drivable-lane": 0.4333333333333318, "deviation-heading": 4.6587341571090235, "deviation-center-line": 0.5210678492606929}, "ep004": {"driven_any": 1.5466883952331063, "driven_lanedir": 0.9667184269351564, "in-drivable-lane": 0.6666666666666696, "deviation-heading": 12.275421338609751, "deviation-center-line": 1.2944816163430288}}
No reset possible 7220
650
Aleksandar Petrov 🇨ðŸ‡Tuned lane controller - ETHZ baseline extension aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:28 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7214
660
Iban Harlouchet 🇨🇦PyTorch template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.5458223400993552 deviation-center-line_median 0.26212997711576197 in-drivable-lane_median 2.0666666666666593
other stats deviation-center-line_max 0.3962071638699865 deviation-center-line_mean 0.2743979993149417 deviation-center-line_min 0.20004254124575735 deviation-heading_max 2.6022778962636233 deviation-heading_mean 1.412498611520158 deviation-heading_median 1.0136524880057494 deviation-heading_min 0.295096840433396 driven_any_max 1.333563920308659 driven_any_mean 0.8264650790111988 driven_any_median 0.9783774405801512 driven_any_min 0.20167580920654743 driven_lanedir_max 0.7269885579091051 driven_lanedir_mean 0.4444218498745909 driven_lanedir_min 0.12413007600733383 in-drivable-lane_max 4.86666666666665 in-drivable-lane_mean 2.06666666666666 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3432699440183892, "driven_lanedir": 0.12413007600733383, "in-drivable-lane": 0.5666666666666667, "deviation-heading": 2.261737677546076, "deviation-center-line": 0.21732968207978215}, "ep001": {"driven_any": 0.20167580920654743, "driven_lanedir": 0.19687441374017167, "in-drivable-lane": 0, "deviation-heading": 0.295096840433396, "deviation-center-line": 0.20004254124575735}, "ep002": {"driven_any": 1.2754382809422478, "driven_lanedir": 0.7269885579091051, "in-drivable-lane": 2.8333333333333233, "deviation-heading": 2.6022778962636233, "deviation-center-line": 0.3962071638699865}, "ep003": {"driven_any": 1.333563920308659, "driven_lanedir": 0.6282938616169886, "in-drivable-lane": 4.86666666666665, "deviation-heading": 0.889728155351946, "deviation-center-line": 0.29628063226342055}, "ep004": {"driven_any": 0.9783774405801512, "driven_lanedir": 0.5458223400993552, "in-drivable-lane": 2.0666666666666593, "deviation-heading": 1.0136524880057494, "deviation-center-line": 0.26212997711576197}}
No reset possible 7212
660
Iban Harlouchet 🇨🇦PyTorch template aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:55 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7209
660
Iban Harlouchet 🇨🇦PyTorch template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 5.166666666666658
other stats episodes details {"ep000": {"nsteps": 81, "reward": -12.645839158871016, "good_angle": 1.4584104424146562, "survival_time": 2.7, "traveled_tiles": 1, "valid_direction": 2.2333333333333334}, "ep001": {"nsteps": 56, "reward": -18.423857440905913, "good_angle": 0.06159650232063907, "survival_time": 1.8666666666666691, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 233, "reward": -4.7900857814775515, "good_angle": 2.824561802904649, "survival_time": 7.766666666666649, "traveled_tiles": 3, "valid_direction": 4.56666666666665}, "ep003": {"nsteps": 249, "reward": -5.198497911922663, "good_angle": 0.9720068717669708, "survival_time": 8.299999999999981, "traveled_tiles": 3, "valid_direction": 4.499999999999985}, "ep004": {"nsteps": 155, "reward": -7.320670593202475, "good_angle": 1.4980131252594056, "survival_time": 5.166666666666658, "traveled_tiles": 2, "valid_direction": 3.766666666666657}}good_angle_max 2.824561802904649 good_angle_mean 1.3629177489332642 good_angle_median 1.4584104424146562 good_angle_min 0.06159650232063907 reward_max -4.7900857814775515 reward_mean -9.675790177275925 reward_median -7.320670593202475 reward_min -18.423857440905913 survival_time_max 8.299999999999981 survival_time_mean 5.159999999999991 survival_time_min 1.8666666666666691 traveled_tiles_max 3 traveled_tiles_mean 2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 4.56666666666665 valid_direction_mean 3.013333333333325 valid_direction_median 3.766666666666657 valid_direction_min 0
No reset possible 7203
660
Iban Harlouchet 🇨🇦PyTorch template aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:07 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7201
661
Gunshi Gupta 🇨🇦Template for ROS Submission aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:15 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7198
661
Gunshi Gupta 🇨🇦Template for ROS Submission aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:03 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 4.133333333333328
other stats episodes details {"ep000": {"nsteps": 56, "reward": -18.144491671102255, "good_angle": 0.8478028957709983, "survival_time": 1.8666666666666691, "traveled_tiles": 1, "valid_direction": 1.766666666666669}, "ep001": {"nsteps": 183, "reward": -6.9695053821623, "good_angle": 15.787864771931968, "survival_time": 6.099999999999988, "traveled_tiles": 2, "valid_direction": 3.966666666666656}, "ep002": {"nsteps": 179, "reward": -5.842615197683872, "good_angle": 1.258035159397662, "survival_time": 5.966666666666655, "traveled_tiles": 3, "valid_direction": 2.0999999999999925}, "ep003": {"nsteps": 81, "reward": -12.648846599790785, "good_angle": 0.1900601385642617, "survival_time": 2.7, "traveled_tiles": 2, "valid_direction": 0.5999999999999979}, "ep004": {"nsteps": 124, "reward": -8.213626082996921, "good_angle": 0.2467587096654036, "survival_time": 4.133333333333328, "traveled_tiles": 2, "valid_direction": 0.6333333333333311}}good_angle_max 15.787864771931968 good_angle_mean 3.6661043350660583 good_angle_median 0.8478028957709983 good_angle_min 0.1900601385642617 reward_max -5.842615197683872 reward_mean -10.363816986747228 reward_median -8.213626082996921 reward_min -18.144491671102255 survival_time_max 6.099999999999988 survival_time_mean 4.153333333333328 survival_time_min 1.8666666666666691 traveled_tiles_max 3 traveled_tiles_mean 2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 3.966666666666656 valid_direction_mean 1.8133333333333297 valid_direction_median 1.766666666666669 valid_direction_min 0.5999999999999979
No reset possible 7162
676
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:11:13 Timeout:
Waited 636 [...] Timeout:
Waited 636.583799124 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7127
687
Mandana Samiei 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:08:12 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 88, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7083
699
Mandana Samiei 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:37 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7039
712
Liam Paull 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:57 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7026
730
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:51 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.0633319132004069 deviation-center-line_median 0.2855186340727478 in-drivable-lane_median 0
other stats deviation-center-line_max 1.072270399440044 deviation-center-line_mean 0.5705771172745165 deviation-center-line_min 0.19320369254632783 deviation-heading_max 4.723509209567449 deviation-heading_mean 1.6943301945754703 deviation-heading_median 0.6240544086983971 deviation-heading_min 0.1579035777258213 driven_any_max 3.248686029486499 driven_any_mean 1.71477581550808 driven_any_median 1.2845242994993418 driven_any_min 0.5342015305139659 driven_lanedir_max 3.188089028771449 driven_lanedir_mean 1.6107337246262765 driven_lanedir_min 0.5321586091738142 in-drivable-lane_max 1.500000000000001 in-drivable-lane_mean 0.4199999999999998 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 1.2845242994993418, "driven_lanedir": 1.0633319132004069, "in-drivable-lane": 1.500000000000001, "deviation-heading": 0.6240544086983971, "deviation-center-line": 0.2855186340727478}, "ep001": {"driven_any": 2.6834240414680526, "driven_lanedir": 2.449972694200103, "in-drivable-lane": 0.5999999999999979, "deviation-heading": 4.723509209567449, "deviation-center-line": 1.072270399440044}, "ep002": {"driven_any": 0.5342015305139659, "driven_lanedir": 0.5321586091738142, "in-drivable-lane": 0, "deviation-heading": 0.1579035777258213, "deviation-center-line": 0.19320369254632783}, "ep003": {"driven_any": 0.8230431765725404, "driven_lanedir": 0.8201163777856089, "in-drivable-lane": 0, "deviation-heading": 0.2226947602850225, "deviation-center-line": 0.24563951255486385}, "ep004": {"driven_any": 3.248686029486499, "driven_lanedir": 3.188089028771449, "in-drivable-lane": 0, "deviation-heading": 2.743489016600664, "deviation-center-line": 1.0562533477585991}}
No reset possible 7007
730
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:05:00 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6996
744
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:46 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1402606240643702 deviation-center-line_median 0.3840458582910434 in-drivable-lane_median 0.13333333333333286
other stats deviation-center-line_max 0.711674997838784 deviation-center-line_mean 0.419066712514385 deviation-center-line_min 0.24823341281566189 deviation-heading_max 3.263198909203356 deviation-heading_mean 1.4167711630237871 deviation-heading_median 0.7362030012613442 deviation-heading_min 0.3965692323311814 driven_any_max 3.2455339443023155 driven_any_mean 1.4933535155961066 driven_any_median 1.3444261849537495 driven_any_min 0.4077135505688551 driven_lanedir_max 3.067881703176673 driven_lanedir_mean 1.3784724316862178 driven_lanedir_min 0.40203270567049 in-drivable-lane_max 1.400000000000001 in-drivable-lane_mean 0.3533333333333333 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 1.3543386321302768, "driven_lanedir": 1.1402606240643702, "in-drivable-lane": 1.400000000000001, "deviation-heading": 0.5748015265121692, "deviation-center-line": 0.2755385675349739}, "ep001": {"driven_any": 0.4077135505688551, "driven_lanedir": 0.40203270567049, "in-drivable-lane": 0, "deviation-heading": 0.3965692323311814, "deviation-center-line": 0.24823341281566189}, "ep002": {"driven_any": 1.3444261849537495, "driven_lanedir": 1.1811415874569735, "in-drivable-lane": 0.13333333333333286, "deviation-heading": 3.263198909203356, "deviation-center-line": 0.47584072609146166}, "ep003": {"driven_any": 3.2455339443023155, "driven_lanedir": 3.067881703176673, "in-drivable-lane": 0.2333333333333325, "deviation-heading": 2.1130831458108843, "deviation-center-line": 0.711674997838784}, "ep004": {"driven_any": 1.114755266025336, "driven_lanedir": 1.101045538062582, "in-drivable-lane": 0, "deviation-heading": 0.7362030012613442, "deviation-center-line": 0.3840458582910434}}
No reset possible 6993
744
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:56 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6991
744
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:39 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 4.933333333333326
other stats episodes details {"ep000": {"nsteps": 148, "reward": -7.024807937325268, "good_angle": 0.1543820009948916, "survival_time": 4.933333333333326, "traveled_tiles": 3, "valid_direction": 0}, "ep001": {"nsteps": 64, "reward": -16.25278271175921, "good_angle": 0.13330238492318994, "survival_time": 2.1333333333333355, "traveled_tiles": 1, "valid_direction": 0.40000000000000013}, "ep002": {"nsteps": 204, "reward": -5.018573467704175, "good_angle": 1.4528412334392908, "survival_time": 6.799999999999986, "traveled_tiles": 3, "valid_direction": 2.299999999999992}, "ep003": {"nsteps": 316, "reward": -3.370438709410049, "good_angle": 0.36910289958072445, "survival_time": 10.533333333333308, "traveled_tiles": 6, "valid_direction": 1.099999999999996}, "ep004": {"nsteps": 137, "reward": -7.825199343209719, "good_angle": 4.887162011872098, "survival_time": 4.56666666666666, "traveled_tiles": 3, "valid_direction": 1.3333333333333286}}good_angle_max 4.887162011872098 good_angle_mean 1.399358106162039 good_angle_median 0.36910289958072445 good_angle_min 0.13330238492318994 reward_max -3.370438709410049 reward_mean -7.898360433881685 reward_median -7.024807937325268 reward_min -16.25278271175921 survival_time_max 10.533333333333308 survival_time_mean 5.793333333333322 survival_time_min 2.1333333333333355 traveled_tiles_max 6 traveled_tiles_mean 3.2 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 2.299999999999992 valid_direction_mean 1.0266666666666633 valid_direction_median 1.099999999999996 valid_direction_min 0
No reset possible 6973
744
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:17 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6961
841
Yun Chen 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:10 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.21091553965592436 deviation-center-line_median 0.19669233480794623 in-drivable-lane_median 0
other stats deviation-center-line_max 0.36244638871481594 deviation-center-line_mean 0.20810231863082232 deviation-center-line_min 0.13226281225297515 deviation-heading_max 1.8575080225276024 deviation-heading_mean 0.8483164489614163 deviation-heading_median 0.5932738092779462 deviation-heading_min 0.4731192022743975 driven_any_max 0.5678799021333145 driven_any_mean 0.30071541569328825 driven_any_median 0.2285771929156074 driven_any_min 0.16070108520087817 driven_lanedir_max 0.33526404461859105 driven_lanedir_mean 0.2343886390623312 driven_lanedir_min 0.14550252369974823 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.5678799021333145, "driven_lanedir": 0.2978324027450896, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.8575080225276024, "deviation-center-line": 0.20998379718522137}, "ep001": {"driven_any": 0.3428629400829315, "driven_lanedir": 0.33526404461859105, "in-drivable-lane": 0, "deviation-heading": 0.4731192022743975, "deviation-center-line": 0.36244638871481594}, "ep002": {"driven_any": 0.16070108520087817, "driven_lanedir": 0.14550252369974823, "in-drivable-lane": 0, "deviation-heading": 0.5681843092589429, "deviation-center-line": 0.13912626019315266}, "ep003": {"driven_any": 0.20355595813370983, "driven_lanedir": 0.18242868459230263, "in-drivable-lane": 0, "deviation-heading": 0.7494969014681925, "deviation-center-line": 0.13226281225297515}, "ep004": {"driven_any": 0.2285771929156074, "driven_lanedir": 0.21091553965592436, "in-drivable-lane": 0, "deviation-heading": 0.5932738092779462, "deviation-center-line": 0.19669233480794623}}
No reset possible 6950
755
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:07 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6945
755
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:43 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -1.0671425678431987, "good_angle": 51.4322522382949, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 13.299999999999962}, "ep001": {"nsteps": 500, "reward": -1.619858711987734, "good_angle": 12.867934710029685, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.366666666666635}, "ep002": {"nsteps": 500, "reward": -0.04492271298385458, "good_angle": 5.143053257755371, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 11.833333333333329}, "ep003": {"nsteps": 500, "reward": -1.4435005245953798, "good_angle": 12.551574469217556, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 11.933333333333325}, "ep004": {"nsteps": 500, "reward": -1.4547944277971985, "good_angle": 6.959041056376123, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 9.433333333333303}}good_angle_max 51.4322522382949 good_angle_mean 17.790771146334727 good_angle_median 12.551574469217556 good_angle_min 5.143053257755371 reward_max -0.04492271298385458 reward_mean -1.1260437890414732 reward_median -1.4435005245953798 reward_min -1.619858711987734 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 2 traveled_tiles_mean 1.6 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 13.299999999999962 valid_direction_mean 11.773333333333312 valid_direction_median 11.933333333333325 valid_direction_min 9.433333333333303
No reset possible 6924
755
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:58 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6916
770
Simon Schaefer 🇨ðŸ‡AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1339670031613929 deviation-center-line_median 0.3453842981256443 in-drivable-lane_median 0.033333333333333215
other stats deviation-center-line_max 0.4625904444299094 deviation-center-line_mean 0.30597882595341497 deviation-center-line_min 0.14843553015223906 deviation-heading_max 2.5187871046739683 deviation-heading_mean 1.1001213396346814 deviation-heading_median 0.7191512154722794 deviation-heading_min 0.124099113245172 driven_any_max 2.1133333333332667 driven_any_mean 1.1199999999999908 driven_any_median 1.1733333333333549 driven_any_min 0.36000000000000254 driven_lanedir_max 2.0720400686018614 driven_lanedir_mean 0.9952598020995628 driven_lanedir_min 0.1479416971540677 in-drivable-lane_max 0.5333333333333314 in-drivable-lane_mean 0.1933333333333329 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3600000000000074, "driven_lanedir": 0.1479416971540677, "in-drivable-lane": 0.4, "deviation-heading": 1.453568460270295, "deviation-center-line": 0.14843553015223906}, "ep001": {"driven_any": 0.36000000000000254, "driven_lanedir": 0.35917351816203724, "in-drivable-lane": 0, "deviation-heading": 0.124099113245172, "deviation-center-line": 0.2016114458048761}, "ep002": {"driven_any": 1.1733333333333549, "driven_lanedir": 1.1339670031613929, "in-drivable-lane": 0, "deviation-heading": 0.7191512154722794, "deviation-center-line": 0.3453842981256443}, "ep003": {"driven_any": 2.1133333333332667, "driven_lanedir": 2.0720400686018614, "in-drivable-lane": 0.033333333333333215, "deviation-heading": 0.685000804511693, "deviation-center-line": 0.4625904444299094}, "ep004": {"driven_any": 1.5933333333333222, "driven_lanedir": 1.263176723418455, "in-drivable-lane": 0.5333333333333314, "deviation-heading": 2.5187871046739683, "deviation-center-line": 0.3718724112544059}}
No reset possible 6908
770
Simon Schaefer 🇨ðŸ‡AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:12 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6906
770
Simon Schaefer 🇨ðŸ‡AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:33 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 5.933333333333322
other stats episodes details {"ep000": {"nsteps": 56, "reward": -18.11559725355723, "good_angle": 0.8921105662302433, "survival_time": 1.8666666666666691, "traveled_tiles": 1, "valid_direction": 1.566666666666669}, "ep001": {"nsteps": 56, "reward": -18.42357532573598, "good_angle": 0.008400321768129769, "survival_time": 1.8666666666666691, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 178, "reward": -6.111355765063442, "good_angle": 0.6612090390214628, "survival_time": 5.933333333333322, "traveled_tiles": 3, "valid_direction": 1.4666666666666617}, "ep003": {"nsteps": 319, "reward": -3.4423892636517746, "good_angle": 0.7088851214962292, "survival_time": 10.633333333333306, "traveled_tiles": 4, "valid_direction": 1.533333333333328}, "ep004": {"nsteps": 241, "reward": -4.737908348469141, "good_angle": 14.163927690504837, "survival_time": 8.033333333333315, "traveled_tiles": 3, "valid_direction": 2.7333333333333245}}good_angle_max 14.163927690504837 good_angle_mean 3.2869065478041803 good_angle_median 0.7088851214962292 good_angle_min 0.008400321768129769 reward_max -3.4423892636517746 reward_mean -10.166165191295514 reward_median -6.111355765063442 reward_min -18.42357532573598 survival_time_max 10.633333333333306 survival_time_mean 5.666666666666657 survival_time_min 1.8666666666666691 traveled_tiles_max 4 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 2.7333333333333245 valid_direction_mean 1.4599999999999966 valid_direction_median 1.533333333333328 valid_direction_min 0
No reset possible 6893
770
Simon Schaefer 🇨ðŸ‡AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:36 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6870
777
Yannick Berdou AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:03:40 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1244940250770317 deviation-center-line_median 0.1702985030340966 in-drivable-lane_median 0.033333333333333215
other stats deviation-center-line_max 0.2291035934160819 deviation-center-line_mean 0.15104091265992872 deviation-center-line_min 0.07084844041760953 deviation-heading_max 1.2760765640740437 deviation-heading_mean 0.5466497354728592 deviation-heading_median 0.35171107009370006 deviation-heading_min 0.06092138286581164 driven_any_max 2.1066666666666567 driven_any_mean 1.1093333333333295 driven_any_median 1.1599999999999824 driven_any_min 0.3466666666666718 driven_lanedir_max 2.066698089009994 driven_lanedir_mean 0.9871748479623343 driven_lanedir_min 0.14089729353622693 in-drivable-lane_max 0.2666666666666657 in-drivable-lane_mean 0.09999999999999978 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3466666666666718, "driven_lanedir": 0.14089729353622693, "in-drivable-lane": 0.2, "deviation-heading": 0.7098822712947942, "deviation-center-line": 0.07084844041760953}, "ep001": {"driven_any": 0.34666666666667245, "driven_lanedir": 0.3458718774424743, "in-drivable-lane": 0, "deviation-heading": 0.06092138286581164, "deviation-center-line": 0.09897289157693905}, "ep002": {"driven_any": 1.1599999999999824, "driven_lanedir": 1.1244940250770317, "in-drivable-lane": 0, "deviation-heading": 0.35171107009370006, "deviation-center-line": 0.1702985030340966}, "ep003": {"driven_any": 2.1066666666666567, "driven_lanedir": 2.066698089009994, "in-drivable-lane": 0.033333333333333215, "deviation-heading": 0.33465738903594666, "deviation-center-line": 0.2291035934160819}, "ep004": {"driven_any": 1.5866666666666656, "driven_lanedir": 1.2579129547459444, "in-drivable-lane": 0.2666666666666657, "deviation-heading": 1.2760765640740437, "deviation-center-line": 0.1859811348549165}}
No reset possible 6812
787
Cliff Li AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:12:51 InvalidEvaluator:
Tr [...] InvalidEvaluator:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 488, in wrap_evaluator
evaluator.score(cie)
File "eval.py", line 97, in score
raise dc.InvalidEvaluator(msg)
InvalidEvaluator: Gym exited with code 2
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6796
799
Julian Zilly Baseline solution using imitation learning from logs aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:06 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1360756262058462 deviation-center-line_median 0.4394837181007228 in-drivable-lane_median 0.06666666666666643
other stats deviation-center-line_max 0.6333827791987502 deviation-center-line_mean 0.4212631472000551 deviation-center-line_min 0.24366373322158896 deviation-heading_max 3.8004671062440982 deviation-heading_mean 1.5700097495124463 deviation-heading_median 1.1751997611595733 deviation-heading_min 0.19159266173442707 driven_any_max 1.9333799185588456 driven_any_mean 1.077723423927906 driven_any_median 1.1802158534322102 driven_any_min 0.3144654076456124 driven_lanedir_max 1.932166076243618 driven_lanedir_mean 0.9565430019281578 driven_lanedir_min 0.14257465471603614 in-drivable-lane_max 1.1999999999999955 in-drivable-lane_mean 0.39999999999999913 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3610528751992641, "driven_lanedir": 0.14257465471603614, "in-drivable-lane": 0.7333333333333333, "deviation-heading": 2.4651655876232828, "deviation-center-line": 0.24366373322158896}, "ep001": {"driven_any": 0.3144654076456124, "driven_lanedir": 0.3134573187621861, "in-drivable-lane": 0, "deviation-heading": 0.2176236308008516, "deviation-center-line": 0.2987658071273343}, "ep002": {"driven_any": 1.1802158534322102, "driven_lanedir": 1.1360756262058462, "in-drivable-lane": 0.06666666666666643, "deviation-heading": 1.1751997611595733, "deviation-center-line": 0.6333827791987502}, "ep003": {"driven_any": 1.9333799185588456, "driven_lanedir": 1.932166076243618, "in-drivable-lane": 0, "deviation-heading": 0.19159266173442707, "deviation-center-line": 0.4394837181007228}, "ep004": {"driven_any": 1.5995030648035986, "driven_lanedir": 1.2584413337131024, "in-drivable-lane": 1.1999999999999955, "deviation-heading": 3.8004671062440982, "deviation-center-line": 0.49101969835187953}}
No reset possible 6793
817
Gianmarco Bernasconi ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:55 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.19137340289137897 deviation-center-line_median 0.20093803683902609 in-drivable-lane_median 0
other stats deviation-center-line_max 0.20756742817629417 deviation-center-line_mean 0.17833796981665873 deviation-center-line_min 0.12931703951125476 deviation-heading_max 1.8439918325814553 deviation-heading_mean 0.8124041119682881 deviation-heading_median 0.6167446789204305 deviation-heading_min 0.21632967691536764 driven_any_max 0.5571686061914455 driven_any_mean 0.27571592616047186 driven_any_median 0.19286856792722237 driven_any_min 0.17498533311134012 driven_lanedir_max 0.28729950819802474 driven_lanedir_mean 0.20973809159635853 driven_lanedir_min 0.15776187464552138 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.5571686061914455, "driven_lanedir": 0.28729950819802474, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.8439918325814553, "deviation-center-line": 0.20751882977364483}, "ep001": {"driven_any": 0.19286856792722237, "driven_lanedir": 0.19137340289137897, "in-drivable-lane": 0, "deviation-heading": 0.21632967691536764, "deviation-center-line": 0.20093803683902609}, "ep002": {"driven_any": 0.17498533311134012, "driven_lanedir": 0.15776187464552138, "in-drivable-lane": 0, "deviation-heading": 0.6091650214648153, "deviation-center-line": 0.14634851478307387}, "ep003": {"driven_any": 0.19284471001217895, "driven_lanedir": 0.16933619803164834, "in-drivable-lane": 0, "deviation-heading": 0.7757893499593713, "deviation-center-line": 0.12931703951125476}, "ep004": {"driven_any": 0.26071241356017244, "driven_lanedir": 0.2429194742152192, "in-drivable-lane": 0, "deviation-heading": 0.6167446789204305, "deviation-center-line": 0.20756742817629417}}
No reset possible 6780
817
Gianmarco Bernasconi ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:51 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6765
815
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:02:46 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 118, in run
solve(params, cis)
File "solution.py", line 83, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/wrappers.py", line 79, in step
ob, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6753
839
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:10:34 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.12873581159579173 deviation-center-line_median 0.13344892518792947 in-drivable-lane_median 0
other stats deviation-center-line_max 0.14783216050839043 deviation-center-line_mean 0.12838987464935875 deviation-center-line_min 0.08249076531582478 deviation-heading_max 0.8619982658552878 deviation-heading_mean 0.6293159810887797 deviation-heading_median 0.5649393993353382 deviation-heading_min 0.3636845553155839 driven_any_max 0.36134326848971365 driven_any_mean 0.18631679869490897 driven_any_median 0.1612008435759256 driven_any_min 0.09516228492822434 driven_lanedir_max 0.15243964041425828 driven_lanedir_mean 0.12503317837525957 driven_lanedir_min 0.08425077324726926 in-drivable-lane_max 1.4333333333333345 in-drivable-lane_mean 0.2866666666666669 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.36134326848971365, "driven_lanedir": 0.12873581159579173, "in-drivable-lane": 1.4333333333333345, "deviation-heading": 0.8619982658552878, "deviation-center-line": 0.08249076531582478}, "ep001": {"driven_any": 0.09516228492822434, "driven_lanedir": 0.08425077324726926, "in-drivable-lane": 0, "deviation-heading": 0.3636845553155839, "deviation-center-line": 0.13344892518792947}, "ep002": {"driven_any": 0.1612008435759256, "driven_lanedir": 0.14250782157328667, "in-drivable-lane": 0, "deviation-heading": 0.5649393993353382, "deviation-center-line": 0.14769878002189304}, "ep003": {"driven_any": 0.1804885953476817, "driven_lanedir": 0.15243964041425828, "in-drivable-lane": 0, "deviation-heading": 0.8527167691740561, "deviation-center-line": 0.13047874221275613}, "ep004": {"driven_any": 0.13338900113299948, "driven_lanedir": 0.11723184504569197, "in-drivable-lane": 0, "deviation-heading": 0.5032409157636322, "deviation-center-line": 0.14783216050839043}}
No reset possible 6751
839
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:45 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6750
839
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:00:30 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.1333333333333355
other stats episodes details {"ep000": {"nsteps": 68, "reward": -15.091665333082132, "good_angle": 0.6981166724766602, "survival_time": 2.2666666666666684, "traveled_tiles": 1, "valid_direction": 1.6000000000000016}, "ep001": {"nsteps": 41, "reward": -24.910202139034503, "good_angle": 0.12060718551653264, "survival_time": 1.3666666666666676, "traveled_tiles": 1, "valid_direction": 0.4000000000000003}, "ep002": {"nsteps": 64, "reward": -15.972955069504678, "good_angle": 0.2113207229158426, "survival_time": 2.1333333333333355, "traveled_tiles": 2, "valid_direction": 0.9333333333333356}, "ep003": {"nsteps": 71, "reward": -14.35928999485684, "good_angle": 0.39706338058496066, "survival_time": 2.366666666666668, "traveled_tiles": 1, "valid_direction": 1.466666666666668}, "ep004": {"nsteps": 58, "reward": -17.632628266153663, "good_angle": 0.18148187305231292, "survival_time": 1.933333333333336, "traveled_tiles": 1, "valid_direction": 0.7000000000000022}}good_angle_max 0.6981166724766602 good_angle_mean 0.3217179669092618 good_angle_median 0.2113207229158426 good_angle_min 0.12060718551653264 reward_max -14.35928999485684 reward_mean -17.593348160526364 reward_median -15.972955069504678 reward_min -24.910202139034503 survival_time_max 2.366666666666668 survival_time_mean 2.013333333333335 survival_time_min 1.3666666666666676 traveled_tiles_max 2 traveled_tiles_mean 1.2 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.6000000000000016 valid_direction_mean 1.0200000000000016 valid_direction_median 0.9333333333333356 valid_direction_min 0.4000000000000003
No reset possible 6731
839
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:06:47 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6631
729
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:07:11 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6228
828
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation timeout no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:31:01 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6127
834
Samuel Lavoie Young Duke aido1_LF1_r3-v3
step4-viz success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:01:52 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6124
834
Samuel Lavoie Young Duke aido1_LF1_r3-v3
step2-scoring success no s-MacBook-puro.local-13653
6 years, 8 months 6 years, 8 months 0:09:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 11.733333333333302
other stats episodes details {"ep000": {"nsteps": 168, "reward": -7.934439358700599, "good_angle": 19.888117786020445, "survival_time": 5.59999999999999, "traveled_tiles": 6, "valid_direction": 3.099999999999997}, "ep001": {"nsteps": 242, "reward": -5.63131987395858, "good_angle": 18.727965412663018, "survival_time": 8.066666666666649, "traveled_tiles": 7, "valid_direction": 3.4666666666666552}, "ep002": {"nsteps": 500, "reward": -0.5593890390801244, "good_angle": 2.5609619856563275, "survival_time": 16.666666666666654, "traveled_tiles": 13, "valid_direction": 5.9666666666666455}, "ep003": {"nsteps": 352, "reward": -5.00258318618448, "good_angle": 36.28035911926448, "survival_time": 11.733333333333302, "traveled_tiles": 10, "valid_direction": 4.966666666666652}, "ep004": {"nsteps": 500, "reward": -0.8166006759773009, "good_angle": 2.5717428190104616, "survival_time": 16.666666666666654, "traveled_tiles": 13, "valid_direction": 5.5333333333333155}}good_angle_max 36.28035911926448 good_angle_mean 16.005829424522947 good_angle_median 18.727965412663018 good_angle_min 2.5609619856563275 reward_max -0.5593890390801244 reward_mean -3.988866426780217 reward_median -5.00258318618448 reward_min -7.934439358700599 survival_time_max 16.666666666666654 survival_time_mean 11.74666666666665 survival_time_min 5.59999999999999 traveled_tiles_max 13 traveled_tiles_mean 9.8 traveled_tiles_median 10 traveled_tiles_min 6 valid_direction_max 5.9666666666666455 valid_direction_mean 4.606666666666653 valid_direction_median 4.966666666666652 valid_direction_min 3.099999999999997
No reset possible