AI Driving Olympics Home Challenges Submissions Jobs

Evaluator 291

evaluator291
ownerAndrea Censi
machinenutonomy-P50
processnutonomy-P50-2570
versionDC:3.1.54;DCR:3.2.14;DTS:3.0.30
first heard
last heard
statusinactive
# evaluating
# success16
# timeout
# failed1
# error2
# aborted3
# host-error
arm
x86_64
Mac
gpu available
Number of processors
Processor frequency (MHz)
Free % of processors
RAM total (MB)
RAM free (MB)
Disk (MB)
Disk available (MB)
Docker Hub
P1
P2
PI Camera
# Duckiebots
Map 3x3 avaiable
gpu cores
AIDO 2 Map LF public
AIDO 2 Map LF private
AIDO 2 Map LFV public
AIDO 2 Map LFV private
AIDO 2 Map LFVI public
AIDO 2 Map LFVI private
IPFS mountpoint /ipfs available
IPNS mountpoint /ipns available

Evaluator jobs

Job IDsubmissionuseruser labelchallengestepstatusup to dateevaluatordate starteddate completeddurationmessage
5949799Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step4-vizabortedno2910:01:16(hidden)
other stats
videos1
5938786martatintoreAMOD18-AIDO not that random executionaido1_LF1_r3-v3step3-videossuccessno2910:05:07(hidden)
other stats
videos1
5931800heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationabortedno2910:02:35
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
    raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
    solution.run(cis)
  File "solution.py", line 118, in run
    solve(params, cis)
  File "solution.py", line 83, in solve
    observation, reward, done, info = env.step(action)
  File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
    return self.env.step(action)
  File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
    observation, reward, done, info = self.env.step(action)
  File "/workspace/wrappers.py", line 89, in step
    ob, reward, done, info = self.env.step(action)
  File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
    observation, reward, done, info = self.env.step(action)
  File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
    obs, rew, done, misc = self.sim.step(action, with_observation=True)
  File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
    return self._failsafe_observe(msg)
  File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
    raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator


(hidden)
5924799Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step2-scoringabortedno2910:02:48(hidden)
survival_time_median10.833333333333306


other stats
episodes
details{"ep000": {"nsteps": 204, "reward": -6.7331884848136525, "good_angle": 29.572980253786163, "survival_time": 6.799999999999986, "traveled_tiles": 2, "valid_direction": 4.699999999999983}, "ep001": {"nsteps": 452, "reward": -3.059079861185983, "good_angle": 27.6059563105319, "survival_time": 15.066666666666624, "traveled_tiles": 3, "valid_direction": 4.699999999999983}, "ep002": {"nsteps": 336, "reward": -3.0095930438303964, "good_angle": 0.184076416761546, "survival_time": 11.19999999999997, "traveled_tiles": 3, "valid_direction": 0}, "ep003": {"nsteps": 310, "reward": -5.454106606206587, "good_angle": 31.408071657883266, "survival_time": 10.333333333333307, "traveled_tiles": 3, "valid_direction": 4.699999999999984}, "ep004": {"nsteps": 325, "reward": -4.930836339840361, "good_angle": 5.946495714391657, "survival_time": 10.833333333333306, "traveled_tiles": 3, "valid_direction": 3.566666666666655}}
good_angle_max31.408071657883266
good_angle_mean18.943516070670903
good_angle_median27.6059563105319
good_angle_min0.184076416761546
reward_max-3.0095930438303964
reward_mean-4.637360867175396
reward_median-4.930836339840361
reward_min-6.7331884848136525
survival_time_max15.066666666666624
survival_time_mean10.84666666666664
survival_time_min6.799999999999986
traveled_tiles_max3
traveled_tiles_mean2.8
traveled_tiles_median3
traveled_tiles_min2
valid_direction_max4.699999999999984
valid_direction_mean3.5333333333333217
valid_direction_median4.699999999999983
valid_direction_min0
5917798CpPIRandom executionaido1_LF1_r3-v3step2-scoringsuccessno2910:02:19(hidden)
survival_time_median8.999999999999979


other stats
episodes
details{"ep000": {"nsteps": 171, "reward": -7.729742120232499, "good_angle": 25.07383215378148, "survival_time": 5.6999999999999895, "traveled_tiles": 2, "valid_direction": 3.9333333333333207}, "ep001": {"nsteps": 382, "reward": -4.457744235767744, "good_angle": 24.218758506416364, "survival_time": 12.7333333333333, "traveled_tiles": 3, "valid_direction": 4.066666666666652}, "ep002": {"nsteps": 338, "reward": -3.039140198254471, "good_angle": 0.1268074882377727, "survival_time": 11.266666666666636, "traveled_tiles": 4, "valid_direction": 0}, "ep003": {"nsteps": 258, "reward": -6.04335063233856, "good_angle": 27.571145449058, "survival_time": 8.59999999999998, "traveled_tiles": 3, "valid_direction": 3.93333333333332}, "ep004": {"nsteps": 270, "reward": -5.542087158836701, "good_angle": 4.78818581978914, "survival_time": 8.999999999999979, "traveled_tiles": 3, "valid_direction": 2.9999999999999902}}
good_angle_max27.571145449058
good_angle_mean16.355745883456553
good_angle_median24.218758506416364
good_angle_min0.1268074882377727
reward_max-3.039140198254471
reward_mean-5.362412869085995
reward_median-5.542087158836701
reward_min-7.729742120232499
survival_time_max12.7333333333333
survival_time_mean9.459999999999976
survival_time_min5.6999999999999895
traveled_tiles_max4
traveled_tiles_mean3
traveled_tiles_median3
traveled_tiles_min2
valid_direction_max4.066666666666652
valid_direction_mean2.9866666666666566
valid_direction_median3.93333333333332
valid_direction_min0
5912796heyt0nySAIC MOSCOW MMLaido1_LF1_r3-v3step1-simulationfailedno2910:02:42
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
    raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
    solution.run(cis)
  File "solution.py", line 118, in run
    solve(params, cis)
  File "solution.py", line 83, in solve
    observation, reward, done, info = env.step(action)
  File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
    return self.env.step(action)
  File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
    observation, reward, done, info = self.env.step(action)
  File "/workspace/wrappers.py", line 86, in step
    ob, reward, done, info = self.env.step(action)
  File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
    observation, reward, done, info = self.env.step(action)
  File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
    obs, rew, done, misc = self.sim.step(action, with_observation=True)
  File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
    return self._failsafe_observe(msg)
  File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
    raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator


(hidden)
5907794ManishAMOD18-AIDOaido1_LF1_r3-v3step4-vizsuccessno2910:02:06(hidden)
other stats
videos1
5903794ManishAMOD18-AIDOaido1_LF1_r3-v3step2-scoringsuccessno2910:02:34(hidden)
survival_time_median9.033333333333312


other stats
episodes
details{"ep000": {"nsteps": 171, "reward": -7.749294848818528, "good_angle": 24.378853664865417, "survival_time": 5.6999999999999895, "traveled_tiles": 2, "valid_direction": 3.9333333333333207}, "ep001": {"nsteps": 375, "reward": -3.5734044618407883, "good_angle": 22.88311803426891, "survival_time": 12.499999999999966, "traveled_tiles": 3, "valid_direction": 3.899999999999986}, "ep002": {"nsteps": 404, "reward": -2.509986069532508, "good_angle": 0.11435066787012724, "survival_time": 13.46666666666663, "traveled_tiles": 4, "valid_direction": 0}, "ep003": {"nsteps": 259, "reward": -6.018412907151181, "good_angle": 26.243785621492304, "survival_time": 8.633333333333313, "traveled_tiles": 3, "valid_direction": 3.93333333333332}, "ep004": {"nsteps": 271, "reward": -5.466318552420093, "good_angle": 4.2315628353182655, "survival_time": 9.033333333333312, "traveled_tiles": 3, "valid_direction": 2.933333333333324}}
good_angle_max26.243785621492304
good_angle_mean15.570334164763006
good_angle_median22.88311803426891
good_angle_min0.11435066787012724
reward_max-2.509986069532508
reward_mean-5.063483367952619
reward_median-5.466318552420093
reward_min-7.749294848818528
survival_time_max13.46666666666663
survival_time_mean9.866666666666642
survival_time_min5.6999999999999895
traveled_tiles_max4
traveled_tiles_mean3
traveled_tiles_median3
traveled_tiles_min2
valid_direction_max3.9333333333333207
valid_direction_mean2.9399999999999897
valid_direction_median3.899999999999986
valid_direction_min0
5893792EdwardRandom executionaido1_LF1_r3-v3step2-scoringsuccessno2910:00:17(hidden)
survival_time_median6.333333333333321


other stats
episodes
details{"ep000": {"nsteps": 119, "reward": -10.29928544088572, "good_angle": 17.226891564339322, "survival_time": 3.9666666666666623, "traveled_tiles": 2, "valid_direction": 2.766666666666662}, "ep001": {"nsteps": 264, "reward": -4.779882687623754, "good_angle": 16.51740361855984, "survival_time": 8.79999999999998, "traveled_tiles": 3, "valid_direction": 2.7666666666666577}, "ep002": {"nsteps": 279, "reward": -3.644003686591512, "good_angle": 0.06896665489742422, "survival_time": 9.299999999999978, "traveled_tiles": 4, "valid_direction": 0}, "ep003": {"nsteps": 181, "reward": -7.788495979256393, "good_angle": 18.313907760322827, "survival_time": 6.033333333333322, "traveled_tiles": 3, "valid_direction": 2.766666666666657}, "ep004": {"nsteps": 190, "reward": -7.142260233303042, "good_angle": 3.5933936079923985, "survival_time": 6.333333333333321, "traveled_tiles": 3, "valid_direction": 2.1333333333333258}}
good_angle_max18.313907760322827
good_angle_mean11.144112641222362
good_angle_median16.51740361855984
good_angle_min0.06896665489742422
reward_max-3.644003686591512
reward_mean-6.730785605532084
reward_median-7.142260233303042
reward_min-10.29928544088572
survival_time_max9.299999999999978
survival_time_mean6.886666666666652
survival_time_min3.9666666666666623
traveled_tiles_max4
traveled_tiles_mean3
traveled_tiles_median3
traveled_tiles_min2
valid_direction_max2.766666666666662
valid_direction_mean2.08666666666666
valid_direction_median2.766666666666657
valid_direction_min0
5868788zxcvfd13502Random executionaido1_LF1_r3-v3step2-scoringsuccessno2910:02:11(hidden)
survival_time_median9.699999999999976


other stats
episodes
details{"ep000": {"nsteps": 170, "reward": -7.8836926705697, "good_angle": 23.391290056292988, "survival_time": 5.666666666666656, "traveled_tiles": 2, "valid_direction": 3.933333333333321}, "ep001": {"nsteps": 377, "reward": -3.2887647123003765, "good_angle": 20.766891595661512, "survival_time": 12.566666666666633, "traveled_tiles": 3, "valid_direction": 3.899999999999986}, "ep002": {"nsteps": 475, "reward": -1.99879671661992, "good_angle": 0.405360251127977, "survival_time": 15.833333333333288, "traveled_tiles": 5, "valid_direction": 1.099999999999996}, "ep003": {"nsteps": 259, "reward": -6.163807896573571, "good_angle": 25.38986957485389, "survival_time": 8.633333333333313, "traveled_tiles": 3, "valid_direction": 3.899999999999987}, "ep004": {"nsteps": 291, "reward": -5.436106777042663, "good_angle": 6.5743249917656055, "survival_time": 9.699999999999976, "traveled_tiles": 5, "valid_direction": 3.733333333333321}}
good_angle_max25.38986957485389
good_angle_mean15.305547293940396
good_angle_median20.766891595661512
good_angle_min0.405360251127977
reward_max-1.99879671661992
reward_mean-4.954233754621246
reward_median-5.436106777042663
reward_min-7.8836926705697
survival_time_max15.833333333333288
survival_time_mean10.479999999999972
survival_time_min5.666666666666656
traveled_tiles_max5
traveled_tiles_mean3.6
traveled_tiles_median3
traveled_tiles_min2
valid_direction_max3.933333333333321
valid_direction_mean3.313333333333322
valid_direction_median3.899999999999986
valid_direction_min1.099999999999996
5862784haliangRandom executionaido1_LF1_r3-v3step3-videossuccessno2910:01:11(hidden)
other stats
videos1
5856787licAMOD18-AIDO not that random executionaido1_LF1_r3-v3step1-simulationerrorno2910:00:56
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 488, in wrap_evaluator
    evaluator.score(cie)
  File "eval.py", line 96, in score
    raise dc.InvalidEvaluator(msg)
InvalidEvaluator: Gym exited with code 2
(hidden)
5843784haliangRandom executionaido1_LF1_r3-v3step1-simulationsuccessno2910:03:30(hidden)
other stats
simulation-passed1
5837783licAMOD18-AIDO not that random executionaido1_LF1_r3-v3step1-simulationerrorno2910:00:53
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 488, in wrap_evaluator
    evaluator.score(cie)
  File "eval.py", line 96, in score
    raise dc.InvalidEvaluator(msg)
InvalidEvaluator: Gym exited with code 2
(hidden)
5822779aleksandarpetrRandom executionaido1_LF1_r3-v3step2-scoringsuccessno2910:01:57(hidden)
survival_time_median9.099999999999978


other stats
episodes
details{"ep000": {"nsteps": 171, "reward": -7.781195403191081, "good_angle": 23.466307210913083, "survival_time": 5.6999999999999895, "traveled_tiles": 2, "valid_direction": 3.899999999999987}, "ep001": {"nsteps": 379, "reward": -3.5312907700095137, "good_angle": 23.49675131172871, "survival_time": 12.6333333333333, "traveled_tiles": 3, "valid_direction": 3.966666666666653}, "ep002": {"nsteps": 327, "reward": -3.0822081143600406, "good_angle": 0.13463207120400894, "survival_time": 10.899999999999972, "traveled_tiles": 4, "valid_direction": 0}, "ep003": {"nsteps": 258, "reward": -5.806202528550643, "good_angle": 25.491406356667778, "survival_time": 8.59999999999998, "traveled_tiles": 3, "valid_direction": 3.93333333333332}, "ep004": {"nsteps": 273, "reward": -5.558138499804005, "good_angle": 5.494277352927701, "survival_time": 9.099999999999978, "traveled_tiles": 3, "valid_direction": 2.9999999999999902}}
good_angle_max25.491406356667778
good_angle_mean15.616674860688256
good_angle_median23.466307210913083
good_angle_min0.13463207120400894
reward_max-3.0822081143600406
reward_mean-5.151807063183057
reward_median-5.558138499804005
reward_min-7.781195403191081
survival_time_max12.6333333333333
survival_time_mean9.386666666666644
survival_time_min5.6999999999999895
traveled_tiles_max4
traveled_tiles_mean3
traveled_tiles_median3
traveled_tiles_min2
valid_direction_max3.966666666666653
valid_direction_mean2.95999999999999
valid_direction_median3.899999999999987
valid_direction_min0
5809775frtimAMOD18-AIDO not tht random executionaido1_LF1_r3-v3step4-vizsuccessno2910:02:33(hidden)
other stats
videos1
5798775frtimAMOD18-AIDO not tht random executionaido1_LF1_r3-v3step2-scoringsuccessno2910:02:27(hidden)
survival_time_median16.666666666666654


other stats
episodes
details{"ep000": {"nsteps": 500, "reward": -1.2629533237218855, "good_angle": 13.558177147634575, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.93333333333332}, "ep001": {"nsteps": 500, "reward": -1.1507397584803405, "good_angle": 30.37251736114832, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 13.33333333333332}, "ep002": {"nsteps": 500, "reward": -1.5188683025373613, "good_angle": 56.82600488254528, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 14.866666666666648}, "ep003": {"nsteps": 500, "reward": -1.2894317615032196, "good_angle": 13.520643297797143, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 12.866666666666651}, "ep004": {"nsteps": 500, "reward": -1.3244163292050362, "good_angle": 13.522749450401015, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.899999999999986}}
good_angle_max56.82600488254528
good_angle_mean25.560018427905263
good_angle_median13.558177147634575
good_angle_min13.520643297797143
reward_max-1.1507397584803405
reward_mean-1.3092818950895686
reward_median-1.2894317615032196
reward_min-1.5188683025373613
survival_time_max16.666666666666654
survival_time_mean16.666666666666654
survival_time_min16.666666666666654
traveled_tiles_max2
traveled_tiles_mean1.4
traveled_tiles_median1
traveled_tiles_min1
valid_direction_max14.866666666666648
valid_direction_mean13.379999999999988
valid_direction_median12.93333333333332
valid_direction_min12.866666666666651
5795773AmaurXAMOD18-AIDO not that random executionaido1_LF1_r3-v3step3-videossuccessno2910:00:47(hidden)
other stats
videos1
5782772tomaszfAMOD18-AIDO not that random executionaido1_LF1_r3-v3step1-simulationsuccessno2910:04:11(hidden)
other stats
simulation-passed1
5769768Dzenan LapandicRandom executionaido1_LF1_r3-v3step3-videossuccessno2910:02:41(hidden)
other stats
videos1
5755765Andrea CensiRandom executionaido1_LF1_r3-v3step3-videossuccessno2910:00:44(hidden)
other stats
videos1
5748766Andrea CensiRandom executionaido1_LFV_r1-v3step2-scoringsuccessno2910:01:55(hidden)
survival_time_median5.666666666666656


other stats
episodes
details{"ep000": {"nsteps": 170, "reward": -7.793226854240193, "good_angle": 24.514739373338095, "survival_time": 5.666666666666656, "traveled_tiles": 2, "valid_direction": 3.933333333333321}, "ep001": {"nsteps": 180, "reward": -6.071764395137628, "good_angle": 0.009215519679207373, "survival_time": 5.9999999999999885, "traveled_tiles": 2, "valid_direction": 0}, "ep002": {"nsteps": 112, "reward": -10.107387881293628, "good_angle": 4.629534678140666, "survival_time": 3.73333333333333, "traveled_tiles": 1, "valid_direction": 2.9666666666666632}, "ep003": {"nsteps": 41, "reward": -26.19490526071409, "good_angle": 0.04891935475235413, "survival_time": 1.3666666666666676, "traveled_tiles": 1, "valid_direction": 0}, "ep004": {"nsteps": 500, "reward": -1.094580393433571, "good_angle": 0.09763253251803804, "survival_time": 16.666666666666654, "traveled_tiles": 5, "valid_direction": 0}}
good_angle_max24.514739373338095
good_angle_mean5.860008291685672
good_angle_median0.09763253251803804
good_angle_min0.009215519679207373
reward_max-1.094580393433571
reward_mean-10.25237295696382
reward_median-7.793226854240193
reward_min-26.19490526071409
survival_time_max16.666666666666654
survival_time_mean6.6866666666666585
survival_time_min1.3666666666666676
traveled_tiles_max5
traveled_tiles_mean2.2
traveled_tiles_median2
traveled_tiles_min1
valid_direction_max3.933333333333321
valid_direction_mean1.3799999999999968
valid_direction_median0
valid_direction_min0