Duckietown Challenges Home Challenges Submissions

Evaluator 103

ID103
evaluatoridsc-rudolf-11141
ownerAndrea CensiΒ πŸ‡¨πŸ‡­
machineidsc-rudolf
processidsc-rudolf-11141
versionDC:3.1.42;DCR:3.2.4;DTS:3.0.29
first heard
last heard
statusinactive
# evaluating
# success4 482
# timeout
# failed1 468
# error
# aborted
# host-error
arm
x86_64
Mac
gpu available
Number of processors
Processor frequency (MHz)
Free % of processors
RAM total (MB)
RAM free (MB)
Disk (MB)
Disk available (MB)
Docker Hub
P1
P2
Cloud simulations
PI Camera
# Duckiebots
Map 3x3 avaiable
Number of duckies
gpu cores
AIDO 2 Map LF public
AIDO 2 Map LF private
AIDO 2 Map LFV public
AIDO 2 Map LFV private
AIDO 2 Map LFVI public
AIDO 2 Map LFVI private
AIDO 3 Map LF public
AIDO 3 Map LF private
AIDO 3 Map LFV public
AIDO 3 Map LFV private
AIDO 3 Map LFVI public
AIDO 3 Map LFVI private
AIDO 5 Map large loop
ETU track
for 2021, map is ETH_small_inter
IPFS mountpoint /ipfs available
IPNS mountpoint /ipns available

Evaluator jobs

Job IDsubmissionuseruser labelchallengestepstatusup to dateevaluatordate starteddate completeddurationmessage
503224Julian ZillyBaseline solution using imitation learning from logsaido1_LF1_r3-v3step1-simulationsuccessnoidsc-rudolf-111410:04:46
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats
simulation-passed1
No reset possible
492267Andrea CensiΒ πŸ‡¨πŸ‡­Random executionaido1_LF1_r3-v3step2-scoringsuccessnoidsc-rudolf-111410:01:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median4.866666666666659


other stats
episodes
details{"ep000": {"nsteps": 262, "reward": -5.360776264106727, "good_angle": 3.8727163243445295, "survival_time": 8.733333333333313, "traveled_tiles": 3, "valid_direction": 3.3666666666666556}, "ep001": {"nsteps": 206, "reward": -5.986628623424127, "good_angle": 23.322008728745647, "survival_time": 6.866666666666652, "traveled_tiles": 2, "valid_direction": 3.966666666666653}, "ep002": {"nsteps": 146, "reward": -9.33520656824112, "good_angle": 25.90880075988976, "survival_time": 4.866666666666659, "traveled_tiles": 2, "valid_direction": 3.9333333333333256}, "ep003": {"nsteps": 109, "reward": -9.662303757007926, "good_angle": 1.1208104660652094, "survival_time": 3.63333333333333, "traveled_tiles": 1, "valid_direction": 2.233333333333329}, "ep004": {"nsteps": 82, "reward": -12.854354936402382, "good_angle": 0.6882428680029298, "survival_time": 2.7333333333333334, "traveled_tiles": 1, "valid_direction": 1.8}}
good_angle_max25.90880075988976
good_angle_mean10.982515829409616
good_angle_median3.8727163243445295
good_angle_min0.6882428680029298
reward_max-5.360776264106727
reward_mean-8.639854029836457
reward_median-9.33520656824112
reward_min-12.854354936402382
survival_time_max8.733333333333313
survival_time_mean5.366666666666658
survival_time_min2.7333333333333334
traveled_tiles_max3
traveled_tiles_mean1.8
traveled_tiles_median2
traveled_tiles_min1
valid_direction_max3.966666666666653
valid_direction_mean3.0599999999999925
valid_direction_median3.3666666666666556
valid_direction_min1.8
No reset possible
48916Andrea CensiΒ πŸ‡¨πŸ‡­Solution templateaido1_test_multistep-v3step2successyesidsc-rudolf-111410:00:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
passed-step21


No reset possible
48223Manfred DiazSolution templateaido1_test_multistep-v3step1successyesidsc-rudolf-111410:00:44
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
passed-step11


No reset possible
468188Brian WangΒ πŸ‡ΊπŸ‡ΈROS-based Lane Followingaido1_LF1_r3-v3step1-simulationfailednoidsc-rudolf-111410:00:59
InvalidSubmission: T [...]
InvalidSubmission:
Traceback (most recent call last):
  File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
    raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Env Duckietown-Lf-Lfv-Navv-Silent-v1 not found (valid versions include ['Duckietown-Lf-Lfv-Navv-Silent-v0'])
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible