Duckietown Challenges Home Challenges Submissions

Submission 3130

Submission3130
Competingyes
Challengeaido2-LF-sim-validation
UserAlexander Karavaev
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 22583
Next
User labeltest
Admin priority50
Blessingn/a
User priority50

22583

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
22583step1-simulationsuccessyes0:15:15
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.9221394331500324
survival_time_median6.149999999999986
deviation-center-line_median0.4226862984983189
in-drivable-lane_median0.1999999999999993


other stats
agent_compute-ego_max0.11781038210063638
agent_compute-ego_mean0.11142841017281048
agent_compute-ego_median0.11065754294395448
agent_compute-ego_min0.10752142735613072
deviation-center-line_max0.6951430902366791
deviation-center-line_mean0.4388432062203975
deviation-center-line_min0.19467892430488812
deviation-heading_max2.3674462508475824
deviation-heading_mean1.5459877064107794
deviation-heading_median1.3045578283954786
deviation-heading_min0.9292287122039666
driven_any_max1.2883041900232195
driven_any_mean0.894629152250439
driven_any_median0.934779392424075
driven_any_min0.5168419197383907
driven_lanedir_consec_max1.1194072319537969
driven_lanedir_consec_mean0.7623114540758053
driven_lanedir_consec_min0.1982626620490111
driven_lanedir_max1.1194072319537969
driven_lanedir_mean0.7732286949614735
driven_lanedir_median0.9221394331500324
driven_lanedir_min0.25284886647735216
in-drivable-lane_max1.1999999999999966
in-drivable-lane_mean0.36999999999999994
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 0.7743219983875073, "sim_physics": 0.14190215101608863, "survival_time": 5.1999999999999895, "driven_lanedir": 0.6306078328295646, "sim_render-ego": 0.07910280732008126, "in-drivable-lane": 0.1999999999999993, "agent_compute-ego": 0.11065754294395448, "deviation-heading": 2.057643717504747, "set_robot_commands": 0.11015061461008512, "deviation-center-line": 0.3138611195856906, "driven_lanedir_consec": 0.6306078328295646, "sim_compute_sim_state": 0.0441336058653318, "sim_compute_performance-ego": 0.07905893371655391, "sim_compute_robot_state-ego": 0.08329962767087497}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 0.5168419197383907, "sim_physics": 0.15258930098842566, "survival_time": 3.5499999999999954, "driven_lanedir": 0.25284886647735216, "sim_render-ego": 0.07310131234182439, "in-drivable-lane": 1.1999999999999966, "agent_compute-ego": 0.11221699983301298, "deviation-heading": 1.3045578283954786, "set_robot_commands": 0.11413730701930086, "deviation-center-line": 0.19467892430488812, "driven_lanedir_consec": 0.1982626620490111, "sim_compute_sim_state": 0.04355610592264525, "sim_compute_performance-ego": 0.08157729430937431, "sim_compute_robot_state-ego": 0.09279083198224994}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 0.934779392424075, "sim_physics": 0.14471169409713125, "survival_time": 6.149999999999986, "driven_lanedir": 0.9221394331500324, "sim_render-ego": 0.07163887295296521, "in-drivable-lane": 0, "agent_compute-ego": 0.10752142735613072, "deviation-heading": 0.9292287122039666, "set_robot_commands": 0.1144604663538739, "deviation-center-line": 0.6951430902366791, "driven_lanedir_consec": 0.9221394331500324, "sim_compute_sim_state": 0.04413312237437179, "sim_compute_performance-ego": 0.08185324242444543, "sim_compute_robot_state-ego": 0.08958624630439573}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 0.9588982606790024, "sim_physics": 0.14207042588127983, "survival_time": 6.299999999999986, "driven_lanedir": 0.9411401103966216, "sim_render-ego": 0.07139614271739173, "in-drivable-lane": 0, "agent_compute-ego": 0.1089356986303178, "deviation-heading": 1.071062023102122, "set_robot_commands": 0.11016838891165597, "deviation-center-line": 0.567846598476411, "driven_lanedir_consec": 0.9411401103966216, "sim_compute_sim_state": 0.04470344006069123, "sim_compute_performance-ego": 0.08112561324286083, "sim_compute_robot_state-ego": 0.08675991164313422}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 1.2883041900232195, "sim_physics": 0.14253605197289743, "survival_time": 8.349999999999984, "driven_lanedir": 1.1194072319537969, "sim_render-ego": 0.07212043236829564, "in-drivable-lane": 0.4500000000000038, "agent_compute-ego": 0.11781038210063638, "deviation-heading": 2.3674462508475824, "set_robot_commands": 0.11471227828614013, "deviation-center-line": 0.4226862984983189, "driven_lanedir_consec": 1.1194072319537969, "sim_compute_sim_state": 0.04473019788365165, "sim_compute_performance-ego": 0.08378471157507981, "sim_compute_robot_state-ego": 0.09208942030718228}}
set_robot_commands_max0.11471227828614013
set_robot_commands_mean0.1127258110362112
set_robot_commands_median0.11413730701930086
set_robot_commands_min0.11015061461008512
sim_compute_performance-ego_max0.08378471157507981
sim_compute_performance-ego_mean0.08147995905366287
sim_compute_performance-ego_median0.08157729430937431
sim_compute_performance-ego_min0.07905893371655391
sim_compute_robot_state-ego_max0.09279083198224994
sim_compute_robot_state-ego_mean0.08890520758156742
sim_compute_robot_state-ego_median0.08958624630439573
sim_compute_robot_state-ego_min0.08329962767087497
sim_compute_sim_state_max0.04473019788365165
sim_compute_sim_state_mean0.04425129442133834
sim_compute_sim_state_median0.0441336058653318
sim_compute_sim_state_min0.04355610592264525
sim_physics_max0.15258930098842566
sim_physics_mean0.14476192479116456
sim_physics_median0.14253605197289743
sim_physics_min0.14190215101608863
sim_render-ego_max0.07910280732008126
sim_render-ego_mean0.07347191354011165
sim_render-ego_median0.07212043236829564
sim_render-ego_min0.07139614271739173
simulation-passed1
survival_time_max8.349999999999984
survival_time_mean5.909999999999988
survival_time_min3.5499999999999954
No reset possible
21429step1-simulationsuccessno0:30:08
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
21428step1-simulationhost-errorno0:08:20
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 483, in get_cr
    submission_id=submission_id, timeout_sec=timeout_sec)
  File "/usr/lib/python3.6/contextlib.py", line 88, in __exit__
    next(self.gen)
  File "/project/src/duckietown_challenges_runner/runner.py", line 343, in setup_logging
    convert(f_stdout)
  File "/project/src/duckietown_challenges_runner/runner.py", line 332, in convert
    data = open(log).read().strip()
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido2-LF-sim-validation/submission3130/step1-simulation-ip-172-31-42-167-7227-job21428/logs/challenges-runner/stdout.log'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible