Duckietown Challenges Home Challenges Submissions

Submission 4737

Submission4737
Competingyes
Challengeaido3-LF-sim-validation
UserNikolaus Howe
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 27145
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

27145

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
27145step1-simulationsuccessyes0:12:15
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.7859860082915429
survival_time_median11.900000000000034
deviation-center-line_median0.4054971791279752
in-drivable-lane_median0.14999999999999947


other stats
agent_compute-ego_max0.0560741662979126
agent_compute-ego_mean0.04859915433855338
agent_compute-ego_median0.0520161209987993
agent_compute-ego_min0.02953320614835049
deviation-center-line_max1.0435080083945425
deviation-center-line_mean0.5236800338915093
deviation-center-line_min0.15524570367014923
deviation-heading_max3.0634377207231083
deviation-heading_mean1.7835526916089202
deviation-heading_median1.3892097926413982
deviation-heading_min1.1107892633243697
driven_any_max2.349149090530479
driven_any_mean1.693452943026837
driven_any_median1.8589231521295957
driven_any_min0.6940596831542876
driven_lanedir_consec_max2.332713323533033
driven_lanedir_consec_mean1.612161109064505
driven_lanedir_consec_min0.6279773655469074
driven_lanedir_max2.332713323533033
driven_lanedir_mean1.612161109064505
driven_lanedir_median1.7859860082915429
driven_lanedir_min0.6279773655469074
in-drivable-lane_max0.5000000000000071
in-drivable-lane_mean0.21000000000000135
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 0.6940596831542876, "sim_physics": 0.12062779639629607, "survival_time": 4.699999999999991, "driven_lanedir": 0.6279773655469074, "sim_render-ego": 0.013527649514218594, "in-drivable-lane": 0.14999999999999947, "agent_compute-ego": 0.02953320614835049, "deviation-heading": 1.1107892633243697, "set_robot_commands": 0.012482988073470746, "deviation-center-line": 0.15524570367014923, "driven_lanedir_consec": 0.6279773655469074, "sim_compute_sim_state": 0.006225707683157414, "sim_compute_performance-ego": 0.008230244859736016, "sim_compute_robot_state-ego": 0.01009810985402858}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 1.8589231521295957, "sim_physics": 0.12900555534523075, "survival_time": 11.900000000000034, "driven_lanedir": 1.7859860082915429, "sim_render-ego": 0.014066752265481389, "in-drivable-lane": 0.10000000000000142, "agent_compute-ego": 0.0520161209987993, "deviation-heading": 2.2284118017237367, "set_robot_commands": 0.011623417629915126, "deviation-center-line": 0.4054971791279752, "driven_lanedir_consec": 1.7859860082915429, "sim_compute_sim_state": 0.005615558944830373, "sim_compute_performance-ego": 0.00831326616912329, "sim_compute_robot_state-ego": 0.010815402038958892}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 2.349101560452834, "sim_physics": 0.11938103675842283, "survival_time": 14.950000000000076, "driven_lanedir": 2.1692824839651053, "sim_render-ego": 0.01347795327504476, "in-drivable-lane": 0.5000000000000071, "agent_compute-ego": 0.053672333558400474, "deviation-heading": 3.0634377207231083, "set_robot_commands": 0.010804093678792318, "deviation-center-line": 1.0435080083945425, "driven_lanedir_consec": 2.1692824839651053, "sim_compute_sim_state": 0.0053222004572550455, "sim_compute_performance-ego": 0.008217520713806152, "sim_compute_robot_state-ego": 0.01011972188949585}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.21603122886699, "sim_physics": 0.09587442573112775, "survival_time": 7.89999999999998, "driven_lanedir": 1.144846363985936, "sim_render-ego": 0.011223286013059976, "in-drivable-lane": 0.29999999999999893, "agent_compute-ego": 0.051699944689304014, "deviation-heading": 1.1259148796319878, "set_robot_commands": 0.008843541145324707, "deviation-center-line": 0.347828110497652, "driven_lanedir_consec": 1.144846363985936, "sim_compute_sim_state": 0.004635410972788364, "sim_compute_performance-ego": 0.006876696514177926, "sim_compute_robot_state-ego": 0.008323507972910434}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 2.349149090530479, "sim_physics": 0.12848081668217978, "survival_time": 14.950000000000076, "driven_lanedir": 2.332713323533033, "sim_render-ego": 0.013599771658579509, "in-drivable-lane": 0, "agent_compute-ego": 0.0560741662979126, "deviation-heading": 1.3892097926413982, "set_robot_commands": 0.010960843563079834, "deviation-center-line": 0.6663211677672276, "driven_lanedir_consec": 2.332713323533033, "sim_compute_sim_state": 0.005592251618703206, "sim_compute_performance-ego": 0.008177241484324138, "sim_compute_robot_state-ego": 0.010318324565887452}}
set_robot_commands_max0.012482988073470746
set_robot_commands_mean0.010942976818116543
set_robot_commands_median0.010960843563079834
set_robot_commands_min0.008843541145324707
sim_compute_performance-ego_max0.00831326616912329
sim_compute_performance-ego_mean0.007962993948233504
sim_compute_performance-ego_median0.008217520713806152
sim_compute_performance-ego_min0.006876696514177926
sim_compute_robot_state-ego_max0.010815402038958892
sim_compute_robot_state-ego_mean0.00993501326425624
sim_compute_robot_state-ego_median0.01011972188949585
sim_compute_robot_state-ego_min0.008323507972910434
sim_compute_sim_state_max0.006225707683157414
sim_compute_sim_state_mean0.00547822593534688
sim_compute_sim_state_median0.005592251618703206
sim_compute_sim_state_min0.004635410972788364
sim_physics_max0.12900555534523075
sim_physics_mean0.11867392618265143
sim_physics_median0.12062779639629607
sim_physics_min0.09587442573112775
sim_render-ego_max0.014066752265481389
sim_render-ego_mean0.013179082545276846
sim_render-ego_median0.013527649514218594
sim_render-ego_min0.011223286013059976
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean10.880000000000033
survival_time_min4.699999999999991
No reset possible
27112step1-simulationhost-erroryes0:12:11
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 648, in get_cr
    wd, aws_config, copy_to_machine_cache=copy_to_machine_cache
  File "/project/src/duckietown_challenges_runner/runner.py", line 1309, in upload_files
    aws_config, toupload, copy_to_machine_cache=copy_to_machine_cache
  File "/project/src/duckietown_challenges_runner/runner.py", line 1565, in upload
    copy_to_cache(realfile, sha256hex)
  File "/project/src/duckietown_challenges_runner/runner_cache.py", line 40, in copy_to_cache
    msg = "Copying %s to cache %s" % (friendly_size(os.stat(fn).st_size), have)
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido3-LF-sim-validation/submission4737/step1-simulation-ip-172-31-35-218-10490-job27112/challenge-evaluation-output/episodes/ETHZ_autolab_technical_track-1-0/drawing.html'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible