Duckietown Challenges Home Challenges Submissions

Submission 13010

Submission13010
Competingyes
Challengeaido5-LFI-sim-validation
UserMΓ‘rton TimΒ πŸ‡­πŸ‡Ί
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
ResultπŸ’š
JobsLFVIv-sim: 60260
Next
User label3626
Admin priority50
Blessingn/a
User priority50

60260

Click the images to see detailed statistics about the episode.

LFI-norm-4way-000

LFI-norm-udem1-000

challenge-solution-output

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
60260LFVIv-simsuccessyes0:02:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.7072560191923074
survival_time_median2.799999999999998
deviation-center-line_median0.24972215695454592
in-drivable-lane_median0.10000000000000007


other stats
agent_compute-ego0_max0.04403135180473328
agent_compute-ego0_mean0.04369307783516971
agent_compute-ego0_median0.04369307783516971
agent_compute-ego0_min0.04335480386560613
complete-iteration_max0.34505195915699005
complete-iteration_mean0.32580081915313547
complete-iteration_median0.32580081915313547
complete-iteration_min0.3065496791492809
deviation-center-line_max0.3467504384294299
deviation-center-line_mean0.24972215695454592
deviation-center-line_min0.15269387547966196
deviation-heading_max2.288944131743964
deviation-heading_mean1.4302746349169937
deviation-heading_median1.4302746349169937
deviation-heading_min0.571605138090023
driven_any_max1.10615441743032
driven_any_mean0.8753046369706095
driven_any_median0.8753046369706095
driven_any_min0.6444548565108991
driven_lanedir_consec_max0.7946722156244491
driven_lanedir_consec_mean0.7072560191923074
driven_lanedir_consec_min0.6198398227601656
driven_lanedir_max0.7970244916427148
driven_lanedir_mean0.7084321572014403
driven_lanedir_median0.7084321572014403
driven_lanedir_min0.6198398227601656
get_duckie_state_max1.3582634203361743e-06
get_duckie_state_mean1.3248486952348189e-06
get_duckie_state_median1.3248486952348189e-06
get_duckie_state_min1.2914339701334636e-06
get_robot_state_max0.0037613420775442414
get_robot_state_mean0.003726070577448065
get_robot_state_median0.003726070577448065
get_robot_state_min0.003690799077351888
get_state_dump_max0.0048391322294871015
get_state_dump_mean0.00481905946225831
get_state_dump_median0.00481905946225831
get_state_dump_min0.004798986695029519
get_ui_image_max0.038885846734046936
get_ui_image_mean0.038527711097038154
get_ui_image_median0.038527711097038154
get_ui_image_min0.03816957546002937
in-drivable-lane_max0.20000000000000015
in-drivable-lane_mean0.10000000000000007
in-drivable-lane_min0.0
per-episodes
details{"LFI-norm-4way-000-ego0": {"driven_any": 1.10615441743032, "get_ui_image": 0.03816957546002937, "step_physics": 0.19814210588281805, "survival_time": 3.2499999999999964, "driven_lanedir": 0.7970244916427148, "get_state_dump": 0.004798986695029519, "get_robot_state": 0.0037613420775442414, "sim_render-ego0": 0.00387349273219253, "get_duckie_state": 1.3582634203361743e-06, "in-drivable-lane": 0.20000000000000015, "deviation-heading": 2.288944131743964, "agent_compute-ego0": 0.04335480386560613, "complete-iteration": 0.3065496791492809, "set_robot_commands": 0.002294865521517667, "deviation-center-line": 0.3467504384294299, "driven_lanedir_consec": 0.7946722156244491, "sim_compute_sim_state": 0.010019179546471798, "sim_compute_performance-ego0": 0.0020441033623435283}, "LFI-norm-udem1-000-ego0": {"driven_any": 0.6444548565108991, "get_ui_image": 0.038885846734046936, "step_physics": 0.23610695203145343, "survival_time": 2.3499999999999996, "driven_lanedir": 0.6198398227601656, "get_state_dump": 0.0048391322294871015, "get_robot_state": 0.003690799077351888, "sim_render-ego0": 0.0038235485553741455, "get_duckie_state": 1.2914339701334636e-06, "in-drivable-lane": 0.0, "deviation-heading": 0.571605138090023, "agent_compute-ego0": 0.04403135180473328, "complete-iteration": 0.34505195915699005, "set_robot_commands": 0.0023403515418370566, "deviation-center-line": 0.15269387547966196, "driven_lanedir_consec": 0.6198398227601656, "sim_compute_sim_state": 0.009262263774871826, "sim_compute_performance-ego0": 0.0019872734944025674}}
set_robot_commands_max0.0023403515418370566
set_robot_commands_mean0.0023176085316773615
set_robot_commands_median0.0023176085316773615
set_robot_commands_min0.002294865521517667
sim_compute_performance-ego0_max0.0020441033623435283
sim_compute_performance-ego0_mean0.002015688428373048
sim_compute_performance-ego0_median0.002015688428373048
sim_compute_performance-ego0_min0.0019872734944025674
sim_compute_sim_state_max0.010019179546471798
sim_compute_sim_state_mean0.009640721660671812
sim_compute_sim_state_median0.009640721660671812
sim_compute_sim_state_min0.009262263774871826
sim_render-ego0_max0.00387349273219253
sim_render-ego0_mean0.003848520643783338
sim_render-ego0_median0.003848520643783338
sim_render-ego0_min0.0038235485553741455
simulation-passed1
step_physics_max0.23610695203145343
step_physics_mean0.21712452895713577
step_physics_median0.21712452895713577
step_physics_min0.19814210588281805
survival_time_max3.2499999999999964
survival_time_mean2.799999999999998
survival_time_min2.3499999999999996
No reset possible
60258LFVIv-simhost-erroryes0:02:16
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 59, in get_services_id
    raise ZValueError(container_ids=container_ids, services=services, res=res, names=names)
zuper_commons.types.exceptions.ZValueError: 

β”‚ container_ids: [bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9,
β”‚                 5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751,
β”‚                 bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70,
β”‚                 c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7,
β”‚                 8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79]
β”‚      services: dict[7]
β”‚                β”‚ npc0:
β”‚                β”‚ dict[6]
β”‚                β”‚ β”‚ image: docker.io/duckietown/challenge-aido_lf-minimal-agent-full@sha256:2914fa743f9d20457cff19df787b230fc8489a3bdd17d9f3b7bbcd3cb4541f15
β”‚                β”‚ β”‚ environment:
β”‚                β”‚ β”‚ dict[12]
β”‚                β”‚ β”‚ β”‚ AIDONODE_NAME: npc0
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_IN: /fifos/npc0-in
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_OUT: fifo:/fifos/npc0-out
β”‚                β”‚ β”‚ β”‚ challenge_name: aido5-LFI-sim-validation
β”‚                β”‚ β”‚ β”‚ challenge_step_name: LFVIv-sim
β”‚                β”‚ β”‚ β”‚ submission_id: 13010
β”‚                β”‚ β”‚ β”‚ submitter_name: timur-BMEConti
β”‚                β”‚ β”‚ β”‚ SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β”‚                β”‚ β”‚ β”‚ username: ubuntu
β”‚                β”‚ β”‚ β”‚ uid: 0
β”‚                β”‚ β”‚ β”‚ USER: ubuntu
β”‚                β”‚ β”‚ β”‚ HOME: /fake-home/ubuntu
β”‚                β”‚ β”‚ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β”‚                β”‚ β”‚ user: 0:0
β”‚                β”‚ β”‚ volumes:
β”‚                β”‚ β”‚ [
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β”‚                β”‚ β”‚ networks: {evaluation: {aliases: [evaluation]} }
β”‚                β”‚ npc1:
β”‚                β”‚ dict[6]
β”‚                β”‚ β”‚ image: docker.io/duckietown/challenge-aido_lf-minimal-agent-full@sha256:2914fa743f9d20457cff19df787b230fc8489a3bdd17d9f3b7bbcd3cb4541f15
β”‚                β”‚ β”‚ environment:
β”‚                β”‚ β”‚ dict[12]
β”‚                β”‚ β”‚ β”‚ AIDONODE_NAME: npc1
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_IN: /fifos/npc1-in
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_OUT: fifo:/fifos/npc1-out
β”‚                β”‚ β”‚ β”‚ challenge_name: aido5-LFI-sim-validation
β”‚                β”‚ β”‚ β”‚ challenge_step_name: LFVIv-sim
β”‚                β”‚ β”‚ β”‚ submission_id: 13010
β”‚                β”‚ β”‚ β”‚ submitter_name: timur-BMEConti
β”‚                β”‚ β”‚ β”‚ SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β”‚                β”‚ β”‚ β”‚ username: ubuntu
β”‚                β”‚ β”‚ β”‚ uid: 0
β”‚                β”‚ β”‚ β”‚ USER: ubuntu
β”‚                β”‚ β”‚ β”‚ HOME: /fake-home/ubuntu
β”‚                β”‚ β”‚ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β”‚                β”‚ β”‚ user: 0:0
β”‚                β”‚ β”‚ volumes:
β”‚                β”‚ β”‚ [
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β”‚                β”‚ β”‚ networks: {evaluation: {aliases: [evaluation]} }
β”‚                β”‚ npc2:
β”‚                β”‚ dict[6]
β”‚                β”‚ β”‚ image: docker.io/duckietown/challenge-aido_lf-minimal-agent-full@sha256:2914fa743f9d20457cff19df787b230fc8489a3bdd17d9f3b7bbcd3cb4541f15
β”‚                β”‚ β”‚ environment:
β”‚                β”‚ β”‚ dict[12]
β”‚                β”‚ β”‚ β”‚ AIDONODE_NAME: npc2
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_IN: /fifos/npc2-in
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_OUT: fifo:/fifos/npc2-out
β”‚                β”‚ β”‚ β”‚ challenge_name: aido5-LFI-sim-validation
β”‚                β”‚ β”‚ β”‚ challenge_step_name: LFVIv-sim
β”‚                β”‚ β”‚ β”‚ submission_id: 13010
β”‚                β”‚ β”‚ β”‚ submitter_name: timur-BMEConti
β”‚                β”‚ β”‚ β”‚ SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β”‚                β”‚ β”‚ β”‚ username: ubuntu
β”‚                β”‚ β”‚ β”‚ uid: 0
β”‚                β”‚ β”‚ β”‚ USER: ubuntu
β”‚                β”‚ β”‚ β”‚ HOME: /fake-home/ubuntu
β”‚                β”‚ β”‚ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β”‚                β”‚ β”‚ user: 0:0
β”‚                β”‚ β”‚ volumes:
β”‚                β”‚ β”‚ [
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β”‚                β”‚ β”‚ networks: {evaluation: {aliases: [evaluation]} }
β”‚                β”‚ npc3:
β”‚                β”‚ dict[6]
β”‚                β”‚ β”‚ image: docker.io/duckietown/challenge-aido_lf-minimal-agent-full@sha256:2914fa743f9d20457cff19df787b230fc8489a3bdd17d9f3b7bbcd3cb4541f15
β”‚                β”‚ β”‚ environment:
β”‚                β”‚ β”‚ dict[12]
β”‚                β”‚ β”‚ β”‚ AIDONODE_NAME: npc3
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_IN: /fifos/npc3-in
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_OUT: fifo:/fifos/npc3-out
β”‚                β”‚ β”‚ β”‚ challenge_name: aido5-LFI-sim-validation
β”‚                β”‚ β”‚ β”‚ challenge_step_name: LFVIv-sim
β”‚                β”‚ β”‚ β”‚ submission_id: 13010
β”‚                β”‚ β”‚ β”‚ submitter_name: timur-BMEConti
β”‚                β”‚ β”‚ β”‚ SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β”‚                β”‚ β”‚ β”‚ username: ubuntu
β”‚                β”‚ β”‚ β”‚ uid: 0
β”‚                β”‚ β”‚ β”‚ USER: ubuntu
β”‚                β”‚ β”‚ β”‚ HOME: /fake-home/ubuntu
β”‚                β”‚ β”‚ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β”‚                β”‚ β”‚ user: 0:0
β”‚                β”‚ β”‚ volumes:
β”‚                β”‚ β”‚ [
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β”‚                β”‚ β”‚ networks: {evaluation: {aliases: [evaluation]} }
β”‚                β”‚ evaluator:
β”‚                β”‚ dict[7]
β”‚                β”‚ β”‚ image: docker.io/andreacensi/aido5-lfi-sim-validation-lfviv-sim-evaluator@sha256:8089b2f843cd0a02b70b085c33ca0766989b1321b3c3a8e64fbd698ec01dbd79
β”‚                β”‚ β”‚ environment:
β”‚                β”‚ β”‚ dict[10]
β”‚                β”‚ β”‚ β”‚ experiment_manager_parameters:
β”‚                β”‚ β”‚ β”‚ |episodes_per_scenario: 1
β”‚                β”‚ β”‚ β”‚ |episode_length_s: 60.0
β”‚                β”‚ β”‚ β”‚ |min_episode_length_s: 0.0
β”‚                β”‚ β”‚ β”‚ |seed: 20200922
β”‚                β”‚ β”‚ β”‚ |physics_dt: 0.05
β”‚                β”‚ β”‚ β”‚ |max_failures: 2
β”‚                β”‚ β”‚ β”‚ |fifo_dir: /fifos
β”‚                β”‚ β”‚ β”‚ |sim_in: /fifos/simulator-in
β”‚                β”‚ β”‚ β”‚ |sim_out: /fifos/simulator-out
β”‚                β”‚ β”‚ β”‚ |sm_in: /fifos/scenario_maker-in
β”‚                β”‚ β”‚ β”‚ |sm_out: /fifos/scenario_maker-out
β”‚                β”‚ β”‚ β”‚ |timeout_initialization: 120
β”‚                β”‚ β”‚ β”‚ |timeout_regular: 120
β”‚                β”‚ β”‚ β”‚ |port: 10123
β”‚                β”‚ β”‚ β”‚ |scenarios:
β”‚                β”‚ β”‚ β”‚ |- /scenarios
β”‚                β”‚ β”‚ β”‚ |
β”‚                β”‚ β”‚ β”‚ challenge_name: aido5-LFI-sim-validation
β”‚                β”‚ β”‚ β”‚ challenge_step_name: LFVIv-sim
β”‚                β”‚ β”‚ β”‚ submission_id: 13010
β”‚                β”‚ β”‚ β”‚ submitter_name: timur-BMEConti
β”‚                β”‚ β”‚ β”‚ SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β”‚                β”‚ β”‚ β”‚ username: ubuntu
β”‚                β”‚ β”‚ β”‚ uid: 0
β”‚                β”‚ β”‚ β”‚ USER: ubuntu
β”‚                β”‚ β”‚ β”‚ HOME: /fake-home/ubuntu
β”‚                β”‚ β”‚ ports: [10123]
β”‚                β”‚ β”‚ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β”‚                β”‚ β”‚ user: 0:0
β”‚                β”‚ β”‚ volumes:
β”‚                β”‚ β”‚ [
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β”‚                β”‚ β”‚ networks: {evaluation: {aliases: [evaluation]} }
β”‚                β”‚ simulator:
β”‚                β”‚ dict[6]
β”‚                β”‚ β”‚ image: docker.io/duckietown/challenge-aido_lf-simulator-gym@sha256:c0096866077db3574e425d40603d8f5fc8ebbd164da7c0578df94ff4ede58d95
β”‚                β”‚ β”‚ environment:
β”‚                β”‚ β”‚ dict[12]
β”‚                β”‚ β”‚ β”‚ AIDONODE_CONFIG:
β”‚                β”‚ β”‚ β”‚ |env_constructor: Simulator
β”‚                β”‚ β”‚ β”‚ |env_parameters:
β”‚                β”‚ β”‚ β”‚ |  max_steps: 500001 # we don't want the gym to reset itself
β”‚                β”‚ β”‚ β”‚ |  domain_rand: 0
β”‚                β”‚ β”‚ β”‚ |  camera_width: 640
β”‚                β”‚ β”‚ β”‚ |  camera_height: 480
β”‚                β”‚ β”‚ β”‚ |  distortion: true
β”‚                β”‚ β”‚ β”‚ |  num_tris_distractors: 0
β”‚                β”‚ β”‚ β”‚ |  color_ground: [0, 0.3, 0] # green
β”‚                β”‚ β”‚ β”‚ |  enable_leds: true
β”‚                β”‚ β”‚ β”‚ |
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_IN: /fifos/simulator-in
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_OUT: fifo:/fifos/simulator-out
β”‚                β”‚ β”‚ β”‚ challenge_name: aido5-LFI-sim-validation
β”‚                β”‚ β”‚ β”‚ challenge_step_name: LFVIv-sim
β”‚                β”‚ β”‚ β”‚ submission_id: 13010
β”‚                β”‚ β”‚ β”‚ submitter_name: timur-BMEConti
β”‚                β”‚ β”‚ β”‚ SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β”‚                β”‚ β”‚ β”‚ username: ubuntu
β”‚                β”‚ β”‚ β”‚ uid: 0
β”‚                β”‚ β”‚ β”‚ USER: ubuntu
β”‚                β”‚ β”‚ β”‚ HOME: /fake-home/ubuntu
β”‚                β”‚ β”‚ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β”‚                β”‚ β”‚ user: 0:0
β”‚                β”‚ β”‚ volumes:
β”‚                β”‚ β”‚ [
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β”‚                β”‚ β”‚ networks: {evaluation: {aliases: [evaluation]} }
β”‚                β”‚ solution-ego0:
β”‚                β”‚ dict[6]
β”‚                β”‚ β”‚ image: docker.io/marcsita/aido-submissions@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β”‚                β”‚ β”‚ environment:
β”‚                β”‚ β”‚ dict[13]
β”‚                β”‚ β”‚ β”‚ AIDONODE_NAME: ego0
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_IN: /fifos/ego0-in
β”‚                β”‚ β”‚ β”‚ AIDO_REQUIRE_GPU: 1
β”‚                β”‚ β”‚ β”‚ AIDONODE_DATA_OUT: fifo:/fifos/ego0-out
β”‚                β”‚ β”‚ β”‚ challenge_name: aido5-LFI-sim-validation
β”‚                β”‚ β”‚ β”‚ challenge_step_name: LFVIv-sim
β”‚                β”‚ β”‚ β”‚ submission_id: 13010
β”‚                β”‚ β”‚ β”‚ submitter_name: timur-BMEConti
β”‚                β”‚ β”‚ β”‚ SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β”‚                β”‚ β”‚ β”‚ username: ubuntu
β”‚                β”‚ β”‚ β”‚ uid: 0
β”‚                β”‚ β”‚ β”‚ USER: ubuntu
β”‚                β”‚ β”‚ β”‚ HOME: /fake-home/ubuntu
β”‚                β”‚ β”‚ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β”‚                β”‚ β”‚ user: 0:0
β”‚                β”‚ β”‚ volumes:
β”‚                β”‚ β”‚ [
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β”‚                β”‚ β”‚  /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β”‚                β”‚ β”‚ networks: {evaluation: {aliases: [evaluation]} }
β”‚           res: dict[5]
β”‚                β”‚ npc0: bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9
β”‚                β”‚ npc1: 5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751
β”‚                β”‚ npc2: bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70
β”‚                β”‚ npc3: c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7
β”‚                β”‚ solution-ego0: 8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79
β”‚         names: dict[5]
β”‚                β”‚ bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9: gpu-prod-01_71d9d0a909df-job60258-432240_npc0_1
β”‚                β”‚ 5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751: gpu-prod-01_71d9d0a909df-job60258-432240_npc1_1
β”‚                β”‚ bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70: gpu-prod-01_71d9d0a909df-job60258-432240_npc2_1
β”‚                β”‚ c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7: gpu-prod-01_71d9d0a909df-job60258-432240_npc3_1
β”‚                β”‚ 8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79: gpu-prod-01_71d9d0a909df-job60258-432240_solution-ego0_1

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 745, in get_cr
    cr = run_single(
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 959, in run_single
    write_logs(wd, project, services=config["services"])
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 120, in write_logs
    services2id: Dict[ServiceName, ContainerID] = get_services_id(wd, project, services)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 63, in get_services_id
    raise DockerComposeFail(msg, output=output.decode(), names=names) from e
duckietown_challenges_runner.docker_compose.DockerComposeFail: Cannot get process ids
β”‚ output: |bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9
β”‚         |5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751
β”‚         |bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70
β”‚         |c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7
β”‚         |8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79
β”‚         |
β”‚  names: dict[5]
β”‚         β”‚ bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9: gpu-prod-01_71d9d0a909df-job60258-432240_npc0_1
β”‚         β”‚ 5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751: gpu-prod-01_71d9d0a909df-job60258-432240_npc1_1
β”‚         β”‚ bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70: gpu-prod-01_71d9d0a909df-job60258-432240_npc2_1
β”‚         β”‚ c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7: gpu-prod-01_71d9d0a909df-job60258-432240_npc3_1
β”‚         β”‚ 8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79: gpu-prod-01_71d9d0a909df-job60258-432240_solution-ego0_1
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
60257LFVIv-simsuccessyes0:02:35
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible