60260 | LFVIv-sim | success | yes | | | 0:02:50 | | Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard. | driven_lanedir_consec_median | 0.7072560191923074 | survival_time_median | 2.799999999999998 | deviation-center-line_median | 0.24972215695454592 | in-drivable-lane_median | 0.10000000000000007 |
other statsagent_compute-ego0_max | 0.04403135180473328 | agent_compute-ego0_mean | 0.04369307783516971 | agent_compute-ego0_median | 0.04369307783516971 | agent_compute-ego0_min | 0.04335480386560613 | complete-iteration_max | 0.34505195915699005 | complete-iteration_mean | 0.32580081915313547 | complete-iteration_median | 0.32580081915313547 | complete-iteration_min | 0.3065496791492809 | deviation-center-line_max | 0.3467504384294299 | deviation-center-line_mean | 0.24972215695454592 | deviation-center-line_min | 0.15269387547966196 | deviation-heading_max | 2.288944131743964 | deviation-heading_mean | 1.4302746349169937 | deviation-heading_median | 1.4302746349169937 | deviation-heading_min | 0.571605138090023 | driven_any_max | 1.10615441743032 | driven_any_mean | 0.8753046369706095 | driven_any_median | 0.8753046369706095 | driven_any_min | 0.6444548565108991 | driven_lanedir_consec_max | 0.7946722156244491 | driven_lanedir_consec_mean | 0.7072560191923074 | driven_lanedir_consec_min | 0.6198398227601656 | driven_lanedir_max | 0.7970244916427148 | driven_lanedir_mean | 0.7084321572014403 | driven_lanedir_median | 0.7084321572014403 | driven_lanedir_min | 0.6198398227601656 | get_duckie_state_max | 1.3582634203361743e-06 | get_duckie_state_mean | 1.3248486952348189e-06 | get_duckie_state_median | 1.3248486952348189e-06 | get_duckie_state_min | 1.2914339701334636e-06 | get_robot_state_max | 0.0037613420775442414 | get_robot_state_mean | 0.003726070577448065 | get_robot_state_median | 0.003726070577448065 | get_robot_state_min | 0.003690799077351888 | get_state_dump_max | 0.0048391322294871015 | get_state_dump_mean | 0.00481905946225831 | get_state_dump_median | 0.00481905946225831 | get_state_dump_min | 0.004798986695029519 | get_ui_image_max | 0.038885846734046936 | get_ui_image_mean | 0.038527711097038154 | get_ui_image_median | 0.038527711097038154 | get_ui_image_min | 0.03816957546002937 | in-drivable-lane_max | 0.20000000000000015 | in-drivable-lane_mean | 0.10000000000000007 | in-drivable-lane_min | 0.0 | per-episodes | details{"LFI-norm-4way-000-ego0": {"driven_any": 1.10615441743032, "get_ui_image": 0.03816957546002937, "step_physics": 0.19814210588281805, "survival_time": 3.2499999999999964, "driven_lanedir": 0.7970244916427148, "get_state_dump": 0.004798986695029519, "get_robot_state": 0.0037613420775442414, "sim_render-ego0": 0.00387349273219253, "get_duckie_state": 1.3582634203361743e-06, "in-drivable-lane": 0.20000000000000015, "deviation-heading": 2.288944131743964, "agent_compute-ego0": 0.04335480386560613, "complete-iteration": 0.3065496791492809, "set_robot_commands": 0.002294865521517667, "deviation-center-line": 0.3467504384294299, "driven_lanedir_consec": 0.7946722156244491, "sim_compute_sim_state": 0.010019179546471798, "sim_compute_performance-ego0": 0.0020441033623435283}, "LFI-norm-udem1-000-ego0": {"driven_any": 0.6444548565108991, "get_ui_image": 0.038885846734046936, "step_physics": 0.23610695203145343, "survival_time": 2.3499999999999996, "driven_lanedir": 0.6198398227601656, "get_state_dump": 0.0048391322294871015, "get_robot_state": 0.003690799077351888, "sim_render-ego0": 0.0038235485553741455, "get_duckie_state": 1.2914339701334636e-06, "in-drivable-lane": 0.0, "deviation-heading": 0.571605138090023, "agent_compute-ego0": 0.04403135180473328, "complete-iteration": 0.34505195915699005, "set_robot_commands": 0.0023403515418370566, "deviation-center-line": 0.15269387547966196, "driven_lanedir_consec": 0.6198398227601656, "sim_compute_sim_state": 0.009262263774871826, "sim_compute_performance-ego0": 0.0019872734944025674}} | set_robot_commands_max | 0.0023403515418370566 | set_robot_commands_mean | 0.0023176085316773615 | set_robot_commands_median | 0.0023176085316773615 | set_robot_commands_min | 0.002294865521517667 | sim_compute_performance-ego0_max | 0.0020441033623435283 | sim_compute_performance-ego0_mean | 0.002015688428373048 | sim_compute_performance-ego0_median | 0.002015688428373048 | sim_compute_performance-ego0_min | 0.0019872734944025674 | sim_compute_sim_state_max | 0.010019179546471798 | sim_compute_sim_state_mean | 0.009640721660671812 | sim_compute_sim_state_median | 0.009640721660671812 | sim_compute_sim_state_min | 0.009262263774871826 | sim_render-ego0_max | 0.00387349273219253 | sim_render-ego0_mean | 0.003848520643783338 | sim_render-ego0_median | 0.003848520643783338 | sim_render-ego0_min | 0.0038235485553741455 | simulation-passed | 1 | step_physics_max | 0.23610695203145343 | step_physics_mean | 0.21712452895713577 | step_physics_median | 0.21712452895713577 | step_physics_min | 0.19814210588281805 | survival_time_max | 3.2499999999999964 | survival_time_mean | 2.799999999999998 | survival_time_min | 2.3499999999999996 |
| No reset possible |
60258 | LFVIv-sim | host-error | yes | | | 0:02:16 | Uncaught exception:
[...]Uncaught exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 59, in get_services_id
raise ZValueError(container_ids=container_ids, services=services, res=res, names=names)
zuper_commons.types.exceptions.ZValueError:
β container_ids: [bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9,
β 5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751,
β bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70,
β c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7,
β 8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79]
β services: dict[7]
β β npc0:
β β dict[6]
β β β image: docker.io/duckietown/challenge-aido_lf-minimal-agent-full@sha256:2914fa743f9d20457cff19df787b230fc8489a3bdd17d9f3b7bbcd3cb4541f15
β β β environment:
β β β dict[12]
β β β β AIDONODE_NAME: npc0
β β β β AIDONODE_DATA_IN: /fifos/npc0-in
β β β β AIDONODE_DATA_OUT: fifo:/fifos/npc0-out
β β β β challenge_name: aido5-LFI-sim-validation
β β β β challenge_step_name: LFVIv-sim
β β β β submission_id: 13010
β β β β submitter_name: timur-BMEConti
β β β β SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β β β β username: ubuntu
β β β β uid: 0
β β β β USER: ubuntu
β β β β HOME: /fake-home/ubuntu
β β β labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β β β user: 0:0
β β β volumes:
β β β [
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β β β /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β β β networks: {evaluation: {aliases: [evaluation]} }
β β npc1:
β β dict[6]
β β β image: docker.io/duckietown/challenge-aido_lf-minimal-agent-full@sha256:2914fa743f9d20457cff19df787b230fc8489a3bdd17d9f3b7bbcd3cb4541f15
β β β environment:
β β β dict[12]
β β β β AIDONODE_NAME: npc1
β β β β AIDONODE_DATA_IN: /fifos/npc1-in
β β β β AIDONODE_DATA_OUT: fifo:/fifos/npc1-out
β β β β challenge_name: aido5-LFI-sim-validation
β β β β challenge_step_name: LFVIv-sim
β β β β submission_id: 13010
β β β β submitter_name: timur-BMEConti
β β β β SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β β β β username: ubuntu
β β β β uid: 0
β β β β USER: ubuntu
β β β β HOME: /fake-home/ubuntu
β β β labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β β β user: 0:0
β β β volumes:
β β β [
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β β β /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β β β networks: {evaluation: {aliases: [evaluation]} }
β β npc2:
β β dict[6]
β β β image: docker.io/duckietown/challenge-aido_lf-minimal-agent-full@sha256:2914fa743f9d20457cff19df787b230fc8489a3bdd17d9f3b7bbcd3cb4541f15
β β β environment:
β β β dict[12]
β β β β AIDONODE_NAME: npc2
β β β β AIDONODE_DATA_IN: /fifos/npc2-in
β β β β AIDONODE_DATA_OUT: fifo:/fifos/npc2-out
β β β β challenge_name: aido5-LFI-sim-validation
β β β β challenge_step_name: LFVIv-sim
β β β β submission_id: 13010
β β β β submitter_name: timur-BMEConti
β β β β SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β β β β username: ubuntu
β β β β uid: 0
β β β β USER: ubuntu
β β β β HOME: /fake-home/ubuntu
β β β labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β β β user: 0:0
β β β volumes:
β β β [
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β β β /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β β β networks: {evaluation: {aliases: [evaluation]} }
β β npc3:
β β dict[6]
β β β image: docker.io/duckietown/challenge-aido_lf-minimal-agent-full@sha256:2914fa743f9d20457cff19df787b230fc8489a3bdd17d9f3b7bbcd3cb4541f15
β β β environment:
β β β dict[12]
β β β β AIDONODE_NAME: npc3
β β β β AIDONODE_DATA_IN: /fifos/npc3-in
β β β β AIDONODE_DATA_OUT: fifo:/fifos/npc3-out
β β β β challenge_name: aido5-LFI-sim-validation
β β β β challenge_step_name: LFVIv-sim
β β β β submission_id: 13010
β β β β submitter_name: timur-BMEConti
β β β β SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β β β β username: ubuntu
β β β β uid: 0
β β β β USER: ubuntu
β β β β HOME: /fake-home/ubuntu
β β β labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β β β user: 0:0
β β β volumes:
β β β [
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β β β /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β β β networks: {evaluation: {aliases: [evaluation]} }
β β evaluator:
β β dict[7]
β β β image: docker.io/andreacensi/aido5-lfi-sim-validation-lfviv-sim-evaluator@sha256:8089b2f843cd0a02b70b085c33ca0766989b1321b3c3a8e64fbd698ec01dbd79
β β β environment:
β β β dict[10]
β β β β experiment_manager_parameters:
β β β β |episodes_per_scenario: 1
β β β β |episode_length_s: 60.0
β β β β |min_episode_length_s: 0.0
β β β β |seed: 20200922
β β β β |physics_dt: 0.05
β β β β |max_failures: 2
β β β β |fifo_dir: /fifos
β β β β |sim_in: /fifos/simulator-in
β β β β |sim_out: /fifos/simulator-out
β β β β |sm_in: /fifos/scenario_maker-in
β β β β |sm_out: /fifos/scenario_maker-out
β β β β |timeout_initialization: 120
β β β β |timeout_regular: 120
β β β β |port: 10123
β β β β |scenarios:
β β β β |- /scenarios
β β β β |
β β β β challenge_name: aido5-LFI-sim-validation
β β β β challenge_step_name: LFVIv-sim
β β β β submission_id: 13010
β β β β submitter_name: timur-BMEConti
β β β β SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β β β β username: ubuntu
β β β β uid: 0
β β β β USER: ubuntu
β β β β HOME: /fake-home/ubuntu
β β β ports: [10123]
β β β labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β β β user: 0:0
β β β volumes:
β β β [
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β β β /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β β β networks: {evaluation: {aliases: [evaluation]} }
β β simulator:
β β dict[6]
β β β image: docker.io/duckietown/challenge-aido_lf-simulator-gym@sha256:c0096866077db3574e425d40603d8f5fc8ebbd164da7c0578df94ff4ede58d95
β β β environment:
β β β dict[12]
β β β β AIDONODE_CONFIG:
β β β β |env_constructor: Simulator
β β β β |env_parameters:
β β β β | max_steps: 500001 # we don't want the gym to reset itself
β β β β | domain_rand: 0
β β β β | camera_width: 640
β β β β | camera_height: 480
β β β β | distortion: true
β β β β | num_tris_distractors: 0
β β β β | color_ground: [0, 0.3, 0] # green
β β β β | enable_leds: true
β β β β |
β β β β AIDONODE_DATA_IN: /fifos/simulator-in
β β β β AIDONODE_DATA_OUT: fifo:/fifos/simulator-out
β β β β challenge_name: aido5-LFI-sim-validation
β β β β challenge_step_name: LFVIv-sim
β β β β submission_id: 13010
β β β β submitter_name: timur-BMEConti
β β β β SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β β β β username: ubuntu
β β β β uid: 0
β β β β USER: ubuntu
β β β β HOME: /fake-home/ubuntu
β β β labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β β β user: 0:0
β β β volumes:
β β β [
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β β β /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β β β networks: {evaluation: {aliases: [evaluation]} }
β β solution-ego0:
β β dict[6]
β β β image: docker.io/marcsita/aido-submissions@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β β β environment:
β β β dict[13]
β β β β AIDONODE_NAME: ego0
β β β β AIDONODE_DATA_IN: /fifos/ego0-in
β β β β AIDO_REQUIRE_GPU: 1
β β β β AIDONODE_DATA_OUT: fifo:/fifos/ego0-out
β β β β challenge_name: aido5-LFI-sim-validation
β β β β challenge_step_name: LFVIv-sim
β β β β submission_id: 13010
β β β β submitter_name: timur-BMEConti
β β β β SUBMISSION_CONTAINER: docker.io/marcsita/aido-submissions:2020_12_08_17_26_53@sha256:9f1550e1a06e04910bc59b9879a4e10ce9de673725046a0896d35b0a883731b0
β β β β username: ubuntu
β β β β uid: 0
β β β β USER: ubuntu
β β β β HOME: /fake-home/ubuntu
β β β labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_71d9d0a909df}
β β β user: 0:0
β β β volumes:
β β β [
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-wd:/challenges:rw,
β β β /tmp/duckietown/aido5-LFI-sim-validation/submission13010/LFVIv-sim-gpu-prod-01_71d9d0a909df-job60258-a-fifos:/fifos:rw,
β β β /tmp/duckietown/dt-challenges-runner/20_12_08_19_58_00-99462/fake-ubuntu-home:/fake-home/ubuntu:rw]
β β β networks: {evaluation: {aliases: [evaluation]} }
β res: dict[5]
β β npc0: bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9
β β npc1: 5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751
β β npc2: bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70
β β npc3: c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7
β β solution-ego0: 8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79
β names: dict[5]
β β bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9: gpu-prod-01_71d9d0a909df-job60258-432240_npc0_1
β β 5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751: gpu-prod-01_71d9d0a909df-job60258-432240_npc1_1
β β bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70: gpu-prod-01_71d9d0a909df-job60258-432240_npc2_1
β β c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7: gpu-prod-01_71d9d0a909df-job60258-432240_npc3_1
β β 8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79: gpu-prod-01_71d9d0a909df-job60258-432240_solution-ego0_1
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 745, in get_cr
cr = run_single(
File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 959, in run_single
write_logs(wd, project, services=config["services"])
File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 120, in write_logs
services2id: Dict[ServiceName, ContainerID] = get_services_id(wd, project, services)
File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 63, in get_services_id
raise DockerComposeFail(msg, output=output.decode(), names=names) from e
duckietown_challenges_runner.docker_compose.DockerComposeFail: Cannot get process ids
β output: |bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9
β |5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751
β |bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70
β |c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7
β |8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79
β |
β names: dict[5]
β β bbea4f25456ab4484007e9f9b4ea0b349db251c6b67653f4dc2da70e53aef4e9: gpu-prod-01_71d9d0a909df-job60258-432240_npc0_1
β β 5947eb7c4fe891e520747e1f80811685ab6b5d968ff4a0700f1ede927f281751: gpu-prod-01_71d9d0a909df-job60258-432240_npc1_1
β β bc7e886d62c8ce1fdd2569018cf94e1228243cc961bb7a5f6606ba2b36592b70: gpu-prod-01_71d9d0a909df-job60258-432240_npc2_1
β β c6bc0fde0c3a31e44d1896096fd358886065e8a0840e1cb8da2204f4fc6784d7: gpu-prod-01_71d9d0a909df-job60258-432240_npc3_1
β β 8719ae615b38dfa98df55bcb0b77206aa5fccc527bc6243b5c0ef4a461cd4a79: gpu-prod-01_71d9d0a909df-job60258-432240_solution-ego0_1
| Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard. | | No reset possible |