Duckietown Challenges Home Challenges Submissions

Submission 10007

Submission10007
Competingyes
Challengeaido5-LF-sim-validation
UserFernanda Custodio Pereira do Carmo 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58078
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58078

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
58078LFv-simsuccessyes0:08:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.9583491653967502
survival_time_median12.825000000000047
deviation-center-line_median0.2547305599370908
in-drivable-lane_median5.950000000000053


other stats
agent_compute-ego0_max0.013760181120884271
agent_compute-ego0_mean0.013395286801816016
agent_compute-ego0_median0.013391170925909822
agent_compute-ego0_min0.013038624234560156
complete-iteration_max0.21904865900675455
complete-iteration_mean0.18771556826503116
complete-iteration_median0.1861686754354331
complete-iteration_min0.15947626318250382
deviation-center-line_max0.4059488212230585
deviation-center-line_mean0.2515773791682832
deviation-center-line_min0.09089957557589272
deviation-heading_max1.776266984775825
deviation-heading_mean1.1274414930720094
deviation-heading_median0.9931028972796732
deviation-heading_min0.7472931929528661
driven_any_max4.027322879942491
driven_any_mean2.489975950128337
driven_any_median2.6353373100148243
driven_any_min0.6619063005412074
driven_lanedir_consec_max1.9746614650590584
driven_lanedir_consec_mean1.022951777979726
driven_lanedir_consec_min0.2004473160663447
driven_lanedir_max1.9746614650590584
driven_lanedir_mean1.022951777979726
driven_lanedir_median0.9583491653967502
driven_lanedir_min0.2004473160663447
get_duckie_state_max1.342208297164352e-06
get_duckie_state_mean1.2887666804021392e-06
get_duckie_state_median1.2743194499822456e-06
get_duckie_state_min1.2642195244797135e-06
get_robot_state_max0.0040181199813324854
get_robot_state_mean0.0038217659781835983
get_robot_state_median0.0038302275945928874
get_robot_state_min0.0036084887422161336
get_state_dump_max0.004900071595119656
get_state_dump_mean0.004753543013465223
get_state_dump_median0.004750163568170743
get_state_dump_min0.004613773322399752
get_ui_image_max0.03618711601068944
get_ui_image_mean0.03248600975879877
get_ui_image_median0.03313638676620907
get_ui_image_min0.027484149492087485
in-drivable-lane_max13.550000000000118
in-drivable-lane_mean7.012500000000054
in-drivable-lane_min2.5999999999999934
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.8533819152229634, "get_ui_image": 0.03178111999043489, "step_physics": 0.10131800906322493, "survival_time": 13.80000000000006, "driven_lanedir": 1.9746614650590584, "get_state_dump": 0.004900071595119656, "get_robot_state": 0.003926508693488496, "sim_render-ego0": 0.004092902483062193, "get_duckie_state": 1.2824681691744698e-06, "in-drivable-lane": 4.500000000000064, "deviation-heading": 1.14073761278928, "agent_compute-ego0": 0.013240853801961407, "complete-iteration": 0.1744195838267192, "set_robot_commands": 0.0024053430729394356, "deviation-center-line": 0.4059488212230585, "driven_lanedir_consec": 1.9746614650590584, "sim_compute_sim_state": 0.010497464169664072, "sim_compute_performance-ego0": 0.002162056279096362}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.6619063005412074, "get_ui_image": 0.03618711601068944, "step_physics": 0.14244090774912893, "survival_time": 3.999999999999994, "driven_lanedir": 0.2004473160663447, "get_state_dump": 0.004613773322399752, "get_robot_state": 0.0036084887422161336, "sim_render-ego0": 0.003988595656406732, "get_duckie_state": 1.342208297164352e-06, "in-drivable-lane": 2.5999999999999934, "deviation-heading": 0.7472931929528661, "agent_compute-ego0": 0.013760181120884271, "complete-iteration": 0.21904865900675455, "set_robot_commands": 0.0022278803366201894, "deviation-center-line": 0.09089957557589272, "driven_lanedir_consec": 0.2004473160663447, "sim_compute_sim_state": 0.010130287688455464, "sim_compute_performance-ego0": 0.001999198654551565}, "LF-norm-techtrack-000-ego0": {"driven_any": 4.027322879942491, "get_ui_image": 0.03449165354198326, "step_physics": 0.11999416911570895, "survival_time": 19.100000000000136, "driven_lanedir": 1.0385283777616323, "get_state_dump": 0.004802264992935539, "get_robot_state": 0.0040181199813324854, "sim_render-ego0": 0.004163443264077289, "get_duckie_state": 1.2661707307900212e-06, "in-drivable-lane": 13.550000000000118, "deviation-heading": 1.776266984775825, "agent_compute-ego0": 0.013541488049858231, "complete-iteration": 0.19791776704414707, "set_robot_commands": 0.002385422081611175, "deviation-center-line": 0.2981451925677786, "driven_lanedir_consec": 1.0385283777616323, "sim_compute_sim_state": 0.012225043369024292, "sim_compute_performance-ego0": 0.00220112887748539}, "LF-norm-small_loop-000-ego0": {"driven_any": 2.4172927048066857, "get_ui_image": 0.027484149492087485, "step_physics": 0.0969581233353174, "survival_time": 11.850000000000032, "driven_lanedir": 0.8781699530318678, "get_state_dump": 0.004698062143405946, "get_robot_state": 0.003733946495697278, "sim_render-ego0": 0.003862025357094131, "get_duckie_state": 1.2642195244797135e-06, "in-drivable-lane": 7.400000000000041, "deviation-heading": 0.8454681817700663, "agent_compute-ego0": 0.013038624234560156, "complete-iteration": 0.15947626318250382, "set_robot_commands": 0.002156973886890572, "deviation-center-line": 0.21131592730640297, "driven_lanedir_consec": 0.8781699530318678, "sim_compute_sim_state": 0.00554183150539879, "sim_compute_performance-ego0": 0.0019138380259024995}}
set_robot_commands_max0.0024053430729394356
set_robot_commands_mean0.0022939048445153433
set_robot_commands_median0.002306651209115682
set_robot_commands_min0.002156973886890572
sim_compute_performance-ego0_max0.00220112887748539
sim_compute_performance-ego0_mean0.0020690554592589544
sim_compute_performance-ego0_median0.002080627466823964
sim_compute_performance-ego0_min0.0019138380259024995
sim_compute_sim_state_max0.012225043369024292
sim_compute_sim_state_mean0.009598656683135654
sim_compute_sim_state_median0.010313875929059768
sim_compute_sim_state_min0.00554183150539879
sim_render-ego0_max0.004163443264077289
sim_render-ego0_mean0.0040267416901600865
sim_render-ego0_median0.004040749069734463
sim_render-ego0_min0.003862025357094131
simulation-passed1
step_physics_max0.14244090774912893
step_physics_mean0.11517780231584504
step_physics_median0.11065608908946696
step_physics_min0.0969581233353174
survival_time_max19.100000000000136
survival_time_mean12.187500000000057
survival_time_min3.999999999999994
No reset possible
58063LFv-simhost-erroryes0:07:03
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 803, in get_cr
    uploaded = upload_files(wd, aws_config, copy_to_machine_cache=copy_to_machine_cache)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/uploading.py", line 45, in upload_files
    uploaded = upload(aws_config, toupload, copy_to_machine_cache=copy_to_machine_cache)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/uploading.py", line 265, in upload
    shutil.copy(realfile0, realfile)
  File "/usr/lib/python3.8/shutil.py", line 415, in copy
    copyfile(src, dst, follow_symlinks=follow_symlinks)
  File "/usr/lib/python3.8/shutil.py", line 272, in copyfile
    _fastcopy_sendfile(fsrc, fdst)
  File "/usr/lib/python3.8/shutil.py", line 163, in _fastcopy_sendfile
    raise err from None
  File "/usr/lib/python3.8/shutil.py", line 149, in _fastcopy_sendfile
    sent = os.sendfile(outfd, infd, offset, blocksize)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/aido5-LF-sim-validation/submission10007/LFv-sim-nogpu-prod-07_10a9ab190fec-job58063-a-wd/challenge-evaluation-output/episodes/LF-norm-techtrack-000/log.gs2.cbor' -> '/tmp/tmprtzlb8evlog.gs2.cbor'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible