Duckietown Challenges Home Challenges Submissions

Submission 10784

Submission10784
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57882
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

57882

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57882LFv-simsuccessyes0:15:42
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.6959790711489267
survival_time_median19.575000000000145
deviation-center-line_median0.908797111367774
in-drivable-lane_median4.975000000000037


other stats
agent_compute-ego0_max0.013580323784214676
agent_compute-ego0_mean0.0130431065877683
agent_compute-ego0_median0.013034669800479098
agent_compute-ego0_min0.012522762965900328
complete-iteration_max0.24792072844744523
complete-iteration_mean0.2057082546077018
complete-iteration_median0.19797049490355223
complete-iteration_min0.17897130017625745
deviation-center-line_max2.1286879696316365
deviation-center-line_mean1.0180384958595516
deviation-center-line_min0.12587179107102192
deviation-heading_max9.231661776586725
deviation-heading_mean5.101650421949353
deviation-heading_median5.2650610689908035
deviation-heading_min0.6448177732290822
driven_any_max7.416483668855816
driven_any_mean4.121079586350292
driven_any_median3.858746531987403
driven_any_min1.350341612570544
driven_lanedir_consec_max5.303701535421841
driven_lanedir_consec_mean2.8398597130255245
driven_lanedir_consec_min0.6637791743824045
driven_lanedir_max5.303701535421841
driven_lanedir_mean2.8450061934823068
driven_lanedir_median2.7062720320624902
driven_lanedir_min0.6637791743824045
get_duckie_state_max1.6491883879254578e-06
get_duckie_state_mean1.5147609606420568e-06
get_duckie_state_median1.5337762311909806e-06
get_duckie_state_min1.3423029922608077e-06
get_robot_state_max0.004009840594735115
get_robot_state_mean0.0038066493374032022
get_robot_state_median0.003852321194277987
get_robot_state_min0.003512114366321717
get_state_dump_max0.0050584404331863305
get_state_dump_mean0.004776017479670094
get_state_dump_median0.004797295237739684
get_state_dump_min0.004451039010014674
get_ui_image_max0.036327090151732584
get_ui_image_mean0.03138552596197424
get_ui_image_median0.03110904157468273
get_ui_image_min0.02699693054679891
in-drivable-lane_max9.499999999999762
in-drivable-lane_mean5.887499999999956
in-drivable-lane_min4.099999999999985
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.416483668855816, "get_ui_image": 0.02699693054679891, "step_physics": 0.11418696926362391, "survival_time": 37.25000000000002, "driven_lanedir": 5.303701535421841, "get_state_dump": 0.004451039010014674, "get_robot_state": 0.003512114366321717, "sim_render-ego0": 0.0037277083614236863, "get_duckie_state": 1.3423029922608077e-06, "in-drivable-lane": 9.499999999999762, "deviation-heading": 9.231661776586725, "agent_compute-ego0": 0.012522762965900328, "complete-iteration": 0.17897130017625745, "set_robot_commands": 0.002137977380215642, "deviation-center-line": 1.6031963125027653, "driven_lanedir_consec": 5.303701535421841, "sim_compute_sim_state": 0.009419757303539614, "sim_compute_performance-ego0": 0.00193334201066168}, "LF-norm-zigzag-000-ego0": {"driven_any": 6.002151977569645, "get_ui_image": 0.036327090151732584, "step_physics": 0.1699936238419651, "survival_time": 29.85000000000029, "driven_lanedir": 4.713280140139978, "get_state_dump": 0.004756044783321113, "get_robot_state": 0.003834875531021169, "sim_render-ego0": 0.003953063368398609, "get_duckie_state": 1.5273939406991404e-06, "in-drivable-lane": 4.850000000000069, "deviation-heading": 8.91030084981301, "agent_compute-ego0": 0.013176585918286174, "complete-iteration": 0.24792072844744523, "set_robot_commands": 0.002217942256991281, "deviation-center-line": 2.1286879696316365, "driven_lanedir_consec": 4.709195499253962, "sim_compute_sim_state": 0.011450436202977414, "sim_compute_performance-ego0": 0.002118747369900196}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.350341612570544, "get_ui_image": 0.03349168133583798, "step_physics": 0.1419233592452517, "survival_time": 7.79999999999998, "driven_lanedir": 0.6637791743824045, "get_state_dump": 0.0050584404331863305, "get_robot_state": 0.004009840594735115, "sim_render-ego0": 0.004223144737778196, "get_duckie_state": 1.6491883879254578e-06, "in-drivable-lane": 4.099999999999985, "deviation-heading": 0.6448177732290822, "agent_compute-ego0": 0.013580323784214676, "complete-iteration": 0.21604206607599927, "set_robot_commands": 0.0023057870804124576, "deviation-center-line": 0.12587179107102192, "driven_lanedir_consec": 0.6637791743824045, "sim_compute_sim_state": 0.00909691099907942, "sim_compute_performance-ego0": 0.002252451173818795}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.7153410864051608, "get_ui_image": 0.02872640181352748, "step_physics": 0.1154589257775781, "survival_time": 9.299999999999995, "driven_lanedir": 0.6992639239850025, "get_state_dump": 0.004838545692158256, "get_robot_state": 0.0038697668575348064, "sim_render-ego0": 0.004033254429618305, "get_duckie_state": 1.5401585216828209e-06, "in-drivable-lane": 5.100000000000004, "deviation-heading": 1.6198212881685958, "agent_compute-ego0": 0.01289275368267202, "complete-iteration": 0.17989892373110522, "set_robot_commands": 0.002365123779378473, "deviation-center-line": 0.21439791023278296, "driven_lanedir_consec": 0.6827626430438911, "sim_compute_sim_state": 0.0055283696893702215, "sim_compute_performance-ego0": 0.002100950893871287}}
set_robot_commands_max0.002365123779378473
set_robot_commands_mean0.0022567076242494635
set_robot_commands_median0.0022618646687018696
set_robot_commands_min0.002137977380215642
sim_compute_performance-ego0_max0.002252451173818795
sim_compute_performance-ego0_mean0.00210137286206299
sim_compute_performance-ego0_median0.0021098491318857415
sim_compute_performance-ego0_min0.00193334201066168
sim_compute_sim_state_max0.011450436202977414
sim_compute_sim_state_mean0.008873868548741667
sim_compute_sim_state_median0.009258334151309515
sim_compute_sim_state_min0.0055283696893702215
sim_render-ego0_max0.004223144737778196
sim_render-ego0_mean0.0039842927243047
sim_render-ego0_median0.003993158899008457
sim_render-ego0_min0.0037277083614236863
simulation-passed1
step_physics_max0.1699936238419651
step_physics_mean0.1353907195321047
step_physics_median0.1286911425114149
step_physics_min0.11418696926362391
survival_time_max37.25000000000002
survival_time_mean21.05000000000007
survival_time_min7.79999999999998
No reset possible
57877LFv-simsuccessyes0:10:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible