Duckietown Challenges Home Challenges Submissions

Submission 5539

Submission5539
Competingyes
Challengeaido3-LF-sim-validation
UserMoustafa Elarabi
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 28905
Next
User labelchallenge-aido_LF-template-pytorch
Admin priority50
Blessingn/a
User priority50

28905

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
28905step1-simulationsuccessyes0:05:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.318070427436712
survival_time_median9.6
deviation-center-line_median0.4819230030931238
in-drivable-lane_median1.699999999999994


other stats
agent_compute-ego_max0.02093431111928579
agent_compute-ego_mean0.01944092402933661
agent_compute-ego_median0.0191118327776591
agent_compute-ego_min0.018995108703772228
deviation-center-line_max0.6265022816733171
deviation-center-line_mean0.46004271259828994
deviation-center-line_min0.2867268275114114
deviation-heading_max2.0242437060604708
deviation-heading_mean1.311702754109721
deviation-heading_median1.0439911555373131
deviation-heading_min0.8254563693484318
driven_any_max2.8600415936172827
driven_any_mean1.8122074017128424
driven_any_median1.78384910123139
driven_any_min1.0685871956752109
driven_lanedir_consec_max2.512983000838762
driven_lanedir_consec_mean1.459016753689904
driven_lanedir_consec_min0.6845611874117301
driven_lanedir_max2.512983000838762
driven_lanedir_mean1.459016753689904
driven_lanedir_median1.318070427436712
driven_lanedir_min0.6845611874117301
in-drivable-lane_max2.450000000000019
in-drivable-lane_mean1.710000000000011
in-drivable-lane_min1.099999999999996
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 1.0685871956752109, "sim_physics": 0.058464799378369306, "survival_time": 7.399999999999982, "driven_lanedir": 0.6845611874117301, "sim_render-ego": 0.0074719609440983955, "in-drivable-lane": 1.699999999999994, "agent_compute-ego": 0.02093431111928579, "deviation-heading": 0.9962398321850736, "set_robot_commands": 0.005252245310190562, "deviation-center-line": 0.2867268275114114, "driven_lanedir_consec": 0.6845611874117301, "sim_compute_sim_state": 0.003050805749119939, "sim_compute_performance-ego": 0.004382400899320035, "sim_compute_robot_state-ego": 0.005206022713635419}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 2.186092462942246, "sim_physics": 0.056092705389465945, "survival_time": 9.900000000000006, "driven_lanedir": 1.8223019729740737, "sim_render-ego": 0.007202625274658203, "in-drivable-lane": 1.40000000000002, "agent_compute-ego": 0.01913925011952718, "deviation-heading": 1.0439911555373131, "set_robot_commands": 0.005159398522039857, "deviation-center-line": 0.6265022816733171, "driven_lanedir_consec": 1.8223019729740737, "sim_compute_sim_state": 0.0029705618367050633, "sim_compute_performance-ego": 0.0040780968136257595, "sim_compute_robot_state-ego": 0.0051418133456297595}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 1.162466655098083, "sim_physics": 0.05569361195419774, "survival_time": 6.5999999999999845, "driven_lanedir": 0.9571671797882436, "sim_render-ego": 0.007308838945446592, "in-drivable-lane": 1.099999999999996, "agent_compute-ego": 0.019024117426438766, "deviation-heading": 1.6685827074173154, "set_robot_commands": 0.00514132326299494, "deviation-center-line": 0.5714222556891025, "driven_lanedir_consec": 0.9571671797882436, "sim_compute_sim_state": 0.003042271642973929, "sim_compute_performance-ego": 0.004161845554005016, "sim_compute_robot_state-ego": 0.005199390830415668}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.78384910123139, "sim_physics": 0.05694925909241041, "survival_time": 9.6, "driven_lanedir": 1.318070427436712, "sim_render-ego": 0.007215812802314758, "in-drivable-lane": 2.450000000000019, "agent_compute-ego": 0.018995108703772228, "deviation-heading": 0.8254563693484318, "set_robot_commands": 0.005131768683592479, "deviation-center-line": 0.3336391950244951, "driven_lanedir_consec": 1.318070427436712, "sim_compute_sim_state": 0.002978246659040451, "sim_compute_performance-ego": 0.0040942442913850146, "sim_compute_robot_state-ego": 0.005130507051944733}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 2.8600415936172827, "sim_physics": 0.05535748640696208, "survival_time": 14.950000000000076, "driven_lanedir": 2.512983000838762, "sim_render-ego": 0.007222949663798014, "in-drivable-lane": 1.900000000000027, "agent_compute-ego": 0.0191118327776591, "deviation-heading": 2.0242437060604708, "set_robot_commands": 0.005043076674143473, "deviation-center-line": 0.4819230030931238, "driven_lanedir_consec": 2.512983000838762, "sim_compute_sim_state": 0.002968886693318685, "sim_compute_performance-ego": 0.004114123980204264, "sim_compute_robot_state-ego": 0.005134984652201335}}
set_robot_commands_max0.005252245310190562
set_robot_commands_mean0.005145562490592262
set_robot_commands_median0.00514132326299494
set_robot_commands_min0.005043076674143473
sim_compute_performance-ego_max0.004382400899320035
sim_compute_performance-ego_mean0.004166142307708018
sim_compute_performance-ego_median0.004114123980204264
sim_compute_performance-ego_min0.0040780968136257595
sim_compute_robot_state-ego_max0.005206022713635419
sim_compute_robot_state-ego_mean0.0051625437187653835
sim_compute_robot_state-ego_median0.0051418133456297595
sim_compute_robot_state-ego_min0.005130507051944733
sim_compute_sim_state_max0.003050805749119939
sim_compute_sim_state_mean0.003002154516231614
sim_compute_sim_state_median0.002978246659040451
sim_compute_sim_state_min0.002968886693318685
sim_physics_max0.058464799378369306
sim_physics_mean0.05651157244428109
sim_physics_median0.056092705389465945
sim_physics_min0.05535748640696208
sim_render-ego_max0.0074719609440983955
sim_render-ego_mean0.007284437526063193
sim_render-ego_median0.007222949663798014
sim_render-ego_min0.007202625274658203
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean9.690000000000008
survival_time_min6.5999999999999845
No reset possible
28904step1-simulationsuccessyes0:06:24
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible