Duckietown Challenges Home Challenges Submissions

Submission 4888

Submission4888
Competingyes
Challengeaido3-LF-sim-validation
UserRey Wiyatno 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 26910
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

26910

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
26910step1-simulationsuccessyes0:13:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.513339029186876
survival_time_median14.950000000000076
deviation-center-line_median0.7812956086028069
in-drivable-lane_median0.7999999999999972


other stats
agent_compute-ego_max0.06007460037867228
agent_compute-ego_mean0.03724594116210937
agent_compute-ego_median0.03205663522084554
agent_compute-ego_min0.03011705001195272
deviation-center-line_max0.9290697172694876
deviation-center-line_mean0.7933072096270919
deviation-center-line_min0.6377733677986812
deviation-heading_max4.347401610309438
deviation-heading_mean3.5543958853518887
deviation-heading_median3.253717411385601
deviation-heading_min2.740393962254751
driven_any_max4.28164128590303
driven_any_mean3.5167852363606555
driven_any_median3.458946209491299
driven_any_min2.8727290470717297
driven_lanedir_consec_max4.189256518582445
driven_lanedir_consec_mean2.674539745796827
driven_lanedir_consec_min1.708402763901201
driven_lanedir_max4.189256518582445
driven_lanedir_mean3.272362807594868
driven_lanedir_median3.2608540537029524
driven_lanedir_min2.513339029186876
in-drivable-lane_max2.5500000000000265
in-drivable-lane_mean0.9800000000000044
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 2.8727290470717297, "sim_physics": 0.12029876311620076, "survival_time": 14.950000000000076, "driven_lanedir": 2.513339029186876, "sim_render-ego": 0.015173566341400147, "in-drivable-lane": 2.5500000000000265, "agent_compute-ego": 0.03336680889129639, "deviation-heading": 3.0931429550781973, "set_robot_commands": 0.010930140813191732, "deviation-center-line": 0.6377733677986812, "driven_lanedir_consec": 2.513339029186876, "sim_compute_sim_state": 0.006631155014038086, "sim_compute_performance-ego": 0.009430710474650066, "sim_compute_robot_state-ego": 0.010885738531748452}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 3.458946209491299, "sim_physics": 0.11159509261449178, "survival_time": 14.950000000000076, "driven_lanedir": 3.2608540537029524, "sim_render-ego": 0.015611076354980468, "in-drivable-lane": 0.20000000000000284, "agent_compute-ego": 0.03011705001195272, "deviation-heading": 4.347401610309438, "set_robot_commands": 0.01016294240951538, "deviation-center-line": 0.902864997387826, "driven_lanedir_consec": 1.8543924054007024, "sim_compute_sim_state": 0.006595018704732259, "sim_compute_performance-ego": 0.009263989130655924, "sim_compute_robot_state-ego": 0.010777510801951091}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 3.4190947800453144, "sim_physics": 0.11481253226598104, "survival_time": 14.950000000000076, "driven_lanedir": 3.1073080119129126, "sim_render-ego": 0.015128412246704102, "in-drivable-lane": 1.3499999999999952, "agent_compute-ego": 0.030614611307779947, "deviation-heading": 3.253717411385601, "set_robot_commands": 0.010315110683441162, "deviation-center-line": 0.9290697172694876, "driven_lanedir_consec": 3.1073080119129126, "sim_compute_sim_state": 0.006591357390085856, "sim_compute_performance-ego": 0.009509894053141276, "sim_compute_robot_state-ego": 0.01107755422592163}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 3.5515148592919044, "sim_physics": 0.11925475358963011, "survival_time": 14.950000000000076, "driven_lanedir": 3.2910564245891516, "sim_render-ego": 0.015796648661295574, "in-drivable-lane": 0.7999999999999972, "agent_compute-ego": 0.03205663522084554, "deviation-heading": 4.337323487731456, "set_robot_commands": 0.010733027458190918, "deviation-center-line": 0.7812956086028069, "driven_lanedir_consec": 1.708402763901201, "sim_compute_sim_state": 0.006738104820251465, "sim_compute_performance-ego": 0.009503495693206788, "sim_compute_robot_state-ego": 0.011195691426595052}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 4.28164128590303, "sim_physics": 0.1179134448369344, "survival_time": 14.950000000000076, "driven_lanedir": 4.189256518582445, "sim_render-ego": 0.01582737366358439, "in-drivable-lane": 0, "agent_compute-ego": 0.06007460037867228, "deviation-heading": 2.740393962254751, "set_robot_commands": 0.010338190396626793, "deviation-center-line": 0.7155323570766576, "driven_lanedir_consec": 4.189256518582445, "sim_compute_sim_state": 0.006510356267293294, "sim_compute_performance-ego": 0.009275213082631429, "sim_compute_robot_state-ego": 0.011027441024780274}}
set_robot_commands_max0.010930140813191732
set_robot_commands_mean0.010495882352193196
set_robot_commands_median0.010338190396626793
set_robot_commands_min0.01016294240951538
sim_compute_performance-ego_max0.009509894053141276
sim_compute_performance-ego_mean0.009396660486857096
sim_compute_performance-ego_median0.009430710474650066
sim_compute_performance-ego_min0.009263989130655924
sim_compute_robot_state-ego_max0.011195691426595052
sim_compute_robot_state-ego_mean0.0109927872021993
sim_compute_robot_state-ego_median0.011027441024780274
sim_compute_robot_state-ego_min0.010777510801951091
sim_compute_sim_state_max0.006738104820251465
sim_compute_sim_state_mean0.006613198439280192
sim_compute_sim_state_median0.006595018704732259
sim_compute_sim_state_min0.006510356267293294
sim_physics_max0.12029876311620076
sim_physics_mean0.11677491728464762
sim_physics_median0.1179134448369344
sim_physics_min0.11159509261449178
sim_render-ego_max0.01582737366358439
sim_render-ego_mean0.015507415453592934
sim_render-ego_median0.015611076354980468
sim_render-ego_min0.015128412246704102
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean14.950000000000076
survival_time_min14.950000000000076
No reset possible
26909step1-simulationsuccessyes0:14:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible