Duckietown Challenges Home Challenges Submissions

Submission 6625

Submission6625
Competingyes
Challengeaido3-off-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 32948
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

32948

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
32948step1-simulationsuccessyes0:07:45
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.224246417339951
survival_time_median14.950000000000076
deviation-center-line_median0.7469834910220896
in-drivable-lane_median0


other stats
agent_compute-ego_max0.06487900654474894
agent_compute-ego_mean0.058227216974149974
agent_compute-ego_median0.06310380697250366
agent_compute-ego_min0.03699840929197228
deviation-center-line_max1.1505722613134484
deviation-center-line_mean0.7308195951725678
deviation-center-line_min0.2136061277314395
deviation-heading_max3.969105465543017
deviation-heading_mean1.8776767185653649
deviation-heading_median1.318714180961699
deviation-heading_min0.8589881107873744
driven_any_max2.3491425657201495
driven_any_mean1.9104257492499652
driven_any_median2.3490669905464303
driven_any_min0.6779877273904572
driven_lanedir_consec_max2.3353637711145008
driven_lanedir_consec_mean1.8659784035387736
driven_lanedir_consec_min0.6540766610263272
driven_lanedir_max2.3353637711145008
driven_lanedir_mean1.8659784035387736
driven_lanedir_median2.224246417339951
driven_lanedir_min0.6540766610263272
in-drivable-lane_max0
in-drivable-lane_mean0
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 0.6779877273904572, "sim_physics": 0.06444310623666515, "survival_time": 4.599999999999992, "driven_lanedir": 0.6540766610263272, "sim_render-ego": 0.012674917345461638, "in-drivable-lane": 0, "agent_compute-ego": 0.03699840929197228, "deviation-heading": 0.8589881107873744, "set_robot_commands": 0.008245284142701523, "deviation-center-line": 0.2136061277314395, "driven_lanedir_consec": 0.6540766610263272, "sim_compute_sim_state": 0.005780401437178902, "sim_compute_performance-ego": 0.007602546526038129, "sim_compute_robot_state-ego": 0.009697592776754627}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 1.8267947353876832, "sim_physics": 0.06269454956054688, "survival_time": 11.700000000000031, "driven_lanedir": 1.8020560123240608, "sim_render-ego": 0.012080822235498672, "in-drivable-lane": 0, "agent_compute-ego": 0.0636453873071915, "deviation-heading": 1.297129806953829, "set_robot_commands": 0.008292160482488127, "deviation-center-line": 0.7469834910220896, "driven_lanedir_consec": 1.8020560123240608, "sim_compute_sim_state": 0.005916631119882959, "sim_compute_performance-ego": 0.007060774371155307, "sim_compute_robot_state-ego": 0.00937095462766468}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 2.3490669905464303, "sim_physics": 0.06328815778096517, "survival_time": 14.950000000000076, "driven_lanedir": 2.224246417339951, "sim_render-ego": 0.012074695428212484, "in-drivable-lane": 0, "agent_compute-ego": 0.0625094747543335, "deviation-heading": 3.969105465543017, "set_robot_commands": 0.008210238615671793, "deviation-center-line": 1.1505722613134484, "driven_lanedir_consec": 2.224246417339951, "sim_compute_sim_state": 0.006067891915639241, "sim_compute_performance-ego": 0.007112514972686767, "sim_compute_robot_state-ego": 0.009205777645111084}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 2.3491367272051056, "sim_physics": 0.06322927474975586, "survival_time": 14.950000000000076, "driven_lanedir": 2.3353637711145008, "sim_render-ego": 0.012003229459126793, "in-drivable-lane": 0, "agent_compute-ego": 0.06310380697250366, "deviation-heading": 1.318714180961699, "set_robot_commands": 0.008290499846140543, "deviation-center-line": 0.934880512430046, "driven_lanedir_consec": 2.3353637711145008, "sim_compute_sim_state": 0.006036368211110433, "sim_compute_performance-ego": 0.007179370721181234, "sim_compute_robot_state-ego": 0.009286406040191653}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 2.3491425657201495, "sim_physics": 0.06329809506734212, "survival_time": 14.950000000000076, "driven_lanedir": 2.314149155889026, "sim_render-ego": 0.012394964694976808, "in-drivable-lane": 0, "agent_compute-ego": 0.06487900654474894, "deviation-heading": 1.944446028580904, "set_robot_commands": 0.008315752347310384, "deviation-center-line": 0.6080555833658158, "driven_lanedir_consec": 2.314149155889026, "sim_compute_sim_state": 0.006094113985697428, "sim_compute_performance-ego": 0.007201385498046875, "sim_compute_robot_state-ego": 0.009350976943969726}}
set_robot_commands_max0.008315752347310384
set_robot_commands_mean0.008270787086862474
set_robot_commands_median0.008290499846140543
set_robot_commands_min0.008210238615671793
sim_compute_performance-ego_max0.007602546526038129
sim_compute_performance-ego_mean0.007231318417821662
sim_compute_performance-ego_median0.007179370721181234
sim_compute_performance-ego_min0.007060774371155307
sim_compute_robot_state-ego_max0.009697592776754627
sim_compute_robot_state-ego_mean0.009382341606738351
sim_compute_robot_state-ego_median0.009350976943969726
sim_compute_robot_state-ego_min0.009205777645111084
sim_compute_sim_state_max0.006094113985697428
sim_compute_sim_state_mean0.005979081333901793
sim_compute_sim_state_median0.006036368211110433
sim_compute_sim_state_min0.005780401437178902
sim_physics_max0.06444310623666515
sim_physics_mean0.06339063667905503
sim_physics_median0.06328815778096517
sim_physics_min0.06269454956054688
sim_render-ego_max0.012674917345461638
sim_render-ego_mean0.01224572583265528
sim_render-ego_median0.012080822235498672
sim_render-ego_min0.012003229459126793
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean12.23000000000005
survival_time_min4.599999999999992
No reset possible
32947step1-simulationsuccessyes0:07:42
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible