Duckietown Challenges Home Challenges Submissions

Submission 5689

Submission5689
Competingyes
Challengeaido3-LF-sim-validation
Userc reddy 🇮🇳
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 29258
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

29258

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
29258step1-simulationsuccessyes0:03:45
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.1340177321439595
survival_time_median4.899999999999991
deviation-center-line_median0.26159104171307546
in-drivable-lane_median0.5499999999999989


other stats
agent_compute-ego_max0.016739193190876234
agent_compute-ego_mean0.015828048085320606
agent_compute-ego_median0.015809304492418155
agent_compute-ego_min0.015245812279837472
deviation-center-line_max0.5582805719007031
deviation-center-line_mean0.31021392392258845
deviation-center-line_min0.18919692921063355
deviation-heading_max2.6365124597811884
deviation-heading_mean1.5288263048598676
deviation-heading_median1.491294588143928
deviation-heading_min0.9903252867405172
driven_any_max2.9016693406667775
driven_any_mean1.738455252306225
driven_any_median1.5876911487792216
driven_any_min1.0856778413651
driven_lanedir_consec_max2.48913092446663
driven_lanedir_consec_mean1.4387585400083511
driven_lanedir_consec_min0.8120044492001306
driven_lanedir_max2.48913092446663
driven_lanedir_mean1.4387585400083511
driven_lanedir_median1.1340177321439595
driven_lanedir_min0.8120044492001306
in-drivable-lane_max0.9999999999999964
in-drivable-lane_mean0.5599999999999982
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 1.9218277088475817, "sim_physics": 0.059972261771177635, "survival_time": 5.849999999999987, "driven_lanedir": 1.7455260470656524, "sim_render-ego": 0.007050410295144105, "in-drivable-lane": 0.3999999999999986, "agent_compute-ego": 0.016739193190876234, "deviation-heading": 1.5074785904904977, "set_robot_commands": 0.007418811830699953, "deviation-center-line": 0.2301953747235941, "driven_lanedir_consec": 1.7455260470656524, "sim_compute_sim_state": 0.0036714749458508617, "sim_compute_performance-ego": 0.004174426070645324, "sim_compute_robot_state-ego": 0.005206949690468291}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 1.5876911487792216, "sim_physics": 0.057411955327403785, "survival_time": 4.899999999999991, "driven_lanedir": 1.1340177321439595, "sim_render-ego": 0.007156554533510792, "in-drivable-lane": 0.9999999999999964, "agent_compute-ego": 0.015245812279837472, "deviation-heading": 1.491294588143928, "set_robot_commands": 0.006626574360594458, "deviation-center-line": 0.3118057020649359, "driven_lanedir_consec": 1.1340177321439595, "sim_compute_sim_state": 0.0032735454792879068, "sim_compute_performance-ego": 0.003966204974116112, "sim_compute_robot_state-ego": 0.005078101644710619}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 2.9016693406667775, "sim_physics": 0.05362192281456881, "survival_time": 8.599999999999987, "driven_lanedir": 2.48913092446663, "sim_render-ego": 0.008046420507652814, "in-drivable-lane": 0.5499999999999989, "agent_compute-ego": 0.015809304492418155, "deviation-heading": 2.6365124597811884, "set_robot_commands": 0.007102163725121077, "deviation-center-line": 0.5582805719007031, "driven_lanedir_consec": 2.48913092446663, "sim_compute_sim_state": 0.0035696376201718354, "sim_compute_performance-ego": 0.004014562728793122, "sim_compute_robot_state-ego": 0.005207399989283362}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.1954102218724445, "sim_physics": 0.05850981458832946, "survival_time": 3.949999999999994, "driven_lanedir": 0.8120044492001306, "sim_render-ego": 0.007195345963103862, "in-drivable-lane": 0.849999999999997, "agent_compute-ego": 0.015816489352455623, "deviation-heading": 1.0185205991432067, "set_robot_commands": 0.006904061836532399, "deviation-center-line": 0.18919692921063355, "driven_lanedir_consec": 0.8120044492001306, "sim_compute_sim_state": 0.003582537928714028, "sim_compute_performance-ego": 0.004014075556887856, "sim_compute_robot_state-ego": 0.005206340475927425}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 1.0856778413651, "sim_physics": 0.05705745292432381, "survival_time": 3.2999999999999963, "driven_lanedir": 1.013113547165385, "sim_render-ego": 0.00716963681307706, "in-drivable-lane": 0, "agent_compute-ego": 0.01552944111101555, "deviation-heading": 0.9903252867405172, "set_robot_commands": 0.0068858392310865, "deviation-center-line": 0.26159104171307546, "driven_lanedir_consec": 1.013113547165385, "sim_compute_sim_state": 0.0034249840360699277, "sim_compute_performance-ego": 0.0039933847658561936, "sim_compute_robot_state-ego": 0.005073879704330907}}
set_robot_commands_max0.007418811830699953
set_robot_commands_mean0.006987490196806877
set_robot_commands_median0.006904061836532399
set_robot_commands_min0.006626574360594458
sim_compute_performance-ego_max0.004174426070645324
sim_compute_performance-ego_mean0.004032530819259722
sim_compute_performance-ego_median0.004014075556887856
sim_compute_performance-ego_min0.003966204974116112
sim_compute_robot_state-ego_max0.005207399989283362
sim_compute_robot_state-ego_mean0.005154534300944121
sim_compute_robot_state-ego_median0.005206340475927425
sim_compute_robot_state-ego_min0.005073879704330907
sim_compute_sim_state_max0.0036714749458508617
sim_compute_sim_state_mean0.0035044360020189116
sim_compute_sim_state_median0.0035696376201718354
sim_compute_sim_state_min0.0032735454792879068
sim_physics_max0.059972261771177635
sim_physics_mean0.0573146814851607
sim_physics_median0.057411955327403785
sim_physics_min0.05362192281456881
sim_render-ego_max0.008046420507652814
sim_render-ego_mean0.007323673622497727
sim_render-ego_median0.00716963681307706
sim_render-ego_min0.007050410295144105
simulation-passed1
survival_time_max8.599999999999987
survival_time_mean5.319999999999991
survival_time_min3.2999999999999963
No reset possible
29257step1-simulationsuccessyes0:03:36
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible