Duckietown Challenges Home Challenges Submissions

Submission 5373

Submission5373
Competingyes
Challengeaido3-LF-sim-validation
UserCharles Cossette 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 28424
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

28424

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
28424step1-simulationsuccessyes0:08:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.7224690421658573
survival_time_median10.450000000000014
deviation-center-line_median0.4727142840031846
in-drivable-lane_median2.700000000000001


other stats
agent_compute-ego_max0.033112884362538654
agent_compute-ego_mean0.030134943548028537
agent_compute-ego_median0.028616124186022527
agent_compute-ego_min0.028417716855588165
deviation-center-line_max0.7993113967829087
deviation-center-line_mean0.5536523484300887
deviation-center-line_min0.3795216635032576
deviation-heading_max3.351725725830146
deviation-heading_mean1.8864598691367824
deviation-heading_median1.6109629980917108
deviation-heading_min0.907765717028337
driven_any_max5.139714682837282
driven_any_mean3.808350119646154
driven_any_median3.6860340864503898
driven_any_min3.0389181025731196
driven_lanedir_consec_max2.936395627650224
driven_lanedir_consec_mean2.762041826295669
driven_lanedir_consec_min2.522735375190785
driven_lanedir_max2.936395627650224
driven_lanedir_mean2.762041826295669
driven_lanedir_median2.7224690421658573
driven_lanedir_min2.522735375190785
in-drivable-lane_max5.450000000000017
in-drivable-lane_mean2.7799999999999994
in-drivable-lane_min1.099999999999996
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 3.8959059996305494, "sim_physics": 0.07945833365122477, "survival_time": 14.950000000000076, "driven_lanedir": 2.936395627650224, "sim_render-ego": 0.013953158855438233, "in-drivable-lane": 2.700000000000001, "agent_compute-ego": 0.033112884362538654, "deviation-heading": 3.351725725830146, "set_robot_commands": 0.011210331122080483, "deviation-center-line": 0.7993113967829087, "driven_lanedir_consec": 2.936395627650224, "sim_compute_sim_state": 0.005800000826517741, "sim_compute_performance-ego": 0.009126702149709063, "sim_compute_robot_state-ego": 0.010386736392974851}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 5.139714682837282, "sim_physics": 0.08042189359664917, "survival_time": 14.950000000000076, "driven_lanedir": 2.935050391038422, "sim_render-ego": 0.014964900016784667, "in-drivable-lane": 5.450000000000017, "agent_compute-ego": 0.03198400100072225, "deviation-heading": 2.015227391440832, "set_robot_commands": 0.011247084935506186, "deviation-center-line": 0.7370277696257234, "driven_lanedir_consec": 2.935050391038422, "sim_compute_sim_state": 0.006070239543914795, "sim_compute_performance-ego": 0.009058112303415937, "sim_compute_robot_state-ego": 0.010925958156585694}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 3.6860340864503898, "sim_physics": 0.07277149446843344, "survival_time": 10.450000000000014, "driven_lanedir": 2.7224690421658573, "sim_render-ego": 0.01348139794819663, "in-drivable-lane": 3.2999999999999887, "agent_compute-ego": 0.02854399133527108, "deviation-heading": 1.5466175132928872, "set_robot_commands": 0.009900623531432812, "deviation-center-line": 0.3795216635032576, "driven_lanedir_consec": 2.7224690421658573, "sim_compute_sim_state": 0.005800824416311164, "sim_compute_performance-ego": 0.008209067668641013, "sim_compute_robot_state-ego": 0.009647444674843237}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 3.2811777267394304, "sim_physics": 0.07417642681495003, "survival_time": 9.199999999999996, "driven_lanedir": 2.6935586954330573, "sim_render-ego": 0.013579709374386332, "in-drivable-lane": 1.3499999999999952, "agent_compute-ego": 0.028417716855588165, "deviation-heading": 1.6109629980917108, "set_robot_commands": 0.010131076626155687, "deviation-center-line": 0.4727142840031846, "driven_lanedir_consec": 2.6935586954330573, "sim_compute_sim_state": 0.005651380704796832, "sim_compute_performance-ego": 0.008053455663763958, "sim_compute_robot_state-ego": 0.009767748739408413}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 3.0389181025731196, "sim_physics": 0.0676227174956223, "survival_time": 7.249999999999982, "driven_lanedir": 2.522735375190785, "sim_render-ego": 0.013826348863798996, "in-drivable-lane": 1.099999999999996, "agent_compute-ego": 0.028616124186022527, "deviation-heading": 0.907765717028337, "set_robot_commands": 0.00974612729302768, "deviation-center-line": 0.3796866282353696, "driven_lanedir_consec": 2.522735375190785, "sim_compute_sim_state": 0.005142914015671303, "sim_compute_performance-ego": 0.007845373811392949, "sim_compute_robot_state-ego": 0.00963985015606058}}
set_robot_commands_max0.011247084935506186
set_robot_commands_mean0.010447048701640567
set_robot_commands_median0.010131076626155687
set_robot_commands_min0.00974612729302768
sim_compute_performance-ego_max0.009126702149709063
sim_compute_performance-ego_mean0.008458542319384584
sim_compute_performance-ego_median0.008209067668641013
sim_compute_performance-ego_min0.007845373811392949
sim_compute_robot_state-ego_max0.010925958156585694
sim_compute_robot_state-ego_mean0.010073547623974556
sim_compute_robot_state-ego_median0.009767748739408413
sim_compute_robot_state-ego_min0.00963985015606058
sim_compute_sim_state_max0.006070239543914795
sim_compute_sim_state_mean0.005693071901442367
sim_compute_sim_state_median0.005800000826517741
sim_compute_sim_state_min0.005142914015671303
sim_physics_max0.08042189359664917
sim_physics_mean0.07489017320537594
sim_physics_median0.07417642681495003
sim_physics_min0.0676227174956223
sim_render-ego_max0.014964900016784667
sim_render-ego_mean0.01396110301172097
sim_render-ego_median0.013826348863798996
sim_render-ego_min0.01348139794819663
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean11.360000000000028
survival_time_min7.249999999999982
No reset possible
28421step1-simulationsuccessyes0:05:28
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible