Duckietown Challenges Home Challenges Submissions

Submission 10034

Submission10034
Competingyes
Challengeaido5-LF-sim-validation
UserHimanshu Arora 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58040
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58040

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
58040LFv-simsuccessyes0:16:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.2890078462803554
survival_time_median23.175000000000196
deviation-center-line_median0.7694935019795095
in-drivable-lane_median9.650000000000135


other stats
agent_compute-ego0_max0.01384641896540196
agent_compute-ego0_mean0.01319590732708322
agent_compute-ego0_median0.013280121413131445
agent_compute-ego0_min0.012376967516668036
complete-iteration_max0.22814370319247249
complete-iteration_mean0.19751572846262344
complete-iteration_median0.1981231684411473
complete-iteration_min0.16567287377572676
deviation-center-line_max1.414404199168249
deviation-center-line_mean0.7650326066978363
deviation-center-line_min0.10673922366407726
deviation-heading_max6.031872421667577
deviation-heading_mean2.934508869090093
deviation-heading_median2.2991521781925175
deviation-heading_min1.1078586983077598
driven_any_max8.303247622330368
driven_any_mean4.062671241931412
driven_any_median3.8184050045471234
driven_any_min0.3106273363010348
driven_lanedir_consec_max3.10238942225407
driven_lanedir_consec_mean1.959539037042335
driven_lanedir_consec_min0.1577510333545593
driven_lanedir_max3.10238942225407
driven_lanedir_mean1.959539037042335
driven_lanedir_median2.2890078462803554
driven_lanedir_min0.1577510333545593
get_duckie_state_max1.6987323760986328e-06
get_duckie_state_mean1.568750312747285e-06
get_duckie_state_median1.550414638039288e-06
get_duckie_state_min1.4754395988119308e-06
get_robot_state_max0.0041669295660814445
get_robot_state_mean0.00391834626973642
get_robot_state_median0.003888410762957157
get_robot_state_min0.0037296339869499207
get_state_dump_max0.005209911408735879
get_state_dump_mean0.004927216850256044
get_state_dump_median0.004905485686637363
get_state_dump_min0.004687984619013574
get_ui_image_max0.03625333488886081
get_ui_image_mean0.03228600072898549
get_ui_image_median0.03292006777911649
get_ui_image_min0.027050532468848185
in-drivable-lane_max38.54999999999949
in-drivable-lane_mean14.79999999999994
in-drivable-lane_min1.349999999999996
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 4.483845189533792, "get_ui_image": 0.029842420710906696, "step_physics": 0.10613050433875476, "survival_time": 26.50000000000024, "driven_lanedir": 2.013048747448605, "get_state_dump": 0.004971107745125694, "get_robot_state": 0.003976267385392988, "sim_render-ego0": 0.004122911873510328, "get_duckie_state": 1.620887138524747e-06, "in-drivable-lane": 15.850000000000223, "deviation-heading": 1.5889881402121444, "agent_compute-ego0": 0.012836366498986656, "complete-iteration": 0.17766369174878252, "set_robot_commands": 0.0023431984493736944, "deviation-center-line": 0.4940421761596698, "driven_lanedir_consec": 2.013048747448605, "sim_compute_sim_state": 0.011116662267911233, "sim_compute_performance-ego0": 0.002228077297605813}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.3106273363010348, "get_ui_image": 0.03599771484732628, "step_physics": 0.15183694660663605, "survival_time": 3.149999999999997, "driven_lanedir": 0.1577510333545593, "get_state_dump": 0.004839863628149033, "get_robot_state": 0.0037296339869499207, "sim_render-ego0": 0.004019558429718018, "get_duckie_state": 1.6987323760986328e-06, "in-drivable-lane": 1.349999999999996, "deviation-heading": 1.1078586983077598, "agent_compute-ego0": 0.01372387632727623, "complete-iteration": 0.22814370319247249, "set_robot_commands": 0.002283763140439987, "deviation-center-line": 0.10673922366407726, "driven_lanedir_consec": 0.1577510333545593, "sim_compute_sim_state": 0.009621977806091309, "sim_compute_performance-ego0": 0.0020012184977531433}, "LF-norm-techtrack-000-ego0": {"driven_any": 3.152964819560455, "get_ui_image": 0.03625333488886081, "step_physics": 0.13807881597298474, "survival_time": 19.850000000000147, "driven_lanedir": 2.5649669451121055, "get_state_dump": 0.005209911408735879, "get_robot_state": 0.0041669295660814445, "sim_render-ego0": 0.0043825186676715484, "get_duckie_state": 1.4754395988119308e-06, "in-drivable-lane": 3.450000000000049, "deviation-heading": 3.0093162161728904, "agent_compute-ego0": 0.01384641896540196, "complete-iteration": 0.2185826451335121, "set_robot_commands": 0.002484038247534977, "deviation-center-line": 1.0449448277993492, "driven_lanedir_consec": 2.5649669451121055, "sim_compute_sim_state": 0.011686657541361289, "sim_compute_performance-ego0": 0.0023731879852524956}, "LF-norm-small_loop-000-ego0": {"driven_any": 8.303247622330368, "get_ui_image": 0.027050532468848185, "step_physics": 0.10306978086746305, "survival_time": 59.99999999999873, "driven_lanedir": 3.10238942225407, "get_state_dump": 0.004687984619013574, "get_robot_state": 0.0038005541405213266, "sim_render-ego0": 0.003932368050606225, "get_duckie_state": 1.4799421375538288e-06, "in-drivable-lane": 38.54999999999949, "deviation-heading": 6.031872421667577, "agent_compute-ego0": 0.012376967516668036, "complete-iteration": 0.16567287377572676, "set_robot_commands": 0.002281398598498647, "deviation-center-line": 1.414404199168249, "driven_lanedir_consec": 3.10238942225407, "sim_compute_sim_state": 0.006365931103568986, "sim_compute_performance-ego0": 0.0020172961248545525}}
set_robot_commands_max0.002484038247534977
set_robot_commands_mean0.0023480996089618265
set_robot_commands_median0.002313480794906841
set_robot_commands_min0.002281398598498647
sim_compute_performance-ego0_max0.0023731879852524956
sim_compute_performance-ego0_mean0.002154944976366501
sim_compute_performance-ego0_median0.0021226867112301828
sim_compute_performance-ego0_min0.0020012184977531433
sim_compute_sim_state_max0.011686657541361289
sim_compute_sim_state_mean0.009697807179733203
sim_compute_sim_state_median0.010369320037001271
sim_compute_sim_state_min0.006365931103568986
sim_render-ego0_max0.0043825186676715484
sim_render-ego0_mean0.00411433925537653
sim_render-ego0_median0.004071235151614173
sim_render-ego0_min0.003932368050606225
simulation-passed1
step_physics_max0.15183694660663605
step_physics_mean0.12477901194645966
step_physics_median0.12210466015586977
step_physics_min0.10306978086746305
survival_time_max59.99999999999873
survival_time_mean27.37499999999978
survival_time_min3.149999999999997
No reset possible
58039LFv-simsuccessyes0:10:46
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible