T2 = T1 x (D2/D1)1.06
D1 = the distance you've already run
T1 = the time it took you
D2 = the distance you're about to run
T2 = the predicted time.
This means that as a standard rule, your running performance drops by 6% when the race distance doubles.
I was intrigued by this, on 2 counts:
1 - is the distance decay factor different for everyone?
2 - does the decay factor apply across all distances?
I don't have much good data to work with because relatively few of my races are of the flat road variety. After thinking a bit, I dug out 2 comparative sets of times:
From 2006... I ran the Bath half marathon in 1:21:24 and the London marathon in 2:45:48.
More recently... My most recent Southampton parkrun time is 18:22 and my Eastleigh 10k time was 37:35.
I then found an excellent web resource on the Good Run Guide website . It allows you to play with different decay factors. I bunged the results above in and got the following:
Half-marathon to marathon: decay of 5.2%
5k to 10k: decay of 3.8%
Interesting on both counts. I'm not remotely surprised that the marathon performance has a higher dip, because the last 6 miles of a marathon are debilitating like nothing else and only the superhuman can keep going at the pace of the previous 20 miles.
I'm also intrigued about the age factor... do runners generally get a lower decay factor as they get older, reflecting a lower overall speed but a relative improvement over longer distances?
I'm also intrigued about the age factor... do runners generally get a lower decay factor as they get older, reflecting a lower overall speed but a relative improvement over longer distances?
No comments:
Post a Comment
Note: only a member of this blog may post a comment.