basingwerk wrote:
I’m sure you already know this, but you have asked me to explain, so I will. Going slower gives you more time before things turn critical when things do or nearly do happen. The times come from the fact that you have less speed to shed before you can come to an emergency stop, and shedding speed takes time. Furthermore, if that extra time does not enable you to avoid crashing, at least you crash at lower speed, which is in itself a good thing due to F=MA. Of course, the same applies for a dog running out, child running out, brakes fail near junction, large pothole comes into view, etc. etc.
The error is that I only include reactions to real-world things that do happen, not to the zillions of unconnected hypothetical scenarios that do not play out because of changed timelines in a parallel world of optional events
Having read my previous reply, I decided it needed a bit more detail to put things in perspective, and to illustrate what happens in the timescales in which things turn critical, so here goes:
If you're driving along and an unexpected something happens ahead of you (say, for the purpose of this example, a tractor pulling out from a junction) the amount of time you have to react is how much time it would take from the first moment you notice the tractor pulling out to the time you would reach the junction
at your current speed. If you don't react at all, this is the amount of time you have before you hit the tractor. Just to clarify a point: if you do brake it will take longer to reach the junction, by virtue of your now lower average speed, but this extra time is unavailable to you before you react.
Now, this amount of time you have available to react depends on a myriad of factors such as the precise time the tractor pulls out, what time you started out, how far you've travelled, etc etc - most of which are random and completely beyond your control, and can vary from picoseconds to hours.
But, as we're only dealing with critical situtions, let's look at times up to 3 seconds.
First, a few assumptions:
1) Your braking deceleration is 0.9g, or approx. 20mph/s
2) Your reaction time is 1 second.
So now we can work out some figures.
If you are 3 seconds away from the junction,
at your current speed, you will be 3 - 1 = 2 seconds from the junction when you start braking. If your speed is such that you stop just short of the junction, and as with linear deceleration your average speed while braking is half your initial speed, it will take you 4 seconds to reach the junction - at which point you'll just have stopped.
4 seconds at 20mph/s = 80mph. So, if you're 3 seconds from the junction, your speed must be under 80mph for you to stop before the junction.
If you are 2.5 seconds from the junction, you need to be under 60mph,
2 seconds =< 40mph, 1.5 seconds =< 20mph, 1.25 seconds =< 10mph, and 1 second, < 0mph (you haven't a chance to brake)
If we now assume that your reaction time is 0.75 seconds, you can either add 10mph to all those speeds, or subtract 0.25 seconds from the times.
Note both the short time in which things become critical, and also the large equivalent change in speed for changes of fractions of a second in time - which is dependent on largely random factors and almost completely beyond your control.
If you haven't got enough time to brake to a stop, your impact speed depends on just how long you've had your foot on the brake for and, similarly, fractions of a second also make large difference to impact speed.
Are you getting the picture yet?