Observer wrote:
But empirical tests (including those in the programme) show that these devices do not give error messages or erroneous readings all the time so, despite the above, the internal software algorithms must be reasonably effective at trapping and neutralising the error potential described.
The crux of the issue, for me, is how often does the device fail to catch an error in real conditions (as opposed to controlled tests) and display a materially incorrect speed? If, as shown, it can fail to catch some errors, how many more are being missed. Is 99% reliability good enough? 99.9%? 99.99%?
It is very good to see somebody with their thinking cap on! And this is a very good point. However maybe the answer is simple, and not what you might think intuitively.
While we've got some brains here, and the thickies are staying quiet, it's time to go up a gear.....
So far the assumption has been that these devices produce a spurious speed reading once in a while. Is this a good assumption?
First of all have a think about what the speed displayed actually means. It is not an instantaneous speed. In fact it's not even an accurate speed - there are no decimal points for a start.
The device uses a whole pile of analogue electronics to measure the round trip time of a beam of light. The target is moving and so there is no way the same point on the target will be hit for each pulse of the laser - even without slip. For each round trip of the laser a distance is calculated. From a series of such distances an **AVERAGE** speed is calculated, and displayed, without the decimal point.
The target could be accelerating or decelerating, and the angle being formed will be constantly changing. So it is clear that the displayed speed is only a "nominal" value. It would be impossible to show at what point during the 0.3s sample period that actual speed ocurred.
So it is clear that the accuracy isn't 100% even before we start worrying about jerks/slip etc.
I reckon that in actual fact the device NEVER reads the speed accurately but always reads an approximation. However the DEGREE of accuracy varies, depending on a number of factors, including jerking and slipping etc.
This means that often the device is "near enough" and gross, immediately obvious errors are RARE. Note that "RARE" does not equal "NEVER HAPPENS".
One of the factors will obviously be the accuracy of the targeting, and clearly misaligned devices are a disaster waiting to happen. But how exactly accurate can you target a device? Is there such a thing as a perfectly aligned device? I don't think there is, because there is enough parallax between the red dot aiming mechanism on the LTI 20.20 to mean that if your eye position shifts relative to the red dot then the aiming point at distance varies by a lot more.
There is some more evidence of this. I have seen static objects get speeds of 1 or 2mph. But the key one is that I have seen readings of -0mph as well as 0mph. For this to happen, the device must be seeing some sort of difference. Imagine the decimal point put back in. I would be prepared to bet it would NEVER measure exactly 0.000mph.
This is why for static targets sometimes the LTI 20.20 displays 0mph, and sometimes it displays -0mph.
Here's how the "expert" from TeleTraffic describes it, describing an instance of this, picked up by a Judge:
Quote:
Q: ...does it read 0mph, or minus 0mph?"
A: It actually reads minus 0mph."
Q: I see
A: This is the way it will calculate a zero speed, but it may put a minus or not put a minus on it. Minus zero is actually the same as zero, if you think about it.
Q: Is there any explanation as to why, if it is hitting the same object, it comes back in one reading as simple zero, if I can use that phrase, and in another reading as minus zero?
A: There is no simple explanation. Call it a quirk, if you will, but it's a quirk of the software that will show one way or the other.
Perhaps the software rolls a dice, or tosses a coin.
