Scan speed versus sensitivity
Posted: Mon Mar 22, 2021 10:32 pm
The current scan speed is for each channel / frequency to be sampled for 30 milliseconds (DMR simplex is sampled for 60 milliseconds because DMR simplex is only transmitted for 30 milliseconds every 60 milliseconds)
However, because of the way the Rx chip works, the sensitivity of FM signal detection is only around 60% of its maximum if the channel / frequency is only held for 30 milliseconds
Every time the frequency is changed the FM detector output value from the Rx is cleared, and the value slowly increases with a normal C R charge curve, only reaching 100% after 160 milliseconds
This results in the scanning not always stopping on a channel / frequency even if the signal is relatively strong.
The sensitivity is also controlled by the squelch, because the scan will stop only when the squelch is opened.
Even allowing for adjustments to the squelch to improve the scan sensitivity, always using the 30 millisecond step time, is not going to be ideal, as some signals may have fading or have other fluctuations which means that they can't be detected in 30 milliseconds.
So I'm currently investigating adding a setting to allow the scan speed to be be controlled as a setting, with my current test range of 30 ms to 480 milliseconds in 30 millisecond steps.
There the scan step time does not need to be larger than about 160 milliseconds in order that the Rx is at maximum sensitivity, but potentially longer settings may be useful for fluctuating signals.
Which brings me to a question..
Would it be useful to allow very slow scanning e.g. more than 480 milliseconds per channel / frequency ?
Are there any occasions when perhaps holding of the channel for as long as 5 or 10 seconds would be beneficial?
However, because of the way the Rx chip works, the sensitivity of FM signal detection is only around 60% of its maximum if the channel / frequency is only held for 30 milliseconds
Every time the frequency is changed the FM detector output value from the Rx is cleared, and the value slowly increases with a normal C R charge curve, only reaching 100% after 160 milliseconds
This results in the scanning not always stopping on a channel / frequency even if the signal is relatively strong.
The sensitivity is also controlled by the squelch, because the scan will stop only when the squelch is opened.
Even allowing for adjustments to the squelch to improve the scan sensitivity, always using the 30 millisecond step time, is not going to be ideal, as some signals may have fading or have other fluctuations which means that they can't be detected in 30 milliseconds.
So I'm currently investigating adding a setting to allow the scan speed to be be controlled as a setting, with my current test range of 30 ms to 480 milliseconds in 30 millisecond steps.
There the scan step time does not need to be larger than about 160 milliseconds in order that the Rx is at maximum sensitivity, but potentially longer settings may be useful for fluctuating signals.
Which brings me to a question..
Would it be useful to allow very slow scanning e.g. more than 480 milliseconds per channel / frequency ?
Are there any occasions when perhaps holding of the channel for as long as 5 or 10 seconds would be beneficial?