Eco Levels

VK3KYY
Posts: 3331
Joined: Sat Nov 16, 2019 3:25 am
Location: Melbourne, Australia
Contact:

Eco Levels

Post by VK3KYY » Sun Jan 17, 2021 12:00 am

I've been doing some research into the current consumption of the official firmware and how it achieves better battery life than the OpenGD77 firmware.

To make the measurement I had to but a "battery eliminator" for the rather inflated price of $20 (AUD), and then break the battery housing option to remove the switch mode voltage regulator.

I power the transceiver via a linear "bench" power supply which is variable voltage and can supply up to 1A.
I also have a 10A switch mode bench supply, however to reduce supply noise I used the linear supply, even though it can't supply enough current for full power Tx operation, as Rx consumption is the main difference between the firmwares.

Current measurement is via my old Fluke 23 multimeter, which is at least 30 years old, but still works well.
Additionally I compared readings with 2 other mutimeters, a cheap Chinese DT-830B and a better quality meter I bought from a local supplier, which is branded as a Digitech QM1323

The first thing I noticed was that all 3 meters showed different values for the current, with the difference being around 5mA, with the Fluke 23 being considerably different from the other 2 meters.

So, I connected the meters directly across the current regulated output from the power supply, and found that all 3 meters gave identical results when just supplied with a clean DC current from the PSU.

Therefore the discrepancy between the readings was being caused by the non-pure DC nature of the current being demanded by the radio.

To partially overcome this problem,I made a capacitor pack, from 4 x 2200uF capacitors, in parallel to make 8800uF.
This improved the discrepancy between the meters.

Loading the official Radioddity firmware, disabling power saving and disabling the backlight , the current consumption is
68mA when not receiving a signal

The OpenGD77 firmware seems to consume 84mA when receiving under the same conditions, which was a bit of a surprise, as I presumed the the OpenGD77 firmware would draw the same current as the official Radioddity firmware when its not in Power Saving mode

To try to establish what is causing the difference in current requirements in I made various serious modifications to the OpenGD77 firmware, and although I have not totally confirmed what's taking the extra 16mA, it looks like the LCD display and the Flash memory chip are the reason.

The data sheet for the Flash memory chip, shows that it takes 5mA if not put into suspend mode, (and the OpenGD77 firmware does not send the suspend command to the Flash chip), or send any commands to the LCD to shut it down.

The OpenGD77 firmware updates the display constantly to update the signal meter, but one optimisation would be to not update the display when the radio enters various stages of power saving.

Also the Flash chip could potentially also be put into suspend mode, and this would hopefully reduce the current to approximately the same level as the Radioddity firmware, and increase battery endurance by around 20%.


However the main difference between the official Radioddity firmware and the OpenGD77 firmware is when the official firmware enters its Power Saving mode.

In my tests, the Radioddity firmware enters this mode after 10 seconds of inactivity.

Looking at my current meter, it was obvious that the Radioddity firmware was switching things on and off to save power, at a pulse rate of about 2 time per second.

So do do better analysis of this, I connected a 1 ohm resistor in series with the supply to the radio, and attached my Rigol 2072 (modified to be a 2372) to measure the voltage across the resistor.

I've attached a capture from the screen, and its clear that the Radioddity firmware is only receiving for 120ms every 540ms.
But even the peak current, while the radio is receiving is lower than the 68mA that it consumes before it enters this power saving mode.
Official_firmware_power_saving2.png
Official_firmware_power_saving2.png (22.74 KiB) Viewed 562 times

One possible reason that the current is lower, is that the C6000 DMR chip may be in deep sleep mode for then entire time.

I did some other tests and approximately 10mA can be saved by powering down the C6000, however the C6000 controls the master reference oscillator, and if its powered down, I found the reference oscillator output is also disabled.

I tried transmitting on FM with the C6000 in deep sleep and observed that the radio transmitted 17.5kHz lower than with the C6000 enabled, when the Tx frequency was set to 439.000Mhz

This is approximately 40 parts per million, and difference in frequency, would be less on 2m than it is on 70cm.

I don't know if the Radioddity firmware is powering down the C6000, but if it is, the Rx frequency during reception may be not 100% correct, or the Radioddity firmware may simply be compensating by increasing the frequency control on the RF chip by 40 ppm.

(The 40ppm is specific to my radio, but the value can be read from the calibration table, so could be made appropriate to individual radios)

The other mystery is what accounts for the difference in current between the Rx and sleep phases.

The sleep phase seems to consumes around 25mA and the active phase takes around 50mA.

Possibly this is the current taken by the AT1846S between its Deep Sleep and normal Rx operation, but I have not had time to determine the currents that the AT1846S takes in Rx and Deep sleep

Also, the CPU in the GD77 has various power modes, each of which consumes different amount of current.

The normal "Run" mode and the High Speed Run mode, both seem to consume the same amount of current, which is around 25 to 30mA.
There is also a Very Low Power Run mode, which consumes just 1.5mA. In this mode the processor runs at a fraction of its normal speed.

I did some tests and surprisingly, I was able to use the GD77 on FM when the CPU was set permanently into Very Low Power Run mode.
However, it took ages to boot up.

I also did some tests to attempt to switch between Run modes, however even after several hours of trying different things, the GD77 rebooted every time I tried to change modes.

Its possible that the Radioddity firmware is using Very Low Power run mode, and leaving the AT1846S receiving, but I think its more likely that the CPU is always in High Speed Run mode, and that the current savings are mostly caused by turning off the Rx for 75% of the time.

In theory, it should be able to greatly improve upon the power saving employed by Radioddity, as the CPU could be put into Very Low Power Run mode, and the AT1846S could also be put into Deep Sleep.

And, there is no need to sample for 120ms like the official firmware does.

On FM and also when receiving from a DMR repeater (which carries a constant carrier when transmitting both TS), then its only necessary to sample for around 25ms to get the RSSI and determine whether there is a signal and hence the need to wake the radio into full operating mode.

Even for DMR simplex, the sample time only needs to be 60mS to catch either TS1 or TS2.


So, I think in the long term it should be possible to get very good battery endurance.

However.

Getting this far has taken me about a week of effort, (probably around 20 hours of work), so its likely to take another 100 hours or more before I have something which gives better battery endurance.

User avatar
m1dyp
Posts: 452
Joined: Sat Nov 16, 2019 8:03 am
Location: Hertfordshire, U.K.
Contact:

Re: FYI. Battery power saving

Post by m1dyp » Sun Jan 17, 2021 12:26 am

wow, you have been busy, take a break please, you deserve it. stay safe.
73 de Ken :D

VK3KYY
Posts: 3331
Joined: Sat Nov 16, 2019 3:25 am
Location: Melbourne, Australia
Contact:

Re: FYI. Battery power saving

Post by VK3KYY » Sun Jan 17, 2021 1:10 am

m1dyp wrote:
Sun Jan 17, 2021 12:26 am
wow, you have been busy, take a break please, you deserve it. stay safe.
I just thought I'd give a bit of background about what goes on behind the scenes in developing the firmware.

Nothing every seems to be easy ;-)

VK3KYY
Posts: 3331
Joined: Sat Nov 16, 2019 3:25 am
Location: Melbourne, Australia
Contact:

Re: FYI. Battery power saving

Post by VK3KYY » Sun Jan 17, 2021 2:15 am

I just did some more testing, and simply turning off the Rx section in the AT1846S chip, seems to save 25mA, without the need to put the chip completely into its "Deep Sleep" mode.

VK3KYY
Posts: 3331
Joined: Sat Nov 16, 2019 3:25 am
Location: Melbourne, Australia
Contact:

Re: FYI. Battery power saving

Post by VK3KYY » Sun Jan 17, 2021 10:50 pm

After struggling for 2 days to understand why I could not put the AT1846S into deep sleep mode, and then wake it up again, I did some individual test to look at the RSSI and "Noise" sample values, just after the receiver section in the AT1846S is re-enabled; and I have some unexpected results.

FYI.
The Noise value, is used on FM to open the squelch, and on scanning both FM and DMR its as the indicator that a signal is present.
I don't know the technical details why, but the RSSI value is not used as an indication that a signal is present. From what I have been told, by experts in this subject the Noise value is a much more reliable indication of a FM signal (or DMR signal, as DMR uses FM modulation)


Anyway...
I had expected that the RSSI and Noise values were an instantaneous measurement of the received signal, but they are not.

I plotted the RSSI and Noise values every 1 millisecond for 175 milliseconds after the receiver section is enabled, and the plot looks like this
no_signal_RSSI_and_noise.png
no_signal_RSSI_and_noise.png (56.14 KiB) Viewed 510 times
There is some noise on this plot, as the radio is attached to my external antenna, but the radio is turned to a frequency with no activity apart from my local ambient QRM.

Its clear that the Noise value, is not an instantaneous value, but a long term average, and that the Noise value takes over 150 milliseconds before it stabilises.

The RSSI reacts much faster to changes than the Noise value, but it is not valid until about 25 milliseconds after the receiver is enabled, as it seems to have a glitch in its value, immediately after the receiver is enabled.

The next plot is for the same value, except when I was transmitting on FM so that there was a strong signal (S9+)
RSSI_and_Noise_FM_S9.png
RSSI_and_Noise_FM_S9.png (52.97 KiB) Viewed 510 times
And the results are basically the same, except the RSSI value seems to be valid sooner than 25 milliseconds.

This slow response time of the Noise value, explains why the official Radioddity firmware seems to be enabling the receiver for approximately 120 milliseconds, as they are probably waiting for the Noise value to stabilise.


It also means that the official Radioddity firmware is probably not using any of the low power modes of the CPU, and is just turning the receiver on and off to save current.
I'm not 100% sure that the official firmware is not using the low power CPU modes, perhaps if the radio is left on an empty frequency for a long time, but in my research so far, the current taken by the official firmware appears to be too high to indicate that they are using the low power CPU modes.


These results don't just effect the possibility of power saving, but they also throw a spanner in the works for the scan feature in the OpenGD77 firmware, because the at the moment, the Rx has to be disabled when the frequency is changed, and re-enabled afterwards.

Currently the scan dwell time on each channel / frequency is only 30 milliseconds, but its clear from this data that 30mS after the frequency has been changed, that the Noise value will only get to around 50% of its final value after a wait period of 30 milliseconds.

This probably explains why the scan sometimes does not stop on an active channel.

There are many ways which the scan sensitivity could be improved. For example using the high RSSI value to hold the scan on a specific frequency longer to allow the Noise value to stabilise. (as suggested to me by Colin G4EML)
However, this is another big body of work, which I don't have time to investigate at the moment.

But I am going to continue to investigate power saving (battery endurance), and its likely that whatever I learn can be applied to the scanning at a later date. Or if anyone else e.g. Daniel or Colin etc wants to investigate how to improve scanning, then whatever anyone else learns could potentially be used in the power saving feature.

I'm now hopeful that some basic power saving feature should not be too difficult to create.
Its not initially going to be quite as good as in the official firmware, but hopefully in the long term it will be better.

VK3KYY
Posts: 3331
Joined: Sat Nov 16, 2019 3:25 am
Location: Melbourne, Australia
Contact:

Re: FYI. Battery power saving

Post by VK3KYY » Mon Jan 18, 2021 7:02 am

Another small update.

After looking at the current saving by putting the Flash chip into Power-Down mode, I double checked the data sheet, and the Flash chip aromatically goes into its standby mode, and the saving in current by putting the chip into full Power-Down mode is only a few micro Amps, and would not extend the battery endurance at all.

This just leaves the display as the possible cause of the difference in current consumption between the OpenGD77 firmware and the official Radioddity firmware

G4EML
Posts: 206
Joined: Sat Nov 16, 2019 10:01 am

Re: FYI. Battery power saving

Post by G4EML » Mon Jan 18, 2021 11:25 am

Roger,

Don’t forget the display on OpenGD77 is permanently selected with chip select held low all the time. I don’t know if that increases the power consumption but it might. It is pretty easy to modify the code to only set chip select low when sending data to the display.

Colin.

VK3KYY
Posts: 3331
Joined: Sat Nov 16, 2019 3:25 am
Location: Melbourne, Australia
Contact:

Re: FYI. Battery power saving

Post by VK3KYY » Mon Jan 18, 2021 8:12 pm

G4EML wrote:
Mon Jan 18, 2021 11:25 am
Roger,

Don’t forget the display on OpenGD77 is permanently selected with chip select held low all the time. I don’t know if that increases the power consumption but it might. It is pretty easy to modify the code to only set chip select low when sending data to the display.

Colin.
I tried toggling the chip select line but it didn't seem to make any differnce.

But I agree we should toggle it to On , only when needed.

I will make a change to the code to do that today.

The data sheet also describes a way to put the display into standby , by sending two command bytes, but that didn't seem to work.

But its possible that perhaps the CS line should be toggled after each command.

VK3KYY
Posts: 3331
Joined: Sat Nov 16, 2019 3:25 am
Location: Melbourne, Australia
Contact:

Re: FYI. Battery power saving

Post by VK3KYY » Mon Jan 18, 2021 9:18 pm

Colin

I've made the change to the code to only enable the LCD chip select when transferring to the display, and this change will be part of the next beta release.

However, I'm not seeing any reduction in current taken by the radio.

Still 84mA when the radio is idle with the backlight turned off.

Strangely, when the radio needs to display the message "Settings update" when the settings version has changed, the radio seems to be consuming over 100mA.

I get the settings update message quite often, because I do a soft reset on the CPU and it seems to cause some problem with the EEPROM not being accessible. So I don't know if the high current use is just something associated with me using the JLink hardware debugger, or something in the radio that happens under these conditions.

I'll check if perhaps the audio amp is being left enabled and the LED.

VK3KYY
Posts: 3331
Joined: Sat Nov 16, 2019 3:25 am
Location: Melbourne, Australia
Contact:

Re: FYI. Battery power saving

Post by VK3KYY » Mon Jan 18, 2021 9:51 pm

Colin,

I added a function to the code, for the LCD to enter "Sleep" mode, but it makes the display go completely blank, and also the current being consumed by the radio didn't change.

So if there is a low power mode, this is not the one we should be using for normal operation.

In the data sheet, there is a "Command" called "Power Save"
PowerSave.png
PowerSave.png (13.85 KiB) Viewed 441 times
But the data sheet doesn't make any sense.

Because all the binary bits have a # in their value, whereas normally the other commands e.g. Power control. Have values for the binary bits
PowerContol.png
PowerContol.png (30.57 KiB) Viewed 441 times
So I can't see how to enter Power Save, or how to exit from that mode.



We don't have access to the W/R line. I the display is permanently set to Write

We only have access to the Chip Select, Reset and the Command / Data line

We normally leave the display in Command mode, so I'll try changing the code so that we normally leave it in data mode in case that helps

Post Reply