Eco Levels
Posted: Sun Jan 17, 2021 12:00 am
I've been doing some research into the current consumption of the official firmware and how it achieves better battery life than the OpenGD77 firmware.
To make the measurement I had to but a "battery eliminator" for the rather inflated price of $20 (AUD), and then break the battery housing option to remove the switch mode voltage regulator.
I power the transceiver via a linear "bench" power supply which is variable voltage and can supply up to 1A.
I also have a 10A switch mode bench supply, however to reduce supply noise I used the linear supply, even though it can't supply enough current for full power Tx operation, as Rx consumption is the main difference between the firmwares.
Current measurement is via my old Fluke 23 multimeter, which is at least 30 years old, but still works well.
Additionally I compared readings with 2 other mutimeters, a cheap Chinese DT-830B and a better quality meter I bought from a local supplier, which is branded as a Digitech QM1323
The first thing I noticed was that all 3 meters showed different values for the current, with the difference being around 5mA, with the Fluke 23 being considerably different from the other 2 meters.
So, I connected the meters directly across the current regulated output from the power supply, and found that all 3 meters gave identical results when just supplied with a clean DC current from the PSU.
Therefore the discrepancy between the readings was being caused by the non-pure DC nature of the current being demanded by the radio.
To partially overcome this problem,I made a capacitor pack, from 4 x 2200uF capacitors, in parallel to make 8800uF.
This improved the discrepancy between the meters.
Loading the official Radioddity firmware, disabling power saving and disabling the backlight , the current consumption is
68mA when not receiving a signal
The OpenGD77 firmware seems to consume 84mA when receiving under the same conditions, which was a bit of a surprise, as I presumed the the OpenGD77 firmware would draw the same current as the official Radioddity firmware when its not in Power Saving mode
To try to establish what is causing the difference in current requirements in I made various serious modifications to the OpenGD77 firmware, and although I have not totally confirmed what's taking the extra 16mA, it looks like the LCD display and the Flash memory chip are the reason.
The data sheet for the Flash memory chip, shows that it takes 5mA if not put into suspend mode, (and the OpenGD77 firmware does not send the suspend command to the Flash chip), or send any commands to the LCD to shut it down.
The OpenGD77 firmware updates the display constantly to update the signal meter, but one optimisation would be to not update the display when the radio enters various stages of power saving.
Also the Flash chip could potentially also be put into suspend mode, and this would hopefully reduce the current to approximately the same level as the Radioddity firmware, and increase battery endurance by around 20%.
However the main difference between the official Radioddity firmware and the OpenGD77 firmware is when the official firmware enters its Power Saving mode.
In my tests, the Radioddity firmware enters this mode after 10 seconds of inactivity.
Looking at my current meter, it was obvious that the Radioddity firmware was switching things on and off to save power, at a pulse rate of about 2 time per second.
So do do better analysis of this, I connected a 1 ohm resistor in series with the supply to the radio, and attached my Rigol 2072 (modified to be a 2372) to measure the voltage across the resistor.
I've attached a capture from the screen, and its clear that the Radioddity firmware is only receiving for 120ms every 540ms.
But even the peak current, while the radio is receiving is lower than the 68mA that it consumes before it enters this power saving mode.
One possible reason that the current is lower, is that the C6000 DMR chip may be in deep sleep mode for then entire time.
I did some other tests and approximately 10mA can be saved by powering down the C6000, however the C6000 controls the master reference oscillator, and if its powered down, I found the reference oscillator output is also disabled.
I tried transmitting on FM with the C6000 in deep sleep and observed that the radio transmitted 17.5kHz lower than with the C6000 enabled, when the Tx frequency was set to 439.000Mhz
This is approximately 40 parts per million, and difference in frequency, would be less on 2m than it is on 70cm.
I don't know if the Radioddity firmware is powering down the C6000, but if it is, the Rx frequency during reception may be not 100% correct, or the Radioddity firmware may simply be compensating by increasing the frequency control on the RF chip by 40 ppm.
(The 40ppm is specific to my radio, but the value can be read from the calibration table, so could be made appropriate to individual radios)
The other mystery is what accounts for the difference in current between the Rx and sleep phases.
The sleep phase seems to consumes around 25mA and the active phase takes around 50mA.
Possibly this is the current taken by the AT1846S between its Deep Sleep and normal Rx operation, but I have not had time to determine the currents that the AT1846S takes in Rx and Deep sleep
Also, the CPU in the GD77 has various power modes, each of which consumes different amount of current.
The normal "Run" mode and the High Speed Run mode, both seem to consume the same amount of current, which is around 25 to 30mA.
There is also a Very Low Power Run mode, which consumes just 1.5mA. In this mode the processor runs at a fraction of its normal speed.
I did some tests and surprisingly, I was able to use the GD77 on FM when the CPU was set permanently into Very Low Power Run mode.
However, it took ages to boot up.
I also did some tests to attempt to switch between Run modes, however even after several hours of trying different things, the GD77 rebooted every time I tried to change modes.
Its possible that the Radioddity firmware is using Very Low Power run mode, and leaving the AT1846S receiving, but I think its more likely that the CPU is always in High Speed Run mode, and that the current savings are mostly caused by turning off the Rx for 75% of the time.
In theory, it should be able to greatly improve upon the power saving employed by Radioddity, as the CPU could be put into Very Low Power Run mode, and the AT1846S could also be put into Deep Sleep.
And, there is no need to sample for 120ms like the official firmware does.
On FM and also when receiving from a DMR repeater (which carries a constant carrier when transmitting both TS), then its only necessary to sample for around 25ms to get the RSSI and determine whether there is a signal and hence the need to wake the radio into full operating mode.
Even for DMR simplex, the sample time only needs to be 60mS to catch either TS1 or TS2.
So, I think in the long term it should be possible to get very good battery endurance.
However.
Getting this far has taken me about a week of effort, (probably around 20 hours of work), so its likely to take another 100 hours or more before I have something which gives better battery endurance.
To make the measurement I had to but a "battery eliminator" for the rather inflated price of $20 (AUD), and then break the battery housing option to remove the switch mode voltage regulator.
I power the transceiver via a linear "bench" power supply which is variable voltage and can supply up to 1A.
I also have a 10A switch mode bench supply, however to reduce supply noise I used the linear supply, even though it can't supply enough current for full power Tx operation, as Rx consumption is the main difference between the firmwares.
Current measurement is via my old Fluke 23 multimeter, which is at least 30 years old, but still works well.
Additionally I compared readings with 2 other mutimeters, a cheap Chinese DT-830B and a better quality meter I bought from a local supplier, which is branded as a Digitech QM1323
The first thing I noticed was that all 3 meters showed different values for the current, with the difference being around 5mA, with the Fluke 23 being considerably different from the other 2 meters.
So, I connected the meters directly across the current regulated output from the power supply, and found that all 3 meters gave identical results when just supplied with a clean DC current from the PSU.
Therefore the discrepancy between the readings was being caused by the non-pure DC nature of the current being demanded by the radio.
To partially overcome this problem,I made a capacitor pack, from 4 x 2200uF capacitors, in parallel to make 8800uF.
This improved the discrepancy between the meters.
Loading the official Radioddity firmware, disabling power saving and disabling the backlight , the current consumption is
68mA when not receiving a signal
The OpenGD77 firmware seems to consume 84mA when receiving under the same conditions, which was a bit of a surprise, as I presumed the the OpenGD77 firmware would draw the same current as the official Radioddity firmware when its not in Power Saving mode
To try to establish what is causing the difference in current requirements in I made various serious modifications to the OpenGD77 firmware, and although I have not totally confirmed what's taking the extra 16mA, it looks like the LCD display and the Flash memory chip are the reason.
The data sheet for the Flash memory chip, shows that it takes 5mA if not put into suspend mode, (and the OpenGD77 firmware does not send the suspend command to the Flash chip), or send any commands to the LCD to shut it down.
The OpenGD77 firmware updates the display constantly to update the signal meter, but one optimisation would be to not update the display when the radio enters various stages of power saving.
Also the Flash chip could potentially also be put into suspend mode, and this would hopefully reduce the current to approximately the same level as the Radioddity firmware, and increase battery endurance by around 20%.
However the main difference between the official Radioddity firmware and the OpenGD77 firmware is when the official firmware enters its Power Saving mode.
In my tests, the Radioddity firmware enters this mode after 10 seconds of inactivity.
Looking at my current meter, it was obvious that the Radioddity firmware was switching things on and off to save power, at a pulse rate of about 2 time per second.
So do do better analysis of this, I connected a 1 ohm resistor in series with the supply to the radio, and attached my Rigol 2072 (modified to be a 2372) to measure the voltage across the resistor.
I've attached a capture from the screen, and its clear that the Radioddity firmware is only receiving for 120ms every 540ms.
But even the peak current, while the radio is receiving is lower than the 68mA that it consumes before it enters this power saving mode.
One possible reason that the current is lower, is that the C6000 DMR chip may be in deep sleep mode for then entire time.
I did some other tests and approximately 10mA can be saved by powering down the C6000, however the C6000 controls the master reference oscillator, and if its powered down, I found the reference oscillator output is also disabled.
I tried transmitting on FM with the C6000 in deep sleep and observed that the radio transmitted 17.5kHz lower than with the C6000 enabled, when the Tx frequency was set to 439.000Mhz
This is approximately 40 parts per million, and difference in frequency, would be less on 2m than it is on 70cm.
I don't know if the Radioddity firmware is powering down the C6000, but if it is, the Rx frequency during reception may be not 100% correct, or the Radioddity firmware may simply be compensating by increasing the frequency control on the RF chip by 40 ppm.
(The 40ppm is specific to my radio, but the value can be read from the calibration table, so could be made appropriate to individual radios)
The other mystery is what accounts for the difference in current between the Rx and sleep phases.
The sleep phase seems to consumes around 25mA and the active phase takes around 50mA.
Possibly this is the current taken by the AT1846S between its Deep Sleep and normal Rx operation, but I have not had time to determine the currents that the AT1846S takes in Rx and Deep sleep
Also, the CPU in the GD77 has various power modes, each of which consumes different amount of current.
The normal "Run" mode and the High Speed Run mode, both seem to consume the same amount of current, which is around 25 to 30mA.
There is also a Very Low Power Run mode, which consumes just 1.5mA. In this mode the processor runs at a fraction of its normal speed.
I did some tests and surprisingly, I was able to use the GD77 on FM when the CPU was set permanently into Very Low Power Run mode.
However, it took ages to boot up.
I also did some tests to attempt to switch between Run modes, however even after several hours of trying different things, the GD77 rebooted every time I tried to change modes.
Its possible that the Radioddity firmware is using Very Low Power run mode, and leaving the AT1846S receiving, but I think its more likely that the CPU is always in High Speed Run mode, and that the current savings are mostly caused by turning off the Rx for 75% of the time.
In theory, it should be able to greatly improve upon the power saving employed by Radioddity, as the CPU could be put into Very Low Power Run mode, and the AT1846S could also be put into Deep Sleep.
And, there is no need to sample for 120ms like the official firmware does.
On FM and also when receiving from a DMR repeater (which carries a constant carrier when transmitting both TS), then its only necessary to sample for around 25ms to get the RSSI and determine whether there is a signal and hence the need to wake the radio into full operating mode.
Even for DMR simplex, the sample time only needs to be 60mS to catch either TS1 or TS2.
So, I think in the long term it should be possible to get very good battery endurance.
However.
Getting this far has taken me about a week of effort, (probably around 20 hours of work), so its likely to take another 100 hours or more before I have something which gives better battery endurance.