RTC Drift?

Has anyone seen the RTC time drift significantly while the device is powered up and running? We read the RTC time to display on the user display. Over the last 3 days, the RTC has lost more than 10 seconds when compared against my phone time.
We are running firmware 2.2.2.1000 on our production devices.

Last week we received complaints from customers regarding the issue.

What device? Did you calibrate the clock on each unit?

I don’t believe that any GHI board has a temperature controlled crystal. Further, I suspect each module/board drifts differently. The ambiant temperature has a further effect.

From my experience, 10 seconds over three days is much better than I would expect without calibration.

When I need accurate times, I add a GPS module .

If that amount of drift is normal does anyone have an example of how to calibrate the RTC?

Can it be calibrated against the system clock to get sufficient results? (I would be happy with +/- 2 seconds a month.)

I do not have any code I can release.

Does the device have internet access? If so, you can periodically update the clocks via NTP or another online time source.

If not, you need to compare the timing on the device to a known source. The source could be GPS or an Apple Watch.

You initiate the calibration by setting the RTC and system time to the known source. You then wait a specific period of time and compare the RTC and system time to the known source. From this comparison, you obtain the error factor, such as 30 seconds in 24 hours. With the error factor, you can then periodically update the device clocks using the error factor.
In essence, you need to compare the time on the device to a known source.

Can it be calibrated against the system clock to get sufficient results?
You might find that the error factor of the system clock is different from the RTC. Or not…
I would think you would want to keep both clocks updated.

Bottom line… experiment and characterize…

Actually, I can be replaced by entering “algorithm for determining the error factor of microprocessor rtc” into ChatGPT.

To be able to compare the RTC against an external source I need the RTC_OUT signal. Depending on the register configuration it is either a 1Hz or 512Hz pulse output. The RTC_CR register bit COE enables the output. When the COE bit is enabled it should send the RTC_OUT signal to pin PC13, which is the TAMPR pin. I tried using the Marshal class to write to the register and it appears to not allow writes, only reads.

        var RTC_CR_Addr = (IntPtr)0x58004000 + 0x08;
        var RTC_CR_Value = Marshal.ReadInt32(RTC_CR_Addr);
        
        Debug.WriteLine("RTC_CR: " + RTC_CR_Value.ToString("X"));

        var RTC_CR_NewValue = 0x00800000;
        Marshal.WriteInt32(RTC_CR_Addr, RTC_CR_NewValue);

        RTC_CR_Value = Marshal.ReadInt32(RTC_CR_Addr);
        Debug.WriteLine("RTC_CR New: " + RTC_CR_Value.ToString("X"));

The above code output is:
RTC_CR: 00
RTC_CR New: 00

Am I missing something that would cause it not to write to the register, or is TinyCLR blocking writes to that register?