Current state of clocks

There is a fair amount of information on the forum about how much the various clocks drift but I’m a little confused about the current state of affairs. I have a Raptor (G400) based system that resets the .Net Micro clock based on GPS time about every 4 hours and the jump is as much as 6 minutes in those 4 hours (see below) . It is pretty clear from threads like https://www.ghielectronics.com/community/forum/topic?id=15587&page=4 that this is kind of to be expected for the .Net Micro clock but the RTC is supposed to be much better.

Here are my questions:
It seems from my reading that if I “enable” the RTC, the .Net Micro clock will show improved performance similar to the RTC. Is that correct? How do I “enable” the RTC?

Thanks

            
pubx04DateTime = new DateTime(year, month, day, hour, minute, second);
Utility.SetLocalTime(pubx04DateTime);

@ Gene - Documentation for setting and reading the RTC is here [url]https://www.ghielectronics.com/downloads/man/Library_Documentation_v4.3/html/T_GHI_Processor_RealTimeClock.htm[/url]

I do not believe that setting the RTC will have an effect on the accuracy of the NETMF clock. Usually, a battery is use to keep the RTC running when power is removed. When the device is powered up, the RTC is read, and the system clock is set. NETMF does not check the RTC automatically.

It would be interesting to test the accuracy of the RTC versus system clock, using GPS as the benchmark. Accuracy may be better. If this turns out to be true you can periodically update the system clock from the RTC in your application.

If the GPS peripheral is always available, I would use it for updating the system clock.

I have found the internal clock on the Fez Raptor (thus G400) to be exceptionally bad (compared to the Spider II anyway) but the real-time clock to be quite accurate. The way I handle it is to have a thread that syncs the internal clock with the real-time clock every 10 seconds which I found to be the optimum time before seconds started to be lost when syncing… This is not great if you have a stopwatch for counting a precise period of time so I have had to resort to calculating using times only from the RTC…


//to set rtc
RealTimeClock.SetDateTime(new DateTime(DateTime.UtcNow.Ticks));
//to set local time
var rtcNow = GHI.Processor.RealTimeClock.GetDateTime();
Microsoft.SPOT.Hardware.Utility.SetLocalTime(rtcNow);

One solution is to use a small Peltier TEC unit to slightly cool the cyrstal and change the clock speed slightly. Calibrate it over a few hours, and find the specific temperature that matches the gps time over that span, and viola.

Not a very good solution, mind you, but a solution nonetheless

:whistle:

I just ran a test where I checked GPS time against system time and sure enough, system time drifted like a madman. In an attempt to “turn on” the real time clock as John mentions in the quote below (from the thread I mention in my first post), I set the time on the RTC using

.  Sure enough, the RTC stays pretty close to GPS but the system time still drifts like mad.  Maybe @ John can explain how I turn on the RTC so the .Net Micro System time is running off the good crystal.

Here's what John said in the old thread I mentioned in my first post

[quote="John"]
We are still working on the issue, but here is what we have so far:

There are two issues actually: time keeping in DateTime.Now and our RealTimeClock.GetDateTime are both off. As it stands now, RTC is powered by the external crystal and DateTime is powered by the internal crystal until RTC is turned on at which point is also powered by the external crystal. 

As RobvanSchelven pointed out, the internal crystal has a wide range of variance. We have confirmed this in testing, many different G400's lose varying amounts of time, sometimes drastically. The solution to that problem is to power DateTime from the external crystal at all times. Until that change is made, enabling RTC will set DateTime to use the external crystal.

Once both clocks are on the external crystal, and testing over 10 G400s, DateTime gains a constant 7 seconds every 5 minutes (which we are still working on fixing) and RTC changes are almost negligible: a loss of around 4 seconds per day.

To work around the 7 second gain for DateTime, you can periodically update it by calling Microsoft.SPOT.Hardware.Utility.SetLocalTime with either the RTC or some network time. To work around the RTC loss, you can either add ~4.89 seconds once per day or sync it with network time.
[/quote]

@ Gene - There is no automatic method to drive System.DateTime with the hardware RTC. You’ll have to manually call Microsoft.SPOT.Hardware.Utility.SetLocalTime to update what System.DateTime.Now returns.

1 Like

@ John - OK got it. However, a better solution seems to be to just quit using DateTime.Now and use RealTimeClock.GetDateTime(). Would you agree?

If you had a ‘Clock’ class that had a member that was a delegate that returned a DateTime, then you could change it on the fly to return whatever truth you wanted. Even a static instance to freeze time. I do this in several frameworks.

2 Likes

@ eddie_garmon - Thanks for the suggestion, I wish I’d been smart enough to do that to begin with. But just to make sure, please don’t confuse me with a real programmer. I’m actually an Ocean Engineer that has been dragged kicking and screaming into doing my own programming in self defense. :slight_smile:

2 Likes

@ Gene - You could use RealTimeClock.GetDateTime() in your code, but that doesn’t mean that other code doesn’t use DateTime.Now. So if you let DateTime.Now drift, that code may perform incorrectly.

For example, in our current 4.3.8.1 libraries, we use DateTime.Now for event timestamps and to calculate timeouts so if you calculate the timeout based on RealTimeClock.GetTime() the timeouts will be incorrect.

@ John - Bummer.

@ John - Resetting the system clock on a regular basis is kind of problematic since jumps in time can lead to erratic behavior. I can reset the system clock at selected points in my app where a jump won’t cause a problem but these selected points are hours apart and the drift is significant during that time period.

Some of your posts in the old thread I referred to suggest that GHI was coming up with a way to make the System clock run off the external 32 kHz xtal. Apparently that didn’t happen? Any chance it might happen in the near future?

Thanks for all the help

@ Gene - there has been no change with regards to what runs of what crystals in the current SDK. We will investigate again to see if anything can be improved.

Thanks all for the help.

@ John - Thanks for the update. It seems like a pretty fundamental requirement to have system time running off a clock of the right frequency. so I’m hoping you can do something with regard to getting the system clock on a better crystal. Even if the crystal is a garden variety part, it will have acceptable drift and jitter for most applications. But if it is the wrong frequency, that is just bad.

@ cyberh0me - Synching the system clock every few seconds or even every few minutes is a bigger problem for me than synching the clock every few hours. For example, for things like low bandwidth (less than 1 update per second for example) control loops, it isn’t a slow, constant drift or even a small amount of jitter that hurts. It is a jump in time from one calculation to the next. If I’m calculating the derivative of a signal every 3 seconds and it is actually 3.05 seconds +/-0.01 seconds, that isn’t a big problem. If it is 3.05 seconds one calculation and 3.5 seconds the next calculation due to a resynch, that does create a problem. The smaller problem I’m trying to address is that I time stamp everything that happens in my app to the system clock. When I resynch the system clock to GPS, I get a big jump in my time stamps and time actually “runs backwards”. That isn’t a big issue and using the RTC for my time stamps is a reasonable solution.

What I really need is a wall clock that gives me 36 hours in a day. It is amazing how long it takes to get all the details right, even in a modestly simply application.

1 Like

The drift is pretty constant per my tests. If I understand the facts from the older thread I mentioned, the problem with the system clock is it is running off an oscillator that has a wide frequency spec. For example, iIt might be that 32.767 kHz is the right value but the spec allows the actual frequency to be quite a bit different from that. Whatever it is, is does appear to be a constant offset from the correct value plus or minus jitter and temperature effects

This is for the one board I’ve been testing. The previous thread suggests that different boards have different drift rates which is consistent with my theory above.

1 Like

For my time stamping purposes, I can use the RealTimeClock.GetDateTime method which is pretty good at keeping up with GPS time for hours and even days at a time. For the .Net Micro functions that use system time under the hood ilke DateTime.Now, timers and so on, I can live with the system clock running at the “wrong” frequency which looks like drift when compared to GPS (or wall clock) time but is really just a clock running at a frequency that is slightly different than the GPS frequency standard.

@ Gene - You may be able to do something like the below code. Calculate how much DateTime.UtcNow drifts for a given RTC period, then scale every call to UtcNow by that factor to compensate. The longer you let the Calibrate call run for the more accurate the scaled time should be. If all of your devices drift at the same rate you can skip the Calibrate call and just hardcode a precalculated factor. I haven’t had a chance to test the below code on an actual device yet.


public static class ScaledClock {
    private static double factor;

    public static void Calibrate(TimeSpan rtcDuration) {
        var endRtc = RealTimeClock.GetDateTime().Add(rtcDuration);
        var start = DateTime.UtcNow;

        while (RealTimeClock.GetDateTime() < endRtc)
            ;

        var end = DateTime.UtcNow;

        ScaledClock.factor = rtcDuration.Ticks / (end - start).Ticks;
    }

    public static DateTime UtcNow {
        get {
            var now = DateTime.UtcNow;

            return now.AddTicks((long)(now.Ticks * ScaledClock.factor));
        }
    }
}

Rewriting Johns example with the clock front end would look like this…


public delegate DateTime GetClock();

public static class Clock {
    static Clock() {
        //Default clock implementation, make this whatever you want it to be
        Time = () => DateTime.Now;
    }

    public static GetClock Time { get; set; }
}

internal static class ScaledClockProvider
{
    private static double _factor;

    public static void Calibrate(TimeSpan rtcDuration) {
        var endRtc = RealTimeClock.GetDateTime().Add(rtcDuration);
        var start = DateTime.UtcNow;
        while (RealTimeClock.GetDateTime() < endRtc) {
            ;
        }
        var end = DateTime.UtcNow;
        _factor = rtcDuration.Ticks / (end - start).Ticks;
    }

    public static DateTime UtcNow
    {
        get {
            var now = DateTime.UtcNow;
            return now.AddTicks((long)(now.Ticks * _factor));
        }
    }
}

public class Program
{
    public static void Main() {
        DateTime result;

        // Read The clock using the default DateTime.Now
        result = Clock.Time();

        //Override using real time clock
        Clock.Time = () => RealTimeClock.GetDateTime();
        result = Clock.Time();

        //Override using a scaled offset
        ScaledClockProvider.Calibrate(new TimeSpan(0, 0, 5));
        Clock.Time = () => ScaledClockProvider.UtcNow;

    }
}

1 Like