Interrupts time accuracy

Hello,

I don’t know why but I wanted to check how accurate could be the “time” parameter in the interrupt handler…

So, I’ve used the Phidget SBC as a source for the Domino. I’ve programmed the Phidget SBC to send output at 100ms intervals (Java code):

while (true)
                {
                  ik.setOutputState(1, true);
                  Thread.sleep(100);
                  ik.setOutputState(1, false);
                  Thread.sleep(100);
                }

As you can see, nothing really hard here :wink: This code runs standalone inside the SBC.

Then, the SBC output is connected to DI11 :

static InterruptPort ReadCodeuse = new InterruptPort((Cpu.Pin)FEZ_Pin.Interrupt.Di11, false, Port.ResistorMode.PullDown, Port.InterruptMode.InterruptEdgeBoth);

The handler then fills an array with the milliseconds :

static void ReadCodeuse_OnInterrupt(uint data1, uint data2, DateTime time)
        {
            Tab[i] = time.Millisecond;
            i++;
        }

Now, if I check the difference between successive interrupts, I find values from 104 to 106 ms, with some “pikes” at 100 or 110 ms.

Not that I have to be that precise, but we speak of milliseconds, which is a big time for embedded system, isn’t it ?

Both programs (on SBC and on Domino) are short enough to not add any overhead, so I wonder why I don’t get something around 100 :think:

Any idea ?

ps: that’s not a bug or a criticism, only a question :hand:

I would think that when the interrupt accurse the processor has to finish up some current statement (even if its a NOP cycle). As far as I know the interrupt isnt a non makeable interrupt.
But then again, I might be totally off here… :slight_smile:

hummm…

Interrupt routines that I’ve wrote seem (to me at least) to be accurate down to the sub micro-second level. That being said I dont see anything in your code that would cause the problem either.

Have you tried looking at the …


time.ticks

…value instead of the …


time.Millisecond

…value? And if so, was the difference also ~100milliSeconds?

I put some interrupt timer code in this thread last night. It uses the ticks value and seems to be pretty accurate.

[url]http://www.tinyclr.com/forum/6/714/[/url]

[quote]Not that I have to be that precise, but we speak of milliseconds, which is a big time for embedded system, isn’t it ?
[/quote]

Not really, this depends on your application and on the operating system. NETMF is not made for timings by any means (not real time). But, for many applications, the timing error is very acceptable.

@ pastasauce : ticks show the same thing, except that I have more digits :wink:

I’ve made another test : I replaced the Domino with a Phidget 1018 and used the same logic for it.

        void ik_InputChange(object sender, InputChangeEventArgs e)
        {
            Tab[i] = DateTime.Now.Millisecond;
            i++;
        }

Results are not as good since I get 94/109/110 with a pike at 125 :frowning:

Although it seems (is) worse, the 1018 relies on USB and .Net Framework, so I suppose that all this may generate some overhead, hence the bigger timings.

This also means that the Domino performs well, here :slight_smile:

I will try another test : the Domino will act as the source and the Phidget SBC as the dest.

Too bad I don’t have a second Domino so I could exclude “OS” overheads (SBC uses Java or C). Because I’m not sure that the SBC is really sending at 100ms intervals…

I understand that NetMF is not good for precise timings but I thought that millisecond should be accurate enough on a MHz processor, even with NET stuff running.
As I said, that’s no big deal, but I like to know/understand some things :wink:

Last test with Domino as source and SBC as dest : 84/94/112/125 (min/avg low/avg high/max) :frowning:

My conclusion is : there’s some overhead somewhere in the SBC (Java code) or Phidget 1018 (.Net Framework + USB) and Domino is just as accurate as it can be given the fact that it runs NETMF.

Not really useful test(s), I admit, but not that useless either :wink: I now know that I can rely on 10ms accuracy.

I don’t have scope or function generator to check all this, so I will stop here.

Why not get a cheap 555 timer and measure that? No processor overheads on them :wink:

You’re right. :stuck_out_tongue: I may try that, too. With all my boards, just to compare.

I was only curious to see how accurate the Domino interrupts could be since the interrupt handler exposes the time. Given my results, I could have been a little bit disappointed because I was expecting something like ms or even less. But there are too much things that may influence my measures : .NetMF, Java, SBC, USB, and so on…

Also, I admit that the parameters in nanoseconds for the servos tickled me :wink:

You only need the Domino for testing. Generate your output pulse stream with a PWM and then connect that output pin to your interrupt input.

Yes and no :wink: I would prefer an external source to avoid biases introduced by NetMF as much as possible.

But I will do it, because it’s not a bad idea. I also think I will get better results.

Since PWM is generated in HW it is very accurate and does not influence NETMF in one way or the other.