Hello,
I don’t know why but I wanted to check how accurate could be the “time” parameter in the interrupt handler…
So, I’ve used the Phidget SBC as a source for the Domino. I’ve programmed the Phidget SBC to send output at 100ms intervals (Java code):
while (true)
{
ik.setOutputState(1, true);
Thread.sleep(100);
ik.setOutputState(1, false);
Thread.sleep(100);
}
As you can see, nothing really hard here This code runs standalone inside the SBC.
Then, the SBC output is connected to DI11 :
static InterruptPort ReadCodeuse = new InterruptPort((Cpu.Pin)FEZ_Pin.Interrupt.Di11, false, Port.ResistorMode.PullDown, Port.InterruptMode.InterruptEdgeBoth);
The handler then fills an array with the milliseconds :
static void ReadCodeuse_OnInterrupt(uint data1, uint data2, DateTime time)
{
Tab[i] = time.Millisecond;
i++;
}
Now, if I check the difference between successive interrupts, I find values from 104 to 106 ms, with some “pikes” at 100 or 110 ms.
Not that I have to be that precise, but we speak of milliseconds, which is a big time for embedded system, isn’t it ?
Both programs (on SBC and on Domino) are short enough to not add any overhead, so I wonder why I don’t get something around 100 :think:
Any idea ?
ps: that’s not a bug or a criticism, only a question