Spider - can I expect the timer to be accurate?

Hello!

I have a very simple timer but seems it doesn’t run on time.

I initialize some code as follows:
m_Timer = new GT.Timer(1); //set to interrupt every 1ms
m_Timer.Tick += new GT.Timer.TickEventHandler(timer_Tick);
m_ExposeTime = 10000; // 10 seconds

(m_CountTicks and m_ExposeTime are ushort type)
Then, I start the timer but instead of hitting my breakpoint in 10 seconds, it takes
about 30 seconds!!
I also removed the breakpoint and just printed a message out to debug but still 30 seconds. I don’t believe I’m doing anything wrong, but maybe someone can see something that isn’t right???

Should I expect this timer to be accurate???

private void timer_Tick(GT.Timer timer)
{
m_CountTicks += 1;
if (m_CountTicks >= m_ExposeTime)
{
setIOs(false); ---------> I have a breakpoint here
return;
}
}

Show a compete and simple sample please

Well, this code is actually in a class with a lot of functions and my code has a lot of stuff.
The timer is actually triggered from a usb serial message.

But, your suggestion gives me an idea -

I will create a very simple project with just a timer and see if I have the same issue or not.
Then I’ll report back -

thanks…

You are generating and queuing one thousand interrupts a second. You might just be overloading the processor.

With an InterruptPort, at the MF level, the EMX can handle around 1500 interrupts per second, doing minimal processing, before the interrupt queue begins to grow and exhausts memory.

I suspect timer interrupts require more processing resources than InterruptPorts, which will result in a lower maximum rate.

Gadgeteer adds additional overhead to the timer interrupts, and will lower the rate further.

It would be interesting to monitor the available memory, with a Debug.GC(), and see if the memory usage is growing, confirming the growing queue.

Ok,
Mike’s post gave me an idea - so it seems I can’t expect the timer work on a 1ms time frame. I ran this simple code below 3 ways: timer every 1ms, 10ms and 100ms.
It seems to work ok for 10ms and 100ms but at 1ms it takes about 30seconds.

public partial class Program
{
    ushort m_countTicks, m_exposeTime;
    private GT.Timer m_Timer;
    // This method is run when the mainboard is powered up or reset.   
    void ProgramStarted()
    {

        m_exposeTime = 1000;
        m_Timer = new GT.Timer(10);
        m_Timer.Tick += new GT.Timer.TickEventHandler(timer_Tick);

        Debug.Print("Program Started");
        m_Timer.Start();
    }

    private void timer_Tick(GT.Timer timer)
    {
        if (m_countTicks == 0)
        {
            Debug.Print("timer starts");
        }
        m_countTicks += 1;
        if (m_countTicks >= m_exposeTime)
        {
            Debug.Print("timer stops");
            m_Timer.Stop();

            return;
        }
    }
 }

@ dave001 - And if you let run continuously at 1ms it will eventually run out of memory.

These devices are easy to work with, and great things can be implemented with minimal effort. But, the code is interpreted, and speed is not one of MF’s strengths.