SignalGenerator generating incorrect timing?

Hello,

I am trying to use the SignalGenerator class to generate IR Codes for a remote control, I checked other posts and I think I am using it correctly, but after capturing the data with an oscilloscope, the values are off.

These is the code I have (sample code)


    public class Program
    {
        static SignalGenerator sg;
        static uint[] command = { 1000, 500, 1000, 500, 1000, 500 };

        public static void Main()
        {
            sg = new SignalGenerator(GHI.Pins.FEZCerb.PC2, false);


            var t = new ExtendedTimer((s) =>
              {
                  Debug.Print("Sending Command...");
                  sg.SetBlocking(true, command, 0, command.Length, true, 0, 38000);
                  Debug.Print("Command Sent.");
              }, null, 0, 3000);
              

            Thread.Sleep(Timeout.Infinite);
        }
    }


I expected that code to generate a simple signal of 1000-off, 500-on 3 times (microseconds) using a 38kHz frequency, but the oscilloscope shows the “bumps” that are supposed to be 1000uS as 791uS, and the 500uS as 382uS.

I am sure I am not understanding how the SignalGenerator is supposed to work, Could someone explain how to use it?.. or Why I am not getting the values I expected?.

I am a programmer (lots of years) but with minimal electronics expertise, so, be gentle! :slight_smile:

I read the Adafruit documentation about IR Receivers/Transmitters and I was able to generate the correct curves using a Photon (like Arduino)… using the same values… so I know the IR Sensor, IR Transmitter and the Oscilloscope are working!

Thanks.

What GHI board are you using with this code?

That would help. right? :slight_smile:

It is a FEZ Cerberus using .Net 4.3

Thanks.

@ dulfe - Which SDK are you using?

Net Microframework 4.3
GHI 2014 R5

Thanks!

@ dulfe - Can you test it on the latest pre-release?

According to the GHI SDK document for the SignalGenerator class, carrier frequency is done in software and is inaccurate. Is 20% drift unreasonable?

@ Mike - Thanks for your reply, but I do not remember saying it was “unreasonable”… but now that you mentioned it, I think :think: it is, I was hoping around ±10%, and perhaps even less when disabling interrupts (like I did -shown in the sample code-).

The example shows 1000 microseconds, but if I change that to 10000 and the error is 20%, then that is 2 full milliseconds, which is a lot even for a software based timer. :snooty:

That is why I asked if I was using the class incorrectly, or that was the expected results for that class.

If you have used it before and you have seen it always behaving like that, then great, that is how it works and I do not have to spend more time in that route and look for alternatives.

Thank you again.

We are still looking into it

@ John - I just installed the new SDK and updated my Cerberus board, it now using Firmware 4.3.7.10.

The results are the almost the same (30 to 50 microseconds difference), as Mike said, I might be expecting too much of that class (assuming I am using it correctly)

Thank you for looking into it!

@ dulfe - might be worthwhile to run a test without the carrier and see what happens its accuracy.

@ Mike - I made more tests, this time without the carrier and changing the pulse values all to 1000uS, so it is easier to “see”… first test was using “SetBlocking” with no interrupts, the pulses captured by the oscilloscope were 750~752uS (image 1).


static uint[] command = { 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000 };
...
sg = new SignalGenerator(GHI.Pins.FEZCerb.PC2, false);
sg.SetBlocking(true, command);

Then I tried the non-blocking method “Set” with the same pulses, there is no way to disable interrupts with that method, so I use it as is and the pulses were exactly 1mS. (image 2)


sg.Set(true, nc, false);

I was REALLY expecting to see “correct” values in the blocking method and a little “off-values” in the non-blocking version.

So… Or I am reading the Oscilloscope incorrectly (which It will not surprise me), I am expecting the incorrect values from the SignalGenerator’s Blocking method or it is doing the wrong thing.

Any guidance would be appreciate it.

Thanks.

@ dulfe -

we were able to reproduce the issue. This should will be fixed in next 2016 SDK

@ Dat - Sorry for breaking your SDK! ::slight_smile:

I guess I will have to play with RLP and see if I can make it work.

Thanks.

Seems like the issue also affect RLP, I tried using DELAY and gives me 33% less than requested, for example:



Sleeps for 750uS, I tried multiple values and there is always a 33% difference.

You might want to look into that too.

Thanks!