G120 & SignalGenerator


I just tested SignalGenerator with the G120 module and I noticed that the time in microseconds of the buffer does not match what is really generated.

For example, if I put the value 4’000’000 in the buffer, the G120 generates a signal during only 3 seconds instead of 4 seconds.

Of course a factor of 4/3 fixes this problem but it’s not ideal to use.

Is a known issue ?

Can you please show your code?

using System;
using GHI.Premium.Hardware;
using Microsoft.SPOT;

namespace Signal
    public class Program
        public static void Main()

            UInt32[] buffer = new UInt32[] { 1000000, 1000000, 1000000, 1000000 };

            SignalGenerator generator = new SignalGenerator(G120.Pin.P3_24, false, buffer.Length);


            long ticks = DateTime.UtcNow.Ticks;

            generator.SetBlocking(true, buffer, 0, buffer.Length, 0, true);

            ticks = DateTime.UtcNow.Ticks - ticks;

            Debug.Print("Time : " + (ticks / TimeSpan.TicksPerSecond) + " s");

Just to make sure. So when you change one of the values in the buffer to 4000000 it only holds for 3 seconds. Does it matter what state it is high or low?

All values (high or low state) of the buffer have a factor of 0.75 in reality.

With this code the G120 module generates : 750 ms high, 750 ms low, 750 ms high and 750 ms low.

I will check this code on my Cobra II.

You are right. It is a bug I see exactly same issue.

We will check it

Yes, that is a bug in SetBlocking() function and it has been fixed. If this function is not important in your project now, try another way with Set() function. Set() function still works fine.

1 Like

I noticed that the frequency of SetBlocking() function is also incorrect but it’s probably the same bug.
With a carrierFrequency_hz of 300 the G120 generates a signal of 400 Hz.