My first post, apologies in advance for being a complete noob in this area.
I’m looking into timestamping signals from a sensor by using a GPS. The accuracy has to be around 250 ns. Reason for the GPS is that there will be a network of sensors which all have to be on the same time line. I know C# and .NET very well but have understood from reading other posts here that for real-time applications one must use RLP. FEZ Panda 2 looks like the ideal choice in many ways but is it achievable in theory to obtain the kind of timing accuracy that I need?
Actually, your bigger issue is going to be GPS latency not timestamping. In my (admittedly minimal) exploration of GPS, you are limited to 10 or even less samples a second, all parsed into strings in a GPS NMEA output. All that string conversion is “costly” to get into useable numbers, therefore you’ll not really achieve what you want.
How were you thinking you’d keep this accuracy? What were your expectations before you asked the question? And what sensors do you have, and what sampling regime were you expecting to perform?
Lastly, welcome to the forums !
latency can be accounted for… most gps chips have an output pin which goes high at the time the chip calculates a new position. you can use a RLP interrupt handler to take any associated measurements. later, when the gps sentence has been parsed, you can associate it with the RLP measurement.
Your other option is to measure relative to CPU ticks. That should be fairly accurate, at least at the ns scale. Note that you probably can’t measure ticks from managed code in that short a time frame.
Thanks for the welcome!
I was thinking of using a GPS with 1PPS capability such as the GlobalSat EM-406A (other suggestions are welcome). The pulse should trigger an internal timer that counts up until the next pulse. I have seen this done with Atmel microcontrollers resulting in a timing accuracy of around 400ns. From what I have read in the USBIzi user manual the same kind of accuracy should be achievable with FEZ Panda II using the RTC+crystal.
For sensors, they will detect lightning. Not yet sure about anything else, still have a lot of research to do. But as I see it the main issue is getting the inter-sensor time synchronization to work.
I wouldn’t count on software, even RLP and interrupt to measure stuffs at such accuracy.
I did a similar project 10 years ago to timestamps events on multiple sites with a 1µs accurancy.
All was done in hardware in a FPGA.
The FPGA was receiving the serial link from the GPS as well as the PPS.
The NMEA sentence was limited to the minimum to get time and date.
The serial stream was deserialized and stored on a 1st buffer
The PPS rising edge was driving the reset of a 20 bit counter which was clocked by a 1MHz coming from a highly accurate and stable TCXO (half the price of the board itself).
On event’s rising edge, the value of the counter and serial buffer were copied on shadow registers and an interrupt was generated to the CPU (A PC running XP, had to play with XP drivers at that time).
In the driver interrupt code, the shadow registers were read from the FPGA into the CPU and the system was re-armed for a new capture.
Using both the time/date from NMEA sentence and the counter value, it was possible to date events with a 1µs accuracy.
You might extend this principle to 250ns with a 4MHz TCXO but I must admit that I am unsure, if you are considering timestamping simultaneously on multiple sites, when there could be a drift between each receivers.
In digital terrestrial television, it is often made use of a mode called SFN (Single Frequency Network) where multiple transmitters are ran on the same carrier frequency and transmitting the exact same (MPEG) data stream. Transmitters’ carrier is generated by a PLL locked on a 10MHz clock from a GPS receiver. The PPS is used to synchronize the data stream transmission.
So I guess that multiple GPS receivers dispatched on a reasonnable area should be in sync with an accurancy better than 1µs.