I help develop hardware and software for battery powered instruments and moorings that remain in the ocean (surfaced and submerged) for several years collecting data for scientific research.
Although I’m not a “noob” developing with ARM processors and embedded C/C++, I’m a complete novice with C# and the Gadgeteer environment. I’m investigating the use of GHI’s products to minimize development time and effort for oceanographic instruments. So far, our results have been very encouraging.
Several of my organization’s other instruments or platforms use Java running on top of embedded Linux. Java’s garbage collection has been very problematic for devices
that are deployed for long periods of time. Either storage was completely exhausted or the JVM spent a long time reclaiming storage at unpredictable intervals . All of the garbage collection related problems have been reduced to nuisance level but were severe enough to give pause about using any software environment that relies on GC.
Any anyone used NETMF and Gadgeteer controllers to build long running apps? Any comments about the impact of NETMF’s garbage collection on long running apps would be welcomed.
CLR seems to be an interpreted environment driven with “byte codes”. The NETMF documentation mentions “Just In Time” compilation to machine code. Do GHI’s products do any JIT translation? If not, are there any rough rules of thumb for the overhead of the byte code interpreter? Our apps don’t require much performance but I like to know what things “cost” to some degree.
Our apps log data to a local file system, use TCP/IP jiggery-pokery to telemeter data to the shore often through satellite modems and have several PID loops for control of velocity and position. The control systems are rather slow, operating at 10 Hz but some instruments would benefit from certain control loops running at 100 Hz or more. More or less, everything is soft “real time”.
The PID controllers require being executed at a fixed rate with reasonably small (< 10ms is tolerable, less < 1 ms would be great) timing jitter. Any timing jitter just introduces additional phase lag into the control system. There must be a dozen different ways to execute code at a fixed rate using C# using a thread with “sleep” or something driven from a timer. Any suggestions about an approach that will minimize timing jitter?
Finally, is there any way to place a G120 or similar GHI controller into a truly “deep sleep” mode. Browsing the GHI literature revealed that some controllers can be placed into a “deep sleep” mode that still required several milliamps of current. Low power to meet our requirements means less than 50 uA of current being consumed in “deep sleep”.
Is the relatively high current consumption driven by NETMF or is it just hardware dependent?
Thanks for insights that member of the forum can provide,