Porting GHI AnalogIn v4.1 to AnalogInput v4.2

Can I get someone to help me understand this little area in the framework. I’m not entirely sure how min/max value of GHI AnalogIn translates to Offset and Scale in MS.SPOT.

Old v4.1 code using GHI Hardware lib:

_variableResistor = new AnalogIn(AnalogIn.Pin.Ain0);
_variableResistor.SetLinearScale(0,3300);

new code using v4.2 Microsoft.SPOT.Hardware

_variableResistor = new AnalogInput(Cpu.AnalogChannel.ANALOG_0);
_variableResistor.Offset = 0;
_variableResistor.Scale = 3300;

is this good/bad? What do I do? Thanks!

Not an expert in this area but I believe you will need to adds some math on your side for 4.2. The “offset” value in MS.SPOT should correspond to the minvalue in GHI setlinearscale. But there is no direct equivalency between GHI maxvalue and scale without taking into account the precision in 4.2. see:

https://www.ghielectronics.com/downloads/man/Library_Documentation_v4.1/html/1a262f71-ceb9-22b6-4f94-ea1d7f894114.htm

https://www.ghielectronics.com/community/forum/topic?id=7832

@ dapug - with 10bit ADC you get raw values from 0 to 1023 for the voltage range of 0V to 3V3.

Offset and scale (in 4.2) change the formula to

value = Scale * Raw + Offset.

in your case this will translate to values between 0 and 3375900

So if you want same value as in your 4.1 example you need to use Scale value of 3.222

@ Architect - Thanks, I also learned something now. So the offset is basically just an indication of how many millivolts per step. In other words 3V3 = 1024 steps. Therefore each step is 3V3 /1024 = 0.00322265625 V or put 3.222 mV.

@ kiwisaner,

no, the offset is not that.

The fact that it’s a 10-bit ADC is what generates 1024 “steps”. 1024 = 2^10. A 10-bit ADC on a 5v micro would still generate 1024 steps, but each step is [ (max - min)/steps ] volts per step, or (5-0)/1024.

Offset just does what it says, offsets your read value by some quantity.

Stupid me, I meant to say Scale, not Offset. :slight_smile: