Can I get someone to help me understand this little area in the framework. I’m not entirely sure how min/max value of GHI AnalogIn translates to Offset and Scale in MS.SPOT.
Old v4.1 code using GHI Hardware lib:
_variableResistor = new AnalogIn(AnalogIn.Pin.Ain0);
_variableResistor.SetLinearScale(0,3300);
new code using v4.2 Microsoft.SPOT.Hardware
_variableResistor = new AnalogInput(Cpu.AnalogChannel.ANALOG_0);
_variableResistor.Offset = 0;
_variableResistor.Scale = 3300;
Not an expert in this area but I believe you will need to adds some math on your side for 4.2. The “offset” value in MS.SPOT should correspond to the minvalue in GHI setlinearscale. But there is no direct equivalency between GHI maxvalue and scale without taking into account the precision in 4.2. see:
@ Architect - Thanks, I also learned something now. So the offset is basically just an indication of how many millivolts per step. In other words 3V3 = 1024 steps. Therefore each step is 3V3 /1024 = 0.00322265625 V or put 3.222 mV.
The fact that it’s a 10-bit ADC is what generates 1024 “steps”. 1024 = 2^10. A 10-bit ADC on a 5v micro would still generate 1024 steps, but each step is [ (max - min)/steps ] volts per step, or (5-0)/1024.
Offset just does what it says, offsets your read value by some quantity.