AnalogInput PT1001 = new AnalogInput(Cpu.AnalogChannel.ANALOG_0);
AnalogInput PT1002 = new AnalogInput(Cpu.AnalogChannel.ANALOG_1);
AnalogInput PT1003 = new AnalogInput(Cpu.AnalogChannel.ANALOG_2);
I using mainboard GHI Electronics G120HDR version 2.0
Your output should be a number between 0 and 4096 to correspond to 0 to 3.3V.
You can initialize it to give a scaled output
C#C++F#JScriptVB
public AnalogInput(
Cpu.AnalogChannel channel,
double scale,
double offset,
int precisionInBits
)
Parameters
channel
Type: Microsoft.SPOT.Hardware..::..Cpu..::..AnalogChannel
The channel for the analog input.
scale
Type: System..::..Double
A multiplicative factor to apply to the raw sensor reading before the value is returned.
offset
Type: System..::..Double
A constant factor to apply to the raw sensor reading before the value is returned.
precisionInBits
Type: System..::..Int32
The desired bit precision for the A/D conversion. A value of -1 indicates maximum available precision.
AnalogInput PT1001 = new AnalogInput(Cpu.AnalogChannel.ANALOG_0,scale,offset,prescision);
//I think but haven't tested that :-
// scale = 4096 / 3.3
// offset = 0
// precision = -1 or 12 (-1 is max available precision)