Analog inputs

I’d like your help, I would like the G120HDR read voltage with analogue inputs:

Debug.Print (“PT100 AD0 =” + PT1001.Read ());
PT1001 is a variable “AnalogInput”

But I get the same result as Show not on the voltmeter?

What is the problem in your opinion?

Can you show the code for declaring you AnalogInput ?

@ hagster - Yes of course !

AnalogInput PT1001 = new AnalogInput(Cpu.AnalogChannel.ANALOG_0);
AnalogInput PT1002 = new AnalogInput(Cpu.AnalogChannel.ANALOG_1);
AnalogInput PT1003 = new AnalogInput(Cpu.AnalogChannel.ANALOG_2);

I using mainboard GHI Electronics G120HDR version 2.0

Your output should be a number between 0 and 4096 to correspond to 0 to 3.3V.

You can initialize it to give a scaled output

public AnalogInput(
	Cpu.AnalogChannel channel,
	double scale,
	double offset,
	int precisionInBits
Type: Microsoft.SPOT.Hardware..::..Cpu..::..AnalogChannel
The channel for the analog input.
Type: System..::..Double
A multiplicative factor to apply to the raw sensor reading before the value is returned.
Type: System..::..Double
A constant factor to apply to the raw sensor reading before the value is returned.
Type: System..::..Int32
The desired bit precision for the A/D conversion. A value of -1 indicates maximum available precision.

okay, thank you for your reactivity, but I do not think that is the problem because instead of 0.23 Volt I 0.070574

Sorry, Read Raw gives you the 4096 scale(i.e. 2^ADCbits)

You have 1.0 = FULLSCALE

so 0.70574 * 3.3 = 0.232V

@ hagster - No sorry, is good ! Thank you for your help !!

@ hagster - One last thing, I did the multiplication in the code but is it possible to tell him I want a full scale of 3.3?

Details were in the snippet I posted.

AnalogInput PT1001 = new AnalogInput(Cpu.AnalogChannel.ANALOG_0,scale,offset,prescision);

//I think but haven't tested that :-
// scale = 4096 / 3.3
// offset = 0
// precision = -1 or 12 (-1 is max available precision)

Have a play until you get the right values.

@ hagster - Thank you !