RobV
November 3, 2010, 5:14pm
1
I am confused!
The getting started manual says:
the input range is 0 to 3.3 volts, but 5v tolerant. by default the read() gives 0…1023.
using SetLinearScale(0, 3300) give real voltage.
However, Using A5 (Pin 19 on the physical board) my values are off. If I set the scale to (0,5000) I get correct results!! I check using a voltmeter.
I the manual wrong or am I doing something terribly wrong here?
RobV
November 3, 2010, 6:02pm
2
I don’t know what happened, but after a reboot it suddenly is ok
The pins are 5 volt tolerant.
But for analog inputs, to get reliable readings, the voltage should not exceed the 3.3Volt.
The scale range is mapped in a linear manner with the voltage.
For example,Scale(0,5000):
0 means 0 volts on the input
5000 means 3.3 volt on the input
Another example Scale (3000, 6000)
3000 means 0 volts on the input
6000 means 3.3 volt on the input
RobV
November 4, 2010, 2:49pm
4
I understood. But I was having unexplainable readings. I haven’t changed a thing I will let you know when i happens again.
I would like to add that If you have another analog input and the voltage exceeds the 3.3v then all other analog inputs reading will be affected and you will get not reliable readings.