Hello!
I hope this is not a dumb question but I am curious about the difference in
the analog software interface (Set() vs WriteVoltage) from what I used on the 4.2 SDK vs 4.3 SDK.
I am doing this -
ao = GT.SocketInterfaces.AnalogOutputFactory.Create(GT.Socket.GetSocket(9, true, null, null), GT.Socket.Pin.Five, null);
ao.WriteVoltage(1.0); // on old SDK I get 1volt (ie Set(1.0)), on new SDK, I get full scale!!
ao.WriteVoltage(0.1); // this gives me 1 volt on new SDK.
ao.WriteProportion(0.5); // you would think this would give 1/2 scale but no, it is full scale.
ao.WriteProportion(0.05); // this is 1/2 scale or 50%
so, it seems by default, I have to divide by 10 before getting what I would expect.
I don’t see any parameter to adjust this and why would it default to this anyway…
It’s not a problem as I can adjust, but I’m just curious as to the reasoning behind this.