I was wondering if anyone with a cereberus could run this code and tell me how long it takes…
I suspect it will run out of memory though… :’(
var myInt = new int[100000];
var ain = new AnalogIn(AnalogIn.Pin.Ain0);
var startTime = DateTime.Now.Ticks;
for (var i = 0; i < myInt.Length; i++)
myInt[i] = ain.Read();
var elapsedTime = (DateTime.Now.Ticks - startTime)/(double)TimeSpan.TicksPerSecond;
Debug.Print(elapsedTime.ToString("F2") + " s");
well i tried it on a Panda and sure enough i get an out of memory exception…
but changing this:
var myInt = new int[100000];
to
byte[] myInt = new byte[100000];
and this :
for (var i = 0; i < myInt.Length; i++)
myInt[i] = ain.Read();
to
for (var i = 0; i < myInt.Length; i++)
myInt[i] =(byte) ain.Read();
gives a whole lot more to read and store.