Hi,
I have been trying to port over an Arduino based solution (http://controlyourcamera.blogspot.co.uk/2011/02/arduino-controlled-video-recording-over.html) for sending commands to a video camera over the Sony LANC protocol (The SONY LANC protocol) via Gadgeteer and .Net MF, but am having some issues, and could do with some advice.
The protocol is a “two-way serial open collector 9600 baud protocol with inverted logic”. The LANC line is pulled high to about +5v and is pulled low to send commands or status information. My electronics is a bit rusty, but I am just about getting my head round it!
I have built the circuit in the blog and connected it to an Extender module on my FEZ Spider (see attached image), and for my first stage I am trying to read the byte stream from the camera using a DigitalInput. I am not using a UART connector yet, as have been unable to get the SerialPort to register any inbound DataReceived events - this might be because the circuit needs modification - the Arduino example works via a digital input and output.
Using the DigitalInput I can register a data stream. To do this, I am reading the bits from the digital input, which are 104 µs duration (see the LANC protocol doc link above). The data coming out does not make much sense and my suspicions are that the looping and delay I am using to read the bits is far too long.
My questions are…
[b]1) is it possible to create a small delay small enough (104 µs) using Gadgeteer and .Net MF? (see code extracts below)
- if it is not possible, I imagine I should use a SerialPort and UART to read and write the serial data to the camera and pretty much ignore how it has been done in the blog for Arduino sketch. If this is the case, the UART TX and RX are two different pins on the Gadgeteer sockets. I only have 3 wires from the camera +5V, signal cable, and GND. Should I use a similar circuit as in the blog to separate out the input and output from a single signal cable to separate TX/RX on the UART socket?[/b]
I created a couple of methods to allow me to delay for µs intervals.
public class TimingService
{
public static long GetCurrentTimeInTicks()
{
return Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks;
}
public static void DelayMicroseconds(int microseconds)
{
//10 ticks per microsecond
var now = GetCurrentTimeInTicks();
while (GetCurrentTimeInTicks() < 10 * microseconds + now)
{
}
}
}
However, when I write a ‘test’ for this, which evaluates a 1 µs delay (I am entering 1 µs here to see what is the shortest delay I can achieve - I know that there is no way that I will be able to achieve 1 µs!).
private void TestTimer()
{
var microsecondsDelayValue = 1;
long ticksBefore = 0;
long ticksAfter = 0;
ticksBefore = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks;
TimingService.DelayMicroseconds(microsecondsDelayValue);
ticksAfter = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks;
var elapsedTicks = ticksAfter - ticksBefore;
Debug.Print(elapsedTicks.ToString());
}
I get a result of 11259 ticks which equates to 1126 µs, (10 ticks equal 1 µs) which, ignoring the other calls, is generally a factor of 10 greater than my target 104 µs.
The following statement is another test just to call the GetMachineTime().Ticks a couple of times and perform a subtract operation.
var elapsedTicks = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks - Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks;
Debug.Print(elapsedTicks.ToString());
It gave me a result of 2394 ticks (239 µs) which makes me think that I am not going to be able to achieve anywhere close or accurate enough for 104 µs!
Any advice/guidance would be greatly appreciated.
I can provide code samples and further info, but trying to keep post as short as possible for the moment.
Thanks,
Andrew