Duration of an RLP->Writepin(...) call

I am trying to communicate with a device (video camera) by generating a Serial signal in RLP at 9600 baud, which implicitly requires each bit to be a duration of 104us. I need to know how long an RLP->Writepin(…) call might take in order that I can subtract this from the total duration, to give me the delay I need to insert between pin writes.
My board is a Fez Spider I, and as my C/C++ is not great, I was wondering if anyone had a C/C++ code snippet that I might be able to use to time an RLP call in microseconds?
I have found the following code online which I adapted to workout how long a RLP->Delay(…) takes, but it always returns zero. I am guessing that I am missing something very obvious, but not sure what!

double GetExecutionTime(void **args)
     unsigned int delay = *(uint *)args[0];
    clock_t start, end;
    double cpu_time_used;

    start = clock();
    end = clock();
    cpu_time_used = ((double) (end - start)) / CLOCKS_PER_SEC;

    return cpu_time_used;

@ pandrewster - you are dividing by clock ticks per second. Try clock ticks per microsecond?

I would also verify that you are getting a proper value for the delay time.

The bits for the UART are handled in hardware. What are you trying to do?

@ Gus - I am trying to send commands to a video camera using the Sony Lanc protocol. I am using this link Control Your Camera: Arduino controlled video recording using the LANC port as a guide. It explains the signal structure very well, but in a nutshell, my main objective with the RLP code is to write to the first two bytes of a serial 8-byte command generated by the camera. The signal is sent via a single wire, using a two-way serial open collector 9600 baud protocol with inverted logic. The camera generates the timings of the command and sends a >5ms pause before the command is to be sent. It then sends a 104us start bit, after which I write my first byte. The camera generated signal follows this by a stop bit. I then wait for a second start bit, and write the second byte of my command. The remaining 6 bytes of the command sent by the camera are status messages which I can ignore for the moment.

Due to the precise synchronisation needed, I can’t use the standard UART functionality (pls correct me if I’m wrong). At 9600 baud the length of each bit is 104us. The code in the link above is for an Arduino, and it states ‘Writing to the digital port takes about 8 microseconds so only 96 microseconds are left till the end of each bit’. Currently I am successfully synchronising with the long pause, and the start bits/stop bits, but the commands I am writing to the pin don’t look right (I am using ‘RLP Serial Debugging’ code from Codeshare to read it on COM2).
I want to verify the execution times for a RLP->Writepin(), to see if writing to a digital pin on my FEZ Spider via RLP is <> 8us, and adjust my RLP->Delay(…) after each write as necessary.

Any guidance gratefully received!

9600 baud is a pretty normal UART rate, so I’d be surprised if you actually needed to go to RLP to achieve that. As Gus asks, what are you trying to achieve, or why didn’t UART work for what you wanted in the first place?

Edit: 30 seconds between your post and mine :slight_smile:

I stumbled upon this post https://www.ghielectronics.com/community/forum/topic?id=3495&page=2 relating to ticks in RLP, and saw that if I used Cpu.SystemClock in .Net MF I could get the ticks per second for my board, and then if I included LPC24xx.h in my RLP, I can use the Timer0 T0TC variable to get the current ticks. Subtracting the ticks before I call the writepin from the ticks after, and I get the tick count to execute the statement. I can then divide by ticks per microsecond (18 for the EMX), and it gives me the microsecond count. Finally got there!

as a quick update, on my EMX Spider (18MHz) the RLP->Delay takes 2us to execute, and the RLP->Writepin takes 3us, so from these figures I was able to calculate the delay between pin writes!