FEZ Spider and DMX

Time between the bytes is not that important. All units reset their address counter on the longer sync signal and increase the counter by one for each received byte. Is address counter matches dip-switch address, the received byte is latched for the units first channel.

If you want smooth updates, you might want to send the sequence from RLP.

as i mentioned before… i dont know where to connect the dmx-receiver… :-[

edit: and how to send raw-data there

Dmx is a very simple protocol based on rs485 indead. We have done a lot with it. But not with .net micro yet :’( I used atmel controllers a lot for it and just keep checking in a loop if there is a frame error (if you are creating a dmx receiver) Because that will indicate you that we are going get a new dmx stream of data. Then it is just counting the received bytes until you reach the address you want (keeping in mind that you first get the start code).

But does anybody know how we could do this in .net? The standard serial driver doesn’t show frame error ofcourse. So should this be done in rpl then?

Regards,

Per

Well got some panda2 boards here for testing now to see what they can be used for (apart of the emx and chipworks we already have). So maybee a dmx project would be nice. But the receiving part… Nobody actually got something working with .net and dmx? The start of the frame part is still bugging me. Like i said, on our atmel projects i just check for a frame error, but how to do that in .net?

Regards,

Per

What about the SerialPort.ErrorReceived event? You should get the SerialError.Frame error.

Check out the following thread.

I modified the project for the Panda II, but haven’t been able to successfully get it to work. I am not sure the break and mark after break is properly being set. My thought is the break is not >88us since the Panda has a higher clock speed.


   private static void SendDMX()      
             {           
                    // data for a whole universe including start code (which must be zero for DMX data)      
            
               byte[] data = new byte[513];      
               OutputPort led = new OutputPort((Cpu.Pin)69, true);

                 // init test bytes       
                 for (int i = 0; i < data.Length; i++)       
                 { 
                     data[i] = (byte)i;         
                 }


                // Ensure first byte is 0
                   data[0] = 0;

                 OutputPort pBreak = new OutputPort((Cpu.Pin)61,false);
                 SerialPort sp = new SerialPort("COM1", 250000, Parity.None, 8, StopBits.Two);    
                 sp.Open();

                 for (int x=0; x< 1000; x++)

                 {               
                     led.Write(true); 

                     // create the break (2x Write(false) to get a break length of >88µs)     
                     pBreak.Write(false);         
                     pBreak.Write(false);
                     pBreak.Write(true);             
                     // rely on .net lazyness for 8µS Mark-after-Break         
                     sp.Write(data, 0, data.Length);           
                     led.Write(false);               
                     Thread.Sleep(40);          
                 }
        
            
                   // turn all lights off
                 for (int i = 0; i < data.Length; i++)       
                 { 
                     data[i] = (byte)0;         
                 }

           
            // create the break (2x Write(false) to get a break length of >88µs)     
                     pBreak.Write(false);         
                     pBreak.Write(false);
                     pBreak.Write(true);             
                     // rely on .net lazyness for 8µS Mark-after-Break         
                     sp.Write(data, 0, data.Length);  
        
             }    

Does anyone have an example project that sends DMX?

Can you use OutputCompare on a different output and tie serial output to outputcompare output with 2 diodes and a pull-up resistor?

I wonder if I could just use OutputCompare to output directly to the MAX485 chip.

From the SDK, I looks like I could start with the following.



// The following will do a software UART, 8N1, 2400 baudrate.
const int MAX_TIMINGS_BUFFER_SIZE = 10;
OutputCompare oc = new OutputCompare(pin, true, MAX_TIMINGS_BUFFER_SIZE);
uint[] buffer = new uint[MAX_TIMINGS_BUFFER_SIZE];
const int BAUD_RATE = 2400;
const int BIT_TIME_US = 1 * 1000 * 1000 / BAUD_RATE;
int BYTE_TIME_MS = (int)System.Math.Ceiling((double)BIT_TIME_US * MAX_TIMINGS_BUFFER_SIZE / 1000);

void SendByte(byte b)
{
    bool currentPinState;
    int currentBufferIndex = 0;
    uint currentStateTiming;

    // start bit
    currentPinState = false;
    currentStateTiming = BIT_TIME_US;

    // data bits
    for (int i = 0; i < 8; i++) 
    {
        bool neededState = (b & (1 << i)) != 0;

        if (neededState != currentPinState)
        {
            buffer[currentBufferIndex] = currentStateTiming;
            currentStateTiming = BIT_TIME_US;
            currentPinState = neededState;
            currentBufferIndex++;
        }
        else
        {
            currentStateTiming += BIT_TIME_US;
        }
    }

    // stop bit
    if (currentPinState != true)
    {
        buffer[currentBufferIndex] = currentStateTiming;
        currentBufferIndex++;
    }

    // Use a non-blocking option ///////////////
    oc.Set(false, buffer, 0, currentBufferIndex, false);
    // wait till data is sent
    Thread.Sleep(BYTE_TIME_MS);
    ////////////////////////////

    // Note that if the blocking option is used, you can get more accurate timings and faster baudrates. Also, you can disable processor interrupts so data doesn't get corrupted when interrupts occur.
    // oc.SetBlocking(false, buffer, 0, currentBufferIndex, BIT_TIME_US, true);
}


DMX Protocol

A DMX byte contains 11 bits, so a byte is 44us long.
[ulist]The byte starts with a Start Bit, which is LOW (0).
After the Start Bit follows a Box of 8 Bits, this Box contains a Value between 0-255.
After this 8 Bit follows a Stop Bit and a Release Bit, both are High (1).
[/ulist]

Every byte is storing one Value (Intensity) for one channel. Because DMX is controlling 512 channels we have to send 512 Bytes in a stream (packet).

I think you will waste too much time configuring the oc to send a single byte. Dmx needs to go fast. You want to send the 512byte buffer multiple times per second…

Has any pointers on programming a DMX receiver? How do you know the start of the 512byte data packet?

Thanks,
Steve

The start is a long “mark” on the serial line. You should get a framing error event at each start of the 512byte transmission as I explained in previous post.

I’m resurrecing this topic :slight_smile:

Help me out! I just seized up on “sending the frame beginning error/ 88uS glitch”

There’s a solution, with using another output pin and AND’ing with the UART output…

But I don’t wanna waste my resources for make this kinda tricks.
I believe there’s a line of code that fixes this, but I can’t found it yet.

@ fuatsengul - See my thread for cerberus http://www.ghielectronics.com/community/forum/topic?id=11416

I think the same could easily apply to spider/emx. The trick is to do on the fly baudrate switching, slowing baudrate enough so that a null byte could fake a break and a mark-after-the-break with the stopbits. I am not sure but I think you can’t do baudrate tweaking with an open serial port on all netmf platforms, so you’ll just have to find the right registers for emx/spider to do that.

Disregard the note about the two-stop bits bug, you shouldn’t be concerned with the spider.
(Or, should you ?)