Pclk and fractional divide register

(Panda-II)

In many posts it is claimed that the hardware UART can do oddball baud rates very accurately, including 1000000 for the AX12 servo, but my Panda-II is pumping bits out at approximately 0.4 the configured speed (using the AX12 example posted on this site). Even stranger is that when I double the baud rate, the rate shown on the oscilloscope only goes up by about 1.5. It’s just weird.

I want to try to set the baudrate manually, through the registers. The datasheet has all the formulas, but I cannot figure out what the value of Pclk is. Is that 72MHz?

I never did find a published way to determine Pclk, but I sort of backed into it by running the equation backwards from known results/configuration.

This link is quite helpful in setting up the register values: [url]http://prototalk.net/forums/showthread.php?t=11[/url]
Not so clear in all this mess is the fact that you have to write DLAB to 1, then write DLM and DLL, [italic]then write DLAB back to 0[/italic]. Officially the Fractional Divide Register will not have any effect if DLMx256+DLL is below 3. What I found is that any value below 3 had no effect, regardless of the status of FDR.

According to my oscilloscope the fastest bit rate I can get out of my Panda-II is ~512kHz. Interestingly, my AX12’s are responding to that frequency even though they’re programmed for 1MHz, and work my AVR that is talking at 1MHz (verified with the scope). I was not aware that the AX12 had any kind of auto baud detection, but I can’t explain it any other way.

FDR = 0x10
DLM = 0x00
DLL = 0x03 (can’t seem to go any lower)

Based on the values above, and what I see in PCLK_UART2, the UART is maxed out at 512. What am I missing? Are some of the UARTS faster than others? I’m using UART2 (COM3).

When you open your serial port, use the actual baud you need (non-standard) then set the register as explained on wiki.

This will make the UART clock = system clock = 72Mhz

Yes, I did exactly that. I believe I found the source of the mystery.

I had been estimating my baud rate by dividing my throughput, in bytes, by 10. This is good enough for rough figures, but it overlooks one variable; a rather unexpected variable if you ask me. I hard-coded my buffer to send 0x55 0x55 0x55 0x55, so I could easily examine the bit time and byte time separately. See the attached image (apologies for the fuzziness). There seems to be a fixed (or at least minimum) inter-character pause of just over 12.3uS. It’s like having an extra stop bit of fixed length. When I reduced DLL by half, my bit-time doubled, but the time between the estimated trailing edge of the stop bit, and the leading edge of the next start bit did not change. At 1Mb, this pause is longer than the byte itself. This pause (and failing to account for it) is why my baudrate estimations were not linear, and seemed to cap.

Now that one mystery is solved, why is the the time between bytes fixed? And why is it so long? Is this something I can control? Is this a function of the firmware failing to supply the hardware TX buffer before the previous byte is shifted out? I know how AVRs solve this, but I’m totally new to LPC2xxx.

I am not sure how I can help further but hte LPC user manual and datasheet would be the right way to go about this.

While the chip UART can send each character at 1Mb/sec, does the chip have enough power to actually feed the UART 100,000 characters per second?

Dave, that is the question of the hour.

I have a hypothesis. I believe only GHI could confirm it though. In other architectures I’ve worked with, there is an interrupt to signal end-of-transmission, and a separate interrupt to signal tx-buffer-ready. If SerialPort is waiting for the signalled end-of-transmission before writing to the buffer, there will be a delay between characters. The correct behavior is to trigger on tx-buffer-ready. If you watch the interrupt behavior at the start of your packet, you find that the interrupt fires twice in quick succession, writing 2 characters quickly before resuming a more normal pace. The first character was (almost) instantly copied into the shift-register, making way for the second character. The second char sits there in the tx buffer until the first char’s bits are shifted out. The second char is copied in and tx-buffer-ready is triggered again. The time it takes to shift out the bits of the previuos byte is more than plenty to spit one more byte into the tx buffer.

Again, only GHI engineers could confirm this, as the source is not available.

The source is available, because Serial Port is part of the base .NETMF. Yo could download SDK and porting kit from Microsoft to check it out. I dont have it handly to check by now, maybe at nigth.

Really? I know very little about the porting aspect, but I would not expect MS to write the ARM7-specific implementation; merely the CLR wrapper for such. I would expect the implementation guts to be written (and heavily guarded) by GHI. I’m happy to be wrong, if that’s the case. I’d like to see the C/C++/ASM that is doing the grunt work of SerialPort.

There are USART drivers for several architectures in the NETMF base code; GHI is likely using one of their own and NOT one of these.

There are LPC22xx and LPC24xx USART drivers in the code tree:

LPC22xx: [url]http://netmf.codeplex.com/SourceControl/changeset/view/13774#12317[/url]
LPC24xx: [url]http://netmf.codeplex.com/SourceControl/changeset/view/13774#12367[/url]

How close to these the GHI driver is, only GHI knows.