I just successfully transferred one byte from one Cobra to another.
Now I have a question. If I want to transfer data from few bytes to few kB is it safe to just read on receiver all bytes available or it would be better approach to transfer first integer, how many bytes will be send, and then read correct count of bytes.
I know that when using sockets, you never know how many bytes will you receive on DataAvailable (all or just part of them).
How is it when using UART protocol. If I send 10kB will I receive on SerialPort.BytesToRead 10kB or any delay is possible?
I hope someone will understand what I am trying to ask.
XMODEM is simple to implement and gives some small resilience to errors. If you wanted to get fancy, you could implement XMODEM-CRC which is only a tiny bit more complex, and gives much better resilience to errors.
Yes, it is possible, though probably unlikely if you’re working with short distances. In a noisy environment, it might be more likely, but I wouldn’t sweat it TOO much.
Of course, the faster you go, the more likely the chance of error.
Generally, I go as slow as possible, depending upon the application requirements.
If you want to go a long distance, with a serial interface, then converting the TTL levels to RS-232 is recommended.
RS-232 I believe can go 50 feet. Been a while since I have actually done long RS232 cables.
it’s not “the only way to get safe” but it’s a tried and proven method using commodity cables. TTL level signals will be affected by resistance and capacitance in the cables and RS232 removes a lot of that sensitivity
Using the pure TTL signals without transceivers is (EMC based) only usable on board, and then still you have to follow the rules…
2 meters in a violent environment (50hz power is qualified as violent for this) is living on the edge
RS232 is good for “some” meters, RS485 is “great” for violent environment and longer distance, CAN is also “great”
To each message you need to add a checksum. The receiver can then recalculate the checksum and verify the integrity of the message.
If the message has been corrupted, then the receiver needs to send a message to the transmitter to re-transmit. If the message is OK,
the a message needs to be sent to the transmitter acknowledging the received message. The transmitter can then discard the message.
It is also possible to package the message in an error correcting protocol. Additional bits are added to the message that allow for the
detection and correction of errors. The more error correction bits, the more errors that can be corrected.
If 100% reliability is required, then software is necessary to insure accurate delivery of messages.
I have seen this in different applications i have to support at the moment but if you need to have a reliable and economical solution i would suggest to adapt the industry standards and go with at least a RS232 solution and leave the verification to lower layers in the OSI model.