4

I have been working on creating a UART on an FPGA. I can successfully transmit and receive single characters typed on PuTTY. However, when I set my FPGA to constantly write a large sequences of "A", sometimes I end up with a sequences of "@" or some other characters until I reset the FPGA a few times.

I believe the UART on the computer looses track of the difference between the start bit and a zero. The delay between the two "A" is ~ 30us (measured with a logic analyzer) and the baud rate is 115200 8N1.

Is there a minimum delay that must be maintained between two consecutive RS232 frames?

Ben N
  • 40,045
  • 17
  • 140
  • 181
Lord Loh.
  • 1,014
  • 4
  • 13
  • 30
  • Fairly easy to get out of sync on long data transmissions if using only one stop bit. Gets worse when transmitting binary data that tends to be all zeros at times (or all ones). – Daniel R Hicks Dec 07 '12 at 01:13
  • But you say that you have to "reset the FPGA a few times" to clear the error. If the problem was at the receiving end it would only be necessary to pause transmission (between characters) for one character's time to achieve reset. This makes one wonder if your FPGA is somehow getting out of sync internally. – Daniel R Hicks Dec 07 '12 at 01:22
  • @sawdust -- The point is that the transition between stop bit and start bit is the only reliable point of reference in the data stream, and the protocol is *asynchronous*, meaning that you can't rely on the timing of previous bits. A single noise glitch and the protocol's "locked on" to the wrong reference point. – Daniel R Hicks Dec 07 '12 at 01:28
  • @DanielRHicks - OK, you're right. I do recall once or twice hot-plugging a serial connection, and the receiver synch'd to the wrong bit of the frame. It would not correct until there was a pause in the transmission. – sawdust Dec 07 '12 at 03:30
  • Okay, I had a free running baud rate generator. May be I should try to reset the counter in the generator on detecting a start bit. But this is not my problem - the computer is going out of sync, not the receiving FPGA (not stress tested yet). – Lord Loh. Dec 07 '12 at 05:56

3 Answers3

3

Is there a minimum delay that must be maintained between two consecutive RS232 frames?

No, there is no such requirement (no min and no max) in EIA/RS232C.
The Start bit of the next character can immediately follow the Stop bit of a character.
Note that the line idles at the Marking state, which is the same level as the Stop bit.

It is interesting that you make no mention of the Stop bit in the character frame.

I believe the UART on the computer looses track of the difference between the start bit and a zero. The delay between the two "A" is ~ 30us (measured with a logic analyzer)

You are using the wrong tool for this task! You should be using a 'scope. You cannot analyse a timing problem by viewing a sampled and sanitized rendition of the analog signal.
The difference between the Start bit and a zero is timing. The character frames are transmitted at an asynchronous rate. But the bits of the frame have to be clocked at the specified clock rate.
For 115200 baud rate, that would be 8.68usec for 1 bit time. For 8 data bits plus a Start bit and a Stop bit, the frame time is 86.8usec.
You question implies that you have not bothered to look at the EIA/RS232C spec for minimum rise/fall times and when the signal is typically sampled. Interesting method for implementing HW.

Perhaps you should also use a frequency counter to measure the baud rate generator at each end. A mismatch of a few percent can usually be tolerated. A mismatch could produce the symptoms you see.
How come framing errors are not reported by the receiver? Instead of just looking at output, maybe you need to review the stats of the serial port, i.e. /proc/tty/driver/...

sawdust
  • 17,383
  • 2
  • 36
  • 47
  • Using an FPGA, I have little control of the raise time or the fall time. Also, I am not attempting to implement the EIA/RS232C specification. What I can reasonably do an an FPGA is a serdes. Example code provided by the vendor works just fine - so I rule out analog trouble. Although example codes do not dump single characters as fast as the hardware possibly could. – Lord Loh. Dec 07 '12 at 06:02
2

As well as speed and number of data bits, I think the two ends must agree on the number of start bits, stop bits and parity bits.

See Asynchronous Serial Communication

RS232 signal

The above shows how characters are separated but has rather idealised rise and fall times, I believe a scope would show something more like what follows (note inverted mark/space axis compared with prior diagram).

enter image description here

Perhaps you should set the speed lower, maybe your FPGA isn't emitting a well-formed signal at higher speeds.

Also RS232 is async, I believe that means the receiver is expected to synchronise it's timing based on the start and stop bits.

  • A is binary 01000001
  • @ is binary 01000000

The difference is a matter of accurate timing. With inaccurate timing a receiver can count six instead of five whilst the +3...15V is asserted.

See Signal Timing and Signal Characteristics

RedGrittyBrick
  • 81,981
  • 20
  • 135
  • 205
  • This makes it look easy to get out of sync if I have one start and one stop bit. Practically, my stop bit is a little bit longer as the serializer it loads the next byte in. I used a logic analyzer. I do not have an oscilloscope :-( But the FPGA and the board are designed to operate up to 400 MHz (http://www.digilentinc.com/Products/Detail.cfm?NavPath=2,400,836&Prod=ATLYS) – Lord Loh. Dec 06 '12 at 17:29
  • I'd try running at 9600 N81. If OK double speed & retry. rinse repeat. You do the byte loading during the "idle" state? At low speeds a cheap and nasty scope may be useful. – RedGrittyBrick Dec 06 '12 at 17:32
  • This is my first revision of the UART. I have not yet implemented a FIFO or anything fancy. Just a Serializer and Deserializer. – Lord Loh. Dec 06 '12 at 17:57
  • It just occurred to me that I get the same problem whether I use the IO pins to connect to a USB TTL UART module (http://www.ebay.com/itm/180953299346?ssPageName=STRK:MEWNX:IT&_trksid=p3984.m1439.l2649) or the onboard USB UART. Can I eliminate signal jitters? – Lord Loh. Dec 06 '12 at 18:22
  • Actually, the two ends really only need to agree on data rate and number of data + parity bits. There's only ever one start bit, and modern UARTs should be able to handle one stop bit. More than one stop bit just looks like "dead air" between characters. – Daniel R Hicks Dec 07 '12 at 01:15
  • @DanielRHicks - *"the two ends really only need to agree on ..."* - Not quite true. Since most modern UARTs apply the *same* attributes to both Rx and Tx, using arbitrary stop-bit lengths at each end would result in unreliable reception for one direction. Transmitting 2 stop bits can be received okay by a UART expecting only 1 stop sit. But that UART will then send only 1 stop bit, and the other end will be expecting 2 stop bits. If there isn't enough idle time between chars, the 2-stop-bit end will have reception problems. So proper setup should have identical settings at both ends. – sawdust Dec 08 '12 at 20:32
  • @sawdust - Except that the only "UART" I've ever known that actually needed the extra stop bit time (for other than a noise margin) was the ASR-33 (electro-mechanical Teletype). The original UARTs back in 1972 only needed one stop bit, though they could be programmed to send two -- the two functions ARE separate. – Daniel R Hicks Dec 09 '12 at 01:30
  • @DanielRHicks - I'm fully aware of TTYs needing 2 stop bits; I've used TTYs and punched paper tape back in the 70s. But advising that the number of Stop bits can be set to an arbitrary value is misleading. Most UARTs, serial port drivers and APIs (e.g. POSIX) only accept one specification for number-of-Stop-bits for both Tx and Rx. IF 2 Stop bits are set, then the Rx might actually expect 2 bit times before expecting a Start bit. Why offer shortcut advice that may not work? – sawdust Dec 12 '12 at 00:51
  • @sawdust -- I'm pretty sure you're mistaken. There's only one setting for stop bits because only the transmitter needs the setting. The receiver doesn't even check stop bits, since once it's counted out the data bits it's done until the next start bit is detected. – Daniel R Hicks Dec 13 '12 at 01:15
  • @DanielRHicks - *"The receiver doesn't even check stop bits..."* -- If that was true, then the receiver has no means to determine a framing error. But UARTs do check for and detect framing errors, so the receiver must check for a stop bit (or at least the [minimum period of time the line must be idle (mark state) at the end of each word](http://www.lammertbies.nl/comm/info/RS-232_specs.html)). – sawdust Mar 14 '13 at 09:19
  • @sawdust - Neither UART I worked with checked for 2 vs 1 (or 1.5) stop bits. Once they saw the stop (probably for about 3/4 bit time) they called it good. – Daniel R Hicks Mar 14 '13 at 10:55
1

I suspect that UARTs are still pretty much similar to the original ones. They used a 16xdata rate clock to "sample" the data, vs the earlier analog scheme that used an oscillator that was edge-triggered. Using the sample approach the UART could fairly accurately position it's sample time in the middle of the pulses, and could even do multiple samples to be a little more noise tolerant.

Your description is unclear in that you talk in a recent comment about "detecting a start bit", but you had implied earlier that you're TRANSMITTING and hence would have nothing to "detect".

Daniel R Hicks
  • 6,143
  • 3
  • 27
  • 50
  • That is true. I am having trouble with the transmission. But that does not mean my receiver is perfect. I am sure there are problems that I have not detected. I did not know of the 16x receiver clock. I had thought of something similar, but did not implement it as it would take a lot more hardware. I might try 4x. – Lord Loh. Dec 08 '12 at 08:43
  • 1
    Without the 16x scheme you've got the problem of building a triggerable oscillator that can be triggered by the start bit and then mark out time reasonably accurately. This is an "analog" circuits problem, and, as we all know, analog solutions are much fussier and less precise than digital solutions. – Daniel R Hicks Dec 09 '12 at 01:37