I got a RS232 signal capture device. and it working great.
I need some help making sense of the data. Basically we bought it because we are dealing a late 80's machine controller that uses serial communication. We had little luck despite knowing the port parameters.
From the data I dumped machine control is using the break signal as part of it's protocol. I am having trouble duplicating it using VB and the MSComm. I know to toggle the break signal and on and off. But I am not sure what I am supposed to be doing with it. I am supposed to leave it on for each byte of data I send. Or send a byte of data and then toggle.
Also I am confused how I supposed to receive any data from the controller. Do I toggle a flag when the break is turned on and then when it turned off read the input?
Michael Burr's description of the way break works is accurate. Often, "break" signals are sent for significantly longer than one character time.
These days, "Break" is infrequently used in serial comms, but the most common use is as a 'cheap' way of providing packet synchronization. "Break" may be sent before a packet starts, to alert the receiver that a new packet is on the way (and allow it to reset buffers, etc.) or at the end of a packet, to signal that no more data is expected. It's a kind of 'meta-character' in that it allows you to keep the full range of 8 or 7-bit values for packet contents, and not worry about how start or end of packet are delineated.
To send a break, typically you call SetCommBreak, wait an appropriate period (say, around 2 millseconds at 9600 baud) then call ClearCommBreak. During this time you can't be sending anything else, of course.
So, assuming that the protocol requires 'break' at the start of the packet, I'd do this (sorry for pseudocode):-
procedure SendPacket(CommPort port, Packet packet)
{
SetCommBreak(port)
Sleep(2); // 2 milliseconds - assuming 9600 baud. Pro-rata for others
ClearCommBreak(port)
foreach(char in packet)
SendChar(port, char)
}
Pseudocode for a receiver is more difficult, because you have to make a load of assumptions about the incoming packet format and the API calls used to receive breaks. I'll write in C this time, and assume the existence of an imaginary function. WaitCommEvent is probably the key to handling incoming Breaks.
bool ReadCharOrBreak(char *ch); // return TRUE if break, FALSE if ch contains received char
We'll also assume fixed-length 100 byte packets with "break" sent before each packet.
void ReadAndProcessPackets()
{
char buff[100];
int count;
count = 0;
while (true)
{
char ch;
if (ReadcharOrBreak(ch))
count = 0; // start of packet - reset count
else
{
if (count < 100)
{
buff[count++] = ch;
if (count == 100)
ProcessPacket(buff);
}
else
Error("too many bytes rx'd without break")
}
}
WARNING - totally untested, but should give you the idea...
For an example of a protocol using Break, check out the DMX-512 stage lighting protocol.
The start of a packet is signified by a Break followed by a "mark" (a logical one) known as the "Mark After Break" (MAB). The break signals end of one packet and the start of the next. It causes the receivers to start reception. After the break up to 513 slots are sent.
A break signal is an invalid character. When the RS-232 line is idle, the voltage is in the 'mark' (or '1') state (which is -12 volts if I remember right). When a character is sent, the protocol toggles the line to the 'space' (or '0') state for one bit time (the start bit) then toggles the signal as appropriate for the data (the data bits) and any parity bits. It then holds the line in an idle/mark (or 1) state for a number of bits defined by the stop bits, which is typically configurable (usually 1 stop bit in my experience).
Since there is always some period of time where the line will be in a mark state between data characters, the start of a character can always be recognised. This also means that the longest period of time that the line can be in a space state is:
1 start bit + however many data bits + a parity bit (if any)
A break signal is defined as holding the line in the space state for longer than that period of time - no valid data byte can do that, so the break 'character' isn't really a character. It's a special signal.
As far as when you need to issue a break signal depends entirely on the protocol being used.
'Break' was intended for when the line synchronization got totally mixed up.
I am supposed to leave it on for each byte of data I send. Or send a byte of data and then toggle.
Try sending a nice long 'break' signal (500 ms?) then wait a bit (50 ms?) then send your data.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With