Serial Port communication

B

Bernhard Voigt

Hey there,

I've got serious problems with timing on a serial connection, using the
java communication api. I try to controll a device over a RS 232 port. The
problem is that the timing of feedback from the device differs strongly,
that means from milliseconds to several seconds. Therefor I'd like to
adjust the receive timeout value by using enableReceiveTimout ( ) with an
argument of 3000 milliseconds or so, at times I know that the response
will take a time. And vice versa, when I expect fast feedback I'd like to
set the timeout threshold to a few milliseconds.
As far as I understood, this timeout is the only thing wich may stopp the
the recieving on the serial port unless no other threshold is set.

The problem is that the setting, just before the long waiting period, of
enableReceiveTimeout doesn't affect anything, although it's correctly set
and the driver supports this feature. However, setting the timeout a time
before, e.g. after opening the connection, has big influence. But as I
said, afterwards a change isn't fullfilled.
Additionally timeouts greater than 3000 milliseconds yield to no
communication at all.
I wondered wether it is necessary to recreate the input buffer by calling
serialport.getInputBuffer(), that the new timeout value is used, but that
didn't help.

My readout method is settled within the serialEvent ( ) method, I took
this from the SerialDemo sample, delivered with the api package. I tried
to build up my own timing controll, but that didn't work at all.

So the question is, how to deal with this timing problems. And by the way,
is there someone who can explain me how the serialEvent ( ) method works,
that means, when is the event triggered. And how do I know when the data
stream is over.
At this time I'll do it like the SerialDemo sample

// in the serialEvent method
newData = 0;
while ( newData != -1 )
newData = serialPort.read();

Of course there is additional mapping from byte to int, but that doesn't
matter here. I'd like to know who's setting the -1, when the device
on the other side doesn't serve this. Is this done after the timeout
threshold within the api?

So hopefully someone can give me some hints.

Yours truly,
Bernhard


__________________________________________

Bernhard Voigt
DESY - FLC
Notkestr. 85
22607 Hamburg

Phone: ++49 40 8998 3290
Mail: (e-mail address removed)
__________________________________________
 
D

Dale King

Bernhard Voigt said:
Hey there,

I've got serious problems with timing on a serial connection, using the
java communication api. I try to controll a device over a RS 232 port. The
problem is that the timing of feedback from the device differs strongly,
that means from milliseconds to several seconds. Therefor I'd like to
adjust the receive timeout value by using enableReceiveTimout ( ) with an
argument of 3000 milliseconds or so, at times I know that the response
will take a time. And vice versa, when I expect fast feedback I'd like to
set the timeout threshold to a few milliseconds.
As far as I understood, this timeout is the only thing wich may stopp the
the recieving on the serial port unless no other threshold is set.

The problem is that the setting, just before the long waiting period, of
enableReceiveTimeout doesn't affect anything, although it's correctly set
and the driver supports this feature. However, setting the timeout a time
before, e.g. after opening the connection, has big influence. But as I
said, afterwards a change isn't fullfilled.
Additionally timeouts greater than 3000 milliseconds yield to no
communication at all.
I wondered wether it is necessary to recreate the input buffer by calling
serialport.getInputBuffer(), that the new timeout value is used, but that
didn't help.

My readout method is settled within the serialEvent ( ) method, I took
this from the SerialDemo sample, delivered with the api package. I tried
to build up my own timing controll, but that didn't work at all.

So the question is, how to deal with this timing problems. And by the way,
is there someone who can explain me how the serialEvent ( ) method works,
that means, when is the event triggered. And how do I know when the data
stream is over.
At this time I'll do it like the SerialDemo sample

// in the serialEvent method
newData = 0;
while ( newData != -1 )
newData = serialPort.read();

Of course there is additional mapping from byte to int, but that doesn't
matter here. I'd like to know who's setting the -1, when the device
on the other side doesn't serve this. Is this done after the timeout
threshold within the api?

So hopefully someone can give me some hints.


I don't think there is any guarantee that setting the receive timeout while
a read is in progress will change the timeout for that read. One thing you
might want to try is disabling the receive timeout before setting the new
value, but no guarantees.

Another thing I learned is that it helps getting receive timeouts in a
timely manner if you call notifyOnDataAvailable even if you do not use an
event listener.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,764
Messages
2,569,566
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top