Page 1 of 1

When does "finish"?

Posted: Sun Jun 29, 2008 3:36 am
by TimH
In the documentation for Port.Read, it says:

Read has two versions. The first takes a BufferSize and TimeOut as arguments. Setting BufferSize to 0 will return all data in serial buffer. But if the serial buffer is empty it will wait until something has arrived. TimeOut determines how long to wait (in miliseconds). Set TimeOut to 0 to disable time out.

In the third sentence, where it says it "will return all data in serial buffer"...WHEN exactly does it return it, and how does it know that it is "all" data? Typically I'm working with a scale that sends about 15 bytes of data about once each minute, at 9600 baud--which I'm assuming seem virtually instantaneous. But and when does know that it's time to "return all data in serial buffer"? Does it look to see that data is in the buffer, and then if no more comes in during the next 'x' milliseconds, it thinks that "all data" is there...and then delivers the data and moves on? If not, how DOES it work? TIA

RE: When does "finish"?

Posted: Sun Jun 29, 2008 7:54 am
by TimH
And, note that I'm specifically referring to the syntax where I'm using a BufferSize of 0--so that it will "return all data in serial buffer".

RE: When does "finish"?

Posted: Sun Jun 29, 2008 2:20 pm
by johan
It will return when data comes in. It can return 1 byte, or many bytes. This will depend on the serial port hardware and driver.

Your application need to handle this. You need to implement a "parser" which takes data that comes in, and acts on it when a complete 15 bytes message has arrived.


Franson Support