Accelerated C++ - Chapter 1, confusion in understanding "flushing the buffer"

A

arnuld

this is from mentioned section. i did not understand some things here:

To avoid the overhead of writing in response to each output request, the library
uses the buffer to accumulate the characters to be written, and flushes the buffer,
by writing its contents to the output device, only when necessary"

it means "flushing the buffer" and "writing to output device" are SAME
thing, these are just 2 different names for the same thing.

?

When our program writes its prompt to cout, that output goes into the buffer
associated with the standard output stream. Next, we attempt to read from
cin. This read flushes the cout buffer, so we are assured that our user will
see the prompt"

1st, author is saying that our program writes its /prompt/ to cout and
then he says that /cin/ flushes the buffer and that is why user sees
the prompt.

what does he mean by "our program writes the prompt" and /cin/ helps
in the user to see the prompt ?

i think i can see the prompt every time, whether i use /cout/, cin/ or
whatever.

Our next statement, which generates the output, explicitly instructs the
library to flush the buffer. That statement is only slightly more complicated
than the one that wrote the prompt. Here we write the string literal "Hello, "
followed by the value of the string variable name, and finally by std::endl.
Writing the value of std::endl ends the line of output, and then flushes the
buffer, which forces the system to write to the output stream immediately.

both /cin/ and /cout/ write prompt to output device. i am utterly
confused.
 
A

Alf P. Steinbach

* arnuld:
this is from mentioned section. i did not understand some things here:



it means "flushing the buffer" and "writing to output device" are SAME
thing, these are just 2 different names for the same thing.

?

Almost. Flushing the buffer involves (1) writing the characters in the
buffer to the output devices, so that they appear visually on your
screen or paper, and (2) emptying the buffer so that these characters
aren't written more times and to make room for later output.

1st, author is saying that our program writes its /prompt/ to cout and
then he says that /cin/ flushes the buffer and that is why user sees
the prompt.
Yes.


what does he mean by "our program writes the prompt" and /cin/ helps
in the user to see the prompt ?

Your source code has some statement like

std::cout << "Gimme a number!";

That's "our program writes the prompt".

The effect is just to place the text "Gimme a number!" in the output
buffer. If there had been a newline at the end, or a <<std::endl, then
the buffer would have been flushed at this point. But there isn't, so
the effect is just to accumulate text in the output buffer.

When subsequently the program executes

std::cin >> number;

std::cin first tells std::cout to flush its buffer. std::cout flushes
its buffer, and the prompt appears on your screen. std::cin inputs the
number.


i think i can see the prompt every time, whether i use /cout/, cin/ or
whatever.
Nope.



both /cin/ and /cout/ write prompt to output device. i am utterly
confused.

It's std::cout that does the work. But std::cout does it in a lazy way,
accumulating text in its buffer, sort of like std::cout sits on the
toilet accumulating excrement there. Then std::cin knocks on the door
and says, hey, /I/ need to use the toilet now, flush it and get out!
 
A

arnuld

* arnuld:

Almost. Flushing the buffer involves (1) writing the characters in the
buffer to the output devices, so that they appear visually on your
screen or paper, and (2) emptying the buffer so that these characters
aren't written more times and to make room for later output.

ok , thanks for *clear* explanation.

Your source code has some statement like

std::cout << "Gimme a number!";

That's "our program writes the prompt".

now i got it. by "prompt" you mean "Gimme a number: ".

i was thinking, "prompt" means "blinking cursor".
The effect is just to place the text "Gimme a number!" in the output
buffer. If there had been a newline at the end, or a <<std::endl, then
the buffer would have been flushed at this point. But there isn't, so
the effect is just to accumulate text in the output buffer.

When subsequently the program executes

std::cin >> number;

std::cin first tells std::cout to flush its buffer. std::cout flushes
its buffer, and the prompt appears on your screen. std::cin inputs the
number.

author should have said the "text" appears on the screen, using
"prompt" as a synonym for "text" confused me.

ok, ok, you are right. as i said, i can always see the "blinking
cursor" on my screen.


It's std::cout that does the work. But std::cout does it in a lazy way,
accumulating text in its buffer, sort of like std::cout sits on the
toilet accumulating excrement there. Then std::cin knocks on the door
and says, hey, /I/ need to use the toilet now, flush it and get out!

dirty boy

;-)

just for fun


BTW, buffer is an internal data structures which holds characters, so
is it an array ?

2nd, buffer concept is used only for "efficiency" reasons ?
 
A

Alf P. Steinbach

* arnuld:
BTW, buffer is an internal data structures which holds characters, so
is it an array ?

In practice yes.

2nd, buffer concept is used only for "efficiency" reasons ?

Yes, but...

The "but": if you are std::cout, when you write characters to a screen
that's part of the same computer you could just draw each one on the
screen as it's received for output, and then forget about it, no buffer.
But let's say the screen is an old-fashioned terminal connected to the
computer via a serial line. Displaying a character on the screen then
involves transmitting it to the terminal via the serial line, and this
is a complicated copying and conversion process, involving at least a
one-character buffer on each end of the line (in practice those buffers
are larger) -- in that situation the buffers are logically required by
the task to be accomplished, and not just optimization.

Generally, when we say "buffer" we mean a buffer that can hold more than
one character (or byte, or item), but for some problems it's important
to realize that even "un-buffered" input or output can involve
1-character buffers that cannot be avoided, because the i/o mechanism in
question isn't direct but has e.g. a transmission line.
 
A

arnuld

The "but": if you are std::cout, when you write characters to a screen
that's part of the same computer you could just draw each one on the
screen as it's received for output, and then forget about it, no buffer.
But let's say the screen is an old-fashioned terminal connected to the
computer via a serial line. Displaying a character on the screen then
involves transmitting it to the terminal via the serial line,

are you talking about "source code on my computer" and output on "some
other computer" connected to mine with internet ?

and that "other" is trying to compile "my programme" on his computer ?

is this "buffer" concept inherited from C ?
and this
is a complicated copying and conversion process, involving at least a
one-character buffer on each end of the line (in practice those buffers
are larger) -- in that situation the buffers are logically required by
the task to be accomplished, and not just optimization.

Generally, when we say "buffer" we mean a buffer that can hold more than
one character (or byte, or item), but for some problems it's important
to realize that even "un-buffered" input or output can involve
1-character buffers that cannot be avoided, because the i/o mechanism in
question isn't direct but has e.g. a transmission line.

i really did not understand anything here but it seems interesting

:)
 
?

=?ISO-8859-1?Q?Erik_Wikstr=F6m?=

are you talking about "source code on my computer" and output on "some
other computer" connected to mine with internet ?

and that "other" is trying to compile "my programme" on his computer ?

Think of it like this, your application is running on one computer and
the person interacting with it is sitting on another computer connected
via the internet (perhaps over Telnet or SSH). Now imagine that you are
going to output 100 characters on the screen.

If you send each character on their own that would mean that you would
have to send 100 network packages, each package would need some data to
tell it where to go and so on. I think that this data can take up to 40
bytes. So sending each character alone would take 100*(40+1) bytes
(assuming one byter per character).

If you instead put all output in a buffer and wait to send until all
characters are in the buffer you only need to send 40+100 bytes.

On modern coputers it does not matter if you output the characters one
and one or all at once as long as the output is to the local monitor
since the hardware is so fast, but as you saw in the example above if a
network is involved it can be important.
is this "buffer" concept inherited from C ?

Kind of yes, it's origionally from UNIX but C was developed on UNIX (or
rather it's standard library was a part of UNIX). On UNIX-machines of
old IO was done using terminals connected with a serial line, which were
slow so using buffering provided a noticable speedup. But you can notice
the difference even on modern hardware if you are writing large data to
files, this is almost never done without buffering for performance reasons.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,007
Latest member
obedient dusk

Latest Threads

Top