Newbie-question: scanf alternatives?

D

Dan Pop

In said:
Ben said:
CBFalconer said:
if (EOF == ch) ungetc(ch, f); /* questionable coding */

Questionable indeed:

7.19.7.11 The ungetc function
Synopsis
1 #include <stdio.h>
int ungetc(int c, FILE *stream);
Description
[...]
4 If the value of c equals that of the macro EOF, the
operation fails and the input stream is unchanged.

See, you agree with me. Questionable in that it is very likely
not to have the desired effect, but it will not crash the
program. The desired effect is to preserve the EOF condition for
any other i/o call.

The desired effect is the effect guaranteed by the standard, in the
absence of any nonsensical ungetc call. As the standard *clearly* says,
your ungetc call achieves exactly zilch.

EOF is not a value you read from a stream, therefore it makes
no sense to push it back into the stream. That's why the operation
is guaranteed to fail and the input stream is unchanged.

Dan
 
B

Ben Pfaff

CBFalconer said:
Ben said:
CBFalconer said:
if (EOF == ch) ungetc(ch, f); /* questionable coding */

Questionable indeed:

7.19.7.11 The ungetc function
Synopsis
1 #include <stdio.h>
int ungetc(int c, FILE *stream);
Description
[...]
4 If the value of c equals that of the macro EOF, the
operation fails and the input stream is unchanged.

See, you agree with me. Questionable in that it is very likely
not to have the desired effect, but it will not crash the
program. The desired effect is to preserve the EOF condition for
any other i/o call.

It doesn't have any effect, so why do it at all?
 
C

CBFalconer

Ben said:
CBFalconer said:
Ben said:
if (EOF == ch) ungetc(ch, f); /* questionable coding */

Questionable indeed:

7.19.7.11 The ungetc function
Synopsis
1 #include <stdio.h>
int ungetc(int c, FILE *stream);
Description
[...]
4 If the value of c equals that of the macro EOF, the
operation fails and the input stream is unchanged.

See, you agree with me. Questionable in that it is very likely
not to have the desired effect, but it will not crash the
program. The desired effect is to preserve the EOF condition for
any other i/o call.

It doesn't have any effect, so why do it at all?

Because there are known to be systems where an EOF is effectively
cancelled once signalled. If it CAN be signalled without
previously terminating the line, other software can get confused.
Since it costs me nothing to assume that such a faulty (to me)
implementation might also allow preservation via the ungetc call,
it costs nothing to take a shot at it.

Not having such a system, I cannot test it. It may well be
universally useless. But in that case it should also be
universally harmless. At least as I see it. And that is why I
drew attention to it in the first place.

Have any of you other suggestions for dealing with that situation?
 
G

Giorgos Keramidas

CBFalconer said:
Because there are known to be systems where an EOF is effectively
cancelled once signalled.

I'm sure those other systems can fail in many other non-standard ways.
This could probably be wrapped in a conditional #ifdef of some sort
which would ungetc() only if ``the hosting system has breakage #98746:
EOF reset once reached'' :)
 
D

Dan Pop

In said:
Ben said:
CBFalconer said:
Ben Pfaff wrote:

if (EOF == ch) ungetc(ch, f); /* questionable coding */

Questionable indeed:

7.19.7.11 The ungetc function
Synopsis
1 #include <stdio.h>
int ungetc(int c, FILE *stream);
Description
[...]
4 If the value of c equals that of the macro EOF, the
operation fails and the input stream is unchanged.

See, you agree with me. Questionable in that it is very likely
not to have the desired effect, but it will not crash the
program. The desired effect is to preserve the EOF condition for
any other i/o call.

It doesn't have any effect, so why do it at all?

Because there are known to be systems where an EOF is effectively
cancelled once signalled.

They are also known to be non-conforming to the C standard specification,
which requires sticky eof.
If it CAN be signalled without
previously terminating the line, other software can get confused.
Since it costs me nothing to assume that such a faulty (to me)
implementation might also allow preservation via the ungetc call,
it costs nothing to take a shot at it.

This assumption is foolish, even if it costs you nothing.
Have any of you other suggestions for dealing with that situation?

Yup, redesign the interface of your function, so that it can signal the
EOF condition to its caller. A well designed fgets-like function returns
a structure containing a pointer to the line and an exit status flag.

This way, you can *reliably* take care of those non-conforming systems
you're talking about.

Dan
 
C

CBFalconer

Dan said:
CBFalconer said:
Ben said:
Ben Pfaff wrote:

if (EOF == ch) ungetc(ch, f); /* questionable coding */

Questionable indeed:

7.19.7.11 The ungetc function
Synopsis
1 #include <stdio.h>
int ungetc(int c, FILE *stream);
Description
[...]
4 If the value of c equals that of the macro EOF, the
operation fails and the input stream is unchanged.

See, you agree with me. Questionable in that it is very likely
not to have the desired effect, but it will not crash the
program. The desired effect is to preserve the EOF condition for
any other i/o call.

It doesn't have any effect, so why do it at all?

Because there are known to be systems where an EOF is effectively
cancelled once signalled.

They are also known to be non-conforming to the C standard
specification, which requires sticky eof.

Chapter and verse please.
This assumption is foolish, even if it costs you nothing.


Yup, redesign the interface of your function, so that it can signal
the EOF condition to its caller. A well designed fgets-like
function returns a structure containing a pointer to the line and
an exit status flag.

This way, you can *reliably* take care of those non-conforming
systems you're talking about.

Now recall how this started. You claimed that a truncating form
of fgets would be more useful, and I pointed out that that could
be built from fgets, but not the converse. The code in question
appeared in my implementation of that. Thus the functional
interface is fixed (which does signal EOF to its caller).

The problem to be solved is the smooth interface with other
functions, which is an area in which C is deficient. It is not
necessary to the actual function, which could leave things no
better than the original. However I generally aim to repair such
awkward behavior when possible. It pays off in reduced future
errors.
 
D

Dan Pop

In said:
Dan said:
CBFalconer said:
Ben Pfaff wrote:
Ben Pfaff wrote:

if (EOF == ch) ungetc(ch, f); /* questionable coding */

Questionable indeed:

7.19.7.11 The ungetc function
Synopsis
1 #include <stdio.h>
int ungetc(int c, FILE *stream);
Description
[...]
4 If the value of c equals that of the macro EOF, the
operation fails and the input stream is unchanged.

See, you agree with me. Questionable in that it is very likely
not to have the desired effect, but it will not crash the
program. The desired effect is to preserve the EOF condition for
any other i/o call.

It doesn't have any effect, so why do it at all?

Because there are known to be systems where an EOF is effectively
cancelled once signalled.

They are also known to be non-conforming to the C standard
specification, which requires sticky eof.

Chapter and verse please.

Only if you admit being unable to find it yourself.
Now recall how this started. You claimed that a truncating form
of fgets would be more useful, and I pointed out that that could
be built from fgets, but not the converse.

Without explaining why would anyone build anything out of something as
badly designed as fgets(). As I have pointed out, NOT using fgets()
at all leads to cleaner, simpler code.
The code in question
appeared in my implementation of that. Thus the functional
interface is fixed (which does signal EOF to its caller).

What is a "functional interface"?
The problem to be solved is the smooth interface with other
functions, which is an area in which C is deficient.

Your coding appears to be defficient in this area. Big difference!
It is not
necessary to the actual function, which could leave things no
better than the original. However I generally aim to repair such
awkward behavior when possible. It pays off in reduced future
errors.

Your attempt at repairing things is laughable, in this particular case.

Dan
 
M

Malcolm

Felipe Magno de Almeida said:
which temptations to bad coding?
This

fgets(name, sizeof(name), stdin);

if(*strchr(name, '\n'))
*strchr(name, '\n') = 0;

printf("Your name is %s\n", name);
 
M

Malcolm

Felipe Magno de Almeida said:
well, just discard the line that is greater than the buffer, and warns
the user, that is not hard to do at all...
It takes several lines of code, just to use one library function designed to
read a line of text. From experience, most programmers don't bother.

If you are going to write several lines of code, why not just read the data
into a dynamic buffer? In which case fgetc() is easier for use than fgets().
 
K

Keith Thompson

Malcolm said:
The problem comes when the user enters more than 1023 characters. For some
applications, this is more theoretical than real, but for code that you
release to a third party it is essential to think about it, since some
malicious person could deliberately crash your program, even on some systems
hack into the system (because the overflow overwrites the function return
address, allowing arbitrary code to be run, if you know what you are doing).

fgets() will fix this problem, but adds a new one. What if over 1023
characters are entered, and the partly-read input is processed as whole? The
results are quite likely to be much worse than the undefined behaviour that
results from using gets(), since undefined behaviour is usually correct
behaviour (terminate the offending program with an error message), whilst no
operating system can guard against coded incorrect behaviour, such as
chopping off one of the hundred names of the Indian god brumin-brah and
getting you torn to pieces by his devotees for blasphemy.

It's extremely dangerous to assume that undefined behavior usually
results in termination of the offending program with an error message.
In the case of writing outside the bounds of a buffer, it very likely
results in corrupting some other variable (the memory adjacent to a
variable is likely to be another variable). The program is likely to
continue running with arbitrarily bad data, causing arbitrarily bad
results.

You did acknowledge this, but it should be emphasized.
 
C

Christopher Benson-Manica

Malcolm said:
fgets(name, sizeof(name), stdin);
if(*strchr(name, '\n'))

ITYM

if( strchr(name, '\n' ) )

, unless you were going for *really* bad coding.
 
K

Keith Thompson

CBFalconer said:
Chapter and verse please.

I think that C99 7.19.1p2:

The types declared are size_t (described in 7.17);

FILE

which is an object type capable of recording all the information
needed to control a stream, including its file position indicator,
a pointer to its associated buffer (if any), an _error indicator_
that records whether a read/write error has occurred, and an
_end-of-file indicator_ that records whether the end of the file
has been reached;
[...]

can be taken to imply that eof must be sticky. If "the end of the
file has been reached", it doesn't later become the case that the end
of the file has not been reached (though some actions, such as a call
to freopen(), explicitly clear the end-of-file indicator).
 
C

CBFalconer

Keith said:
CBFalconer said:
Chapter and verse please.

I think that C99 7.19.1p2:

The types declared are size_t (described in 7.17);

FILE

which is an object type capable of recording all the information
needed to control a stream, including its file position indicator,
a pointer to its associated buffer (if any), an _error indicator_
that records whether a read/write error has occurred, and an
_end-of-file indicator_ that records whether the end of the file
has been reached;
[...]

can be taken to imply that eof must be sticky. If "the end of the
file has been reached", it doesn't later become the case that the
end of the file has not been reached (though some actions, such as
a call to freopen(), explicitly clear the end-of-file indicator).

I am not worrying about disk files, which should meet that
easily. I am thinking of such devices as stdin from the keyboard,
being terminated by ^D or ^Z as the case may be. On this system
(W98/DJGPP) that doesn't signal EOF except immediately after a \n,
so the problem doesn't arise. However once EOF has been signalled
it is possible to continue.

It just isn't sticky, and that is probably a good thing. I
remember a system where, if you signalled an eof from the
terminal, that terminal was out of action until the system
operator reset the line from the console, or the system was
rebooted.

Running this little test on my system shows that EOF is sticky for
stdin. The second while is always exited immediately. Yet the
device is not in an EOF condition.

#include <stdio.h>

int main(void)
{
int ch;

while (EOF != (ch = getchar())) putchar(ch);
puts("EOF encountered, doing it again");
while (EOF != (ch = getchar())) putchar(ch);
return 0;
} /* main */
 
D

Dan Pop

In said:
So you can't find any such clause either?

I NEVER make any technical statement about the C standard if I am not
prepared to support it with a chapter and verse. It may happen that
my interpretation of the chapter and verse is incorrect, but this is
another issue.

Dan
 
D

Dan Pop

In said:
CBFalconer said:
Chapter and verse please.

I think that C99 7.19.1p2:

The types declared are size_t (described in 7.17);

FILE

which is an object type capable of recording all the information
needed to control a stream, including its file position indicator,
a pointer to its associated buffer (if any), an _error indicator_
that records whether a read/write error has occurred, and an
_end-of-file indicator_ that records whether the end of the file
has been reached;
[...]

can be taken to imply that eof must be sticky. If "the end of the
file has been reached", it doesn't later become the case that the end
of the file has not been reached (though some actions, such as a call
to freopen(), explicitly clear the end-of-file indicator).

There is more precise wording in the description of the fgetc function
(all other input functions work as if they use fgetc):

2 If the end-of-file indicator for the input stream pointed to by
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
stream is not set and a next character is present, the fgetc
^^^^^^^^^^^^^^^^^
function obtains that character as an unsigned char converted
to an int and advances the associated file position indicator
for the stream (if defined).

Returns

3 If the end-of-file indicator for the stream is set, or if the
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
stream is at end-of-file, the end-of-file indicator for the
^^^^^^^^^^^^^^^^^^^^^^^^
stream is set and the fgetc function returns EOF. Otherwise, the
fgetc function returns the next character from the input stream
pointed to by stream. If a read error occurs, the error indicator
for the stream is set and the fgetc function returns EOF.

No mention that fgetc() could reset, under *any* condition, the
end-of-file indicator of the corresponding stream. So, once a fgetc()
call has set the end-of-file indicator by reading from a stream that was
at end-of-file, further fgetc() calls won't even attempt to read from that
stream, because that stream has the end-of-file indicator already set.

The *only* ways of resetting the end-of-file indicator for a stream are
freopen(), ungetc(), fseek(), rewind() and clearerr().

Dan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Members online

Forum statistics

Threads
473,772
Messages
2,569,590
Members
45,100
Latest member
MelodeeFaj
Top