Even *I* could and did write a 'fake finger' from the brief
description.
Then what point are you making when you say its not trivial? I would
say the number of people who know how to do this, may number in the
100s of thousands.
[...] and programmers are left with the take-home
lesson: use gets() and the Russian mob will take
over your machine and the rest of the world.
Uh ... the Russian mob *does* use hackers ...
Uh ... yes. But watch the conclusion you're about to reach
and see if blind dogma is subverting rational thought.
You mean the conclusion about taking over the world? That's a
conclusion *you* created. Straw men are so much more convenient to
argue against then real arguments aren't they?
No. fingerd's special nature was that stdin was directed
to a socket connected to an (untrusted) external machine.
Obviously gets()'er's in that environment deserve to have
their n*ts c** *ff. No one has implied anything different.
No ... users of gets() in *ANY* environment deserve to have their
programming privileges diminished. The environment only dictates the
most likely consequences of using gets() (the simplest undefined
behavior to try to deterministically invoke), it doesn't have any
relevant bearing on the validity of using gets() in of itself.
You managed to write several paragraphs on this subtopic
without even hinting that you understand that the gets()'s
we're "defending" are simply not subject to Russian mob
exploitations!
First of all that's not true, and very explicitly so. Deliberately
deleting my comment about your mysterious "13" values in your sample
code doesn't change the fact that I made it. The argument against
gets() does not rely on the behavior of the Russian Mob, and you
really need to learn what a straw man argument is, before some
religious apologist accuses you of patent infringement.
I've time to read very few Usenet groups these days.
Are many as totally devoid of humor as c.l.c?
No just devoid of contextual and rational thinkers. My comment is
clearly directed at what you think about warning messages about
gets(), not about attempts at rhetorical self-deprecating humor.
Did anyone seriously think I was worried about the warning message?
You must be very new to the internet or something: make your sarcasm
very obvious, demarcate it, or expect it to be misinterpreted.
[...] (I *do* use a simple 'grep -v' to remove
one irrelevant gcc warning, but haven't bothered for
the friendly "dangerous" message.)
That's about the saddest thing I have read in a long time. You
intentionally ignore messages through a crude post process, rather
than commenting your code explicitly with compiler shield pragma to
turn the warning off ... oh wait, is this another one of your examples
of demarcated self deprecating sarcastic humor?
Another poster implied that a reason gets() is "dangerous"
is that it will disappear when the pedants take over
libc!
<sarcasm>Of course, because everyone who has a problem with gets() is
a 'pendant'. said:
[...] Does anyone think any of us would have trouble
writing our own gets() when it goes missing from libc?
You are responding to the wrong person and to the wrong post.
Remember I want the compiler to detect an attempted use of gets() and
remove the developer's source code, and I even refer to this fact *IN
THE POST YOU ARE RESPONDING TO*. (To do it right, of course, you
would want to catch it in the pre-processor *and* the compiler *and*
the linker.)
Either way, if someone is writing the source for a semantic gets()
clone, any serious development house would file a bug against it
immediately.
(This would also be a trivial way to get rid of the
"dangerous" message.) In fact, at the risk of encouraging
the mob to Repeat_The_Obvious One_More_Time I'll put a
gets implementation in the public domain right now:
/* Not tested

*/
char *gets(char *s)
{
s = fgets(s, 1000000, stdin);
/* core-dump immediately if a cosmic ray has
* damaged the terminating LineFeed:
*/ if (s) 0[index(s, '\n')] = 0;
return s;
}
Doug Gwyn already tried to get something essentially the same as this
standardized, but was shot down by his own cohorts on the ANSI C
committee.
Again, you've got the wrong guy. I think fgets() is an amateur's hack
job and will tend to avoid it as well.
"Narrow" is unclear. I'll assume you meant something
worse, but are being polite!
No, I meant that I cannot use it because it doesn't generalize in any
useful way. You know ... its narrow.
I won't your straw men for you. You get to do that all by yourself.
[...] In any event *most* of
the example programs at my website are just that:
*examples* rather than library functions intended for
immediate reuse.
I think there may be some useful ideas at my site.
For example, real-time programmers might benefit from a
glance at
http://james.fabpedigree.com/lesson9.htm#p94
but I imagine if you condescend to comment further
it will be to nitpick or to gloat when you find some
fragment still written in old-fashioned K&R C.
Recently I added functions to operate on compressed
sparse bit strings:
http://james.fabpedigree.com/sofa.c
which *is* intended as a small but useful (for some)
downloadable package. Paul (and anyone else) is welcome
to examine it and offer comments.
The code is unreadable, and I have no idea or motivation to figure out
what it does anyway. You don't even document which compression method
you are using. I am *currently* working on a project in which I work
very closely with state of the art compression. If I don't get
minimal documentation (actually I got a tutorial instead) on the
*algorithm*, I refuse to touch it. I would seriously laugh at anyone
who tried to use someone else's undocumented unknown compression
method.
I guess I've not uploaded my cuckoo hash implementation
yet, so you're referring to the hash table I introduced some
years ago under a Subject like "Reusable Source code".
Indeed I am. I found the code to be totally unusable, let alone
*reusable*.
To say that program "can only ever support one [hash table]"
is to ignore that its source code can be modified!
To support multiple instances, and different types simultaneously, you
need to rewrite it (or do something insane like have a duplicated
source file for each instance). From the ground up. There wasn't
anything worth saving, as I recall.
It's My Own(tm) code for My Own(tm) projects; I use
Just-In-Time Programming(tm) for My Own(tm) projects,
and have found this code easy to modify. As it turns
out, not supporting the feature you mention was the correct
Just-In-Time Programming(tm) decision, as that routine has
been superseded by my cuckoo hash implementation which
*does* support multiple users.
I still don't get why you needed to start with a less functional one.
The cost for making it data context relative is basically nothing.
[...] But the whole point was completely lost, as Chuck reacted
defensively re his own hashlib (without ever explaining how
the user needing two bytes maximum could cope with hashlib's
eight-byte minimum), and Paul enlightened us with code in
which Paul (or his dog?) proudly allows the programmer to
replace the confusing, error-prone predicate
(! p->curr)
with an invocation of the function
int utilGenLinkedListIteratorEndReached \
(struct genLinkedListIterator * iter) {
if (iter == NULL) return -__LINE__;
return NULL == iter->curr;
}
Is this the kind of code you think I need to improve
my pedigree generator, Paul?
Quoting-mining out of context, missing the point, erecting more straw
men. I posted more, and will not dignify this repeat performance of
the straw man argument with a defense. Anybody who wishes can look up
the posts and see that I was clearly making a special kind of point.
Besides, can you explain what's actually wrong with that code? I
highly doubt it given your lame follow up. (Which I will snip.)
Hunh? The OS performed to customer spec, which was
to do a MS-DOS lookalike on MC68000 chip.
Like I said, undemanding customers. Did you know that the sort.exe
and fc.exe programs in MS-DOS are, were and have always been
chronically and totally broken? You'd think that MS would have gotten
around to fixing those at some point. They never did, and as far as I
know, users never rose a stink about it. I wonder, are your customers
any more demanding?
My customer (who got exclusive rights -- I never claimed
to be a businessman) ended up selling the source to
A Very Big Company You've All Heard Of(tm) and recovering
his payment to me. (The Company's representative spent
several days on-site at my customer's examining the source
"eyes only" and concluded with a comment: "Originally I
was almost sure we'd develop our own in-house instead,
but after examining the source ... Sold!")
You found someone who actually likes your bizarre style of coding? ...
Oh wait, I know which really big company you are talking about now; I
think we all do.
With your "honestly", one is tempted to guess you might
not even be joking! If so: Wow! Words fail me.
Well you are actually an unashamed user of gets(). My "eccentric"
approach would actually save the industry millions (by now, it would
probably not be billions as programmers have escaped to Java and C++/
STL; the damage has been done). I can defend my position completely
without introducing false arguments.
I'm afraid I'll have to (cheerfully) retract that concession
in your case, Paul.
Ok, so attack my position. Try.
As I compose this reply, I'm reading, in another window, about
present-day Internet security problems, despite the Billion$
of $$$ spent on $oftware by Micro$oft and others. Frankly,
I doubt that the problem is programmers using simple
functions to do simple things, but is more likely to be
the overly complex methods some of you seem to advocate.
A straw man and unsubstantiated speculation in a single sentence. Its
a two for one! Not bad.
I am unaware of anyone having a security citation from any of the
source code I've produced. On the contrary, people have been
desperately seeking things like my Better String Library precisely
because they just don't want to deal with security problems. A huge
bulk of C based security flaws involve some incorrect processing of C
strings. With the standard mainline usage of Bstrlib (which provides
a complete substitute for C strings) it is almost inconceivable to
produce a security flaw analogous to any I've seen with standard C
string manipulation.
Call it a fetish if you wish, but I *do* like to program in C.
My brain isn't completely dormant here, raising my kids and
smelling the roses: I read up on a variety of topics, e.g.
historical linguistics. Learning a new programming language
is *not* a priority and wasn't even before I retired, unless
you count LaTeX as a "programming" language.
You can become a Python expert in < 3 days. You can become an Lua
expert in < 3 hours.
LOL, chuckle! Call me simplistic but I process one line
at a time. The PC I use has relatively little memory,
but it's enough to spare one (1) 2000-byte buffer
for one (1) 200-byte string.
Moreover, your comments show another confusion: one uses
the "over-sized buffer" whether one gets()'s or fgets()'s.
Trust me, I'm not the one confused. Of course you'll end up using an
over-sized buffer no matter what you do unless you do some insane
linked-list then exact sized buffer copy thing. I was pointing out
that using C for string processing of a fairly low-taxing job like
building a family and making something like a trie out of it is *NOT*
something you use C for unless you are some sort of masochist.
I did not realize that you actually don't *KNOW* other programming
languages.
In either case, if the string needs to be preserved,
one strlen()'s, malloc()'s, and strcpy()'s (*) Yes, one can
hire Chuck F. to implement this trivial functionality, but
so what? (* - or do the pedants use strncpy() instead of
strcpy() even when they've just strlen()'ed on the off-chance
a gamma-ray zapped the null terminator?
True, the fgets() *can* handle lines too long to fit in
the already "over-sized" buffer, but if you're doing that
handling in more than 2% of your fgets() applications you must
be coding like a masochist! The whole point of gets()/fgets()
is the simplicity of reading exactly one line: if you lose
that, you might as well just use fread().
Yeah whatever, you still don't know who you are talking to.
If you've been reading my website, you're misquoting!
I revere T&R (not K&R) as if they were human, but excellent
and creative designers.
What I mean by that is that you use C even when it is insane to do so
(you have no performance constraints and you want to do processing
with primitives precisely where C is weak (i.e., strings)).
I'm agnostic about whether the taxpayers should have been
required to fund the expensive test; the point is that the
flaw would have surfaced readily (with the $50 test) in a
less pedantic testing environment. You seem to have missed
this point; indeed you seem to be arguing that *no* test is
too expensive for the taxpayers! ... Where do you fit on the
political spectrum, Paul?
Taxpayers pay NASA not individual engineers. People at NASA have to
decide how to allocate their budget, and I am *NOT* an advocate of run-
away costs to solve a complex problem. I would rather they spent a
quarter of the money on a rushed job version of Hubble 1, watched it
fail, then learned some lessons and made a Hubble 2, and maybe a
Hubble 3 until it worked. See the Mars lander programs as an
example. Many of them have failed, yet overall its been an incredible
success. How can repeated failure lead to such a high degree of
success? Because there are *MANY* probes, many missions, and they are
*SUPER CHEAP*. It makes the space shuttle program look like a big
joke by comparison. If I were in charge of NASA I would ask for 10
times the budget for these cheaper probe missions under the proviso
that there be 10 times as many missions (and consequently 10 times as
many failures).
If you read what I was saying with a bit of contemplation you would
realize by claiming you need to test absolutely everything to be sure,
I am implicitly arguing *against* doing that. Because *OBVIOUSLY*
that's too expensive. Giving up sureness, and accepting a failure
rate ultimately can end up being cheaper.
So long as you see the world in false dichotomies and learn entirely
the wrong lessons from them, it will prevent you from seeing the real
answers even when they appear to you right before your very eyes.
Regarding my claim, which you contested, that one
SHOULD SPEND ONE'S THOROUGHNESS WISELY
consider again the table hander you deprecated as
single instance. Making it multi-instance *would* have
complicated it and did indeed prove unnecessary.
So you claim. Obviously I don't speak from a position of ignorance
here. Why do you think I examined your code and CBFalconer's code so
closely? Neither was worth effort to try to fix, so I just rolled my
own.