Why doesn't strrstr() exist?

  • Thread starter Christopher Benson-Manica
  • Start date
W

websnarf

Old said:
Unicode is a character set, not an encoding.

Right. It turns out that UTF-16 is the encoding (I don't know whether
or not its LE or BE, but I suspect, its not an exposed thing from the
representation point of view -- i.e., it just matches whatever works
best for your platform.)
AFAIK the language doesn't specify how to deal with Unicode
characters whose value is greater than 65,535

Not so. It specifies UTF-16, which can represent the whole range.
Does it handle UTF-8, big-endian UCS-2, little-endian UCS-2,
b-e UTF16, l-e UTF16, and UCS-4 ? All of those occur in the
real world (unfortunately!)

I am not *that* familliar with Java. But I wouldn't be surprised if
Java didn't come with utilities to support all of those. UCS-2 are
just subsets of UTF16, and UCS-4 is trivial. The only real question is
UTF-8 support, which I don't know about.
 
K

kuyper

Coroutines are not very widely *deployed*. So popularity is how you
judge the power and utility of a programming mechanism?

Powerful and useful programming mechanisms tend to become popular. If
they've been around for a reasonable length of time without becoming
popular, if they're not yet widely deployed, then it's reasonable to
guess that this might be because they're not as powerful or as useful
as you think they are, or that they might have disadvantages that are
more than sufficient to compensate for those advantages. The
implication of his comment is that he's not merely guessing; he knows
reasons that he considers sufficient to justify their lack of
popularity.

....
You undertstand that those are all mostly UNIX right?

No, Windows conforms to POSIX (though from what I've heard, it doesn't
conform very well; I'm happy to say I know nothing about the details),
but that doesn't make it UNIX.
Tell me, when is the last time the C language committee considered a
change in the language that made it truly more powerful that wasn't
already implemented in many compilers as extensions? Can you give me
at least a plausibility argument that I wouldn't be wasting my time by
doing such a thing?

You've answered your own question: the single best way to get something
standardized is to convince someone to implement it as an extension,
develop experience actually using it, show people that it's useful, and
have it become so popular that the committee has no choice but to
standardize it. Why should the committee waste it's very limited amount
of time standardizing something that isn't a sufficiently good idea to
have already become a widely popular extension? If something good is so
new that it hasn't become popular yet, it's too early to standardize
it. Standardizing something freezes it; the need to maintain backwards
compatibility means that it's almost impossible to make substantive
changes to something once it's been standardized. Nothing should be
standardized until people have gained enough experience with it that it
seems unlikely that it will ever again need significant modification.

And, before you bring it up: yes, the committee has occasionally
ignored this advice; however, the results of doing so have often been
regrettable.
 
K

kuyper

....
"False dichotomy". Look it up. I never mentioned high or low level
language, and don't consider it relevant to the discussion. Its a
false dichotomoy because you immediately dismiss the possibility of a
safe low-level language.

No, it's not an immediate dimissal. It's also not a dichotomy:
low-level languages are inherently unsafe, but high-level languages are
not inherently safe. If it's low-level, by definition it gives you
access to unprotected access to dangerous features of the machine
you're writing for. If it protected your access to those features, that
protection (regardless of what form it takes) would make it a
high-level language.

....
C gives you access to a sequence of opcodes in ways that other
languages do not? What exactly are you saying here? I don't
understand.

Yes, you can access things more directly in C than in other higher
level languages. That's what makes them higher-level languages. One of
the most dangerous features of C is that it has pointers, which is a
concept only one layer of abstraction removed from the concept of
machine addresses. Most of the "safer" high level languages provide
little or no access to machine addresses; that's part of what makes
them safer.
I am dodging the false dichotomy. Yes. You are suggesting that making
C safer is equivalent to removing buffer overflows from assembly. The
two have nothing to do with each other.

You can't remove buffer overflows from C without moving it at least a
little bit farther away from assembly, for precisely the same reason
why you can't remove buffer overflows from assembly without making it
less of an assembly language.
As I recall this was just a point about low level languages adopting
safer interfaces. Tough in this case, the performance improvements
probably drives their interest in it.
[...] If you want to argue that too many people
write code in C when their skill level is more appropriate to a
language with more seatbelts, I won't disagree. The trick is
deciding who gets to make the rules.

But I'm not arguing that either. I am saying C is to a large degree
just capriciously and unnecessarily unsafe (and slow, and powerless,
and unportable etc., etc).

Slow? Yes, I keep forgetting how much better performance one
achieves when using Ruby or Python. Yeah, right.

I never put those languages up as alternatives for speed. The false
dichotomy yet again.

A more useful response would have been to identify these
safer-and-specdier-than-C languages that you're referring to.
Right. Because you write every piece of C code that's ever been
written right?

His comment says nothing to suggest that he's ported any specific
number of programs to those platforms. It could be a single program, it
could be a million. Why are you interpreting his claim suggesting that
ported many different programs to those platforms?
Ok ... that's interesting, but this is ridiculuous. As I said above,
you do not write every piece of software in the world. And we are well
aware of about 10,000 programmers living in the pacific northwest who
we know do *NOT* share your attitude.

Well, that's their fault, and their liability. That doesn't make the
attitude wrong.
And you defence of the situation is that you assume every gainfully
employed programmer should be willing to quit the moment they see that
their process of programming is not likely to yield the highest
possible quality in software engineering.

No, they should be willing to quit if deliberately ordered to ship
seriously defective products. There's a huge middle ground between
"seriously defective" and "highest possible quality". In that huge
middle ground, they should argue and strive for better quality, but not
necessarily threaten to quit over it.
That's nice for you. That's not going to be a choice for lots of other
people.

That's a choice every employed person has. If they choose not to take
it, that's their fault - literally, in the sense that they can and
should be held personally liable for the deaths caused by their
defective choice.

....
Wild guess.

Why are you making wild guesses? Why are you making guesses that have
no discernable connection to the topic on hand?
....
That was *my* point. Remember you are claiming that you want to pin
responsibility and liability for code to people so that you can dish
out punishment to them. I see a direct line of responsibility from
weakness in the C library back to him (or maybe it was Thompson or
Kernigham). And remember you want to punish people.

Yes, people should be held responsibile for things they're actually
responsible for. Ritchie isn't responsible for mis-use of the things
he's created.
 
R

Randy Howard

(e-mail address removed) wrote
(in article
C gives you access to a sequence of opcodes in ways that other
languages do not? What exactly are you saying here? I don't
understand.

asm( character-string-literal ); springs to mind. I do not
believe all languages have such abilities. Having that kind of
capability alone, nevermind pointers and all of the subtle and
no so subtle tricks you can do with them in C makes it capable
of low-level work, like OS internals. There are lots of
landmines there, as you are probably already aware.
Yes of course! When people learn a new language they learn what it
*CAN* do before they learn what it should not do. It means anyone that
learns C first learns to use gets() before they learn not to use
gets().

Strange, it has been years since I have picked up a book on C
that uses gets(), even in the first few chapters. I have seen a
few that mention it, snidely, and warn against it though.

The man page for gets() on this system has the following to say:
SECURITY CONSIDERATIONS
The gets() function cannot be used securely. Because of its
lack of bounds checking, and the inability for the calling
program to reliably determine the length of the next incoming
line, the use of this function enables malicious users to
arbitrarily change a running program's functionality through a
buffer overflow attack. It is strongly suggested that the
fgets() function be used in all cases.

[end of man page]

I don't know about you, but I suspect the phrase "cannot be used
securely" might slow quite a few people down. It would be even
better if they showed an example of proper use of fgets(), but I
think all man pages for programming interfaces would be improved
by doing that.
You are suggesting that making C safer is equivalent to removing
buffer overflows from assembly. The two have nothing to do with each other.

Not equivalent, but difficult. Both languages are very powerful
in terms of what they will 'allow' the programmer to attempt.
There is little or no hand-holding. If you step off the edge,
you get your head chopped off. It's not like you can make some
simple little tweak and take that property away, without
removing a lot of the capabilities overall. Yes, taking gets()
completely out of libc (and its equivalents) would be a good
start, but it wouldn't put a dent in the ability of programmers
to make many more mistakes, also of a serious nature with the
language.

Just as I can appreciate the differences between a squirt gun
and a Robar SR-90, I can appreciate the differences between
Python and C, or any other 'safer' language and assembler.
Terse, HLA, Rosasm, LuxAsm -- this is all for *one* assembly language.

I was referring to your use of 'many people', which is, unless I
am mistaken, the only use of 'many' above.
Oh by the way there is a new version! It incoroporates a new secure
non data-leaking input function!

You mean it wasn't secure from day one? tsk, tsk. That C stuff
sure is tricky. :)
Soon to reach 5000 downloads and
80000 webpage hits! Come join the string library revolution and visit:
http://bstring.sf.net/ to see all the tastey goodness!
LOL.


You mean it closes the most obvious and well trodden thousand doors out
of a million doors.

Both work out to .001. Hmmm.
Assembly is not a real application development language no matter how
you slice it.

I hope the HLA people don't hear you saying that. They might
get riotous.
So I'm would be loath to make any point about whether or
not you should expect application to become safer because they are
writing them in assembly language using Bstrlib-like philosophies. But
maybe those guys would beg to differ -- who knows.
Yes.

As I recall this was just a point about low level languages adopting
safer interfaces. Tough in this case, the performance improvements
probably drives their interest in it.

Exactly. C has performance benefits that drive interest in it
as well. If there was a language that would generate faster
code (without resorting to hand-tuned assembly), people would be
using it for OS internals.

I don't think it should have been used for some things, like
taking what should be a simple shell script and making a binary
out of it (for copyright/copy protection purposes) like is done
so often. Many of the tiny binaries from a C compiler on a lot
of systems could be replaced with simple scripts with little or
no loss of performance. But, somebody wanted to hide their
work, or charge for it, and don't like scripting languages for
that reason. People even sell tools to mangle interpreted
languages to help with this. That is not the fault of the C
standard body (as you originally implied, and lest we forget
what led me down this path with you), but the use of C for
things that it really isn't best suited. For many simple
problems, and indeed some complicated ones, C is not the best
answer, yet it is the one chosen anyway.
I never put those languages up as alternatives for speed. The false
dichotomy yet again.

Then enlighten us. I am familiar with Fortran for a narrow
class of problems of course, and I am also familiar with its
declining use even in those areas.
No introspection capabilities. I cannot write truly general
autogenerated code from the preprocessor, so I don't get even the most
basic "fake introspection" that's should otherwise be so trivial to do.
No coroutines (Lua and Python have them) -- which truly closes doors
for certain kinds of programming (think parsers, simple incremental
chess program legal move generators, and so on). Multiple heaps which
a freeall(), so that you can write "garbage-collection style" programs,
without incurring the cost of garbage collection -- again there are
real applications where this kind of thing is *really* useful.

Then by all means use alternatives for those problem types. As
I said a way up there, C is not the best answer for everything,
it just seems to be the default choice for many people, unless
an obvious advantage is gained by using something else.
[...] It seems to be the only language other than
assembler which has been used successfully for operating system
development.

The power I am talking about is power to program. Not the power to
access the OS.

So we agree on this much then?
Right. Because you write every piece of C code that's ever been
written right?

Thankfully, no. The point, which I am sure you realize, is that
C can, and often is used for portable programs. Can it be used
(in non-standard form most of the time btw), for writing
inherently unportable programs? Of course. For example, I
could absolutely insist upon the existence of certain entries in
/proc for my program to run. That might be useful for a certain
utility that only makes sense on a platform that includes those
entries, but it would make very little sense to look for them in
a general purpose program, yet there are people that do that
sort of silly thing every day. I do not blame Ritchie or the C
standards bodies for that problem.
Ok, first of all runtime error handling is not the only path.

Quite. I wasn't trying to enumerate every possible reason that
C would continue to be used despite it's 'danger'.
Well just a demonstration candidate, we could take the C standard, add
in Bstrlib, remove the C string functions listed in the bsafe.c module,
remove gets and you are done (actually you could just remove the C
string functions listed as redundant in the documentation).

What you propose is in some mays very similar to the MISRA-C
effort, in that you are attempting to make the language simpler
by carving out a subset of it. It's different in that you also
add some new functionality. I don't wish to argue any more
about whether MISRA was good or bad, but I think the comparison
is somewhat appropriate. You could write a tome, entitled
something like "HSIEH-2005, A method of providing more secure
applications in a restricted variant of C" and perhaps it would
enjoy success, particularly amongst people starting fresh
without a lot of legacy code to worry about. Expecting the
entire C community to come on board would be about as naive as
expecting everyone to adopt MISRA. It's just not going to
happen, regardless of any real or perceived benefits.
Uhh ... actually no. People like my Bstrlib because its *safe* and
*powerful*. They tend not to notice or realize they are getting a
major performance boost for free as well (they *would* notice if it was
slower, of course). But my optimization and low level web pages
actually do have quite a bit of traffic -- a lot more than my pages
critical of apple or microsoft, for example.

So you are already enjoying some success then in getting your
message across.
Its not hard to beat compiler performance, even based fundamentally on
weakness in the standard (I have a web page practically dedicated to
doing just that; it also gets a lot of traffic). But by itself, that's
insufficient to gain enough interest in building a language for
everyday use that people would be interested in.
Indeed.

[...] "D" is already taken, what will you call it?

How about "C"?

Well, all you need to do is get elected ISO Dictator, and all
your problems will be solved. :)
Ok, well then we have an honest point of disagreement then. I firmly
believe that the current scourge of bugs that lead to CERT advisories
will not ever be solved unless people abandon the current C and C++
languages.

Probably a bit strongly worded, but I agree to a point. About
90% of those using C and C++ today should probably be using
alternative languages. About 20% of them should probably be
working at McDonald's, but that's an argument for a different
day, and certainly a different newsgroup.
I think there is great concensus on this. The reason why I
blame the ANSI C committee is because, although they are active, they
are completely blind to this problem, and haven't given one iota of
consideration to it.

I suspect they have considered it a great deal, and yet not
provided any over action that you or I would appreciate. They
are much concerned (we might easily argue 'too much') with the
notion of not breaking old code. Where I might diverge with
that position is on failing to recognize that a lot of 'old
code' is 'broken old code' and not worth protecting.
Even though they clearly are in the *best*
position to do something about it.

I actually disagree on this one, but they do have a lot of power
in the area, or did, until C99 flopped. I think the gcc/libc
crowd could put out a x++ that simply eradicates gets(). That
should yield some immediate improvements. In fact, having a
compiler flag to simply sqwawk loudly every time it encounters
it would be of benefit. Since a lot of people are now using gcc
even on Windows systems (since MS isn't active in updating the C
side of their C/C++ product), it might do a lot of good, far
sooner, by decades than a change in the standard.
And its them any only them -- the
only alternative is to abandon C (and C++) which is a very painful and
expensive solution; but you can se that people are doing exactly that.
Not a lot of Java in those CERT advisories.

That's good. The more people move to alternate languages, the
more people will have to realize that security bugs can appear
in almost any language. Tons of poorly written C code currently
represents the low-hanging fruit for the bad guys.
And so are you saying it didn't cost you anything when you first
learned it?

Since I did not have a million lines of C to worry about
maintaining at the time, indeed very little, it was not very
expensive. I'll admit it wasn't zero-cost, in that it took me
whatever time it was for the point to soak in, and to learn
better alternatives. I could have recouped some of the 'cost'
be selling some old Schildt books to unsuspecting programmers,
but felt that would have been uncivilized.
And that it won't cost the next generation of programmers,
or anyone else who learns C for the first time?

Provided that they learn it early on, and /not/ after they ship
version 1.0 of their 'next killer app', it won't be that bad.
Given that it shouldn't be taught at all to new programmers
today (and I am in favor of pelting anyone recommending it today
with garbage), I suspect it will be eradicated for all practical
purposes soon.
C99 is not being adopted because there is no *demand* from the users or
development houses for it. If the standard had been less drammatic,
and solved more real world problems, like safety, for example, I am
sure that this would not be the case.

Do I think C99 was for many people of no tangible value, or
enough improvement to justify changing compilers, related tools
and programmer behavior? Unfortunately, yes. It was a lot of
change, but little meat on the bones.

However, there was also the problem that C89/90 did for many
people exactly what they expected from the language, and for a
significant sub-group of the population, "whatever gcc adds as
an extension" had become more important than what ISO had to say
on the matter. The stalling out of gcc moving toward C99
adoption (due to conflicts between the two) is ample support for
that claim.
You also ignore the fact that
the C++ folks typically pick up the changes in the C standard for their
own. So the effect of the standard actually *is* eventually
propogated.

Here I disagree. C and C++ are not closely related anymore. It
takes far longer to enumerate all the differences that affect
both than it does to point out the similarities. Further, I
care not about C++, finding that there is almost nothing about
C++ that can not be done a better way with a different language.
C is still better than any reasonable alternative for a set of
programming tasks that matter to me, one in which C++ doesn't
even enter the picture. That is my personal opinion of course,
others may differ and they are welcome to it.
The fact that it would take a long time for a gets() removal in the
standard to be propogated to compiler, I do not find to be a credible
argument.

Why not? If the compiler doesn't bitch about it, where are all
of those newbie programmers you are concerned about going to
learn it? Surely not from books, because books /already/ warn
about gets(), and that doesn't seem to be working. If they
don't read, and it's not in the compiler, where is this benefit
going to appear?
Also note thast C89, had very fast adoption. It took a long time for
near perfect and pervasive adoption, but you had most vendors more than
90% of the way there within a very few years.

Because it was very similar to existing practice, and a smaller
language standard overall. Far less work. Frankly, I have had
/one/ occasion where something from C99 would have made life
easier for me, on a single project. It turned out I didn't get
to use it anyway, because I did not have access to C99 compilers
on all of the platforms I needed to support, so I did it
differently. I don't anticipate 'wishing' for a C99 compiler
much, if at all, in the future either. The problem domains that
C became dominant for are well-served by C89/90, as is, just
stay away from the potholes. I certainly do not need a C05
compiler just to avoid gets(), I've been doing it with C89
compilers for many years.
A compiler error telling the user that its wrong (for new platform
compilers) is the best and simplest way to do this.

And we both say that, several times, we seem to differ only in
the requirements to make that change.
Uh ... but you see that its still better than nothing right?

So is buying lottery tickets for the worst programmer you know.
:)
You think programming will suddenly stop in 15 years?

Yeah, that's exactly what I was thinking. How did you guess?
Do you think there will be less programmer *after* this 15
year mark than there has been before it?

Nope. but I think it will 15 years too late, and even if it does
come, and the gets() removal is part of it, which assumes facts
not in evidence, that there will STILL be a lot of people using
C89/90 instead. I would much rather see it show up in compilers
with the next minor update, rather than waiting for C05, which
will still have the barrier of implementing the ugly bits of
C99, which the gcc crowd seems quite loath to do.
Or, like me, do you think C will just become COBOL in 15 years?

Yeah, as soon as a suitable replacement for system programming
shows up. Hold your breath, it's right around the corner.
The linker for the GNU linker already does this. But its perceived as
a warning. People do not always listen to warnings.

So make it email spam to the universe pronouncing "Someone at
foobar.com is using gets()!! Avoid their products!!!" instead.
:)

Perhaps having the C runtime library spit out a warning on every
execution at startup "DANGER: THIS PROGRAM CONTAINS INSECURE
CODE!!!" along with a string of '\a' characters would be better.

I do not see a magic wand that will remove it for all time, the
genie is out of the bottle. Some nebulous future C standard is
probably the weakest of the bunch. I am not saying it shouldn't
happen, but it will not be sufficient to avoid the problem.
Use of old compilers is not the problem. The piles of CERT advisories
and news stories about exploits are generally directed at systems that
are constantly being updated with well supported compilers.

Which of those systems with CERT advisories against them have
recently updated C99 compilers? It's only been 6 years right?
How long will it be before they have a compiler you are happy
with, providing guaranteed expulsion of code with gets()?

Use of old compilers is definitely part of the problem, along of
course with badly trained programmers.
But nobody would believe your claim. My claim could be audited, and a
company would actually worry about being sued for making a false claim
of the sort I am advocating unless it were true.

If they can prove that no gets() or friends in the list are in
their product, then what is the worry? Put in a different way,
if they claimed "We're C 2010 compliant", just because they have
access to gcc2010, and yet it allows some command line argument
-std=c89, all bets are off anyway. Either way, they either use
gets() or they do not. As such, both claims are pretty similar.

Frankly, I don't care about marketing BS, so let's move on...
Did I misspeak and ask for deprecation? Or are you misrepresenting my
position as usual?

No, you just failed to notice the end of one sentence,
pertaining to your position, and the start of another one, with
the words "I would much rather see...".
I'm pretty sure I explicitely said "non-redefinable
in the preprocessor and always leads to an error" to specifically
prevent people from working around its removal.

And, just as I said above, which I will repeat to get the point
across (hopefull), "I AM NOT OPPOSED TO THEM BEING REMOVED".

I simply think more could be done in the interim, especially
since we have no guarantee of it every happening your way at
all.
... And you think there will be lots of CERT advisories on such
products? Perhaps you could point my to a few examples of such
advisories which are new, but which use old compilers such as Borland
C.

If your tactic is to only attack the people working on widely
deployed software likely to be involved in CERTs, I think the
"gets(), just say no" mantra is being driven into their head
practically every day. It's legacy code (and cleaning it) that
represents the bulk of the problem today. Scanning the complete
source tree for every product currently on the market would be
your best bet.
We can't do anything about legacy compilers -- and we don't *NEED TO*.
That's not the point. The "software crisis" is directed at development
that usually uses fairly well maintained compilers.

Well, if it is a 'crisis', then 15 years is definitely too long
to wait for a solution.
Ok ... that's interesting, but this is ridiculuous. As I said above,
you do not write every piece of software in the world.

A fact of which I am painfully aware each time I sit down at a
keyboard to a machine running Windows. :) (For the record,
no, I do not think I would replicate Windows functionality and
improve on it, single-handedly.)
And we are well
aware of about 10,000 programmers living in the pacific northwest who
we know do *NOT* share your attitude.

Correct. Perhaps if they weren't so anxious to grab 20 year old
open source software and glue into their own products, there
would be less to worry about from them as well.
And you defence of the situation is that you assume every gainfully
employed programmer should be willing to quit the moment they see that
their process of programming is not likely to yield the highest
possible quality in software engineering.

That is /not/ what I said. The specific example in question had
to do with something that would be very dangerous if shipped.
The type of danger people would send lawyers hunting for you
over. there is a lot of room between that and "not likely to
yield the highest possible quality..."
Wild guess.
Very.


That isn't what I am saying. People's ability to quit or work at will
are often not related to things like programming philosophy or idealism
about their job. And software is and always will be created by
developers who have considerations other than the process of creating
perfect software.

Then we are in a lot of trouble. The ISO C body isn't going to
solve that problem. You better start tilting some more powerful
windmills, and do it quickly.
I implore you -- read the CERT advisories. Buffer Overflows are #1 by
a LARGE margin.

Yes. And when they are all gone, something else will be number
#1. As I already said, a lot of people have figured out how to
find and expose the low-hanging fruit, it's like shooting fish
in a barrel right now. It won't always be that way. I long for
the day when some whole in .NET becomes numero uno, for a
different reason than buffer overflows. It's just a matter of
time. :)
If you remove buffer overflows, it doesn't mean that other kinds of
bugs will suddenly increase in absolute occurrence. Unless you've got
your head in the sand, you've got to know that *SPECIFICALLY* buffer
overflows are *BY THEMSELVES* the biggest and most solvable, and
therefore most important safety problem in programming.

Yep. they're definitely the big problem today. do you really
think they'll still be the big problem by the time your C2010
compiler shows up in the field? It's possible of course, but I
hope not.
I'm not sure how this is an argument that Buffer Overflows aren't the
worst safety problem in programming by a large margin.

None of those problems actually have anything to do with programmer
abilities, or language capabilities. They have to do with corporate
direction, mismanagement, and incompetent program architecture. That's
a completely seperate issue.

Yes, I diverged in the wood, for no good reason, it was just too
bothersome for me to leave out, since it seemed related at the
time. Forgive me.
Uh ... no *you* are. My point was that he *COULDN'T* be.

OK. If that's your point, then how do you justify claiming that
the ISO C folks are culpable in buffer overflow bugs?
Sometimes you are *NOT AWARE* of your liability, and you don't *KNOW*
the situations where your software might be used.

So if your point is that the ISO committee knew about gets() and
allowed it to live on in C99, and that for example, 7.19.7.7
should contain some bold wording to that effect, I agree.
Better yet of course, marking it deprecated and forcing
compilers to emit an ERROR, not a warning for its use when
invoked in C99 conforming mode.

Even so, the knowledge was readily available elsewhere that
gets() was inherently unsafe at the time, and frankly, I have
met two or three programmers other than myself in the last 15
years that owned their own copy or copies of the applicable C
standards. Putting it in the document along wouldn't have
helped, but it is somewhat surprising that it didn't even rate a
warning in the text.
That was *my* point. Remember you are claiming that you want to pin
responsibility and liability for code to people so that you can dish
out punishment to them. I see a direct line of responsibility from
weakness in the C library back to him (or maybe it was Thompson or
Kernigham). And remember you want to punish people.

In the country I live, ex post facto laws are unconstitutional,
and I do not want to make people retroactively responsible
(although our elected representatives sometimes do).

Especially since they couldn't have possible been claiming
'certification' of any kind that far in the past, seeing it
doesn't even exist today.
Well no, but you can argue that they are responsible for the bugs they
introduce into their compilers. I've certainly stepped on a few of
them myself, for example. So if a bug in my software came down to a
bug in their compiler, do you punish me for not being aware of the bug,
or them for putting the bug in there in the first place?

It would be difficult, if not impossible, to answer that
generically about a hypothetical instance. That's why we have
lawyers. :-(
Steve Gibson famously railed on Microsoft for enabling "raw sockets" in
Windows XP.

Yes, I saw something about it on his website only yesterday,
ironically.
This allows for easy DDOS attacks, once the machines have
been zombified. Microsoft marketing, just like you, of course
dismissed any possibility that they should accept any blame whatsoever.

Don't put that one on me, their software exposes an interface in
a running operating system. If their software product leaves a
whole open on every machine it is installed on, it's their
fault. I see nothing in the ISO C standard about raw sockets,
or indeed any sockets at all, for well over 500 pages.

Can raw sockets be used for some interest things? Yes. The sad
reality is that almost /everything/ on a computer that is
inherently powerful can be misused. Unfortunately, there are
currently more people trying to break them than to use them
effectively.
Oh I see. So you just want to punish, IBM, Microsoft, Unisys, JASC
software, Adobe, Apple, ... etc. NOBODY caught the bug for about *10
years* dude.

Exactly. They all played follow-the-leader. I'm sure they'll
use the same defense if sued.
Everyone was using that sample code including *myself*.

tsk, tsk.
Speaking of lack of logic ... its the *REVERSE* that I am talking
about. Its because I *don't* have a 0 first-compile error rate that I
feel that my hidden error rate can't possibly be 0.

I'll say it a different way, perhaps this will get through.
REGARDLESS of what your first-compiler error rate, you should
feel that hidden error rate is non-zero. You /might/ convince
yourself otherwise at some point in the future, but using
first-compile errors as a metric in this way is the path to
hell.
You miss my argument. First-compile error rates are not a big deal --
the compiler catches them, you fix them. But they are indicative of
nature blind spots.

True. Unfortunately, if you had none at all, there are still
'unnatural blind spots' in code that will bite you in the
backside. This is why developers (outside of small shops)
rarely are solely responsible for testing their own code. They
get false impressions about what code is likely to be sound or
unsound based upon things like how many typos they made typing
it in. Not good.
Testing, structured walk throughs/inspections, are just imperfect
processes for trying to find hidden bugs. Sure they reduce them, but
you can't believe that they would get all of them -- they dont!

No kidding. I'm often amazed at how you give off the impression
that you think you are the sole possessor of what others
recognize as common knowledge.

I have never claimed that a program was bug free. I have
claimed that they have no known bugs, which is a different
matter completely.
When did I suggest or imply this?

Apparently not. Good.
[...] Not all development work is for use inside a VM or
other sandbox.

Again putting words in my mouth.

Stating a fact, actually.
When did I suggest that I was doing such a thing? Can you find the
relevant quote?

You didn't. I suggested it. Since it is more likely of
happening before 2020, it might be of interest to you in solving
the 'software crisis'.
 
D

Douglas A. Gwyn

Chris said:
It's NOT down to the ANSI committee..... it is down to WG14 an ISO
committee of which ANSI is but one part. ...

It's already evident that "websnarf" doesn't understand
standardization.
 
D

Douglas A. Gwyn

In any event, compare this to Java, where Unicode is actually the
standard encoding for string data. Its not really possible to have
"unicode parsing problems" in Java, since all this stuff has been
specified in the core of the language. Compare this to ANSI C, which
uses wchar, which literally doesn't *specify* anything useful. So
technically the only reason one is writing Unicode parsers in C is
because the standard doesn't give you one.

C is *meant* for implementing systems at that level, and doesn't
presuppose such things as the specific native character encoding.
People who have to deal with a variety of encodings would have to
do similar things in Java too.
 
R

Randy Howard

(e-mail address removed) wrote
(in article
This was a point to demonstrate the programmers are not perfect, not
matter what you do. So this idea that you should just blame
programmers is just pointless.

It's better than blaming someone that doesn't even have access
to the source code in most cases. If I saw a car wreck, I
wouldn't immediately go in search of the CEO of the company that
built the car. I'd first try to find out about the driver. It
might turn out that the car was fundamentally flawed, and
impossible to drive correctly.

You could easily make the argument that gets() is impossible, or
nearly so, to use properly. That doesn't preclude you from
getting input in other ways.
Well, Ritchie, AFAIK, did not push for the standardization, or
recommend that everyone actually use C as a real application
development language. So I blame him for the very narrow problem of
making a language with lots of silly unnecessary problems, but not for
the fact that everyone decided to use it.

I have yet to encounter a language without a lot of silly
problems. Some folks argue that lisp is such a language, but I
don't take them seriously. :)
The actual ANSI C committee
is different -- they knew exactly what role C was taking. They have
the ability to fix the warts in the language.

No, they have the ability to recommend fixes to a language, but
as we have already seen, the developer community is more than
willing to ignore them when push comes to shove. A community
that strong-willed should be strong enough to address the gets()
problem itself, and not rely on 'permission' from the standard
group.
First of all, the standard doesn't *have* coroutines while other
languages do.

It doesn't have threads either. If you want to argue about
pthreads, take it up with the POSIX guys. I suspect Butenhof
and a few others would just love to argue you with you about it
over in c.p.t.
Coroutines are not very widely *deployed*.
Correct.

So popularity is how you
judge the power and utility of a programming mechanism?

No, but it has a lot to do with whether or not programmers see
it as a suitable alternative at design time.
Why don't you
try to add something substantive here rather leading with ignorance?

Why don't you stop playing games? It's readily apparent that
programming isn't knew to either of us, so let's stop fencing
with ad homs and move on.
Can you give a serious pro-con argument for full threads versus
coroutines? Because I can.

Yes. There is a POSIX standard for pthreads. This allows me to
develop cross-platform, portable software that utilizes them
across a diverse group of OS platforms and CPU architectures for
tools that, for a number of reasons, must be written in C, or
assembler, which isn't all that practical since different CPU
architectures are involved, and the amount of work involved
would be staggering. There is no coroutine solution available
AFAIK to solve the problem across all of the platforms, and I do
not have the luxury of discussing vaporware.

If there was coroutine support in a suitable language, and if it
was readily available on a wide variety of OS and hardware, I
would consider it, especially if it offered parity or
improvement over threads in terms of performance, flexibility
and stability. I have not spent much time on them, apart from
the theoretical, because they do not form a practical solution
to the problems I have to address today.

Ironically, I can say similar things about the Windows threading
model, which does exist, but only solves a fraction of the
problem, and in a much more tortured way.
You undertstand that those are all mostly UNIX right? Even the windows
thing is really an implementation or emulation of pthreads on top of
Windows multithreading.

Wow. I didn't know that. I thought that the reason I
downloaded pthreads-win32 was simply because Microsoft forgot to
put it on the CD and had mistakenly left it on somebody else's
website. *sigh*
Show me pthreads in an RTOS.

Read Bill Weinberg's (MontaVists) paper for an alternate view.
http://www.mvista.com/dswp/wp_rtos_to_linux.pdf
Right. And have they fixed the generic problem of race conditions?

No, you have to code to avoid them, just as you do to avoid
logic flaws.
Race conditions are just the multitasking equivalent of
buffer-overflows.
Hardly.

Except, as you know, they are *much* harder to debug,

And to exploit. Buffer overflows are comparably easy to
exploit. The best way to 'debug' races is to avoid them in the
first place.
and you cannot use tools, compiler warnings or other simple
mechanisms to help you avoid them.

You can use the knowledge of how to avoid them to good effect
however. I have not had to chase one down for quite a while,
and will be mildly surprised if one pops up down the road.
This is the real benefit of coroutines over full threading.
You can't have race conditions using coroutines.

Unfortunately, you can't have coroutines using a number of
popular languages either.

And you can have race conditions, even without threads. Two
separate processes doing I/O to the same file can race, with or
without coroutines in either or both. Race conditions are not
something threads hold a patent on, not by a long shot.
I would if I thought there was an audience for it. These things take
effort, and a brief perusal of comp.std.c leads me to believe that the
ANSI committee is extremely capricious.

Then skip them, and sell it instead. People use things (such as
pthreads) that aren't part of standard C every day, with varying
degrees of success. If you had a package, and it did not
subscribe to the 'all the world's an intel' philosophy, I would
entertain the notion. Not all of us have the luxury of
supporting a single, or even a few platforms. The choices are
limited, or you roll your own.
Think about it. You want me to propose something actually useful,
powerful and which would improve the language to a committee that
continues to rubber stamp gets().

Yeah. And if nobody tries, it is a self-fulfilling prophecy
that it won't change.
Is your real point that I am supposed to do this to waste my time and
energy, obviously get rejected because the ANSI C committee has no
interested in improving the language, and this will be proof that I am
wrong?

No. I think that if you are correct in believing it has value
and the world needs it, then you propose it for standardization,
and while you wait, you build it anyway, because if it's that
good, then people will want to buy it if they can't get it
another way. Even it if it is accepted, you can still sell it,
look at the folks selling C99 solutions and libraries today,
despite of the lack of general availability.

You can sit around being mad at the world over your pet peeves,
or you can do something about it. Of course, the former is far
easier.
Tell me, when is the last time the C language committee considered a
change in the language that made it truly more powerful that wasn't
already implemented in many compilers as extensions?

I can't think of an example offhand, but I am restricted by not
being completely fluent on all the various compiler extensions,
as I try to avoid them whenever possible. Even so, you have a
point about their past history.
Can you give me at least a plausibility argument that I wouldn't
be wasting my time by doing such a thing?

If you don't think it's worth fooling with, then why are we
arguing about it at all? either it is a 'software crisis', or
it is, to use a coarse expression, 'just a fart in the wind'.
If it is the latter, let's stop wasting time on it.
 
R

Richard Bos

Randy Howard said:
(e-mail address removed) wrote
(in article


asm( character-string-literal ); springs to mind. I do not
believe all languages have such abilities.

Neither does C.

Richard
 
R

Randy Howard

(e-mail address removed) wrote
(in article
No, it's not an immediate dimissal. It's also not a dichotomy:
low-level languages are inherently unsafe, but high-level languages are
not inherently safe. If it's low-level, by definition it gives you
access to unprotected access to dangerous features of the machine
you're writing for. If it protected your access to those features, that
protection (regardless of what form it takes) would make it a
high-level language.

I'm glad that is obvious to someone else. I was feeling lonely.
:)
You can't remove buffer overflows from C without moving it at least a
little bit farther away from assembly, for precisely the same reason
why you can't remove buffer overflows from assembly without making it
less of an assembly language.

Yes. We could just as easily have had "C without Pointers"
instead of "C with classes". Guess how many people would have
went for the first of the two? We can argue about the benefits
or lack thererof with the second some other time. :)
A more useful response would have been to identify these
safer-and-specdier-than-C languages that you're referring to.

Exactly. It would have been far more difficult to do though,
and he already has 'false dichotomy' in his paste buffer.
His comment says nothing to suggest that he's ported any specific
number of programs to those platforms. It could be a single program, it
could be a million. Why are you interpreting his claim suggesting that
ported many different programs to those platforms?

Intentional misreading, I suspect.
No, they should be willing to quit if deliberately ordered to ship
seriously defective products. There's a huge middle ground between
"seriously defective" and "highest possible quality". In that huge
middle ground, they should argue and strive for better quality, but not
necessarily threaten to quit over it.

Eerily, we had almost exactly the same response to this bit.
Once again, there is hope. :)
That's a choice every employed person has. If they choose not to take
it, that's their fault - literally, in the sense that they can and
should be held personally liable for the deaths caused by their
defective choice.
Agreed.


Yes, people should be held responsibile for things they're actually
responsible for. Ritchie isn't responsible for mis-use of the things
he's created.

He's probably in favor of suing Stihl is someone uses one of
their chainsaws to decapitate their spouse too. After all, it's
the same false premise at work.
 
C

Chris Hills

Oh I see. So, which socialist totally unionized company do you work as
a programmer for? I'd like to apply!

You don't need to work in a unionised company to do that. I have seen
Professional Software Engineers put disclaimers on work they have had to
modify that whilst they are happy with their own work they were very
unhappy with the rest of it. It even got locked into the comments in the
CVs database to make sure it was on record.

Of course the corporate manslaughter Bill (UK) helps now.
[...] If you
are being overworked, you can either keep doing it, or you can
quit, or you can convince your boss to lighten up.

Hmmm ... so you live in India?

How the hell do you come to that conclusion? seriously I would like to
see your reasoning here.
I'm trying to guess where it is in this
day and age that you can just quit your job solely because you don't
like the pressures coming from management.

Any civilised country. Life is too short to mess about. It is the way
Professionals should behave.
That's a nice bubble you live in. Or is it just in your mind?

No. People do that. Seen it done. apparently you don't mind what you do
as long as you are paid. Where do you draw the line?

Try and force a brain surgeon to operate on your head with a
chainsaw. good luck.
[...] If you could be fined or perhaps even jailed for
gross neglicence in software development the way doctors can be
today, I suspect the problem would be all but nonexistent.

Ok, that's just vindictive nonsense.

Why? We expect architects, doctors, lawyers, pretty much all
other real 'professions' to meet and typically exceed a higher
standard, and those that do not are punished, fined, or stripped
of their license to practice in the field. Why should
programmers get a pass? Is it because you do not feel it is a
professional position?

Because its not as structured, and that's simply not practical.
Doctors have training, internships, etc. Lawyers have to pass a bar
exam, etc. There's no such analogue for computer programmers.

Yes there is. Most certainly. There is the PE in many countries and
Chartered Engineer in others. It requires a Degree, training and
experience.

The discussion in another thread is should it be made mandatory world
wide.
Agreed.


You don't think medical practioners use the latest and safest
technology available to practice their medicine?

Not always.
Go measure your own first-compile error rate

That is a meaningless statistic.
For a nuclear reactor, I would also include the requirement that they
use a safer programming language like Ada.

As the studies have shown.. language choice has a minimum impact on
errors.
Personally I would be
shocked to know that *ANY* nuclear reactor control mechanism was
written in C. Maybe a low level I/O driver library, that was
thoroughly vetted (because you probably can't do that in Ada), but
that's it.

Which destroys your argument! Use Ada because it is safe but the
interface between Ada and the hardware is C.... So effectively C
controls the reactor.
I guess all the professionals in other fields where they are
held up to scrutiny must be irresponsible daredevils too.

No -- they have great assistance and controlled environments that allow
them to perform under such conditions.
Yes.

Something akin to using a
better programming language.
No.
[...] For
example, there are operations that have very low success rates,
yet there are doctors that specialize in them anyway, despite
the low odds.

Well, your analogy only makes some sense if you are talking about
surgeons in developing countries who simply don't have access to the
necessary anesthetic, support staff or even the proper education to do
the operation correctly. In those cases, there is little choice, so
you make do with what you have. But obviously its a situation you just
want to move away from -- they way you solve it, is you give them
access to the safer, and better ways to practice medicine.

There is a blinkered view. Some operations are the only choice between
life and death. a relative on mine had that... this op has a 40% success
rate. Without it your life expectancy is about 2 weeks.

So you do the dangerous op.

So you want some people to stay away from C because the language is too
dangerous.

It is in the hands of the inexperienced.
While I want the language be fixed so that most people
don't trigger the landmines in the language so easily. If you think
about it, my solution actually *costs* less.

I have seem more errors in "safe" languages because people thought it if
compiled OK it must be good code... there is a LOT more to it than
getting it to compile. Only an idiot would think that.
 
R

Randy Howard

Richard Bos wrote
(in article said:
Neither does C.

Ref: J.5.10 Yes, I know about where it appears in the
document, but it's close enough given we are discussing the real
world. Since one can also link assembly code modules with C
into an executable without it, it seems a moot point anyway.
 
A

Alan Balmer

(e-mail address removed) wrote
(in article


It's better than blaming someone that doesn't even have access
to the source code in most cases. If I saw a car wreck, I
wouldn't immediately go in search of the CEO of the company that
built the car. I'd first try to find out about the driver. It
might turn out that the car was fundamentally flawed, and

Can't you guys find a more suitable venue for your argument?

How about comp.programming?
 
D

Douglas A. Gwyn

Randy said:
.. A community
that strong-willed should be strong enough to address the gets()
problem itself, and not rely on 'permission' from the standard
group.

Actually all you need to do is not use gets (except
perhaps in certain carefully controlled situations).
There are other, standard, mechanisms that can be used
safely enough. If the actual problem is perceived to
be that naive novice programmers might use gets
without appreciating the opportunity for buffer
overrun, consider that the same programmers will make
comparable errors throughout their code. A genuine
fix for the actual problem requires something quite
different from removing gets from the system library.
 
K

Keith Thompson

"False dichotomy". Look it up. I never mentioned high or low level
language, and don't consider it relevant to the discussion. Its a
false dichotomoy because you immediately dismiss the possibility of a
safe low-level language.

"websnarf", let me offer a suggestion. If you want to point out a
false dichotomy, use the words "false dichotomy". I had no clue what
you meant by your remark about terrorists, and I'd be surprised if
anyone else did either. (I didn't mention it earlier because,
frankly, I didn't care what you meant.)

If you want to communicate, you need to write more clearly. If you're
more interested in showing of how obscure you can be, please do so
somewhere else.
 
R

Randy Howard

Alan Balmer wrote
(in article said:
Can't you guys find a more suitable venue for your argument?

I'm sorry. Is a discussion about what might or might not happen
in future C standards drowning out discussions of homework
problems and topicality?
 
R

Randy Howard

Douglas A. Gwyn wrote
(in article said:
Actually all you need to do is not use gets (except
perhaps in certain carefully controlled situations).
There are other, standard, mechanisms that can be used
safely enough.

Indeed, if I am not mistaken I made that very point several
times already. Apparently it lacks for an iron-clad guarantee.
If the actual problem is perceived to be that naive novice
programmers might use gets without appreciating the opportunity
for buffer overrun, consider that the same programmers will make
comparable errors throughout their code.

Yes. Also, it is incredibly unlikely that a naive novice
programmer will be producing software that will be widely
deployed and wind up in a CERT advisory, but I suppose it is not
impossible.

I am somewhat curious about why even as late as C99, or even
later in TC1, there is still no official wording in the standard
concerning gets() being of any concern at all. It seems that it
couldn't have offended many people to simply saying "hey, this
is in the standard already, but it's really not a good idea to
use it for new development, and in fact, it is highly
recommended that an existing usage be expunged." That seems to
be the strongest argument in favor of Hsieh's position that I
have seen so far. It is very hard to think of a justification
for it appearing unadorned with a a warning in the text.
A genuine fix for the actual problem requires something quite
different from removing gets from the system library.

What would you propose?
 
A

Alan Balmer

Alan Balmer wrote


I'm sorry. Is a discussion about what might or might not happen
in future C standards drowning out discussions of homework
problems and topicality?

Reread your last 223-line post. That's not what it was about.

Unless you are concentrating mostly on the "might not happen" aspect,
in which case everything becomes topical. I can think of hundreds of
things which might not appear in future C standards.
 
D

Douglas A. Gwyn

Randy said:
Yes. Also, it is incredibly unlikely that a naive novice
programmer will be producing software that will be widely
deployed and wind up in a CERT advisory, but I suppose it is not
impossible.

Judging by some of the reported bugs, one wonders.

Just a few weeks ago, there was an IAVA for a bug in
Kerberos v5 that was essentially of the form
if (!try_something) {
error_flag = CODE;
free(buffer);
}
free(buffer);
How that could have passed even a casual code review
is a mystery.
I am somewhat curious about why even as late as C99, or even
later in TC1, there is still no official wording in the standard
concerning gets() being of any concern at all. It seems that it
couldn't have offended many people to simply saying "hey, this
is in the standard already, but it's really not a good idea to
use it for new development, and in fact, it is highly
recommended that an existing usage be expunged." That seems to
be the strongest argument in favor of Hsieh's position that I
have seen so far. It is very hard to think of a justification
for it appearing unadorned with a a warning in the text.

The simple answer is that the C standard is a specification
document, not a programming tutorial. Such a warning
properly belongs in the Rationale Document, not in the spec.
 
O

Old Wolf

Second of all, remember, I *BEAT* the performance of C's strings
across the board on multiple platforms with a combination of run
time and API design in Bstrlib. This is a false idea that error
checking always costs performance. Performance is about design,
not what you do about safety.

You keep going on about how "C is slow" and "it would be easy
to make it faster and safer". Now you claim that you have a
library that does make C "faster and safer".

In other messages, you've explained that by "safer", you mean
being less prone to buffer overflows and undefined behaviour.

The only way a C-like language can avoid buffer overflows
is to include a runtime bounds check.

Please explain how -adding- a runtime bounds check to some
code, makes it faster than the exact same code but without
the check.
 
C

Chris Hills

Randy said:
No, they have the ability to recommend fixes to a language,

Actually "they" (the WG14 committee) can change or "fix" the language
Unfortunately IMOH they are not doing so.
but
as we have already seen, the developer community is more than
willing to ignore them when push comes to shove. A community
that strong-willed should be strong enough to address the gets()
problem itself, and not rely on 'permission' from the standard
group.

This is why things like MISRA-C exist and are widely used.
I can't think of an example offhand, but I am restricted by not
being completely fluent on all the various compiler extensions,
as I try to avoid them whenever possible. Even so, you have a
point about their past history.

They are about to do it for the IBM maths extensions. Also some DSP
maths functions
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,013
Latest member
KatriceSwa

Latest Threads

Top