Why doesn't strrstr() exist?

  • Thread starter Christopher Benson-Manica
  • Start date
A

Alan Balmer

Default User said:
I think it's dumb to [set followups].
I find it rude and obnoxious.

For that I humbly apologize; consider the lesson learned.

What lesson? That Brian doesn't like the Followup-To header? I
wouldn't recommend tailoring your posting habits solely to his
preferences. Setting Followup-To on crossposted messages is
recommended by a number of netiquette guides and Son-of-1036. Some
people dislike it; other people - some of whom felt sufficiently
animated by the subject to formalize their thoughts in usage guides -
do not.

My inclination, frankly, is to follow the recommendations of the
group which can be bothered to promulgate guidelines, over the
complaints of those who can't be bothered to do more than complain.
Sometimes there are good reasons (a clear majority of opinion or
well-established practice in a given group, for example) for
observing other conventions, but I don't see any of those here. What
I see is one poster (well, two, since I've seen Alan chime in as well)
complaining about a widely-recommended practice.

It's really very simple. If one doesn't want discussion in a
newsgroup, don't post to it.

That's my guideline, hereby promulgated. Complain if you like.
 
M

Michael Wojcik

It's really very simple. If one doesn't want discussion in a
newsgroup, don't post to it.

That's my guideline, hereby promulgated.

Posting it hardly constitutes promulgation. When you include it in a
serious, substantial discussion of netiquette, made available as a
separate document, and preferably submitted to a standards mechanism
(say, as an Internet-Draft), then I'll consider it promulgated.
Promulgation must mean something other than "writing in public", or
why have the term at all?

There's a difference between, on the one hand, taking the time to
consider the nature of discourse in a medium, developing from that
theories of how best to use that medium, formulating those theories
as claims about best practices, constructing arguments in favor of
those practices, setting the lot down in a durable public document,
and submitting it for review; and on the other tossing out some
statement of preference in a note dashed off in a few seconds in some
conversation on a newsgroup where the question isn't even topical.

For me, that's a significant difference. For others, no doubt, it
is not; but it suffices for me to justify, to myself, disregarding
complaints about, say, cross-posting and followup-to when those
features are used in a manner that accords with most promulgated
guidelines.
Complain if you like.

I don't, particularly, since I don't really care what guidelines
people toss out in Usenet postings. What I do care about are the
ones that are arrived at by serious consideration and presented
with substantial justification.

Of course, that doesn't mean that there should be no discussion
of the question - quite the opposite, since it informs those who
might go on to produce the latter sort of guideline.

Tangentially, I might note that the reason I originally replied to
Christopher's post was that I feared he might believe that Brian's
opinion represented a consensus. It does not. (Should Christopher
choose to shape his behavior to it anyway, that's his business.)
 
D

Douglas A. Gwyn

Walter said:
It seems to me that you are implying that the maximum
object size that a C implementation may support, is only
half of the memory addressible in that address mode --

No, I was saying that *if* a C implementation doesn't
support some integer type with more bits than are needed
to represent an address, *and if* the compiler supports
objects larger than half the available address space,
*then* then the definition of ptrdiff_t becomes
problematic. Note all the conditions..
The machines I use most often -happen- to have that property
anyhow, because the high-bit on a pointer is reserved for
indicating kernel memory space, but I wonder about the extent
to which this is true on other machines?

Now that 64-bit integer support is required for C
conformance, there should be a suitable ptrdiff_t type
available except on systems that support processes with
data sizes greater than 2^63 bytes. I don't know of
many systems like that..
 
D

Douglas A. Gwyn

Antoine said:
The straightforward idea (using strstr() in a loop and returning the last
not-NULL answer, as strrchr() usually does) won't be a good one?

Well, it won't be optimal, since it searches the entire string
even when a match could have been found immediately if the
scan progressed from the end of the string. Finding the end
of the string initially has relatively high overhead, alas,
due to the representation of C strings. It isn't immediately
obvious just what the trade-off is between starting at the end
and scanning backward vs. the algoritm you suggested. Probably,
unless strrstr() is a bottleneck in the app, what you suggested
will be good enough.
At least it would take profit from the optimized form of strstr()

Yes, that is useful.

What I was actually concerned about was that people might
implement the naive "brute-force" method of attempting matches
at each incremental (decremental?) position, which is okay for
occasional use but certainly not nearly the fastest method.
(several people reported here that the shipped strstr()'s
regularly outperform crafted algorithms like Boyer-Moore.)

I compared various algorithms in the book to which I referred.
Not that I see any use for strrstr(), except perhaps to do the same as
strrchr() when c happens to be a multibyte character in a stateless
encoding.

Even then it's problematic, because the search would not respect
alignment with boundaries between character encodings.
 
D

Douglas A. Gwyn

Keith said:
I don't think anyone has posted the real reason: it's arbitrary. The
C standard library isn't a coherently designed entity. It's a
collection of functionality from historical implementations,
consisting largely of whatever seemed like a good idea at the time,
filtered through the standards committee. ...

That is far from arbitrary. The evolution of C library
functions was substantially influenced by the demands of
practical programming, and many of the interfaces went
through several iterations in the early years of C, as
deficiencies in earlier versions were identified. The C
standards committee quite reasonably chose to standardize
existing interfaces rather than try to design totally new
ones. Many of the standard interfaces are not at all
what we would come up with in a new design.
 
D

Default User

Michael Wojcik wrote:

Tangentially, I might note that the reason I originally replied to
Christopher's post was that I feared he might believe that Brian's
opinion represented a consensus. It does not. (Should Christopher
choose to shape his behavior to it anyway, that's his business.)

You have no idea whether it represents a consensus or not. A
"consensus" is not necessarily complete unanimity.



Brian
 
W

websnarf

Keith said:
I don't think anyone has posted the real reason: it's arbitrary. The
C standard library isn't a coherently designed entity. It's a
collection of functionality from historical implementations,
consisting largely of whatever seemed like a good idea at the time,
filtered through the standards committee. Just look at the continuing
existence of gets(), or the design of <time.h>.

It's remarkable (and a tribute to the original authors and to the
committee) that the whole thing works as well as it does.

When you look at the world through rose color glasses ...

Remember that almost every virus, buffer overflow exploit, core
dump/GPF/etc is basically due to some undefined situation in the ANSI C
standard. I consider the ANSI C standard committee basically coauthors
of every one of these problems.
 
R

Randy Howard

(e-mail address removed) wrote
(in article
When you look at the world through rose color glasses ...

Well, at least some seem to have their eyes fully open.
Remember that almost every virus, buffer overflow exploit, core
dump/GPF/etc is basically due to some undefined situation in the ANSI C
standard.

Not really. Those that defined early C, and later standard C
are not responsible for bad programming. If a programmer has
access to the standard (which they do), and they decide to do
something which 'invokes undefined behavior', then it is their
fault. The standard says do not do that, and they did it
anyway.
I consider the ANSI C standard committee basically coauthors
of every one of these problems.

I couldn't disagree more. If programmers themselves were held
responsible for their mistakes, instead of trying to blame it on
loopholes or missing words in a huge document, we would be much
better off. If you could be fined or perhaps even jailed for
gross neglicence in software development the way doctors can be
today, I suspect the problem would be all but nonexistent.
 
W

websnarf

Randy said:
Well, at least some seem to have their eyes fully open.


Not really. Those that defined early C, and later standard C
are not responsible for bad programming.

Bad programming + good programming language does not allow for buffer
overflow exploits. You still need a bad programming language to
facilitate the manifestation of these worst case scenarios.
[...] If a programmer has access to the standard (which they
do), and they decide to do something which 'invokes undefined
behavior', then it is their fault. The standard says do not
do that, and they did it anyway.

Ok, this is what I was talking about when I mentioned rose colored
glasses. If programmers are perfect, then what you are saying is fine,
because you can expect perfection. But real people are not. And I
think expectations of perfection in programming is really nonsensical.

Remember NASA put a priority inversion (a truly nasty bug to deal with)
in the mars pathfinder. The Arianne rocket blew up because of an
overflow triggering an interrupt handler that was faulty. You think
the programmers for these projects were not trying their best to do a
good job? Perfect programmers/programming is a pipedream. There is a
reason we paint lines on the roads, wear seatbelts, put guardrails on
stairs and bridges.

The problem of programmer safety can be attacked quite successfully at
the level of the programming language itself. There isn't actually a
downside to removing gets() and deprecating strtok and strnc??. (Hint:
Legacy code uses legacy compilers.)
I couldn't disagree more. If programmers themselves were held
responsible for their mistakes, instead of trying to blame it on
loopholes or missing words in a huge document, we would be much
better off.

And what if its not the programmer's fault? What if the programmer is
being worked to death? What if he's in a dispute with someone else
about how something should be done and lost the argument and was forced
to do things badly?
[...] If you could be fined or perhaps even jailed for
gross neglicence in software development the way doctors can be
today, I suspect the problem would be all but nonexistent.

Ok, that's just vindictive nonsense. Programmers are generally not
aware of the liability of their mistakes. And mistakes are not
completely removable -- and there's a real question as to whether the
rate can even be reduced.

But if you were to truly enforce such an idea, I believe both C and C++
as programming languages would instantly disappear. Nobody in their
right mind, other than the most irresponsible daredevils would program
in these langauges if they were held liable for their mistakes.
 
R

Randy Howard

(e-mail address removed) wrote
(in article
Bad programming + good programming language does not allow for buffer
overflow exploits.

For suitably high-level languages that might be true (and
provable). Let us not forget that C is *not* a high-level
language. It's not an accident that it is called high-level
assembler.

I'd love for you to explain to us, by way of example, how you
could guarantee that assembly programmers can not be allowed to
code in a way that allows buffer overflows.
You still need a bad programming language to
facilitate the manifestation of these worst case scenarios.

If you wish to argue that low-level languages are 'bad', I will
have to disagree. If you want to argue that too many people
write code in C when their skill level is more appropriate to a
language with more seatbelts, I won't disagree. The trick is
deciding who gets to make the rules.
[...] If a programmer has access to the standard (which they
do), and they decide to do something which 'invokes undefined
behavior', then it is their fault. The standard says do not
do that, and they did it anyway.

Ok, this is what I was talking about when I mentioned rose colored
glasses. If programmers are perfect, then what you are saying is fine,
because you can expect perfection. But real people are not. And I
think expectations of perfection in programming is really nonsensical.

/Exactly/ Expecting zero buffer overruns is nonsensical.
Remember NASA put a priority inversion (a truly nasty bug to deal with)
in the mars pathfinder. The Arianne rocket blew up because of an
overflow triggering an interrupt handler that was faulty. You think
the programmers for these projects were not trying their best to do a
good job?

No, I do not. I expect things to go wrong, because humans are
not infallible. Especially in something as inherently difficult
as space travel. It's not like you can test it (for real)
before you try it for all the marbles. You can't just hire an
army of monkey to sit in a lab beating on the keyboarrd all day
like an application company.

Anyway, a language so restrictive as to guarantee that nothing
can go wrong will probably never be used for any real-world
project.
Perfect programmers/programming is a pipedream.

So is the idea of a 'perfect language'.
There is a
reason we paint lines on the roads, wear seatbelts, put guardrails on
stairs and bridges.

Yes. And we require licenses for dangerous activities
elsewhere, but anyone can pick up a compiler and start playing
around.
The problem of programmer safety can be attacked quite successfully at
the level of the programming language itself.

It's quite easy to simply make the use of gets() and friends
illegal for your code development. Most of us have already done
so, without a standard body telling us to do it.
There isn't actually a downside to removing gets() and deprecating
strtok and strnc??. (Hint: Legacy code uses legacy compilers.)

Hint: Legacy code doesn't have to stay on the original platform.
Even so, anyone dusting off an old program that doesn't go
sifting through looking for the usual suspects is a fool.

I don't have a problem with taking gets() out of modern
compilers, but as you already pointed out, this doesn't
guarantee anything. People can still fire up an old compiler
and use it. I don't see a realistic way for the C standard to
enforce such things.
And what if its not the programmer's fault?

It is the fault of the development team, comprised of whoever
that involves for a given project. If the programmer feels like
his boss screwed him over, let him refuse to continue, swear out
an affidavit and have it notarized the bad software was
knowingly shipped, and that you refuse to endorse it.
What if the programmer is being worked to death?

That would be interesting, because although I have worked way
more than my fair share of 120 hour weeks, I never died, and
never heard of anyone dying. I have heard of a few losing it
and checking themselves into psycho wards, but still. If you
are being overworked, you can either keep doing it, or you can
quit, or you can convince your boss to lighten up. ESPECIALLY
in this case, the C standard folks are not to blame.
What if he's in a dispute with someone else
about how something should be done and lost the argument and
was forced to do things badly?

Try and force me to write something in a way that I know is
wrong. Go ahead, it'll be a short argument, because I will
resign first.

Try and force a brain surgeon to operate on your head with a
chainsaw. good luck.
[...] If you could be fined or perhaps even jailed for
gross neglicence in software development the way doctors can be
today, I suspect the problem would be all but nonexistent.

Ok, that's just vindictive nonsense.

Why? We expect architects, doctors, lawyers, pretty much all
other real 'professions' to meet and typically exceed a higher
standard, and those that do not are punished, fined, or stripped
of their license to practice in the field. Why should
programmers get a pass? Is it because you do not feel it is a
professional position?

We don't let anyone that wants to prescribe medicine, why should
we let anyone that wants to put software up for download which
could compromise system security?
Programmers are generally not aware of the liability of
their mistakes.

Then those you refer to must be generally incompetent. Those
that are good certainly are aware, especially when the software
is of a critical nature.
And mistakes are not completely removable --

Correct. It's also not possible to completely remove medical
malpractice, but it gets punished anyway. It's called a
deterrent.
and there's a real question as to whether the rate can even be reduced.

As long as there is no risk of failure, it almost certainly will
not be reduced by magic or wishing.
But if you were to truly enforce such an idea, I believe both C and C++
as programming languages would instantly disappear.

I highly doubt that. Low-level language programmers would be
the cream of the crop, not 'the lowest bidder' as is the case
today. You would not be hired to work based upon price, but on
skill. Much as I would go look for the most expensive attorney
I could find if I was on trial, I would look for the most highly
skilled programmers I could find to work on a nuclear reactor.

Taking bids and outsourcing to some sweatshop in a jungle
somewhere would not be on the list of options.
Nobody in their right mind, other than the most irresponsible
daredevils would program in these langauges if they were held
liable for their mistakes.

I guess all the professionals in other fields where they are
held up to scrutiny must be irresponsible daredevils too. For
example, there are operations that have very low success rates,
yet there are doctors that specialize in them anyway, despite
the low odds.

If you don't want to take the risk, then go write in visual
whatever#.net and leave it to those that are.
 
C

Chris McDonald

I'd love for you to explain to us, by way of example, how you
could guarantee that assembly programmers can not be allowed to
code in a way that allows buffer overflows.

......

/Exactly/ Expecting zero buffer overruns is nonsensical.

......

Anyway, a language so restrictive as to guarantee that nothing
can go wrong will probably never be used for any real-world
project.


I struggle to parse your first sentence, but what if assembly language
programmers were "required" to program in an assembly language whose
program structure could be strongly verified at runtime (aka JVM bytecodes)?

Or would that be against the spirit of an assembly language, and the
discussion?

</getting-way-OT>
 
W

websnarf

Randy said:
For suitably high-level languages that might be true (and
provable). Let us not forget that C is *not* a high-level
language. It's not an accident that it is called high-level
assembler.

Right. If you're not with us, you are with the terrorists.

Why does being a low language mean you have to present a programming
interface surrounded by landmines? Exposing a sufficiently low level
interface may require that you expose some danergous semantics, but why
expose them up front right in the most natural paths of usage?
I'd love for you to explain to us, by way of example, how you
could guarantee that assembly programmers can not be allowed to
code in a way that allows buffer overflows.

Ok, the halting problem means basically nobody guarantees anything
about computer programming.

But its interesting that you bring up the questions of assembly
language. If you persuse the x86 assembly USENET newsgroups, you will
see that many people are very interested in expanding the power and
syntax for assembly language (examples include HLA, RosAsm, and
others). A recent post talked about writing a good string library for
assembly, and there was a strong endorsement for the length prefixed
style of strings, including one direct reference to Bstrlib as a design
worth following (not posted by me!).

So, while assembly clearly isn't an inherently safe language, it seems
quite possible that some assembly efforts will have a much safer (and
much faster) string interface than C does.
If you wish to argue that low-level languages are 'bad', I will
have to disagree.

So why put those words in my mouth?
[...] If you want to argue that too many people
write code in C when their skill level is more appropriate to a
language with more seatbelts, I won't disagree. The trick is
deciding who gets to make the rules.

But I'm not arguing that either. I am saying C is to a large degree
just capriciously and unnecessarily unsafe (and slow, and powerless,
and unportable etc., etc).
[...] If a programmer has access to the standard (which they
do), and they decide to do something which 'invokes undefined
behavior', then it is their fault. The standard says do not
do that, and they did it anyway.

Ok, this is what I was talking about when I mentioned rose colored
glasses. If programmers are perfect, then what you are saying is fine,
because you can expect perfection. But real people are not. And I
think expectations of perfection in programming is really nonsensical.

/Exactly/ Expecting zero buffer overruns is nonsensical.

Well, not exactly. If you're not using C or C++, then buffer overflows
usually at worse lead to a runtime exception; in C or C++, exploits are
typically designed to gain shell access in the context of the erroneous
program. Its like honey for bees -- people attack C/C++ programs
because they have this weakness. In other safer programming languages,
even if you had a buffer overflow, allowing a control flow
zombification of the program is typically not going to be possible.
No, I do not. I expect things to go wrong, because humans are
not infallible. Especially in something as inherently difficult
as space travel.

Space travel itself was not the issue, and it wasn't any more
complicated than any kind of industrial device manager (as you might
find in an automated assembly line.) The real problem is the priority
inversions are *nasty*. Each component can be unit tested and
validated to work properly in isolation -- the problem is that when you
put them together and they encounter a specific scenario. Its just a
very sophisticated deadlock.
[...] It's not like you can test it (for real)
before you try it for all the marbles. You can't just hire an
army of monkey to sit in a lab beating on the keyboarrd all day
like an application company.

Hmm ... I don't think that's quite it. The problem is that the
scenario, which I don't recall all the details of, was something that
was simply unaccounted for in their testing. This is a problem in
testing in general. Line by line coverage, unit testing, and other
forms of typical testing really only find the most obvious bugs.

They were able to save the pathfinder, because VxWorks allows you to
reboot into a shell or debug mode, and they were able to patch the code
remotely. The point of this being that in the end they were lucky to
have very sophisticated 3rd party support that is well beyond anything
that the C standard delivers.
Anyway, a language so restrictive as to guarantee that nothing
can go wrong will probably never be used for any real-world
project.

How about simpler language that is more powerful, demonstrably faster,
more portable (dictionary definition), obviously safer and still just
as low level? Just take the C standard, deprecate the garbage, replace
a few things, genericize some of the APIs, well define some of the
scenarios which are currently described as undefined, make some of the
ambiguous syntaxes that lead to undefined behavior illegal, and you're
immediately there. If these steps seem too radical, just draw a line
from where you are and where you need to go, and pick an acceptable
point in between.

Your problem is that you assume making C safer (or faster, or more
portable, or whatever) will take something useful away from C that it
currently has. Think about that for a minute. How is possible that
your mind can be in that state?
So is the idea of a 'perfect language'.

But I was not advocating that. You want punishment -- so you
implicitely are *demanding* programmer perfection.
Yes. And we require licenses for dangerous activities
elsewhere, but anyone can pick up a compiler and start playing
around.


It's quite easy to simply make the use of gets() and friends
illegal for your code development. Most of us have already done
so, without a standard body telling us to do it.

So, estimate the time taken to absorb this information per programmer,
multiply it by the average wage of that programmer, multiply that by
the number of programmers that follow that and there you get the cost
of doing it correctly. Add to that the cost of downtime for those that
get it wrong. (These are costs per year, of course -- since its an on
going problem, the total cost would really be infinite.)

The standards body, just needs to remove it and those costs go away.
Vendors and legacy defenders and pure idiot programmers might get their
panties in a bunch, but no matter how you slice it, the cost of doing
this is clearly finite.
Hint: Legacy code doesn't have to stay on the original platform.

Hint: moving code *ALWAYS* incurrs costs. As I said above, its a
*finite* cost. You don't think people who move code around with calls
to gets() in it should remove them?
Even so, anyone dusting off an old program that doesn't go
sifting through looking for the usual suspects is a fool.

And an old million line program? I think this process should be
automated. In fact, I think it should be automated in your compiler.
In fact I think your compiler should just reject these nonsensical
functions out of hand and issue errors complaining about them. Hey! I
have an idea! Why not remove them from the standard?
I don't have a problem with taking gets() out of modern
compilers, but as you already pointed out, this doesn't
guarantee anything. People can still fire up an old compiler
and use it. I don't see a realistic way for the C standard to
enforce such things.

Interesting -- because I do. You make gets a reserved word, not
redefinable by the preprocessor, and have it always lead to a syntax
error. This forces legacy code owners to either remove it, or stay
away from new compilers.

This has value because, developers can claim to be "C 2010 compliant"
or whatever, and this can tell you that you know it doesn't have gets()
or any other wart that you decided to get rid of. This would in turn
put pressure of the legacy code owners to remove the offending calls,
in an effort that's certainly no worse than the Y2K issue (without the
looming deadline hanging over their heads).
It is the fault of the development team, comprised of whoever
that involves for a given project. If the programmer feels like
his boss screwed him over, let him refuse to continue, swear out
an affidavit and have it notarized the bad software was
knowingly shipped, and that you refuse to endorse it.

Oh I see. So, which socialist totally unionized company do you work as
a programmer for? I'd like to apply!
That would be interesting, because although I have worked way
more than my fair share of 120 hour weeks, I never died, and
never heard of anyone dying. I have heard of a few losing it
and checking themselves into psycho wards, but still.

Well ... they usually put in buffer overflows, backdoors, or otherwise
sloppy code before they check into these places.
[...] If you
are being overworked, you can either keep doing it, or you can
quit, or you can convince your boss to lighten up.

Hmmm ... so you live in India? I'm trying to guess where it is in this
day and age that you can just quit your job solely because you don't
like the pressures coming from management.
[...] ESPECIALLY in this case, the C standard folks are not to blame.

But if the same issue happens and you are using a safer language, the
same kinds of issues don't come up. Your code might be wrong, but it
won't allow buffer overflow exploits.
Try and force me to write something in a way that I know is
wrong. Go ahead, it'll be a short argument, because I will
resign first.

That's a nice bubble you live in. Or is it just in your mind?
Try and force a brain surgeon to operate on your head with a
chainsaw. good luck.
[...] If you could be fined or perhaps even jailed for
gross neglicence in software development the way doctors can be
today, I suspect the problem would be all but nonexistent.

Ok, that's just vindictive nonsense.

Why? We expect architects, doctors, lawyers, pretty much all
other real 'professions' to meet and typically exceed a higher
standard, and those that do not are punished, fined, or stripped
of their license to practice in the field. Why should
programmers get a pass? Is it because you do not feel it is a
professional position?

Because its not as structured, and that's simply not practical.
Doctors have training, internships, etc. Lawyers have to pass a bar
exam, etc. There's no such analogue for computer programmers. Because
the most successful programmers are always ones that are able to think
outside the box, but the bar for average programmers is pretty low --
but both can make a contribution, and neither can guarantee perfect
code.
We don't let anyone that wants to prescribe medicine, why should
we let anyone that wants to put software up for download which
could compromise system security?


Then those you refer to must be generally incompetent.

Dennis Ritchie had no idea that NASA would put a priority inversion in
their pathfinder code. Linus Torvalds had no idea that the NSA would
take his code and use it for a security based platform. My point is
that programmers don't know what the liability of their code is,
because they are not always in control of when or where or for what it
might be used.

The recent JPEG parsing buffer overflow exploit, for example, came from
failed sample code from the JPEG website itself. You think we should
hunt down Tom Lane and linch him?
[...] Those that are good certainly are aware, especially when
the software is of a critical nature.
And mistakes are not completely removable --

Correct. It's also not possible to completely remove medical
malpractice, but it gets punished anyway. It's called a
deterrent.

You don't think medical practioners use the latest and safest
technology available to practice their medicine?
As long as there is no risk of failure, it almost certainly will
not be reduced by magic or wishing.

This is utter nonsense. The reason for the success of languages like
Java and Python is not because of their speed you know.
I highly doubt that. Low-level language programmers would be
the cream of the crop, not 'the lowest bidder' as is the case
today.

You still don't get it. You, I or anyone you know, will produce errors
if pushed. There's no such thing as a 0 error rate for programming.
Just measuring first time compile error rates, myself, I score roughly
one syntax error per 300 lines of code. I take this as an indicator
for the likely number of hidden bugs I just don't know about in my
code. Unless my first-compile error rate was 0, I just can't have any
confidence that I don't also have a 0 hidden bug rate. I know that
since using my own Bstrlib library, and other similar mechanisms my
rate is probably far less now than its ever been. But its still not 0.

Go measure your own first-compile error rate and tell me you are
confident in your own ability to avoid hidden bugs. If you still think
you can achieve a 0 or near 0 hidden bug rate, go look up "priority
inversion". No syntax checker and no run time debugger can tell you
about this sort of error. Your only chance of avoiding these sorts of
errors is having a very thoroughly vetted high level design.
[...] You would not be hired to work based upon price, but on
skill. Much as I would go look for the most expensive attorney
I could find if I was on trial, I would look for the most highly
skilled programmers I could find to work on a nuclear reactor.

Taking bids and outsourcing to some sweatshop in a jungle
somewhere would not be on the list of options.

For a nuclear reactor, I would also include the requirement that they
use a safer programming language like Ada. Personally I would be
shocked to know that *ANY* nuclear reactor control mechanism was
written in C. Maybe a low level I/O driver library, that was
thoroughly vetted (because you probably can't do that in Ada), but
that's it.
I guess all the professionals in other fields where they are
held up to scrutiny must be irresponsible daredevils too.

No -- they have great assistance and controlled environments that allow
them to perform under such conditions. Something akin to using a
better programming language.
[...] For
example, there are operations that have very low success rates,
yet there are doctors that specialize in them anyway, despite
the low odds.

Well, your analogy only makes some sense if you are talking about
surgeons in developing countries who simply don't have access to the
necessary anesthetic, support staff or even the proper education to do
the operation correctly. In those cases, there is little choice, so
you make do with what you have. But obviously its a situation you just
want to move away from -- they way you solve it, is you give them
access to the safer, and better ways to practice medicine.
If you don't want to take the risk, then go write in visual
whatever#.net and leave it to those that are.

So you want some people to stay away from C because the language is too
dangerous. While I want the language be fixed so that most people
don't trigger the landmines in the language so easily. If you think
about it, my solution actually *costs* less.
 
M

Magnus Wibeck

> The point of this being that in the end they were lucky to
> have very sophisticated 3rd party support that is well beyond anything
> that the C standard delivers.

You surely cannot be comparing "3rd party support" from a commercial
company to a language standard? They have totally different purposes.
That's like comparing a specification of a car to a taxi company,
and complaining that if you sit on the specification it doesn't get you
anywhere, but if you call the taxi company they get you where you tell them to.
> For a nuclear reactor, I would also include the requirement that they
> use a safer programming language like Ada.

The Ariane software module that caused the problem was written in Ada.
http://sunnyday.mit.edu/accidents/Ariane5accidentreport.html
Had it been written in C, the actual cause (integer overflow) probably would not
have caused an exception. I'm not saying that it would have been better in
C, but you *cannot* blame the C standard for what happened there.

Also, this "priority inversion" you speak of - doesn't that imply processes
or threads? C does not have that AFAIK. So you cannot blame the C standard
for allowing priority inversion bugs to occurr. It neither allows or disallows
them, because C has no notion of priorities.

/Magnus
 
R

Richard Kettlewell

Bad programming + good programming language does not allow for
buffer overflow exploits. You still need a bad programming language
to facilitate the manifestation of these worst case scenarios.

Exploits that rely on C undefined behaviour are not the only kind of
problem in reality. Programs not written in C sometimes have serious
security problems too.

For example lots of software has had various kinds of quoting and
validation bugs - SQL injection, cross-site scripting, inadequate
shell quoting - for many years, and this is a consequence purely of
the program, and cannot be pinned on the language it is written in.

You won't spot these bugs with tools such as Valgrind or Purify,
either.
 
W

websnarf

Magnus said:
You surely cannot be comparing "3rd party support" from a commercial
company to a language standard?

Originally I was making a point about the mistake rate of programmers.
But more generally, the C language probably has more "problem support
tools" than any language in existence, and this will probably continue
to be true for the future regardless of language mindshare.
[...] They have totally different purposes.
That's like comparing a specification of a car to a taxi company,
and complaining that if you sit on the specification it doesn't get you
anywhere, but if you call the taxi company they get you where you tell them
to.

Hmmm ... I'm not sure its the same thing. For example let's say C
added a function: numallocs(), that counted the number of memory
allocations that are outstanding (or the maximum number that could be
legally freed or whatever.) Similarly, if the Boehm garbage collector
were adopted as part of the C standard (not that I'm advocating that.)
If the C library were to basically abandon its string functions and use
something like Bstrlib, for example, then David Wagner's (and many
other) buffer overflow security analysis tools would be obsolete.
The Ariane software module that caused the problem was written in Ada.
http://sunnyday.mit.edu/accidents/Ariane5accidentreport.html
Had it been written in C, the actual cause (integer overflow) probably would
not have caused an exception. I'm not saying that it would have been better
in C, but you *cannot* blame the C standard for what happened there.

You are right, I cannot blame C for bugs that happen in other
languages. This is the most famous one from Ada. If you would like a
short list of infamous bugs for C just go through the CERT advisories
-- they are basically almost entirely C related.

See, the thing is, with Ada bugs, you can clearly blame the programmer
for most kinds of failures. With C you can go either way. But nearly
every software design house that writes lots of software in C just gets
bit by bugs from all sorts of edges of the language.
Also, this "priority inversion" you speak of - doesn't that imply processes
or threads? C does not have that AFAIK. So you cannot blame the C standard
for allowing priority inversion bugs to occurr. It neither allows or
disallows them, because C has no notion of priorities.

The programmer used priority based threading because that's what he had
available to him. Suppose, however, that C had implemented co-routines
(they require only barely more support than setjmp()/longjmp()). It
turns out that using coroutines alone, you can implement a lot of
multitasking problems. Maybe the Pathfinder code would have more
coroutines, and fewer threads, and may have avoided the problem
altogether (I am not privy to their source, so I really don't know).
This isn't just some weird snake oil style solution -- by their very
nature, coroutines do not have priorities, do not in of themselves make
race conditions possible, and generally consume less in resources than
threads.

Coroutines are one of those "perfect compromises", because you can
easily specify a portable interface, that is very likely to be widely
supportable, they are actually tremendously faster than threading in
many cases, and all without adding *any* undefined behavior or
implementation defined behavior scenarios (other than a potential
inability to allocate new stacks.) Full blown multithreading, such as
in POSIX is notoriously platform specific, and it should not surprise
anyone that only few non-UNIX platforms support full blowns POSIX
threads. This fact has been noticed and adopted by those languages
where serious development is happening (Lua, Perl, Python). I don't
know if the C standards committee would be open to this -- I highly
doubt it.
 
C

Chris Hills

why not?
You are right, I cannot blame C for bugs that happen in other
languages. This is the most famous one from Ada. If you would like a
short list of infamous bugs for C just go through the CERT advisories
-- they are basically almost entirely C related.


Possibly because C is more widely and less rigorously used? I would
expect that most Ada projects are high integrity and developed as such.
C is often not used ( and certainly not taught) in a high integrity
environmnet
See, the thing is, with Ada bugs, you can clearly blame the programmer
for most kinds of failures.

AFAIK the Arriane problem was one of project management
With C you can go either way. But nearly
every software design house that writes lots of software in C just gets
bit by bugs from all sorts of edges of the language.

So use a subset? Many industries do.
 
K

kuyper

So why put those words in my mouth?

He didn't - he's just pointing out that the characteristics you deplore
in C are inherent in C being a low-level language. Therefore, any
criticism of C for possessing those characteristics implies a criticism
of all low-level languages. You didn't actually make such a criticism,
but it was implied by the criticism you did make.

....
Your problem is that you assume making C safer (or faster, or more
portable, or whatever) will take something useful away from C that it
currently has. Think about that for a minute. How is possible that
your mind can be in that state?

Possibly, possession of a certain minimal state of awareness of
reality? No one wants C to be unsafe, slow, or unportable. As a general
rule, the cost-free ways of making it safer, faster, and more portable
have already been fully exploited. Therefore, the remaining ways are
disproportionately likely to carry a significant cost.

This is simple economics: cost-free or negative-cost ways of improving
anything are usually implemented quickly. With any reasonably mature
system, the ways of improving the system that haven't been implemented
yet are disproportionately likely to carry a significant cost.

....
But I was not advocating that. You want punishment -- so you
implicitely are *demanding* programmer perfection.

By that logic, requiring punishment for theft implicitly demands human
perfection?

....
get it wrong. (These are costs per year, of course -- since its an on
going problem, the total cost would really be infinite.)

You're failing to take into consideration the cost of capital. Costs
that take place in the future are less expensive in present-day dollars
than costs that take place in the present. The net present value of a
steady annual cost is finite, so long as the cost of capital is
positive.

....
The standards body, just needs to remove it and those costs go away.
Vendors and legacy defenders and pure idiot programmers might get their
panties in a bunch, but no matter how you slice it, the cost of doing
this is clearly finite.

You're assuming that those programmers are idiots, instead of being
intelligent people who are actually aware of what the ongoing (i.e. by
your way of calculating things, infinite) costs of such a change will
be.
^^^^^^^^^^^
Interesting -- because I do. You make gets a reserved word, not
redefinable by the preprocessor, and have it always lead to a syntax
error. This forces legacy code owners to either remove it, or stay
away from new compilers.

How in the world does changing new compilers have any effect on people
who "fire up an old compiler and use it"?

....
Oh I see. So, which socialist totally unionized company do you work as
a programmer for? I'd like to apply!

What does socialism and unionism have to do with workers accepting full
responsibility for the quality of their product?
Well ... they usually put in buffer overflows, backdoors, or otherwise
sloppy code before they check into these places.

Backdoors are, by definition, installed deliberately. I suppose you
might have intended to imply that overworked programmers would install
backdoors as a way of getting revenge for being overworked, but if so,
you didn't express that idea properly.
[...] If you
are being overworked, you can either keep doing it, or you can
quit, or you can convince your boss to lighten up.

Hmmm ... so you live in India? I'm trying to guess where it is in this
day and age that you can just quit your job solely because you don't
like the pressures coming from management.

I'm curious - what part of the world do you live in where you are
prohibited from quitting your job? I don't understand your reference to
India - are you suggesting that it is the only place in the world where
workers aren't slaves?

....
That's a nice bubble you live in. Or is it just in your mind?

I live in that same bubble. I'm free to quit my job for any reasons I
want to, at any time I want to. I would stop being paid, I'd have to
start searching for a new job at a better employer, and I'd have to pay
full price if I decided to use the CORBA option to continue recieving
the insurance benefits that my employer currently subsizes, but those
are just consequences of my decision, not things that would prevent me
from making it. If I decide to obey orders to produce defective code, I
have to accept the consequences of being responsible for bad code. If I
prefer the consequences of having to look for a new job at a better
employer, that's precisely what I'll do. Wouldn't you?

....
Dennis Ritchie had no idea that NASA would put a priority inversion in
their pathfinder code. Linus Torvalds had no idea that the NSA would
take his code and use it for a security based platform. My point is
that programmers don't know what the liability of their code is,
because they are not always in control of when or where or for what it
might be used.

When you take someone else's code and use in in a context that it
wasn't designed for, the responsibility for adapting it to be suitable
for use in the new context is yours, not the original author's.
But if you were to truly enforce such an idea, I believe both C and C++
[...] For
example, there are operations that have very low success rates,
yet there are doctors that specialize in them anyway, despite
the low odds.

Well, your analogy only makes some sense if you are talking about
surgeons in developing countries who simply don't have access to the
necessary anesthetic, support staff or even the proper education to do
the operation correctly.

Which would you prefer: a life expectancy of three months, or a 30%
chance of increasing your life expectancy to 20 years, inextricably
linked with a 70% chance of dying in the operating room tomorrow? There
are real-life situations where the best doctors in the world, with the
best equipment in the world, can't offer you a choice that's any more
attractive than that one.
... In those cases, there is little choice, so
you make do with what you have. But obviously its a situation you just
want to move away from -- they way you solve it, is you give them
access to the safer, and better ways to practice medicine.

I suspect that no matter how advanced our medicine gets, there will
always be conditions that it's just barely able to deal with. The
longer we live, the harder it is to keep us living; that's pretty much
unavoidable.
 
H

Hallvard B Furuseth

Paul said:
Remember that almost every virus, buffer overflow exploit, core
dump/GPF/etc is basically due to some undefined situation in the
ANSI C standard. I consider the ANSI C standard committee
basically coauthors of every one of these problems.

So it's partly their fault? What should they have done -
refrained from standardizing the already existing C language?
That would not have helped: K&R C was already widely used, and
people were cooperating anyway to get some sort of portability out
of it.

Or should they have removed every undefined situation from the
language? Bye bye free() and realloc() - require a garbage
collector instead. To catch all bad pointer usage, insert
type/range information in both pointers and data. Those two
changes alone in the standard would change the C runtime
implementation so much that it's practically another language.

An _implementation_ which catches such things can be nice when you
already have a C program which you want to run safely. But if the
language standard itself made such requirements, a number of the
reasons that exist to choose C for a project would not be there.

If one is going to use another language than C, it's better to use
a language which takes advantage of not being C, instead of a
language which pretends to be C but isn't.
 
C

Chris Torek

[off-topic drift, but I cannot resist...]

You feel that I my choice of moniker reflects something about my level
of expertise? Note that "Default User" is NOT the default name in
XanaNews, my current newsreader.

I always figured it meant that you are known for requiring a
"default:" label in every switch(). :)
 
C

Chris Torek

(Again, quite off-topic, but ...)

[Ariane rocket example]

You are right, I cannot blame C for bugs that happen in other
languages. This is the most famous one from Ada. ...
See, the thing is, with Ada bugs, you can clearly blame the programmer
for most kinds of failures.

I am reminded of a line from a novel and movie:

"*We* fix the blame. *They* fix the problem. Their way's better."

[Pathfinder example]
The programmer used priority based threading because that's what he had
available to him.

Actually, the Pathfinder used vxWorks, a system with which I am
now somewhat familiar. (Not that I know much about versions
predating 6.0, but this particular item has been this way "forever",
or long enough anyway.)

The vxWorks system offers "mutex semaphores" as one of its several
flavors of data-protection between threads. The mutex creation
call, semMCreate(), takes several flag parameters. One of these
flags controls "task" (thread, process, whatever moniker you prefer)
priority behavior when the task blocks on the mutex.

The programmer *chose* this behavior, because vxWorks does offer
priority inheritance. (Admittedly, vxWorks priority inheritance
has a flaw, but that is a different problem.)

Thus, your premise -- that the programmer used priority based
scheduling (without inheritance) that led to the priority inversion
problem "because that's what he had available" is incorrect: he
could have chosen to make all the threads the same priority, and/or
used priority inheritance, all with simple parameters to the various
calls (taskSpawn(), semMCreate(), and so on).
Coroutines are one of those "perfect compromises" ...

Coroutines are hardly perfect. However, if you like them, I suggest
you investigate the Icon programming language, for instance.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,766
Messages
2,569,569
Members
45,042
Latest member
icassiem

Latest Threads

Top