Why C is really a bad programming language

R

Rafael Anschau

To lapse into prose: people with narcissistic personality disorders,
like you, are instinctively Platonists. They form the Idea of the
"good" programmer and the "good" manager but reality is always rubbing
their nose in failure.

I disagreed with much of what you have said until now. If it serves
you as an advice, it seems like many people around here aren´t ready
for humbleness.I definitely don´t expect that anymore. They will be
arrogant and will put their self-image above anything else. That´s
just a fact.
 
K

Keith Thompson

Richard Heathfield said:
Ben Bacarisse said:


Then would someone please implement "99 bottles" therein?
http://99-bottles-of-beer.net/ (what else?) is my reference site for
obscure languages, and there is no F entry. This threw me.

My own special-purpose language "99" is there, joining Ada, B, C, Ada,
D, e, and F among the rare languages whose names are hexadecimal
palindromes.
 
R

Richard Bos

jellybean stonerfish said:
It doesn't really matter. He needs no encouragement.

That is highly debatable. If certain hypocrits didn't do their very
arsefucking best to keep him here, he'd be off pestering somewhere else,
and leave this group alone.

Richard
 
L

luserXtrog

Well, you start with two rather juvenile categories, but good for you.
Yes, I shall be patronizing. For let us not speak falsely now.

Don't you even start on ScientiFICtion.

8{>
 
S

spinoza1111

I disagreed with much of what you have said until now. If it serves
you as an advice, it seems like many people around here aren´t ready
for humbleness.I definitely don´t expect that anymore. They will be
arrogant and will put their self-image above anything else. That´s
just a fact.

It hasn't always been this way. Early programmers were collegial and
learned to review code without getting into personalities. FT Baker of
IBM, for example, enforced this as a rule and the result has been that
the New York Times has used his system for almost thirty years to keep
electronic copies of stories.

But in my experience, it takes only one "bad apple", and I think
that's Richard Heathfield. Like some creep in an office, he forces
people to respond in kind, entrapping them by questioning their global
competence based on one error or some lines of code they suggest here.

This childish behavior ruined the "safety culture" of NASA and got two
Space Shuttle crews killed, because engineers at NASA under Mike
Griffin are afraid to speak up in meetings lest the "Richard
Heathfield" engineer in the meeting respond to their complaints (about
heat shield tiling falling off or alloy performance) not with
counterarguments but by reference to "standards" and, when that fails,
by trashing the engineer's competence.

"Standards" at NASA in 2003 included "standards" that positively
stated that certain problems, such as heat shield tiles falling off on
liftoff, were not to be addressed because they'd been classified as
"known, but not known to affect safety". This strongly reflects the
way in which Heathfield uses the C standard. He uses it like a sleazy
lawyer, or shipping company operatives here in Hong Kong, because
lawyers make reference to law in order to evade duty and shippers make
sure that all "documents" are filed while sailing polluting and unsafe
ships in regions where they kill wildlife.

Fortunately, NASA is not hiring C programmers right now.
 
S

spinoza1111

Many people here have concluded that the test was flawed, that one of the
questions answer depended on the absence of a sequence point, and above
all that C++ != C.

And after that, we found out that you did not even know the concept of
sequence point. (no, thank you, I would *not* buy a compiler from you)

That concept post-dated my use of C. I believe it was developed to
make existing C compilers from powerful manufacturers "compliant". It
didn't exist in original C, and original C is what is commonly
understood to constitute C, not the standard.

The authors of certain runtimes had not been competent to evaluate
expressions using post-increment (etc.) properly so the standard was
adjusted to make those buggy compilers standard.

The C standardization effort was a criminal venture to see to it that
companies could continue to make money, and evidence for that is the
amount of bad feeling and savagery that emerged from it. The winners,
judging from their document on Schildt, aren't competent programmers
but have masqueraded as such ever since.
 
S

spinoza1111

spinoza1111said:



Oh, we taught you all right. If you learned nothing, it was not for
the want of teaching.

No, you did not.

My libel suit is in the works, Richard.
 
R

Rafael Anschau

This childish behavior ruined the "safety culture" of NASA and got two
Space Shuttle crews killed, because engineers at NASA under Mike
Griffin are afraid to speak up in meetings

<snip>

Is there any evidence of the things you talk about ?

You seem to be performing the conspiracy theorist fallacy "Of course
there´s no evidences! That proves how they good they are at hiding
evidences!"


Rafael
 
S

spinoza1111

Rafael Anschau said:

Cf Diane Vaughan, THE CHALLENGER LAUNCH DECISION, Univ of Chicago
Press 1999

Cf 747, by Joe Sutter, former chief engineer of the Boeing 747. He was
asked by President Reagan to serve on the panel that reviewed
Challenger. He concurs with Diane Vaughan that NASA's safety culture
was broken. My observation: it was broken at the same time for the
same reason at Bell-Northern Research. The result? Northern Telecom,
its parent, was milked by its executives and today, former BNR
employees are dying in motels without pensions or medical
insurance...because it abandoned software reliability and the
customers abandoned it.

Cf the official report on the Columbia disaster of 2003. The broken
safety culture had persisted.

Sutter in particular sums it up. On the 747 project, meeting FAA
guidelines was a baseline not a target at Boeing and that is why the
747 is the safest airplane in the skies, with forty years of flying.
He said that at NASA, "safety" consisted in satisfying a baseline.

Likewise, C's "safety" only satisfies a baseline. At any time a C
program may be altered by the original programmer to create unsafe
code WHILE STILL BEING C. This is NOT the case with C Sharp or Java.
There's a bright line to cross in the latter case, in C Sharp's case
the line between managed and unmanaged code. In C, the boundary is as
hard to find as the boundary between Canada and the US in North
Dakota.
 
S

spinoza1111

Now, this is a more effective troll post.  A definite point, an
attention-grabbing headline, and just enough details to make it look
legitimate.  Well done.- Hide quoted text -

- Show quoted text -

Gee, it looks...real...MOM!

Perhaps it IS real?

How could I pretend to write this shit? Isn't the pretense the same as
the writing? If I can quote Shakespeare apropos of programming by
remembering lines that fit, because I read him, is that being a
pretentious little twit, a real smart guy, or both?

What's Hecuba to him, or he to Hecuba,
That he should weepe for her? What would he doe,
Had he the Motiue and the Cue for passion
That I haue?

What's he to Hecuba, and who is lying to themself and others here? And
why is it so necessary for people here to deny others' realities?

Seemes Madam? Nay, it is: I know not Seemes:
'Tis not alone my Inky Cloake (good Mother)
Nor Customary suites of solemne Blacke,
Nor windy suspiration of forc'd breath,
No, nor the fruitfull Riuer in the Eye,
Nor the deiected hauiour of the Visage,
Together with all Formes, Moods, shewes of Griefe,
That can denote me truly. These indeed Seeme,
For they are actions that a man might play:
But I haue that Within, which passeth show;
These, but the Trappings, and the Suites of woe.

People crawl in here with genuine questions, needs and feelings even
as Dosteoevsky's beast does in The Tale of the Grand Inquisitor. But
when those questions, needs and feelings overflow the rule of cool,
where "cool" is male character armor, they are called trolls, which
means that they lie. Working with John "A Beautiful Mind" Nash showed
me that there's always more than meets the eye. Hamlet's Mom thought
he was faking grief but Hamlet knew not seems. Even Musatov has a
story.

What I cannot fake is writing decent C code that Heathfield likes, but
I don't wish to do this. We have a saying here in Hong Kong, which
roughly translated means: "it's not easy to get stupid".
 
G

Guest

| In 1986 I got a job doing software for real estate appraisal. I felt
| guilty because I spent the first couple of weeks rolling string
| handlers. I now believe this is unethical behavior and malpractice,
| today, if it wasn't then.

Many years ago I took over a project that was failing. It was a web site
that search for certain data in a database, along with referenced files.
The project was originally being done in Perl. Five months of development
resulted in a hooribly slow experience and a terrible user interface.
I (a network engineer) was in a meeting where the project came up and the
accounting people who has requested the project were commenting on the
user interface being ugly. I said I could make a better one, and they
said they were interested. I produced a prototype page in had coded HTML
at the end of the day and they loved it. At that point conversation (not
in the meeting anymore) moved over to developing the web application. I
had already had a bad experience with Perl and said if I were to do it,
it would only be in C. They said to give it a try.

So at this point I recognized that I needed to roll by own string handlers
as well as container, which I implemented as an AVL tree. I also found
out that the entire volume of data being searched was actually small and
infrequently updated, so I changed the lookup logic to skip accessing the
database directly, and just access a memory mapped table which a cron job
downloaded from the database three times a day. Within 3 weeks I had a
page that was working perfectly and was giving instant responses (on the
company LAN) with no perceptible delay, compared to 2-5 MINUTE responses
typical of the previous implementation (apparently due to accessing the
database too many times, and driving up swap thrashing due to so much
unshared data because each instance was getting the data itself).

So yeah, C sucks. It made me finish the project in 3 weeks (plus 3 more
weeks of rigorous testing). It made me give instant responses since all
the data (about 140MB) could be memory mapped read-only and shared between
processes on the 256MB machine. And I had to roll my own implementations
of string handlers and AVL binary tree.

No doubt a GOOD programmer experienced with another language could do it
in the language they are familiar with, do it right, and make it work
well. A GOOD Perl programmer could do it. The problem is languages like
Perl also attract BAD programmers while C tends to be a barrier to entry
that keeps most of those programmers out.


| Even if you wrote the handlers long ago, you're still forcing the
| customer to use proprietary approaches that may fail. Richard says his
| string handlers work with EBCDIC, but hasn't told me how he would test
| this assertion, since the only hardware that supports EBCDIC is IBM
| big iron.

Do a code review and see if you find any ASCII dependencies. OTOH, I do
not really care (anymore ... I used to back when I did 360/370 assembler)
about EBCDIC.


| I wrote a big library for old Visual Basic to replicate functionality
| I'd learned in Rexx including finding blank delimited words. I now
| think this was, in Shakespeare's words, a waste of spirit in an
| expense of shame.
|
| Don't use C. It doesn't make you clever and studly any more than wine
| makes Mummy clever.

Use what works for you. If you're not good at rolling your own, either get
tools already implemented and tested, or use a programming environment that
has them. If Java works for you, use Java. If Perl works for you, use Perl.
If PHP works for you, use PHP. If Pike works for you, use Pike. If Python
works for you, use Python.

C works for me ... not for everything ... but for many things. BTW, I have
re-implemented that AVL code (the original only mapped strings to strings),
and put together many other tools. It's called code reuse.
 
A

Antoninus Twink

If you accept what he says at face value, he is accusing me,
specifically, by name, of the manslaughter of fourteen people.

Said with all the righteous indignation of someone accused of robbing a
bank in Washington when at the time he was actually robbing a bank in
Baltimore.
 
L

lawrence.jones

Richard Heathfield said:
C does not mandate the use of 32-bit integers for time_t.
This is truly an implementation issue, and the fix is so easy that
implementors have no excuse for not fixing the problem right now.

Binary compatibility with existing code seems like a pretty good excuse
to me.
 
U

user923005

(e-mail address removed) said:



No. I suppose I'll have to concede that it's a reason - but it's
inexcusable, and therefore not an excuse.

If it's a reason, then its a lame reason, because that reason assumes
all implementors use 32 bit integers for time_t and that they are
coded the same.

Here is what the standard says:
7.23 Date and time <time.h>
7.23.1 Components of time
1 The header <time.h> defines two macros, and declares several types
and functions for manipulating time. Many functions deal with a
calendar time that represents the current date (according to the
Gregorian calendar) and time. Some functions deal with local time,
which is the calendar time expressed for some specific time zone, and
with Daylight Saving Time, which is a temporary change in the
algorithm for determining local time. The local time zone and Daylight
Saving Time are implementation-defined.
2 The macros defined are NULL (described in 7.17); and
CLOCKS_PER_SEC
which expands to an expression with type clock_t (described below)
that is the number per second of the value returned by the clock
function.
3 The types declared are size_t (described in 7.17);
clock_t
and
time_t
which are arithmetic types capable of representing times; and
struct tm
which holds the components of a calendar time, called the broken-down
time.
4 The range and precision of times representable in clock_t and time_t
are implementation-defined. The tm structure shall contain at least
the following members, in any order. The semantics of the members and
their normal ranges are expressed in the comments.274)
int tm_sec; // seconds after the minute — [0, 60]
int tm_min; // minutes after the hour — [0, 59]
int tm_hour; // hours since midnight — [0, 23]
int tm_mday; // day of the month — [1, 31]
int tm_mon; // months since January — [0, 11]
int tm_year; // years since 1900
int tm_wday; // days since Sunday — [0, 6]
int tm_yday; // days since January 1 — [0, 365]
int tm_isdst; // Daylight Saving Time flag
The value of tm_isdst is positive if Daylight Saving Time is in
effect, zero if Daylight Saving Time is not in effect, and negative if
the information is not available.

Footnote 274) The range [0, 60] for tm_sec allows for a positive leap
second.

All that the standard says is that time_t is an arithmetic type
capable of representing times. Hence, we cannot make any assumptions
about its size or nature other than what the compiler tells us with
sizeof(time_t).
It would also be a mistake to assume that even between iterations of a
compiler that clock_t, size_t, and time_t are the same size (just as
assuming that pointers are the same size with different compiler
versions is an error).
While it would be very nice for the compiler vendor to provide a
helper function for conversion of old types to new types, if our code
exhibits problems because we assume that the sizes of things never
changed that shows a distinct lack of foresight in our case.

Would we feel sorry for someone who's code broke because they assumed
all pointers fit into 32 bits? (Well, maybe, if we once assumed that
integers were always between -32768 and + 32767)
;-)
Anyway, assumptions about the size and content of a time_t, a size_t,
a clock_t, or the width of a function or data pointer are bad
assumptions. And if they prevent the upgrade of the compiler's time
types to something that won't cause devatating errors in the near
future then the assumptions are whatever is worse than bad. Evil
{calamitous?}, perhaps?
 
K

Kaz Kylheku

Precisely which binary interfaces did you have in mind?

One is the binary interface between a program and a dynamically linked
C library, like say the time function.

But dynamic liking technology has now advanced into the 1970's,
at least on some operating systems. It's possible for a shared
library to expose a compatible time function for old binary clients,
one with a 32 bit time_t, and a 64 bit time function for new clients.

Of course, programs that aren't going to be recompiled will stay broken.
That's a given. But fixes to the interface don't have to be held back
for the sake of these programs.
 
J

James Kuyper

Binary compatibility with existing code seems like a pretty good excuse
to me.

Sooner or later the binaries that it is incompatible with will have to
be recompiled with a newer library if possible, or discarded if
necessary. The sooner this happens, the better for everyone (except, of
course, whoever it is that has to do the recompile or replace the
discarded binary). As long as a backwards compatibility option is
provided (it could even be the default, at least for the next decade or
so), this shouldn't be a serious problem.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,777
Messages
2,569,604
Members
45,202
Latest member
MikoOslo

Latest Threads

Top