Why C is really a bad programming language

R

Rafael Anschau

Spinoza, how would you build an operating system or a device driver in
Java
? Show me how and will consider leaving C.

An architecture I plan to build is done is c#(gui interfaces mainly)
while calling lower
level stuff from a C++ dll. It´s about using the right tool for the
right job without getting attached to the tool.

There are things that need to be done where the only alternative I
know is assembly. What do you recommend in those cases ?
 
K

Keith Thompson

gwowen said:
[snip]

<OT>
I seriously doubt that Han and spinoza1111 are the same person.
They're both obnoxious trolls, but they have different styles,
and spinoza1111 (Edward Nilges) has been around for a long time,
mostly in other newsgroups.
</OT>
 
K

Kaz Kylheku

Richard Heathfield stumbled on the reason. Because he's reasonably
competent in terms of a low standard, he realized that he had to write
his own string handlers, and he did so. He may have done an acceptable
job. He used Boyer Moore for searching.

But soft: wait a second.

Why on EARTH would anyone EVER use a language for applications or even
hard core programming that does not support strings?

That's a question for some other newsgroup, like comp.programming.

People reading comp.lang.c have already settled the above question,
at least with respect to some particular project they are on. For whatever
reasons, they are working with C. They might be eagerly working with C, or
reluctantly working with C. At the end of a day's worth of dreaming about using
something else, the work is still done in, guess what? C.

Believe it or not, there are still systems that are only targetted by a C
toolchain. So it's either that, or assembly language. (With a poor assembler
that is designed for compiler output).

Some large systems are already written in C. What language should they be
maintained in? I have to develop a thousand line patch for a million line C
program. Gee, let's write the patch in Eiffel.

Here is an excellent reason for programming in C: you're in school taking
taking a required programming course, some or all of which is based on C. So,
by golly, guess in what language that homework program you're struggling with
has to be written? Perhaps I may offer some assistance: the solution to your
homework problem: obviously, you should be working with Java instead!
Don't use C.

I second that. Don't use C.

And, furthermore, if you succeed in not using C, then don't use the comp.lang.c
newsgroup to talk about how you don't use C. See, that would be trolling.

Comp.lang.c is for discussions about C. Not all of us here like C. Yes,
hopefully everyone here knows about some other programming languages and what
they can expres. Liking C is not a requirement for using comp.lang.c, and
neither is it a suitable discussion topic.
 
S

spinoza1111

What is a string? A sequence of bytes? Or a sequence of "characters"
(however you want to define that term; a Unicode codepoint isn't
necessarily a character either).

Are strings mutable? If a is a string, does a=b "clone" the underlying
data or just copy the reference? What is the time complexity for inserting
and deleting characters? Is it the same at the beginning, end, and middle
of the string? What is the time complexity for retrieving the nth
character (it won't be O(1) if the strings are e.g. bytes in UTF-8 but you
want the nth Unicode character).

If you concatenate strings to create a new string without destroying the
existing strings, where does the memory come from? stack? malloc()? Who is
responsible for freeing the memory?

Sure, a high-level language which treats strings as primitive objects and
handles memory (de)allocation automatically will be lot simpler to code
in. Chances are it will also be significantly less efficient.

Good points. But I do not agree that centralized memory allocation
will be less efficient. More socialist, perhaps, but letting
individual programs do memory allocation is asking for trouble.

Whereas an object-oriented approach that starts with a globalized
string in the form of the most general notion of "an ordered sequence
that is comparable based on the comparability of its members" can be
inherited to handle different types of strings.
 
S

spinoza1111

spinoza1111said:



You are confusing the language with a handful of implementations
thereof. C does not mandate the use of 32-bit integers for time_t.
This is truly an implementation issue, and the fix is so easy that
implementors have no excuse for not fixing the problem right now.

Frowning in the general direction of the "implementors" won't fix the
problem.

Y2K was a problem primarily with Cobol. It was fixed because those
despised Cobol programmers were grownups, and showed up for work on
Dec 31 1999. Whereas I don't think you young whippersnappers will.

Dickie boy, have you fixed the dependencies in the code for which you
are responsible? Note that because Cobol is so clunky, merely changing
the width of a date generally had no effect in other areas of Cobol
programs. This was even true for IBM assembler. One retired IBMer, Bob
Bemer discovered a simple automatic way of monitoring running object
code for Y2K.

Whereas the so-called "power" of C means that even if you change a
symbol you must still convert vast amounts of C code to find subtle
dependencies that could not have been coded in Cobol.

Y2K was a non-event because older programmers worked hard, and didn't
spend their times destroying better men on usenet. But because it was
a non-event, managers and programmers will reason that Thirty-Eight
will be, magically, a non-event. Nothing will work, everything will
fail, and the Thousand Years of Darkness will commence.
 
T

Tim Prince

Richard said:
spinoza1111 said:

<snip>


If you mean that the projected disaster scenarios didn't happen, I
agree.


Right. And we got very well paid for it, too.

But some of us, even those of us who were not primarily programmers,
lost our jobs immediately after, in part for supporting the
recommendations to fix those problems. Talk about non-events.
 
J

James Kuyper

pete said:
But you did remind me of a question I had.

Why do they have to keep calling it "C"?

They don't, But D is already taken. So is C++. What would you recommend?
 
B

Beej Jorgensen

Tim Prince said:
But some of us, even those of us who were not primarily programmers,
lost our jobs immediately after, in part for supporting the
recommendations to fix those problems. Talk about non-events.

I remember the hearings on the matter. "We spent all this money to make
sure nothing went wrong, and then nothing went wrong! How can you
possibly justify the expense?"

Very comforting.

-Beej
 
S

spinoza1111

spinoza1111said:

Frowning in the general direction of the "implementors" won't fix
the [Y2.038K] problem.

Neither will blaming the language.


Y2K was a problem primarily with Cobol.

Rubbish. It /started/ with COBOL (or rather, in the era in which COBOL
was prevalent), but thousands of millions of lines of non-COBOL code
also had to be checked.
It was fixed because those
despised Cobol programmers were grownups, and showed up for work on
Dec 31 1999. Whereas I don't think you young whippersnappers will.

No, it was fixed because we identified the problem early enough, and
set about fixing it early enough. The Y2K project on which I worked
was done and dusted by late 1998.

I find it hard to credit programmers who are always part, so they say,
of success. If they are they have never learned from failure. But it
is unlikely that they are, and likely that they narrate failure as
success.

I have, some time ago. The question is when you will.
What are we talking about here? Y2K? All the code I've ever written
from scratch has been Y2K-compliant, right from the start, long
before "the Y2K problem" began to command column-inches. So I don't
need to fix /any/ dependencies. (The only times I've written
non-Y2K-conforming code was, oddly enough, on that same Y2K project,
because the client, a UK bank, had decided to go with a "sliding
century window" fix... a choice which I recommended strongly against,
but they didn't fancy the risk of touching every single date record
on their system.) If you're talking about 2038, my code very
deliberately avoids dependency on time_t wherever possible, and I
certainly never store time_t values on disk. When implementations
finally catch up with reality, all that will be required is
re-compilation.

I don't believe you, since again, C's making-it-possible to create
dependencies means that due diligence is required even if the
dependencies don't exist. I realize that this is a complex thought,
since it is neither managerese nor programmerese.

Managers don't want to be seen as spending money on potential events
that turn out to be non-events: as one poster seems to indicate below,
some Y2K programmers were punished and not rewarded because owing to
their efforts, nothing happened.

Nobody wants to take care of society's needs per se, because that's
"socialism" and as the Mad Woman, Baroness Thatcher, said, "there is
no such thing as society". It's perfectly all right, in the lower
middle class mind of the manager and the programmer, to risk a credit
crisis, or destroy a generation in the North of England by closing
mines.

You can bleat when they come for you
That you don't know nothing, you did what you were told:
You followed the Standard and worked to rule,
But I tell you, yob, it's getting old.
I remember that. I talked it over with some Y2K colleagues at the
time, and we came up with at least three or four things his technique
didn't cover, just in object code, let alone on DASD.

Of course! As always, it is your favorite Hobby
To sit on your arse taking tea in the Lobby,
And do all you can to shame and unfame the Name
Of good workmen, here Navia, Schildt, and Bob Bemer:
Three better Men than we shall see here.
Navia, bold French compiler writer of Note,
Who writeth English well whilst you write by Rote:
Schildt who took your fat girlfriend in the back of Hall
While you perforce stood with a stupid grin, holding his Ball:
And now Bemer, who managed Algol thanklessly at IBM
And was stabbed in the back by the Fortran team.
Like a little Girl, like unto a maiden Aunt
(Who's never felt the thrust and manly Pant)
You spit your frustrate sour grapes, a nasty Wine
On all that is decent, that is Famous, and that is Fine.


You have to search through all the COBOL, too, though, to catch the
things that Bob Bemer /didn't/ think of.


If you mean that the projected disaster scenarios didn't happen, I
agree.


Right. And we got very well paid for it, too.

I'm quite certain you got Paid:
Men clamor for Gold when they can't get Laid.
Is that your way of saying you weren't involved in Y2K?
No.


Programmers won't reason like that. Good managers won't, either.

To lapse into prose: people with narcissistic personality disorders,
like you, are instinctively Platonists. They form the Idea of the
"good" programmer and the "good" manager but reality is always rubbing
their nose in failure.
 
S

spinoza1111

But some of us, even those of us who were not primarily programmers,
lost our jobs immediately after, in part for supporting the
recommendations to fix those problems.  Talk about non-events.

This sounds like a hidden scandal. The programmers and their managers
worked hard to prevent would could have been a social disaster, but
because it didn't happen, higher-ups reasoned that the programmers and
their frontline managers had "wasted resources".

This is fucked up. We throw trillions at bankers today but can't get
health insurance, whereas all indications are that Y2K was a genuine
problem (and 2038 will be far worse). But people, whether coal miners
in Britain, teachers in Britain or the USA, or programmers world wide,
who spend some fraction of their time on society's need to protect
itself against threats (such as no coal), working hard in excess of a
paycheck, are condemned. They're "altruists".
 
S

spinoza1111

spinoza1111said:






spinoza1111said:
[The Y2K problem] was fixed because those
despised Cobol programmers were grownups, and showed up for work
on Dec 31 1999. Whereas I don't think you young whippersnappers
will.
No, it was fixed because we identified the problem early enough,
and set about fixing it early enough. The Y2K project on which I
worked was done and dusted by late 1998.
I find it hard to credit programmers who are always part, so they
say, of success. If they are they have never learned from failure.

I find it hard to treat seriously people who interpret a single report
as "always".
But it is unlikely that they are, and likely that they narrate
failure as success.

Oh, so /that's/ why you always crow about screw-ups, is it?

Yes. I learn from mistakes. I learned more about C++ than you learned
by taking the Sparknotes test and getting a grade in the 70s. You got
a higher grade (but not one that would permit you to set up as an
authority on the C family at all) and learned NOTHING.
The evidence - name-calling - suggests otherwise. Mere assertion on
your part is not convincing. If you wish people to believe that
you're grown up, you're going to have to start /acting/ grown-up.

Hmm, perhaps a grownup in a world of subhumans might seem to them
childish and untutor'd in their subhuman ways.
And that bothers me how, exactly?

....so that you're always here, like a psycho.
 
J

James Kuyper

Richard said:
pete said:


Then make sure it's case-sensitive, since there is already a language
called e.

I don't think we want to move on to F; the name has unfortunate
connotations. :)
 
N

Nick Keighley

Most programmers, apart from the truly competent (who are few and far
between), hate language. They don't read much apart from technical
stuff and the sports pages. They fantasize a world of silence without
texts, and having no string handling suits them fine.

I read a fair amount of science fiction, historical fiction and
a fair variety of other stuff. I have a laymans interest in
linguistics.

I am known "for always having my head in a book"

<snip>
 
N

Nick Keighley

I remember the hearings on the matter.  "We spent all this money to make
sure nothing went wrong, and then nothing went wrong!  How can you
possibly justify the expense?"

Very comforting.

I hear Britain and America spent a ton of money and had no problem.
Whilst some European countries (Italy I think) spent bugger all
and also had no problems...
 
S

spinoza1111

I read a fair amount of science fiction, historical fiction and
a fair variety of other stuff. I have a laymans interest in
linguistics.

Well, you start with two rather juvenile categories, but good for you.
Yes, I shall be patronizing. For let us not speak falsely now.
 
S

spinoza1111

spinoza1111said:



Only because we taught it to you in the process of explaining how
flawed the test was.

You taught me nothing, thug. You don't know computing science. And
who's "we" but a small scattered collection of bullies, criminal
computer consultants, and deviants?
On the contrary, I learned that you enjoy doing flawed tests, I
learned that at least one Spark Notes test was not properly validated
by subject experts, and I got further confirmation that no matter how
wrong you are, you still can't see it.

You failed to get an adequate grade and this bothered you. The failure
always blames the tool, the system, or the test. I'd hoped your
relative failure would change you and make you ready to dialog. I'd
hoped, vainly, that you might show some sensitivity to the fact that
people dread internet technical discussions because it endangers their
jobs to have some lunatic like you start calling them names and
questioning their competence. I'd hoped to find the spirit of early
conferences in the 1970s. Not the stealing of software but the
collegiality implicit in NEVER questioning a person's bonafides,
simply focusing on what is right and wrong with their code.

But as it is, I am going to name and shame every stupid thing you say
here from now until Kingdom Come, such as your misuse of the word
"polymorphism". I'm going to be your worst nightmare until you get the
**** out of this newsgroup, because you're an enabler, a thug, a
fraud, and an incompetent.
Or a hypocrite in a world of not-quite-as-bad-hypocrites might seem to
be hypocritical. You yourself, if you think back a bit, reacted
indignantly when people started calling you "Eddie boy". If you don't
want people to mock your name, it is most inadvisable to mock those
of other people.

They started out of the blue. Again, people lazily look at a few posts
without noticing their time-order, and find someone like Navia, who's
done something you cannot do, responding like a MAN to your lies and
abuse. They conclude that he's the thug and you bastards sit back and
smirk.

I have remembered just enough C to show you're an incompetent and a
fraud and I shall every day relearn more until you are out of here.
Unlike you I leave this newsgroup for months at a time because I'm not
criminally insane, but when I returned last week, I found you abusing
Navia, who has contributed more to the computing community than you
ever shall.
 
N

Nobody

Good points. But I do not agree that centralized memory allocation
will be less efficient. More socialist, perhaps, but letting
individual programs do memory allocation is asking for trouble.

Whereas an object-oriented approach that starts with a globalized
string in the form of the most general notion of "an ordered sequence
that is comparable based on the comparability of its members" can be
inherited to handle different types of strings.

That's asking for even more trouble then letting programs handle memory
allocation.

OO languages ensure that derived classes conform to the syntax of
the base class. They do nothing to ensure that derived classes conform to
the semantics.

If your base string class is immutable, and you add a mutable subclass,
passing an instance of that subclass to something which expects an
immutable string will often fail, as the recipient will assume that it
doesn't need to clone it.

If your base class is mutable, passing an immutable subclass to a function
which expects to be able to modify the string will fail. Plus you don't
get any performance advantage, as the recipient is going to clone the
string anyhow to allow for the case where the string is mutable.

Mutable and immutable strings are inherently different types; neither is
logically a subclass of the other.
 
M

Moi

You failed to get an adequate grade and this bothered you. The failure
always blames the tool, the system, or the test. I'd hoped your relative
failure would change you and make you ready to dialog. I'd hoped,
vainly, that you might show some sensitivity to the fact that people
dread internet technical discussions because it endangers their jobs to

Many people here have concluded that the test was flawed, that one of the
questions answer depended on the absence of a sequence point, and above
all that C++ != C.

And after that, we found out that you did not even know the concept of
sequence point. (no, thank you, I would *not* buy a compiler from you)

So what is your point ? Are you just jealous that other people can see
the errors in the test that you don't see ?

Besides: IMHO you are a troll or a lunatic or possibly both.

HTH,
AvJ
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,776
Messages
2,569,603
Members
45,191
Latest member
BuyKetoBeez

Latest Threads

Top