Why C Is Not My Favourite Programming Language

  • Thread starter evolnet.regular
  • Start date
E

Eric Schmidt

G said:
If you are the real Eric Schmidt, please get your developers to fix up
google groups' treatment of leading blank spaces in usenet posts.

I am not affiliated with Google in any way. Though my name *is* Eric
Schmidt. I am not the person that a Web search for my name brings up.
 
B

Bart C

Any language can be implemented in any Turing-complete language.

You have missed what I was saying. See my other post for a longer
rant, but the point is that although it *could* have been written in
any language, it was *in fact* written in C. Which suggests that even
the author of Python thinks that C is good for something.

-- Richard[/QUOTE]

Somebody with a C implementation decides to write a totally different
language with it.

Maybe that tells you something about what they thought of C and it's
suitability for certain types of programming.

Bart.
 
R

Richard Tobin

Somebody with a C implementation decides to write a totally different
language with it.

Maybe that tells you something about what they thought of C and it's
suitability for certain types of programming.

Exactly. It tells me that they thought it was suitable for
implementing other languages.

-- Richard
 
N

Neil Kurzman

I can not believe You wrote so much about why do do not like C. Big deal!
If it does not suit you needs do not Use it. Who said it is good for every
thing? What is there one perfect language for everything. There can be
more than one Highlander. If C does not suit then project , then use a
language that does.

Try this project for Windows.
Send a file over the serial port.
The program must fit on a floppy.
Nothing can be installed on the hard drive.
It must work with Windows 95 and up.

And remember Not all code runs as a user app on a PC. The CPU in the
keyboard you type on is written in ASM. Because it suited the project.
There is a chance it could be done in C, but python.. No.
 
S

Stan Milam

Walter said:
Introduced relative to -what- ? Do I get to compare it against
the other languages that were available at the time of it's
development in 1969-1973, such as:

- numerous assemblers
- Fortran 66, Fortran G, Fortran H
- The original Pascal (1970)
- PL/1 (1969)
- Algol 68
- APL (1969)
- LISP, LISP 1.5, MacLISP

Let's not forget COBOL! Wasn't BASIC around by then too?

--
Regards,
Stan Milam
=============================================================
Charter Member of The Society for Mediocre Guitar Playing on
Expensive Instruments, Ltd.
=============================================================
 
W

Walter Roberson

:Let's not forget COBOL! Wasn't BASIC around by then too?

BASIC was from 1963, and COBOL from 1959, with COBOL 68 being current
in the 1969-1973 timeframe.

Looks like FORTH wasn't until 1968 either. But I guess we should still
get out the BFG for C because it didn't have the string manipulation
functions of SNOBOL 4.
 
W

Walter Roberson

:Try this project for Windows.
:Send a file over the serial port.
:The program must fit on a floppy.

Fit on a floppy? We used to do that in a very small number of Kb using
xmodem.
 
C

CBFalconer

Neil said:
I can not believe You wrote so much about why do do not like C.

He didn't. Several have pointed out that he plagiarizd the whole
thing, and have given a reference to the original. Just another
troll.
 
M

Mabden

Kenny McCormack said:
"It is a poor workman that blames his tools". Yes, as true now as ever.

Having said that, the only sense in which the OP's obvious trolling is at
all worthwhile is by taking it from a manager's perspective (that is, not
from a programmer's perspective).

While it is certainly true that it is a poor workman who blames his tools,
it *is* legitimate to criticize the tools chosen by and used by OTHER
PEOPLE, especially if you are paying the salaries of those OTHER PEOPLE.

So, I invite you all to continue this discussion from that perspective.

Ok, "**** off!"
 
R

Richard Bos

Big K said:
Computer languages are not written in other computer languages. A
computer language is a set of rules layed out by standards.

_Some_ computer languages, such as C, are defined in a formal standard.
Others are defined by a reference implementation - often the only one.
It has been said that it would be easier to translate the Perl
interpreter (which, TTBOMK, is also written in C) to another language
than to write a formal standard for that junk-ball of accretions.

Richard
 
K

Keith Thompson

CBFalconer said:
He didn't. Several have pointed out that he plagiarizd the whole
thing, and have given a reference to the original. Just another
troll.

The "original" article did raise some valid points, along with some
misconceptions that deserve to be corrected.

If anyone wants to discuss any of these things, I suggest starting a
new thread, preferably without referring to this thread or giving
credit to the troll (though it might be good to refer to the Kuro5hin
article that started this).
 
K

Kenny McCormack

When you make this statement, you are ignoring the human factor.
For a programmer of middling ability, it is quite likely that their Perl
programs will run as fast or faster than their C programs. (You do the
math...)

Yep. I remember that for some early perl 5 version, perl beat grep...
don't know whether this still holds.
Apart from that, when dealing with problems perl and Python have been
created for, you usually save enough development time for very many
runs of the program.[/QUOTE]

You do realize, of course, that I am saying that most people write such bad
C programs, that even _at runtime_, they are better off having written in
a script lang.
 
M

Michael Mair

Kenny said:
Yep. I remember that for some early perl 5 version, perl beat grep...
don't know whether this still holds.
Apart from that, when dealing with problems perl and Python have been
created for, you usually save enough development time for very many
runs of the program.

You do realize, of course, that I am saying that most people write such bad
C programs, that even _at runtime_, they are better off having written in
a script lang.[/QUOTE]

I do, and choose not to comment on it as I have become rather careful
with general statements about "most people"...
I just wanted to solidify your claim with pointing toward an area
where I _know_ that most people lose against a script language --
moreover, where a well-known tool was not up to the challenge.
(Another one: If it comes to beating Matlab on its own turf, the
whole thing becomes even more tricky...)
Another problem is that many people use the stuff script languages
are good at in order to _learn_ C, so I would not think too much
about simple tasks badly done in C.


Cheers
Michael
 
W

Wayne Rasmussen

Mark said:
Its not, its just that you're incompetent at it. Sorry, but if you think
your statement to be true, then it is the only remaining conclusion.


It doesn't, any more than any othe language does./

I think he is looking for a DWIM (Do What I Mean) function like from some
old LISP implementations. Obviously anything he writes is A-Perfect hence
any issue must lie somewhere else language, compilers, etc...
 
N

Neil Kurzman

Walter said:
:Try this project for Windows.
:Send a file over the serial port.
:The program must fit on a floppy.

Fit on a floppy? We used to do that in a very small number of Kb using
xmodem.

It was a custom bootloader the loader, Hexfile and instructions need to
fit on a floppy.
I made a C Console App (64K).

You can hammer nails with a sledge hammer. Choosing the right tool for
the job usually make it easier.
 
K

Kelsey Bjarnason

[snips]

True, but my point is:

(1) C introduces entirely new classes of bugs
and
(2) C makes bugs more likely, regardless of the programmer's skill
and
(3) C makes it much harder to root out and fix bugs

Frankly, I think that's a load of hogwash; C makes coding no more or less
likely to produce bugs than any other language - if each is done by a
competent coder.

For example, in C, a competent coder knows you don't use gets, you use the
scanf family very carefully, or you use fgets with gay abandon, knowing
full well that short of a serious library bug, you simply aren't going to
have to cope with a buffer overflow on input using fgets.

Similarly, when handling strings, a competent coder knows you have to add
1 to the length, to hold the terminal \0; so either he does so by habit,
or he creates functions to do so automatically - dupstr, for example,
could duplicate a string and automatically handle the addition of the
extra byte's space, and the coder would never again have to worry about
adding the one or not.

Where I find you run into the most problems is with code that simply isn't
properly thought out, combined with coders who aren't quite as good as
they think they are.

The perhaps classic example of this is the ever-repeating idea that on
freeing a pointer, it should subsequently be "NULLed out"; that is, the
sequence of code should read free(p); p = NULL;. The idea is that you can
then tell, easily, if you're re-using a freed pointer.

This seems like a good idea until you realise there could have been an
intervening q = p; and that q will _not_ be NULL, but will also not be
valid; this cutesy "NULLing out" notion falls apart, but proper code
design would have rendered it irrelevant in the first place - if the code
is well designed, you don't need to worry whether p was freed or not, you
_know_ whether it was freed or not.

There are some other issues, such as using the wrong types, assuming
pointers and ints are the same size, or can be trivially converted, that
sort of thing, which may be specific to C, but competent coders generally
aren't going to make that mistake, any more than a competent electrician
is going to get himself zapped by poking a knife into a possibly live
socket; these are things he learns to avoid as a matter of habit.

This leaves other issues - algorithmic failures, for example - which are
not applicable to C alone, but to all languages. Forgetting to calculate
a checksum, or misdesigning an encryption function, or failing to check an
error code, these can happen in any language, not just C.

Sure, C has its pitfalls, but so do all languages. If you're any good at
the language, you know what the pitfalls are and they generally don't
affect you, because you simply avoid the situations where they'd arise, as
a matter of habit.
 
S

Servé La

Kelsey Bjarnason said:
Frankly, I think that's a load of hogwash; C makes coding no more or less
likely to produce bugs than any other language - if each is done by a
competent coder.

For example, in C, a competent coder knows you don't use gets, you use the
scanf family very carefully, or you use fgets with gay abandon, knowing
full well that short of a serious library bug, you simply aren't going to
have to cope with a buffer overflow on input using fgets.

And how do you become a competent coder? By making mistakes and doing thing
because you dont know better. Look at this group for instance, so many
beginners use gets. They will continue to use gets until 1) somebody
convinces them why not to 2) they cause a buffer overrun in a real
application causing their life to become miserable for a while.

One of the great things about C is that you can become a true master in it,
precisely because it has many pitfalls.
 
C

CBFalconer

Kelsey said:
On Sun, 06 Feb 2005 09:50:44 -0800, evolnet.regular wrote:

[snips]
True, but my point is:

(1) C introduces entirely new classes of bugs
and
(2) C makes bugs more likely, regardless of the programmer's skill
and
(3) C makes it much harder to root out and fix bugs
.... snip ...

Where I find you run into the most problems is with code that
simply isn't properly thought out, combined with coders who aren't
quite as good as they think they are.
.... snip ...

This leaves other issues - algorithmic failures, for example -
which are not applicable to C alone, but to all languages.
Forgetting to calculate a checksum, or misdesigning an encryption
function, or failing to check an error code, these can happen in
any language, not just C.

Sure, C has its pitfalls, but so do all languages. If you're any
good at the language, you know what the pitfalls are and they
generally don't affect you, because you simply avoid the
situations where they'd arise, as a matter of habit.

Most of these things can be designed around by any competent
programmer. The two areas that are often impossibly awkward to
check are overflows and wild pointers. The wild pointer problem is
built into the language. The overflow problem is not, and maybe
implementors will start to trap any integer overflows.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,774
Messages
2,569,598
Members
45,144
Latest member
KetoBaseReviews
Top