c / c++ : is it end of era ?

S

shaanxxx

I started programming with c. Lot of projects are being done in C++. We
have to move in THE C++.
I read around 3 - 4 books (including Faqs, stroustrup) on c++. What i
found in most of the book is that they criticize c language and c
programmer. And they silently run away from good feature of C.
They have sentence like , "you have to unlearn c to learn C++".

is it the end of C era ?
Stroustrup and writers of Faqs are gurus of technologies. These people
are commenting on C language.
should we believe them?
somewhere it is demoralizing for c programmer.

I just wanted to know what other people thinks on this .
 
J

jacob navia

shaanxxx a écrit :
I started programming with c. Lot of projects are being done in C++. We
have to move in THE C++.
I read around 3 - 4 books (including Faqs, stroustrup) on c++. What i
found in most of the book is that they criticize c language and c
programmer. And they silently run away from good feature of C.
They have sentence like , "you have to unlearn c to learn C++".

is it the end of C era ?
Stroustrup and writers of Faqs are gurus of technologies. These people
are commenting on C language.
should we believe them?
somewhere it is demoralizing for c programmer.

I just wanted to know what other people thinks on this .

C++ started to eliminate C pitfalls but did not know when to stop,
in the way of complexity.

The problem with C is the lack of higher level constructs
that would allow a simpler and bug-free programming.

C++ started out as a preprocessor that compiled to C, that allowed
people to define classes, do object oriented programming, and
in general, as a way of eliminating the problems of C by
adding a new paradigm (object oriented programming) and
by making stricter type checking.

This led to multiple inheritance (one of the first mistakes), and
then templates, name spaces, and an incredible array of features
that make C++ a language so complex that has been almost impossible for
the compilers to follow.

As it stands now, there are maybe one or two implementations (EDG mainly
and maybe Comeau) that implement ALL of the language. All the other
compilers (gcc and microsoft included) implement some large subset of
the language, but not all of it.

People that understand the whole language are very few, since the
learning curve is steep, and unforgiving.

Obviously a language that started to "fix C" or to make a
"better C" is obviously in need of a reason, and it is not
very difficult to find problems with the approach in C to many things,
since the bugs in the language aren't that difficult to find.

C++ has added complexity without really having solved many of
the problems of C. Still you have to allocate/deallocate manually
the memory you want to use, without having an automatic garbage
collector, for instance.

In my opinion, some of the features of C++ are interesting, and
worthwhile, but too many of them make for a language that is just too
big.

jacob
 
D

dcorbit

shaanxxx said:
I started programming with c. Lot of projects are being done in C++. We
have to move in THE C++.
I read around 3 - 4 books (including Faqs, stroustrup) on c++. What i
found in most of the book is that they criticize c language and c
programmer. And they silently run away from good feature of C.
They have sentence like , "you have to unlearn c to learn C++".

is it the end of C era ?
Stroustrup and writers of Faqs are gurus of technologies. These people
are commenting on C language.
should we believe them?
somewhere it is demoralizing for c programmer.

I just wanted to know what other people thinks on this .

Has COBOL gone away?
No. This kludgy 1950 language still clings on because it has a huge
installed base and actually solves some problems better than other
languages (e.g. look at some recent threads here involving currency
transactions which COBOL can handle with aplomb).

Languages do not go obsolete once they get popular. The problem domain
for C and C++ is not identical. For some projects C++ will be better
and for some projects C will be better.

Also important is the resources available for projects. If you have 50
good C programmers and 3 good C++ programmers, then most of your
projects will be C projects because that is what you can maintain.

People frequently prophecise about the demise of various popular
computer languages. They are always wrong.
 
B

Ben C

I started programming with c. Lot of projects are being done in C++. We
have to move in THE C++.
I read around 3 - 4 books (including Faqs, stroustrup) on c++. What i
found in most of the book is that they criticize c language and c
programmer. And they silently run away from good feature of C.
They have sentence like , "you have to unlearn c to learn C++".

is it the end of C era ?

C is still very widely used, including for new projects.

Most people with any sense realize both that C++ does not necessarily
mean OOP, and that OOP is not a "silver bullet" anyway.

Out of people who know and understand quite well both languages, many
prefer C++ and many prefer C.

Most programmers have horror stories about bad code in both languages to
while away the long winter evenings.
Stroustrup and writers of Faqs are gurus of technologies. These people
are commenting on C language.
should we believe them?

They're not exactly impartial gurus though.
 
R

Richard Heathfield

jacob navia said:

it is not
very difficult to find problems with the approach in C to many things,
since the bugs in the language aren't that difficult to find.

Do you have at least two examples that will stand up to close scrutiny?
 
J

jacob navia

Richard Heathfield a écrit :
jacob navia said:




Do you have at least two examples that will stand up to close scrutiny?

Well, I hope we can start a constructive discussion, instead of
flame wars.

The most glaring bugs in C are:

1) Zero terminated strings. This is the source of countless problems,
because each access to a string implies an unbounded search for
the terminating zero, and becasue size information is not stored
explicitely in the string but must be reconstructed, so that
buffer overflows when copying those strings are almost inevitable.

Bounded strings can be written in C like this:

typedef struct tagString {
size_t length;
char *data;
unsigned flags;
} String;

Those are resizable strings. Non-resizable can be described
like this
typedef struct tagFixedString {
size_t length;
int flags;
char data[];
} FixedString;

I give this definitions to block people that say that
using other types of strings is impossible in C.
In the lcc-win32 compiler system, those strings are supported
in a special library.

2) Confusion between pointers and arrays. Arrays in C are completely
screwed up. There is endless confusion between pointers and
arrays specially because the size information is destroyed across
function calls.

3) From (1) and (2) we obtain as a consequence the inherent
impossibility to make bounds checks when accessing arrays and
strings. This leads to endless bugs.


The fix is proposed in lcc-win32: a few improvements to the language and
we can get rid of zero terminated strings and arrays as pointers.

Another big problem is the error-prone malloc/free combination. We have
discussed this here several times. The solution is to use an automatic
software component (garbage collector) that manages the release of the
allocated memory. Lcc-win32 proposes this in its standard distribution.

Note that the objective here is not just to say what is wrong but to
propose solutions. That is why I mention lcc-win32, that is free anyway,
so I have no financial gain from it.

jacob
 
M

Malcolm

Richard Heathfield said:
jacob navia said:



Do you have at least two examples that will stand up to close scrutiny?
It depends what you mean by bugs. C has a lot of weaknesses that are
inherent in a portable assembler: no special error-handling mechanism, no
garbage collection, to array bound guarding. None of this could be chnaged
with altering the design premises of the language, except maybe
error-handling, and even then no-one has found a good way that doesn't
involve hidden control paths.

There are a few other niggly things which could have been better. A few,
like the use of "char" for "byte" are simple details. Others, like the
syntax for multi-dimensional arrays, are a bit more deep-rooted. others,
like the typedef problem, are debateable. Most typedefs are a bad idea, but
there are a few cases where they actually help code maintainability.
Personally I would remove the keyword from the language if I was redesigning
C from scratch, but it is too late now, and that kind of knowledge only
comes from long experience.

Then there are some omissions. No distinction between functions which
perform IO and those which don't, no distinction between functions dependent
on platform-specific libraries and portable ones. I wouldn't describe these
exactly as "bugs".
 
?

=?ISO-8859-1?Q?Jo=E3o_Jer=F3nimo?=

Malcolm said:
like the typedef problem, are debateable.

What's the typedef problem?
I don't like their syntax, but I think it's just a matter of taste...
Most typedefs are a bad idea, but
there are a few cases where they actually help code maintainability.
Personally I would remove the keyword from the language if I was redesigning
C from scratch, but it is too late now, and that kind of knowledge only
comes from long experience.

I don't see the problem with a keyword that makes an alias to a type
(usually to a complex one)... If there wasn't typedefs, how would you
"export"[1] a type in a header?



[1] - It not really export, because exporting involves some kind of
linkage, isn't it?

JJ
 
M

mg

shaanxxx napisal(a):
I started programming with c. Lot of projects are being done in C++. We

I drow an interesting analogy to difference between C and C++. It's
like difference between european alphabet and japanese "alphabet". C++
it's great because when we want to write a word we only write one sign
so programs are shorter and easier to write. Of course we can say that
C is great because ...

C and C++ are different philosophies.
 
R

Richard Heathfield

jacob navia said:
Richard Heathfield a écrit :

Well, I hope we can start a constructive discussion, instead of
flame wars.

When you start talking about "bugs in the language", you're not trying to
start a constructive discussion - you're trying to start a flame war.
Nevertheless, if you can find some serious bugs in the language, that's
important enough to merit serious and indeed constructive discussion. But
if you're just being silly, well, that's just silliness. Let's find out
which it is, shall we?
The most glaring bugs in C are:

1) Zero terminated strings.

That's not a bug. It's a design decision. You might not agree that it's a
good decision, but it's a conscious, deliberate decision nonetheless.

No bugs so far.

2) Confusion between pointers and arrays.

What confusion? I don't get them confused, and neither does anyone who has
taken the trouble to learn the language.
Arrays in C are completely
screwed up. There is endless confusion between pointers and
arrays specially because the size information is destroyed across
function calls.

No, array information is never destroyed. It isn't always *conveyed*, but it
is never destroyed except when the array is destroyed.

No bugs so far.
3) From (1) and (2) we obtain as a consequence the inherent
impossibility to make bounds checks when accessing arrays and
strings. This leads to endless bugs.

Since both your premises are false, what hope is there for your conclusion?
But the lack of bounds checking in C does not in fact lead to endless bugs.
What leads to endless bugs is "not knowing what one is doing", and that is
true for any programming language and indeed any engineering discipline.
The fix is proposed in lcc-win32: a few improvements to the language and
we can get rid of zero terminated strings and arrays as pointers.

Getting rid of zero-terminated strings is not a fix or an improvement.
Removing choice is not the best way to persuade a C programmer that you're
on his side. And arrays are not pointers, so you can't get rid of "arrays
as pointers".
Another big problem is the error-prone malloc/free combination.

Why is that error-prone?
We have
discussed this here several times.

Yes. When you want memory, you ask for it, and if possible you'll get it.
And when you're done, you give it back. What could be easier? How is this
"error-prone"? You have never satisfactorily answered this.
The solution is to use an automatic
software component (garbage collector) that manages the release of the
allocated memory.

Memory management is too important to be left in the hands of the system.
Note that the objective here is not just to say what is wrong but to
propose solutions.

Let me know when you find something wrong. Nothing you have mentioned so far
constitutes a "bug" in C.
 
I

Ian Collins

jacob said:
C++ started to eliminate C pitfalls but did not know when to stop,
in the way of complexity.

The problem with C is the lack of higher level constructs
that would allow a simpler and bug-free programming.

C++ started out as a preprocessor that compiled to C, that allowed
people to define classes, do object oriented programming, and
in general, as a way of eliminating the problems of C by
adding a new paradigm (object oriented programming) and
by making stricter type checking.
OK so far.
This led to multiple inheritance (one of the first mistakes), and
then templates, name spaces, and an incredible array of features
that make C++ a language so complex that has been almost impossible for
the compilers to follow.
Debatable.

As it stands now, there are maybe one or two implementations (EDG mainly
and maybe Comeau) that implement ALL of the language. All the other
compilers (gcc and microsoft included) implement some large subset of
the language, but not all of it.
I know of at least one other, which makes for several more "complete"
C++ implementations than there are C99 ones.
C++ has added complexity without really having solved many of
the problems of C. Still you have to allocate/deallocate manually
the memory you want to use, without having an automatic garbage
collector, for instance.
The RAII (Resource Acquisition Is Initialisation) paradigm solves that
problem. This topic as been done to death both here and down the hall.
In my opinion, some of the features of C++ are interesting, and
worthwhile, but too many of them make for a language that is just too
big.
You don't have to use them all, just the features that make for a better C.
 
N

newsman654

shaanxxx said:
I started programming with c. Lot of projects are being done in C++. We
have to move in THE C++.
I read around 3 - 4 books (including Faqs, stroustrup) on c++. What i
found in most of the book is that they criticize c language and c
programmer. And they silently run away from good feature of C.
They have sentence like , "you have to unlearn c to learn C++".

is it the end of C era ?
Stroustrup and writers of Faqs are gurus of technologies. These people
are commenting on C language.
should we believe them?
somewhere it is demoralizing for c programmer.

I just wanted to know what other people thinks on this .

Just as poets find new and inventive ways to make antiquated languages sing with beauty,
so can a programmer by sticking with a language as elegant as C.

A good friend of mine had a BASH script he used which would run on average over 20 minutes.

As a programming practice in C (to freshen my skills) - remade the program

And it completed on average over 2 seconds....

From 20 minutes to 2 seconds... C's efficiency combined with the power left to the programmer

will always make it a favorite for years to come. It's just simply fun.
 
?

=?ISO-8859-1?Q?Jo=E3o_Jer=F3nimo?=

Richard said:
...which has nothing to do with the typedef. The typedef doesn't create a
compound type. It merely creates a synonym for an existing type.

Yes, but it makes the use of compound types much simples, exactly
because of the alias...

JJ
 
?

=?ISO-8859-1?Q?Jo=E3o_Jer=F3nimo?=

Malcolm said:
Bool breaks libraries.

That, in a nutshell, is the typedef problem. The ability to define aliases
for basic types makes code unreadable, and forces all code to use a
particular convention, making it difficult to integrate functions from two
sources.

That would be solved if C had some more precise types...
Instead of relying on "at least x bits" types, C should have defined
types with 8 bits, types with 16 bits and types with 32 bits...
This was difficult in the time when C was designed, because of the
diversity of architectures at the time (I think there were
bit-addressable machines, right?)...
But now "byte" is a synonym of "octet", right? What's the problem with that?

JJ
 
J

jacob navia

Richard Heathfield a écrit :
jacob navia said:




When you start talking about "bugs in the language", you're not trying to
start a constructive discussion - you're trying to start a flame war.
Nevertheless, if you can find some serious bugs in the language, that's
important enough to merit serious and indeed constructive discussion. But
if you're just being silly, well, that's just silliness. Let's find out
which it is, shall we?




That's not a bug. It's a design decision. You might not agree that it's a
good decision, but it's a conscious, deliberate decision nonetheless.

No bugs so far.

Customer: HEY! Your dammed program erased all my data files!
Programmer: Of course. You forgot to read the documentation page 2643
paragraph 76: If the cutomer doesn't check the dialog button
"Do not erase all my data files" in the menu item 8,submenu
26, the program will erase them.

IT IS NOT A BUG! IT IS A FEATURE!

Any bug can be converted to a "design decision", since a design that
is at the root of COUNTLESS buffer overruns, virus attacks, etc, is
obviously correct.
What confusion? I don't get them confused, and neither does anyone who has
taken the trouble to learn the language.

Of course (see above). This is not a bug, it is a "conscious design
decision". Nevertheless, it is not immediately obvious to anyone outside
the C pros, why

#include <stdio.h>
int array[2765];

void fn(int array[2765])
{
printf("sizeof array is: %d\n",sizeof(array));
}

int main(void)
{
fn(array);
}

This prints:
sizeof array is: 4

Ahhh. OF COURSE. Arrays "decay". This is a C only concept.
And this needs surely a lot of convoluted explanations
as the countless C-FAQ prove.
No, array information is never destroyed. It isn't always *conveyed*, but it
is never destroyed except when the array is destroyed.

Yes of course, since when I pass the array in the program above
it is just not passed as an array, even if C has pass by value
semantics...

Then pedantic people will say that all is well since if the arrays
aren't passed as arrays but as pointers by definition, the sizeof
still works ok.

But everyone else understands that in the above example function "fn"
No bugs so far.

Of course. Only "conscious design decisions", like trigraphs...
Since both your premises are false, what hope is there for your conclusion?
But the lack of bounds checking in C does not in fact lead to endless bugs.
What leads to endless bugs is "not knowing what one is doing", and that is
true for any programming language and indeed any engineering discipline.

Of course.

C programmers never have bugs, since, if someone has a bug, it is not
"knowing what he is doing", hence he is not a C programmer. Obviously
only Mr Heathfield qualifies as a C programmer then (maybe with the
company of Mr Dan Pop, that also told me that he never had a bug...)
Getting rid of zero-terminated strings is not a fix or an improvement.
Removing choice is not the best way to persuade a C programmer that you're
on his side.

Who told you that C strings aren't supported? They are supported OF
COURSE.

What I do is to give programmers the choice PRECISELY. The can now
choose between C strings or the String library. In YOUR world there is
NO OTHER CHOICE but C strings!!!

And arrays are not pointers, so you can't get rid of "arrays
as pointers".

Yeah, of course I can. Look at lcc-win32.
Why is that error-prone?

Because humans are not machines, and the human circuit (i.e. the brain)
is not a computer, but a vastly more complicated circuit than any
computer in existence.

Such a circuit is able to build circuits (something computers
aren't able to do) and is able to program computers (also
something computers aren't able to do) but it is ERROR PRONE, i.e.
due to the way the brain ciruitry works, it is not able to reproduce
a lot of mechanical acts without sometimes FAILING.

If I tell you to add thousands of digits thousands of times
you WILL make a mistake, even if you are an expert.

If I ask you to keep track of thousands and thousands of memory areas
and never make a mistake when releasing the allocated memory you
WILL make a mistake even if you claim here that it will never
happen to you.

The problemof malloc/free is that it is not scalable. You can get away
with it in small systems, and in single threaded applications.

In a multi-threaded complex application, where there are thousands or
millions of allocated pieces of memory it is another, completely
different story...
Yes. When you want memory, you ask for it, and if possible you'll get it.
And when you're done, you give it back.

How do you know when you are done?
That is precisely the question. You have to know exactly when each piece
of memory is needed, and when not. Since it is SO EASY to make an alias
in C, how do you know that in all that complex code there isn't an
alias for this piece of memory???

What could be easier?

Easier would be:
"When you want memory, you ask for it, and if possible you'll get it.
The system will detect when you are done with it and
release it."

That IS easier...


How is this
"error-prone"? You have never satisfactorily answered this.

I can't answer it for you since you claim never to do a mistake...
For all other people however, the reasoning is obvious.
Memory management is too important to be left in the hands of the system.
Nobody takes memory management from you. Just the finding of unused
memory is taken from you. It is still the programmer that allocates
memory. This is like saying that an automatic car doesn't let the driver
drive the car...

Nonsense
Let me know when you find something wrong.

Nothing is wrong Heathfield. For programmers like you that never
make mistakes nothing is wrong. I am speaking for the other ones
like me that DO make mistakes.

Nothing you have mentioned so far
constitutes a "bug" in C.

There isn't a blinder man than the one that doesn't want to see.
 
J

jacob navia

Malcolm a écrit :
It depends what you mean by bugs. C has a lot of weaknesses that are
inherent in a portable assembler: no special error-handling mechanism, no
garbage collection, to array bound guarding. None of this could be chnaged
with altering the design premises of the language, except maybe
error-handling, and even then no-one has found a good way that doesn't
involve hidden control paths.

With just one change to the language (operator overloading)
I have been able to develop an experimental compiler
that eliminates some of those problems.

The operator overloading allows lcc-win32 to use containers
(accessed with the array notation [ ] ) and with that have
arrays and strings that are bound checked.

Lcc-win32 offers a garbage collector in its standard distribution.
 
J

jacob navia

Richard Heathfield a écrit :
When you start talking about "bugs in the language", you're not trying to
start a constructive discussion - you're trying to start a flame war.

This is precisely what I do not accept.

Why being blind to the flaws of the language?

"C, is MY language. Take it or leave it!!!"

?????

Why isn't possible to discuss the flaws of the language
without flame wars?

I remember the starting days of Unix, where EVERY
manual page had a BUGS section. Has this attitude got
lost in this "language nationalism" ???

jacob
 
R

Richard Heathfield

jacob navia said:
Richard Heathfield a écrit :

Any bug can be converted to a "design decision", since a design that
is at the root of COUNTLESS buffer overruns, virus attacks, etc, is
obviously correct.

And any design decision can be called a bug. For example, the whole of
lcc-win32 can be called a bug. So what? The fact that you can abuse
null-terminated strings doesn't mean that null-terminated strings are a
language blemish. You can abuse anything if you try hard enough.
What confusion? I don't get them confused, and neither does anyone who
has taken the trouble to learn the language.

Of course (see above). This is not a bug, it is a "conscious design
decision". Nevertheless, it is not immediately obvious to anyone outside
the C pros, why [...] prints:
sizeof array is: 4

Nor is it immediately obvious to a newbie guitarist that guitar music is
written an octave high. I once knew a newbie guitarist who had broken
dozens of strings because of this, and yet he remained convinced that he
had to pitch his strings in such a way that he could play the music as
written. It took a long time to convince him otherwise. The ignorance of
the newbie is not the best yardstick for whether language features are a
good idea or not.
Ahhh. OF COURSE. Arrays "decay". This is a C only concept.

Not true. It's also true, for example, in C++.
And this needs surely a lot of convoluted explanations
as the countless C-FAQ prove.

No, it's all perfectly straightforward, and is explained very clearly in
K&R2. Anyone with the nous to read that is not going to struggle for long.
Yes of course, since when I pass the array in the program above
it is just not passed as an array, even if C has pass by value
semantics...

The expression's value is passed. Before that value is passed, it has to be
calculated. In your example, the value of the expression consisting solely
of the name of the poorly-named 'array' array is the address of the first
element of the array, and that element's address is passed by value, its
type being pointer-to-int. That your poorly-prototyped function doesn't
make it clear that it is receiving a pointer-to-int because it uses ancient
syntax, does not change the fact that what is being passed and received is
a pointer.
Then pedantic people will say that all is well since if the arrays
aren't passed as arrays but as pointers by definition, the sizeof
still works ok.

The array isn't passed at all! What is passed is an expression's value.
But everyone else understands that in the above example function "fn"

The example function was poorly written, and does not serve as a good
example.
Of course. Only "conscious design decisions", like trigraphs...
Right.


Of course.

C programmers never have bugs, since, if someone has a bug, it is not
"knowing what he is doing", hence he is not a C programmer.

More usefully, if one's program has a bug, it means that the programmer does
not understand his program. (This may be because he doesn't understand the
rules of the language, or it may not.) The way to get rid of the bug, then,
is not to change the program or the language, but to increase the
programmer's understanding of his program. Once he understands it fully, he
will see why it does not do what he intends.
Obviously
only Mr Heathfield qualifies as a C programmer then (maybe with the
company of Mr Dan Pop, that also told me that he never had a bug...)

See above. Anyway, I doubt very much whether Mr Pop would have told you
that. He's far too sensible.
Who told you that C strings aren't supported? They are supported OF
COURSE.

So you're getting rid of them *and* supporting them? What colour is the sky
on your planet?
What I do is to give programmers the choice PRECISELY. The can now
choose between C strings or the String library. In YOUR world there is
NO OTHER CHOICE but C strings!!!

Sure there is. If you don't like C strings, there are lots of string
libraries out there, all with varying designs, performance characteristics,
etc. Lots of choice. Me? I use one I wrote myself. Why? Because I know I
can move it around the place. What I can't do is write a program that uses
lcc-win32-only features and *guarantee* that I can move that program to
another computer (say, for example, the S2/NX) and still have it work,
straight out of the box, on that machine's native conforming C compiler.

Yeah, of course I can.

No, you can't. This is very, very simple.

Arrays are not pointers.
Therefore there is no "arrays are pointers" feature.
Therefore you can't have got rid of such a feature.

The absence of such a feature does not mean you have got rid of it. It can
mean that the feature never existed in the first place, and that's what it
means here.
Look at lcc-win32.

Your utter inability to understand simple logic does not make a good
advertisement for your product.

Because humans are not machines, and the human circuit (i.e. the brain)
is not a computer, but a vastly more complicated circuit than any
computer in existence.

This does not explain why malloc/free is error-prone.
Such a circuit is able to build circuits (something computers
aren't able to do) and is able to program computers (also
something computers aren't able to do) but it is ERROR PRONE, i.e.
due to the way the brain ciruitry works, it is not able to reproduce
a lot of mechanical acts without sometimes FAILING.

This, again, does not explain why malloc/free is error-prone.
If I tell you to add thousands of digits thousands of times
you WILL make a mistake, even if you are an expert.

So what? If I tell you to write int main(void) by hand thousands of times
you WILL make a mistake, even if you are an expert. That does not mean int
main(void) is error-prone.
If I ask you to keep track of thousands and thousands of memory areas
and never make a mistake when releasing the allocated memory you
WILL make a mistake even if you claim here that it will never
happen to you.

I would claim no such thing. I would, however, claim that tracking such
things down is not nearly as difficult as you imagine.

The problemof malloc/free is that it is not scalable. You can get away
with it in small systems, and in single threaded applications.

In a multi-threaded complex application, where there are thousands or
millions of allocated pieces of memory it is another, completely
different story...

You have not demonstrated your case. You have, however, provided some
evidence for eschewing multi-threading, which is non-standard in any case
and, as you say, enormously increases the complexity of an application for
no real benefit.
How do you know when you are done?

When you've given back the last piece that you received.
That is precisely the question. You have to know exactly when each piece
of memory is needed, and when not. Since it is SO EASY to make an alias
in C, how do you know that in all that complex code there isn't an
alias for this piece of memory???

Because I understand the program, either because I wrote it or because I've
taken the well-remunerated time to study it long and hard.
Easier would be:
"When you want memory, you ask for it, and if possible you'll get it.
The system will detect when you are done with it and
release it."

That IS easier...

Slower, too. And unpredictable. And not portable.
I can't answer it for you since you claim never to do a mistake...

I don't recall saying any such thing, ever. Please cite the message ID to
which you refer.
For all other people however, the reasoning is obvious.

I make mistakes just like anyone else, and that includes sometimes
forgetting to release memory that I allocated. Unlike you, however, I don't
see this as being a huge problem, because it's easy to detect and easy to
fix.
Nobody takes memory management from you. Just the finding of unused
memory is taken from you.

That's a contradiction in terms. Memory management includes deciding when to
release a memory resource.
It is still the programmer that allocates
memory. This is like saying that an automatic car doesn't let the driver
drive the car...

No, it's like saying an automatic doesn't allow the driver to manage the
gear-changing process as effectively, which is why many racing drivers
prefer manual gearboxes even though automatic transmission in racing cars
is far superior to that in works cars.
Nothing is wrong Heathfield.

Fine, so what's all the fuss?
For programmers like you that never make mistakes nothing is wrong.

Oh, but I do make mistakes. So what?
I am speaking for the other ones like me that DO make mistakes.

The kind of mistakes you're talking about are trivial and easy to correct.
The kind of mistakes you *make* (in Usenet articles) are also trivial and
easy to correct, but unfortunately this involves knowledge and experience
of the C language and a mastery of elementary logic which your articles do
not demonstrate that you possess.
There isn't a blinder man than the one that doesn't want to see.

Okay, do you want to see? Here we go:

Null-terminated strings were a design decision. If you don't like it, by all
means use something else, but be sure to keep the source code around
because otherwise it'll be hell to port.

Arrays are not pointers. Pointers are not arrays. Never have been, never
will be. That's a design decision too. If you don't like it, the most
sensible thing you can do is find a different language, because this is
basic stuff.

The presence or absence of bounds-checking is not a language issue, but an
implementation decision. If you want to put bounds-checking into a C
implementation, that's your choice, provided that the implementation
correctly translates correct programs. But don't expect to be able to force
your choice onto other implementors.

This is very, very simple. So the question is: do you *want* to understand?
If so, we'll help.
 
R

Richard Heathfield

jacob navia said:
Richard Heathfield a écrit :

This is precisely what I do not accept.

Why being blind to the flaws of the language?

C isn't perfect (for one thing, it's grown too big). But I specifically mean
that when ***YOU***, Jacob Navia, start talking about bugs in the language,
we know it really means that you're going to bang on about your
implementation's useless non-portable mods that *you* think of as
improvements to what *you* think of as bugs in C.
"C, is MY language. Take it or leave it!!!"

No, C is not your language to change as you please without consequences. C
is defined by ISO. If you want to change the language, talk to ISO.
?????

Why isn't possible to discuss the flaws of the language
without flame wars?

It is, but you don't seem to understand what the real flaws are.
I remember the starting days of Unix, where EVERY
manual page had a BUGS section. Has this attitude got
lost in this "language nationalism" ???

No, not at all. But what you think are bugs are not in fact bugs. Why not
track down the real bugs instead?
 
C

CBFalconer

jacob said:
Richard Heathfield a écrit :

Customer: HEY! Your dammed program erased all my data files!
Programmer: Of course. You forgot to read the documentation page
2643 paragraph 76: If the cutomer doesn't check the
dialog button "Do not erase all my data files" in the
menu item 8,submenu 26, the program will erase them.

IT IS NOT A BUG! IT IS A FEATURE!

Any bug can be converted to a "design decision", since a design that
is at the root of COUNTLESS buffer overruns, virus attacks, etc, is
obviously correct.

You are welcome to use Ada and/or Pascal. This is virtually
guaranteed to avoid the above scenario. Not completely
guaranteed. Simply renaming ISO10206 to be C0x would do the whole
job. However it would break existing code.
.... big snip ...

The problemof malloc/free is that it is not scalable. You can get
away with it in small systems, and in single threaded applications.

In a multi-threaded complex application, where there are thousands
or millions of allocated pieces of memory it is another, completely
different story...

Funny thing, my hashlib subsystem, written in purely standard C,
routinely controls millions of allocations without any known memory
leaks. It may have something to do with being systematic.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,767
Messages
2,569,572
Members
45,046
Latest member
Gavizuho

Latest Threads

Top