Exceptions in C/C++

J

jacob navia

Ed said:
I'm not against C evolving, I just recognize the fact that even if the
standard evolves, it won't be widely implemented enough to matter.


You're right that some implementors have implemented C99, but I'm not
sure that's enough to keep C from losing more and more developers.

The problems that C has are fixable, and the
interest that simple languages arise is a proof that
a simple language is better for many applications than an overly
complex one like C++.

My compiler system has been downloaded more than 500 000
times in our main distribution site. Not counted there
are all the other downloads in other sites. Last week end
we went past the 2 000 downloads over a weekend.

C is interesting for many people. Many user of lcc-win
like it because it is so simple, small and efficient,
like the language it implements.
Sorry, Jacob, I didn't mean it to sound offensive, I just think you
have plenty of good ideas and lots of energy that might be better
spent elsewhere.

Ok, do not worry. I think my energies are well spent here because
what I want is to convince people. If you try to convince people
that already agree with you, nothing is gained.


:)
 
C

CJ

My compiler system has been downloaded more than 500 000
times in our main distribution site. Not counted there
are all the other downloads in other sites. Last week end
we went past the 2 000 downloads over a weekend.

"1,000,000 heroin addicts can't be wrong!"
(seen on a tee shirt)
 
J

James Kuyper

jacob said:
You confirm my sentence above.

Pointing out that changes to the standard don't seem to be effective in
creating changes in real-world compilers is a very different thing from
saying that changes are bad.
 
J

jacob navia

James said:
Pointing out that changes to the standard don't seem to be effective in
creating changes in real-world compilers is a very different thing from
saying that changes are bad.

Well, let's hope that this was the Ed's intent. Personally I do not
think that trying to evolve C is pointless.
 
J

jacob navia

James said:
Pointing out that changes to the standard don't seem to be effective in
creating changes in real-world compilers is a very different thing from
saying that changes are bad.

Since C99 left the standard library virtually
untouched (and this is one of the main problems with C)
not many people felt the need to dedicate any effort to a
language that is perceived as obsolete and not worth any effort.

Since C++ is the better C, there is no need to improve C, and C99 was
ignored.

The main errors of C99 were:

1) The obsolete C library (not even gets() was taken out)

2) The bad string data type was mainteinaed without any
real alternative even considered...

3) The error prone malloc/free system was left untouched.
Not the most evident enhancements (size of an allocated
block returned in a library function call) were
introduced. size_t malloc_sizeof(void *);

4) The attitude of "C is anyway completely hopeless, then
let's leave lit like this" is still there.

The improvements were very small, and in some areas, completely
arbitrary.

o Complex numbers. This can be done as a result of operator
overloading much more easily. As they are now, they add
another 3 types of numbers to the already quite rich
number zoo.

o Arbitrary changes not justified by anything. "main" returns
now zero by default. Great, so you can't know if the
programmer forgot to give main a return value or if he/she
really wanted to return zero. And this means that compilers must
special case "main" and see if there is a return value
defined and if not add it at the end...
o Good ideas were not specified to the end:
int fn(int n)
{ int array[n]; }

OK that is a good idea but it would be much better if the
programmer could do a
if (array == NULL) {
// error handling in case the allocation FAILED!
}


I do not believe C99 is a quantum step forward, but it is a step forward
in some points, and anyway, it is the definition of the language as it
stands today.

Problem is, nobody is addressing all the above problems.
 
R

Richard Heathfield

jacob navia said:
Since C99 left the standard library virtually
untouched (and this is one of the main problems with C)

<shrug> It needs to lose gets(), ato*(), and perhaps one or two others, but
it's more or less there. qsort, bsearch, and strtok all need updating (or
rather, new versions need to be written, complete with new names).
not many people felt the need to dedicate any effort to a
language that is perceived as obsolete and not worth any effort.

Perhaps the implementors simply felt that they'd already implemented C and
didn't see any point in doing it again. Whatever the implementors' reasons
for ignoring C99, the fact is that most of them /have/ ignored it - and
there seems little point in pressing for further changes to C when the
last lot of changes hasn't been implemented yet.
Since C++ is the better C, there is no need to improve C, and C99 was
ignored.

It is certainly true that C99 was ignored. I don't accept that "C++ is the
better C", and neither do I accept that there is no need to improve C.
The main errors of C99 were:

1) The obsolete C library (not even gets() was taken out)

Yes, a lot of functions should have been removed, but weren't. But this
isn't a reason not to implement C99. If people don't want those functions,
they can simply avoid using them.
2) The bad string data type was mainteinaed without any
real alternative even considered...

Wrong - C doesn't have a string type, so it can hardly have a bad string
type.
3) The error prone malloc/free system was left untouched.

Or to put it another way - the simple, powerful malloc/free system was
correctly left untouched.
4) The attitude of "C is anyway completely hopeless, then
let's leave lit like this" is still there.

I don't know of any competent programmer who has that attitude towards C.
The improvements were very small, and in some areas, completely
arbitrary.

It is certainly the case that a doubling of the page count did not approach
anything like double the utility or power of the language and library.
o Complex numbers. This can be done as a result of operator
overloading much more easily. As they are now, they add
another 3 types of numbers to the already quite rich
number zoo.

Better still, just leave them out completely. People who need complex
numbers will have no difficulty implementing them using structs and
doubles.
o Arbitrary changes not justified by anything. "main" returns
now zero by default. Great, so you can't know if the
programmer forgot to give main a return value or if he/she
really wanted to return zero. And this means that compilers must
special case "main" and see if there is a return value
defined and if not add it at the end...

That is indeed a ridiculous change.
o Good ideas were not specified to the end:
int fn(int n)
{ int array[n]; }

OK that is a good idea but it would be much better if the
programmer could do a
if (array == NULL) {
// error handling in case the allocation FAILED!

It is because you can't do something of that kind that VLAs are actually a
*bad* idea rather than a good one.
}


I do not believe C99 is a quantum step forward,

I disagree. (A quantum is the smallest possible step.)
but it is a step forward in some points,

Compound literals are quite nice. That's about it, really. What a shame
that it remains mostly unimplemented.

and anyway, it is the definition of the language as it
stands today.

A pretty pointless definition, in practice - until implementors pull their
fingers out and get implementing it.
Problem is, nobody is addressing all the above problems.

There is no universal agreement that they *are* problems.
 
J

James Kuyper

jacob navia wrote:
....
Since C99 left the standard library virtually
untouched (and this is one of the main problems with C)

Entirely new headers include <iso646.h>, <complex.h>, <tgmath.h>,
<inttypes.h> and <stdint.h>, <stdbool.h>. Changes were made to <float.h>
and <stdarg.h>. Substantial additions were made to the format strings
used by the printf() family. The vscanf() and snprintf() families of
functions were added to <stdio.h>. The single biggest and most important
library change was the addition of a huge number of functions to the
<math.h> library.

If what you're complaining about is that they didn't make the changes
you think should have been made, then say so. But please don't try to
suggest that they made almost no changes.
 
B

Ben Pfaff

Richard Heathfield said:
<shrug> It needs to lose gets(), ato*(), and perhaps one or two others, but
it's more or less there. qsort, bsearch, and strtok all need updating (or
rather, new versions need to be written, complete with new names).

The biggest problem with ato*() is the undefined behavior upon
out-of-range values. Thus, I think that ato*() could be saved
simply by requiring them to be implemented as the obvious
wrappers around strto*(). Many implementations do so anyhow.
 
J

jacob navia

James said:
jacob navia wrote:
...

Entirely new headers include <iso646.h>,

This is iso646.H

#define and &&
#define and_eq &=
#define bitand &
#define bitor |
#define compl ~
#define not !
#define not_eq !=
#define or ||
#define or_eq |=
#define xor ^
#define xor_eq ^=

And you tell me that this is a BIG STEP FORWARD ???

<complex.h>,

Complex numbers make the language heavier without solving the
problem of how to add new numeric types to the language.

<tgmath.h>,

Generic functions is a GREAT idea ("tg" stands for type generic")
but it wasn't incorporated to the language but *only* to some
functions!
<inttypes.h>

Well, there is nothing spectacular there. Useful yes.

and <stdint.h>, <stdbool.h>.

bool is a good idea.

Changes were made to said:
and <stdarg.h>.

Nothing substantial really.
Substantial additions were made to the format strings
used by the printf() family.

No. Only long long and long double support, if I recall correctly.
The vscanf() and snprintf() families of
functions were added to <stdio.h>. The single biggest and most important
library change was the addition of a huge number of functions to the
<math.h> library.


Yes, but erf() lgamma and others aren't really essential to the
language itself.
If what you're complaining about is that they didn't make the changes
you think should have been made, then say so. But please don't try to
suggest that they made almost no changes.

If you reread my message I never said they made no changes!

I have been trying to implement libraries and language modifications
since a long time and it is hard, really. The last weekend I
implemented
int fn(void)
{
int array[] = { 1,2,3, SomeFunction(56),88,66 };

}

And it took me a LOT of time to do that without breking the small
compiler I distribute.
 
R

Richard Tobin

The main errors of C99 were:

1) The obsolete C library (not even gets() was taken out)

2) The bad string data type was mainteinaed without any
real alternative even considered...

3) The error prone malloc/free system was left untouched.
Not the most evident enhancements (size of an allocated
block returned in a library function call) were
introduced. size_t malloc_sizeof(void *);

I don't think those are the problems at all. Removing gets() would
not have any real world effect. If you don't like the strings, then C
is probably not your language. Likewise malloc() and free(), though I
have nothing against optional garbage collection. The ability to find
the size of an allocated block is only of the tiniest use.

I think the problem is that the changes weren't of much interest. Complex
numbers are needed only by a tiny corner of the C community, and could have
been provided by a completely separate standard. Similarly the floating
point stuff. Variable-length arrays are useful, but not so useful as to
justify C90-incompatibility in themselves, for most programmers. There's
just no compelling reason for most users to change.

-- Richard
 
J

jacob navia

Richard said:
I don't think those are the problems at all. Removing gets() would
not have any real world effect. If you don't like the strings, then C
is probably not your language. Likewise malloc() and free(), though I
have nothing against optional garbage collection. The ability to find
the size of an allocated block is only of the tiniest use.

That would allow to BOUNDS CHECK an object access!

That is not "tiny" (at least in my opinion :)

The problem NOW is given a pointer to a malloced object there is no
way to know how big it is to verify that accessing a given offset will
be out of bounds or not!
 
J

James Kuyper

Richard said:
jacob navia said: ....

Better still, just leave them out completely. People who need complex
numbers will have no difficulty implementing them using structs and
doubles.

Implementing even ordinary arithmetic for complex numbers has pitfalls.
Naive implementations of complex multiplication or division are prone to
unnecessary overflows and loss of precision. The transcendental
functions are much worse. Only a very small percentage of programmers
ever have a need to call a transcendental function with a complex
argument, but that small percentage is still a large number of
programmers. It is a much larger number than the number of programmers
who have the expertise needed to implement such functions reliably,
efficiently, and accurately. I don't have that expertise; I know just
enough about the pitfalls to be glad that I can rely upon someone else,
hopefully substantially better qualified than I am, to have implemented
those functions correctly.

The language-level support of complex numbers opened up C for the first
time as a reasonable alternative to languages such as Fortran for
certain kinds of numerical work. Implementation as structs would have
required such horribly convoluted code, unless C++ features such as
references and operatore overloads were added as well. Whether the
group of people who benefited was large enough to justify adding complex
math to the language is a different question, but there was little point
in doing it at any level other than direct support in the language itself.
 
C

CJ

3) The error prone malloc/free system was left untouched.
Not the most evident enhancements (size of an allocated
block returned in a library function call) were
introduced. size_t malloc_sizeof(void *);

With due respect, this is an idiotic suggestion. The programmer knows
what parameter he passed to malloc, so he already knows the allocated
size without needing a special library function to tell him.
 
C

CBFalconer

Ed said:
C99 is proof that trying to evolve C is probably pointless.

Implementors are already ignoring C99. Even if your suggested changes
became part of a future C standard, chances are implementors would
just ignore that standard, just as they ignore the C99 standard.

Your energies would probably be better spent working on your own
language or perhaps on the D programming language or something.

No, they simply are not willing to put in the effort. I suspect
that once the gcc library problems are solved the presence of that
one will galvenize a few more. Of course, attempts to go off in
un-approved peculiar directions will incompatible extensions do not
help.
 
C

CBFalconer

jacob said:
.... snip ...

Since C++ is the better C, there is no need to improve C, and C99
was ignored.

The main errors of C99 were:

1) The obsolete C library (not even gets() was taken out)

Much of the library has to be system specific. It is one thing to
build a library for X86 use. It is quite another to build
libraries for all the systems (even the majority) that gcc already
supports. Almost all the purely compiler work for the gcc system
has been done.

You are not capable of ignoring gets() when writing software? You
could use ggets, which is not standard, but will compile on any
standard system. It will duplicate the ease of use of gets,
without the problems. See:

<http://cbfalconer.home.att.net/download/>
 
C

CBFalconer

James said:
jacob navia wrote:
...

Entirely new headers include <iso646.h>, <complex.h>, <tgmath.h>,
<inttypes.h> and <stdint.h>, <stdbool.h>. Changes were made to
<float.h> and <stdarg.h>. Substantial additions were made to the
format strings used by the printf() family. The vscanf() and
snprintf() families of functions were added to <stdio.h>. The
single biggest and most important library change was the addition
of a huge number of functions to the <math.h> library.

iso646.h has been available since C95.
 
R

Richard Tobin

That would allow to BOUNDS CHECK an object access!

In certain very limited circumstances. You would have to know that
the base pointer you were checking had been returned by malloc(). You
couldn't use it if the value might not be malloc()ed, or if it might
point somewhere inside a malloc()ed block, e.g. an array element.

If you're thinking of doing this check in a library, you're unlikely
to be able to rely on that.

-- Richard
 
J

James Kuyper

jacob said:
....
And you tell me that this is a BIG STEP FORWARD ???

No. I deliberately listed the changes in order of increasing magnitude.
I'm saying that this change, *in combination with the other items on
my list*, is far more change than can reasonably be described by the
phrase "virtually untouched".

I didn't specify whether these were forward changes or backward changes
(or side-ways changes, for that matter). I happen to like most of the
changes, apparently much better than you do. But that's not the point of
what I was saying. Whether good or bad, the changes were too extensive
to be described the way you described them.

May I presume that your own implementation fully supports all of the
changes that you considered to be so negligible as to describe the
standard library as "virtually untouched"? If not, why not? If the
changes were indeed as minor as you claim they were, it would surely be
no great problem to fully implement them.

....
No. Only long long and long double support, if I recall correctly.

It also added the "%a%A%lf" format specifiers.
Yes, but erf() lgamma and others aren't really essential to the
language itself.

We are talking about the library, not the language itself, are we not?

For many people doing serious numerical, the new functions are indeed
essential. Few people need all of them, but many people need at least
one of them.
If you reread my message I never said they made no changes!

And I didn't claim that you did. I described you as claiming they made
"almost no changes", not "no changes". Your actual wording was
"virtually untouched". In this context, "virtually untouched" and
"almost no change" are not significantly different from each other.
 
J

James Kuyper

CJ said:
With due respect, this is an idiotic suggestion. The programmer knows
what parameter he passed to malloc, so he already knows the allocated
size without needing a special library function to tell him.

A conforming implementation is free to allocate a block larger than
requested; most real implementations do so. Some implementations round
upward to the next multiple of a fixed block size. Some implementations
round upward to the next power of 2.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,584
Members
45,075
Latest member
MakersCBDBloodSupport

Latest Threads

Top