Boost process and C

A

Andrew Poelstra

Andrew said:
Quite frankly, a lot of us are happy with the C of 1980. In terms of
forward-looking features, C++ seems the place to go, since C itself will
(should) never use the object-oriented paradigm.

Try posting some ideas in comp.std.c; see for yourself what reaction
you'll get.
Oops! I meant 1989... or whenever C89 was finalized.
 
I

Ian Collins

Andrew said:
Oops! I meant 1989... or whenever C89 was finalized.
I wasn't surprised by your original post, I've known C programmers who
scoffed at those mod-fangled prototypes :)
 
J

jacob navia

I am impressed by the depth of your argumentation.

You must be a brilliant person really

:)
 
W

websnarf

One of the very best things that the original ANSI-ification of C did
was to write a standard that described a language that was in use, as
opposed to legislating a language from on high. That process should be
a model for all standardization attempts, everywhere - too many other
standardization processes attempt to go an invent something completely
new, which leads to standards that don't parallel reality.

What? This is a very narrow and unbalanced view of the standard.

ANSI pulled together the largest common subset of all the incompatible
C's out there, then made a few tough choices, and in fact with C89
discarded the original K&R prototypes and other nonsense to make it a
practically useable language that compiler implementers could resonably
support.

The *PROBLEM* is that they did absolutely nothing outside of those
confines with C99 -- so why has C99 been such an utter failure? Didn't
they follow this supposed perfect recipe for standards?

The primary reason for the success of any standard is that it delivers
*REAL VALUE* that wasn't there before. In 1989, some degree of
unification (so that programmer skills became transferrable) was
extremely valuable (read: 100s of billions of dollars kind of
valuable). In 1999 we already *had* a unified standard, so what did
they add that had any real value? The rate of adoption, its actual
usage, and the culture that followed it tells us exactly: nothing.

Standards, where they just pushed forward with "ideas from on high"
include C++ and Java, which as far as I can tell are quite successful
and have plenty of buy in.
Now, where did those features and ideas that came into common use after
K&R's first book was published come from? It seems that they just
popped up in compilers, everybody started using them, and so they made
sense to standardize.

These days, what groups are testing/working on new features or
extensions to the language? Comp.std.c is a great place for discussing
the current C language standards, but the C tradition seems to have new
features coming from places other than high committees.

I don't know what features would be great to have in the C of 2010 -

I do, but I don't seem to have the right credo for anyone to care.
maybe a more powerful strings toolkit,

*Sigh* ...
[...] maybe a collection of data structures,

Ask the libclc people, how their project turned out. Actually the
SGLIB at least took a reasonable crack at it. Even the Boost people
started by writing C extensions.
[...] maybe even quadragraphs. Who knows? But what is important
that someone is considering these things and asking these questions.

And as Jacob Navia says, if they bring it here, they get shot down.
comp.std.c is not much better.

I have at various times posted about possible extensions I would like
to see -- the amazing negative reactions I get are just indescribably
disappointing. People don't care about the practice of programming
anymore, and they don't care about the capabilities of hardware they
paid good money for. And the idea of actually *taking something out*
of the language; that's just sacred ground that couldn't possibly be up
for discussion.
The Boost group seems to be doing that for C++; who is doing it here?

*Sigh* ...
 
K

Keith Thompson

ANSI pulled together the largest common subset of all the incompatible
C's out there, then made a few tough choices, and in fact with C89
discarded the original K&R prototypes and other nonsense to make it a
practically useable language that compiler implementers could resonably
support.
[...]

What "original K&R prototypes" are you talking about?

K&R1 didn't have anything called "prototypes". It did have an older
style of function declarations:

int main(argc, argv)
int argc;
char **argv;
{
...
}

but that's still supported in both C90 and C99.
 
I

Ian Collins

jacob said:
Ian Collins a écrit :


This is not possible. For some operators, C++ decides that they
need to be defined only within a class, and there you are. You are
forced to define classes, constructors, destructors, copy constructors,
and all the stuff.
That's because they only make sense in the context of an object, you
have to (for example) add something to something else. The are no
restrictions on function overloading. You could use a simple struct.
Besides, there are things that the C++ operator overloading
implementation gets wrong:

1) There is no overloading possible for higher dimensional arrays

array[2][3] is just impossible using overloaded operators.
Indeed.

2) There is no way to distinguish between assignment and reading when
accessing an array. This is specially important when you want to
implement read only data areas.

There is a very simple technique known as proxy objects to solve this
problem, but that's a bit to OT for this forum.
 
C

Chris Hills

Michael Mair said:
I know ;-)

In any event "compliance" is anything you want as there is no official
certification or compliance testing for MISRA-C. Just like most C
compilers claim to be "ANSI C" or "ISO C".

However there will be an example suite for MISRS-C2 by the end of the
year. It will not be exhaustive so I expect it to grow over the next
couple of years.

Hopefully as MISRA-C3 is developed the example suite will be developed
with it and launched at the same time..
 
J

jacob navia

CBFalconer a écrit :
Back in the last century I downloaded, and actually used, Navia's
system. I wondered at the time why it never showed up in the
recommended list of C systems for the x86, since it seemed fairly
usable after disabling some of the outre monstrosities (such as GC,
overloading, etc.) and it was fairly handy to be able to flip into
a debugger instantaneously.

Then I observed various things. For one, it depended on the
brain-dead Microsoft C dll and didn't create proper independent
modules.

This is no longer the case. It costed me years of development to do
that. Writing a C library from scratch is not something you do so
easily, unles your name is Chuck Falconer of course. Normal guys like me
take years.

This is hard to understand, specially for genius types of guys like Chuck.
There was never any indication of revision levels, except
that suddenly the debugger no longer worked at all, it just crashed
immediately (it apparently suddenly used Pentium only instructions,
without bothering to check what it was running on).

Mr Falconer insisted that I keep lcc-win32 compatible with his 486
system. I refused without a maintenance contract, and he did not forgive
me that to this day.

So is life :)
The total lack
of regression checks had led to this, and the various brand new
bugs reported on the lcc newsgroup confirmed the lack of testing.
At the time I reported it, and there was no effort to repair it.

Not even that. I told him that I would not even look into it without a
fair payment. Why should I work for him for free? He was the only one
using a 486 that I have ever heard of.
It is probably all very well for someone willing to use Beta (at
best) software, but it is certainly not recommended. I think it
has done much harm to the reputation of the reputedly accurate LCC
compiler (which is not limited to Windoze use).

My conclusions: Lcc-win32 allows you to test Beta software, with
the advantage of easily creating totally non-portable off-beat
source which is useless elsewhere. Some sort of exercise in
masochism. I have kept these conclusions more or less to myself
for some time, but Navia's insistence on posting silly off-topic
material here and insulting those who object has aroused my ire.

Wow, I am impressed. I will commit suicide shortly.
His recent citing of Trollsdale as an authority goes beyond any
pale.

Maybe you should cite the context too?

I did not cite him as an authority but as a representative of a view
that is always repeated here:

C++ is the future, C is the past. If you want any improvement to C just
use C++.
 
B

Bill Pursell

jacob navia wrote:
If there is no clear place where the evolution of C can be
discussed, then it won't be, and C will not evolve.
Still, C has a big potential of growth with some minor additions like
operator overloading, something that is accepted by more conservative
languages like fortran for instance.

Operator overloading is, IMHO, a really, really bad idea. I've only
been coding C for just under a year, and 11 months ago I was really
bent out of shape that I couldn't write:
struct foo A,B,C;
A = B + C;
but I'm really glad now that I can't, and I would hate to see operater
overloading be expanded in C. Operator overloading in C is the root
cause of a very large number of bugs already. How many bugs are a
result of "3+4" being different that "3.0 + 4.0"? Those bugs would
have been avoided had the programmer been required to type
"int_add(3,4)" or "float_add(3,4)". Now, I'm not arguing that the '+'
symbol be dropped for arithmetic on basic numeric types, but expanding
the language to allow '+' as an infix operator on user-defined structs
is just asking for trouble. The only gain is (arguably) cleaner code,
but quite frankly "A = foo_add(B,C)" is more informative than "A =
B+C" and less prone to error.
 
I

Ian Collins

jacob said:
well, that's my point. In C++ I czan't avoid classes, constructors,
destructors and the whole complexity, even if I want to use a simple
thing like

typef struct __int128 { int32_t part1,part2,part3,part4} INT128;


INT128 operator+(INT128 *a,INT128 *b)
{
....
}

INT128 operator+(INT128 *a,long long b)
{
....
}
struct int128_t { int32_t part1,part2,part3,part4 };

Other than that correction and the operator bodies, you've written all
you have to write.
No, it is not OT because that confirms what I said:

You can't just take a small piece. YOU GET THE HOLE PIG AT EACH SERVING!
No, how would you differentiate between read and write to an array in C?

If you want the extra fat, you pay for it, if you don't, you don't.

It's a popular misconception that C++ has to be more complex than C. It
doesn't.
 
J

jacob navia

Keith Thompson a écrit :
Perhaps I should have pointed out that, although comp.std.c is the
best newsgroup to discuss changes to the C standard, there's no
guarantee that anyone will actually like your idea, and nobody has any
obligation to discuss it.




Adding a smiley to a pointless insult doesn't make it any less
insulting.

We had this discussion already. I started yet another discussion about
the need for a portable standard container library several times and the
answers were:

Mr Flash Gordon, a "regular" here said: (11 Oct 2004)

"Not only can a lot of programing be done without using hash tables,
list, queues etc, but as I said a lot of programming *is* done without
using such things. Also the natural solution for a lot of problems does
not use such things.

Our dear Dan Pop, another "regular" said... (21 March 2004)

"It's been 12 years since I've used a binary tree the last time. In the
meantime, the most complex data structure I *needed* to use was an array
of structures. And I suspect that many programmers use linked lists
only in their homework assignments. "

And everyone accepted those things in silence. Nobody complained.

The standards comitee doesn't even accept simple data structures like
strings. Lists, flexible arrays, stacks, queues, etc etc must be done
over and over, and are kept OUTSIDE the standard library.

Why?

Because everyone should use the STL damm it!

I repeat that such an attitude towards data structures means that indeed
C is the past and C++ the dreaded future.

I am not inventing this Keith, you should know. You participated to
those discussions.

And you know perfectly well that when I say:

I am not insulting anyone but precisely arguing AGAINST that frame of mind.

jacob
 
I

Ian Collins

jacob said:
Ian Collins a écrit :
No, how would you differentiate between read and write to an array in C?

lcc-win32 proposes the [] operator for reading, and the []= for assignment
Fine, but how would you do it in standard C?
 
I

Ian Collins

Bill said:
Now, I'm not arguing that the '+'
symbol be dropped for arithmetic on basic numeric types, but expanding
the language to allow '+' as an infix operator on user-defined structs
is just asking for trouble. The only gain is (arguably) cleaner code,
but quite frankly "A = foo_add(B,C)" is more informative than "A =
B+C" and less prone to error.
I think Jacob's example applies equally well to this case.

You end up with the same mess that Java has to put up with.

Operator overloading enables you to use a user defined type in the same
way as a built in type. Much more intuitive.
 
M

Michael Mair

Chris said:
In any event "compliance" is anything you want as there is no official
certification or compliance testing for MISRA-C. Just like most C
compilers claim to be "ANSI C" or "ISO C".

There are "MISRA checkers", though, and I remember having seen
the option to activate MISRA-C checks in some lint tool, probably
PCLint.

However there will be an example suite for MISRS-C2 by the end of the
year. It will not be exhaustive so I expect it to grow over the next
couple of years.

Thank you for the information.

Hopefully as MISRA-C3 is developed the example suite will be developed
with it and launched at the same time..

This sounds promising -- I only can recommend this course of action;
the time and money usually are well invested.


Cheers
Michael
 
J

jacob navia

Keith Thompson a écrit :
Are you suggesting that the C standard should be changed so that
strings are no longer terminated by '\0'?

Yes. If we have the length as a size_t we do not need to start scanning
memory at each access to get the length of the string, what is GREATLY
inefficient and has been pointed out as inefficient since decades!!!!!
There are dozens of
standard library functions that use this representation, and it's
central to the semantics of string literals.

lcc-win32 proposed replacing
strcmp --> Strcmp
strcat --> Strcat

where the arguments are the same but in THOSE kind of strings.

Similarly, the compiler when it sees:

String s = "This is a string";

would make an array of characters and prepend the length instead of
adding a terminating zero.
Conceivably you could add a new string representation *in addition to*
the existing one.
Yes.

You would then be permanently stuck with two
distinct and incompatible ways of representing strings. (Breaking
nearly all existing code is not an option.)

Yes, new code would use Strcmp, old would use strcmp.

I have tried porting code that uses heavily strings in the old
representation to the new one and it is relatively easy to do.
Of course, it's easy enough to implement this kind of thing in a
library using purely standard C; perhaps that's why there isn't much
enthusiasm for adding it to the language.

No. The problem is that you want to keep:

String s;
....

s[23] = 'b';

and not forcing people to use:

strindex(s,23)

strindexassign(s,23,'c');

or similar nonsense.
If you'll provide a pointer to the documentation, I might take a look
at it. (If I can't read the documentation without installing
lcc-win32, I'm not going to bother.)

Basically it implements all functions of the C library using the new
Strings. The syntax is almost the same:
Strcmp --> strcmp
Strcat --> Strcat

etc

Using operator overloading operators like

if (! s) {
}

have their usual meanings.

[...]
Of course, but they willl not agree with this, obviously. They are
still in C89 and the few points that C99 brought in the sense of a
better language are just denied. Each time I remind people about them,
they tell me that "not all compilers support C99" what is true but
doesn't make things advance at all.


Speaking only for myself, I routinely quote from the C99 standard.
Yes, I and others often point out that not all compilers support C99.
We do so because it happens to be true, and programmers in the real
world need to be aware of that fact.

Maybe. I would not say that it is wrong. But the insistence for using
gcc with the -pedantic etc options that is getting recommended here goes
with the attitude of negating C99 in this bgroup. I remember the
discussion about the FAQ we had, where I was flamed because I insisted
on upgrading the FAQ to C99.
[...]
Yes, but the problem is that in the C standards comitee most people
care much more about C++ than C. There is no compiler for C today. All
are C++ compilers that can do C as an after thought. Gcc has
implemented most of the standard but still never makes it to finish it.


<OT>
gcc is a suite of compilers with a common backend. The C and C++
frontends are separate. There are also frontends for a number of
other languages, including Fortran, Objective C, and Ada.

C is certainly not an "afterthought".
Microsoft doesn't care at all, and pursues its own standards through
the windows platform, where it can dictate whatever it wishes.

Apple has got objective C, and sticks to it.


Which suggests a possible approach. If you want to use a C-like
language, but you don't think standard C has some feature you require,
*use a different language*. Changing the C standard itself, even if
there's general agreement that your change is a good idea, will take
many years; that's just the nature of the standardization process.
Any such change will be imposed on *all* conforming implementations.
There are numerous other C-like languages that you can use *today*.
One of them happens to be the C-like language implemented by
lcc-win32; I'm fairly sure you have access to it.

Nobody is preventing you from implementing and using whatever
extensions you like. The only thing you're having a problem with is
your inability to impose those extensions on the entire world.

I am not imposing anything to anyone. I have explained with a lot of
arguments why I am doing this or that. Nobody is forced to use lcc-win32
but I think that arguing and convincing people still is possible, and
is still the oenly way to publish your own ideas.

Nobody is behind me, there is no FSF nor Microsoft nor any other
organization trying to promote the ideas exposed here. I have just
confidence that good ideas speak for themselves and that they win at the
end.

jacob
 
B

Bill Pursell

jacob said:
Bill Pursell a écrit :

Sorry but 3+4 is identical to 3.0+4.0 : SEVEN in BOTH cases, only the
representtion of the number changes.

The representation is often significant, and my point is that
overloading the '+' operator obscures that detail from the programmer.

Maybe you meant 3/4 and 3.0/4.0 ???

No. I meant '+'.
Aaaahhh what an easy syntax...

What would you do with
z = sqrt((a+b)/(b-d) +0.5*d/(a*d-b*d);

As I said, I'm not suggesting that the basic arithmetic operators be
removed from the language, and this is an excellent example of their
utility. Can you come up with a similar example that doesn't rely on
fundamental types? I've never seen an object in any language that was
prone to this type of calculation, and certainly never seen a structure
in a C program to which this would apply. Any such calculation should
be performed by a function anyway, so rather than forcing the
maintenance programmer to parse:
z = sqrt((a+b)/(b-d) +0.5*d/(a*d-b*d);
that same programmer would have to parse:
z = determinant( A);
or the like.
Yes, you have to COMPILE THEM by hand producing this incredible GIBBERISH!

tmp1=float_add(a,b);
tmp2=float_sub(b,d);
tmp3=float_div(tmp1,tmp2);
tmp4=float_mul(0.5,d);
tmp5=float_mul(a,d);
tmp6=float_mul(b,d);
tmp7=float_sub(tmp5,tmp6);
tmp8=float_div(tmp4,tmp7);
tmp9=float_add(tmp3,tmp8);
z=sqrt(tmp9);

Which would be implemented as an appropriately named function.

And if you change the type of the data from float to long double you
have to edit all those function calls!

And since you've encapsulated the computation in a function, you now
only have to change the definition of that one function. If you had
overloaded the operators, you would have to change the definition of
the overloaded function, so you've gained nothing.
You are dreaming. LESS PRONE TO ERROR??????????

Yes. Less prone to error, because things are explicit. You choose to
code in a language because you are making a decision about what aspects
of the system you want hidden, and what aspects you want explicit. In
C (in my opinion), this is precisely the type of detail that should not
be hidden. Function overloading is really nice, it's very handy, and
most of the time I really like it. I just don't think it belongs in C.
And I'm far more likely to change my mind if you don't shout. :)
 
J

jacob navia

Richard Heathfield a écrit :
jacob navia said:







Well, okay, let's just say for the sake of argument that we're going to add
standard data structure APIs to C.

First step - decide what's in and what's out. Let battle commence.

When you eventually get the blood out of the carpet, the second step is to
agree on an interface. That will take forever and a day.

Third step - get people to use your shiny new standard container library.
Except that they won't, of course, because they will all just carry on
using whatever it is that they're using right now.

Until they have to port it to a new environment. Then they will see how
easy is to port the libc. Basically you do not port it.

*and that is my point* ...

Yes, it is a lot of work. But is doable if anyone would care about
improving the language.

jacob
 
B

Bill Pursell

Ian said:
I think Jacob's example applies equally well to this case.

You end up with the same mess that Java has to put up with.

I'm not familiar with Java.
Operator overloading enables you to use a user defined type in the same
way as a built in type. Much more intuitive.

I'm not convinced that it provides added intuition. As I pointed out
in my response to Jacob, any sufficiently complex computation should
have a function to perform it. If foo represents a mathematical object
and we want to perform some computation on it, then whether you have
operator overloading or not, you end up with the code:
compute_contour_integral (A, f);
Having the overloading might encourage someone to write the code
in-line rather than make a function call, leading to more diffictult
code, not less.
 
J

jacob navia

Herbert Rosenau a écrit :
I found already
out that onloy trolls, twits and mental amputated peoples who are
unable to write a simple "hello world" C program have a real need for
GC.


This sentence will go into my collection of sentences "to be framed" and
put in a shrine.

What a nice prose. It perfectly describes the general attitude of the
people that are against GC ...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,770
Messages
2,569,583
Members
45,075
Latest member
MakersCBDBloodSupport

Latest Threads

Top