Anyone else feel like C++ is getting too complicated?

B

Balog Pal

jacob navia said:
Some people here argue that C++ is not overly complicated.

Care to give some reference to any of those people?
I am reading "C++ Templates, the complete Guide". There,
the authors say (page 15)

"The rules for this decision may become rather
complicated even without templates. ...
Every C++ programmer *must* know those rules but maybe
1% of them know the WHOLE set of rules because the human
mind is not adapted to absorbing tons of programming trivia.

Yeah, name lookup rules are complicated.
Yeah, no human being can recite them.
Yeah, we don' like it.

However, in practice this is not as big an issue as looks, as it has natural
mitigation.
It is:
- use names as they make sense
- overloads must be semantically equal
- avoid using directives on a namespace unless you really mean to drag in
everything and know all of it

You do just that, and don't need to care how many candidates are there and
which one got called if the set has more than one.
And those guidelines are good to follow regardless of name lookup rules...

(The dark side is unfortunately introduced via Koenig lookup that can drag
in stuff you are not aware exists... But it is not a 'complexity of C++'
issue, but a design flaw.)
THEN, it is obviously the task of the maintenance programmer
to debug all those cases where the byzantine rules go
completely in other direction that the poor programmer
intended, when some new code using indirectly those
overloaded functions is written, 2 years AFTER the original
programmer left.

Quite bad, that we have all kind of tools around, and they still fail to
address this very situation. I.e. MSVC shows you tooltips on mousing -- but
it is not based on compilation data even if it exists, and you're not told
what was picked up. Nor is is straightforward to see on a sensible listing.
The call graph of the function does not list the correct function (and btw
the internally involved invisible calls either), that would help so much.
The great majority of C++ programmers doesn't know *exactly*
what are they doing when do a simple thing like overloading
a function...

Now that is really BAD -- but blame the programmer, not the language.
Overload is a programmer-made decision and shall be a good one.
Is this really necessary?

What?
Overload is a powerful and needed feature. Without it you have a bigger
mess. And if your fellows misuse it, slap them.

I know way more cases where a programmer assigned the wrong value. Is it
grounds to get rid of assignment or claim it bad?
Now, look at this "concept" concept again. (Sorry but I find
no other way of saying it)

Behind it is the need to declare a set of types so that
template arguments can be checked by the compiler. OK.

But wasn't *inheritance* supposed to define this stuff?

Huh? Of course not. Care to explain how it comes into a picture?

Let's see a very simple concept, copyable. Used in many std:: collections
too.
That works fine for int, for char*, all POD types, etc... How you define it
via inheritance?
If you make a standard "super-class" that describes a
concept, wouldn't it be necessary just that your class
INHERITS from that super class to be cleanly checked
in the template argument list? Why is it necessary to
introduce YET ANOTHER huge construct?

Guess no, but show me on the above example...

just for reference the aim is to get:

std::vector<int> v1; //okay
std::vector<std::auto_ptr<int> > v2; // error: std::auto_ptr<int> fails is
not Copyable

Just as another thought: "concept" is meant for the template library
programmers. The people doing that correctly know how to use it and why it
is good. The rest of the public can ignore it entirely, just sit back and
enjoy the ride -- getting sensible checks and compile-time error flagging
when using the library. Making their life way less complicated than it is
now, when you get a 50-line message claiming it can't convert [20-line name]
to [similar 20-line name] just having std::map< std::string, std::string >
in a simple operation.
 
J

Juha Nieminen

jacob said:
But wasn't *inheritance* supposed to define this stuff?

Object-oriented programming and inheritance hierarchies were all the
hype in the 80's and early 90's. While certainly not the silver bullet
of programming, it was nevertheless considered one of the greatest
advances in programming.

As years passed and the overall experience on OOP and especially
inheritance grew, it became more and more clear to the programming
community in general that OOP and especially inheritance is not, after
all, such a great tool. It's very useful, and it has great ideas in
theory, but... just somehow it's not everything that it promised to be.
While there are situations where inheritance is the perfect solution,
these situations are not as abundant as once thought. OOP and
inheritance is not the silver bullet of programming, not even close.
(IMO the modular part of OOP is still extremely useful, and an inherent
part of almost any well-designed program. It just can't do everything by
itself.)

In the last decade the paradigms have shifted more towards dynamic
programming. Dynamic code/object generation (at compile time or at
runtime), dynamic creation of first-class objects (including first-class
functions), runtime type information (which allows things like
reflection), etc. Template metaprogramming can be considered a subset of
this. Also there has been a clear shift towards a more functional
approach, inspired by lambda calculus.
 
P

peter koch

Some people here argue that C++ is not overly complicated.

I am reading "C++ Templates, the complete Guide". There,
the authors say (page 15)

"The rules for this decision may become rather
complicated even without templates. In this section we
discuss overloading when templates are involved. If
you are not familiar with the basic rules of overloading
without templates please look at Appendix B, where we
provide a reasonable detailed survey of the overload
resolution rules".

Great. That Appendix is 12 pages long! And it is not
complete of course!

Every C++ programmer *must* know those rules but maybe
1% of them know the WHOLE set of rules because the human
mind is not adapted to absorbing tons of programming trivia.

THEN, it is obviously the task of the maintenance programmer
to debug all those cases where the byzantine rules go
completely in other direction that the poor programmer
intended, when some new code using indirectly those
overloaded functions is written, 2 years AFTER the original
programmer left.

The great majority of C++ programmers doesn't know *exactly*
what are they doing when do a simple thing like overloading
a function...

Is this really necessary?

Now, look at this "concept" concept again. (Sorry but I find
no other way of saying it)

Behind it is the need to declare a set of types so that
template arguments can be checked by the compiler. OK.

But wasn't *inheritance* supposed to define this stuff?

No. Inheritance indicates "is-a" relationships, while what is needed
is a "has-a" relationship.
If you make a standard "super-class" that describes a
concept, wouldn't it be necessary just that your class
INHERITS from that super class to be cleanly checked
in the template argument list? Why is it necessary to
introduce YET ANOTHER huge construct?

For several reasons, one being that inheritance does not solve the
problem. But even if it did, you would require an inheritance relation-
ship that would quickly be unmaintainable. Just imagine all those
properties you would have to define in order to even simple types: are
they default-constructible, assignable, copy-constructible, can you
compare them, do they have a "less-than" relationship a.s.o. Using
concepts, all old code would have to be thrown out if inheritance or
something like that were to be used instead.

/Peter
 
B

Balog Pal

All in all, alot of C++0x is win, as I said in the original post. But
the stuff that is for library writers, leave it at that. As crazy as
it sounds, what I would really have liked is if the language forked
into a normal version and an "extended" version, such that code
written in the extended C++ could almost always be organized such that
the "extended" features were limited only to CPP files.
<<

hm, we all know that dreaming up stuff is unlikely to lead anywhere. And
real systems like C++ are buiild from executeable proposals. The language
doesn't have a separation of interface and implementation, neither even the
concept of "CPP files". What you say just can not be done in the current
language -- only in a new one, dropping backwards compatibility. (Or if you
think otherwise, you should have written a proposal, or at least a seed for
one...)

Introducing "modules" failed to get in for this very reason -- while most
everyone agrees on the need no one (I aware of) could show a way to get it
actually. And it will not happen by magic in the future either.

The story you mentioned on 'export' is at least a good learning case, the
mtivation there was on your line, aiming separation. And even despite a good
description it failed on the practical side -- the compiler writers did not
implement it. Because it would have needed completely new ABI that did not
fit well with the existing linker concept, that allow together objects from
C and many other languages.

A new module thing would require even more separation and rigidity. Likely.
This way normal C++ could make use of libraries written in extended C++.

The is done well by C#. Also you can use CORBA, COM, etc to have such
things, and practically with good compiler support in MSVC (just #import the
typelib) starting 10+ years ago. Yet it doesn't seem too popular or even
used. Cant't we use it as evidence that is not so much desired?
I like the features, what I DON'T like is the fact that I'm going to
have to work with other people who are going to try and use them.

Who wants or likes to work with idiots? But my experience shows that idiots
are not tied to any set of features. Instead they just break anything
around, and very inventive to abuse the least abuseive tools. While the
hard-to-get features are actually safer from them.

And where I can't ret rid of idiots, all tools that help to detect problems
are more than welcome. Like a 'concept' detecting misuse at compile time
instead of just having UB or misbehavior in the release build.
I know it's shocking, but your average C++ programmer *really* cannot
even figure out how to use templates effectively.

You mean write templates? Well, so then he should not. Use templates? Like
vector? I don't think that accnts for a problem, or lesser one if we were
back to template-less state and polymorphic collections...

And peopleware problems shall be handled down in teams by having at least a
couple knowledgeable engineers, use them as reviewers and mentors. And
for programmers juse the Joel Spolsky method of selection, go for bright and
caring persons, nuke all the rest. And in a few years everyone will be able
to do proper work.
And if most of my
time is going to be spent toiling over terribly written code, it's
much more pleasant to just find a different language.

Interesting, I'm yet to hear about a language or a system where workers do
not face these kind of idiots-at-helm problems.

Java was (in one big part) motivated by what you say. After its decade+ and
6th version and popularity did it make things better? I don't see that.
In fact I see way bigger mess in java programs, and it looks increasing like
a rolling showball.
 
J

jacob navia

peter said:
For several reasons, one being that inheritance does not solve the
problem. But even if it did, you would require an inheritance relation-
ship that would quickly be unmaintainable. Just imagine all those
properties you would have to define in order to even simple types: are
they default-constructible,

yes

assignable,

yes

copy-constructible,

yes
can you compare them,

yes

do they have a "less-than" relationship

yes

a.s.o.

Well, primitive types (int, double,char,short,long,
double, float,long double,long long) are all numbers.
If you exclude strings and arrays, they have all the above
properties. Those properties do not have to be specified,
since they are intrinisc to the language!


Using concepts, all old code would have to be thrown out if inheritance or
something like that were to be used instead.

Why?

If I needed some "concept" I would create an abstract class
that has all those properties and the compiler could check
that the given type conforms to all the properties of the
specified class.

The only reason that this is not done is that OO is no longer
"in", i.e. the OO "FAD" has disappeared. We have new fads
now.

It will be left to the maintenance programmer to figure out
then, why the mixture of old+new fads doesn't work.
 
P

peter koch

yes

assignable,

yes

copy-constructible,

yes
can you compare them,

yes

do they have a "less-than" relationship

yes

a.s.o.

Well, primitive types (int, double,char,short,long,
double, float,long double,long long) are all numbers.
If you exclude strings and arrays, they have all the above
properties. Those properties do not have to be specified,
since they are intrinisc to the language!

Using concepts, all old code would have to be thrown out if inheritance or


Why?

Because e.g. int is a primitive type, that does not inherit from these
concepts. Therefore, I would not be able to use my new concept-based
container with integers. What is worse, I would not be able to rewrite
the standard library so that it could take advantage of concepts.
Remember: one advantage of concepts is that they should simplify
programming for the users of the libraries. Not having concepts in
std::vector probably would not be acceptable.
If I needed some "concept" I would create an abstract class
that has all those properties and the compiler could check
that the given type conforms to all the properties of the
specified class.

The only reason that this is not done is that OO is no longer
"in", i.e. the OO "FAD" has disappeared. We have new fads
now.
OO is not the silver bullit, and it never was considered so - at least
by the C++ community.
C++ always was a multi-paradigm language and will continue to be so.
It will be left to the maintenance programmer to figure out
then, why the mixture of old+new fads doesn't work.

Old code will continue to work - what are you getting at? And just as
important: you can use existing code in new concepts-based code,
something that would not be possible with your proposal to abuse
inheritance.

/Peter
 
J

jacob navia

peter said:
Old code will continue to work - what are you getting at?

In the shop I work we are tied to gcc 3.2 / MSVC 6.0.

Why?

Because those compilers allow to define a template without checking
until template expansion time if all symbols used in
the template are defined. Newer compilers do not.

The language changed.

The problem is that the header files contain thousands of definitions
and no human mind can untangle them now. Depending on the order
of header files inclusion, some things will be defined when the
templates are defined and others are not. All will be defined when the
template is used but that is not enough for language purists.

We have attempted several times to solve this but it needs at least
4-5 man-months to do that. And we do not have the resources,
we are a small shop...

And we are stuck then.

Yes, "Old code will continue to work" until somebody decides otherwise.
 
J

jacob navia

Juha said:
Object-oriented programming and inheritance hierarchies were all the
hype in the 80's and early 90's. While certainly not the silver bullet
of programming, it was nevertheless considered one of the greatest
advances in programming.

That was the fad of those days.

Everything was object oriented, mind you.
As years passed and the overall experience on OOP and especially
inheritance grew, it became more and more clear to the programming
community in general that OOP and especially inheritance is not, after
all, such a great tool. It's very useful, and it has great ideas in
theory, but... just somehow it's not everything that it promised to be.

This is the typical way of all fads.
While there are situations where inheritance is the perfect solution,
these situations are not as abundant as once thought. OOP and
inheritance is not the silver bullet of programming, not even close.
(IMO the modular part of OOP is still extremely useful, and an inherent
part of almost any well-designed program. It just can't do everything by
itself.)

In the last decade the paradigms have shifted more towards dynamic
programming.

New fad. This is like reading "Vogue"...
Dynamic code/object generation (at compile time or at
runtime), dynamic creation of first-class objects (including first-class
functions), runtime type information (which allows things like
reflection), etc.

Yes, that is considered sexy now.

Look, I am French, and even worst, I live in Paris. I know
something about fads really.
Template metaprogramming can be considered a subset of
this. Also there has been a clear shift towards a more functional
approach, inspired by lambda calculus.

Yeah. Like the mini-skirt, that always returns, fads tend to run into
circles. Lambda calculus,invented in the thirties was THE fad
that led to LISP in the fifties.

We are there again, for the third time.

And Vogue tell us that the mini-skirt is IN again!

Actually, it never left us. Women like to show their legs!

So, we are served with mini-skirts 2009 style, not at all
the same like in 1969 but so deliciously similar!

And the fad goes on...

:)
 
S

SG

If I needed some "concept" I would create an abstract class
that has all those properties and the compiler could check
that the given type conforms to all the properties of the
specified class.

You can certainly do that. But in many cases you don't need the extra
level of indirection. If you don't need the extra level of
indirection you could use templates. If you want to use templates
then a nice "type system for types" would be nice. That's where
concepts come in.

Not requireing unneeded levels of indirection is in the spirit of C++
("You don't pay for what you don't use.")

Of course, sometimes you *want* this indirection (runtime
polymorphism). Generic programming doesn't invalidate object oriented
programming. These "styles" both have their use cases.
The only reason that this is not done is that OO is no longer
"in", i.e. the OO "FAD" has disappeared. We have new fads
now.

It will be left to the maintenance programmer to figure out
then, why the mixture of old+new fads doesn't work.

They can work pretty well together. The buzzword here is "type
erasure". See boost::function, for example.


Cheers!
SG
 
B

Balog Pal

jacob navia said:
In the shop I work we are tied to gcc 3.2 / MSVC 6.0.

msvc 6 is a pre-standard compiler.
Why?

Because those compilers allow to define a template without checking
until template expansion time if all symbols used in
the template are defined. Newer compilers do not.

Yeah, I wasn't too happy about that change at the time either, but
rearranged the code to have the declarations, it wasn't that much work (with
also adding a few typenames, etc...)

Moving ahead was kinda relief after all.
The language changed.

Yeah, and that version was from the MS era of 'we shit on the standard',
they even managed to lose their voting rights in the committee by not
attending... The tide turned a few years later fruiting at ver 7.1.
The problem is that the header files contain thousands of definitions
and no human mind can untangle them now.

Still it is hard to imagine havig *that much* template code for a project.
After all they are to cover generic stuff and an application is limited on
that. While standalone libraries do not use direct function calls to
unknown functions directly from the text, do they?
Depending on the order
of header files inclusion, some things will be defined when the
templates are defined and others are not. All will be defined when the
template is used but that is not enough for language purists.

You only need declaration of the function. Or make it dependent (via this->
or similar additions...) Why not declare the functions used in the template
right there in the template header?
We have attempted several times to solve this but it needs at least
4-5 man-months to do that.

It's so hard to believe to go even over 4-5 man-days. How many lines of
template code you have and how many errors flagged on compile?
And we do not have the resources, we are a small shop...

Sure, that amount sounds gross. Still there may be even more implied costs
in being locked in -- especially into archaic stuff that will gett
increesingly abandoned around theese years (IMO).
And we are stuck then.

Yes, "Old code will continue to work" until somebody decides otherwise.

It referred to old standard-compliant code, yours is not such. It never
worked in a standard compiler and never will be. While C++98 code will
happily work in a C++0x compiler.
 
S

Stefan Ram

Juha Nieminen said:
community in general that OOP and especially inheritance is not, after

You have to tell between inheritance of interfaces and of
implementations. Inheritance of interfaces is great. To
implement, sometimes delegation might be better.
In the last decade the paradigms have shifted more towards
dynamic programming. Dynamic code/object generation (at compile
time or at runtime), dynamic creation of first-class objects
(including first-class functions), runtime type information
(which allows things like reflection), etc. Template
metaprogramming can be considered a subset of this.

This sounds strange too me. Template metaprogramming is
exactly the opposite. It is static programming - template
instantiation is happening at compile time. That's why the
afficinados are fond of it: It does not consume runtime.
»dynamic« means »at runtime«. Templates are static in
comparison to OOP, which means run-time polymorphism
(late binding while templates are early binding).

Smalltalk/OOP: Bind everything as late as possible.
template metaprogramming: ... as early as possible.
Also there has been a clear shift towards a more functional
approach, inspired by lambda calculus.

Object-oriented programming languages (Smalltalk
and Lisp) always included functional sublanguages.
 
I

Ian Collins

jacob said:
In the shop I work we are tied to gcc 3.2 / MSVC 6.0.

Why?

Because those compilers allow to define a template without checking
until template expansion time if all symbols used in
the template are defined. Newer compilers do not.

The language changed.

The language became standardised. The upcoming standard should not
break code conforming to the current standard.
The problem is that the header files contain thousands of definitions
and no human mind can untangle them now. Depending on the order
of header files inclusion, some things will be defined when the
templates are defined and others are not. All will be defined when the
template is used but that is not enough for language purists.

We have attempted several times to solve this but it needs at least
4-5 man-months to do that. And we do not have the resources,
we are a small shop...

4-5 man-months? How many tens of millions of lines do you have? More
to the point, how many did you have in 1999 when the language became
standardised?
 
P

peter koch

In the shop I work we are tied to gcc 3.2 / MSVC 6.0.

Why?

Because those compilers allow to define a template without checking
until template expansion time if all symbols used in
the template are defined. Newer compilers do not.

The language changed.

No - this is not correct. There has been no change in that direction.
Your problem rather is that you used compilers that did not follow the
standard (MSVC did not because among other things it predates the
standard).
The problem is that the header files contain thousands of definitions
and no human mind can untangle them now. Depending on the order
of header files inclusion, some things will be defined when the
templates are defined and others are not. All will be defined when the
template is used but that is not enough for language purists.

The actual problem is not a question about language purism. Rather it
is a problem of your code which is extremely fragile. For that reason
alone it would be worth trying to change the code.
We have attempted several times to solve this but it needs at least
4-5 man-months to do that. And we do not have the resources,
we are a small shop...

And we are stuck then.

code will continue to work" until somebody decides otherwise.

Exactly. Work is not the entirely correct word.

/Peter
 
J

Juha Nieminen

Stefan said:
This sounds strange too me. Template metaprogramming is
exactly the opposite. It is static programming - template
instantiation is happening at compile time. That's why the
afficinados are fond of it: It does not consume runtime.
»dynamic« means »at runtime«. Templates are static in
comparison to OOP, which means run-time polymorphism
(late binding while templates are early binding).

It could be argued that there are many levels of "dynamic".

Completely "static" programming can be considered a 1-to-1
relationship between written source code and produced machine code: What
you write is basically exactly what you get.

However, templates are a bit different. They do not produce any code
when the compiler first parses them. Moreover, there's no 1-to-1
relationship between source and compiled machine code, but a 1-to-many:
The same source can produce "dynamically" many different types of
compiled machine code, depending on how the template is instantiated.

In other words, the compiler dynamically adapts your template code to
the specified types (and scalars, in some cases).
Object-oriented programming languages (Smalltalk
and Lisp) always included functional sublanguages.

I thought Lisp has always been nothing but a functional language.
Object-oriented features were later devised by "abusing" its versatility.
 
S

SG

  However, templates are a bit different. They do not produce any code
when the compiler first parses them. Moreover, there's no 1-to-1
relationship between source and compiled machine code, but a 1-to-many:
The same source can produce "dynamically" many different types of
compiled machine code, depending on how the template is instantiated.

In other words, the compiler dynamically adapts your template code to
the specified types (and scalars, in some cases).

I don't think this use of "dynamically" is in alignment with how it is
generally understood:

dynamically = at runtime
statically = at compile-time

Cheers!
SG
 
S

Stefan Ram

Juha Nieminen said:
I thought Lisp has always been nothing but a functional language.
Object-oriented features were later devised by "abusing" its versatility.

All LISPs I know have a procedural sub-language, so LISP does
not seem to be /purely/ functional.

The term »object-oriented programming« was coined by Alan Kay,
who answered a question about this term's meaning in 2003:

»OOP to me means only messaging, local retention and
protection and hiding of state-process, and extreme
late-binding of all things. It can be done in Smalltalk
and in LISP. There are possibly other systems in which
this is possible, but I'm not aware of them.«

http://www.purl.org/stefan_ram/pub/doc_kay_oop_en

The Lisp community today uses »LISP« to refer to historic
LISPs (IIRC without CLOS) and »Lisp« to refer to Common Lisp
(which has CLOS). I do not know whether Alan Kay follows this
spelling rule.

In Smalltalk, blocks can have parameters IIRC, so they are
lambda expressions in disguise. And IIRC, they are objects,
too. Blocks-as-objects can become parts of messages and be
send to objects, which seems to be an important feature of the
»object-oriented programming« in the sense of Smalltalk/Kay.

This is possible in LISP/Lisp (even without CLOS) because
LISP/Lisp has lambda expressions. An object system can be
build on top of this. But when the core language does not
support concise lambda expressions, they often can not be
added to the language. (I have not yet looked at the lambda
implementation that is part of Boost, IIRC.)
 
B

Balog Pal

Preston said:
Of course it is constructive criticism. It's a valid complaint.
Declaring that it's not doesn't make the complaint less accurate.

Care to look to meaning of 'constructive'?
I don't recall anyone denying C++ being complex or stating such complaint
is invalid.

(Though OP implied like C++ is getting complex in next release while common
opinion is that it got way complex well before that.)
 
T

Tony

Ian Collins said:
The language became standardised. The upcoming standard should not break
code conforming to the current standard.

The solution is to not use the std library.
 
L

Lionel B

The solution is to not use the std library.

So you implement your own containers, streams, algorithms, ... (easy-peasy)
debug them to a high standard of reliability (shouldn't take much work)
and you're fine (except that nobody else understands your code).
 
N

Noah Roberts

Preston said:
Absolutely nobody cares what you're sick of. Absolutely nobody cares
that your epeen growth hormone is based on your amount of esoteric C++
knowledge.

*plonk*
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,776
Messages
2,569,603
Members
45,189
Latest member
CryptoTaxSoftware

Latest Threads

Top