Asfand said:
I don't think I agree with that - the pimpl idiom easily separates
interface
from implementation. It allows me to place just the interface in the .hh
file, without reference to a single data member. It easily allows
copy-on-write behaviour, and allows the interface file to remain unchanged
even if things underneath change radically.
Or is that what you meant?
Pimpl solves an emergency situation. If you have Foo.h, and it includes
many .h files, and it declares a huge class Foo, and if everyone uses Foo,
then each time you change any of those .h files, everything must recompile,
even things that do not directly use the changed things in those .h files.
The fastest and easiest fix replaces each .h file with a forward declare of
the class it defines, and replaces all of Foo's data members with a single,
bald pointer to FooImpl. Only define FooImpl in Foo.cpp, and forward every
method to it.
This is a ruthlessly effective fix because you needn't redesign Foo, or any
of its users, or any of its servants, to push in the fix. You only need a
few copies and pastes.
Now, in terms of the emergency, ask why Foo.h got so burdensome. Why do so
many things use it? Do they do too much? Why does it use so many servants?
Do they do too little?
If a class is well-designed, it will achieve many of Pimpl's benefits,
possibly including a bald pointer to an implementation object in a .cc
file. C++ works because the techniques that logically decouple modules
parallel the techniques that physically decouple them.
Ideally, if Foo is very important, its clients should access it through an
abstract interface; call this FooInf (but real code should think of a
better name). So now the clients cannot create a Foo because they are not
aware of it. If they want one, they must get it from a Foo Factory of some
kinds. This technique, "Construction Encapsulation", fully insulates Foo
clients from the concrete Foo, and its data members, just as rigorously as
the FooImpl would have. But the entire design is now more flexible.