[Please do not mail me a copy of your followup]
"(e-mail address removed)" <
[email protected]> spake the secret code
So i wanna know..is typedefs are really useful in full fledged
project??
Like anything in your code, you should use them if they add value and
avoid them if they are not carrying their weight.
Its possible to overuse typedefs to the point where they fail to
provide value.
For instance, I personally don't find value in typedefs like this:
typedef std::string *stringPtr;
Particularly since using that typedef leads you down the problem of
improperly specifying pointers to const strings. People think its
this:
const stringPtr unchangedStringThroughPointer;
Unfortunately you didn't specify the string was const, but that the
pointer was const. That's why Windows headers have the typedefs LPSTR
and LPCSTR, with the former being pointer to char and the latter being
pointer to const char. (The LP business is just an anachronism from
Win16 days.) Is LPSTR really more clear than char *? Is LPCSTR
really more clear than char const *? I don't think so, but I'll use
those typedefs because they are pervasive in Windows and Windows
programmers are familiar enough with them to know what they mean. I
stop short of using things like LPDIRECT3DDEVICE9 instead of
IDirect3DDevice9 *. The former just feels like screaming to me, while
the latter is easier to read and is just as clear (in context) that
we're talking about a COM interface pointer.
I'm currently working on a code base that is replete with typedefs
like these:
typedef unsigned char uint8;
typedef int int32;
typedef unsigned int uint32;
typedef float real;
typedef double lreal;
and so-on.
Personally I don't like declaring my types as having a bit size unless
it is truly relevant to their role. Once *everything* is declared as
an int32, how do you know which ones *really* need 32-bits and which
ones are comfortable in the native size of int? If you really need to
guarantee that an int is at least 32-bits you can do this in other
ways than polluting the entire source base with spurious int32's
everywhere.
The real/lreal thing is particularly annoying. It just obfuscates the
code in an attemp to abstract away a coupling to an implementation
dependency, yet it fails to do so in any meaningful way because it is
just an alias for an implementation dependency instead of an
abstraction like a new type (i.e. a class) would do. Because the
built-in types like int, float and double can't be extended or
customized, the typedef doesn't carry its own weight. It obscures the
fact that you *are* coupled to the implementation details of a float
or a double, while at the same time pretending to decouple you from
those details. It adds no value and reduces the clarity of the code.
The one area where typedefs are most useful is in consuming template
classes. std::string::find returns std::string::size_type; if I'm
using that type repeatedly in a function its easier to create a local
typedef for it:
void wipeOutSlashes(std::string &text)
{
typedef std::string::size_type size_type;
// declare locals of type size_type
}
That's just a simple example. It gets more useful when declaring
iterators and other types related to a template class. If you are
writing template libraries, creating nested typedefs that refer to
related types in your base class or in the template argument are very
useful for making the template expansion more readable.