typedef vs. #define

Discussion in 'C Programming' started by Mark, Mar 9, 2010.

  1. Mark

    Mark Guest

    Hello

    are there any other pros to using #defined new types, except the one
    mentioned in C-FAQ 1.13? Quite often I see in chips SDKs things like these:

    #ifndef int8
    #define int8 char
    #endif
    ....
    #ifndef uint32
    #define uint32 unsigned int
    #endif

    Looks quite ugly to me, I'd prefer 'typedef' these types, but perhaps I'm
    missing something?

    --
    Mark
    Mark, Mar 9, 2010
    #1
    1. Advertising

  2. On 9 Mar, 08:04, "Mark" <> wrote:
    >
    > are there any other pros to using #defined new types, except the one
    > mentioned in C-FAQ 1.13? Quite often I see in chips SDKs things like these:
    >
    > Looks quite ugly to me, I'd prefer 'typedef' these types, but perhaps I'm
    > missing something?
    >

    For very short programs it doesn't really matter.
    However #defines are word processing commands, typedefs alias types.
    This means that you can do things like

    #define int8 char
    unsigned int8


    The technical term for this is "preprocessor abuse". It can be handy.
    Another thing you can do is
    stringise the #defined type macro. This might be useful if writing
    some sort of God-awful generic function, using strings as cheap and
    cheerful C++ templates.
    Malcolm McLean, Mar 9, 2010
    #2
    1. Advertising

  3. Mark

    Fred Guest

    On Mar 9, 1:15 am, Malcolm McLean <>
    wrote:
    > On 9 Mar, 08:04, "Mark" <> wrote:
    >
    > > are there any other pros to using #defined new types, except the one
    > > mentioned in C-FAQ 1.13? Quite often I see in chips SDKs things like these:

    >
    > > Looks quite ugly to me, I'd prefer 'typedef' these types, but perhaps I'm
    > > missing something?

    >
    > For very short programs it doesn't really matter.
    > However #defines are word processing commands, typedefs alias types.
    > This means that you can do things like
    >
    > #define int8 char
    > unsigned int8
    >
    > The technical term for this is "preprocessor abuse". It can be handy.
    > Another thing you can do is
    > stringise the #defined type macro. This might be useful if writing
    > some sort of God-awful generic function, using strings as cheap and
    > cheerful C++ templates.


    Using the #define method can easily lead to errors.
    Consider:
    #define String char*
    vs.
    typedef String char*

    followed by:
    String a,b,c;

    If you used the #define, bad things are likely to occur.
    --
    Fred K
    Fred, Mar 9, 2010
    #3
  4. Mark

    Seebs Guest

    On 2010-03-09, Mark <> wrote:
    > Looks quite ugly to me, I'd prefer 'typedef' these types, but perhaps I'm
    > missing something?


    The obvious thing would be the lack of a way to spell "iftypedef(uint32)".

    -s
    --
    Copyright 2010, all wrongs reversed. Peter Seebach /
    http://www.seebs.net/log/ <-- lawsuits, religion, and funny pictures
    http://en.wikipedia.org/wiki/Fair_Game_(Scientology) <-- get educated!
    Seebs, Mar 9, 2010
    #4
  5. Mark

    ImpalerCore Guest

    On Mar 9, 3:04 am, "Mark" <> wrote:
    > Hello
    >
    > are there any other pros to using #defined new types, except the one
    > mentioned in C-FAQ 1.13? Quite often I see in chips SDKs things like these:
    >
    > #ifndef int8
    > #define int8    char
    > #endif
    > ...
    > #ifndef uint32
    > #define uint32   unsigned int
    > #endif
    >
    > Looks quite ugly to me, I'd prefer 'typedef' these types, but perhaps I'm
    > missing something?


    The typedef is the preferred implementation, but as Seebs said, not
    having a convenient syntax for verifying that a typedef exists either
    leads them to using #define or relying on an autoconf test to define a
    HAVE_UINT32_T or maybe a HAVE_STDINT_H symbol. In my case, I have a
    stdint wrapper that either includes the real stdint.h if HAVE_STDINT_H
    is defined, or declare my own stdint subset (consisting mostly the
    uintN_t types and the _C, _MIN, and _MAX macros).

    > --
    > Mark
    ImpalerCore, Mar 9, 2010
    #5
  6. Mark

    Mark Guest

    Seebs wrote:
    > On 2010-03-09, Mark <> wrote:
    >> Looks quite ugly to me, I'd prefer 'typedef' these types, but
    >> perhaps I'm missing something?

    >
    > The obvious thing would be the lack of a way to spell
    > "iftypedef(uint32)".

    That was mentioned in FAQ, I thought there could be other benefits too.

    --
    Mark
    Mark, Mar 10, 2010
    #6
  7. On 9 Mar, 08:04, "Mark" <> wrote:

    > Hello


    Hello, I've taken my pedantry medication this morning.


    > are there any other pros to using #defined new types,


    technically, you can't create new types with a preprocessor macro.


    > [...] Quite often I see in chips SDKs things like these:
    >
    > #ifndef int8
    > #define int8    char
    > #endif
    > ...
    > #ifndef uint32
    > #define uint32   unsigned int
    > #endif
    >
    > Looks quite ugly to me, I'd prefer 'typedef' these types, but perhaps I'm
    > missing something?


    you can't create a new types witha typedef. typedef only creates a
    type alias.


    Consider:-
    <code>

    #define M_int int
    typedef int T_int;

    struct S_int1
    {
    int i;
    };

    struct S_int2
    {
    int i;
    };

    int main (void)
    {
    int *pi;
    M_int *mpi;
    T_int *tpi;
    struct S_int1 *spi1;
    struct S_int2 *spi2;

    mpi = pi; /* 1 */
    tpi = pi; /* 2 */

    spi1 = spi2; /* 3 */

    return 0;
    }

    </code>

    the lines marked 1 and 2 don't give an error because in the first case
    the copiler sees no difference and the second they are merely aliases
    for each other. structs on the other hand *are* new types so you get a
    type incompatibility warning (from VCC anyway).

    Is this a required diagnostic? I'd hope so.


    --
    - Yes it works in practice - but does it work in theory?
    Nick Keighley, Mar 10, 2010
    #7
  8. pete wrote:
    > Mark wrote:
    >> Hello
    >>
    >> are there any other pros to using #defined new types, except the one
    >> mentioned in C-FAQ 1.13? Quite often I see in chips SDKs things like these:
    >>
    >> #ifndef int8
    >> #define int8 char
    >> #endif
    >> ...
    >> #ifndef uint32
    >> #define uint32 unsigned int
    >> #endif
    >>
    >> Looks quite ugly to me, I'd prefer 'typedef' these types,
    >> but perhaps I'm missing something?

    >
    > I like to use both.
    > I prefer the readily changable part of the code to be macros.


    What do you mean? Function-like macros?
    >
    > #define E_TYPE char
    >
    > typedef E_TYPE e_type
    >


    E_TYPE seems to be popular this way.
    --
    fred
    Phred Phungus, Mar 10, 2010
    #8
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. O Plameras
    Replies:
    10
    Views:
    539
    Keith Thompson
    Dec 19, 2005
  2. ramu

    #define and typedef

    ramu, Jan 18, 2006, in forum: C Programming
    Replies:
    2
    Views:
    308
    Jaspreet
    Jan 18, 2006
  3. robin liu
    Replies:
    3
    Views:
    810
    Robin Liu
    Apr 21, 2006
  4. Vladimir Oka

    define versus typedef

    Vladimir Oka, Jun 8, 2006, in forum: C Programming
    Replies:
    10
    Views:
    509
    Christian Christmann
    Jun 8, 2006
  5. oor
    Replies:
    0
    Views:
    1,329
Loading...

Share This Page