newbie: #define with Italian characters (ASCII > 128)

Discussion in 'C++' started by oddvark, Nov 10, 2006.

  1. oddvark

    oddvark Guest

    Hello,

    I'm doing a

    #define STR_YES Si

    Where Si has that special italian accent mark in it. Basically its
    extended ascii 236 (the i with the sloping accent).

    If I put the #define in the same file and try to print out it out, it
    works. Basically i get ascii 236 when my printing algorithm hits the
    special 'i'.

    However, if I put the (#define STR_YES Si) in another file called
    "Italian.h" then include that, when I print it, it doesn't work, and I
    get some garbage character.

    Why would putting the define in the same file versus in another file
    generate different results?

    Here is a sample:

    SOLUTION 1, WORKS:
    ================
    #define STR_YES Si
    drawString(STR_YES)

    SOLUTION 2, DOESNT WORK
    =======================
    #include "spanish.h" // spansh.h defines STR_YES_SI
    drawString(STR_YES) // GET GARBAGE!
     
    oddvark, Nov 10, 2006
    #1
    1. Advertising

  2. oddvark

    Guest

    Are not #include and #define lines specific to a file. Correct me if
    I'm wrong here, but if you do

    #define bla

    in the header file, then it will apply only to the header file, even
    though

    include "something.h"

    is included in the source file and something.h has that #define line in
    it.

    Again, I might be saying bs, but it's a try.
    *waving my hands* please carry on debate. ;)

    oddvark wrote:
    > Hello,
    >
    > I'm doing a
    >
    > #define STR_YES Si
    >
    > Where Si has that special italian accent mark in it. Basically its
    > extended ascii 236 (the i with the sloping accent).
    >
    > If I put the #define in the same file and try to print out it out, it
    > works. Basically i get ascii 236 when my printing algorithm hits the
    > special 'i'.
    >
    > However, if I put the (#define STR_YES Si) in another file called
    > "Italian.h" then include that, when I print it, it doesn't work, and I
    > get some garbage character.
    >
    > Why would putting the define in the same file versus in another file
    > generate different results?
    >
    > Here is a sample:
    >
    > SOLUTION 1, WORKS:
    > ================
    > #define STR_YES Si
    > drawString(STR_YES)
    >
    > SOLUTION 2, DOESNT WORK
    > =======================
    > #include "spanish.h" // spansh.h defines STR_YES_SI
    > drawString(STR_YES) // GET GARBAGE!
     
    , Nov 10, 2006
    #2
    1. Advertising

  3. oddvark

    Larry Smith Guest

    oddvark wrote:
    > Hello,
    >
    > I'm doing a
    >
    > #define STR_YES Si
    >
    > Where Si has that special italian accent mark in it. Basically its
    > extended ascii 236 (the i with the sloping accent).
    >
    > If I put the #define in the same file and try to print out it out, it
    > works. Basically i get ascii 236 when my printing algorithm hits the
    > special 'i'.
    >
    > However, if I put the (#define STR_YES Si) in another file called
    > "Italian.h" then include that, when I print it, it doesn't work, and I
    > get some garbage character.
    >
    > Why would putting the define in the same file versus in another file
    > generate different results?
    >
    > Here is a sample:
    >
    > SOLUTION 1, WORKS:
    > ================
    > #define STR_YES Si
    > drawString(STR_YES)
    >
    > SOLUTION 2, DOESNT WORK
    > =======================
    > #include "spanish.h" // spansh.h defines STR_YES_SI
    > drawString(STR_YES) // GET GARBAGE!
    >


    If it is meant to be a string, enclose it in
    double quotes:

    #define STR_YES "Si"

    Then always refer to it as STR_YES.
    STR_YES_SI is something completely different from
    STR_YES.

    If the above '#define...' is in either the current file
    or in "spanish.h", it should work - but it should
    be in only one file, not both.

    Think about it for a moment, if having define's in
    included headers was not legal, then none of the
    define's in the system headers (stdlib.h, vector, etc)
    would work - but they do work.
     
    Larry Smith, Nov 10, 2006
    #3
  4. oddvark

    BobR Guest

    wrote in message
    <>...
    >Are not #include and #define lines specific to a file. Correct me if
    >I'm wrong here, but if you do
    >
    >#define bla
    >
    >in the header file, then it will apply only to the header file, even
    >though
    >
    >include "something.h"
    >
    >is included in the source file and something.h has that #define line in
    >it.
    >
    >Again, I might be saying bs, but it's a try.
    >*waving my hands* please carry on debate. ;)
    >


    I'll give you a little homework assignment to do. Try this experiment:

    Make two files as follows:

    // --- Guts.h ---
    std::cout<<"Hello World, from Guts.h"<<std::endl;
    // --- Guts.h --- END

    // --- GutsMain.cpp ---
    #include <iostream>
    #include <ostream>

    int main(){
    #include "Guts.h"
    return 0;
    }
    // --- GutsMain.cpp --- END

    Compile "GutsMain.cpp" and run it.
    Report your findings here. ...<waiting>....

    --
    Bob R
    POVrookie
     
    BobR, Nov 10, 2006
    #4
  5. oddvark

    Guest

    oddvark schreef:

    > Hello,
    >
    > I'm doing a
    >
    > #define STR_YES Si
    >
    > Where Si has that special italian accent mark in it. Basically its
    > extended ascii 236 (the i with the sloping accent).


    There are about a thousand extensions to ascii, which one(s) do you
    mean?
    In ISO C++, there is only one relevant extension, though - Unicode. I
    think
    you'd need character U+00EC (especially since 0x00EC is 236)

    > If I put the #define in the same file and try to print out it out, it
    > works. Basically i get ascii 236 when my printing algorithm hits the
    > special 'i'.


    Well, if your OS/compiler interprets "" strings as Latin-1 (common) or
    Unicode
    that could just work. However, you should use "S\u00ec".

    HTH,
    Michiel Salters
     
    , Nov 10, 2006
    #5
  6. oddvark

    Guest

    BobR wrote:
    > I'll give you a little homework assignment to do. Try this experiment:
    >
    > Make two files as follows:
    >
    > // --- Guts.h ---
    > std::cout<<"Hello World, from Guts.h"<<std::endl;
    > // --- Guts.h --- END
    >
    > // --- GutsMain.cpp ---
    > #include <iostream>
    > #include <ostream>
    >
    > int main(){
    > #include "Guts.h"
    > return 0;
    > }
    > // --- GutsMain.cpp --- END
    >
    > Compile "GutsMain.cpp" and run it.
    > Report your findings here. ...<waiting>....


    Fun fun fun! I did what you asked and got this, and I was shocked!

    -- Hello World, from Guts.h
    -- Press any key to continue

    So that means that #include simply spits out code at a certain location
    in code?

    If this is true, then theoretically, if oddvark put a line in his .cpp
    and then move it to a .h which he then includes in his .cpp, then it
    should be as if nothing happened.

    Oddvark, maybe you're having problems with building your project? If
    you want the .cpp file to consider the .h file, then you should
    probably include both in a project that you then build.

    Otherwise, the " " thing given by Larry Smith seems relevant,
    otherwise, how would the compiler interpret characters 's' & 'i'
    together ? As a variable, as a keyword?
     
    , Nov 10, 2006
    #6
  7. oddvark

    oddvark Guest

    thanks for the help everyone

    It was #define YES "si", (with quotes) I just didn't put the quotes in
    my example.

    I went ahead and put it in a seperate file and inforced ISOLATIN1 on
    it. That seemed to have solved the problem.

    I'm guessing the fact that it sometimes treated it as ISOLATIN and
    other times treated it as something else my have been to do with me
    cutting and pasting or some other inconsistent handling of the
    encoding.

    Thanks.
     
    oddvark, Nov 10, 2006
    #7
  8. oddvark

    BobR Guest

    wrote in message ...
    >
    >Fun fun fun! I did what you asked and got this, and I was shocked!
    >
    >-- Hello World, from Guts.h
    >-- Press any key to continue
    >
    >So that means that #include simply spits out code at a certain location
    >in code?


    Exactly. Now try this:
    [ in case someone out there says, "he said #define".]

    // --- Guts.h ---
    std::cout<<"Hello World, from Guts.h"<<std::endl;
    #define BLAHBLAH std::cout<<"Hello World2, from Guts.h"<<std::endl;
    // --- be sure that is all on one line! ---
    // --- Guts.h --- END

    // --- GutsMain.cpp ---
    #include <iostream>
    #include <ostream>

    int main(){
    #include "Guts.h"
    BLAHBLAH
    // looks weird, eh. (no semicolon at end)
    // note that very tricky nameing.<G>
    return 0;
    }
    // --- GutsMain.cpp --- END

    Of course, using something like that in a real program would be insane [1].
    It's just to point out how '#include' and '#define' work.

    [1] - note that "Guts.h" depended on the std headers included in the
    "GutsMain.cpp". That should be avoided.


    >
    >If this is true, then theoretically, if oddvark put a line in his .cpp
    >and then move it to a .h which he then includes in his .cpp, then it
    >should be as if nothing happened.
    >
    >Oddvark, maybe you're having problems with building your project? If
    >you want the .cpp file to consider the .h file, then you should
    >probably include both in a project that you then build.
    >
    >Otherwise, the " " thing given by Larry Smith seems relevant,
    >otherwise, how would the compiler interpret characters 's' & 'i'
    >together ? As a variable, as a keyword?
    >


    More than likely the problem was how Oddvark used the macro.

    #define STR_YES Si
    std::string TestSTR_YES( STR_YES );
    // error: `Si' undeclared (first use this function)


    #define STR_YES "Si"
    std::string TestSTR_YES( STR_YES );
    cout << "string TestSTR_YES=" << TestSTR_YES << std::endl;
    // out: string TestSTR_YES=Si

    --
    Bob R
    POVrookie
     
    BobR, Nov 11, 2006
    #8
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Robin Siebler
    Replies:
    4
    Views:
    26,692
    Tim Peters
    Oct 8, 2004
  2. wob
    Replies:
    4
    Views:
    454
    Dave Thompson
    Aug 1, 2005
  3. balavignesh
    Replies:
    0
    Views:
    2,006
    balavignesh
    Nov 8, 2009
  4. Eric Sosman

    Re: Integer 128 != Integer 128 ??

    Eric Sosman, Oct 12, 2010, in forum: Java
    Replies:
    6
    Views:
    854
    Screamin Lord Byron
    Oct 13, 2010
  5. chankey pathak

    Re: Integer 128 != Integer 128 ??

    chankey pathak, Oct 13, 2010, in forum: Java
    Replies:
    0
    Views:
    862
    chankey pathak
    Oct 13, 2010
Loading...

Share This Page