newbie: #define with Italian characters (ASCII > 128)

O

oddvark

Hello,

I'm doing a

#define STR_YES Si

Where Si has that special italian accent mark in it. Basically its
extended ascii 236 (the i with the sloping accent).

If I put the #define in the same file and try to print out it out, it
works. Basically i get ascii 236 when my printing algorithm hits the
special 'i'.

However, if I put the (#define STR_YES Si) in another file called
"Italian.h" then include that, when I print it, it doesn't work, and I
get some garbage character.

Why would putting the define in the same file versus in another file
generate different results?

Here is a sample:

SOLUTION 1, WORKS:
================
#define STR_YES Si
drawString(STR_YES)

SOLUTION 2, DOESNT WORK
=======================
#include "spanish.h" // spansh.h defines STR_YES_SI
drawString(STR_YES) // GET GARBAGE!
 
P

p_adib

Are not #include and #define lines specific to a file. Correct me if
I'm wrong here, but if you do

#define bla

in the header file, then it will apply only to the header file, even
though

include "something.h"

is included in the source file and something.h has that #define line in
it.

Again, I might be saying bs, but it's a try.
*waving my hands* please carry on debate. ;)
 
L

Larry Smith

oddvark said:
Hello,

I'm doing a

#define STR_YES Si

Where Si has that special italian accent mark in it. Basically its
extended ascii 236 (the i with the sloping accent).

If I put the #define in the same file and try to print out it out, it
works. Basically i get ascii 236 when my printing algorithm hits the
special 'i'.

However, if I put the (#define STR_YES Si) in another file called
"Italian.h" then include that, when I print it, it doesn't work, and I
get some garbage character.

Why would putting the define in the same file versus in another file
generate different results?

Here is a sample:

SOLUTION 1, WORKS:
================
#define STR_YES Si
drawString(STR_YES)

SOLUTION 2, DOESNT WORK
=======================
#include "spanish.h" // spansh.h defines STR_YES_SI
drawString(STR_YES) // GET GARBAGE!

If it is meant to be a string, enclose it in
double quotes:

#define STR_YES "Si"

Then always refer to it as STR_YES.
STR_YES_SI is something completely different from
STR_YES.

If the above '#define...' is in either the current file
or in "spanish.h", it should work - but it should
be in only one file, not both.

Think about it for a moment, if having define's in
included headers was not legal, then none of the
define's in the system headers (stdlib.h, vector, etc)
would work - but they do work.
 
B

BobR

(e-mail address removed) wrote in message
Are not #include and #define lines specific to a file. Correct me if
I'm wrong here, but if you do

#define bla

in the header file, then it will apply only to the header file, even
though

include "something.h"

is included in the source file and something.h has that #define line in
it.

Again, I might be saying bs, but it's a try.
*waving my hands* please carry on debate. ;)

I'll give you a little homework assignment to do. Try this experiment:

Make two files as follows:

// --- Guts.h ---
std::cout<<"Hello World, from Guts.h"<<std::endl;
// --- Guts.h --- END

// --- GutsMain.cpp ---
#include <iostream>
#include <ostream>

int main(){
#include "Guts.h"
return 0;
}
// --- GutsMain.cpp --- END

Compile "GutsMain.cpp" and run it.
Report your findings here. ...<waiting>....
 
M

Michiel.Salters

oddvark schreef:
Hello,

I'm doing a

#define STR_YES Si

Where Si has that special italian accent mark in it. Basically its
extended ascii 236 (the i with the sloping accent).

There are about a thousand extensions to ascii, which one(s) do you
mean?
In ISO C++, there is only one relevant extension, though - Unicode. I
think
you'd need character U+00EC (especially since 0x00EC is 236)
If I put the #define in the same file and try to print out it out, it
works. Basically i get ascii 236 when my printing algorithm hits the
special 'i'.

Well, if your OS/compiler interprets "" strings as Latin-1 (common) or
Unicode
that could just work. However, you should use "S\u00ec".

HTH,
Michiel Salters
 
P

p_adib

BobR said:
I'll give you a little homework assignment to do. Try this experiment:

Make two files as follows:

// --- Guts.h ---
std::cout<<"Hello World, from Guts.h"<<std::endl;
// --- Guts.h --- END

// --- GutsMain.cpp ---
#include <iostream>
#include <ostream>

int main(){
#include "Guts.h"
return 0;
}
// --- GutsMain.cpp --- END

Compile "GutsMain.cpp" and run it.
Report your findings here. ...<waiting>....

Fun fun fun! I did what you asked and got this, and I was shocked!

-- Hello World, from Guts.h
-- Press any key to continue

So that means that #include simply spits out code at a certain location
in code?

If this is true, then theoretically, if oddvark put a line in his .cpp
and then move it to a .h which he then includes in his .cpp, then it
should be as if nothing happened.

Oddvark, maybe you're having problems with building your project? If
you want the .cpp file to consider the .h file, then you should
probably include both in a project that you then build.

Otherwise, the " " thing given by Larry Smith seems relevant,
otherwise, how would the compiler interpret characters 's' & 'i'
together ? As a variable, as a keyword?
 
O

oddvark

thanks for the help everyone

It was #define YES "si", (with quotes) I just didn't put the quotes in
my example.

I went ahead and put it in a seperate file and inforced ISOLATIN1 on
it. That seemed to have solved the problem.

I'm guessing the fact that it sometimes treated it as ISOLATIN and
other times treated it as something else my have been to do with me
cutting and pasting or some other inconsistent handling of the
encoding.

Thanks.
 
B

BobR

(e-mail address removed) wrote in message ...
Fun fun fun! I did what you asked and got this, and I was shocked!

-- Hello World, from Guts.h
-- Press any key to continue

So that means that #include simply spits out code at a certain location
in code?

Exactly. Now try this:
[ in case someone out there says, "he said #define".]

// --- Guts.h ---
std::cout<<"Hello World, from Guts.h"<<std::endl;
#define BLAHBLAH std::cout<<"Hello World2, from Guts.h"<<std::endl;
// --- be sure that is all on one line! ---
// --- Guts.h --- END

// --- GutsMain.cpp ---
#include <iostream>
#include <ostream>

int main(){
#include "Guts.h"
BLAHBLAH
// looks weird, eh. (no semicolon at end)
// note that very tricky nameing.<G>
return 0;
}
// --- GutsMain.cpp --- END

Of course, using something like that in a real program would be insane [1].
It's just to point out how '#include' and '#define' work.

[1] - note that "Guts.h" depended on the std headers included in the
"GutsMain.cpp". That should be avoided.

If this is true, then theoretically, if oddvark put a line in his .cpp
and then move it to a .h which he then includes in his .cpp, then it
should be as if nothing happened.

Oddvark, maybe you're having problems with building your project? If
you want the .cpp file to consider the .h file, then you should
probably include both in a project that you then build.

Otherwise, the " " thing given by Larry Smith seems relevant,
otherwise, how would the compiler interpret characters 's' & 'i'
together ? As a variable, as a keyword?

More than likely the problem was how Oddvark used the macro.

#define STR_YES Si
std::string TestSTR_YES( STR_YES );
// error: `Si' undeclared (first use this function)


#define STR_YES "Si"
std::string TestSTR_YES( STR_YES );
cout << "string TestSTR_YES=" << TestSTR_YES << std::endl;
// out: string TestSTR_YES=Si
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,764
Messages
2,569,564
Members
45,039
Latest member
CasimiraVa

Latest Threads

Top