forcing 32-bit long on 64-bit machine?

C

CoffeeGood

Hi folks,

I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.
I've discovered however that gcc on this machine automatically
makes longs 64 bits.

Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?

Thanks.
 
S

Skarmander

CoffeeGood said:
I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.
I've discovered however that gcc on this machine automatically
makes longs 64 bits.

Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?
Ask the gcc folks at gnu.gcc; this is a compiler-specific question.

If you do decide to convert the longs rather than change the platform
(which is what you're trying to do), #include <stdint.h> and convert
them to (u)int32_t. That's an (unsigned) integer type with exactly 32
bits, if it exists, and solves the problem once and for all. Well, other
than rewriting the code to make it not use such assumptions in the first
place.

S.
 
R

Richard Heathfield

CoffeeGood said:
Hi folks,

I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.

Bad code.
I've discovered however that gcc on this machine automatically
makes longs 64 bits.

Good compiler. Better compiler, because it exposes bad code!
Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?

Have you considered asking in a gcc newsgroup? gnu.gcc.help might be worth a
go.
 
R

Randy Howard

CoffeeGood wrote
(in article
Hi folks,

I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.

Okay. Does that assumption imply that it will break if they are
bigger? Sometimes apps like this have a minimum requirement for
data size, but if it's bigger it will work. Have you made sure
that isn't the case here?
I've discovered however that gcc on this machine automatically
makes longs 64 bits.

you can usually tell the compiler to generate a 32-bit binary,
even when running on a 64-bit kernel if it is (as I suspect)
part of the AMD/Intel 'x86-64' family. See the man page for
gcc, or even more appropriately, take this up in one of the gcc
specific newsgroups.
Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?

I suppose fixing the code so that it didn't make unwarranted
assumptions about data type sizes never came up as an option?
 
R

Robert Harris

Randy said:
CoffeeGood wrote
(in article
Hi folks,

I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.

[snip]


Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?


I suppose fixing the code so that it didn't make unwarranted
assumptions about data type sizes never came up as an option?
If you don't, then what will you do about pointer sizes which will also
have increased by default from 32 bits to 64 bits?

Robert
 
C

CoffeeGood

I suppose fixing the code so that it didn't make unwarranted
assumptions about data type sizes never came up as an option?

I don't know, I merely inherited the code from others.

Methinks what C needs is to do away with vague
terms like int, short, long and require the use of
terms like u8, u16, u32, s64. Then if people still want
to use "long", let that be a typedef or #define.
 
W

Walter Roberson

Methinks what C needs is to do away with vague
terms like int, short, long and require the use of
terms like u8, u16, u32, s64. Then if people still want
to use "long", let that be a typedef or #define.

And if one is trying to program on a machine whose integer size is 36
bits, but one is trying to write a program that would also work on
a machine whose integer size is 32?
 
C

CoffeeGood

And if one is trying to program on a machine whose integer size is 36 bits...

typedef s36 int;
typedef u36 unsigned;
 
B

Ben Pfaff

CoffeeGood said:
Methinks what C needs is to do away with vague
terms like int, short, long and require the use of
terms like u8, u16, u32, s64. Then if people still want
to use "long", let that be a typedef or #define.

Most of the time I don't care what size my variables really are.
I just need them to be some minimum size. C's types work fine
for that. When I do need a specific size, there's always
<stdint.h>.

(You could always use Java if you want fixed-size types.)
 
S

Skarmander

CoffeeGood said:
typedef s36 int;
typedef u36 unsigned;
When replying, please retain attribution, and don't cut off sentences
when they have relevant information.

Your answer is rendered bogus by omission of "but one is trying to write
a program that would also work on a machine whose integer size is 32?"

Suppose I want to write a program that works on a machine with an
integer type that is at least 32 bits? Well, you'd use standard C's
"long". No typedefs needed.

Your approach means I must have typedefs if I *don't* care about the
exact size of integers. ANSI C's approach is that you must have typedefs
if you *do* care about the exact size of integers. The latter is more
portable, even if it also makes life harder for programmers. Assuming
exact sizes makes some things easier. It also makes other things
impossible. The tradeoff is warranted.

Exact-size integer types are provided by <stdint.h>, where available.

S.
 
R

Randy Howard

Robert Harris wrote
(in article said:
If you don't, then what will you do about pointer sizes which will also
have increased by default from 32 bits to 64 bits?

Sorry, but if you code that assumes the size of a pointer, you a
broken. Badly.
 
B

Ben Pfaff

Randy Howard said:
Robert Harris wrote


Sorry, but if you code that assumes the size of a pointer, you a
broken. Badly.

intptr_t can be helpful here (although it isn't guaranteed to
exist).
 
M

Malcolm

Randy Howard said:
Sorry, but if you code that assumes the size of a pointer, you a
broken. Badly.
Windows allows you to set a "user long" to the window.
It is extremely tempting to make this long into a pointer, to hang
arbitrary data on your window. In fact i don't know of any other good way of
achieving the same thing.
 
S

Skarmander

Malcolm said:
Windows allows you to set a "user long" to the window.
It is extremely tempting to make this long into a pointer, to hang
arbitrary data on your window. In fact i don't know of any other good way of
achieving the same thing.
<OT>I do: use SetWindowLongPtr(), which supersedes SetWindowLong() for
*exactly this reason*: a long cannot portably be assumed to have the
same size as a pointer (and this will not work on 64-bit Windows).

In other words, you can certainly write code that makes this assumption,
but it's not a good idea to actually do so, because your code won't go
far. Unfortunately for you, this is true even if your environment forces
you to do this...

S.
 
R

Richard Bos

Malcolm said:
Windows allows you to set a "user long" to the window.
It is extremely tempting to make this long into a pointer, to hang
arbitrary data on your window. In fact i don't know of any other good way of
achieving the same thing.

Sadly, the MS Windows API is replete with this sort of errant
sub-hackery. Whenever I look into the declarations of types under MS
Windows and the trouble they had to go to to make Win32S, Win'98 and
WinNT _almost_ compatible, the importance of properly portable C code is
pressed upon me again.

Richard
 
C

CoffeeGood

Most of the time I don't care what size my variables really are.
I just need them to be some minimum size.

That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.
 
P

pete

CoffeeGood wrote:
That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.

No.
Ben is talking about portable programming.
If you need an unsigned type with at least 32 bits,
then unsigned long is the portable choice.
 
R

Randy Howard

CoffeeGood wrote
(in article
That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.

Incorrect. For a lot of algorithms and/or program needs, this
is perfectly reasonable.
 
S

Skarmander

CoffeeGood said:
That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.
Err.... why? The topic of this ng is standard C. Writing standard C in
fact *requires* that you only rely on guaranteed minimum sizes.

"Heresy" would be assuming that a short is 16 bits, an int 32 bits, that
you can safely convert pointers of all types to ints and back, that a
null pointer is represented by all-bits-zero... that sort of thing.

I think you're confusing a minimum size with an exact size. That is, if
you do something like this:

typedef char int8_t;
typedef short int16_t;
typedef int int32_t;

With the intention that intN_t is a signed integer type of exactly N
bits. This renders code unportable on all platforms that don't meet the
assumptions this implies: that a char is 8 bits (and plain char is
signed), that a short is 16 bits, an int 32 bits.

If you isolate these typedefs and make it clear they are to be
customized for every platform, you're still relying on the assumption
that the types you want exist at all. Some platforms will simply not
*have* signed integer types of exactly 8, 16 or 32 bits.

If, as is usual, you actually don't need *exact* sizes (since this is
mostly interesting for interfacing with low-level bits that don't need
much in the way of portability anyway) but are content with minimally
adequate types, portability is easier. C89 already gives you standard
integer types of at least 8, 16 and 32 bits: char, int and long.

C99 makes this even easier with <stdint.h>, which includes the intN_t
typedefs above (where the corresponding types exist), as well as uintN_t
for the unsigned counterparts, and (u)int_leastN_t for integers of at
least a certain size. C99 furthermore requires that (u)int_leastN_t
types exist for N = 8, 16, 32 or 64, and defines macros for the minimum
and maximum values of all of these types.

This makes writing portable code that much easier. You can do, for example:

#include <stdint.h>
typedef uint16_t word;

For poorly written or platform-specific code that requires exact 16-bit
quantities. This code will fail to compile on platforms that don't have
such a type, but will not require any modification for those platforms
that do. You can even do:

#include <stdint.h>

#ifndef INT_LEAST24_MIN
typedef int_least32_t int_least24_t;
#define INT_LEAST24_MIN INT_LEAST32_MIN
#define INT_LEAST24_MAX INT_LEAST32_MAX
#define INT24_C(x) INT32_C(x)
#endif

This gives you a platform-specific integer type of at least 24 bits
long, using the platform's 32-bit integer type if necessary.

S.
 
C

Clark S. Cox III

That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.

You've actually got it backwards. Writing portable code implies *only*
relying on the guarantees offered by the standard; including the
minimum sizes for integers.

The problem is when people make unwarranted assumptions like:

"int is always at least 32-bits"
"long is always 32-bits"
"char is always 8-bits"
etc.

None of which are true.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,015
Latest member
AmbrosePal

Latest Threads

Top