Any way to have ostream not auto-extend sign bit when printingaddresses?

D

David T

Hi,

When I use an ostream to print a address/pointer, it will
automatically extend the sign bit.

For example, suppose a pointer has a value of 0xaf46d00c. If I print
it, using
the "<<" operator, I get this:
0xffffffffaf46d00c
I would like it to print as 0xaf46d00c.

The code is generally like this:
void* myptr;
.....
clog << myptr << endl;

Any ideas on how I can use something like a global setf() flag to make
it not
extend the sign bit when printing all addresses?

Thanks,
David
 
J

James Kanze

When I use an ostream to print a address/pointer, it will
automatically extend the sign bit.
For example, suppose a pointer has a value of 0xaf46d00c. If
I print it, using
the "<<" operator, I get this:
0xffffffffaf46d00c
I would like it to print as 0xaf46d00c.
The code is generally like this:
void* myptr;
.....
clog << myptr << endl;
Any ideas on how I can use something like a global setf() flag
to make it not extend the sign bit when printing all
addresses?

The output format of a pointer is implementation defined. None
of the formatting arguments are required to affect it (although
from a QoI point of view, I would expect at least width to be
taken into account). Sign extending a 16 bit value seems a bit
strange, but it's the implementation's decision.

If you want to control the output format, you really only have
one choice: reinterpret_cast the pointer to the correct integral
type (probably unsigned). And of course, even then, you might
have to "interpret" the output somewhat, e.g. if the
architecture doesn't have linear addressing.
 
J

James Kanze

David T wrote:
This is a tough one for me to test at the moment, because GCC
actually provides overloads for pointer types.

The standard requires overloads for void*, char* and I think
unsigned char* and signed char*. His problem is that he doesn't
like the format the implementation does for void*.
Here's a proposed solution that depends on pointers fitting in
std::size_t, but I'm not sure whether that's guaranteed by the
standard.

Of course it's not. I've done a lot of programming on systems
where sizeof( size_t ) == 2, but sizeof( void* ) == 4. More
generally, the standard doesn't even guarantee that sizeof(
void* ) == sizeof( int* ), and there are machines being sold
today where this is not the case.

In modern C, in the next release of C++, and in any really good
implementation of C++ today, there will be a uintptr_t, defined
in <stdint.h> (or as std::uintptr_t in <cstdint>, of course, in
the future C++).

I'm still a little curious about his platform. What platform
has 16 bit pointers, but outputs them as 32 bit values?
 
A

Alf P. Steinbach

* James Kanze:
If you want to control the output format [for a pointer], you really
only have one choice: reinterpret_cast the pointer to the correct
integral type (probably unsigned). And of course, even then, you
might have to "interpret" the output somewhat, e.g. if the
architecture doesn't have linear addressing.

Well, an alternative C++ way is to output the pointer value, static_cast'ed to
void const*, to a std::eek:stringstream, then deal with string value. But I'd
prefer to use sprintf. :) Of course, I'm pretty sure your preference is for the
iostream solution, and if so there is merit in that position: by simply banning
all that C i/o stuff one removes a whole big breeding ground for bugs -- but
then, at least for my personal preference, convenience & efficiency counts more.

Cheers,

- Alf
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top