T
tim
thanks in advance for your help, tim
thanks in advance for your help, tim
thanks in advance for your help, tim
In narrow technical sense, it's almost never dangerous. chars can havethanks in advance for your help, tim
In narrow technical sense, it's almost never dangerous. chars can have
trap representations whilst unsigned chars can't, but it's most
unlikely your code will ever need to run on such a machine.
However casting unsigned char to char, though not the other way round,
usually means that someone doesn't know what they are doing. char
should be used for characters, i.e human-readable text unsigned char
for bytes, usually arbitrary bits, occasionally for small integers. If
you need a tiny signed integer, use signed char. It doesn't normally
make sense to convert an arbitrary bit pattern to a human-readable
character.
In narrow technical sense, it's almost never dangerous. chars can have
trap representations whilst unsigned chars can't, but it's most
unlikely your code will ever need to run on such a machine.
However casting unsigned char to char, though not the other way round,
usually means that someone doesn't know what they are doing. char
should be used for characters, i.e human-readable text unsigned char
for bytes, usually arbitrary bits, occasionally for small integers.
If
you need a tiny signed integer, use signed char.
It's dangerous when it's done by someone who got other people to answer
his technical interview or homework questions.
THIS IS NOT A HOMEWORK
Vincenzo Mercuri said:Ben Pfaff ha scritto:
I thought about this as well, but the typecast "per se" is in fact
a conversion between pointer types so I think it would be safe.
In that case, ...
If the unsigned char * value is already well defined and everything, it is
safe to convert it to char *, and even to access the memory. In C, this
is not considered to be invalid aliasing. Any object can be accessed
as an array of characters, plain, signed or unsigned.
(no need to shout)
Does the standard guarantee that?
Kaz Kylheku said:Even if there is a trap representation there, it's not an aliasing issue.
If you could not alias an object using chars, then no access at all would
be well-defined.
I don't follow your reasoning.
Where does the standard say that you can alias any object with an
array of plain or signed char?
Where does the standard say that you can alias any object with an
array of plain or signed char?
I have a vague memory of a statement that plain and signed char cannot
have trap representations, but I can't confirm that from the standard.
On 11/16/2011 02:27 PM, Keith Thompson wrote:
...
I know of no reason why signed char (and therefore, char) cannot have
trap representations. However, every statement in 6.2.6.1p5 which says
that the behavior is undefined when a trap representation is involved,
explicitly excludes all character types, not just unsigned char. I'm not
quite sure what to make of that fact, but I'm sure that explicitly
excluding all character types was intentional; I'm not so sure whether
it was intentional to allow signed char to have trap representations.
6.2.6.1p5 refers to the trap representations for the type of the
object. In other words, if an object p of type void * holds a trap
representation, 6.2.6.1p5 makes it explicit that reading that object
as void * is not valid.
So, in your opinion, what is the significance of the exclusion of
character types from those statements? What do those statements mean,
with those exclusions, that differs from what they would mean if those
exclusions were dropped? Please accompany your explanation with specific
examples of code that would have defined behavior under the existing
rules, but not with that modification, or vice-versa.
6.5 paragraph 7. An object can be accessed with an lvalue which
is of character type.
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.