C
Chris Mantoulidis
I have heard the way an int is stored, is different from the way a
short is (for exampe eh... the same could apply for 16bit int and a
long)...
But how are they differently stored?
Is the following correct?
short a=16: 0000000000010000
int b=16 (int int): 00000000000000000000000000010000
in front of the 16 bits of a, are there all 1's?
Plus, I've read about the various ways a signed data type is stored...
What is the mostly used one? If I saw 32bits, how would I know if it
was a signed int or an unsigned int? Would I be able to tell??
Different bit storing seems confusing...
short is (for exampe eh... the same could apply for 16bit int and a
long)...
But how are they differently stored?
Is the following correct?
short a=16: 0000000000010000
int b=16 (int int): 00000000000000000000000000010000
in front of the 16 bits of a, are there all 1's?
Plus, I've read about the various ways a signed data type is stored...
What is the mostly used one? If I saw 32bits, how would I know if it
was a signed int or an unsigned int? Would I be able to tell??
Different bit storing seems confusing...