S
Shivanand Kadwadkar
---------------------------------------------------------------
#include<stdio.h>
int main()
{
signed char i=128;
printf("i =%d and signed char size =%d byte",i,sizeof(signed char));
}
-----------------------------------------------------------------------
According to me it should work like following way
Since char is 1 byte in length above char is represented as 1000 0000
in binary
i thought When i print i it will be 128 or -0/0
as a output of above program i got i=-128 and singed char size= 1 byte
I dont understand how -128 is represented in 8 bits and why compiler
is detecting it as -128 why not 128
#include<stdio.h>
int main()
{
signed char i=128;
printf("i =%d and signed char size =%d byte",i,sizeof(signed char));
}
-----------------------------------------------------------------------
According to me it should work like following way
Since char is 1 byte in length above char is represented as 1000 0000
in binary
i thought When i print i it will be 128 or -0/0
as a output of above program i got i=-128 and singed char size= 1 byte
I dont understand how -128 is represented in 8 bits and why compiler
is detecting it as -128 why not 128