I have question about using a signed and unsigned int data types. In a typical
for loop like:
int i;
for(i=0;i<n;i++);
If I used an unsigned int instead of an int, a difference of how many
instructions is achieved? Would it matter if we were considering embedded systems?
If the variable 'n' is 'int' - i.e, 'signed int' - there is no problem
here.
If your system codes an integer value into 16 bits,
an 'unsigned int' will span from 0x0000 (0) to 0xFFFF (65535),
while a 'signed int' will span from 0x0000 (0) to 0x7FFF (32767)
and 0x8000 to 0xFFFF will represent negative numbers.
Therefore, the value 0xA000 will be positive for an
'unsigned int', and negative for a 'signed int' ...
It is to you to know what you are doing.
Now, about the number of instructions involved at runtime,
there is no difference between 'signed' or 'unsigned',
because for a given data type (i.e, 'int') they are of the same size.
Here are some examples of code :
// This will work ---------------
#define MAX 1000
int i;
for( i = -1; ++i < MAX; )
{
// some code
}
// This will *NOT* work ---------------
#define MAX 1000
unsigned int i;
for( i = -1; ++i < MAX; )
{
// some code
}
// This will work ---------------
#define MAX 30000
int i;
for( i=0; i < MAX; i++ )
{
// some code
}
// This will work ---------------
#define MAX 30000
unsigned int i;
for( i=0; i < MAX; i++ )
{
// some code
}
// This will work ---------------
#define MAX 60000
unsigned int i;
for( i=0; i < MAX; i++ )
{
// some code
}
// This will *NOT* work ---------------
#define MAX 60000
int i;
for( i=0; i < MAX; i++ )
{
// some code
}