V
V.Subramanian, India
This question is only for understanding purpose.
I am using RedHat Linux kenel 2.6.9 on Intel Pentium D dual core
processor.
The malloc function takes size_t type argument. I am using gcc 3.4.3
implementation. Under this implementation, in stdint.h, SIZE_MAX is
defined as follows:
# define SIZE_MAX (4294967295U)
SIZE_MAX, defined in stdint.h, is the largest value that can be
represented in type size_t. However when I pass SIZE_MAX as the
argument to malloc, it is unable to allocate SIZE_MAX bytes of memory.
In fact, malloc allocates much much less memory.
Here is the complete program max_size_to_malloc.c
#include <stdlib.h>
#include <stdio.h>
#include <stdint.h>
#include <stddef.h>
int main()
{
size_t count = 0;
do
{
const size_t size = SIZE_MAX - count;
char* p = malloc(size);
if (p)
{
fprintf(stderr,
"%u - %u : ",
SIZE_MAX,
count);
fprintf(stderr,
"Allocated %u bytes of memory\n",
size);
free(p);
p = NULL;
return EXIT_SUCCESS;
}
} while (++count);
}
This program compiles fine with gcc 3.4.3 as
gcc -std=c99 -pedantic -Wall -Wextra max_size_to_malloc.c
and produces the following output(after quite some time):
4294967295 - 1365266451 : Allocated 2929700844 bytes of memory
What is the reason for allocating this much less memory while the
parameter type of malloc is size_t ?
Normally what factor drives the maximum size of a single chunk of
memory allocated by malloc ?
Please explain.
Thanks
V.Subramanian
I am using RedHat Linux kenel 2.6.9 on Intel Pentium D dual core
processor.
The malloc function takes size_t type argument. I am using gcc 3.4.3
implementation. Under this implementation, in stdint.h, SIZE_MAX is
defined as follows:
# define SIZE_MAX (4294967295U)
SIZE_MAX, defined in stdint.h, is the largest value that can be
represented in type size_t. However when I pass SIZE_MAX as the
argument to malloc, it is unable to allocate SIZE_MAX bytes of memory.
In fact, malloc allocates much much less memory.
Here is the complete program max_size_to_malloc.c
#include <stdlib.h>
#include <stdio.h>
#include <stdint.h>
#include <stddef.h>
int main()
{
size_t count = 0;
do
{
const size_t size = SIZE_MAX - count;
char* p = malloc(size);
if (p)
{
fprintf(stderr,
"%u - %u : ",
SIZE_MAX,
count);
fprintf(stderr,
"Allocated %u bytes of memory\n",
size);
free(p);
p = NULL;
return EXIT_SUCCESS;
}
} while (++count);
}
This program compiles fine with gcc 3.4.3 as
gcc -std=c99 -pedantic -Wall -Wextra max_size_to_malloc.c
and produces the following output(after quite some time):
4294967295 - 1365266451 : Allocated 2929700844 bytes of memory
What is the reason for allocating this much less memory while the
parameter type of malloc is size_t ?
Normally what factor drives the maximum size of a single chunk of
memory allocated by malloc ?
Please explain.
Thanks
V.Subramanian