Array size limits

C

Carol Depore

How do I determine the maximum array size?

For example, int a[10000] works, but a[10000000] does not (run time
error).

Thank you.
 
J

Jack Klein

How do I determine the maximum array size?

You can't
For example, int a[10000] works, but a[10000000] does not (run time
error).

Thank you.

The original C standard (ANSI 1989/ISO 1990) required that a compiler
successfully translate at least one program containing at least one
example of a set of environmental limits. One of those limits was
being able to create an object of at least 32,767 bytes.

This minimum limit was raised in the 1999 update to the C standard to
be at least 65,535 bytes.

No C implementation is required to provide for objects greater than
that size, which means that they don't need to allow for an array of
ints greater than (int)(65535 / sizeof(int)).

In very practical terms, on modern computers, it is not possible to
say in advance how large an array can be created. It can depend on
things like the amount of physical memory installed in the computer,
the amount of virtual memory provided by the OS, the number of other
tasks, drivers, and programs already running and how much memory that
are using. So your program may be able to use more or less memory
running today than it could use yesterday or it will be able to use
tomorrow.

Many platforms place their strictest limits on automatic objects, that
is those defined inside of a function without the use of the 'static'
keyword. On some platforms you can create larger arrays if they are
static or by dynamic allocation.
 
B

Ben Pfaff

Carol Depore said:
How do I determine the maximum array size?

There is no portable way. You are better off using dynamic
allocation, because then you can try different sizes at runtime.
 
C

Carol Depore

How do I determine the maximum array size?

You can't
For example, int a[10000] works, but a[10000000] does not (run time
error).

Thank you.

The original C standard (ANSI 1989/ISO 1990) required that a compiler
successfully translate at least one program containing at least one
example of a set of environmental limits. One of those limits was
being able to create an object of at least 32,767 bytes.

This minimum limit was raised in the 1999 update to the C standard to
be at least 65,535 bytes.

No C implementation is required to provide for objects greater than
that size, which means that they don't need to allow for an array of
ints greater than (int)(65535 / sizeof(int)).

So...are you saying that I am guaranteed at least int a[65535], but no
guarantees beyond that?
 
B

Ben Pfaff

Carol Depore said:
No C implementation is required to provide for objects greater than
that size, which means that they don't need to allow for an array of
ints greater than (int)(65535 / sizeof(int)).

So...are you saying that I am guaranteed at least int a[65535], but no
guarantees beyond that?

No. To start with, sizeof(int) may be, and usually is, greater
than 1.
 
C

Carol Depore

Carol Depore said:
No C implementation is required to provide for objects greater than
that size, which means that they don't need to allow for an array of
ints greater than (int)(65535 / sizeof(int)).

So...are you saying that I am guaranteed at least int a[65535], but no
guarantees beyond that?

No. To start with, sizeof(int) may be, and usually is, greater
than 1.


Oh, I see...so, if sizeof(int) is 2 bytes, then I am guaranteed at
least int a[32767], but no larger sized arrays are guaranteed.

Thanks for the help!
 
R

Ravi Uday

Carol Depore said:
Carol Depore said:
No C implementation is required to provide for objects greater than
that size, which means that they don't need to allow for an array of
ints greater than (int)(65535 / sizeof(int)).

So...are you saying that I am guaranteed at least int a[65535], but no
guarantees beyond that?

No. To start with, sizeof(int) may be, and usually is, greater
than 1.


Oh, I see...so, if sizeof(int) is 2 bytes, then I am guaranteed at
least int a[32767], but no larger sized arrays are guaranteed.
No, You have to check INT_MAX in your limits.h file to know the exact
maximum
value that an integer can hold on your implementation.
 
K

Keith Thompson

Ravi Uday said:
Oh, I see...so, if sizeof(int) is 2 bytes, then I am guaranteed at
least int a[32767], but no larger sized arrays are guaranteed.
No, You have to check INT_MAX in your limits.h file to know the exact
maximum
value that an integer can hold on your implementation.

INT_MAX isn't relevant. The limit in question is the maximum size of
an object, which is at least 65535 bytes (in a hosted environment
only). If sizeof(int) is 2, you're guaranteed at least int a[32767];
if sizeof(int) is 4; you're only guaranteed at least int a[16383].

But actually the guarantee is even weaker than that. The standard
only requires that the implementation must be able to translate and
execute at least one program containing an object of 65535 bytes
(along with a number of other limits, such as 127 arguments in one
function call). It's not required to handle *your* program containing
an object of 65535 bytes.

Practically speaking, though, the easiest way to satisfy the
translation limits is generally to impose no explicit limits, but to
support whatever will fit in memory, either at compile time or at run
time. A (non-binding) footnote in C99 5.2.4.1 says, "Implementations
should avoid imposing fixed translation limits whenever possible."
 
P

pete

Keith said:
Ravi Uday said:
Oh, I see...so, if sizeof(int) is 2 bytes, then I am guaranteed at
least int a[32767], but no larger sized arrays are guaranteed.
No, You have to check INT_MAX in your limits.h file to know the exact
maximum
value that an integer can hold on your implementation.

INT_MAX isn't relevant. The limit in question is the maximum size of
an object, which is at least 65535 bytes (in a hosted environment
only). If sizeof(int) is 2, you're guaranteed at least int a[32767];
if sizeof(int) is 4; you're only guaranteed at least int a[16383].

But actually the guarantee is even weaker than that. The standard
only requires that the implementation must be able to translate and
execute at least one program containing an object of 65535 bytes
(along with a number of other limits, such as 127 arguments in one
function call). It's not required to handle *your* program containing
an object of 65535 bytes.

I think what they meant by the "at least one" part,
is closer to saying
"A C implementation is something which can translate and execute
a C program, and anything that can't translate and execute
a C program, isn't a C implementation"

I don't think that they meant to suggest that an implementation
which doesn't self destruct after program translation and execution,
"exceeds ANSI standards".
 
C

Carol Depore

Everyone, thank you for your help. I feel a little like I'm asking
Einstein to explain relativity to me. I'm way out of my league.
So, thanks for your patience.

Anyway, I'm still confused about how large a simple int array can be.
I understand what Jack and Ben said about the Standard guaranteeing
at least int a[32767], but I don't understand why I can't have an
array of int a[600000], especially since I have 311MB of unused memory
on my machine, and I thought the machine would allow programs up to
4GB.

Here's my little test program, which works for 500000, but fails for
600000.

All help to relieve my confusion is appreciated. I think I'm not
understanding something very fundamental.


#include <stdio.h>

//#define NNN 600000
#define NNN 500000

int main() {
long i;
int a[NNN];
for (i=0;i<NNN;i++) {
printf("%i\n",i);
a = 1;
}
}
 
K

Keith Thompson

pete said:
Keith Thompson wrote: [...]
But actually the guarantee is even weaker than that. The standard
only requires that the implementation must be able to translate and
execute at least one program containing an object of 65535 bytes
(along with a number of other limits, such as 127 arguments in one
function call). It's not required to handle *your* program containing
an object of 65535 bytes.

I think what they meant by the "at least one" part,
is closer to saying
"A C implementation is something which can translate and execute
a C program, and anything that can't translate and execute
a C program, isn't a C implementation"

I don't think that they meant to suggest that an implementation
which doesn't self destruct after program translation and execution,
"exceeds ANSI standards".

Actually, I think that is what they meant. For example, if an
implementation can handle a single carefully written program that
meets each of the translation limits, including a single object of
exactly 65535 bytes, but falls over and dies if you add a 1-byte
object declaration to that same program, that implementation is
conforming (assuming it doesn't have any other problems).

That doesn't imply that such an implementation is *useful*; that's a
QoI (Quality of Implementation) issue.

In real life, nobody bothers to implement a conforming C compiler
that's totally useless (or if anybody does, it rapidly vanishes).
 
E

Eric Sosman

Carol said:
Everyone, thank you for your help. I feel a little like I'm asking
Einstein to explain relativity to me. I'm way out of my league.
So, thanks for your patience.

Anyway, I'm still confused about how large a simple int array can be.
I understand what Jack and Ben said about the Standard guaranteeing
at least int a[32767], but I don't understand why I can't have an
array of int a[600000], especially since I have 311MB of unused memory
on my machine, and I thought the machine would allow programs up to
4GB.

Your machine (like many) apparently exceeds the minimum
requirements imposed by the C language Standard. It's sort
of like the Federal definition of the minimum wage: Employers
are required to pay at least thus-and-such much per hour of
labor, but many laborers nonetheless demand and receive more.
Be happy; you're rich!
Here's my little test program, which works for 500000, but fails for
600000.

Your C implementation is giving you more than the minimum,
but does in fact have a limit on how big the array can be.
You're rich, but you're not Bill Gates.
All help to relieve my confusion is appreciated. I think I'm not
understanding something very fundamental.


#include <stdio.h>

//#define NNN 600000
#define NNN 500000

int main() {
long i;
int a[NNN];

Here's another issue. C data objects can have various
"storage classes:" automatic, static, and dynamic. The amount
of memory available for an object can be different for the
different storage classes. Your a[] array occupies automatic
storage, which is typically (although not necessarily) subject
to the tightest space restrictions. You may see dramatically
different results if you change the above to

int main() {
static int a[NNN};
...

or

int main() {
int *a = malloc(NNN * sizeof *a);
if (a != NULL) {
...

The first of these uses static storage for a[], and the
second replaces the array with a pointer to dynamic storage.
There will be limits on how much static or dynamic memory
you can devote to your data, but they're likely to be looser
than the limits on automatic storage.

You're rich, but if you want to augment your wealth it's
better to rob big banks than little ones.
 
K

Keith Thompson

Carol Depore said:
Anyway, I'm still confused about how large a simple int array can be.
I understand what Jack and Ben said about the Standard guaranteeing
at least int a[32767], but I don't understand why I can't have an
array of int a[600000], especially since I have 311MB of unused memory
on my machine, and I thought the machine would allow programs up to
4GB.

Here's my little test program, which works for 500000, but fails for
600000.

All help to relieve my confusion is appreciated. I think I'm not
understanding something very fundamental.

#include <stdio.h>

//#define NNN 600000
#define NNN 500000

int main() {
long i;
int a[NNN];
for (i=0;i<NNN;i++) {
printf("%i\n",i);
a = 1;
}
}


The standard doesn't guarantee at least int a[32767] unless
sizeof(int) happens to be 1 or 2. The specific guarantee is an object
of at least 65535 bytes. But almost all implementations exceed that
guarantee.

You say the program fails for 600000, but you don't say *how* it
fails. It probably doesn't matter much in this case, but in general
knowing *how* something fails can be critical to figuring out what the
problem is.

The maximum size allowed for int a[NNN] in your program, and what's
going to happen if you exceed it, is going to depend on any of a
number of things, most of which we can't help you with here. The
total amount of memory available on the system is only one possible
factor. Some systems impose specific limits on stack size (the array
is probably going to be allocated on "the stack", though the C
standard doesn't define such a thing), but you probably don't know
what else is allocate there. Some systems might allow you to adjust
the limits (on Unix-like systems, see the "limit" or "ulimit" command;
on other systems, I have no clue). Some systems may support larger
chunks of memory in different contexts; for example, declaring a as a
global variable might put it in the data section rather than on the
stack, or you might try allocating it via malloc().

If you want to ask about the details, you should try a newsgroup
that's specific to your system.
 
C

CBFalconer

Carol said:
.... snip ...

Anyway, I'm still confused about how large a simple int array
can be. I understand what Jack and Ben said about the Standard
guaranteeing at least int a[32767], but I don't understand why
I can't have an array of int a[600000], especially since I have
311MB of unused memory on my machine, and I thought the machine
would allow programs up to 4GB.

Here's my little test program, which works for 500000, but
fails for 600000.

#include <stdio.h>

//#define NNN 600000
#define NNN 500000

int main() {
long i;
int a[NNN];
for (i=0;i<NNN;i++) {
printf("%i\n",i);
a = 1;
}
}


This tells me that your system assigns a default stack size
(assuming it has a stack) of between 500,000 * sizeof int and
600,000 * sizeof int. The most likely values are 1 meg and 2 meg.
 
W

Wei Li

I don't know how ANSI C says. But the stacksize of a program in runtime is
limited by Opterating System. So that C compiler always doesn't give out an
compile time error.

If you are on Linux, you can use "limit" command to show the limits and use
"limit stacksize 1000000" to change them. I think your program will work OK
after change the limit.

Thanks!
Wei
 
K

Keith Thompson

Wei Li said:
I don't know how ANSI C says. But the stacksize of a program in runtime is
limited by Opterating System. So that C compiler always doesn't give out an
compile time error.

If you are on Linux, you can use "limit" command to show the limits and use
"limit stacksize 1000000" to change them. I think your program will work OK

Carol Depore said:
How do I determine the maximum array size?

For example, int a[10000] works, but a[10000000] does not (run time
error).

Thank you.

Wei Li: Please don't top-post.

Top-posting means writing your new material first, followed by the
quoted article to which you're responding. It makes it difficult to
follow the discussion, especially when (almost) everyone else
bottom-posts, as I'm doing there. (You should also trim anything
that's not relevant to your response, though in this case the previous
article was short enough that quoting the whole thing is probably ok.)

Thanks.
 
M

Mabden

Wei Li said:
I don't know how ANSI C says. But the stacksize of a program in runtime is
limited by Opterating System. So that C compiler always doesn't give out an
compile time error.

If you are on Linux, you can use "limit" command to show the limits and use
"limit stacksize 1000000" to change them. I think your program will work OK
after change the limit.

Carol Depore said:
How do I determine the maximum array size?

For example, int a[10000] works, but a[10000000] does not (run time
error).

It's the same with recursion. I attempted a simple factorial program (below)
and it crapped out at 39! and even then I think the result was wrong (it was
a while ago, so I don't remember whether it worked right or not) but I
needed something much bigger, so I moved on. I recall believing the fault
was in the recursion, not the size of the value, but it was many years ago.

-------------------------------------------
#include <stdio.h>
#include <stdlib.h>

double fact (int n);

void main(int argc, char **argv)
{
int n;
double factorial=1;

n = atoi (argv[1]);
fact (n);
}

double fact (int n)
{
static double value=1;

if (n > 1)
value = n * fact (n-1);

printf ("%3.d! = %.0f \n", n, value);

return (value);
}
 
C

Carol Depore

Wow, I think a little tiny light bulb has been lit in my head! Thanks
to all of you!

In summary,
- There are TWO areas for array allocation, a stack area and a global
area.
- When my little test program uses the stack (by default), the array
is limited to about int a[500000]. (the stack limit probably can be
increased, but I'll learn that later).
- However, if I use the global area, by "static int", then my array
can be much larger. My test program below, with a[20000000], worked
fine!!

#include <stdio.h>

//#define NNN 600000
#define NNN 20000000

int main() {
long i;
static int a[NNN];
printf("Start\n");
for (i=0;i<NNN;i++) {
printf("%i\n",i);
a = 1;
}
}


Thanks again, to all of you!
 
O

Old Wolf

Carol Depore said:
Anyway, I'm still confused about how large a simple int array can be.

Short answer: it depends, and you can't tell in advance.
I understand what Jack and Ben said about the Standard guaranteeing
at least int a[32767], but I don't understand why I can't have an
array of int a[600000], especially since I have 311MB of unused memory
on my machine, and I thought the machine would allow programs up to
4GB.

Your machine probably has restrictions on how big "automatic"
objects can be (ie. when you go int a[NNN] inside a function).
if you want to use up all of your actual memory then you should
try dynamic allocation, which has the advantage that you can
handle errors nicely instead of just getting a runtime error:
#include <stdio.h>

//#define NNN 600000
#define NNN 500000

int main() {
long i;
int a[NNN];
for (i=0;i<NNN;i++) {
printf("%i\n",i);
a = 1;
}
}


#include <stdio.h>
#include <stdlib.h>

#define NNN 600000

int main(void) {
long i;
int *a = malloc(NNN * sizeof *a);
if (a == NULL) {
printf("not enough memory.\n");
return EXIT_FAILURE;
}
for (i = 0; i < NNN; i++) {
printf("%i\n", i);
a = 1;
}
free(a); /* releases the memory for other applications */
}
 
W

Wei Li

Thanks! I will be careful next time :p

Keith Thompson said:
Wei Li said:
I don't know how ANSI C says. But the stacksize of a program in runtime is
limited by Opterating System. So that C compiler always doesn't give out an
compile time error.

If you are on Linux, you can use "limit" command to show the limits and use
"limit stacksize 1000000" to change them. I think your program will work OK

Carol Depore said:
How do I determine the maximum array size?

For example, int a[10000] works, but a[10000000] does not (run time
error).

Thank you.

Wei Li: Please don't top-post.

Top-posting means writing your new material first, followed by the
quoted article to which you're responding. It makes it difficult to
follow the discussion, especially when (almost) everyone else
bottom-posts, as I'm doing there. (You should also trim anything
that's not relevant to your response, though in this case the previous
article was short enough that quoting the whole thing is probably ok.)

Thanks.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top