Kenneth Brody said:
Well, you can't realloc() the array, as that would invalidate all
existing FILE* values. (Unless it returned the same address, which
is certainly not guaranteed, or even very likely in most scenarios.)
Of course! I was not suggesting using realloc.
You can, however, allocate chunks of fixed-sized arrays which are then
put on a linked list. Not as efficient as a single array, but a lot
more efficient than individual allocations.
Yes, but the complexity is still linear: the number of comparisons against
array bounds is proportional to the total number of FILE objects allocated
(2 * N / chunksize).
The trick is to allocate chunks of increasing sizes for instance by doubling
the chunk size each time you run out of objects: the number of chunks will
be Log_base_2(N / first_chunk_size), a much smaller value for large values
of N. The chunks would be linked in a list and fclosed FILEs linked in a
free list. Alternately, one can store pointers to the chunks in a small
array to make the test more cache efficient.
You can use that method for any type of fixed size structure that a program
allocates, and add debug time asserts to verify pointer validity. There are
2 constraints: any 2 object pointers (converted to char *) must be
comparable (not guaranteed by the Standard) and you need a way to tell that
a freed or unallocated structure is valid (either a dedicated flag or some
other test of field values). This is not 100% foolproof, but can help debug
allocation problems when more general tools are not available.
There are other efficient methods for pointer validity checking, using hash
tables, trees or similar ad hoc structures, usually with more complexity.