Hi,
I am developing a C Program for reading over a million files of size 1
kilobytes each and sending the contents to another program using some
middle ware. I need some help on designing the program to process such
a large number of files in less than 8 hours.
Your main bottleneck is likely to be the file access. Accessing lots of
small files over a hard disk could end up being very slow, disks are much
more efficient reading large chunks of sequential data. You may want to
consider how your file os organised in the first place. If for example you
had the data written in 1K blocks in a single file (perhaps even do both)
the problem reduces to transferring a gigabyte of data which can be done
in seconds or minutes with normal LAN speeds.
This isn't a question about C but about the design of a system of
file management. You need to sit down and specify your real requirements,
e.g. why there are over a million 1K files in the first place and whether
a better approach is possible. There may be things you can do to aid this
transfer process when those million files are being generated (such as
append them to a single file, perhaps even put them in a database).
There is a lot you need to consider before worrying about C related issues.
Lawrence