B
Brian
Hello,
I have a text file I'm attempting to parse. There are about 50 fixed width
fields in each line / row. For example (shortened for brevity):
W1234Somebody East 101110001111010101
E1235Someone Else West 010111001001010101
I'm having problems pulling these fields into structures, in order to be
able to access each individually. I am currently opening as a sequential
file. Is there a better way?
My structure looks something like:
struct data{
char area[1];
char empNumber[4];
char name[16]
char region[5];
char options[20];
}
int index = 0;
data user[100];
I would like to read the entire file into memory. Please tell me if I'm
going about this the wrong way. So far after reading the file in, I'm
unable to access any individual items (ie. user[index].area) Also should
I stick with a sequential file, or should I consider binary access (seems
like it may be easier to address individual elements)?
(I don't want to include too many details here, but will be happy to provide
whatever is needed).
Thank you in advance,
Brian
I have a text file I'm attempting to parse. There are about 50 fixed width
fields in each line / row. For example (shortened for brevity):
W1234Somebody East 101110001111010101
E1235Someone Else West 010111001001010101
I'm having problems pulling these fields into structures, in order to be
able to access each individually. I am currently opening as a sequential
file. Is there a better way?
My structure looks something like:
struct data{
char area[1];
char empNumber[4];
char name[16]
char region[5];
char options[20];
}
int index = 0;
data user[100];
I would like to read the entire file into memory. Please tell me if I'm
going about this the wrong way. So far after reading the file in, I'm
unable to access any individual items (ie. user[index].area) Also should
I stick with a sequential file, or should I consider binary access (seems
like it may be easier to address individual elements)?
(I don't want to include too many details here, but will be happy to provide
whatever is needed).
Thank you in advance,
Brian