S
suresh
Hi
when one store real numbers (doubles) in a binary file, could the file
size be smaller if the data is stored in a binary file than in an
ascii file? I wrote a small program to verify and both files give the
same size. But for integer data, binary files seems to be smaller in
size as the integer value increases which is understandable. I need to
store a really large square matrix of floating point numbers (1
million X 1 million) in a file and what is the best format to store
them so that file size is smaller?
The program I used to test the ascii and binary file sizes is given
below.
Thanks
suresh
#include <iostream>
#include <fstream>
using namespace std;
int main(){
double x[] = {1000.234,2000.345,20000000.567};
ofstream outfileA("ascii.txt");
for(int i = 0; i < sizeof(x)/sizeof(double);i++)
outfileA << x << " ";
outfileA.close();
ofstream outfileB("bin.bin",ios::binary);
for(int i = 0; i < sizeof(x)/sizeof(double);i++)
outfileB.write(reinterpret_cast<char*>(&x),sizeof(double));
}
when one store real numbers (doubles) in a binary file, could the file
size be smaller if the data is stored in a binary file than in an
ascii file? I wrote a small program to verify and both files give the
same size. But for integer data, binary files seems to be smaller in
size as the integer value increases which is understandable. I need to
store a really large square matrix of floating point numbers (1
million X 1 million) in a file and what is the best format to store
them so that file size is smaller?
The program I used to test the ascii and binary file sizes is given
below.
Thanks
suresh
#include <iostream>
#include <fstream>
using namespace std;
int main(){
double x[] = {1000.234,2000.345,20000000.567};
ofstream outfileA("ascii.txt");
for(int i = 0; i < sizeof(x)/sizeof(double);i++)
outfileA << x << " ";
outfileA.close();
ofstream outfileB("bin.bin",ios::binary);
for(int i = 0; i < sizeof(x)/sizeof(double);i++)
outfileB.write(reinterpret_cast<char*>(&x),sizeof(double));
}