David Tiktin said:
Right - the technique is portable, but the value isn't.
Again, the technique is portable, but the value isn't. Consider:
#include <stdio.h>
int main(void)
{
FILE *fp = fopen("foo.txt", "w");
if(fp != NULL)
{
if(fputs("A\n", fp) != EOF && fclose(fp))
{
puts("All is well.");
}
}
return 0; /* if fputs failed, the stream will be closed on exit!
*/
}
If "All is well" is displayed (under any conforming hosted
implementation), we have created a text file, which in C terms
contains one line comprising 'A' and a newline character. So the
technique is portable.
But what have we actually created, in what we might call absolute
terms? On a PC under Windows, the file will contain three bytes,
with values 65, 13, and 10 (obviously I'm using decimal
representation here). On an old-style Mac, it would contain just
two bytes: 65, 13. On a Linux box (or, I *think*, a modern Mac),
it would contain 65, 10. On an IBM mainframe, well, don't ask(!),
but to start off with, that 'A' would be 193 rather than 65. The
point is that, whilst the output can be interpreted consistently
within the machine/OS/implementation combination, moving that file
to another (disparate) system will (or at least may) result in
that interpretation becoming invalidated unless some kind of data
massage is performed.