Trouble with binary files?

G

Guest

I'm trying to write a program that will read a binary
file into an buffer, do stuff with it, and then write
the result back into another file. However, I'm
running into a problem.

I haven't been able to find a way that will read in
more than the first 160 bytes (of a 910 byte file).
I've tried using each_byte and looping with getc, as
well as storing the results in a string or an array.

I've never had this sort of problem working with text
files. Is there something else I have to do to be able
to work with binary data?

-Morgan.

__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com
 
M

Mike Stok

I'm trying to write a program that will read a binary
file into an buffer, do stuff with it, and then write
the result back into another file. However, I'm
running into a problem.

I haven't been able to find a way that will read in
more than the first 160 bytes (of a 910 byte file).
I've tried using each_byte and looping with getc, as
well as storing the results in a string or an array.

I've never had this sort of problem working with text
files. Is there something else I have to do to be able

If you're on a windows platform this can happen if you have a ^Z in your
file.

If this is your problem then binmode may help:

[mike@ratdog mike]$ ri binmode
This is a test 'ri'. Please report errors and omissions
on http://www.rubygarden.org/ruby?RIOnePointEight

------------------------------------------------------------- IO#binmode
ios.binmode -> ios
------------------------------------------------------------------------
Puts ios into binary mode. This is useful only in
MS-DOS/Windows environments. Once a stream is in binary mode, it cannot
be reset to nonbinary mode.

Hope this helps,

Mike
 
H

Heinz Werntges

I'm trying to write a program that will read a binary
file into an buffer, do stuff with it, and then write
the result back into another file. However, I'm
running into a problem.

I haven't been able to find a way that will read in
more than the first 160 bytes (of a 910 byte file).
I've tried using each_byte and looping with getc, as
well as storing the results in a string or an array.

I've never had this sort of problem working with text
files. Is there something else I have to do to be able
to work with binary data?

-Morgan.

__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com
Did you try IO#binmode ? Unix guys (like me) typically miss that
when working in a Windows environment.

Cheers,

-- Heinz
 
J

Joey Gibson

I've never had this sort of problem working with text
files. Is there something else I have to do to be able
to work with binary data?
Are you on a Windows box and are you opening the file in binary mode? I had this same problem recently with the same results. The problem was a Ctrl-Z some way into the file that was being interpreted as EOF. The fix was to add a 'b' to the open flags:

File.open("foo.db", "rb") do ...

and then all was right with the world.
 
H

Hal Fulton

Tim said:
You misunderstand the 160-byte barrier as being related to Ruby.
It's a Win32/DOS issue.

<story-mode>

Way back in PC-/MS-DOS days, it was decided that the
non-printable ASCII-26 (^Z) character would mark the end of a
textmode file. The difference between textmode and binmode of a
DOS file is important, though it need not ever have become an
issue.

The requirement for ^Z to terminate a textfile has since been
changed. However, (for backward compatibility?) when the ^Z *is*
encountered in a textmode file, DOS (and subsequently, Windows)
still set the EOF flag and stop reading.

</story-mode>

This becomes more of an issue when the default file open mode for
DOS/Win is in text mode, creating the need for a completely new
function call almost exclusively for DOS/Win platforms; in this
case, binmode(), which explicitly sets the file read mode to
binary, preventing the OS from stopping at the first ^Z (and from
changing line endings, blah, blah...).

As someone else mentioned above, this isn't an issue on Unix or
many other systems, since EOF on these OSes isn't determined by
file contents. I'm not if it's an issue for Macs, as they also
historically use different line endings. This also may have
changed with OS X; anyone know?

The moral of the story is:

Always call fh.binmode() before reading
any non-text file on non-Unix platforms.

True, but let's be fair.

MSDOS stole many things from Unix, such as the notion of a
hierarchical directory structure and the use of < > | at the
shell level. (Many things were incompletely stolen, unfortunately.)

The binmode/textmode distinction came from Unix. At that time
Unix had an EOF character of control-D (which explains the ^D we
still type occasionally at the terminal).

So historically Unix's behavior with respect to ^D was the same as
DOS's with respect to ^Z. But Unix/Linux moved beyond that, and
DOS/Windows never did.

Hal
 
S

Steven Jenkins

Hal said:
True, but let's be fair.

MSDOS stole many things from Unix, such as the notion of a
hierarchical directory structure and the use of < > | at the
shell level. (Many things were incompletely stolen, unfortunately.)

The binmode/textmode distinction came from Unix. At that time
Unix had an EOF character of control-D (which explains the ^D we
still type occasionally at the terminal).

No. Unix never distinguished between text and binary files. Unix did
(and does) interpret ASCII EOT (ctrl-d) as an end-of-input indicator for
terminal devices, but it never used any in-band character to mark the
end of a file. The EOT never got past the terminal driver, and was never
delivered to an application.

Steve
 
H

Hal Fulton

Steven said:
No. Unix never distinguished between text and binary files. Unix did
(and does) interpret ASCII EOT (ctrl-d) as an end-of-input indicator for
terminal devices, but it never used any in-band character to mark the
end of a file. The EOT never got past the terminal driver, and was never
delivered to an application.

If Unix never distinguished between text and binary files, what
was the binary mode flag for?

Hal
 
S

Shashank Date

Hal Fulton said:
If Unix never distinguished between text and binary files, what
was the binary mode flag for?

For CR-LF may be ... just a wild guess.
 
D

Daniel Kelley

Hal> If Unix never distinguished between text and binary files,
Hal> what was the binary mode flag for?

I recall that the whole ^Z terminator from CP/M and the fact that file
sizes were always multiples of 128 bytes (saving 7 bits in a size
field being important at the time), so a text file needed a special
character to mark the end of the text. MSDOS carried that "tradition"
on, to ease porting of CP/M applications to DOS, and, well, saving bits
was important at that time, at least it *seemed* to be important!

d.k.
 
Y

YANAGAWA Kazuhisa

In Message-Id: <[email protected]>
Hal Fulton said:
If Unix never distinguished between text and binary files, what
was the binary mode flag for?

For ANSI-C compliance. From fopen(3) of FreeBSD 4.8-RELEASE:

The mode string can also include the letter ``b'' either as a third char-
acter or as a character between the characters in any of the two-charac-
ter strings described above. This is strictly for compatibility with
ISO/IEC 9899:1990 (``ISO C89'') and has no effect; the ``b'' is ignored.

I believe most of Unix like platforms stand on a similar position.
 
J

Jim Weirich

If Unix never distinguished between text and binary files, what
was the binary mode flag for?

Unix originally didn't have one. Only has it now for compatibility.
 
H

Hal Fulton

Jim said:
Unix originally didn't have one. Only has it now for compatibility.

I'll have to assume you're correct, as I can't prove my position.

But I definitely remember being led to believe that EOT was an
end-of-file marker. And I remember wondering how it worked for
binary files, did it store the length in the inode or what?

This was System III, around 1980 (out of date even then).

I'll have to dig into the old kernel to see how it actually
worked. I only have it in hardcopy, though.

Hal
 
H

Hal Fulton

Steven said:
The 'b' modifier was added to ANSI C to support non-Unix execution
environments that distinguish between text and binary files. It didn't
exist in Unix until ANSI C required it; since then, it's been a no-op.

http://www.lysator.liu.se/c/rat/d9.html#4-9-2

Thanks, Steve.

Very, very frustrating to me when the facts don't fit my memories...
I still think I was misinformed at some point about EOT and such.

I'm not sure when ANSI C came along, but I think it was *after* I
learned C and Unix, and *after* the introduction of the IBM PC.

Have to go look up Tobin Maginnis and see what he says...

Hal
 
S

Steven Jenkins

Hal said:
Very, very frustrating to me when the facts don't fit my memories...
I still think I was misinformed at some point about EOT and such.

Your batting average is still pretty good.
I'm not sure when ANSI C came along, but I think it was *after* I
learned C and Unix, and *after* the introduction of the IBM PC.

The standard was published in 1989. K&R second edition (1988) mentions
the "b" modifier to fopen() (the standard was nearing ratification at
the time), but the first edition (1978) doesn't.

I remember all this because I was trying to write code in the mid-80s to
run on both BSD Unix and MS-DOS. Turbo C required the "b" modifier or
some library function (binmode()?), but the (pre-gcc) Unix C compiler
didn't allow them. I had to do it with preprocessor conditionals. Yuck.

I grumble (quietly) when other people carry on off-topic discussions,
and now I'm doing it. We return now to ruby-talk, already in progress.

Steve
 
B

Benjamin Peterson

Tim Hammerquist said:
However, (for backward compatibility?) when the ^Z *is*
encountered in a textmode file,

....or, better yet, when any text character whose encoding happens to
include an 0x1a is encountered! My, that *was* annoying.
As someone else mentioned above, this isn't an issue on Unix or
many other systems,

Or indeed on Windows, provided you avoid ruby :I
The moral of the story is:

Always call fh.binmode() before reading
any non-text file on non-Unix platforms.

You have to call it before reading *any* file, unless you just know
that only ASCII was used, for the reason above. I think ruby is the
only software I've ever used that has this issue. I suppose ruby must
check for the 0x1a *before* allowing for the encoding system.
 
F

furlan primus

True, but let's be fair.

MSDOS stole many things from Unix, such as the notion of a
hierarchical directory structure and the use of < > | at the
shell level. (Many things were incompletely stolen, unfortunately.)

The binmode/textmode distinction came from Unix. At that time
Unix had an EOF character of control-D (which explains the ^D we
still type occasionally at the terminal).

So historically Unix's behavior with respect to ^D was the same as
DOS's with respect to ^Z. But Unix/Linux moved beyond that, and
DOS/Windows never did.

Hal

I thought that the Control-Z usage was borrowed from CP/M in order to
make it easier to port programs from that to MS-DOS.

http://www.finseth.com/~fin/craft/Chapter-5.html
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,483
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top