Signed and unsigned int

F

Flash Gordon

Great. Now please
cat * >/dev/ null

Does that work?

Works fine on my Linux machine with over 2GB of images in a single
directory.
Linux crashed when copying those files.
NOT when writing them.

Thanks for your time

I have also copied the contents of a directory with over 2GB of data.

It was almost certainly a hardware problem and is DEFINITELY nothing to
do with the C language. Please find somewhere else to discuss this since
it is a long way OT here.
 
M

Mabden

jacob navia said:
Sorry for not replying but I had to rebuild my machine
and mail sin't there yet.

I repeat:

The bug is not about SINGLE files bigger than 2GB.

It is when the contents of a directory exceed 2GB. Each
of the files is between 30-300K.

Maybe it is relevant to the bug that the upper and lower
directories are also quite big (0.8-1.5GB)

I repeat that the bug is related to the 2GB limit since it appeared
when I crossed it, around monday, when I updated
the directory from the NASA site.

Sorry but I do not know if I can answer more, but it
is VERY EASY TO REPRODUCE IT.

Just make a program that makes random files of
33-300K until you get 3GB, then do some directories and
make files.

I would be *very* interested in hearing from you!

I don't have files that small in my MP3 directory, they are all 2-7MB each.
I have 3,289 files and 638 folders. The size is 16,459,682,561 bytes.
Of course, that's just my junk pile. My main music drive has more than twice
that. Would you like me to play you a song? ;-)
BTW, I'm using Windows 2000 Pro - please, no OS wars!
 
M

Mabden

Keith Thompson said:
As you say, this doesn't sound like a software issue. If the disk
drive itself is functioning properly, it shouldn't even be possible
for software to cause the disk head to bang against the inside of the
case (I think).

Hi, Keith! Hard disks are harder to kill, but here's what happened to my DVD
drive...

I just got my copy of Doom3. Now, I won't run an expensive game on the
original disks - I stopped playing Warcraft3 when I couldn't copy it. And I
won't download an internet hack because I'm paranoid. So, I'm trying to get
dups of my Doom3 disks and Disk One won't copy on my CD writer. So I try
using my DVD writer with DIVX (or whatever it's called) and the drive makes
clicks and clunks. The disk copied, but is unusable. Now I find I can't play
DVD's, or copy them, or write new ones. The drive is non-functional. The
copy of the game is also non-functional.

So, never one to leave bad enough alone, I copy the broken copy using my CD
writer, and the copy (of the broken copy) works great. I have broken their
copy-protection scheme at the cost of a DVD writer.
So anyone who wants a backup of the $50 game can buy a $100 DVD writer and
copy it once (breaking the DVD) and make all the copies of Disk One you
want.

I hate copy protection. It doesn't work, and only hurts the actual customer
and doesn't affect the bootlegger at all. I see plenty of bootleg copies all
over Google, "no-cd" patches, etc. Thanks to ID for breaking my hardware for
buying their game.
OK, I'm rambling a little...

My point is that sometimes hardware can be told to over-extend an arm or try
to reach a false position because the track was laid down using some copy
protection scheme. If the software trying to copy a file by over-riding
normal protocols via low-level routines in order to facilitate a copy, the
device can be put into a position it can not recover from.

Old Western Digital hard drives had a plastic screw device on the exterior
of the hard disk, and if they were over-extended, you could ratchet them
back a little bit and they would be fine.

Also, monitors always warn that if you put them in a mode they don't
support, they can be destroyed.

SO software can kill hardware, if you try hard enough!
 
R

Richard Bos

jacob navia said:
I repeat that the bug is related to the 2GB limit since it appeared
when I crossed it, around monday, when I updated
the directory from the NASA site.

That is a premature conclusion.
Sorry but I do not know if I can answer more, but it
is VERY EASY TO REPRODUCE IT.

Not on this system, it isn't.
Just make a program that makes random files of
33-300K until you get 3GB, then do some directories and
make files.

I already have one. No problems whatsoever.

I suspect either pilot error, or a physically faulty disk.

Richard
 
R

Richard Bos

jacob navia said:
Note that the DISK FAILS, it is not a software issue, i.e. you hear
a "pang" of the disk head repeatedly trying to get to a bad position each
second or so. After a few "pangs" the drive is DEAD.

So, let's get this straight: your disk has an audible, _physical_
crash... it does something which no hard disk should do, no matter what
a user program tells it... and you're blaming C's ints, without even
considering the possibility that you may have broken hardware? Good
grief. You must work for Dell, or perhaps Gateway.

Richard
 
A

Alan Balmer

So I try
using my DVD writer with DIVX (or whatever it's called) and the drive makes
clicks and clunks. The disk copied, but is unusable. Now I find I can't play
DVD's, or copy them, or write new ones. The drive is non-functional. The
copy of the game is also non-functional.

IMO, although you may have a legitimate beef with the publisher for
copy protecting the disk in the first place (or with yourself for
buying it), it wasn't their disk which broke your DVD writer, it was
the DIVX (or whatever it's called) program.

Complain to the publisher of DIVX, or to the manufacturer of the DVD
writer (on the perfectly reasonable grounds that their hardware should
not be designed so poorly as to be damaged by software.) Is it still
under warranty?
 
P

Paul Heininger

jacob navia said:
A signed int can contain up to 2Gig, 2 147 483 648, to be exact.
I had
more than 18 000 files in a single directory.
Without being
aware of it, I crossed the 2 147 483 648 border last week.
<snip>

Jacob,

First, I am sorry for your loss of data.

I am skeptical that the crossing the 2GB boundary in a single directory
would cause the problem you described. I have to believe many people have
had more than 2GB in a single directory. I am not saying this was not the
problem. It just does not seem like you would be one of the earliest to
find it. And by reading the various responses, that seems to be a common
opinion.

So I started thinking about other boundaries you could have crossed. One
theory (and it is a guess) is that you may have gone over 65535 entries in
that folders directory. First you had 18,000 files. Second when searching
for the post you mentioned, I found and indication of the size of the file
names of these files in a different post. I am not going back to count it
but it seemed like about 30 characters. I am also assuming that both your
Windows and Linux machines are using FAT32 or a very similar directory
structure. (If Linux is using a different structure or this is NTFS Windows
structure, my analysis is completely wrong.) For each file, when using long
file names, we need one entry for the 8.3 name and 1 entry for each 11
characters of the name. (Again this is from a faded memory. I am not the
best expert on file system internals.) So to support 18,000 files, one
would need more than 65535 directory entries. If the file system code was
using 16-bit unsigned integers, the indexs would overflow.

This may not be the issue. But something similar to this is obviously
possible. Do not limit your thinking to just the number of bytes in the
files.

Paul
 
D

Dan Pop

In said:
<WAY_OT>
For Windows, FAT32 vs. NTFS might be significant.
</WAY_OT>

Only WRT the maximum size of a *single* file. I have copied directories
with more than 2 GB of data between FAT32 file systems with no problems
at all.

Dan
 
D

Dan Pop

In said:
Great. Now please
cat * >/dev/ null

Does that work?

Now, try to engage your brain and answer the following simple questions:

1. Why would the copy command try to attempt to compute (or count)
the number of copied bytes?

2. Assuming that it did, why would an integer overflow in *user* code
cause a piece of kernel code to malfunction?

Before foolishly jumping to conclusions, try to understand and analyse
the problem.

If you can provide us with a *reproducible* recipe for disaster (one that
systematically "works" for you), many of us would be willing to try and
analyse it (if it works for us, too).

Personally, I suspect some metadata corruption in one of your files.
The 2 GB thing is a pure coincidence, the disaster happened when the
system tried to handle the corrupted file.

Dan
 
K

Keith Thompson

Only WRT the maximum size of a *single* file. I have copied directories
with more than 2 GB of data between FAT32 file systems with no problems
at all.

Ok. I know very little about the internals of either FAT32 or NTFS.
My statement was speculation, and probably should have been more
clearly labeled as such.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,764
Messages
2,569,564
Members
45,040
Latest member
papereejit

Latest Threads

Top