how to calculate correctly the cluster size

Discussion in 'Python' started by Sylwia, Apr 13, 2004.

  1. Sylwia

    Sylwia Guest

    Hi!

    I need Your help...

    I am trying to find a way to calculate the cluster size for a drive. I
    looked at the GetDiskFreeSpaceEx function in Win32 and it turned out
    that it didn't provide the information I was looking for.
    Using the old GetDiskFreeSpace function I could just multiply the
    Bytes per
    Sector by the Sector per Cluster and get the cluster size. I heard
    that the old GetDiskFreeSpace function may report not proper values
    for volume sizes that are greater than 2 gigabytes. The new version of
    that function (GetDiskFreeSpaceEx) returns the final disk free size
    without the ability to calculate the cluster size - and the old
    version does not work correctly with FAT 32... what can I do? Is there
    another function I can use?

    Thank You in advance,

    Farraige
     
    Sylwia, Apr 13, 2004
    #1
    1. Advertising

  2. > I am trying to find a way to calculate the cluster size for a drive. I
    > looked at the GetDiskFreeSpaceEx function in Win32 and it turned out
    > that it didn't provide the information I was looking for.
    > Using the old GetDiskFreeSpace function I could just multiply the
    > Bytes per
    > Sector by the Sector per Cluster and get the cluster size. I heard
    > that the old GetDiskFreeSpace function may report not proper values
    > for volume sizes that are greater than 2 gigabytes. The new version of
    > that function (GetDiskFreeSpaceEx) returns the final disk free size
    > without the ability to calculate the cluster size - and the old
    > version does not work correctly with FAT 32... what can I do? Is there
    > another function I can use?


    Generally, I believe sectors are consistantly 512 bytes, and depending
    on the format of your drive, cluster size can vary. Both NTFS and FAT32
    generally run with 4k clusters, and at least until mid-2002, there
    didn't exist a commercial defragmenter that could handle non-4k clusters
    in Windows (I had a drive with 16k clusters that I needed to repartition
    in order to fix that, thank god for partitionmagic). I don't think you
    would be out of line to assume 4k clusters on installations using FAT32
    or NTFS.

    In terms of volume sizes > 2 gigs, that sounds like the old cluster size
    and addressing limit for FAT 16, but considering how rare FAT 16 is
    nowadays (perhaps pre Win95 OSR2 installations still use it), I wouldn't
    worry too much (unless you are specifically writing for old operating
    systems).

    - Josiah
     
    Josiah Carlson, Apr 19, 2004
    #2
    1. Advertising

  3. On Sun, 18 Apr 2004 23:19:46 -0700, Josiah Carlson <>
    declaimed the following in comp.lang.python:

    > in order to fix that, thank god for partitionmagic). I don't think you
    > would be out of line to assume 4k clusters on installations using FAT32
    > or NTFS.
    >

    Heh... Checking just two of my 10 partitions (spread over two
    internal drives, and an external FireWire) I'm running 8K clusters, and
    likely much larger on the FireWire (which has four 40GB partitions!).

    W98se/FAT32

    --
    > ============================================================== <
    > | Wulfraed Dennis Lee Bieber KD6MOG <
    > | Bestiaria Support Staff <
    > ============================================================== <
    > Home Page: <http://www.dm.net/~wulfraed/> <
    > Overflow Page: <http://wlfraed.home.netcom.com/> <
     
    Dennis Lee Bieber, Apr 19, 2004
    #3
  4. Dennis Lee Bieber wrote:
    > On Sun, 18 Apr 2004 23:19:46 -0700, Josiah Carlson <>
    > declaimed the following in comp.lang.python:
    >
    >
    >>in order to fix that, thank god for partitionmagic). I don't think you
    >>would be out of line to assume 4k clusters on installations using FAT32
    >>or NTFS.
    >>

    >
    > Heh... Checking just two of my 10 partitions (spread over two
    > internal drives, and an external FireWire) I'm running 8K clusters, and
    > likely much larger on the FireWire (which has four 40GB partitions!).
    >
    > W98se/FAT32


    Can you defrag your disks? If so, then perhaps my not being able to
    find a software to defragment disks was a 'feature' of using Windows
    2000, NTFS and non-4k clusters.

    I'm curious; did you set the cluster size yourself, or were those
    automatic sizes generated by 98SE?

    - Josiah
     
    Josiah Carlson, Apr 19, 2004
    #4
  5. On Mon, 19 Apr 2004 07:28:06 GMT, Dennis Lee Bieber
    <> declaimed the following in comp.lang.python:

    Talking to myself... <G>

    > Heh... Checking just two of my 10 partitions (spread over two
    > internal drives, and an external FireWire) I'm running 8K clusters, and
    > likely much larger on the FireWire (which has four 40GB partitions!).
    >

    The FireWire is running 32K clusters on each of the four
    partitions -- smallest size possible on the partition (and conversely,
    about the most optimum partition sizes too).

    --
    > ============================================================== <
    > | Wulfraed Dennis Lee Bieber KD6MOG <
    > | Bestiaria Support Staff <
    > ============================================================== <
    > Home Page: <http://www.dm.net/~wulfraed/> <
    > Overflow Page: <http://wlfraed.home.netcom.com/> <
     
    Dennis Lee Bieber, Apr 19, 2004
    #5
  6. On Mon, 19 Apr 2004 08:24:59 -0700, Josiah Carlson <>
    declaimed the following in comp.lang.python:

    >
    > Can you defrag your disks? If so, then perhaps my not being able to
    > find a software to defragment disks was a 'feature' of using Windows
    > 2000, NTFS and non-4k clusters.
    >

    The big partitions, about monthly (actually, all partitions
    about monthly). The most used data partitions (smaller) about weekly.

    > I'm curious; did you set the cluster size yourself, or were those
    > automatic sizes generated by 98SE?
    >

    These are all FAT32 partitions. Don't know how to, if possible,
    change clusters on my XP laptop using NTFS.

    While I used Partition Magic to do the partitioning, the sizes
    are the smallest cluster size possible in the partition size. One reason
    I have so many partitions:

    30GB (raw) drive:
    C: 9GB System software
    E: 1GB swap/temp
    F: 15GB games and downloaded files
    plus two ext2fs Linux partitions

    40GB (raw) drive:
    D: 5GB 3rd party application software
    G: 7.5GB Text data (Agent data, Word, database,
    etc.)
    H: 15GB Graphics (digicam images, scans, etc.)
    plus two more ext2fs Linux partitions

    The FireWire drive has three 44GB partitions (maximum for W98se,
    I believe), and a 21GB partition). Used for MP3s and some miniDV
    captures.

    It took forever to defrag a 300GB (two partitions) firewire
    drive under NTFS (this drive is used with my XP laptop for miniDV
    captures and editing). I actually had to double the memory of my laptop
    (now 768 MB) before the defrag could run to completion.
    --
    > ============================================================== <
    > | Wulfraed Dennis Lee Bieber KD6MOG <
    > | Bestiaria Support Staff <
    > ============================================================== <
    > Home Page: <http://www.dm.net/~wulfraed/> <
    > Overflow Page: <http://wlfraed.home.netcom.com/> <
     
    Dennis Lee Bieber, Apr 20, 2004
    #6
  7. > These are all FAT32 partitions. Don't know how to, if possible,
    > change clusters on my XP laptop using NTFS.
    >
    > While I used Partition Magic to do the partitioning, the sizes
    > are the smallest cluster size possible in the partition size. One reason
    > I have so many partitions:


    The Windows 2k "Disk Administrator" software for 2K always uses 4k
    cluster sizes by default. I believe your varied cluster sizes are the
    result of using Partition Magic to create them.


    > It took forever to defrag a 300GB (two partitions) firewire
    > drive under NTFS (this drive is used with my XP laptop for miniDV
    > captures and editing). I actually had to double the memory of my laptop
    > (now 768 MB) before the defrag could run to completion.


    There exists an algorithm for defragmenting a drive that only needs to
    read and write the entire drive twice (I wrote one for a database
    defragmenter), and there likely exists one that reads and writes even less.

    If you have a 150 gig drive (and it is filled), your computer will need
    to read and write a around 600 gigs (read, write, each twice, 150 gigs).
    Even if your drive is fast, like say 30 megs/second (probably on the
    high-end for defragmenting), 600,000/30/3600 ~ 5.5 hours. In reality,
    you're probably getting closer to 5-15 megs/second during a defragment,
    which would give you 11-33 hours to defrag each of your 150 gig partitions.

    It's the whole capacity vs bandwidth issue on hard drives, which is
    similar to the bandwidth vs latency issue with RAM. Ahh, technology.

    - Josiah
     
    Josiah Carlson, Apr 21, 2004
    #7
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. pembed2003
    Replies:
    9
    Views:
    14,843
    tom_usenet
    Jun 9, 2004
  2. pmatos
    Replies:
    4
    Views:
    649
    Pete Becker
    Jun 14, 2005
  3. Angus Comber
    Replies:
    4
    Views:
    565
    Martin Ambuhl
    Feb 6, 2004
  4. Sylwia
    Replies:
    2
    Views:
    547
    Dennis Lee Bieber
    Apr 14, 2004
  5. Tim Golden
    Replies:
    1
    Views:
    833
    Josiah Carlson
    Apr 21, 2004
Loading...

Share This Page