Dynamic multidimensional array, deallocation of pointer not malloced..

Discussion in 'C++' started by welch.ryan@gmail.com, May 12, 2007.

  1. Guest

    Hi all,

    Having a problem with addressing large amounts of memory. I have a
    simple piece of code here that is meant to allocate a large piece of
    memory on a ppc64 machine. The code is:

    /*
    Test to see what happens when we try to allocate a massively huge
    piece of memory.
    */

    #include <iostream>
    #include <string>
    #include <stdexcept>
    using namespace std;

    int main(int argc, char** argv) {
    cout << "Attemping to allocate.." << endl;

    const int ROWS = 635000;
    const int COLS = 2350;

    // Allocate.
    try {
    int** test = new int*[ROWS];
    for (int i = 0; i < ROWS; i++) {
    test = new int[COLS];
    for (int j = 0; j < COLS; j++) {
    test[j] = 0;
    }
    }

    cout << "Allocation succeeded!" << endl;
    cout << "Press a key to deallocate and continue.." << endl;
    string blank;
    getline(cin,blank);

    // Deallocate.
    for (int k = 0; k < ROWS; k++) {
    delete[] test[k];
    }
    delete[] test;

    cout << "Deallocation completed!" << endl;
    cout << "Press a key to terminate.." << endl;
    getline(cin,blank);
    }
    catch(bad_alloc& e) {
    cout << "Allocation failed.." << endl;
    }

    return 0;
    }

    If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
    if I set ROWS to 635000 and COLS to 2350, it will give me the
    following error upon deallocation:

    HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
    malloced: 0x20afd2000; This could be a double free(), or free() called
    with the middle of an allocated block; Try setting environment
    variable MallocHelp to see tools to help debug

    Note that the allocation step succeeds, and that I only receive this
    error after allowing the code to deallocate the array.

    Any ideas?

    Thanks,
    Ryan
    , May 12, 2007
    #1
    1. Advertising

  2. Ian Collins Guest

    wrote:
    > Hi all,
    >
    > Having a problem with addressing large amounts of memory. I have a
    > simple piece of code here that is meant to allocate a large piece of
    > memory on a ppc64 machine. The code is:
    >
    > /*
    > Test to see what happens when we try to allocate a massively huge
    > piece of memory.
    > */
    >
    > #include <iostream>
    > #include <string>
    > #include <stdexcept>
    > using namespace std;
    >
    > int main(int argc, char** argv) {
    > cout << "Attemping to allocate.." << endl;
    >
    > const int ROWS = 635000;
    > const int COLS = 2350;
    >
    > // Allocate.
    > try {
    > int** test = new int*[ROWS];
    > for (int i = 0; i < ROWS; i++) {
    > test = new int[COLS];
    > for (int j = 0; j < COLS; j++) {
    > test[j] = 0;
    > }
    > }
    >
    > cout << "Allocation succeeded!" << endl;
    > cout << "Press a key to deallocate and continue.." << endl;
    > string blank;
    > getline(cin,blank);
    >
    > // Deallocate.
    > for (int k = 0; k < ROWS; k++) {
    > delete[] test[k];
    > }
    > delete[] test;
    >
    > cout << "Deallocation completed!" << endl;
    > cout << "Press a key to terminate.." << endl;
    > getline(cin,blank);
    > }
    > catch(bad_alloc& e) {
    > cout << "Allocation failed.." << endl;
    > }
    >
    > return 0;
    > }
    >
    > If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
    > if I set ROWS to 635000 and COLS to 2350, it will give me the
    > following error upon deallocation:
    >
    > HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
    > malloced: 0x20afd2000; This could be a double free(), or free() called
    > with the middle of an allocated block; Try setting environment
    > variable MallocHelp to see tools to help debug
    >
    > Note that the allocation step succeeds, and that I only receive this
    > error after allowing the code to deallocate the array.
    >

    Looks suspect, has the machine got the nigh on 6GB or virtual memory
    available? Make sure your operator new behaves correctly by attempting
    to allocate way more than the machine can provide.

    --
    Ian Collins.
    Ian Collins, May 12, 2007
    #2
    1. Advertising

  3. On May 12, 2:02 am, wrote:
    > Hi all,
    >
    > Having a problem with addressing large amounts of memory. I have a
    > simple piece of code here that is meant to allocate a large piece of
    > memory on a ppc64 machine. The code is:
    >
    > /*
    > Test to see what happens when we try to allocate a massively huge
    > piece of memory.
    > */
    >
    > #include <iostream>
    > #include <string>
    > #include <stdexcept>
    > using namespace std;
    >
    > int main(int argc, char** argv) {
    > cout << "Attemping to allocate.." << endl;
    >
    > const int ROWS = 635000;
    > const int COLS = 2350;
    >
    > // Allocate.
    > try {
    > int** test = new int*[ROWS];
    > for (int i = 0; i < ROWS; i++) {
    > test = new int[COLS];
    > for (int j = 0; j < COLS; j++) {
    > test[j] = 0;
    > }
    > }
    >
    > cout << "Allocation succeeded!" << endl;
    > cout << "Press a key to deallocate and continue.." << endl;
    > string blank;
    > getline(cin,blank);
    >
    > // Deallocate.
    > for (int k = 0; k < ROWS; k++) {
    > delete[] test[k];
    > }
    > delete[] test;
    >
    > cout << "Deallocation completed!" << endl;
    > cout << "Press a key to terminate.." << endl;
    > getline(cin,blank);
    > }
    > catch(bad_alloc& e) {
    > cout << "Allocation failed.." << endl;
    > }
    >
    > return 0;
    >
    > }
    >
    > If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
    > if I set ROWS to 635000 and COLS to 2350, it will give me the
    > following error upon deallocation:


    There are lot of allocations there.
    Looks to me that malloc internaly has overflow in pointer arithmetic.
    Perhaps you use 32 bit malloc on 64 bit setup somehow?

    Try following and see if works:

    const unsigned rows = 635000;
    const unsigned cols = 2350;
    int (*p)[cols] = new int[rows][cols];
    for(unsigned i = 0; i<rows;++i)
    {
    for(unsigned j=0;j<cols;++j)
    p[j] = 0;
    }
    cin.get();
    delete[] p;

    Greetings, Branimir
    Branimir Maksimovic, May 12, 2007
    #3
  4. Guest

    On May 11, 8:24 pm, Ian Collins <> wrote:
    > wrote:
    > > Hi all,

    >
    > > Having a problem with addressing large amounts of memory. I have a
    > > simple piece of code here that is meant to allocate a large piece of
    > > memory on a ppc64 machine. The code is:

    >
    > > /*
    > > Test to see what happens when we try to allocate a massively huge
    > > piece of memory.
    > > */

    >
    > > #include <iostream>
    > > #include <string>
    > > #include <stdexcept>
    > > using namespace std;

    >
    > > int main(int argc, char** argv) {
    > > cout << "Attemping to allocate.." << endl;

    >
    > > const int ROWS = 635000;
    > > const int COLS = 2350;

    >
    > > // Allocate.
    > > try {
    > > int** test = new int*[ROWS];
    > > for (int i = 0; i < ROWS; i++) {
    > > test = new int[COLS];
    > > for (int j = 0; j < COLS; j++) {
    > > test[j] = 0;
    > > }
    > > }

    >
    > > cout << "Allocation succeeded!" << endl;
    > > cout << "Press a key to deallocate and continue.." << endl;
    > > string blank;
    > > getline(cin,blank);

    >
    > > // Deallocate.
    > > for (int k = 0; k < ROWS; k++) {
    > > delete[] test[k];
    > > }
    > > delete[] test;

    >
    > > cout << "Deallocation completed!" << endl;
    > > cout << "Press a key to terminate.." << endl;
    > > getline(cin,blank);
    > > }
    > > catch(bad_alloc& e) {
    > > cout << "Allocation failed.." << endl;
    > > }

    >
    > > return 0;
    > > }

    >
    > > If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
    > > if I set ROWS to 635000 and COLS to 2350, it will give me the
    > > following error upon deallocation:

    >
    > > HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
    > > malloced: 0x20afd2000; This could be a double free(), or free() called
    > > with the middle of an allocated block; Try setting environment
    > > variable MallocHelp to see tools to help debug

    >
    > > Note that the allocation step succeeds, and that I only receive this
    > > error after allowing the code to deallocate the array.

    >
    > Looks suspect, has the machine got the nigh on 6GB or virtual memory
    > available? Make sure your operator new behaves correctly by attempting
    > to allocate way more than the machine can provide.
    >
    > --
    > Ian Collins.


    Hmm.. the machine has 8 GB of RAM, so that probably isn't the issue.
    I'll try maxing it out to see what happens.
    , May 12, 2007
    #4
  5. Guest

    On May 11, 9:15 pm, Branimir Maksimovic <> wrote:
    > On May 12, 2:02 am, wrote:
    >
    >
    >
    > > Hi all,

    >
    > > Having a problem with addressing large amounts of memory. I have a
    > > simple piece of code here that is meant to allocate a large piece of
    > > memory on a ppc64 machine. The code is:

    >
    > > /*
    > > Test to see what happens when we try to allocate a massively huge
    > > piece of memory.
    > > */

    >
    > > #include <iostream>
    > > #include <string>
    > > #include <stdexcept>
    > > using namespace std;

    >
    > > int main(int argc, char** argv) {
    > > cout << "Attemping to allocate.." << endl;

    >
    > > const int ROWS = 635000;
    > > const int COLS = 2350;

    >
    > > // Allocate.
    > > try {
    > > int** test = new int*[ROWS];
    > > for (int i = 0; i < ROWS; i++) {
    > > test = new int[COLS];
    > > for (int j = 0; j < COLS; j++) {
    > > test[j] = 0;
    > > }
    > > }

    >
    > > cout << "Allocation succeeded!" << endl;
    > > cout << "Press a key to deallocate and continue.." << endl;
    > > string blank;
    > > getline(cin,blank);

    >
    > > // Deallocate.
    > > for (int k = 0; k < ROWS; k++) {
    > > delete[] test[k];
    > > }
    > > delete[] test;

    >
    > > cout << "Deallocation completed!" << endl;
    > > cout << "Press a key to terminate.." << endl;
    > > getline(cin,blank);
    > > }
    > > catch(bad_alloc& e) {
    > > cout << "Allocation failed.." << endl;
    > > }

    >
    > > return 0;

    >
    > > }

    >
    > > If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
    > > if I set ROWS to 635000 and COLS to 2350, it will give me the
    > > following error upon deallocation:

    >
    > There are lot of allocations there.
    > Looks to me that malloc internaly has overflow in pointer arithmetic.
    > Perhaps you use 32 bit malloc on 64 bit setup somehow?
    >
    > Try following and see if works:
    >
    > const unsigned rows = 635000;
    > const unsigned cols = 2350;
    > int (*p)[cols] = new int[rows][cols];
    > for(unsigned i = 0; i<rows;++i)
    > {
    > for(unsigned j=0;j<cols;++j)
    > p[j] = 0;
    > }
    > cin.get();
    > delete[] p;
    >
    > Greetings, Branimir


    I can't seem to get that code to compile, it complains about the line
    before the for loop. I'm using gcc 4.0.1 for apple/darwin. Any ideas?

    I think you're probably right, it has something to do with pointer
    arithmetic. I'm just not sure what. The malloc() failure is happening
    on the for loop where I'm deallocating each row of the array, I've
    figured out that much. Beyond that, I'm not sure.

    I've tried the following compiler options too but they don't warn me
    of anything:

    g++ -o HugeMemory.exe -O3 -mcpu=powerpc64 -arch ppc64 -faltivec -Wall -
    Wconversion HugeMemory.cpp
    , May 12, 2007
    #5
  6. Ian Collins Guest

    wrote:
    > On May 11, 9:15 pm, Branimir Maksimovic <> wrote:
    >> There are lot of allocations there.
    >> Looks to me that malloc internaly has overflow in pointer arithmetic.
    >> Perhaps you use 32 bit malloc on 64 bit setup somehow?
    >>
    >> Try following and see if works:
    >>
    >> const unsigned rows = 635000;
    >> const unsigned cols = 2350;
    >> int (*p)[cols] = new int[rows][cols];
    >> for(unsigned i = 0; i<rows;++i)
    >> {
    >> for(unsigned j=0;j<cols;++j)
    >> p[j] = 0;
    >> }
    >> cin.get();
    >> delete[] p;
    >>
    >> Greetings, Branimir

    >
    > I can't seem to get that code to compile, it complains about the line
    > before the for loop. I'm using gcc 4.0.1 for apple/darwin. Any ideas?
    >

    The code is fine, with the exception of an integer overflow warning from
    gcc.

    > I think you're probably right, it has something to do with pointer
    > arithmetic. I'm just not sure what. The malloc() failure is happening
    > on the for loop where I'm deallocating each row of the array, I've
    > figured out that much. Beyond that, I'm not sure.
    >

    Update your code to scan the rows for duplicate addresses. If you find
    one, something is wrong!

    --
    Ian Collins.
    Ian Collins, May 12, 2007
    #6
  7. On May 12, 5:37 am, wrote:
    > On May 11, 9:15 pm, Branimir Maksimovic <> wrote:
    >
    >
    >
    > > On May 12, 2:02 am, wrote:

    >
    > > > Hi all,

    >
    > > > Having a problem with addressing large amounts of memory. I have a
    > > > simple piece of code here that is meant to allocate a large piece of
    > > > memory on a ppc64 machine. The code is:

    >

    .......
    > > > If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
    > > > if I set ROWS to 635000 and COLS to 2350, it will give me the
    > > > following error upon deallocation:

    >
    > > There are lot of allocations there.
    > > Looks to me that malloc internaly has overflow in pointer arithmetic.
    > > Perhaps you use 32 bit malloc on 64 bit setup somehow?

    >
    > > Try following and see if works:

    >
    > > const unsigned rows = 635000;
    > > const unsigned cols = 2350;
    > > int (*p)[cols] = new int[rows][cols];
    > > for(unsigned i = 0; i<rows;++i)
    > > {
    > > for(unsigned j=0;j<cols;++j)
    > > p[j] = 0;
    > > }
    > > cin.get();
    > > delete[] p;

    >
    > > Greetings, Branimir

    >
    > I can't seem to get that code to compile, it complains about the line
    > before the for loop. I'm using gcc 4.0.1 for apple/darwin. Any ideas?


    What is the error message?

    Greetings, Branimir.
    Branimir Maksimovic, May 12, 2007
    #7
  8. On 2007-05-12 02:02, wrote:
    > Hi all,
    >
    > Having a problem with addressing large amounts of memory. I have a
    > simple piece of code here that is meant to allocate a large piece of
    > memory on a ppc64 machine. The code is:
    >
    > /*
    > Test to see what happens when we try to allocate a massively huge
    > piece of memory.
    > */
    >
    > #include <iostream>
    > #include <string>
    > #include <stdexcept>
    > using namespace std;
    >
    > int main(int argc, char** argv) {
    > cout << "Attemping to allocate.." << endl;
    >
    > const int ROWS = 635000;
    > const int COLS = 2350;
    >
    > // Allocate.
    > try {
    > int** test = new int*[ROWS];
    > for (int i = 0; i < ROWS; i++) {
    > test = new int[COLS];
    > for (int j = 0; j < COLS; j++) {
    > test[j] = 0;
    > }
    > }
    >
    > cout << "Allocation succeeded!" << endl;
    > cout << "Press a key to deallocate and continue.." << endl;
    > string blank;
    > getline(cin,blank);
    >
    > // Deallocate.
    > for (int k = 0; k < ROWS; k++) {
    > delete[] test[k];
    > }
    > delete[] test;
    >
    > cout << "Deallocation completed!" << endl;
    > cout << "Press a key to terminate.." << endl;
    > getline(cin,blank);
    > }
    > catch(bad_alloc& e) {
    > cout << "Allocation failed.." << endl;
    > }
    >
    > return 0;
    > }
    >
    > If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
    > if I set ROWS to 635000 and COLS to 2350, it will give me the
    > following error upon deallocation:
    >
    > HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
    > malloced: 0x20afd2000; This could be a double free(), or free() called
    > with the middle of an allocated block; Try setting environment
    > variable MallocHelp to see tools to help debug
    >
    > Note that the allocation step succeeds, and that I only receive this
    > error after allowing the code to deallocate the array.


    I have absolutely no idea, but you could try to make the code a bit more
    simple by allocating everything in one large block instead:

    #include <iostream>
    #include <string>
    #include <stdexcept>
    using namespace std;

    int main() {
    cout << "Attemping to allocate.." << endl;

    const int ROWS = 635000;
    const int COLS = 2350;

    // Allocate.
    try {
    int* test = new int[ROWS * COLS];
    for (int i = 0; i < ROWS * COLS; i++) {
    test = 0;
    }

    cout << "Allocation succeeded!" << endl;
    cout << "Press a key to deallocate and continue.." << endl;

    string blank;
    getline(cin,blank);

    // Deallocate.
    delete[] test;

    cout << "Deallocation completed!" << endl;
    cout << "Press a key to terminate.." << endl;
    getline(cin,blank);
    }
    catch(bad_alloc& e) {
    cout << "Allocation failed.." << endl;
    }

    return 0;
    }

    Do you still get the same error (or some other)? If you do there's
    probably something wrong with your standard library.

    --
    Erik Wikström
    =?ISO-8859-1?Q?Erik_Wikstr=F6m?=, May 12, 2007
    #8
  9. Guest

    On May 11, 9:15 pm, Branimir Maksimovic <> wrote:
    > On May 12, 2:02 am, wrote:
    >
    >
    >
    > > Hi all,

    >
    > > Having a problem with addressing large amounts of memory. I have a
    > > simple piece of code here that is meant to allocate a large piece of
    > > memory on a ppc64 machine. The code is:

    >
    > > /*
    > > Test to see what happens when we try to allocate a massively huge
    > > piece of memory.
    > > */

    >
    > > #include <iostream>
    > > #include <string>
    > > #include <stdexcept>
    > > using namespace std;

    >
    > > int main(int argc, char** argv) {
    > > cout << "Attemping to allocate.." << endl;

    >
    > > const int ROWS = 635000;
    > > const int COLS = 2350;

    >
    > > // Allocate.
    > > try {
    > > int** test = new int*[ROWS];
    > > for (int i = 0; i < ROWS; i++) {
    > > test = new int[COLS];
    > > for (int j = 0; j < COLS; j++) {
    > > test[j] = 0;
    > > }
    > > }

    >
    > > cout << "Allocation succeeded!" << endl;
    > > cout << "Press a key to deallocate and continue.." << endl;
    > > string blank;
    > > getline(cin,blank);

    >
    > > // Deallocate.
    > > for (int k = 0; k < ROWS; k++) {
    > > delete[] test[k];
    > > }
    > > delete[] test;

    >
    > > cout << "Deallocation completed!" << endl;
    > > cout << "Press a key to terminate.." << endl;
    > > getline(cin,blank);
    > > }
    > > catch(bad_alloc& e) {
    > > cout << "Allocation failed.." << endl;
    > > }

    >
    > > return 0;

    >
    > > }

    >
    > > If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
    > > if I set ROWS to 635000 and COLS to 2350, it will give me the
    > > following error upon deallocation:

    >
    > There are lot of allocations there.
    > Looks to me that malloc internaly has overflow in pointer arithmetic.
    > Perhaps you use 32 bit malloc on 64 bit setup somehow?
    >
    > Try following and see if works:
    >
    > const unsigned rows = 635000;
    > const unsigned cols = 2350;
    > int (*p)[cols] = new int[rows][cols];
    > for(unsigned i = 0; i<rows;++i)
    > {
    > for(unsigned j=0;j<cols;++j)
    > p[j] = 0;
    > }
    > cin.get();
    > delete[] p;
    >
    > Greetings, Branimir


    This is what happens:

    HugeMemory2.cpp:4: error: expected unqualified-id before 'for'
    HugeMemory2.cpp:4: error: expected constructor, destructor, or type
    conversion before '<' token
    HugeMemory2.cpp:4: error: expected unqualified-id before '++' token
    HugeMemory2.cpp:9: error: expected constructor, destructor, or type
    conversion before '.' token
    HugeMemory2.cpp:10: error: expected unqualified-id before 'delete'

    I thought maybe it was because there's no 'int' after 'unsigned' but
    that didn't help. :(
    , May 12, 2007
    #9
  10. Guest

    On May 12, 6:09 am, Erik Wikström <> wrote:
    > On 2007-05-12 02:02, wrote:
    >
    >
    >
    > > Hi all,

    >
    > > Having a problem with addressing large amounts of memory. I have a
    > > simple piece of code here that is meant to allocate a large piece of
    > > memory on a ppc64 machine. The code is:

    >
    > > /*
    > > Test to see what happens when we try to allocate a massively huge
    > > piece of memory.
    > > */

    >
    > > #include <iostream>
    > > #include <string>
    > > #include <stdexcept>
    > > using namespace std;

    >
    > > int main(int argc, char** argv) {
    > > cout << "Attemping to allocate.." << endl;

    >
    > > const int ROWS = 635000;
    > > const int COLS = 2350;

    >
    > > // Allocate.
    > > try {
    > > int** test = new int*[ROWS];
    > > for (int i = 0; i < ROWS; i++) {
    > > test = new int[COLS];
    > > for (int j = 0; j < COLS; j++) {
    > > test[j] = 0;
    > > }
    > > }

    >
    > > cout << "Allocation succeeded!" << endl;
    > > cout << "Press a key to deallocate and continue.." << endl;
    > > string blank;
    > > getline(cin,blank);

    >
    > > // Deallocate.
    > > for (int k = 0; k < ROWS; k++) {
    > > delete[] test[k];
    > > }
    > > delete[] test;

    >
    > > cout << "Deallocation completed!" << endl;
    > > cout << "Press a key to terminate.." << endl;
    > > getline(cin,blank);
    > > }
    > > catch(bad_alloc& e) {
    > > cout << "Allocation failed.." << endl;
    > > }

    >
    > > return 0;
    > > }

    >
    > > If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
    > > if I set ROWS to 635000 and COLS to 2350, it will give me the
    > > following error upon deallocation:

    >
    > > HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
    > > malloced: 0x20afd2000; This could be a double free(), or free() called
    > > with the middle of an allocated block; Try setting environment
    > > variable MallocHelp to see tools to help debug

    >
    > > Note that the allocation step succeeds, and that I only receive this
    > > error after allowing the code to deallocate the array.

    >
    > I have absolutely no idea, but you could try to make the code a bit more
    > simple by allocating everything in one large block instead:
    >
    > #include <iostream>
    > #include <string>
    > #include <stdexcept>
    > using namespace std;
    >
    > int main() {
    > cout << "Attemping to allocate.." << endl;
    >
    > const int ROWS = 635000;
    > const int COLS = 2350;
    >
    > // Allocate.
    > try {
    > int* test = new int[ROWS * COLS];
    > for (int i = 0; i < ROWS * COLS; i++) {
    > test = 0;
    > }
    >
    > cout << "Allocation succeeded!" << endl;
    > cout << "Press a key to deallocate and continue.." << endl;
    >
    > string blank;
    > getline(cin,blank);
    >
    > // Deallocate.
    > delete[] test;
    >
    > cout << "Deallocation completed!" << endl;
    > cout << "Press a key to terminate.." << endl;
    > getline(cin,blank);
    > }
    > catch(bad_alloc& e) {
    > cout << "Allocation failed.." << endl;
    > }
    >
    > return 0;
    >
    > }
    >
    > Do you still get the same error (or some other)? If you do there's
    > probably something wrong with your standard library.
    >
    > --
    > Erik Wikström


    Nope! That code succeeds. However, that code should require around 6
    GB of RAM, correct? If I use top to check the memory usage of the
    process, I see:

    3688 HugeMemory 0.0% 0:39.81 1 13 34 232K 6.51M 1.56G
    1.63G

    So it's not even coming close.. or am I reading that incorrectly?
    Quite strange..
    , May 12, 2007
    #10
  11. Re: Dynamic multidimensional array, deallocation of pointer notmalloced..

    On Sat, 12 May 2007 09:43:27 -0700, welch.ryan wrote:
    > On May 12, 6:09 am, Erik Wikström <> wrote:
    >> I have absolutely no idea, but you could try to make the code a bit
    >> more simple by allocating everything in one large block instead:
    >>
    >> #include <iostream>
    >> #include <string>
    >> #include <stdexcept>
    >> using namespace std;
    >>
    >> int main() {
    >> cout << "Attemping to allocate.." << endl;
    >>
    >> const int ROWS = 635000;
    >> const int COLS = 2350;
    >>
    >> // Allocate.
    >> try {
    >> int* test = new int[ROWS * COLS];
    >> for (int i = 0; i < ROWS * COLS; i++) {
    >> test = 0;
    >> }
    >>
    >> cout << "Allocation succeeded!" << endl; cout << "Press a key to
    >> deallocate and continue.." << endl;
    >>
    >> string blank;
    >> getline(cin,blank);
    >>
    >> // Deallocate.
    >> delete[] test;
    >>
    >> cout << "Deallocation completed!" << endl; cout << "Press a key to
    >> terminate.." << endl; getline(cin,blank);
    >> }
    >> catch(bad_alloc& e) {
    >> cout << "Allocation failed.." << endl;
    >> }
    >>
    >> return 0;
    >>
    >> }
    >>
    >> Do you still get the same error (or some other)? If you do there's
    >> probably something wrong with your standard library.
    >>
    >> --
    >> Erik Wikström

    >
    > Nope! That code succeeds. However, that code should require around 6 GB
    > of RAM, correct? If I use top to check the memory usage of the process,
    > I see:
    >
    > 3688 HugeMemory 0.0% 0:39.81 1 13 34 232K 6.51M 1.56G
    > 1.63G
    >
    > So it's not even coming close.. or am I reading that incorrectly? Quite
    > strange..


    Could be a 32bit overflow in top or somewhere else. The reported numbers
    are almost exactly 4G short of what one would expect.

    --
    Markus Schoder
    Markus Schoder, May 12, 2007
    #11
  12. Guest

    On May 12, 12:59 pm, Markus Schoder <> wrote:
    > On Sat, 12 May 2007 09:43:27 -0700, welch.ryan wrote:
    > > On May 12, 6:09 am, Erik Wikström <> wrote:
    > >> I have absolutely no idea, but you could try to make the code a bit
    > >> more simple by allocating everything in one large block instead:

    >
    > >> #include <iostream>
    > >> #include <string>
    > >> #include <stdexcept>
    > >> using namespace std;

    >
    > >> int main() {
    > >> cout << "Attemping to allocate.." << endl;

    >
    > >> const int ROWS = 635000;
    > >> const int COLS = 2350;

    >
    > >> // Allocate.
    > >> try {
    > >> int* test = new int[ROWS * COLS];
    > >> for (int i = 0; i < ROWS * COLS; i++) {
    > >> test = 0;
    > >> }

    >
    > >> cout << "Allocation succeeded!" << endl; cout << "Press a key to
    > >> deallocate and continue.." << endl;

    >
    > >> string blank;
    > >> getline(cin,blank);

    >
    > >> // Deallocate.
    > >> delete[] test;

    >
    > >> cout << "Deallocation completed!" << endl; cout << "Press a key to
    > >> terminate.." << endl; getline(cin,blank);
    > >> }
    > >> catch(bad_alloc& e) {
    > >> cout << "Allocation failed.." << endl;
    > >> }

    >
    > >> return 0;

    >
    > >> }

    >
    > >> Do you still get the same error (or some other)? If you do there's
    > >> probably something wrong with your standard library.

    >
    > >> --
    > >> Erik Wikström

    >
    > > Nope! That code succeeds. However, that code should require around 6 GB
    > > of RAM, correct? If I use top to check the memory usage of the process,
    > > I see:

    >
    > > 3688 HugeMemory 0.0% 0:39.81 1 13 34 232K 6.51M 1.56G
    > > 1.63G

    >
    > > So it's not even coming close.. or am I reading that incorrectly? Quite
    > > strange..

    >
    > Could be a 32bit overflow in top or somewhere else. The reported numbers
    > are almost exactly 4G short of what one would expect.
    >
    > --
    > Markus Schoder


    Interesting.. okay, so let's suppose top is reporting it incorrectly,
    and that code actually does truly successfully allocate all of that
    memory. Then the question is, why is it that my original code (using a
    multidimensional approach) fails, yet allocating it as one large block
    seems to work?
    , May 12, 2007
    #12
  13. Guest

    On May 11, 11:45 pm, Ian Collins <> wrote:
    > wrote:
    > > On May 11, 9:15 pm, Branimir Maksimovic <> wrote:
    > >> There are lot of allocations there.
    > >> Looks to me that malloc internaly has overflow in pointer arithmetic.
    > >> Perhaps you use 32 bit malloc on 64 bit setup somehow?

    >
    > >> Try following and see if works:

    >
    > >> const unsigned rows = 635000;
    > >> const unsigned cols = 2350;
    > >> int (*p)[cols] = new int[rows][cols];
    > >> for(unsigned i = 0; i<rows;++i)
    > >> {
    > >> for(unsigned j=0;j<cols;++j)
    > >> p[j] = 0;
    > >> }
    > >> cin.get();
    > >> delete[] p;

    >
    > >> Greetings, Branimir

    >
    > > I can't seem to get that code to compile, it complains about the line
    > > before the for loop. I'm using gcc 4.0.1 for apple/darwin. Any ideas?

    >
    > The code is fine, with the exception of an integer overflow warning from
    > gcc.
    >
    > > I think you're probably right, it has something to do with pointer
    > > arithmetic. I'm just not sure what. The malloc() failure is happening
    > > on the for loop where I'm deallocating each row of the array, I've
    > > figured out that much. Beyond that, I'm not sure.

    >
    > Update your code to scan the rows for duplicate addresses. If you find
    > one, something is wrong!
    >
    > --
    > Ian Collins.


    Okay, I wrote something that I *think* would detect duplicate
    addresses. Don't laugh at the implementation..

    #include <iostream>
    #include <string>
    #include <stdexcept>
    #include <map>
    using namespace std;

    int main(int argc, char** argv) {
    cout << "Attempting to allocate.." << endl; // I can spell correctly
    now..

    const int ROWS = 635000;
    const int COLS = 2350;

    // Keep track of all used addresses.
    map<int*,int> addresses;

    // Allocate.
    try {
    int** test = new int*[ROWS];
    for (int i = 0; i < ROWS; i++) {
    test = new int[COLS];
    addresses[ test ] += 1;
    for (int j = 0; j < COLS; j++) {
    test[j] = 0;
    }
    }

    // Check for duplicate addresses.
    cout << "Checking for duplicate addresses.." << endl;
    map<int*,int>::iterator iter = addresses.begin();
    map<int*,int>::iterator end = addresses.end();
    while (iter != end) {
    if (iter->second > 1) {
    cout << "--> Duplicate address detected: " << iter->first<<
    endl;
    }
    iter++;
    }

    cout << "Allocation succeeded!" << endl;
    cout << "Press a key to deallocate and continue.." << endl;
    string blank;
    getline(cin,blank);

    // Deallocate.
    for (int k = 0; k < ROWS; k++) {
    int** ptr = test + k;
    delete[] ptr;
    }
    delete[] test;

    cout << "Deallocation completed!" << endl;
    cout << "Press a key to terminate.." << endl;
    getline(cin,blank);
    }
    catch(bad_alloc& e) {
    cout << "Allocation failed.." << endl;
    }

    return 0;
    }

    That code detects no duplicate addresses.

    One additional thing I've noticed, I don't know if this helps: the
    addresses in these error messages are about 8 apart from each other,
    for example:

    HugeMemory.exe(537) malloc: *** Deallocation of a pointer not
    malloced: 0x2009350; This could be a double free(), or free() called
    with the middle of an allocated block; Try setting environment
    variable MallocHelp to see tools to help debug
    HugeMemory.exe(537) malloc: *** Deallocation of a pointer not
    malloced: 0x2009358; This could be a double free(), or free() called
    with the middle of an allocated block; Try setting environment
    variable MallocHelp to see tools to help debug
    HugeMemory.exe(537) malloc: *** Deallocation of a pointer not
    malloced: 0x2009360; This could be a double free(), or free() called
    with the middle of an allocated block; Try setting environment
    variable MallocHelp to see tools to help debug
    , May 12, 2007
    #13
  14. On May 12, 6:35 pm, wrote:
    > On May 11, 9:15 pm, Branimir Maksimovic <> wrote:
    >
    >
    >
    >
    > > Try following and see if works:

    >
    > > const unsigned rows = 635000;
    > > const unsigned cols = 2350;
    > > int (*p)[cols] = new int[rows][cols];
    > > for(unsigned i = 0; i<rows;++i)
    > > {
    > > for(unsigned j=0;j<cols;++j)
    > > p[j] = 0;
    > > }
    > > cin.get();
    > > delete[] p;

    >
    > > Greetings, Branimir

    >
    > This is what happens:
    >
    > HugeMemory2.cpp:4: error: expected unqualified-id before 'for'
    > HugeMemory2.cpp:4: error: expected constructor, destructor, or type
    > conversion before '<' token
    > HugeMemory2.cpp:4: error: expected unqualified-id before '++' token
    > HugeMemory2.cpp:9: error: expected constructor, destructor, or type
    > conversion before '.' token
    > HugeMemory2.cpp:10: error: expected unqualified-id before 'delete'
    >
    > I thought maybe it was because there's no 'int' after 'unsigned' but
    > that didn't help. :(


    Strange. Are you sure you entered code correctly?
    I guess that error is triggered by something else in your code,
    as I tried with comeau online, g++ 3.4.4 and latest vc++.

    Greetings, Branimir.
    Branimir Maksimovic, May 12, 2007
    #14
  15. Ian Collins Guest

    wrote:
    >
    > Interesting.. okay, so let's suppose top is reporting it incorrectly,
    > and that code actually does truly successfully allocate all of that
    > memory. Then the question is, why is it that my original code (using a
    > multidimensional approach) fails, yet allocating it as one large block
    > seems to work?
    >

    The evidence is building a good case for a bug in your allocator, time
    to try a tool/platform specific forum to see it it is a known problem.

    --
    Ian Collins.
    Ian Collins, May 12, 2007
    #15
  16. Guest

    On May 12, 4:49 pm, Ian Collins <> wrote:
    > wrote:
    >
    > > Interesting.. okay, so let's suppose top is reporting it incorrectly,
    > > and that code actually does truly successfully allocate all of that
    > > memory. Then the question is, why is it that my original code (using a
    > > multidimensional approach) fails, yet allocating it as one large block
    > > seems to work?

    >
    > The evidence is building a good case for a bug in your allocator, time
    > to try a tool/platform specific forum to see it it is a known problem.
    >
    > --
    > Ian Collins.


    I'm starting to think you're right.. any suggestions for such a forum?
    , May 13, 2007
    #16
  17. Guest

    On May 12, 4:10 pm, Branimir Maksimovic <> wrote:
    > On May 12, 6:35 pm, wrote:
    >
    >
    >
    > > On May 11, 9:15 pm, Branimir Maksimovic <> wrote:

    >
    > > > Try following and see if works:

    >
    > > > const unsigned rows = 635000;
    > > > const unsigned cols = 2350;
    > > > int (*p)[cols] = new int[rows][cols];
    > > > for(unsigned i = 0; i<rows;++i)
    > > > {
    > > > for(unsigned j=0;j<cols;++j)
    > > > p[j] = 0;
    > > > }
    > > > cin.get();
    > > > delete[] p;

    >
    > > > Greetings, Branimir

    >
    > > This is what happens:

    >
    > > HugeMemory2.cpp:4: error: expected unqualified-id before 'for'
    > > HugeMemory2.cpp:4: error: expected constructor, destructor, or type
    > > conversion before '<' token
    > > HugeMemory2.cpp:4: error: expected unqualified-id before '++' token
    > > HugeMemory2.cpp:9: error: expected constructor, destructor, or type
    > > conversion before '.' token
    > > HugeMemory2.cpp:10: error: expected unqualified-id before 'delete'

    >
    > > I thought maybe it was because there's no 'int' after 'unsigned' but
    > > that didn't help. :(

    >
    > Strange. Are you sure you entered code correctly?
    > I guess that error is triggered by something else in your code,
    > as I tried with comeau online, g++ 3.4.4 and latest vc++.
    >
    > Greetings, Branimir.


    Maybe it's a bad line ending or something. I can't find my handy dandy
    perl script to fix them..
    , May 13, 2007
    #17
  18. Paul Guest

    <> wrote in message
    news:...
    > Hi all,
    >
    > Having a problem with addressing large amounts of memory. I have a
    > simple piece of code here that is meant to allocate a large piece of
    > memory on a ppc64 machine. The code is:
    >
    > /*
    > Test to see what happens when we try to allocate a massively huge
    > piece of memory.
    > */
    >
    > #include <iostream>
    > #include <string>
    > #include <stdexcept>
    > using namespace std;
    >
    > int main(int argc, char** argv) {
    > cout << "Attemping to allocate.." << endl;
    >
    > const int ROWS = 635000;
    > const int COLS = 2350;
    >
    > // Allocate.
    > try {
    > int** test = new int*[ROWS];
    > for (int i = 0; i < ROWS; i++) {
    > test = new int[COLS];
    > for (int j = 0; j < COLS; j++) {
    > test[j] = 0;
    > }
    > }
    >


    How long does it take for your code to run? I'm asking this because the
    code you posted seems very inefficient.

    You are invoking "new[]" 635,000 times. An alternative is to make only two
    calls to "new[]". One new[] for the row pointers, and the second new[] to
    allocate the pool of memory for the int data. Then in the loop, you point
    the row pointers in the right place in the int data pool.

    Not only will this more than likely bypass your problem with the allocator,
    your code would more than likely see a significant increase in speed, both
    in allocation and in deallocation (at least in this area of code). However,
    you should test this (but I would be very surprised if there isn't a
    significant speed increase)

    Here are the internals of your code snippet rewritten making only two calls
    new[] and then two calls to delete[] to deallocate the memory.

    int *pool;
    int** test = new int*[ROWS]; // allocate for row pointers
    pool = new int [ROWS * COLS]; // allocate memory pool for data
    for (int i = 0; i < ROWS; i++)
    {
    test = pool;
    pool += COLS;
    }

    // Deallocate.
    delete [] test[0]; // deallocate pool
    delete [] test; // deallocate row pointers.

    - Paul
    Paul, May 13, 2007
    #18
  19. Guest

    You're right, that's definitely way more efficient. For my instance,
    though, I perform one massive allocation initially and then the code
    can run for hours to weeks, so the initial time for allocation wasn't
    a real issue for me.

    However, the interesting thing is, I don't get that malloc() error
    anymore upon deletion. I still don't know why that was happening
    originally in the first place, but at least this method works.

    Thanks for the help everyone, I really appreciate it!

    Cheers,
    Ryan
    , May 14, 2007
    #19
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. rohit

    malloced union member

    rohit, May 19, 2004, in forum: C Programming
    Replies:
    10
    Views:
    549
    Richard Bos
    May 24, 2004
  2. Kobu
    Replies:
    4
    Views:
    461
    CBFalconer
    Jan 19, 2005
  3. Replies:
    21
    Views:
    730
    Michael Wojcik
    Sep 29, 2005
  4. Kumar McMillan
    Replies:
    0
    Views:
    409
    Kumar McMillan
    Apr 19, 2007
  5. Noob
    Replies:
    18
    Views:
    351
    Tim Rentsch
    Jun 12, 2013
Loading...

Share This Page