Visual C++ and large 2d arrays

Discussion in 'C++' started by ico.bukvic@gmail.com, May 3, 2007.

  1. Guest

    Hi all,

    I've made a 2d dynamic array as follows (this is a snippet so not all
    variables are accounted for):

    //numvoices are dynamic (1-1000), entered by user
    //MAXCHANNELS is currently defined as 24

    float **gvoiceSpat;
    float **notechannelGain;
    float **notechannelGainSpread;

    gvoiceSpat = new float *[numvoices];
    notechannelGain = new float *[numvoices];
    notechannelGainSpread = new float *[numvoices];

    for (i = 0; i < numvoices; i++)
    {
    gvoiceSpat = new float[MAXCHANNELS];
    notechannelGain = new float[MAXCHANNELS];
    notechannelGainSpread = new float[MAXCHANNELS];
    }

    The interesting thing is that this code works flawlessly in gcc but in
    Visual C++ (2003 .NET) whenever numvoices exceeds ~120, the program
    crashes, sometimes reporting unknown exception. The problem is that
    this is a code for an external module for another application and uses
    additional third-party libs so it is difficult to point fingers at the
    culprit. Yet, the fact remains that this crash occurs only on Windows
    using Visual C++, while it works flawlessly on OSX (gcc) and Linux
    (gcc) using same libs.

    Any ideas as to why would this be the case?

    For what it's worth, I also tried substituting these with vectors with
    no difference whatsoever.

    Any help is most appreciated!

    Sincerely,

    Ico
     
    , May 3, 2007
    #1
    1. Advertising

  2. Greg Herlihy Guest

    On 5/2/07 6:44 PM, in article
    ,
    "" <> wrote:

    > I've made a 2d dynamic array as follows (this is a snippet so not all
    > variables are accounted for):
    >
    > //numvoices are dynamic (1-1000), entered by user
    > //MAXCHANNELS is currently defined as 24
    >
    > float **gvoiceSpat;
    > float **notechannelGain;
    > float **notechannelGainSpread;
    >
    > gvoiceSpat = new float *[numvoices];
    > notechannelGain = new float *[numvoices];
    > notechannelGainSpread = new float *[numvoices];
    >
    > for (i = 0; i < numvoices; i++)
    > {
    > gvoiceSpat = new float[MAXCHANNELS];
    > notechannelGain = new float[MAXCHANNELS];
    > notechannelGainSpread = new float[MAXCHANNELS];
    > }
    >
    > The interesting thing is that this code works flawlessly in gcc but in
    > Visual C++ (2003 .NET) whenever numvoices exceeds ~120, the program
    > crashes, sometimes reporting unknown exception. The problem is that
    > this is a code for an external module for another application and uses
    > additional third-party libs so it is difficult to point fingers at the
    > culprit. Yet, the fact remains that this crash occurs only on Windows
    > using Visual C++, while it works flawlessly on OSX (gcc) and Linux
    > (gcc) using same libs.
    >
    > Any ideas as to why would this be the case?


    Could it be that on Windows numvoices is a char - and the program is
    crashing when numvoices' value exceeds 128 and overflows?

    Greg
     
    Greg Herlihy, May 3, 2007
    #2
    1. Advertising

  3. Guest


    > > Any ideas as to why would this be the case?

    >
    > Could it be that on Windows numvoices is a char - and the program is
    > crashing when numvoices' value exceeds 128 and overflows?
    >
    > Greg


    No idea. How can this be tested and/or alleviated?

    Ico
     
    , May 3, 2007
    #3
  4. Guest

    Actually, upon closer inspection, the ceiling appears to shift a lot
    and optimizing code does increase it (without code optimization it
    hovers around 90, and with full optimization it can get up to 120 or
    so).

    Ico
     
    , May 3, 2007
    #4
  5. Guest

    , May 3, 2007
    #5
  6. GeekBoy Guest

    Maybe because you have here a single dimensional array and not a 2D one as
    you claim?

    Define 2D array:

    "A 2D array is an array that has both rows and columns. You must use 2 sets
    of square brackets when declaring a 2D array and when using it."

    e.g.:
    int arr[3][3];
    arr[0][0] = 5;

    <> wrote in message
    news:...
    > Hi all,
    >
    > I've made a 2d dynamic array as follows (this is a snippet so not all
    > variables are accounted for):
    >
    > //numvoices are dynamic (1-1000), entered by user
    > //MAXCHANNELS is currently defined as 24
    >
    > float **gvoiceSpat;
    > float **notechannelGain;
    > float **notechannelGainSpread;
    >
    > gvoiceSpat = new float *[numvoices];
    > notechannelGain = new float *[numvoices];
    > notechannelGainSpread = new float *[numvoices];
    >
    > for (i = 0; i < numvoices; i++)
    > {
    > gvoiceSpat = new float[MAXCHANNELS];
    > notechannelGain = new float[MAXCHANNELS];
    > notechannelGainSpread = new float[MAXCHANNELS];
    > }
    >
    > The interesting thing is that this code works flawlessly in gcc but in
    > Visual C++ (2003 .NET) whenever numvoices exceeds ~120, the program
    > crashes, sometimes reporting unknown exception. The problem is that
    > this is a code for an external module for another application and uses
    > additional third-party libs so it is difficult to point fingers at the
    > culprit. Yet, the fact remains that this crash occurs only on Windows
    > using Visual C++, while it works flawlessly on OSX (gcc) and Linux
    > (gcc) using same libs.
    >
    > Any ideas as to why would this be the case?
    >
    > For what it's worth, I also tried substituting these with vectors with
    > no difference whatsoever.
    >
    > Any help is most appreciated!
    >
    > Sincerely,
    >
    > Ico
    >
     
    GeekBoy, May 3, 2007
    #6
  7. wrote:
    > Could it be something like the problem described at the following link
    > even though I am using "new" to allocate memory?
    >
    > http://www.daniweb.com/techtalkforums/thread46297.html


    Your issue is somthing else because as you already noted, you're using
    dynamic memory allocation.
     
    Gianni Mariani, May 3, 2007
    #7
  8. wrote:
    > Hi all,
    >
    > I've made a 2d dynamic array as follows (this is a snippet so not all
    > variables are accounted for):
    >
    > //numvoices are dynamic (1-1000), entered by user
    > //MAXCHANNELS is currently defined as 24
    >
    > float **gvoiceSpat;
    > float **notechannelGain;
    > float **notechannelGainSpread;
    >
    > gvoiceSpat = new float *[numvoices];
    > notechannelGain = new float *[numvoices];
    > notechannelGainSpread = new float *[numvoices];
    >
    > for (i = 0; i < numvoices; i++)
    > {
    > gvoiceSpat = new float[MAXCHANNELS];
    > notechannelGain = new float[MAXCHANNELS];
    > notechannelGainSpread = new float[MAXCHANNELS];
    > }
    >
    > The interesting thing is that this code works flawlessly in gcc but in
    > Visual C++ (2003 .NET) whenever numvoices exceeds ~120, the program
    > crashes, sometimes reporting unknown exception. The problem is that
    > this is a code for an external module for another application and uses
    > additional third-party libs so it is difficult to point fingers at the
    > culprit. Yet, the fact remains that this crash occurs only on Windows
    > using Visual C++, while it works flawlessly on OSX (gcc) and Linux
    > (gcc) using same libs.
    >
    > Any ideas as to why would this be the case?
    >
    > For what it's worth, I also tried substituting these with vectors with
    > no difference whatsoever.
    >
    > Any help is most appreciated!


    You're probably corrupting memory somewhere. If you are running on a
    linux box, try to use valgrind (remember to set the appropriate
    environment variable.

    The environment variable is GLIBCXX_FORCE_NEW (or GLIBCPP_FORCE_NEW
    depending on version of GCC).

    For gcc 3.3.2 the variable is GLIBCPP_FORCE_NEW
    For gcc 3.4.0 and above it is GLIBCXX_FORCE_NEW

    If you're not sure
    $ strings /usr/lib/libstdc++.so.6 | grep FORCE
    GLIBCXX_FORCE_NEW


    If you're running only on windows, you have to buy a third party tool.

    Just a note, if you litter your code with news and deletes, you're bound
    to have a problem like this show up. Use vectors or a 2D matrix class
    and avoid problems like this.
     
    Gianni Mariani, May 3, 2007
    #8
  9. James Kanze Guest

    On May 3, 7:10 am, wrote:
    > Could it be something like the problem described at the following link
    > even though I am using "new" to allocate memory?


    > http://www.daniweb.com/techtalkforums/thread46297.html


    No, or at least not directly.

    I'm not sure what your version of VC++ does if new fails because
    of insufficient memory. The standard says you get an exception,
    and the latest versions of VC++ are conform, but VC++ 6.0 still
    returned a null pointer (when it detected the situation). For
    starters, you might try 1) wrapping your code in a try block,
    and catching std::bad_alloc, and 2) testing the result of each
    new for NULL. Realistically, however: at around 100, you're
    allocating under 50 KB; I can't imagine that failing on any
    modern machine.

    Greg Hilary's suggestion about char overflowing is interesting
    as well. Logically, it should mean a threashold of exactly 128,
    however. Still, if you compile using unsigned char (option /J,
    I think), you should be able to eliminate it.

    Other than that, I don't see anything wrong with it. I just ran
    it on my PC (a very small machine, with only 256 MB), and had no
    problems with numvoices at 100000; at 1000000, the program
    became very slow, as the machine started paging, and I finally
    got a bad_alloc exception. (This is with VC++ 2005.)

    All I can suggest is that you try to isolate a 15-20 line bit of
    code which can be compiled on its own, and manifests the error,
    and post that.

    --
    James Kanze (GABI Software) email:
    Conseils en informatique orientée objet/
    Beratung in objektorientierter Datenverarbeitung
    9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
     
    James Kanze, May 3, 2007
    #9
  10. Lionel B Guest

    On Thu, 03 May 2007 03:02:12 -0500, GeekBoy wrote:
    > <> wrote in message
    > news:...
    >> Hi all,
    >>
    >> I've made a 2d dynamic array as follows (this is a snippet so not all
    >> variables are accounted for):
    >>
    >> //numvoices are dynamic (1-1000), entered by user //MAXCHANNELS is
    >> currently defined as 24
    >>
    >> float **gvoiceSpat;
    >> float **notechannelGain;
    >> float **notechannelGainSpread;
    >>
    >> gvoiceSpat = new float *[numvoices];
    >> notechannelGain = new float *[numvoices]; notechannelGainSpread =
    >> new float *[numvoices];
    >>
    >> for (i = 0; i < numvoices; i++)
    >> {
    >> gvoiceSpat = new float[MAXCHANNELS];
    >> notechannelGain = new float[MAXCHANNELS];
    >> notechannelGainSpread = new float[MAXCHANNELS];
    >> }

    >
    > Maybe because you have here a single dimensional array and not a 2D one
    > as you claim?
    >
    > Define 2D array:
    >
    > "A 2D array is an array that has both rows and columns. You must use 2
    > sets of square brackets when declaring a 2D array and when using it."
    >
    > e.g.:
    > int arr[3][3];
    > arr[0][0] = 5;


    *Please* don't top-post (rearranged)

    The OP described it as a *dynamic* array and presents a pretty standard
    implementation. Note the two levels of "new" in the code. "Array"
    elements can then be referenced as:

    gvoiceSpat[j]

    etc. just like a "proper" 2D array.

    >> The interesting thing is that this code works flawlessly in gcc but in
    >> Visual C++ (2003 .NET) whenever numvoices exceeds ~120, the program
    >> crashes, sometimes reporting unknown exception. The problem is that
    >> this is a code for an external module for another application and uses
    >> additional third-party libs so it is difficult to point fingers at the
    >> culprit. Yet, the fact remains that this crash occurs only on Windows
    >> using Visual C++, while it works flawlessly on OSX (gcc) and Linux
    >> (gcc) using same libs.
    >>
    >> Any ideas as to why would this be the case?


    Try the usual; a *minimal* program which demonstrates the problem (my
    suspicion would be that the problem exists elsewhere in the code).

    --
    Lionel B
     
    Lionel B, May 3, 2007
    #10
  11. <> wrote in message
    news:...
    > Hi all,
    >
    > I've made a 2d dynamic array as follows (this is a snippet so not all
    > variables are accounted for):
    >
    > //numvoices are dynamic (1-1000), entered by user
    > //MAXCHANNELS is currently defined as 24
    >
    > float **gvoiceSpat;
    > float **notechannelGain;
    > float **notechannelGainSpread;
    >
    > gvoiceSpat = new float *[numvoices];
    > notechannelGain = new float *[numvoices];
    > notechannelGainSpread = new float *[numvoices];
    >
    > for (i = 0; i < numvoices; i++)
    > {
    > gvoiceSpat = new float[MAXCHANNELS];
    > notechannelGain = new float[MAXCHANNELS];
    > notechannelGainSpread = new float[MAXCHANNELS];
    > }
    >
    > The interesting thing is that this code works flawlessly in gcc but in
    > Visual C++ (2003 .NET) whenever numvoices exceeds ~120, the program
    > crashes, sometimes reporting unknown exception. The problem is that
    > this is a code for an external module for another application and uses
    > additional third-party libs so it is difficult to point fingers at the
    > culprit. Yet, the fact remains that this crash occurs only on Windows
    > using Visual C++, while it works flawlessly on OSX (gcc) and Linux
    > (gcc) using same libs.
    >
    > Any ideas as to why would this be the case?


    Based on the reactions earlier in this thread, my first guess would be heap
    corruption, which means your problem has to do with something completely
    different in your code rather than the allocations of these floats. Some
    ways to get your heap corrupted are: writing to the memory pointed to by an
    uninitialized pointer, freeing an allocated object more than once, freeing
    an object that has never been allocated, writing to an object after it has
    been freed, writing beyond the range of allocated buffers, using delete[]
    for objects allocated with new and similarly using delete for objects
    allocated with new[].

    Memory tracking tools like BoundsChecker or Rational Purify can help you
    detect these kinds of problems. But since this is a C++ group, by simply
    using a std::vector instead of user-allocated arrays you can avoid these
    pitfalls altogether

    - Sylvester
     
    Sylvester Hesp, May 3, 2007
    #11
  12. Guest


    > been freed, writing beyond the range of allocated buffers, using delete[]
    > for objects allocated with new and similarly using delete for objects
    > allocated with new[].


    What do you mean by this? Am I not supposed to use delete at all, even
    in the destructor if I created something with new?

    The issue is also I tried vector and it had the same effect.

    This only manifests itself on VC++ (I use 2003 version). gcc on OSX
    and Linux are absolutely rock solid with the code.

    I tried increasing heap and stack sizes with no difference whatsoever,
    yet it seems as if the "magic limit" beyond the object fails to
    initialize and/or brings the host application down remains more or
    less consistent.

    Any other ideas?

    Ico
     
    , May 3, 2007
    #12
  13. wrote:
    >> been freed, writing beyond the range of allocated buffers, using
    >> delete[] for objects allocated with new and similarly using delete
    >> for objects allocated with new[].

    >
    > What do you mean by this? Am I not supposed to use delete at all, ...


    I think the hint you've been given is that you should use 'delete'
    for pointers obtained from 'new' and 'delete[]' for pointers you
    got from 'new[]', but don't freely mix those up.

    V
    --
    Please remove capital 'A's when replying by e-mail
    I do not respond to top-posted replies, please don't ask
     
    Victor Bazarov, May 3, 2007
    #13
  14. Guest


    > I think the hint you've been given is that you should use 'delete'
    > for pointers obtained from 'new' and 'delete[]' for pointers you
    > got from 'new[]', but don't freely mix those up.


    Thanks for the clarification. I am quite confident that this is
    exactly what I've been doing because the aforesaid vars are the only
    ones that needed special destructors anyhow.

    Ico
     
    , May 3, 2007
    #14
  15. James Kanze Guest

    On May 3, 11:29 am, "Sylvester Hesp" <> wrote:
    > <> wrote in message
    > news:...


    > > I've made a 2d dynamic array as follows (this is a snippet so not all
    > > variables are accounted for):


    > > //numvoices are dynamic (1-1000), entered by user
    > > //MAXCHANNELS is currently defined as 24


    > > float **gvoiceSpat;
    > > float **notechannelGain;
    > > float **notechannelGainSpread;


    > > gvoiceSpat = new float *[numvoices];
    > > notechannelGain = new float *[numvoices];
    > > notechannelGainSpread = new float *[numvoices];


    > > for (i = 0; i < numvoices; i++)
    > > {
    > > gvoiceSpat = new float[MAXCHANNELS];
    > > notechannelGain = new float[MAXCHANNELS];
    > > notechannelGainSpread = new float[MAXCHANNELS];
    > > }


    > > The interesting thing is that this code works flawlessly in gcc but in
    > > Visual C++ (2003 .NET) whenever numvoices exceeds ~120, the program
    > > crashes, sometimes reporting unknown exception. The problem is that
    > > this is a code for an external module for another application and uses
    > > additional third-party libs so it is difficult to point fingers at the
    > > culprit. Yet, the fact remains that this crash occurs only on Windows
    > > using Visual C++, while it works flawlessly on OSX (gcc) and Linux
    > > (gcc) using same libs.


    > > Any ideas as to why would this be the case?


    > Based on the reactions earlier in this thread, my first guess would be heap
    > corruption, which means your problem has to do with something completely
    > different in your code rather than the allocations of these floats. Some
    > ways to get your heap corrupted are: writing to the memory pointed to by an
    > uninitialized pointer, freeing an allocated object more than once, freeing
    > an object that has never been allocated, writing to an object after it has
    > been freed, writing beyond the range of allocated buffers, using delete[]
    > for objects allocated with new and similarly using delete for objects
    > allocated with new[].


    > Memory tracking tools like BoundsChecker or Rational Purify can help you
    > detect these kinds of problems.


    Valgrind is a good choice for Linux. Purify it's not, but it's
    easier to use, and a far more affordable.

    > But since this is a C++ group, by simply
    > using a std::vector instead of user-allocated arrays you can avoid these
    > pitfalls altogether


    Not necessarily. In the end, std::vector uses the heap as well,
    and if he's corrupted his heap (which is the most likely
    situation), then std::vector will get into trouble as well.

    --
    James Kanze (GABI Software) email:
    Conseils en informatique orientée objet/
    Beratung in objektorientierter Datenverarbeitung
    9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
     
    James Kanze, May 4, 2007
    #15
  16. Ian Collins Guest

    James Kanze wrote:
    > On May 3, 11:29 am, "Sylvester Hesp" <> wrote:
    >
    >> But since this is a C++ group, by simply
    >> using a std::vector instead of user-allocated arrays you can avoid these
    >> pitfalls altogether

    >
    > Not necessarily. In the end, std::vector uses the heap as well,
    > and if he's corrupted his heap (which is the most likely
    > situation), then std::vector will get into trouble as well.
    >

    But if he sticks to the standard library rather than managing his own
    allocations, he is less likely to mess up the heap.

    --
    Ian Collins.
     
    Ian Collins, May 4, 2007
    #16
  17. James Kanze Guest

    On May 3, 10:47 pm, wrote:
    > > been freed, writing beyond the range of allocated buffers, using delete[]
    > > for objects allocated with new and similarly using delete for objects
    > > allocated with new[].


    > What do you mean by this? Am I not supposed to use delete at all, even
    > in the destructor if I created something with new?


    What he means is that if you allocate an array (as you were
    doing), you should use delete[], and not delete. Otherwise you
    have undefined behavior.

    > The issue is also I tried vector and it had the same effect.


    Sounds like his first suggestion: you've corrupted the free
    space arena somehow: writing beyond the end of the allocated
    memory, writing to already freed memory, freeing the same memory
    twise, or something along those lines.

    > This only manifests itself on VC++ (I use 2003 version). gcc on OSX
    > and Linux are absolutely rock solid with the code.


    Just chance. The effects of writing beyond the end of allocated
    memory, for example, vary enormously depending on the actual
    implementation of the allocator, as does freeing already freed
    memory.

    > I tried increasing heap and stack sizes with no difference whatsoever,
    > yet it seems as if the "magic limit" beyond the object fails to
    > initialize and/or brings the host application down remains more or
    > less consistent.


    > Any other ideas?


    Everything points to a corrupted heap. If you can, try running
    the program under Purify. (Purify is pretty much priced out of
    the reach of the hobby programmer. But it saves much more than
    it costs for a company, so it's more expensive not buying it.)
    Otherwise, try valgrind on the Linux platform. It may pick up
    the error even if there are no visible symptoms otherwise.

    --
    James Kanze (GABI Software) email:
    Conseils en informatique orientée objet/
    Beratung in objektorientierter Datenverarbeitung
    9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
     
    James Kanze, May 4, 2007
    #17
  18. James Kanze Guest

    On May 4, 9:48 am, Ian Collins <> wrote:
    > James Kanze wrote:
    > > On May 3, 11:29 am, "Sylvester Hesp" <> wrote:


    > >> But since this is a C++ group, by simply
    > >> using a std::vector instead of user-allocated arrays you can avoid these
    > >> pitfalls altogether


    > > Not necessarily. In the end, std::vector uses the heap as well,
    > > and if he's corrupted his heap (which is the most likely
    > > situation), then std::vector will get into trouble as well.


    > But if he sticks to the standard library rather than managing his own
    > allocations, he is less likely to mess up the heap.


    Of course, using the standard library systematically will
    eliminate large categories of bugs, and just makes good sense
    from a software engineering point of view. But once the heap is
    corrupt, the standard library is just as fragile as anything
    else. And of course, if he's not using a debugging version of
    the standard library... things like "*v.end() = 3.14159" will
    still get him into trouble.

    --
    James Kanze (GABI Software) email:
    Conseils en informatique orientée objet/
    Beratung in objektorientierter Datenverarbeitung
    9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
     
    James Kanze, May 4, 2007
    #18
  19. "James Kanze" <> wrote in message
    news:...
    On May 3, 11:29 am, "Sylvester Hesp" <> wrote:
    > <> wrote in message
    > news:...
    > > But since this is a C++ group, by simply
    > > using a std::vector instead of user-allocated arrays you can avoid these
    > > pitfalls altogether

    >
    > Not necessarily. In the end, std::vector uses the heap as well,
    > and if he's corrupted his heap (which is the most likely
    > situation), then std::vector will get into trouble as well.


    Naturally, but what I meant was that by using the standard library (or any
    other isolated, well tested and proven container implementation for that
    matter) you can avoid the programming errors I described. Just replacing his
    piece code that reveals the bug by a std::vector implementation obviously
    doesn't magically make it all work :)

    - Sylvester Hesp
     
    Sylvester Hesp, May 4, 2007
    #19
  20. 280Z28 Guest

    wrote:
    >> been freed, writing beyond the range of allocated buffers, using delete[]
    >> for objects allocated with new and similarly using delete for objects
    >> allocated with new[].

    >
    > What do you mean by this? Am I not supposed to use delete at all, even
    > in the destructor if I created something with new?
    >
    > The issue is also I tried vector and it had the same effect.
    >
    > This only manifests itself on VC++ (I use 2003 version). gcc on OSX
    > and Linux are absolutely rock solid with the code.
    >
    > I tried increasing heap and stack sizes with no difference whatsoever,
    > yet it seems as if the "magic limit" beyond the object fails to
    > initialize and/or brings the host application down remains more or
    > less consistent.
    >
    > Any other ideas?
    >
    > Ico
    >


    Application Verifier is an extremely powerful, free, and easy to use
    tool for bounds checking in Windows. See if it helps find the problem.

    http://www.microsoft.com/technet/prodtechnol/windows/appcompatibility/appverifier.mspx
     
    280Z28, May 7, 2007
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Bill Reyn
    Replies:
    3
    Views:
    2,280
    Bob Hairgrove
    Jun 22, 2004
  2. Atemporal
    Replies:
    4
    Views:
    383
    Lionel B
    Jun 9, 2008
  3. Philipp
    Replies:
    21
    Views:
    1,156
    Philipp
    Jan 20, 2009
  4. Ketchup
    Replies:
    1
    Views:
    259
    Jan Tielens
    May 25, 2004
  5. Replies:
    5
    Views:
    919
    Xho Jingleheimerschmidt
    Apr 2, 2009
Loading...

Share This Page