hash table...again

F

filox

first of, i'd like to excuse myself for writing in this group, even though i
know this is not a standard C++ question. i've written to
microsoft.public.dotnet.lang.vc but no one seems to be answering me, and
it's pretty urgent.
any way, i have something like this:

hash_map<double, int> mapa;

double md = 0;

for(int j=0;j<1000000;++j)

{

mapa[md] = 3;

md++;

}

as you can see, i'm simply trying to hash a lot of data. my problem is that
this uses up 70 MB of memory,which seems to me a bit too much. any ideas
what went wrong?
again, sorry for being offtopic
 
M

mlimber

filox said:
first of, i'd like to excuse myself for writing in this group, even though i
know this is not a standard C++ question. i've written to
microsoft.public.dotnet.lang.vc but no one seems to be answering me, and
it's pretty urgent.

Actually, it's not off-topic. FAQ 5.9 says that the topic here is the
standard C++ language and libraries and "planned extensions and
adjustments." TR1, some planned extensions to the standard library,
includes unordered_map, which is (I believe) essentially the same as
hash_map but with a different name so as not to cause problems with
code using the common extension you are using.
any way, i have something like this:

hash_map<double, int> mapa;

double md = 0;

for(int j=0;j<1000000;++j)

{

mapa[md] = 3;

md++;

}

as you can see, i'm simply trying to hash a lot of data. my problem is that
this uses up 70 MB of memory,which seems to me a bit too much. any ideas
what went wrong?

How do you know it takes 70 MB?

Cheers! --M
 
F

filox

mlimber said:
filox said:
first of, i'd like to excuse myself for writing in this group, even
though i
know this is not a standard C++ question. i've written to
microsoft.public.dotnet.lang.vc but no one seems to be answering me, and
it's pretty urgent.

Actually, it's not off-topic. FAQ 5.9 says that the topic here is the
standard C++ language and libraries and "planned extensions and
adjustments." TR1, some planned extensions to the standard library,
includes unordered_map, which is (I believe) essentially the same as
hash_map but with a different name so as not to cause problems with
code using the common extension you are using.
any way, i have something like this:

hash_map<double, int> mapa;

double md = 0;

for(int j=0;j<1000000;++j)

{

mapa[md] = 3;

md++;

}

as you can see, i'm simply trying to hash a lot of data. my problem is
that
this uses up 70 MB of memory,which seems to me a bit too much. any ideas
what went wrong?

How do you know it takes 70 MB?

because the windows task manager says so...
 
M

mlimber

filox said:
because the windows task manager says so...
From what I've read, the Task Manager is not a trustworthy source of
information on this count. Ask about reliable ways to calculate memory
usage in a Windows newsgroup, and once you are confident in the
numbers, ask here again if necessary.

Cheers! --M
 
L

loufoque

filox said:
first of, i'd like to excuse myself for writing in this group, even though i
know this is not a standard C++ question. i've written to
microsoft.public.dotnet.lang.vc but no one seems to be answering me, and
it's pretty urgent.
any way, i have something like this:

hash_map<double, int> mapa;

double md = 0;

for(int j=0;j<1000000;++j)

{

mapa[md] = 3;

md++;

}

as you can see, i'm simply trying to hash a lot of data. my problem is that
this uses up 70 MB of memory,which seems to me a bit too much. any ideas
what went wrong?

You could try to use the constructor to tell the container its size
beforehand, hence preventing multiple reallocation and exponential
memory reservation to amortize it.
 
F

filox

loufoque said:
filox said:
first of, i'd like to excuse myself for writing in this group, even
though i know this is not a standard C++ question. i've written to
microsoft.public.dotnet.lang.vc but no one seems to be answering me, and
it's pretty urgent.
any way, i have something like this:

hash_map<double, int> mapa;

double md = 0;

for(int j=0;j<1000000;++j)

{

mapa[md] = 3;

md++;

}

as you can see, i'm simply trying to hash a lot of data. my problem is
that this uses up 70 MB of memory,which seems to me a bit too much. any
ideas what went wrong?

You could try to use the constructor to tell the container its size
beforehand, hence preventing multiple reallocation and exponential memory
reservation to amortize it.

that sounds like it could help. only, i don't know how to do it...
could you write an example how to call a constructor with a size of ,say,
100 bytes?

thanks
 
L

loufoque

filox said:
that sounds like it could help. only, i don't know how to do it...
could you write an example how to call a constructor with a size of ,say,
100 bytes?

1) I already told you it is in constructor
2) If you have doubts you should check your reference documentation
3) It is the same syntax than with all other containers.
 
P

Pete Becker

loufoque said:
1) I already told you it is in constructor
2) If you have doubts you should check your reference documentation
3) It is the same syntax than with all other containers.

That's too broad. The standard sequence containers (vector, list, deque)
have constructors that initialize them with a specified number of
elements. The standard associative containers (set, multi_set, map,
multi_map) do not. Indeed, you wouldn't really want to create a
map<double, int> with 1,000,000 identical elements.

Associative containers allocate nodes as needed. There is no need for
exponential memory reservation.

--

-- Pete

Author of "The Standard C++ Library Extensions: a Tutorial and
Reference." For more information about this book, see
www.petebecker.com/tr1book.
 
M

mlimber

Pete said:
That's too broad. The standard sequence containers (vector, list, deque)
have constructors that initialize them with a specified number of
elements. The standard associative containers (set, multi_set, map,
multi_map) do not. Indeed, you wouldn't really want to create a
map<double, int> with 1,000,000 identical elements.

Moreover, you couldn't since a std::map's keys must be unique.
Associative containers allocate nodes as needed. There is no need for
exponential memory reservation.

Exactly. The resize member function for the SGI's hash_map
(http://www.sgi.com/tech/stl/HashedAssociativeContainer.html) and the
constructor for SGI's and TR1's
(http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2005/n1836.pdf)
hash maps both accept a parameter specifying number of buckets (which
has to do with look-up speed), not number of allocated/reserved pairs.

Cheers! --M
 
F

filox

mlimber said:
information on this count. Ask about reliable ways to calculate memory
usage in a Windows newsgroup, and once you are confident in the
numbers, ask here again if necessary.

Cheers! --M

i'm confident in the numbers...
 
P

Pete Becker

filox said:
i'm confident in the numbers...

The numbers are right, but they're not saying what you think they're
saying. The task manager tells you the total amount of memory currently
assigned to each task. When you do a bunch of allocations the OS may
have to give the task more memory. The amount it gives the task will be
at least as much as the task has asked for, but it may be more, too.
Unless you know the details of how the OS decides to divvy up memory,
there's not much you can conclude about actual memory use from looking
at this number. The time to worry is if you've allocated 70MB and the
task manager says you have 10MB.

--

-- Pete

Author of "The Standard C++ Library Extensions: a Tutorial and
Reference." For more information about this book, see
www.petebecker.com/tr1book.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top