When to check the return value of malloc

T

Tim Harig

bart.c said:
sandeep said:
Obviously, for tiny allocations like 20 bytes to strcpy a filename,
there's no point putting in a check on the return value of malloc... if
there is so little memory then stack allocations will also be failing
and your program will be dead.
[SNIP]
For a proper application, especially to be run by someone else on their
own machine, then you should check allocations of any size (and have the
machinery in place to deal with failures sensibly).

Note this. As bart.c writes not all insufficient memory errors are causes
for termination. It may mean the the program cannot use one specific
feature do to memory requirements; but, that doesn't mean that it might not
be able to do most anything else, that it might not be able to do so in
the future, or even that you might not be able to revert to a less
optimized but less power hungary algorithm. If there is user data on the
line, the user is going to be seriously pissed if you program closes
without attempting to save any data that it can. In general, most programs
are expected to run as best they can in spite of memory availability.
This is a good idea. I have just made a clever macro to do this - not as
easy as it seems due to void use problems and need for a temporary.

Unless you are in a severvely restricted environment statically allocated
variables will likely be allocated in a different section of memory (the
stack) then will dynamically allocated variables (on the heap). Just
because one area is out of memory doesn't mean that the other is. Therefore,
it is reasonable to assume that create a function (which requires at least
a pointer on the stack) and even use temporary statically allocated
variables if necessary.

Also note that an out like this should be used judiciously. Whether or not
to terminate should be based upon the context of the program and whether
the program can otherwise function; not on how much memory was requested.
Most functions are far better returning error where it is better handled by
the main part of the program which can make better decisions about what
errors are actually terminal.
static void* __p;
#define safeMalloc(x) ((__p=malloc(x))?__p:\
(exit(printf("unspecified error")),(void*)0))

As has been stated by others "unspecified error" is a rather poor error. A
more informative error such as "Insufficient memory to continue with
operation" is much more informative to the user. This isn't leaking
internals as most users *do* realize that programs require memory to
operate. If you are conserned that they are not, you could add a
suggestion such as "You could try closing some windows and try again."

It is also more useful to add debugging versions using the preprocessor
which provide more information while testing and debugging:

#ifndef NDEBUG
printf("Unable to allocate enough memory to copy strings, %lu.", line);
#else
printf("Insuffient memory. Try closing some windows and try again.");
#endif

A function is much better to handle the kind of complexity that is likely
to arise from this operation. Pre-optimization is a bad thing and often
turns out to be unsubstantiated or a performance liability because of
failed assumptions. Never second guess the optimizing abililties of the
compiler. Only optimize *after* you have confirmed that something is
actually a performance liability with hard data to support that claim.
 
S

Seebs

Um, Seebs, maybe you should try decaf?

Hah! I've been on decaf for ages.

I guess it's just a pet peeve. There is little in this world more infuriating
than the scenario, which I'd guess everyone has been through:

You have some sort of time pressure or deadline. A piece of software fails.
And it gives you NO CLUE AT ALL what went wrong or what might be done to
address it. The error message is absent or uninformative, such that you
can't search on the message and get suggestions or ideas.

I have had programs fail in hundreds of ways, which required anything from
upgrading a kernel to switching from one kind of network to another in order
to address them. I have had programs destroy data.

But the only one that consistently, really, infuriates me, and makes me want
to whack developers upside the head with a lead pipe, is when the failure
has clearly been caught, and intercepted, and replaced with a completely
useless error message. I would rather get a segmentation fault than the
message "unspecified error"; at least I could *debug* that.

Now, to be fair, I'm not exactly at the typical "end user" level... But I
have watched people who have trouble with questions like "and what happens
when you double-click that little picture that looks like a piece of paper?"
interacting with software, and they have the same response. They can accept
that a program's error message will make no sense, but if you show them a
message which is completely legible and clearly designed to hide what went
wrong from them, they get pretty mad. It's insulting.

Sandeep's questions have led me to think that this is a person who would
love to become a good and effective programmer. For all that I think it's
absolutely ridiculous to use a macro instead of a function "for efficiency",
thinking about efficiency, while usually the wrong thing to do, is the
kind of thing that suggests someone who *wants* to become good at this stuff.

Whereupon, it's very important to address the importance of understanding
that, even if your target market is developmentally disabled people, it is
almost ALWAYS a horrific mistake to treat the user like an idiot. There is
no surer path to the uninstaller than wasting the user's time by saying "I
could tell you exactly what went wrong, but I don't think you're smart enough
to do anything about it, so I'll just lie to you."

-s
 
J

James Dow Allen

.
Obviously for efficiency! malloc may be called many times in the course
of a program.

Others have given useful correct responses but as someone who
prides himself on writing efficient code, I'd like to answer
this part.

Calling malloc() enough to matter and then worrying about
micro-optimizations in your interface to it is like
putting on your best running-shoes just to ride a slow-moving
elevator.

James Dow Allen
 
D

Dennis \(Icarus\)

sandeep said:
Many users will only be confused by technical error messages about memory
allocation etc. It's best not to get into unwanted details - the user
doesn't know about how my program allocates memory, it just needs to know
there was an error that needs a restart. I think in books they call it
leaking abstractions.

You have the message, which folks can the use to find more information in
your help.

If you're runnng a 64bit proram and try to do something requiring 6gb when
the ram and swap total 3gb, you'll run out of memory.
Telling the user what happened, and how to fix it (increasing RAM or swap
file) will let them adjust the system settings and continue.
Otherwise they may well decide it's crap and uninstall it, then tell their
friends that it’s crap, and so on.
Word of mouth spreads pretty quickly in this day of twitter, facebook,
blogs, Usenet, ....

Dennis
 
E

Eric Sosman

Many users will only be confused by technical error messages about memory
allocation etc. It's best not to get into unwanted details - the user
doesn't know about how my program allocates memory, it just needs to know
there was an error that needs a restart. I think in books they call it
leaking abstractions.

*I* call it "Sandeep imagines he's smarter than everybody who
might ever use his program."
 
D

Dennis \(Icarus\)

Seebs said:
Please stop trying to outsmart the compiler.

The "cost" of using a function instead of a macro is likely to be so small
that you can't even measure it. If there is even a cost at all, which
there
may not be.

memset actually showed as a bottleneck in one of programs I had to debug.
It was being called 2,000,000,000 times or so.

Converting from memset to a for loop writing integers helped a bit, but the
real fix was to correct the logic error so that it wasn't called that many
times.
the memset was part of initializing an object, which was being constructed
during processing, but was only needed a fraction of the time.....

Dennis
 
M

Malcolm McLean

So somewhere in between there must be a point where you stop ignoring the
return value, and start checking it. Where do you draw this line? It must
depend on whether you will deploy to a low memory or high memory
environment... but is there a good rule?
Where it's statistically more likely that the computer will break than
run out of memory, you've got a good case for ignoring malloc().

There are other arguments for ignoring malloc() checks. One is that a
request for 0 bytes can return a null pointer or a non-null pointer to
zero bytes. It's much more likely that a zero request will be
legitimate and be made, and that it won't be obvious at writing time
that requests for zero allocations are legitimate, than it is that a
tiny request will fail. So you are quite likely to trigger a suprious
out of memory mesage, and worse, this won't show up in testing if the
platform always returns a non-null pointer to zero bytes. The correct
test if(request != 0 && ptr == 0) can be messy and is often not seen
in production code.

The other issues is that if an allocation request for, say, a few
bytes to hold a file path fails, it's more likely that there is some
bug in the program causing the path size to be set to a garbage
number, than it is that the program ahs actually run out of memory.
Printing out an "out of memory" message is misleading, and might be
expensive. For instance the user might try to run the computer of a
bigger computer, taking him a whole day's work to beg the bigger
computer from the neighbouring department, recompile and reinstall the
program, set it all up ...
 
S

sandeep

Seebs said:
In short, this is a catastrophically bad design approach. Abandon it.
Reject it. Anything you think you know which caused you to adopt this
is probably also utterly wrong, and dangerously so, and until you root
all the madness out and start afresh, your code will be dangerous,
untrustworthy, and based on a bad attitude.

This was quite a long rant!! I think I see your point but you have to
imagine your grannie when Word crashes. I think she will not like to see
a message like
"malloc failed at src\lib\unicode\mapper\table.c:762"
I think that will confuse her!

Anyway I have adopted your suggestion, also used a function instead of a
macro, and built in some extra functionality. Now it will keep some memory
back to use later on if allocations start failing for added robustness.

void* safeMalloc(size_t x)
{
static void* emrgcy=0;
void* x1;
#define ESIZE 0x40000000uLL
if(!(emrgcy=emrgcy?emrgcy:malloc(ESIZE)))
#undef ESIZE
printf("WARNING running in unstable mode, program may crash at any "
" time....Close open programs, allow more virtual memory or "
" install extra RAM");
if(!(x1=malloc(x))) {
if(emrgcy) {
free(emrgcy);
x1=malloc(x);
} else {
printf("Severe memory failure, program cannot continue at line "
#define STRGFY(x) #x
STRGFY(__LINE__)
#undef STRGFY
" stack dump follows");
abort();
exit(1);
}
}
return x1;
}
 
S

Seebs

This was quite a long rant!! I think I see your point but you have to
imagine your grannie when Word crashes. I think she will not like to see
a message like
"malloc failed at src\lib\unicode\mapper\table.c:762"
I think that will confuse her!

Yes, but "unspecified error" will either:
1. Confuse her.
or
2. Infuriate her.

It's *necessarily* worse. It cannot possibly be better. It can only, at
best, be about as bad.

And again: Imagine that such users are a majority. Should you absolutely
cripple 25% of your users, infuriating them and treating them with contempt,
so that 75% of them will be completely unsure what just happened instead
of being completely unsure what just happened?
Anyway I have adopted your suggestion, also used a function instead of a
macro, and built in some extra functionality. Now it will keep some memory
back to use later on if allocations start failing for added robustness.

That technique is... well, dubious at best. It doesn't necessarily work,
and may well make things worse.

void* safeMalloc(size_t x)
{
static void* emrgcy=0;
void* x1;
#define ESIZE 0x40000000uLL
if(!(emrgcy=emrgcy?emrgcy:malloc(ESIZE)))
#undef ESIZE

Let me guess. Someone told you to define symbolic names for constants,
right?

This is not how you do it.

1. If you're going to define a constant, define it and leave it defined.
2. Don't use a name starting with a capital E followed by another capital
letter, those are reserved for errno values.
3. If you're only using it once, don't feel like you have to #define it.
4. Don't use "uLL" on a constant that's unambiguously within the size
range of an ordinary signed long. You don't need any qualifier at all,
although in theory a system could exist where that value is too big for
size_t, in which case you'd be allocating 0 bytes.
5. Don't get so clever. Try:

if (!emrgcy) {
emrgcy = malloc(0x40000000);
if (!emrgcy) {
fprintf(stderr, "Uh-oh, failed to allocate spare memory.\n");
}
}
6. Don't allocate a GIGABYTE of memory like that -- all this does is
massively increase the chance of catastrophic failure, as a likely response
from a system which overcommits is to determine that your process allocated
a TON of memory, doesn't use most of it, and is probably the best candidate
for being killed out of hand. A megabyte or two, sure, I guess.
7. Actually, even then, this is just cargo cult stuff. Don't do it, it
won't help.
printf("WARNING running in unstable mode, program may crash at any "
" time....Close open programs, allow more virtual memory or "
" install extra RAM");

This is a really poor message, because this is not an "unstable" mode,
it's the normal state of affairs, where you don't have a spare 1GB allocation.
if(!(x1=malloc(x))) {
if(emrgcy) {
free(emrgcy);
x1=malloc(x);
} else {
printf("Severe memory failure, program cannot continue at line "
#define STRGFY(x) #x
STRGFY(__LINE__)
#undef STRGFY

Again, don't do this. With extremely rare exceptions, you should NEVER
be using #undef on something you just defined.

Also, you're still using plain printf for error messages, which is bad for
the same reasons it was last time.
" stack dump follows");

So's the missing newline.

So's the assumption that abort() gives a "stack dump" -- it may not.
exit(1);
}
}
return x1;
}

Finally, you've made a few other mistakes. You're freeing emrgcy, but you
don't set it to NULL, so your check for it is unlikely to be useful. You
don't check malloc() after calling it.

In short, this is full of cargo-cult superstitions. Here's a slightly
more realistic effort:

void *
failsafe_malloc(size_t size) {
static void *failsafe = NULL;
void *ret;

if (!failsafe) {
failsafe = malloc(1024 * 1024);
}
ret = malloc(size);
if (!ret && failsafe) {
free(failsafe);
failsafe = NULL;
ret = malloc(size);
}
if (!ret) {
fprintf(stderr, "failed to allocate %lld bytes of memory.\n",
(long long) size);
#ifndef NDEBUG
abort();
#endif
}
return ret;
}

A few things to note:

1. Picked a size that's much less likely to cause an immediate catastrophic
failure.
2. Don't bother the user with warnings about the supposed "failsafe", since
it's basically a pointless superstition anyway.
3. No trying to show off using an elaborate combination of ?: and assignment
to set something up.
4. Error message is terse, simple, and doesn't clutter the user's world.
A user who knows what it means can use it, a user who doesn't at least gets
a message that Something Went Wrong.
5. abort() is conditional on NDEBUG, for consistency with assert()'s
behavior. (I don't use assert because it yields useless messages.)
6. failsafe is correctly set to NULL when freed, and future calls will
try to reallocate it (which may work if something large has been freed
in the mean time).
7. No #define, use once, #undef hackery, because that's annoying and
generally pointless.

Reading your code, I get the impression you're trying to aim for some kind
of code density, with cool tricks you've seen all thrown in together to
make the code look more impressive. Don't do that. Write the absolute
simplest code you can that clearly expresses what you're doing. You'll
have fewer bugs (if you'd written this more simply, I bet you'd have caught
that you never set emrgcy to NULL after freeing it, but might continue
to test it), and you'll have an easier time fixing things and adding features.

-s
 
P

Phil Carmody

Seebs said:
This question is too incoherent to answer.

Is it "How will it be easier to make effective use of?"

Answer - it won't have multiple-evaluation issues, and it both looks
and behaves like a function.
What part of "a function" do you have trouble with? You know how to write
functions, right? You know how to call them, right?

Try adding some verbs. Questions like "how do I declare a function" or "how
do I use a function" might begin to be answerable. An explanation of what
you're having trouble with, specifically, would be even better.


Wrong.

Users who are "confused" by an error message can accept that they got "an
error". MANY users, however, know enough to recognize that "out of memory"
is different from "file not found".

Stop trying to outsmart the user.

I think this error is called "out-dumbing the user".

Phil
 
S

sandeep

Phil said:
Is it "How will it be easier to make effective use of?"
Yes.

Answer - it won't have multiple-evaluation issues, and it both looks and
behaves like a function.

Multiple evaluation is very unlikely in this case. This answer looks
spurious to me.
 
P

Phil Carmody

Keith Thompson said:
Geoff said:
Many users will only be confused by technical error messages about memory
allocation etc. [...]
I think in books they call it leaking abstractions.

Stop reading those books immediately.

At least until you can understand what they're saying. I rather
doubt that books discussing "leaking abstractions" (a useful concept
and something to avoid) would recommend an "unspecified error"
message over "memory allocation failed".

Leaking *abstractions* sounds a lot better than leaking implementation
details, or leaking specifics.

E.g.:
"Error: out of memory (attempting to clone image buffer)"
may leak a couple of abstrations, but is way better than:
"Error: s_malloc(8192100) returned NULL, called from foo/bar/img.c:6742"
or
"Error: imgbuf_s binary buddy-heap has no free blocks"
IMHO.

Phil
 
P

Phil Carmody

sandeep said:
Obviously for efficiency! malloc may be called many times in the course
of a program.

Forget everything you've learnt.

Start again.

Do not pick up idiocy like the above next time.

Phil
 
P

Phil Carmody

Dennis \(Icarus\) said:
memset actually showed as a bottleneck in one of programs I had to debug.
It was being called 2,000,000,000 times or so.

Converting from memset to a for loop writing integers helped a bit,
but the real fix was to correct the logic error so that it wasn't
called that many times.
the memset was part of initializing an object, which was being
constructed during processing, but was only needed a fraction of the
time.....

An object being 'constructed'? Are you sure you were using C?

Phil
 
T

Tim Harig

Sorry, Seebs much of this is directed at sandeep not you. You have made
goode points. It is just easier to reply here so as not to duplicate
some things.


Yes, but "unspecified error" is no less confusing.

The user should receive a message that is useful enough for them to have
a basic idea of what is happening giving them an idea of how to fix it
without a lot of technical reference or inside knowledge of the program.
What they really need to know is "Insufficient Memory." That isn't too
technical for almost any user. Most people, even grannies, know that
programs need memory to run.

Somebody debugging the program would definitely prefer the "malloc failed"
message as it gives them some clue as to where the failure happened within
the program. This could be useful for tracking down other problems runaway
functions, etc.
And again: Imagine that such users are a majority. Should you absolutely
cripple 25% of your users, infuriating them and treating them with contempt,
so that 75% of them will be completely unsure what just happened instead
of being completely unsure what just happened?

There are ways to satisfy both. The first is to use the preprocessor to
differentiate between test code where detailed debugging information is
included and production code with errors more useful to the user. Another
is to provide a logging system appropriate to your target operating system
where you log messages with additional information while still showing user
oriented messages.
That technique is... well, dubious at best. It doesn't necessarily work,
and may well make things worse.

It is a poor choice. If an application truly cannot function then it
should handle the error gracefully and exit. That said, while one feature
of a program may not be able to perform with low memory, often other
functionality is possible. Many programs may not be able to function; but,
may be too critical to exit for what may be a temporary memory issue.

A server process, for instance, may not be able to handle a request
that requires more memory then is available. It should not just exit.
It should send an error to the client, unallocate any data that was
allocated for the clients request, close the connection, and wait for
another client. The next clients request may not be as memory intensive
or another process that was hogging memory may have released it.

In the end, the appropriate action depends on the application.
6. Don't allocate a GIGABYTE of memory like that -- all this does is
massively increase the chance of catastrophic failure, as a likely response
from a system which overcommits is to determine that your process allocated
a TON of memory, doesn't use most of it, and is probably the best candidate
for being killed out of hand. A megabyte or two, sure, I guess.

Unless you know the memory size of the target system, this is a crap shoot
at best.
5. abort() is conditional on NDEBUG, for consistency with assert()'s
behavior. (I don't use assert because it yields useless messages.)

Assert is not designed to give useful messages and it is not designed for
error handling. It is designed to crash the application if it notices
something is wrong. This helps the programmer to know about subtle bugs
that might otherwise go unseen until they produce noticable problems. Bugs
of this nature can be difficult to detect until the software is shipped and
can be difficult to debug as they may cause problems or crash until far
later in the execution then where the actual bug resides. Assert helps
you catch them early before they ship and closer to when they actually
happen.

The execellent book "Writing Solid Code" by Steve Maguire is a must read
for any programmer. It contains great examples how to use assert()
properly and effectively.
Reading your code, I get the impression you're trying to aim for some kind
of code density, with cool tricks you've seen all thrown in together to
make the code look more impressive. Don't do that. Write the absolute
simplest code you can that clearly expresses what you're doing. You'll
have fewer bugs (if you'd written this more simply, I bet you'd have caught
that you never set emrgcy to NULL after freeing it, but might continue
to test it), and you'll have an easier time fixing things and adding features.

Note that minimizing code density doesn't create a smaller or faster
binary. A concisely written ?: operator on a single line generates the
same code as the equivilant if/else operators spread across multiple lines.
The if/else version is almost always easier to read. "?:"'s are almost
always the sign of an amateur programmer trying to show off their "l337
skilz." Seasoned programmers have learned to avoid them.
 
J

jacob navia

Phil Carmody a écrit :
An object being 'constructed'? Are you sure you were using C?

As you (may) know, objects are allocated, constructed (initialized) in C
all the time.

For instance for some hypothetical structure "Foo":

Foo *newFoo(size_t length, double averageUse)
{
Foo *result;
if ((result=calloc(1,sizeof(Foo))) == NULL)
return NULL;
result->averageUse = averageUse;
result->length = length;
result->Stats = DEFAULT_STATS_VAL;
result->count = 0;
return result;
}
 
N

Nick Keighley


I'm not a fan of putting too much "computer science" into user error
messages or expecting them to know program internals. But a short
succinct description of the cause of the failure is good. I'm a fan of
log files that provide the developer with more specific information.

Note yesterday I encountered someone who got a "not enough memory to
perform operation" error and were surprised because their disk had
plenty of space.

Where did you get this bullshit?  The above paragraph is by far the stupidest
thing I've ever seen you write.  It's not just a little wrong; it's not just a
little stupid; it's not just a little callous or unthinking.  It's one of
the most thoroughly, insideously, wrong, stupid, and evil things you could
start thinking as a programmer.

wow. And you're on decaff?

2.  "Error that needs a restart" is nearly always bullshit.  If the program
is running out of memory because you made a mistake causing it to try to
allocate 4GB of memory on a 2GB machine, "restart" will not fix it.  Nothing
will fix it until the user finds out what's wrong and submits a bug report
allowing the developer to fix it.

the other cause for memory error is that some other program has eaten
the memory

4.  The chances are very good that many of the prospective users of any
program will, in fact, be able to program at least a little,

what universe do you live in? Are most of the people you know
programmers?
or will have basic computer literacy.

again quite strange. "basic computer literacy" can be *very* basic

5.  Trying to avoid "confusing" people is power-mad idiocy.

I disagree. Have you seen Airport displays with Windows NT register
dumps?
 Your job here
is not to imagine yourself some kind of arbiter-of-technology, preserving the
poor helpless idiots from the dangers of actual information.  Your job is
to make a program which works as well as possible, and that includes CLEAR
statements of what failed.

well you draw the line at register dumps so I think this is a matter
of where the line is drawn

6.  You can never make a message so clear that every concievable user will
understand it.  However, a user who won't understand a simple message won't
understand an imprecise or flatly false one, either.  There does not exist
a user who will have a clear idea of what went wrong and be able to react
accordingly when confronted with "unspecified error", but who will be utterly
paralyzed like a deer in headlights when confronted with "memory allocation
failed".  As a result, even if we restrict our study to the set of users
who simply have no clue what those words mean, you STILL gain no benefit,
at all, from the bad message.  But in the real world, you hurt many of your
users by denying them the information that would allow them to address
the issue (say, by closing other applications so that more memory becomes
available).

do you have a limit to this? "Database has deadlocked" "link layer
failure" "too many hash collisions"

<snip>
 
M

Moi

Many users will only be confused by technical error messages about
memory allocation etc. It's best not to get into unwanted details - the
user doesn't know about how my program allocates memory, it just needs
to know there was an error that needs a restart. I think in books they
call it leaking abstractions.

Suppose your program is a filter, used in a (unix shell: sorry!) commandline like:

$ find . -name \*\.c -print | sort | uniq | yourprogram | lpr
FAILED $

What could the user do to help you solve *your* problem ?
Would he have liked it, if "FAILED" had been written to stdout ?
Will he ever attempt to use your program again ?

HTH,
AvK
 
W

Willem

Nick Keighley wrote:
) I'm not a fan of putting too much "computer science" into user error
) messages or expecting them to know program internals. But a short
) succinct description of the cause of the failure is good. I'm a fan of
) log files that provide the developer with more specific information.
)
) Note yesterday I encountered someone who got a "not enough memory to
) perform operation" error and were surprised because their disk had
) plenty of space.

Quite understandable, given that many OSes use disk space as virtual
memory.

)> 4. ?The chances are very good that many of the prospective users of any
)> program will, in fact, be able to program at least a little,
)
) what universe do you live in? Are most of the people you know
) programmers?

No, but 99% of the users are using 1% of the programs.
In other words: most users stick with the big, well-known software,
especially the less computer-literate ones, so any given program is
therefore most likely to be used by a computer-savvy person.


SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT
 
N

Nick Keighley

Let me guess.  Someone told you to define symbolic names for constants,
right?

This is not how you do it.

1.  If you're going to define a constant, define it and leave it defined.
2.  Don't use a name starting with a capital E followed by another capital
letter, those are reserved for errno values.
3.  If you're only using it once, don't feel like you have to #define it.

oops! Don't agree. A named constant makes it clear what the constant
is for- it's the good use of abstraction. Then there's multiple use of
the same number (less likely with really big numbers).

sm.state = 9;
write_port (9, 9);
for (equip_num = 0; equip_num < 9; equip_num++)
reset_equip (equip_num);

If you want to change one of those you have inspect every 9 in the
program

<snip>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,581
Members
45,056
Latest member
GlycogenSupporthealth

Latest Threads

Top