D
Dennis \(Icarus\)
Phil Carmody said:An object being 'constructed'? Are you sure you were using C?
I know I was using C++. I doubt C would be any different in this respect.
Dennis
Phil Carmody said:An object being 'constructed'? Are you sure you were using C?
sandeep said:Multiple evaluation is very unlikely in this case. This answer looks
spurious to me.
Suppose your program is a filter, used in a (unix shell: sorry!) commandline like:
$ find . -name \*\.c -print | sort | uniq | yourprogram | lpr
FAILED $
What could the user do to help you solve *your* problem ?
Would he have liked it, if "FAILED" had been written to stdout ?
Will he ever attempt to use your program again ?
Nick Keighley wrote:
) Note yesterday I encountered someone who got a "not enough memory to
) perform operation" error and were surprised because their disk had
) plenty of space.
Quite understandable, given that many OSes use disk space as virtual
memory.
I don't understand.
I don't know what is you don't understand so I'll start with the
pipeline. - Create a list of *.c files present in the directory
hierarchy rooted at the current working directory,
- sort them lexicographically,
- eliminate duplicate lines (duplicate lines were unlikely in this case,
but I digress),
- process the remaining list with "yourprogram", - pass the output of
"yourprogram" to the line printer (ie. queue the output as a print job
-- further processing is possible "within" lpr, ie. automagically
If "yourprogram" writes FAILED to stdout instead of stderr, then FAILED
will show up interleaved with (or after) the other data sent to "lpr".
Supposing "yourprogram" *either* produces data *or* it writes FAILED,
and that lpr knows to ignore an empty job, writing normal data and
FAILED to different output streams works correctly. Writing FAILED to
stdout would print a page with "FAILED (snicker snicker)" on it.
Of course, if "yourprogram" writes data to stdout before it emits FAILED
to stderr, you still end up with an incomplete page (or book).
Nick Keighley said:oops! Don't agree. A named constant makes it clear what the constant
is for- it's the good use of abstraction. Then there's multiple use of
the same number (less likely with really big numbers).
sm.state = 9;
write_port (9, 9);
for (equip_num = 0; equip_num < 9; equip_num++)
reset_equip (equip_num);
If you want to change one of those you have inspect every 9 in the
program
sandeep said:Multiple evaluation is very unlikely in this case. This answer looks
spurious to me.
This was quite a long rant!! I think I see your point but you have to
imagine your grannie when Word crashes. I think she will not like to see
a message like
"malloc failed at src\lib\unicode\mapper\table.c:762"
I think that will confuse her!
Anyway I have adopted your suggestion, also used a function instead of a
macro, and built in some extra functionality. Now it will keep some memory
back to use later on if allocations start failing for added robustness.
void* safeMalloc(size_t x)
{
static void* emrgcy=0;
void* x1;
#define ESIZE 0x40000000uLL
if(!(emrgcy=emrgcy?emrgcy:malloc(ESIZE)))
#undef ESIZE
printf("WARNING running in unstable mode, program may crash at any "
" time....Close open programs, allow more virtual memory or "
" install extra RAM");
if(!(x1=malloc(x))) {
if(emrgcy) {
free(emrgcy);
x1=malloc(x);
} else {
printf("Severe memory failure, program cannot continue at line "
#define STRGFY(x) #x
STRGFY(__LINE__)
#undef STRGFY
" stack dump follows");
abort();
exit(1);
}
}
return x1;
}
Yes.
Multiple evaluation is very unlikely in this case. This answer looks
spurious to me.
Note yesterday I encountered someone who got a "not enough memory to
perform operation" error and were surprised because their disk had
plenty of space.
wow. And you're on decaff?
the other cause for memory error is that some other program has eaten
the memory
True!
what universe do you live in? Are most of the people you know
programmers?
I disagree. Have you seen Airport displays with Windows NT register
dumps?
do you have a limit to this? "Database has deadlocked" "link layer
failure" "too many hash collisions"
oops! Don't agree. A named constant makes it clear what the constant
is for- it's the good use of abstraction. Then there's multiple use of
the same number (less likely with really big numbers).
sm.state = 9;
write_port (9, 9);
for (equip_num = 0; equip_num < 9; equip_num++)
reset_equip (equip_num);
If you want to change one of those you have inspect every 9 in the
program
****
I don't know what is you don't understand so I'll start with the pipeline.
If "yourprogram" writes FAILED to stdout instead of stderr, then FAILED
will show up interleaved with (or after) the other data sent to "lpr".
Supposing "yourprogram" *either* produces data *or* it writes FAILED, and
that lpr knows to ignore an empty job, writing normal data and FAILED to
different output streams works correctly. Writing FAILED to stdout would
print a page with "FAILED (snicker snicker)" on it.
Of course, if "yourprogram" writes data to stdout before it emits FAILED
to stderr, you still end up with an incomplete page (or book).
Seebs said:Let me guess. Someone told you to define symbolic names for constants,
right?
This is not how you do it.
1. If you're going to define a constant, define it and leave it
defined.
2. Don't use a name starting with a capital E followed by
another capital letter, those are reserved for errno values. 3. If
you're only using it once, don't feel like you have to #define it.
4.
Don't use "uLL" on a constant that's unambiguously within the size range
of an ordinary signed long. You don't need any qualifier at all,
although in theory a system could exist where that value is too big for
size_t, in which case you'd be allocating 0 bytes.
5. Don't get so
clever. Try:
if (!emrgcy) {
emrgcy = malloc(0x40000000);
if (!emrgcy) {
fprintf(stderr, "Uh-oh, failed to allocate spare memory.\n");
}
}
6. Don't allocate a GIGABYTE of memory like that -- all this does is
massively increase the chance of catastrophic failure, as a likely
response from a system which overcommits is to determine that your
process allocated a TON of memory, doesn't use most of it, and is
probably the best candidate for being killed out of hand. A megabyte or
two, sure, I guess.
7. Actually, even then, this is just cargo cult
stuff.
Again, don't do this. With extremely rare exceptions, you should NEVER
be using #undef on something you just defined.
Reading your code, I get the impression you're trying to aim for some
kind of code density, with cool tricks you've seen all thrown in
together to make the code look more impressive.
Yes, of course!
I think this is the same logic but with longer and more complicated
code...
I choose a gigabyte because most allocations will be less than 1 GB... if
you only allow a few MB there could easily be a late allocation for more
that the emergency memory can't satisfy.
I like using advanced C features, yes. It makes programming fun. I think
all good programmers will be able to understand my code.
Yes, of course!
This is very bad practise. Localizing the #define is good for the same
reason that using local instead of global variables.
I would say that the #define is a self-documenting form of code, no?
I think this is wrong. With no "qualifier", the number will be
interpreted as an int and undergo default promotions. Because int may be
16 bits this could overflow.
I think this is the same logic but with longer and more complicated
code...
I choose a gigabyte because most allocations will be less than 1 GB...
if
you only allow a few MB there could easily be a late allocation for more
that the emergency memory can't satisfy.
I don't know what cargo cult stuff is.
By the same argument, with extremely rare exceptions you should NEVER be
using block-scope variables.
I like using advanced C features, yes. It makes programming fun. I think
all good programmers will be able to understand my code.
Lew Pitcher said:It is acceptable to have this utility function /log/ the error. It is
not acceptable to have the utility function decide that the
application /cannot recover/ from this error.
Yes, of course!
This is very bad practise. Localizing the #define is good for the same
reason that using local instead of global variables.
I think this is wrong. With no "qualifier", the number will be
interpreted as an int and undergo default promotions. Because int may be
16 bits this could overflow.
I don't know what cargo cult stuff is.
said:By the same argument, with extremely rare exceptions you should NEVER be
using block-scope variables.
I like using advanced C features, yes. It makes programming fun. I think
all good programmers will be able to understand my code.
sandeep said:Yes, of course!
This is very bad practise. Localizing the #define is good for the same
reason that using local instead of global variables.
I think this is wrong. With no "qualifier", the number will be
interpreted as an int and undergo default promotions. Because int may be
16 bits this could overflow.
I don't know what cargo cult stuff is.
I like using advanced C features, yes. It makes programming fun. I think
all good programmers will be able to understand my code.
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.