Bill Reid said:
Well, OK, maybe, here's canonical specificity:
/* now re-allocate memory for the instrument strings */
if((curr_instrs=(instr_strs *)
realloc(curr_instrs,num_instrs*sizeof(instr_strs)))==NULL) {
printf("Not enough memory for instruments buffer\n");
goto CloseFiles;
}
Does that help you help me?
A little, but there are still a bunch of identifiers whose
declarations I haven't seen.
I will make one comment: Don't cast the result of malloc() or
realloc(). See section 7 of the comp.lang.c FAQ,
[...]
OK, so this should be completely legal and flawless:
/* sort the symbol list alphabetically */
qsort((void *)curr_instrs,num_symbols,128,sort_alpha_list);
then...
/* sort the no-symbol list alphabetically */
qsort((void
*)curr_instrs+num_symbols,num_no_symbols,128,sort_alpha_list);
Um, no.
By "legal and flawless" I DID mean "100% guaranteed functional",
not "pleasing to thine eyes"...
The code isn't 100% guaranteed functional". You're performing
arithmetic on a void*. That's not allowed in standard C.
That's the way YOU'D do it, I do it differently, and since I'm the only
one reading it (except in this one rare instance, or occasionally I'll post
some code somewhere on the net), I can read it just fine, and of
course it compiles all the same...
Ok, but I find it more difficult to read without the whitespace.
Whenever you post code here, you can expect comments on its style.
You're under no obligation to pay attention.
In qsort(), it's basically 128 (character) bytes.
Ok, but why 128 rather than 127, or 100, or 256? That's a rhetorical
question; you don't need to answer it, but ideally your code should.
(And yes, it's a style issue.)
I've actually got "128" defined globally (and I do mean globally, for
several hundred thousand lines of code) for the purposes of reading
and writing strings of certain lengths. And those damned defines
have managed to screw me up royally several times, including a
really irritating "intermittent" problem I had when I first wrote this
particular section of code. So lately I've been using them less
and less...
Even at file scope right now I'm more comfortable with the way it
is...
Ok, it's your code, but I'm quite surprised that defining symbolic
constants would cause more problems than it would solve.
If someone else needs to maintain your code (and "someone else" could
be you a year from now), it's not going to be obvious that the 128 in
this function corresponds to the 128 (or 127) in another function, but
the 128 in that function over there is just coincidental. There's a
good discussion at said:
The first argument to qsort is:
(void *)curr_instrs + num_symbols
You can't do pointer arithmetic on a void* value. (Some compilers may
allow it; if you're using gcc, try "-ansi -pedantic -Wall -W", or
replace "-ansi" with "-std=c99").
Then how does qsort() do it? I'm assuming now that it must just
use pointer arithmetic internally, because it doesn't seem to want or
recognize my typedef of a 128-character string:
typedef char instr_strs[128];
instr_strs *curr_instrs;
qsort behaves in a manner consistent with its specification. That's
all you really need to know. It needn't even be implemented in C, and
if it is, it's free to use compiler-specific extensions.
But if it's implemented in standard C (which is entirely possible), it
presumably would convert the void* arguments to char* before
performing arithmetic on them. (Since char* and void* have the same
representation, the conversion doesn't cost anything at run time.)
Nope, a pointer to the first of many 128-character strings, as above, so
are you saying the pointer cast should be (instr_strs *)? I have no problem
with that, as long as it works, and I must stress again at this point that
the current code:
/* sort the symbol list alphabetically */
qsort((void *)curr_instrs,num_symbols,128,sort_alpha_list);
Has worked flawlessly for months now; it's part of a particular section
of code that downloads about 3/4 meg of raw data from the net every
day at a specific time, parses out about 100,000 data items, and writes
them to a custom database in a matter of seconds.
That qsort() call isn't the one with the problem.
Incidentally, a piece of code is either correct or not. The number of
times it "works" really doesn't prove anything. If your compiler
accepts some non-portable code, it's probably going to keep working
the same way indefinitely -- but it will fail the first time you
compile it with a different compiler, or with the same compiler and
different options. Correctness is not statistical.
One pitfall of C is that there are a lot of errors that your compiler
isn't required to tell you about. Many things invoke undefined
behavior; they may appear to work, but the language doesn't guarantee
anything. Other things may be compiler-specific extensions. The
language requires a conforming implementation to issue a diagnostic
message for many of these -- but many compilers (including gcc) are
not conforming in their default mode. Typically you can use
command-line options to enable a conforming mode and provide
additional warnings.
The only reason I asked the original question was because I went
back and reviewed the code and wondered if I could shave a few
more milliseconds off the execution time...
Note that I didn't
Yeah, I noticed that, I just use (void *) because that's what
I thought qsort() wanted, and it definitely WORKS that way
(I've used qsort() dozens of times EXACTLY that way without
problems).
Yes, it works, but it's not necessary. As a general rule, casts
should be avoided unless they're actually required. A cast is, among
other things, a promise to the compiler that you know what you're
doing, and will often inhibit warnings and error messages. In this
case, the argument will be implicitly converted to void* without the
cast (assuming you have a visible prototype for qsort() -- i.e., you
haven't forgotten the "#include <stdlib.h>".) The code is perfectly
correct either way, but the form with the cast is more "brittle". If
the cast had specified the wrong type, for example, the compiler
likely wouldn't have told you about the error.
Now to get back to this:
I think I see what you're saying, maybe...and maybe not...
If curr_instrs is pointer to a 128-character string type, wouldn't
curr_instrs+num_symbols then point to a location offset from
curr_instrs by (num_symbols*128 bytes)? And if so, what's
the point of cast (char *) if qsort() already works by sorting
some specified number of sequences of some specified
number of character bytes?
I haven't seen the full context of your code (or if I have, I've
forgotten it). Your original code had
(void*)curr_instrs + num_symbols
which is illegal, because you can't perform pointer arithmetic on
void* (the cast applies to "curr_instrs", not to "curr_instrs +
num_symbols"). Pointer arithmetic, as you probably know, is scaled by
the size of the pointed-to type.
Are you using gcc? If so, it supports arithmetic on void* as an
extension; it acts like arithmetic on char*. (IMHO, this extension is
a bad idea.) By casting curr_instrs to void*, you cause the "+
num_symbols" to denote an offset of num_symbols *bytes*. I had
guessed that that's what you wanted, but apparently it isn't.
I think what you *really* wanted was for the addition to be scaled by
sizeof *curr_instrs (128 bytes?). If so, you probably meant to use
(void*)(curr_instrs + num_symbols)
which should work. But since the argument will be implicitly
converted to void* anyway, all you need is
curr_instrs + num_symbols
In other words, all you need to do is drop the cast. This avoids
depending on a compiler-specific extension *and* corrects a bug. It's
also a very nice demonstration of why unnecessary casts should be
avoided.