casts

N

Nick Keighley

double Gma (double *num, int n)
{
    int i;
    double total=0;
    for (i=0;i<n;i++)
    total +=num;
    return total/1*i;
}
I hope this code is correct. But on that last line I've seen a double  
cast.
That isn't necessary is it?


I hope you realize that the last line is equivalent to
    return (total/1) * i;
which is equivalent to
    return total * i;

It looks like you're probably trying to compute the mean of a set of
doubles, in which case it should be
    return total / i;

(which I would write as
    return total / n;
which is the same value, but I think the meaning is
much more immediately apparent to the reader).


it isn't the same value. On exit from the for-loop i will be one
greater
than n. Which is correct is another matter, but they aren't the same.
But in none of these cases is a double cast necessary.
A cast would be appropriate here:

double avg(int *nums, int n)
{
    int i;
    int total = 0;

    for (i = 0; i < n; ++i) total += nums;

    return (double)total/n;

}

to force the division to be computed as a double value instead of
an int, because otherwise both operands of the division are ints.
 
C

Chris M. Thomasson

Bill Cunningham said:
So many people I have noticed in their code use casts. Something
Richard Heathfield said once very much sticks out in my mind. If code is
written properly there is no need (or rarely) for casts. Why do they seem
to be used so much then?

Casting can be very useful when working with, say, intrusive
data-structures:

http://isis.poly.edu/kulesh/stuff/src/klist
 
C

Chris M. Thomasson

Richard Heathfield said:
Chris M. Thomasson said:


If that URL was an attempt to convince me, it failed. (Sorry.) For
example, it seems clear that the author thereof has never heard of
offsetof, and his understanding of malloc seems to be twenty years
out of date.

Is this any better:

http://www.google.com/search?hl=en&q=linux+kernel+linked+list




struct node {
struct node* next;
};


struct foo {
char a;
int b;
struct node node;
char c;
int d;
};




How can you get a `struct foo*' from a `struct node*' without using a single
cast?
 
C

Chris M. Thomasson

Richard Heathfield said:
Chris M. Thomasson said:


Well, the first hit was just the same page I saw earlier, which
obviously has the same flaws as before. The second contained at least
one non-standard construct ("typeof"). At that point, I gave up. But
I see you have provided some code.


I wouldn't do that if I were you. struct node * points to a struct
node, which is most certainly not a struct foo. Therefore, it's
unwise to convert a struct node * to a struct foo *.

I take it that your not a fan of intrusive data-structures. Anyway, whats
wrong with:
______________________________________________________________________
#include <stdio.h>
#include <stddef.h>
#include <assert.h>


#define CONTAINS(mp_ptr, mp_struct, mp_member) ((mp_struct*)( \
((unsigned char*)(mp_ptr)) - offsetof(mp_struct, mp_member) \
))


struct node {
struct node* next;
};


struct foo {
char a;
struct node node1;
int b;
struct node node2;
char c;
struct node node3;
int d;
};


int main(void) {
struct foo foo;
struct node* node1 = &foo.node1;
struct node* node2 = &foo.node2;
struct node* node3 = &foo.node3;
assert(CONTAINS(node1, struct foo, node1) == &foo);
assert(CONTAINS(node2, struct foo, node2) == &foo);
assert(CONTAINS(node3, struct foo, node3) == &foo);
return 0;
}
______________________________________________________________________




IMHO, this technique is perfectly fine. Now, I can pass `foo::node1/2/3' to
generic algorithms which work with `struct node*' and I do not need to
allocate these nodes separately as they are embedded within a `struct foo*',
hence the term "intrusive". If you know what your doing, this is a very
useful construct indeed, well, IMVHO at least...

;^)
 
K

Keith Thompson

Nick Keighley said:
By Arnuld on clc

" At my job, I am working on a piece of code (around 16 source files)
where I have to call malloc() and free() around 20 times. Every time I
have to check the return value of malloc(). So I cam up with this idea
of writing my own malloc() and then using it. is this a good idea ?
[snip]
  printf("Memory allocated: %p\n", (void*)m);
  free_my_struct(&m);
  printf("Memory freed:     %p\n", (void*)m);
[snip]

what casts?

The unsnipped ones above.
 
S

spinoza1111

my irony meter imploded

That's because its timer is broken. Look, Heathfield in all cases
starts trouble while pretending, in a stupid and pompous way, to be
the voice of "objectivity" and "science"...despite the fact that he
promotes a language which is known to be unsafe, and the use of which
for applications programming is a termination offense in some
environments.

"Objectively" he proceeds to question the professional competence of
individuals in such a way that can damage their livelihoods. He's done
so repeatedly to well-known people who've accomplished far more than
he ever will, such as Jacob Navia, Herbert Schildt, and the late Bob
Bemer.

He does so when they use conventions or practices of which he
disapproves in rote and folkloric fashion. He fancies himself, as do
many common-as-dirt programmers, a magical combination of craftsman,
artist and scientist while being none of these: not honestly working
with materials, nor working with or expressing himself with care and
beauty, nor with any dedication to truth or collegiality, marks of the
scientist.

To this, people unsocialised by corporate bullshit to take it up the
ass respond like men, unfashionably, and fight back, meeting force
with force and even fraud with fraud.

And then "neutral" parties read a couple of the latest posts to find
poor widdle Dickie boy being given the good old Edward II treatment (a
hot poker up the butt) and they reason after such undue undiligence
that Antoninus Twink, Jacob Navia and above all Edward G. "Up Yours"
Nilges must be wild men, and *monstrum horrendum*.

"A little learning is a dangerous thing: Drink deep, or drink not,
from the Pieran Spring".
 
S

spinoza1111

Fred said:




Also, it is difficult to use qsort without using cast.

No, it's easy (see below).

Don't make the mistake of believing Antoninus Twink. Casts are almost
never a good idea, and the places where they are a good idea are not
the places you'd think.

Here is an example of using qsort without a cast:

#include <stdlib.h> /* for qsort */
#include <string.h> /* for strcmp, used in comparison function */

struct bigramfreq_
{
  char bigram[3];
  unsigned long freq;

};

typedef struct bigramfreq_ bigramfreq; /* the things we're sorting */

/* comparison function */
int compbigramfreq(const void *vp1, const void *vp2)
{
  const bigramfreq *p1 = vp1;
  const bigramfreq *p2 = vp2;
  int diff = (p1->freq < p2->freq) - (p1->freq > p2->freq);
  if(diff == 0)
  {
    diff = strcmp(p1->bigram, p2->bigram);
  }
  return diff;

}

Since I no longer use C, regarding its use as criminal save for
recreational programming, I may have missed something here, but this
code is to me globally incompetent...so incompetent, in fact, as to
relabel competence itself and normalize deviance.

You declare a function which takes a pointer to ANYTHING, and convert
it blindly and foolishly to a pointer to bigramfreq. You then operate
on essentially unknown data and return an answer.

Not only hard to debug, and not only impossible to do in a modern
language, this code may actually seem to run with only occasional
anomalies while quietly undermining the user's data, and he none the
wiser. Practices like this caused the intelligence failures that led
to Sep 11 and the war in Iraq, and the credit crisis.

But you don't CARE. You don't CARE how many wrong answers you generate
at high speed and how many reputations you ruin. As long as you get
paid.

You filthy swine. You Creosote.

You are such a destructive little man that you actually, given my
thankfully fading memories of C's horrid details, declare the function
unnecessarily as taking void pointers when you could have declared
them as what they MUST BE.

You WANT the calling programmer to be humiliated and embarassed by
passing you shit pointers, you creep. Then you can make fun of him.

If I was your father, I'd kick your ass.

I hope your job goes to India and stays there.
Once all the furniture is in place (a type to sort, and an ordering
function), we can define our data:

  bigramfreq bgf[26*26] = {0};

and, having populated it (not shown here), can sort it thusly:

  qsort(bgf,
        sizeof bgf / sizeof bgf[0],
        sizeof bgf[0],
        compbigramfreq);

I see no need for a cast anywhere here.

I see a need for a bucket, Creosote.
 
S

spinoza1111

I use qsort without casts regularly.

I use qsort without casts regularly.

Because it makes you feel like a Real Man to handle all them void
pointers, tough guy?

C programming is macho BS. I started out in machine, not assembler,
language, so I just laugh at programmers who pretend to be "real"
programmers by endangering their users' data by using C. I was a
professional, you see, so I got an assembler real fast and debugged a
nonworking compiler to be able to program in the safest language
possible.

If I were, today, unprofessional enough to use C, I'd cast up the ass
because casting redresses the broken (indeed, dead at birth) type
system of C.

Kernighan and Ritchie arguably had an excuse for weak to zero typing
in 1970, because they literally had no clue what it was they were
doing. Not only was software in its infancy, as Americans they
regarded the theory work of the Algol team as bullshit, and as
adolescents they feared being associated with "failure", that of the
Algol team to "deliver" something that would kill the Russians and
make the rich, richer. Therefore they had a somewhat genuine need to
represent novel information structures using numeric and other codes.

But as adolescents, they fell in love with the "no girls allowed"
secrecy of the resulting business. It created a Men's Hut. Excluded
from Tiger Inn and the other eating clubs of Prospect Avenue at
Princeton because as graduate students they were regarded as lower
life forms, they could create their very own Skull and Bones.

The Europeans, on the other hand, had had quite enough with
adolescents having power, having seen the result of this in Fascism
and war. In the Algol era they'd elected old men from the pre-war
generation of adults such as Konrad Adenauer,DeGaulle, and Macmillan
rather then get some pushy head-case sprat in the head office.

They were grown ups and made an adult effort to enquire what it might
actually mean to write software and not play games. Their "failure"
wasn't theirs. It was the failure of hardware designers, save at
Burroughs, to make computers that could run decent programming
languages.

And to this day, we have clowns proud of the Puritan sparseness of
their intellects, their speech, and what they use.

"The quantification of technical progress, however, their dissection
into minute operations largely independent of education and
experience, makes the expertise of these new-style managers to a large
degree illusory, a pretence concealing the privilege of being
appointed. That technical development has reached a state which makes
every function really open to all - this immanently socialist element
in progress has been travestied under late industrialism. Membership
of the elite seems attainable to everyone. One only waits to be co-
opted. Suitability consists in affinity, from the libidinal garnishing
of all goings-on, by way of the healthy technocratic outlook, to
hearty realpolitik. Such men are expert only at control."

- Theodore W. Adorno, Minima Moralia
 
R

Richard Tobin

You really don't know diddly-squat about C, do you?

I suspect that Spinoza has forgotten how C works. But it is one of
the real deficiencies of C that its mechanism for polymorphism -
converting pointers to void * and back again - disables normal type
checking.

In theory a compiler could know enough about qsort() to detect type
errors in the usual case where the comparison function is known at
compile time, but qsort() is just one example of the problem.

-- Richard
 
S

spinoza1111

Richard Tobin said:



If he ever knew, which is not a given.

For somewhat the same reason it is better to have loved and lost than
never have loved at all, but not quite, it is better to partly forget
than never learn the fundamentals of competent programming independent
of a language.
It's a deficiency that I can live with. If I want type-checked
polymorphism (which, to be fair, I sometimes do), I know where to
find it.

I don't give a flying **** if you can live with it. I'm concerned
about the chip shops, chop shops and turf accountants who waste their
money on your services, and who get subtly, or not so subtly, wrong
answers at high speed from your crap software, which is destroying
what's left of Britain's industry and trade.

I don't give a rat's ass if you know where to find it. I'm concerned
about the users who you defraud and deceive.

Furthermore, in answer to Richard, "polymorphism" isn't the name of a
simple blunder which here is not strong-typing the parameters. Perhaps
there's some silly C reason why this isn't possible, but in a modern
language in nearly all cases (the exception for me being .Net
delegates in a multithread environment) there is NO EXCUSE for weak
typing, and relying on the caller to provide the right type. In using
formal parameters declared as object I place their assignment to
strongly typed local variables in try..catch blocks because I am
competent and Heathfield is not.

This is despite the fact that Java and .Net runtimes will in most
cases catch the error. Note that in C one of the worst types of errors
occurs instead, and this is the undiscovered and unknown error.

Richard's incompetent, deviant, and fraudulent psychology is here
exposed, for weak-typing the parameter victimises and entraps his
callers, creating a danger for the user in order to give him the
opportunity to blame others for their failure to know what his
incompetent code expects.

Both Tobin and Heathfield are mistaken about the very meaning of
"polymorphism". Polymorphism doesn't mean lazily allowing void
pointers. It means the disciplined provision of routines with
parameters that may be omitted to obtain carefully defined default
values, and the ability to handle disparate but related data types.
Here is a simple example in C Sharp:

private void printValue(int intValue)
{
printValue(intValue.ToString());
}
private void printValue(string strValue)
{
....
}

Polymorphism is bounded to a small set of alternatives such that
errors can ideally be detected at compile time, and at the latest at
run time, whereas calling Richard's code with the wrong parameter type
(perhaps because he's lied to you in order to get you fired) creates
errors that might never be detected.

Richard, do NOT dignify your laziness, dishonesty, and incompetence
with words you do NOT understand.
 
N

Nick Keighley

Nick Keighley said:
By Arnuld on clc
" At my job, I am working on a piece of code (around 16 source files)
where I have to call malloc() and free() around 20 times. Every time I
have to check the return value of malloc(). So I cam up with this idea
of writing my own malloc() and then using it. is this a good idea ?
[snip]
  printf("Memory allocated: %p\n", (void*)m);
  free_my_struct(&m);
  printf("Memory freed:     %p\n", (void*)m);
[snip]

what casts?

The unsnipped ones above.

brain fart
 
N

Nick Keighley

I cleaned up your post a bit

Because it makes you feel like a Real Man to handle all them void
pointers, tough guy?

C programming is macho BS. I started out in machine, not assembler,
language, so I just laugh at programmers who pretend to be "real"
programmers by endangering their users' data by using C. I was a
professional, you see, so I got an assembler real fast and debugged a
nonworking compiler to be able to program in the safest language
possible.

If I were, today, unprofessional enough to use C, I'd cast up the ass
because casting redresses the broken (indeed, dead at birth) type
system of C.

heavy use of casts prevents the compiler from diagnosing certain
errors

void (char *s)
{
char c;
c = (char*)c;
}

Kernighan and Ritchie arguably had an excuse for weak to zero typing
in 1970, because they literally had no clue what it was they were
doing.
riight

not only was software in its infancy, as Americans they
regarded the theory work of the Algol team as bullshit,

Algol-60 is frequently quoted as an influence on the C language

<snip nonsense>
 
S

spinoza1111

spinoza1111said:




Fred said:
On Jul 30, 10:58 am, Antoninus Twink <[email protected]>
wrote:
On 30 Jul 2009 at 14:37, Bill Cunningham wrote:
Something Richard Heathfield said once very much sticks out in
my mind. If code is written properly there is no need (or
rarely) for casts.
As with most things Heathfield says, this needs to be taken with
a pinch of salt. As is typical for Heathfield, this statement is
pure polemic to push his "Sola ISO C" fundamentalism.
There are many places where casts are essential: for example, to
implement polymorphism in object-oriented C programming or for
type punning. There are also many places where casts are useful:
for example, casting a uint64_t to a uint32_t is a simple way to
reduce it mod 2^32.
Also, it is difficult to use qsort without using cast.
No, it's easy (see below).
Don't make the mistake of believing Antoninus Twink. Casts are
almost never a good idea, and the places where they are a good idea
are not the places you'd think.
Here is an example of using qsort without a cast:
#include <stdlib.h> /* for qsort */
#include <string.h> /* for strcmp, used in comparison function */
struct bigramfreq_
{
char bigram[3];
unsigned long freq;
};
typedef struct bigramfreq_ bigramfreq; /* the things we're sorting
*/
/* comparison function */
int compbigramfreq(const void *vp1, const void *vp2)
{
const bigramfreq *p1 = vp1;
const bigramfreq *p2 = vp2;
int diff = (p1->freq < p2->freq) - (p1->freq > p2->freq);
if(diff == 0)
{
diff = strcmp(p1->bigram, p2->bigram);
}
return diff;
}
Since I no longer use C, regarding its use as criminal save for
recreational programming, I may have missed something here, but this
code is to me globally incompetent...so incompetent, in fact, as to
relabel competence itself and normalize deviance.

You've missed something here.
You declare a function which takes a pointer to ANYTHING,
and convert it

Yes. That's how qsort works. It's impossible to demonstrate qsort
without using qsort. It's impossible to use qsort without using a
comparison function. The comparison function *must* be of type
int(const void *, const void *). Since void pointers can't be
dereferenced, it is necessary to convert them to something that can.
Therefore, to demonstrate qsort it is necessary to do such a
conversion.

This is no excuse, that you're instantiating a qsort exit. You're
deliberately using an out of date and unsafe tool, and because it
forces you to use const pointers to void, you use these, I am sure,
elsewhere, and you use them blindly or to entrap.

You call this incompetently weak typing by an incorrect name, because
it is NOT polymorphism, it's anamorphism, shapeless code and a bug
waiting to happen. In C you do not have the right to use a sorting
tool at all and you need to write a separate sort for each data type,
because the children who designed this language never even thought
about sorting, and never realized that it is not a problem that can be
solved safely, correctly and maintainably in a non-OO language,
because sorting IS polymorphic. It's a set or class of problems and as
such its solution cannot be properly or elegantly expressed in a
language without the ability to declare new types (and C's typedef and
struct are not able to declare types orthogonally to built-in types:
such types will forever be weak shadows).

You've also deceptively or foolishly made yet another false claim as
is your alternately deceptive or foolish wont. You say the code does
not use a cast. You're lying or just stupid (and this does not
logically exclude the strong possibility that you're a stupid bloody
liar, being || and not XOR). But in const bigramfreq *p1 = vp1, vp1 is
CASTED to a bigramfreq pointer!

I concede that I did not see you were using qsort. I never used this
tool because of its apparent lack of safety and use of void pointers,
and creeps like you like to snigger at small errors while you make the
big ones, starting with your idiotic advocacy of C. I am certain that
following qsort, you use const void pointers whenever you are in the
mood, and I believe that your code provides systematically wrong
answers in an undetected fashion. I am certain that the clerks and
secretaries forced to use your code are reduced to tears as they are
screamed at by managers to get results from it.
On the contrary, I know perfectly well that I'm getting pointers to
bigramfreq structures. See the qsort call.

There's nothing to prevent the sort exit from being called by new
code. You know only a narcissistic fantasy, a fetishised and
hypostatized Platonic Idea of your code uninterfered with by
maintenance programmers and fed on nothing but the expected inputs.
When that fantasy is deconstructed you proceed to destroy people.
What bug do you think needs debugging? I see none in the code I
posted.


Modern languages don't let you sort? I find that very hard to believe.

They allow the programmer to declare objects as compareable.
On the contrary, it works just fine, with no anomalies whatsoever.

Unless someone calls the qsort exit from another point in your code. I
have said before that one feature of a bad programming language is
that it makes smart people only seem stupid, and stupid bonehead thugs
like you seem smart: indeed, this is why you support C. John Nash
wrote beautiful C code but abandoned it for Mathematica at the
earliest possible moment because of this, whereas uneducated computer
thugs like you insist on using out of date languages to trip up
intelligent people and make you seem smart.
I don't care about your chimerae, no.


I prefer to generate right answers at high speed.

Then why do you use an out of date and unsafe language? Dijkstra
apparently found C beneath contempt in a *tour d'horizon* of
programming and its languages at http://www.cs.utexas.edu/users/EWD/transcriptions/EWD12xx/EWD1284.html.
He mentions Fortran because Fortran, in its own crude way, to him
broke ground. He does not mention C because C ignorantly swiped ideas
from Algol and proceeded to become a monstrosity.

Even if your code is correct, it is unmaintainable, since you cannot
force the next programmer to know all your tics and quirks.
 
A

Antoninus Twink

heavy use of casts prevents the compiler from diagnosing certain
errors

void (char *s)

A poor example, since this requires a mandatory diagnostic message.
 
B

Bill Cunningham

Richard,

I seriously doubt, althought I could be wrong, that learning just from
kandr2 one could get as good at code such as yourself or Ben Pfaff. Do you
agree? I don't know whether or not to get into kandr2 or start with C
unleased. As long as I have the time and my mind stays with me long enough
to learn "real" C code.

Bill
 
N

Nobody

No, it's easy (see below).

Don't make the mistake of believing Antoninus Twink. Casts are almost
never a good idea, and the places where they are a good idea are not
the places you'd think.

Here is an example of using qsort without a cast:
I see no need for a cast anywhere here.

Using an implicit cast to/from void* is cheating. By that token, you can
eliminate all casts with e.g.:

int *ptr_to_int(void *x) { return x; }
void *ptr_from_int(int *x) { return x; }

and so on.
 
K

Keith Thompson

Nobody said:
Using an implicit cast to/from void* is cheating. By that token, you can
eliminate all casts with e.g.:

int *ptr_to_int(void *x) { return x; }
void *ptr_from_int(int *x) { return x; }

and so on.

First, there was no cheating in Richard's code. Since he referred to
the converted arguments several times, it made sense to store them in
local variables. The initialization of the local variables performed
the conversion implicitly.

And no, you can't elimininate *all* casts by implicitly converting to
and from void*. Conversions between pointers and integers, or between
function pointers, can only be done with explicit casts. (You could
do some kind of type punning with unions or pointer conversions, but
that only works if the conversions don't change the representation.)
 
S

spinoza1111

First, there was no cheating in Richard's code.  Since he referred to
the converted arguments several times, it made sense to store them in
local variables.  The initialization of the local variables performed
the conversion implicitly.

And no, you can't elimininate *all* casts by implicitly converting to
and from void*.  Conversions between pointers and integers, or between
function pointers, can only be done with explicit casts.  (You could
do some kind of type punning with unions or pointer conversions, but
that only works if the conversions don't change the representation.)

Wouldn't it make sense to always use a cast in C to simulate a modern
language? Since C was developed we have learned that implicit type
conversions are a toxic "convenience". Can one always use explicit
casting in C?

Richard cheats every time he posts code. It always exploits the
special case, here the use of qsort. He thinks he's being cute. But
it's tiresome.

Qsort is a very bad tool. If the programmer of the qsort call is
different from the coder of the exit logic, they have to agree on the
type. If they fail to do so, qsort will fail in the worst possible
way: by producing wrong answers with no error indication.

C is a worse language than Cobol. >
 
S

spinoza1111

Bill Cunningham said:



"C Unleashed" is not for beginners.

Nor is it for anyone else. It might make a good doorstop, or anchor
for a very small boat.
I recommend you stick with K&R2
for the time being (if you really wish to persist with this). Until
you've mastered K&R2, you're not going to get much, if any, value
from "C Unleashed", because it works from the premise that you
already have a reasonably solid knowledge of C - and, without wishing
to discourage you in any way, quite frankly you do not have that
level of C knowledge yet.

Patronising little sod, aren't we?
 
S

spinoza1111

I cleaned up your post a bit






heavy use of casts prevents the compiler from diagnosing certain
errors

    void (char *s)
    {
         char c;
         c = (char*)c;
    }

Point taken. The basic problem is that NO type information is
available at run time, so its impossible to catch type errors. This
results in the worst type of bug: the bug no-one knows about.

Speed? Screw speed. I am for one tired of macho but inadequate little
programmers who are working for insurance companies and banks and who
fantasize that they're doing rocket science. We know (and Kernighan
and Ritchie did not) how to use just in time compilation to maximize
speed where maximization of speed belongs.

I met Brian Kernighan at Princeton in 1987, and I told him that I'd
admired his early work in programming style. However, I was not at all
impressed with Ritchie's nonworking "regular expression" code in
Beautiful Code, because it showed a failure to understand twenty-plus
years of work in software, esp C Sharp and Java.
Algol-60 is frequently quoted as an influence on the C language

The fathers of Algol 60 have never acknowledged this.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,048
Latest member
verona

Latest Threads

Top