casts

S

spinoza1111

<snip off topic material>

This isn't a usenet group to discuss Algol, ADA, Fortran, COBOL, C++,
Visual Basic or politics.

Please restrict your posts to the C language.

What you don't want to hear is that actual computing praxis can lead
to homeless Iraq veterans: but this is the case. In German
universities under Hitler, scientists were likewise admonished to
"stick to topic": Godwin converges to the limit of unity here because
the same Fascist mechanisms are at work.
 
W

websnarf

So many people I have noticed in their code use casts. Something
Richard Heathfield said once very much sticks out in my mind. If code is
written properly there is no need (or rarely) for casts. Why do they seem to
be used so much then?

That's the same question as asking why people use "void*". Its used
whenever you are doing any sort of polymorphic behavior. Often is
makes sense to create a generic function like:

int parseLines (void * ctx, int (* cb) (void *ctx, const char *
line), const char * input);

The idea is that you parse your input into a sequence of lines. But
you don't predetermine what it is you are going to do with each of
those lines. You can, in fact, reuse this code for any time you want
to send a sequence of lines to some processing. So the idea is that
for each line, the function:

cb (ctx, line);

is called with, say, negative results used as an abort code that is
passed out of parseLines(). Note that ctx is just being passed
through down from the callers straight through to the cb function.
Now, if you only ever use one callback function (cb) then you might as
well hard code it and pick the exact type for ctx (which keeps track
of whatever state you require, like say an MD5 accumulator, or
whatever) and not both with this level of abstraction. But the point
is that if you want to use various different callbacks which have
various different states, then forcing the state to be held in a void
* pointer lets you write parseLines() only once.

Your cb function itself would take the (void*) ctx pointer and cast it
back to the pointer to the state structure that it knows how to
handle.

As you can see, unless you can rewrite parseLines() with macros
(highly undesirable) the way you can achieve re-usability or
genericity is through the use of void *, which will ultimately require
a cast to re-establish your type. Trying to absolutely avoid casts at
all costs is really not worth it. You should probably minimize the
use of them, but sometimes it is truly best to use them.
 
D

Dik T. Winter

> I will admit that in 1970, Algol probably would not have been a good
> choice for writing OS code, and my understanding is that this is what
> Ritchie wanted to write.

Algol-60 would never have been a good idea to do that. The language was
not designed with that in mind. The language was primarily intended as
a way to communicate algorithms between *people*. That is why the
algorithms section of the CACM was filled by algorithms written in Algol-60.
 
D

Dik T. Winter

> Had the United Nations had the political and military clout its
> founders had intended for it, UNESCO would have created a world
> programming language and saved us all a great deal of time for nobler
> pursuits:

Did you know who hosted the conference where Backus presented IAL, Aka
Algol-58, the forerunner of Algol-60, for the first time? Lack of
knowledge about the history of computing on your side perhaps?
 
R

Richard Tobin

Richard Heathfield said:
No, it doesn't need to do that. The canonical example is qsort's
comparison function, where no cast is required:

int double_cmp(const void *vleft, const void *vright)
{
const double *left = vleft;
const double *right = vright;
return (*left > *right) - (*left < *right);
} /* look, ma, no casts */

For clarity (and perhaps, in some circumstances, for efficiency) it is
usually better to dereference an unneeded pointer as soon as possible,
and do it just once. Of course one could write:

const double *leftp = vleft;
const double *rightp = vright;
double left = *leftp;
double right = *rightp;

but I see no advantage to that over:

double left = *(double *)vleft;
double right = *(double *)vright;

Casts are rarely necessary, but are sometimes the natural way to do it.

Observe, by the way, that the locality of use implicit in a cast
rather than an assignment makes the use of "const" pointless.

-- Richard
 
D

Dik T. Winter

....
> Look at his 1999 *tour d'horizon*.

EWD 1284? "Computing Science: Achievements and Challenges"

"It is a pity that they were called programming languages, but apart from
that unfortunate name, FORTRAN and LIST have been the great contribution
of the 50s."

However:

"It is much harder to get lyrical about FORTRAN, which as a programming
language was denied the epithet 'higher-level' already in 1962 (in Rome(,
but I should be grateful for I learned a few things from it. During an
oral examination I had a student develop a program which we would now
recognize as implementing pointer manipulations using a one-dimensional
array. The candidate did very well until at the very end he god
mysteriously stuck, neither of us understanding why. I turnied out that
he shoul have written down 'a[a]:= ...' but that mental block
prevented him from conceiving that because FORTRAN (to which he had been
exposed extensively) did not allow index expressions as complicated as
'a'. It was a revealing warning of the devious influence the tools
we use may have on our thinking habits."

Or:

"The inability to think about programs in an implementation-independent
way still afflicts large sections of the computing community, and FORTRAN
played a majort role in establishing that regrettable tradition."

It does not look entirely like praise...
>
> PL/I influenced the lads too: but there is a cultural ban on crediting
> IBM, because Kernighan and Ritchie were already living the foolish
> fantasy that they were fighting the Forces of Darkness.

Oh, apparently you know better what Kernighan and Ritchie were thinking
than they themselves. Of course they knew PL/I, it is written in the
article where I referred to, but they did not think it was adequate, and
apparently that is a "foolish fantasy that they were fighting the forces
of darkness". Yeah.
> But these problems have been overcome, no thanks to C. C was a net
> waste of time. Basically, the US was damned if the Europeans were to
> go ahead with Algol since that would disrupt the Cold War hegemony of
> the US. Therefore, difficulties in the Algol development process which
> also occured in the development of Fortran were exagerrated by US
> computer media.
Oh.

> In my view, it would have been better for programmers world-wide to
> wait for Algol while using assemblers with conditional and macro
> facilities: this technology was good to go in the 1950s. Instead,
> Fortran was adopted and although it was supposed to enable end users
> to program, it merely created one of the first hordes of coding bums.

Well, in Europe in large Algol was adopted.
> The problem with Algol was, IMO, that whenever developers started
> actually THEORIZING (thinking), their managers at banks and in
> government would flip out because as my fat pal Adorno noticed in the
> 1930s when he worked in market research, capitalism is anti-thought:
> the "thinker" is suspected of soldiering and wasting time in
> "scientific" management. The quotations from Wittgenstein in the Algol
> report were probably considered too la-de-da, not only by American
> auto industry managers, but also, probably, by Dutch merchant seamen,
> Danish agribusiness managers, and other Practical Men.

Oh, well, Fokker did the computations on its airplanes in Algol 60. Did
they not have managers? Or were they not practical?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,581
Members
45,057
Latest member
KetoBeezACVGummies

Latest Threads

Top