why is casting malloc a bad thing?

P

Paul Hsieh

Mark A. Odell said:
You *shouldn't* use malloc in C++!

Why not? What if you wish to explictely make sure constructors are
not invoked for all entries? I don't think "new" has an option where
you can turn this off.
True but this is not a style issue.

No, its a language weakness issue. One of the few good things to come
out of the C99 spec is the adoption of C++'s requirement for
prototype, and not assuming the default: int ()(int) . But since the
OP in highly unlikely to have practical access to a C99 compiler
within the next 5 years, the simpler solution is simply to compile
with C++ (after making your code C++ safe, of course) to catch these
and other kinds of errors.
 
D

donLouis

You are right. Stroustrup's opinion matters on C++, and your opinion
matters on C2, but neither of your (neither's of you?) opinion matters
on C. C++ is Stroustrup's language and C2 is yours, fine, you can keep
them. Neither of you has claim over C.

I believe it's "neither of your opinions".
 
P

Paul Hsieh

Papadopoulos Giannis said:
I've read nearly completely the whole thread, although in the way I
lost my path.

I assume the following:

d = malloc(50*sizeof(*d));

They all generate equivalent object code (when they are correct, and
as intended.)
o the most portable

Its not portable to anal C++ compilers set at their most pedantic
warning level.
o does remember you to include stdlib

Or malloc.h on some systems. However if you use a C++ (or C99)
compiler to compile your C code, you'll get the lack of prototype as
an error whether you cast or not.
o changing type of d does not affect anything

That is about its only real advantage.
d = (double*)malloc( 50*sizeof(double) );

On AMD64 (aka x86-64) pointers have long long (or __int64)
representation. On 16bit DOS systems pointers can have a kind of
int[2] representation. In general pointers should not be assumed to
be of size int on code which is supposed to be portable.
o it gives a good hint on what d's type is

Actually the declaration of d is the only credible source of what type
d is.
 
A

Allin Cottrell

Sidney said:
With Mr. Plauger, I am amazed by the complete inability demonstrated by
many of the 'anti-cast' crowd to admit even a hint of nuance in their
thought process on this issue. Being 'pro-cast' myself...

As I see it, as a member of the 'anti-cast' crowd, the problem is
with the very notion of a 'pro-cast' position (a la Tisdale). We
see no possible rationale for this in terms of the C programming
language.

On the other hand, P.J. Plauger's basic position seems perfectly
reasonable, namely that there are some special contexts where a
body of code has to be compiled indifferently as C or as C++, and
in those special contexts casting the return from malloc is
required (and does no harm if we can assume that the coder is
astute enough to ensure that <stdlib.h> is always #included).

This, it seems to me, is not a 'pro-cast' position: it's just
saying that real-world considerations other than "good C
programming" sometimes dictate a cast. Fair enough.
 
R

Richard Heathfield

Paul said:
Or malloc.h on some systems.

<stdlib.h> is always correct for malloc.
However if you use a C++ (or C99)
compiler to compile your C code, you'll get the lack of prototype as
an error whether you cast or not.

On C90 compilers, however, you won't; and C90 compilers are still the most
widely-used C compilers.
That is about its only real advantage.

You forgot the other advantages to omitting the cast:

1) doesn't add pointless, unnecessary code;
2) doesn't suppress an important required diagnostic;
3) gives you a heads-up if you accidentally compile your C code with a
compiler for some other language with notionally similar syntax but
different semantics.
Actually the declaration of d is the only credible source of what type
d is.

That's certainly true.
 
R

Richard Bos

(e-mail address removed) (Wayne Throop) wrote:

[ Please do not remove attribution lines. ]
:::: If you write malloc calls without casts, it's not because it's
:::: necessarily good programming practice but because your grandfather did.

::: But this is the one place where you're an idiot. A complete one.

:: That's an 'Ad Hominen' argument

: So is Plauger's assertion that those of us who disagree with him do so
: out of mere tradition.

Let me get this strait.

"Your lack of cast on malloc is traditional,
not a superior programming technique"
and
"You are an idiot"

are considered equally ad hominem.

Yes. "You do not think when you don't cast malloc()" and "you do not
think when you do cast malloc()" are equally ad hominem.
After all, Mark didn't call Plauger an idiot per se, as you make him out
to have done; he quite specifically claimed that _this_ matter is the
only one where he considers him so.
On the Nth hand, something like

p = malloc(sizeof *p);

is particularly handy, because you can change the type of p
without having to dive into the code everywhere. Better (IMO)
would be something like

p = (typeof *p)malloc(sizeof *p);

but I don't think that works.

It doesn't work in C (though it might in Gnuck), it is probably wrong
anyway (shouldn't that be typeof p?), but most importantly: in what way
does it give you any more information at all than the first line? I see
absolutely no advantage to it whatsoever.

Richard
 
R

Richard Bos

Default User said:
How are casts from malloc() superfluous? Code should work identically
whether the casts are there or not.

Yes. Which is why they're superfluous.
However, code that uses casts is
different from that which does not in cases of failure to properly
declare malloc(). In that case they convert a int into a pointer.

Erm... no. They convert something the programmer _thinks_ is a pointer,
but has implicitly _declared_ to be an int, and which may not actually
exist or have a valid value, into a pointer. Without so much as a
warning.

Richard
 
R

Richard Bos

P.J. Plauger said:
Not in that sentence. It's a bald statement that happens to be untrue.
Now, had he said "there is no such thing aa an invalid conversion from
malloc(sizeof(double) to double*" I'd be quick to agree. But he didn't.

I'd agree with that if the line containing the malloc() call wasn't
still present and visible in the post in question.

Richard
 
R

Richard Bos

Papadopoulos Giannis said:
Ugly style or not, I got used to it and it is really difficult to adopt
a new style...

Anyone who does not like my code, has always the choice of a
beautifier... And I don't think it's all that bad...

Of course. De gustibus, &c. It's just that _I_ think it's ugly, and as
everybody should know, in matters of taste I am always right ;-)

Richard
 
R

Richard Bos

They all generate equivalent object code

Non sequitur.
Its not portable to anal C++ compilers set at their most pedantic
warning level.

Well, duh. It's not portable to Pascal compilers, either. C is not C++.
Or malloc.h on some systems.

If they're broken, perhaps. Which would explain a lot.
On AMD64 (aka x86-64) pointers have long long (or __int64)
representation. On 16bit DOS systems pointers can have a kind of
int[2] representation. In general pointers should not be assumed to
be of size int

Or, indeed, equivalent to _any_ kind of integer.
Actually the declaration of d is the only credible source of what type
d is.

Quite.

Richard
 
J

Joona I Palaste

It doesn't work in C (though it might in Gnuck), it is probably wrong
anyway (shouldn't that be typeof p?), but most importantly: in what way
does it give you any more information at all than the first line? I see
absolutely no advantage to it whatsoever.

Well, it does have the advantage that if the type of p is ever changed,
then the compiler will... oh, sod that.
Well, it does have the advantage that if malloc() is incorrectly
prototyped, its return value will be... oh, sod that.
Well, it does have the advantage that it makes the code valid C++...
except C++ doesn't have a "typeof" operator.
Well, at least it does have the advantage of keeping people like
Trollsdale and PJ Plauger happy when they see that the cast is in
there. =)
 
H

Holger Hasselbach

Paul said:
Papadopoulos Giannis wrote:
d = (double*)malloc( 50*sizeof(double) ); [...]
o it gives a good hint on what d's type is

Actually the declaration of d is the only credible source of what type
d is.

There are three situations where the types of d and *d are relevant:

Assign a value: d[10] = 1.5;
Read a value: a[5] = d[10] * 5.3;

Low level memory management including pointer arithmetic:
d = malloc(50 * sizeof(*d));
memcpy(&d[10], &a[5], 5 * sizeof(*d));
a = d + 10;

The latter examples will work for any type of *d, assuming that *a has
the same type. In fact, the type does not matter at all. If you (P.
Giannis) like a good hint in these situations, you will love a good
hint for assign and read, since you will get unexpected results when
d[10] is an integer type. Then, hungarian notation might be the tool
for you, making things worse.


Holger
 
B

Bjarne Stroustrup

Joona I Palaste said:
You are right. Stroustrup's opinion matters on C++, and your opinion
matters on C2, but neither of your (neither's of you?) opinion matters
on C. C++ is Stroustrup's language and C2 is yours, fine, you can keep
them. Neither of you has claim over C.

On the other hand, my opinions on C matters as much as any other
experienced and long-term member of the C community and contributor to
the C language. The fact that I designed C++ doesn't disqualify me
from having valid opinions on C. This is especially true as I present
my opinions in the context of tectnical arguments, rather than as glib
snippets in postings. For example, see my papers on C/C++
compatibility, which you can download from
http://www.research.att.com/~bs/papers.html

-- Bjarne Stroustrup; http://www.research.att.com/~bs
 
J

Joona I Palaste

On the other hand, my opinions on C matters as much as any other
experienced and long-term member of the C community and contributor to
the C language. The fact that I designed C++ doesn't disqualify me
from having valid opinions on C. This is especially true as I present
my opinions in the context of tectnical arguments, rather than as glib
snippets in postings. For example, see my papers on C/C++
compatibility, which you can download from
http://www.research.att.com/~bs/papers.html

You are right about that, of course. You are an experienced C programmer
and your opinion about C matters just as much as any other experienced C
programmer's. But the fact that you invented C++ doesn't make your
opinion about C any more special than anyone else's. It just makes your
opinion about _C++_ more special.
 
J

John Bode

Papadopoulos Giannis said:
I' ve read nearly completely the whole thread, although in the way I
lost my path.

I assume the following:

d = malloc(50*sizeof(*d));

You have to #include stdlib.h (or somehow have the correct prototype)
regardless.
o changing type of d does not affect anything
o it looks a bit funny

On the contrary, it looks perfectly normal to me.
d = malloc(50*sizeof(double));

Maybe, maybe not. If you change d to a smaller type, you just
allocated more memory than you needed. Internal fragmentation is
never good, but it's not always disastrous.
d = (double*)malloc( 50*sizeof(double) );

You still have to #include stdlib.h, or somehow have a correct
prototype in scope. Without a prototype in scope, malloc() is assumed
to return int, which may or may not have the same size as a void *.
This means the return value may be truncated or stored incorrectly.
The cast *does not* obviate the need for a correct prototype. All the
cast does is convert the (potentially garbage) return value into a
(potentially invalid) pointer to double.

Pointers need not be the same size or representation as unsigned ints.
Pointers to different types need not be the same size or
representation; e.g., a pointer to int may have a different size and
representation from a pointer to void.
o it gives a good hint on what d's type is

Unless someone changed the type of d.
 
D

Default User

Richard said:
Yes. Which is why they're superfluous.

No. Code does not work identically.
Erm... no. They convert something the programmer _thinks_ is a pointer,
but has implicitly _declared_ to be an int, and which may not actually
exist or have a valid value, into a pointer. Without so much as a
warning.

How does that invalidate what I said? Whether the programmer thinks so
or not, it does convert an int to pointer. Some int, from somewhere. My
point was that casts on malloc() aren't superfluous, which means more
than what is needed, they just plain are different in this context.

Even when a proper declaration for malloc() is in scope, there are the
cases where there is a pointer mismatch:

int *p;

p = (double *)malloc(sizeof (double));


Still not superfluous in this case. The code works differently without
the cast.


I personally going to stick with no cast. And I think the general case
of C++ compatibility is a non-starter for most applications. I don't
write C code to compiler under C++. When I want C++, I write it.




Brian Rodenborn
 
D

Dan Pop

In said:
Sorry, that's not quite right. The cast wasn't required in many/most
early C compilers. Some compilers experimented with stronger type
checking for pointers (a bit of the prior art we drew on when we
decided it was safe and advisable to add stronger type checking
in Standard C). Some of us felt that better documentation of
intended type conversions was advisable. But for whatever reason,
Kernighan at least went through a period when he saw fit to write
the casts.

IIRC, the pre-ANSI C mallloc returned char *. That required an
explicit cast any time its return value was not assigned to a char *.
So, it is very likely that Kernighan was doing it out of habit.

But there is another reason, mentioned in the preface:

We used Bjarne Stroustrup's C++ translator extensively for local
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
testing of our programs, and Dave Kristol provided us with an ANSI
^^^^^^^^^^^^^^^^^^^^^^^
C compiler for final testing.

Since C++ requires the cast, the authors had no chance but to use it.

Dan
 
B

Bob Doherty

I am continually amazed at people who think Stroustrop (I assume this
is whom you meant) magically gets the last word in this argument. He
clearly has an interest in promoting C++, and that very likely
includes persuading C programmers to switch. What makes you think
he's an impartial observer? For that matter, what makes you think
he's right?

Stroustrop uses English carefully, and what he stated (in "The C++
Programming Language Third Edition") is subtly different, and compared
to Mark Bruno's statement above, defensible and correct. He actually
states "However, *good* C programs tend to be C++ programs. For
example, every program in Kernighan and Ritchie, The C Programming
Language, 2nd Edition, is a C++ program."(emphasis in the original).

In an appendix he gives an exhaustive treatment of when, and why, C
programs are not C++ programs, and when the two are semantically
different.
 
M

Mark McIntyre

I'm beginning to understand the communication problem here.
I've never espoused either of these posiitons, but that seems
to be what you've read.

I'm finding it hard to read your postings in another way. You're on
record as saying you think that the casts should be there. Charitably
that puts you in group 1.
Uh yes, I *do* consider intolerance a lack of respect.

What, you don't have any coworkers or friends who have some small
thing that annoys you? Bullshit, to coin a phrase.
And yet you can be intolerant. An interesting pair of positions
to reconcile.

When you can explain why, I'll be interested to read it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,583
Members
45,073
Latest member
DarinCeden

Latest Threads

Top