When did K&R function declarations become obsolete?

A

Alan Mackenzie

I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.

Presumably these declarations are still widely used in compiler validation
suites but nowhere else.

Approximately when did they become obsolete, i.e. not used at all for new
code?

[Context: I'm writing documentation for Emacs's C Mode.]
 
8

88888 Dihedral

The _fastcall, _cdecl, _pascall were used everywhere.

Do you really know how c functions are called in the run time ?
 
K

Kaz Kylheku

I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.

These declarations are still a feature of the 1999 C language, but marked as
obsolescent ISO 9899:1999:

6.11.7 Function definitions

1 The use of function definitions with separate parameter identifier and
declaration lists (not prototype-format parameter type and identifier
declarators) is an obsolescent feature.

(If you want to know about the current draft status of this, Google it up;
I dont' know.)
Presumably these declarations are still widely used in compiler validation
suites but nowhere else.

They likely appear in some nonzero but impossible to quantify amount of legacy
code.

Some open source programs still use old style definitions, coupled with
prototype declarations.

E.g.:

int func PROTO(( char *, int ));

/* ... */

int func(a, b)
char *a;
int b;
{
/*...*/
}

Are these programs rare? In some sense, yes, but in fact we do not have to go
far to find an example.

For instance, oh, the GNU C Library (glibc):

http://sourceware.org/git/?p=glibc.git;a=blob;f=io/open.c

Snippet retrieved Sat Dec 10 10:40:55 PST 2011:

/* Open FILE with access OFLAG. If OFLAG includes O_CREAT,
a third argument is the file protection. */
int
__open (file, oflag)
const char *file;
int oflag;
{
int mode;

if (file == NULL)
{
__set_errno (EINVAL);
return -1;
}
Approximately when did they become obsolete, i.e. not used at all for new
code?

This is impossible to know, without a crystal ball that peers into
what everyone on the planet is coding.
 
B

Ben Pfaff

88888 Dihedral said:
The _fastcall, _cdecl, _pascall were used everywhere.

....in Windows and perhaps MS-DOS code and nowhere else, and are
completely orthogonal to the transition from K&R to
prototype-style function declarations.
 
J

Jens Thoms Toerring

Alan Mackenzie said:
I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.
Presumably these declarations are still widely used in compiler validation
suites but nowhere else.
Approximately when did they become obsolete, i.e. not used at all for new
code?

When I started using C in 1989 they were still quite common to be
seen in code I was looking at but that dropped significantly during
the following decade and I hardly ever have seen code using them
written in this millenium. So my very personal guess is that most
people use the "new" form since at least about 1995. But then I did
programming nearly exclussively under UNIX, things might be diffe-
rent for platforms were "modern" compilers weren't available that
easily. And, of course, people having to cater for the very smal-
lest common denominator (i.e. systems were no C89 compliant com-
piler exists) still have to use them in one form or another (as
the example Kaz came up with might be an example for).

If this is for the emacs C mode then, since the standard still
requires support for the "old way" (even though it might be de-
clared as obsolete) I would think it prudent not to throw it
out completely.
Regards, Jens
 
K

Keith Thompson

Alan Mackenzie said:
I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.

Presumably these declarations are still widely used in compiler validation
suites but nowhere else.

Approximately when did they become obsolete, i.e. not used at all for new
code?

[Context: I'm writing documentation for Emacs's C Mode.]

I'm not aware that they've ever become completely obsolete in that
sense. One poster in comp.std.c, Jun Woong, is apparently still using
them (unnecessarily IMHO, but I can see his point); see Message-ID
<http://groups.google.com/group/comp.std.c/msg/819a7f3553366ef8>.

They've been *obsolescent* since C89 (see 6.9.4 and 6.9.5 in the
C90 standard, presumably 3.9.4 and 3.9.5 in the C99 standard),
and remain so in the latest C201X draft.
 
R

Ralph Spitzner

Alan said:
I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.

Sorry, but having read the thread so far I still don't get it.
Are you guys talking about declarations, or implementations ?

After all to avoid an implicit declaration as 'int (void)'
(or (int), for that matter) one would have to declare in some.h like:

char *myfunc(char *);

then implement it as either

char *
myfunc(char *thestring)
{
....
}

or old

char *myfunc(thestring)
char *thestring;
{
....
}

What am I missing here, a proper C course ? :p

-rasp
 
R

ralph

I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.

Presumably these declarations are still widely used in compiler validation
suites but nowhere else.

Approximately when did they become obsolete, i.e. not used at all for new
code?

[Context: I'm writing documentation for Emacs's C Mode.]

As Mr. Thompson pointed out the closest to an "official" date would be
C89, however K&R fell out of common use long before that. Most C
compilers I used by the early 80s offered vendor specific
implementations that supported the "ANSI style", though there wasn't
any official 'ANSI' yet.

The only C compiler I remember by name and version is Microsoft C 3.0
(XENIX) and it was what we would call fully "ANSI compatible" in 1985.
Other platform compilers I used earlier all had the ANSI-style option
available before then. And we all quickly jumped at using it whenever
it was available.

[The reason I'm so vague is because back then just about every box
came with its own "C compiler" or managment sprang for some 3rd-party
"development package" - the fine-print contained a copyright of some
sort but you seldom paid much attention to it, unless there was a
difference in some obscure library. And then you were likely to code a
conditional compile for the "VAX-C", or for accounting's "XENIX-BOX",
etc.]

I've been trying to remember when I last was forced to use pure K&R
and I think it was a DEC VAX/VMS project in 1983???

-ralph
 
K

Keith Thompson

Keith Thompson said:
They've been *obsolescent* since C89 (see 6.9.4 and 6.9.5 in the
C90 standard, presumably 3.9.4 and 3.9.5 in the C99 standard),
and remain so in the latest C201X draft.

Sorry, I meant 3.9.4 and 3.9.5 in the ANSI C89 standard. (ISO C90
described the same language, but renumbered the sections.)
 
K

Keith Thompson

Ralph Spitzner said:
Sorry, but having read the thread so far I still don't get it.
Are you guys talking about declarations, or implementations ?

The question was about declarations. (By "implementations", I presume
you mean function definitions.)
After all to avoid an implicit declaration as 'int (void)'
(or (int), for that matter) one would have to declare in some.h like:

char *myfunc(char *);

Or as

char *myfunc(char *thestring);

Or, if for some reason you want to use an old-style declaration, you
could declare it as:

char *myfunc();

and as long as you call it with a char* argument you're ok.
then implement it as either

char *
myfunc(char *thestring)
{
...
}

That's the preferred way, yes.

(I suppose you can have an new-style declaration with an old-style
definition, or vice versa. I never really thought about it, since I
avoid both old-style declarationd and old-style definitions.)

C99 no longer has implicit int declarations. Calling a function with no
visible declaration is a constraint violation. (That declaration may or
may not be a prototype, though).
 
B

Ben Bacarisse

ralph said:
I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.

Presumably these declarations are still widely used in compiler validation
suites but nowhere else.

Approximately when did they become obsolete, i.e. not used at all for new
code?

[Context: I'm writing documentation for Emacs's C Mode.]

As Mr. Thompson pointed out the closest to an "official" date would be
C89, however K&R fell out of common use long before that. Most C
compilers I used by the early 80s offered vendor specific
implementations that supported the "ANSI style", though there wasn't
any official 'ANSI' yet.

In contrast, I remember using old-style declarations into to the early
90s for maximum portability. These were the days when a lot of software
was expensive, and not everyone upgraded their compilers as soon as they
might.

It's possible that I was being over-cautious -- maybe everyone in world
had a prototype-aware C compiler by 1990 -- but that seems unlikely.

<snip>
 
E

Eric Sosman

ralph said:
[...]
As Mr. Thompson pointed out the closest to an "official" date would be
C89, however K&R fell out of common use long before that. Most C
compilers I used by the early 80s offered vendor specific
implementations that supported the "ANSI style", though there wasn't
any official 'ANSI' yet.

You encountered "ANSI style" compilers in the "early 80s?" Six,
seven, eight years before the Standard became final? That's some
pretty impressive tea-leaf reading ...

The Standard was a moving target during most of its development.
The first draft I read included the `noalias' keyword -- and that was
in 1988, only about twenty months before ANSI's formal adoption! Did
the compilers you recall support `noalias'?
In contrast, I remember using old-style declarations into to the early
90s for maximum portability. These were the days when a lot of software
was expensive, and not everyone upgraded their compilers as soon as they
might.

Ben's experience matches mine, but it wasn't a reluctance to
upgrade compilers that motivated us to keep using old style: It was
the large body of existing old-style code, and the certainty that
we would introduce gratuitous bugs in the course of a translation
effort. For some strange reason, our engineering management was not
highly motivated to expend programming and Q/A effort on a project
that would have no customer-visible effect except new bugs ...

We used ANSI-capable compilers to process our K&R-style code.
On some platforms we had to implement work-arounds to get pre-ANSI
effects from an ANSI compiler, just so our pre-ANSI code would continue
to work as it had.
It's possible that I was being over-cautious -- maybe everyone in world
had a prototype-aware C compiler by 1990 -- but that seems unlikely.

Not in 1990, certainly. By 1993 or so I think "everyone who was
anyone" offered an ANSI/ISO implementation. Still, though, the amount
of code written post-ANSI was tiny compared to what had been written
in the preceding twenty years.

Also, that existing investment in pre-ANSI code couldn't be
ANSIfied easily and mechanically. It wasn't just a matter of adding
prototypes (of which more below), but of sorting out other practices
that had varied across C dialects and that ANSI had standardized.
Replacing `<strings.h>' was fairly simple, replacing `<varargs.h>'
was harder, dealing with whether `unsigned short' promoted to `int'
or to `unsigned int' was harder still, and the litany of glitches
large and small didn't stop there. ANSIfying several million lines
of pre-ANSI code while it was under active development was not a
project for a part-time summer intern!

As for writing prototype declarations to match K&R-style function
definitions, it's not always possible. The K&R definition expects an
argument that has been subjected to what ANSI calls the "default
argument promotions," and the difficulty is in naming the promoted
type. The implementation gets to make an idiosyncratic decision about
the nature of some types: `enum WhatAmI' is compatible with some kind
of integer, but each implementation gets to make its own decision --
and that decision can influence how a value of that type promotes.
A K&R function with an `enum WhatAmI' parameter may need to be
declared with different prototypes on different systems.
 
J

James Kuyper

I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.

Presumably these declarations are still widely used in compiler validation
suites but nowhere else.

Approximately when did they become obsolete, i.e. not used at all for new
code?

[Context: I'm writing documentation for Emacs's C Mode.]

For me personally, non-prototype declarations became (almost) obsolete
as soon as I learned about prototypes, which was shortly after the C
standard was first published. However, many people write code to
standards which require portability to compilers that pre-date the C
standard, even though their code no longer needs to be ported to such
compilers, and hasn't needed such porting for a decade or more.
Therefore, I doubt that they have yet become obsolete in the sense that
you define: "not used at all for new code".

Also, there's still one point where non-prototype function declarations
are useful. Try to declare a prototype for a function taking, as an
argument, a pointer to a function of the SAME type. You'll quickly
discover that the declaration recurses infinitely. Somewhere along the
way, you have to terminate it. There's a number of ways to terminate it,
but the simplest is to use a non-prototype declaration that leaves the
number and type of the parameters unspecified. Other methods, in
principle, run into type compatibility issues if the pointer is actually
used to call a function whose definition uses the same prototype.
 
8

88888 Dihedral

I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.

Presumably these declarations are still widely used in compiler validation
suites but nowhere else.

Approximately when did they become obsolete, i.e. not used at all for new
code?

[Context: I'm writing documentation for Emacs's C Mode.]

For me personally, non-prototype declarations became (almost) obsolete
as soon as I learned about prototypes, which was shortly after the C
standard was first published. However, many people write code to
standards which require portability to compilers that pre-date the C
standard, even though their code no longer needs to be ported to such
compilers, and hasn't needed such porting for a decade or more.
Therefore, I doubt that they have yet become obsolete in the sense that
you define: "not used at all for new code".

Also, there's still one point where non-prototype function declarations
are useful. Try to declare a prototype for a function taking, as an
argument, a pointer to a function of the SAME type. You'll quickly
discover that the declaration recurses infinitely. Somewhere along the
way, you have to terminate it. There's a number of ways to terminate it,
but the simplest is to use a non-prototype declaration that leaves the
number and type of the parameters unspecified. Other methods, in
principle, run into type compatibility issues if the pointer is actually
used to call a function whose definition uses the same prototype.

A portable C programs in many platforms without noticing that a novice programmer only produces a c program would be so buggy even
in additions or multiplications in counting a buffer size to malloc
or not noticing the choked qsort?

Forget that idea!
 
R

ralph

ralph said:
I think we can say that K&R function declarations are truly obsolete. In
~20 years of hacking C, I've only seen very old code which uses them.

Presumably these declarations are still widely used in compiler validation
suites but nowhere else.

Approximately when did they become obsolete, i.e. not used at all for new
code?

[Context: I'm writing documentation for Emacs's C Mode.]

As Mr. Thompson pointed out the closest to an "official" date would be
C89, however K&R fell out of common use long before that. Most C
compilers I used by the early 80s offered vendor specific
implementations that supported the "ANSI style", though there wasn't
any official 'ANSI' yet.

In contrast, I remember using old-style declarations into to the early
90s for maximum portability. These were the days when a lot of software
was expensive, and not everyone upgraded their compilers as soon as they
might.

It's possible that I was being over-cautious -- maybe everyone in world
had a prototype-aware C compiler by 1990 -- but that seems unlikely.

<snip>

I have no doubt everyone's experience would vary. While predominately
working with UNIX "my" boxes were mostly of the "micro" variety, I
also quickly jumped on the PC bandwagon. As I noted the Microsoft C
compiler was "ANSI-compatible" by 1985. I can't remember a single
mainstream PC compiler that wasn't by then.

Most of the UNIX shops I worked with had purchased "development
packages" which I'm sure had superior features to the "generic" C
compilers that came with the O/S.

So yes in "my world" *everyone* had a prototype-aware C compiler by
1985.

-ralph
 
R

ralph

ralph said:
[...]
As Mr. Thompson pointed out the closest to an "official" date would be
C89, however K&R fell out of common use long before that. Most C
compilers I used by the early 80s offered vendor specific
implementations that supported the "ANSI style", though there wasn't
any official 'ANSI' yet.

You encountered "ANSI style" compilers in the "early 80s?" Six,
seven, eight years before the Standard became final? That's some
pretty impressive tea-leaf reading ...

I can appreciate your disbelief especially if one might be used to
today's environment where the usual process is to develop a "standard"
and then sit back and watch various vendors move to support the new
standard.

Such was not the case with C. In computer-years an official 'standard'
was a long, long time coming. I started using C in 1977 (a year before
they published the book. <g>) There wasn't a 'Standard' till 1986.
That was almost 10 years since I started, and ~15 years after it was
invented.

Do you seriously believe the vendors sat still all that time waiting?

I tried to be very careful and use the expressions "ANSI style",
"ANSI-Compatibility" in quotes, because as a side issue I'm not sure
when we started applying the term "ANSI". If memory serves we used to
call the extensions "Standard C" as opposed to "K&R C".
The Standard was a moving target during most of its development.
The first draft I read included the `noalias' keyword -- and that was
in 1988, only about twenty months before ANSI's formal adoption! Did
the compilers you recall support `noalias'?

Wasn't so much a "moving target" as a classic battle between those who
wanted the standard to agree with the existing implementations at the
time, and those who wanted to define an "even better" language.

Thankfully, the "implementers" pretty much won out. C86 added little
that wasn't already out there.

The "-noalias" option was one of the latter ideas. Don't know if I
ever used a compiler that offered that keyword. I only know I never
used it. I vaguely remember it as an optimizing option, but believe it
applied to whole code blocks not to any one variable.

[Sidnote: I probably wouldn't use it now that I took the time to look
it up. I remember too well the lesson learned with the 'register'
keyword. I jumped on that bandwagon after reading the specs when it
first showed up. Spending time 'n effort to pre-think ways to make my
code more efficient - only to discover that most compiles ignored it
or placed candidates in the register anyway. <g> I seldom bother
anymore, unless testing demonstrated it as needed in a critical
section. Again, no point in anyone flaming me, just my opinion.]
Ben's experience matches mine, but it wasn't a reluctance to
upgrade compilers that motivated us to keep using old style: It was
the large body of existing old-style code, and the certainty that
we would introduce gratuitous bugs in the course of a translation
effort. For some strange reason, our engineering management was not
highly motivated to expend programming and Q/A effort on a project
that would have no customer-visible effect except new bugs ...

We used ANSI-capable compilers to process our K&R-style code.
On some platforms we had to implement work-arounds to get pre-ANSI
effects from an ANSI compiler, just so our pre-ANSI code would continue
to work as it had.

That is interesting and once again points to the vast differences in
experience and what I might call one's "world view".

I was generally a contract programmer. I was never placed in a
situation where I had to deal with humongous amounts of legacy code.
It was either to add some functionality - in which case I just adopted
whatever 'style' was currently in use, or more common totally new
projects - in which case I used whatever I liked, and I liked
vendor-extensions.

You might add to that (at the definite risk of starting another
flame-war <g>) I have never been a big believer in "portable code".
That is, never a slave to the idea that what I wrote needed to be
portable to a vast array of various operating systems. UNLESS it was
business requirement that it compile and run on multiple OS's. I also
'liked' platform-specific extensions. Especially as in most cases
there was a good reason they were available.

I wrote for the target at hand with the best tools available.

Like I said before - the few situations where differences were
substantial, ie, demanded to be addressed - I often fell back on
conditional compiles and even re-writing whole blocks of code. Better
an application performs to its best on all required platforms.

I always felt adhering to "K&R C" as some kind of "lowest common
denominator" was a silly limitation dreamed up by code lawyers sitting
in a committee with nothing else to do. <g>

But I don't argue with those who do. They have a point. If they are
signing the check then I'll even agree with them... till the next job.

[As a side note. (As though I haven't stirred up enough trouble. <g>)
I never been a big fan of an, or make that THE "enterprise" compiler.
I feel using one through out a single or suite of projects makes
obvious sense. But to delay moving on to something better for a round
of new projects makes more sense. There is no reason to keep using
SmithC v3 just because you wrote your applications with it 5 years
ago.

Of course if SmithC v3 still works just fine, then keep it. I'm NOT
advocating moving on just because something is new, only if new
translates to "better" for your business requirements.

Keep SmithC around for the legacy stuff.]

But again, that likely depends on one's experience and comfort level.
As a contractor I became very used to using multiple compilers on
different boxes at the same time. Never expected the same code to work
exactly the same everywhere, thus never surprised when it didn't, nor
thought twice when I had to go some extra effort to make it work.
Not in 1990, certainly. By 1993 or so I think "everyone who was
anyone" offered an ANSI/ISO implementation. Still, though, the amount
of code written post-ANSI was tiny compared to what had been written
in the preceding twenty years.

Like I said, I was no longer using or considering "K&R C" as any kind
of "standard" at least by 1985. Can't remember a single shop that made
it requirement since then either. But that is just me. I obviously
never worked in your shop. (Or likely to have stayed long if I did.
Also, that existing investment in pre-ANSI code couldn't be
ANSIfied easily and mechanically. It wasn't just a matter of adding
prototypes (of which more below), but of sorting out other practices
that had varied across C dialects and that ANSI had standardized.
Replacing `<strings.h>' was fairly simple, replacing `<varargs.h>'
was harder, dealing with whether `unsigned short' promoted to `int'
or to `unsigned int' was harder still, and the litany of glitches
large and small didn't stop there. ANSIfying several million lines
of pre-ANSI code while it was under active development was not a
project for a part-time summer intern!

Doesn't sound like it. Makes me very glad it was never an issue for
me.
As for writing prototype declarations to match K&R-style function
definitions, it's not always possible. The K&R definition expects an
argument that has been subjected to what ANSI calls the "default
argument promotions," and the difficulty is in naming the promoted
type. The implementation gets to make an idiosyncratic decision about
the nature of some types: `enum WhatAmI' is compatible with some kind
of integer, but each implementation gets to make its own decision --
and that decision can influence how a value of that type promotes.
A K&R function with an `enum WhatAmI' parameter may need to be
declared with different prototypes on different systems.

Ha. Never ran into that because I've never considered K&R C anything
to write for or towards. The book was a good guide, but ultimately it
was only a summary of "here is how it should work", after that one had
to address what it was their current compiler actually did.

-ralph
 
K

Keith Thompson

ralph said:
On Sun, 11 Dec 2011 09:11:33 -0500, Eric Sosman


I can appreciate your disbelief especially if one might be used to
today's environment where the usual process is to develop a "standard"
and then sit back and watch various vendors move to support the new
standard.

Such was not the case with C. In computer-years an official 'standard'
was a long, long time coming. I started using C in 1977 (a year before
they published the book. <g>) There wasn't a 'Standard' till 1986.
That was almost 10 years since I started, and ~15 years after it was
invented.

Do you seriously believe the vendors sat still all that time waiting?

The first ANSI C standard was issued in 1989, not 1986. Are you
referring to something else that happened in 1986?

I know that the C community was eager to see the new ANSI C standard,
and that part of were implemented before the standard itself was
released. I don't have a clear idea of what the time scale was.

[...]
The "-noalias" option was one of the latter ideas. Don't know if I
ever used a compiler that offered that keyword. I only know I never
used it. I vaguely remember it as an optimizing option, but believe it
applied to whole code blocks not to any one variable.

We're not talking about a "-noalias" option. Early drafts
of the ANSI C standard include a "noalias" keyword. (I think
C99's "restrict" keyword is a redesigned version of the concept.)
Dennis Ritchie posted his opinion of it in this newsgroup in March
1988 <http://www.lysator.liu.se/c/dmr-on-noalias.html>:

Noalias must go. This is non-negotiable.

It must not be reworded, reformulated or reinvented. The draft's
description is badly flawed, but that is not the problem.
The concept is wrong from start to finish. It negates
every brave promise X3J11 ever made about codifying existing
practices, preserving the existing body of code, and keeping
(dare I say it?) `the spirit of C.'

[...]
 
A

Alan Mackenzie

When I started using C in 1989 they were still quite common to be
seen in code I was looking at but that dropped significantly during
the following decade and I hardly ever have seen code using them
written in this millenium. So my very personal guess is that most
people use the "new" form since at least about 1995. But then I did
programming nearly exclussively under UNIX, things might be diffe-
rent for platforms where "modern" compilers weren't available that
easily. And, of course, people having to cater for the very smal-
lest common denominator (i.e. systems were no C89 compliant com-
piler exists) still have to use them in one form or another (as
the example Kaz came up with might be an example for).

I think 1995 sounds about right.
If this is for the emacs C mode then, since the standard still
requires support for the "old way" (even though it might be de-
clared as obsolete) I would think it prudent not to throw it
out completely.

Don't worry, there's no chance of that. :) It's just that there is a
new option to disable K&R header parsing - this parsing can be
inordinately slow, for example when there are long sequences of variable
declarations (which might be K&R parameter specifications). It's a shame
to impose this performance penalty on everybody, just because of some
obsolete construct which "nobody" uses any more.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top