is "typedef int int;" illegal????

D

David R Tribble

Eric said:
It seems to me "semantically equivalent" might
open an unpleasant can of worms. For example, are
typedef unsigned int mytype;
typedef size_t mytype;
"semantically equivalent" on an implementation that
uses `typedef unsigned int size_t;'? What's really
wanted is "equivalence of intent," which seems a
harder notion to pin down.

It should mean "semantically equivalent", as in "equivalent types",
to allow C to be compatible with the C++.

If the suggestion were modified to require "lexical
equivalence," such questions would disappear and I don't
think the language would be any the worse without them.
Writing header files would perhaps not be quite as much
easier as with "semantic equivalence," but I think would
be a good deal easier than it is now.

Lexical equivalence is harder for compilers to check than
semantic type equivalence, which is already present in compilers.

-drt
 
D

Douglas A. Gwyn

loufoque said:
Defining a type that already exists makes no sense.

There are numerous issues involved that led to the current
spec for typedef. One of them is that after the first
typedef of a given identifier, that identifier plays a
different role (type synonym) and it would be logical for
it to do so in the second "redundant" typedef (which
happens to result in a syntactic error).
typedef int foo;
typedef foo bar;
typedef bar foo; // would this be allowed?
typedef bar int; // but not this?
Basically this is too fundamental and established in the
language to be messing with. If you design some *new*
language you might want to do it differently.
 
S

Stephen Sprunk

Wojtek Lerch said:
BTW Think about

typedef long long long long;

;-)

That "long long" even exists is a travesty.

What are we going to do when 128-bit ints become common in another couple
decades? Call them "long long long"? Or if we redefine "long long" to be
128-bit ints and "long" to be 64-bit ints, will a 32-bit int be a "short
long" or a "long short"? Maybe 32-bit ints will become "short" and 16-bit
ints will be a "long char" or "short short"? Or is a "short short" already
equal to a "char"?

All we need are "int float" and "double int" and the entire C type system
will be perfect! </sarcasm>

S

--
Stephen Sprunk "Stupid people surround themselves with smart
CCIE #3723 people. Smart people surround themselves with
K5SSS smart people who disagree with them." --Aaron Sorkin

*** Free account sponsored by SecureIX.com ***
*** Encrypt your Internet usage with a free VPN account from http://www.SecureIX.com ***
 
J

Jordan Abel

That "long long" even exists is a travesty.

What are we going to do when 128-bit ints become common in another couple
decades? Call them "long long long"? Or if we redefine "long long" to be
128-bit ints and "long" to be 64-bit ints, will a 32-bit int be a "short
long" or a "long short"? Maybe 32-bit ints will become "short" and 16-bit
ints will be a "long char" or "short short"? Or is a "short short" already
equal to a "char"?

All we need are "int float" and "double int" and the entire C type system
will be perfect! </sarcasm>

Don't forget short double and long float.

and are _Complex integers legal?
 
K

Keith Thompson

Stephen Sprunk said:
That "long long" even exists is a travesty.

What are we going to do when 128-bit ints become common in another
couple decades? Call them "long long long"? Or if we redefine "long
long" to be 128-bit ints and "long" to be 64-bit ints, will a 32-bit
int be a "short long" or a "long short"? Maybe 32-bit ints will
become "short" and 16-bit ints will be a "long char" or "short short"?
Or is a "short short" already equal to a "char"?

All we need are "int float" and "double int" and the entire C type
system will be perfect! </sarcasm>

So how would you improve it?
 
S

santosh

James Dennett wrote:
.... snip ...
The standard doesn't ever require a compiler to reject code;

Except, I suppose, if an #error directive is encountered.
it's quite legal for a C compiler to accept Fortran
code, so long as it prints out a diagnostic (maybe
"This looks like Fortran, not C... compiling it anyway...").

In which case it would no longer be a C compiler and would not come
under the restrictions of the C standard.
 
S

santosh

Keith said:
So how would you improve it?

Perhaps by adding Long or llong or Llong for 128 bit integers? Ugly but
there's nothing that can be done. Calling a 128 bit integer as long
long long would be ridiculous.
 
J

Jordan Abel

James Dennett wrote:
... snip ...

Except, I suppose, if an #error directive is encountered.


In which case it would no longer be a C compiler and would not come
under the restrictions of the C standard.

If it also compiles C, it has to print a diagnostic on being given non-C
code and being told that it's C. (e.g. gcc -x c)
 
P

pemo

jacob said:
James Dennett a écrit :
Strange, I get the following:

[root@gateway root]# gcc -v
Reading specs from
/usr/lib/gcc-lib/i586-mandrake-linux-gnu/2.96/specs gcc version
2.96 20000731 (Mandrake Linux 8.2 2.96-0.76mdk) [root@gateway
root]# cat tint.c typedef int int;
[root@gateway root]# gcc -c tint.c
tint.c:1: warning: useless keyword or type name in empty declaration
tint.c:1: warning: empty declaration
[root@gateway root]# ls -l tint.o
-rw-r--r-- 1 root root 703 Mar 24 16:17 tint.o
[root@gateway root]#

Program is not rejected.


That's a nearly 6-year old compiler, and not an official GCC
release at that. Not to say that 2.96 didn't have its uses,
and maybe it still does, but it's far from the state of the
art.

-- James

Wow, this complicates this quite a bit.
If Microsoft AND gcc reject the code... I think better leave it as it
is... I thought that gcc let it pass with some warnings and intended
to do the same, but it is true that I have not upgraded gcc since
quite a while.

Perhaps you could try it with a couple more modern compilers - just as a
'belt and braces' kind of thing? Aren't the Intel compilers at least free to
try
http://www.intel.com/cd/software/products/asmo-na/eng/compilers/219690.htm,
and then there's Sun's .. http://developers.sun.com/prodtech/cc/index.jsp ..
which *are* free, and might prove useful?
 
K

kuyper

Stephen Sprunk wrote:
....
That "long long" even exists is a travesty.

What are we going to do when 128-bit ints become common in another couple
decades? Call them "long long long"? Or if we redefine "long long" to be
128-bit ints and "long" to be 64-bit ints, will a 32-bit int be a "short
long" or a "long short"? Maybe 32-bit ints will become "short" and 16-bit
ints will be a "long char" or "short short"? Or is a "short short" already
equal to a "char"?

We now have size-named types. That should reduce (but not,
unfortunately, eliminate) the likelihood of similar travesties in the
future.
All we need are "int float" and "double int" and the entire C type system
will be perfect! </sarcasm>

Hey! That's a half-way plausible syntax for declaring a fixed-point
type. ;-) All it needs is some way of specifying the number of digits
after the decimal point.
 
K

kuyper

Jordan Abel wrote:
....
and are _Complex integers legal?

They aren't (6.7.2p2), but conceptually it would be a meaningful
concept, and I suspect there are certain obscure situations where
they'd be useful.
 
B

Ben Pfaff

santosh said:
Keith said:
So how would you improve it?

Perhaps by adding Long or llong or Llong for 128 bit integers? Ugly but
there's nothing that can be done. [...]

Not a good "solution" in my opinion. I'm sure there are lots of
programs that use each of these identifiers. "long long" doesn't
reserve any previously unreserved identifiers.
 
K

Keith Thompson

Jordan Abel wrote:
...

They aren't (6.7.2p2), but conceptually it would be a meaningful
concept, and I suspect there are certain obscure situations where
they'd be useful.

Mathematically, they're called "Gaussian integers".
 
J

jacob navia

santosh a écrit :
Perhaps by adding Long or llong or Llong for 128 bit integers? Ugly but
there's nothing that can be done. Calling a 128 bit integer as long
long long would be ridiculous.
lcc-win32 supports 128 bit integers. The type is named:

int128

Planned is support for 128 bit constants with

i128 m = 85566677766545455544455543344i128;

and

printf("%i128d",m);
 
E

Eric Sosman

Ben said:
Keith said:
[...]

That "long long" even exists is a travesty.

So how would you improve it?

Perhaps by adding Long or llong or Llong for 128 bit integers? Ugly but
there's nothing that can be done. [...]


Not a good "solution" in my opinion. I'm sure there are lots of
programs that use each of these identifiers. "long long" doesn't
reserve any previously unreserved identifiers.

atoll()?
 
J

Jack Klein

That "long long" even exists is a travesty.

What are we going to do when 128-bit ints become common in another couple
decades? Call them "long long long"? Or if we redefine "long long" to be
128-bit ints and "long" to be 64-bit ints, will a 32-bit int be a "short
long" or a "long short"? Maybe 32-bit ints will become "short" and 16-bit
ints will be a "long char" or "short short"? Or is a "short short" already
equal to a "char"?

All we need are "int float" and "double int" and the entire C type system
will be perfect! </sarcasm>

The 256 bit integer type has already been designated "long long long
long spam and long".

'nuff said.
 
M

Michael Mair

Stephen Sprunk wrote:
...

We now have size-named types. That should reduce (but not,
unfortunately, eliminate) the likelihood of similar travesties in the
future.

Heh. We can hope.
Hey! That's a half-way plausible syntax for declaring a fixed-point
type. ;-) All it needs is some way of specifying the number of digits
after the decimal point.

There is something called Embedded C for that,
http://www.embedded-c.org

Cheers
Michael
 
J

Jordan Abel

santosh a écrit :
Keith said:
[...]

That "long long" even exists is a travesty.

What are we going to do when 128-bit ints become common in another
couple decades? Call them "long long long"? Or if we redefine "long
long" to be 128-bit ints and "long" to be 64-bit ints, will a 32-bit
int be a "short long" or a "long short"? Maybe 32-bit ints will
become "short" and 16-bit ints will be a "long char" or "short short"?
Or is a "short short" already equal to a "char"?

All we need are "int float" and "double int" and the entire C type
system will be perfect! </sarcasm>

So how would you improve it?


Perhaps by adding Long or llong or Llong for 128 bit integers? Ugly but
there's nothing that can be done. Calling a 128 bit integer as long
long long would be ridiculous.
lcc-win32 supports 128 bit integers. The type is named:

int128

Planned is support for 128 bit constants with

i128 m = 85566677766545455544455543344i128;

case-insensitive, i hope.
and

printf("%i128d",m);

How do you differentiate this from the valid standard format string
consisting of %i followed by the string "128d"? Maybe you should use
%I128d instead, like how microsoft does I64
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,777
Messages
2,569,604
Members
45,227
Latest member
Daniella65

Latest Threads

Top