swap() function without tmp

A

Arthur J. O'Dwyer

Thanks a lot of reply messages.

You're welcome, but please don't snip attributions. Those are
the lines that say "On Foo 2004, Bar wrote:", and they tell the
reader (us) who's talking. It's politeness.

[Someone wrote:]
It is my mistake. A 'int' means a 'long'. :)

:) Of course, the same objection applies to 'long' (or any other
type you care to mention).
Hmm, if there was a memxor() function that would exclusive
or one region of memory with another, then this could be
done somewhat portably. memand() and memor() to complete
the set. Maybe it should be added to C2009.

A memxor() not implemented [in] my environment(library) yet.
But it's easy to implement for me. Thanks a valuable
information.

I'm pretty sure this corresponds to what you'll want. Untested
code, though.

#include <stdlib.h>

void mem_xor(void *p, const void *q, size_t len)
{
unsigned char *cp = p, *cend = cp+len;
const unsigned char *cq = q;
while (cp != cend)
*cp++ ^= *cq++;
}


[re: better use a temp]
Yes, I know my program is slow and tricky.
But, imagine the microchip programming like a PIC.
(which has 64/128 byte's mem only --It is off topic here?)

Technically, yes, PIC implementations that don't allow the
construction of objects at least 65535 bytes in size cannot be
conforming hosted C implementations, and thus are off-topic
in comp.lang.c. In reality, I think most questions about C
on PICs turn out to apply equally to hosted implementations.
This one seems to me to qualify. :)
A double tmp variable needs 8 bytes stack and microchip
has not enough memory.

You mean, your implementation doesn't even allocate 8 bytes
for a stack? That's pretty tight (and not in the good way,
either)! The obvious "fix" would be to make the temporary
variable 'static', but that would actually be worse, memory-wise,
than the 'auto' solution, assuming a naive optimizer.
Are you *sure* that you can't get a better optimizing compiler,
or activate more optimizations on the one you have? I don't
know your system, but it seems to me that if your system can
perform

x ^= y;

in registers, where x and y are memory locations, then it
certainly ought to also provide an 'XCHG' instruction to exchange
two memory locations. And if such an instruction exists, your
compiler ought to provide a way to access it, either through
its own optimizations or directly through inline assembly code
(which is *definitely* off-topic here).
Don't say 'Use Assembly Language'. :(

Ookaay... but it *would* be the simplest way, wouldn't it? :)
void dswap(double *x, double *y){
*(long long *)x ^= *(long long *)y;
*(long long *)y ^= *(long long *)x;
*(long long *)x ^= *(long long *)y;
}

In my programming environment, it worked correctly and object
code has not use the surplus mem.

But it's just as non-portable as if you'd used inline assembly
code, and much slower and larger --- so where's the gain? I
don't get it.

-Arthur
 
M

Mark McIntyre

On 15 May 2004 04:33:10 -0700, in comp.lang.c ,
Hi,
Thanks a lot of reply messages.


It is my mistake. A 'int' means a 'long'. :)

It doesn't matter, the question still stands. The answer by the way is "you
can't - C doesn't mandate the sizes of types, only how much data they must
be able to hold".
 
D

Dan Pop

In said:
Or there might not be enough C99 compilers available by 2009?

Then, it is a fair bet that there will NEVER be enough C99 compilers.
Any provider of C implementations who didn't upgrade to C99 after 10
years most likely is not interested in providing a C99 implementation
at all.

Dan
 
D

Dan Pop

In said:
And personally, I value my sanity too much to try to use bitwise
operators on anything other than unsigned integer types. That's not
to say that you *can't* use bitwise operators on signed integer types,
but it rarely makes sense to do so.

It makes sense to do so any time you (have good reasons to) expect a
positive result.

Dan
 
J

Joe Wright

Dan said:
Then, it is a fair bet that there will NEVER be enough C99 compilers.
Any provider of C implementations who didn't upgrade to C99 after 10
years most likely is not interested in providing a C99 implementation
at all.

Dan

I keep getting subliminal messages from here and elsewhere that C99
is a non-starter. C89 on the other hand did the right thing. It
seems nobody 'needs' C99. K&R2 is a best-seller at $40. All of us
have one and newbies are still buying. The C99 Standard doesn't draw
flies at $18. Maybe nobody wants it.

I could be wrong. I have been before. I'll spare you all the list.
 
A

August Derleth

I keep getting subliminal messages from here and elsewhere that C99
is a non-starter. C89 on the other hand did the right thing. It
seems nobody 'needs' C99. K&R2 is a best-seller at $40. All of us
have one and newbies are still buying. The C99 Standard doesn't draw
flies at $18. Maybe nobody wants it.

I could be wrong. I have been before. I'll spare you all the list.

I won't speak to the long-term viability of C99 as a whole, but I do
know this: GCC, the GNU Compiler Collection, implements a conforming C89
compiler. It gives all the right warnings (after a bit of command-line
prodding) and implements all the right semantics. It also implements a
non-conforming attempt at a C99 compiler, some of which is accomplished by
simply not giving warnings or errors for traditional GNU C extensions.

This state of affairs seems to trouble nobody important. Nearly all new
code in the open-source community is either (nominally) C89 or GNU C, and
none that I've found require C99 features beyond what GCC will provide in
a non-conforming mode.

GCC is the standard compiler for this market. If GCC supports it, there's
a chance it will be used. Contrariwise, if GCC ignores it, there's a
chance the community as a whole just doesn't care.

(Of course, I don't see the need for C99, so I'm biased. I think C89 gives
us function prototypes, the ability to return structs and unions, good
pointer semantics and limitations, and the ability to write idiomatic code
in a conforming way. C99 gives us compound literals, the bool type,
restricted pointers, and long long. No killer features, nothing I really
missed when programming C before.)
 
D

Dan Pop

In said:
I keep getting subliminal messages from here and elsewhere that C99
is a non-starter. C89 on the other hand did the right thing. It
seems nobody 'needs' C99. K&R2 is a best-seller at $40. All of us ^^^^^^
have one and newbies are still buying. The C99 Standard doesn't draw
flies at $18. Maybe nobody wants it.
^^^^^^
"Nobody" is too strong. The demand exists, but it's not significant
enough to justify the effort of producing conforming C99 implementations.

Dan
 
D

Dan Pop

In said:
I won't speak to the long-term viability of C99 as a whole, but I do
know this: GCC, the GNU Compiler Collection, implements a conforming C89
compiler. It gives all the right warnings (after a bit of command-line
prodding) and implements all the right semantics. It also implements a
non-conforming attempt at a C99 compiler, some of which is accomplished by
simply not giving warnings or errors for traditional GNU C extensions.

This state of affairs seems to trouble nobody important. Nearly all new
code in the open-source community is either (nominally) C89 or GNU C, and
none that I've found require C99 features beyond what GCC will provide in
a non-conforming mode.

GCC is the standard compiler for this market. If GCC supports it, there's
a chance it will be used. Contrariwise, if GCC ignores it, there's a
chance the community as a whole just doesn't care.

It was a major blunder to ignore GNU C when designing C99. If a few
C99 features didn't have semantics conflicting with the semantics of
the same features in GNU C, gcc would have had a conforming -std=c99
mode by now. The issue is purely political, there are no technical
difficulties.

Furthermore, if C99 had more GNU C features, the demand for C99 would have
been far greater. E.g. the addition of typeof and block expressions
would have greatly improved the capabilities of the function-like macros.

Of course, these considerations have nothing to do with the standard C99
library, which is where most of work of upgrading from C89 to C99 goes.
(Of course, I don't see the need for C99, so I'm biased. I think C89 gives
us function prototypes, the ability to return structs and unions, good
pointer semantics and limitations, and the ability to write idiomatic code
in a conforming way. C99 gives us compound literals, the bool type,
restricted pointers, and long long. No killer features, nothing I really
missed when programming C before.)

Integer 64-bit support seems to be important to many people, but most
C89 implementations in current use provide it, one way or another, so
people see no point in switching to C99 just for that. And those liking
the idea of <stdint.h>, can have it for C89, too (free implementations
exist).

And Fortran programmers are not going to abandon their favourite
programming language and switch to C99 simply because it now supports
many traditional Fortran features (VLAs, complex arithmetic and library
functions, generic function calls).

Dan
 
L

lawrence.jones

Dan Pop said:
It was a major blunder to ignore GNU C when designing C99.

GNU C wasn't ignored, but it wasn't considered to be any more important
than any other existing implementation. And since none of the GCC
developers or serious users could be bothered to join the committee or
attend meetings (not even occasionally), it probably didn't receive as
much attention as implementations that had advocates attending committee
meetings, or as much as it deserved. Microsoft C was in much the same
boat for the same reason.
If a few
C99 features didn't have semantics conflicting with the semantics of
the same features in GNU C, gcc would have had a conforming -std=c99
mode by now. The issue is purely political, there are no technical
difficulties.

That is simply incorrect. Although some of the conflicts were
undoubtedly caused inadvertently due to simple ignorance (see above),
others were deliberate reactions to technical shortcomings in GCC.
Describing the issue as "political" is not accurate -- the committee has
never had any animosity toward GCC, despite the converse not being true.
(But I must hasten to add that that attitude is long gone and the
current GCC developers seem genuinely interested in producing a
conforming implementation.)

-Larry Jones

That gives me a FABULOUS idea. -- Calvin
 
C

CBFalconer

GNU C wasn't ignored, but it wasn't considered to be any more
important than any other existing implementation. And since none
of the GCC developers or serious users could be bothered to join
the committee or attend meetings (not even occasionally), it
probably didn't receive as much attention as implementations that
had advocates attending committee meetings, or as much as it
deserved. Microsoft C was in much the same boat for the same
reason.

I suspect that the high price of such 'joining', together with the
unpaid volunteer aspect of gcc development, had more than a little
to do with it. Microsoft simply doesn't care, any standards
impede their freedom to 'innovate, foul, and charge'.
 
D

Dan Pop

In said:
GNU C wasn't ignored, but it wasn't considered to be any more important
than any other existing implementation.

Which is sheer stupidity. Not all existing implementations are equally
important.
And since none of the GCC
developers or serious users could be bothered to join the committee or
attend meetings (not even occasionally), it probably didn't receive as
much attention as implementations that had advocates attending committee
meetings, or as much as it deserved.

GNU C is pretty well documented. And its extensions are nicely grouped
together in a separate chapter.
Microsoft C was in much the same boat for the same reason.

In other words, the two implementations *by far* the most important have
been given Cinderella status, for purely bureaucratic reasons. Yeah,
that's typical committee thinking and a clear explanation for why the
C programming community at large is turning a deaf ear and a blind eye
to C99.

If gcc and Microsoft C were C99-conforming today, C89 would have been
history: any implementor targeting major hosted platforms and still
willing to survive would have followed their example.
That is simply incorrect. Although some of the conflicts were
undoubtedly caused inadvertently due to simple ignorance (see above),
others were deliberate reactions to technical shortcomings in GCC.

The idea was not to introduce semantic differences between GNU C features
with a well established status and new C99 features with the same name
and/or purpose. Not to adopt all the technical shortcomings of GNU C in
the C99 standard...
Describing the issue as "political" is not accurate -- the committee has
never had any animosity toward GCC, despite the converse not being true.

You misunderstood my words. I was talking about gcc's conformance issue
as being purely political: the gcc people don't want to break their
semantics, apparently not even in -std=c99 mode.

I attribute the ignorance of GNU C features to sheer committee stupidity,
not to any political agenda.
(But I must hasten to add that that attitude is long gone and the
current GCC developers seem genuinely interested in producing a
conforming implementation.)

Well, I haven't noticed any *documented* progress since 2001 or so...

Dan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,773
Messages
2,569,594
Members
45,125
Latest member
VinayKumar Nevatia_
Top