No more C after Cyber World War 1

S

Skybuck Flying

Like Einstein I am going to make a daring prediction:

After Cyber World War 1 has taken place there will be no more C !? ;) =D

Now go stuff that into your fokking ASS ! ;) =D

Bye,
Skybuck ;) =D

(Sorry I guess I had to blow off some full-of-hot-air-steam-LOL ;) :) =D)
 
S

Skybuck Flying

I shall even go a step further to rub it into your face.

I shall take it to the extreme:

The letter C will be banned from the ALPHABET ! ;) =D

Bye,
Skybuck.
 
M

MitchAlsup

At the very least, I hope that people will start using something like
Milo Martin's SoftBound C compiler, that catches all of those stupid
buffer overflow bugs that cyber warriors use to get into your system,
with a very small performance penalty in the 10% range.

Coders can check buffer problems themselves at a vanishingly low
costs.

Mitch
 
T

Torben Ægidius Mogensen

MitchAlsup said:
Coders can check buffer problems themselves at a vanishingly low
costs.

But why should they? The compiler can easily insert the checks and can
even remove most redundant checks automatically.

A careful coder will also be able to insert checks only where necessary,
but once the code has gone through a few modifications, the assumptions
that allowed some checks to be eliminated may no longer be true, so you
risk safety holes unless you think carefully about the consequences of
every change. If you leave it to the compiler to figure these things
out, all you have to do is recompile.

A good coder can make safe and efficient code in C, but it takes effort
and most coders are not good enough to achieve both efficiency and
safety in C -- either they make efficient and unsafe code or they make
safe but inefficient code by coding defensively, i.e., inserting a lot
of redundant tests and assertions.

So you should leave C-coding to the 1% of programmers that actually know
what they do and are willing to use the extra effort. All the rest
should use languages/compilers with managed memory and compiler-inserted
tests, even if a few of these are redundant.

The cost is not so high as many believe. This belief stems from the
fact that most languages that use managed memory and guaranteed checks
also do a lot of other stuff that _is_ costly and hard to optimise, such
as dynamic method invocation, reflection, dynamic type checking and so
on. If you strip these away and make a C-like language with managed
memory and guaranteed checks, it will run nearly as fast as well-written
C and faster than badly written C.

Torben
 
N

Noob

Torben said:
So you should leave C-coding to the 1% of programmers that actually know
what they do and are willing to use the extra effort. All the rest
should use languages/compilers with managed memory and compiler-inserted
tests, even if a few of these are redundant.

The cost is not so high as many believe. This belief stems from the
fact that most languages that use managed memory and guaranteed checks
also do a lot of other stuff that _is_ costly and hard to optimize, such
as dynamic method invocation, reflection, dynamic type checking and so
on. If you strip these away and make a C-like language with managed
memory and guaranteed checks, it will run nearly as fast as well-written
C and faster than badly written C.

Would Cyclone fit the bill?
http://en.wikipedia.org/wiki/Cyclone_(programming_language)
 
C

Chris H

In message said:
I shall even go a step further to rub it into your face.

I shall take it to the extreme:

The letter C will be banned from the ALPHABET ! ;) =D

Bye,
Skybuck.

Said (e-mail address removed) ?
 
M

Michael S

But why should they? The compiler can easily insert the checks and can
even remove most redundant checks automatically.

A careful coder will also be able to insert checks only where necessary,
but once the code has gone through a few modifications, the assumptions
that allowed some checks to be eliminated may no longer be true, so you
risk safety holes unless you think carefully about the consequences of
every change.  If you leave it to the compiler to figure these things
out, all you have to do is recompile.

A good coder can make safe and efficient code in C, but it takes effort
and most coders are not good enough to achieve both efficiency and
safety in C -- either they make efficient and unsafe code or they make
safe but inefficient code by coding defensively, i.e., inserting a lot
of redundant tests and assertions.

So you should leave C-coding to the 1% of programmers that actually know
what they do and are willing to use the extra effort.  All the rest
should use languages/compilers with managed memory and compiler-inserted
tests, even if a few of these are redundant.

The cost is not so high as many believe.  This belief stems from the
fact that most languages that use managed memory and guaranteed checks
also do a lot of other stuff that _is_ costly and hard to optimise, such
as dynamic method invocation, reflection, dynamic type checking and so
on.  If you strip these away and make a C-like language with managed
memory and guaranteed checks, it will run nearly as fast as well-written
C and faster than badly written C.

        Torben

Ada-83 is pretty close to your C-with-checks.
In theory except infrequent special cases (double indirection etc)
Ada-83 should be as fast as C or even faster, due to higher amount of
aliasing information available to optimizer.
In practice, I saw only 1 or 2 comparisons of C vs Ada on the same
(Gnu) back end. In those comparisons Ada ended up measurably slower.
 
K

Keith Thompson

But why should they? The compiler can easily insert the checks and can
even remove most redundant checks automatically.

If a coder can "check buffer problems" in the sense of writing code that
avoids them in the first place, that's much better than checking for
problems at run time. If you detect a problem at run time, you still
have to decide what to do if the check fails.

But if that's what MitchAlsup meant, the idea that it can be done "at a
vanishingly low costs" is laughable.

On the other hand, if he's talking about manually inserting checks:

if (i < 0 || i >= MAX) {
arr = 42;
}
else {
/* handle error? */
}

then again, the "vanishingly low costs" idea seems absurd.

Mitch, can you explain what you meant?

A careful coder will also be able to insert checks only where necessary,
but once the code has gone through a few modifications, the assumptions
that allowed some checks to be eliminated may no longer be true, so you
risk safety holes unless you think carefully about the consequences of
every change. If you leave it to the compiler to figure these things
out, all you have to do is recompile.

Furthermore, you have to maintain the checks as you maintain the code.
In the above, MAX might be the number of elements in arr today, but
tomorrow it could be something else.

[...]
The cost is not so high as many believe. This belief stems from the
fact that most languages that use managed memory and guaranteed checks
also do a lot of other stuff that _is_ costly and hard to optimise, such
as dynamic method invocation, reflection, dynamic type checking and so
on. If you strip these away and make a C-like language with managed
memory and guaranteed checks, it will run nearly as fast as well-written
C and faster than badly written C.

Also, in languages that require such checking, many checks can be
removed at compile time.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,057
Latest member
KetoBeezACVGummies

Latest Threads

Top