Ada vs Ruby

M

Marc Heiler

Hi,

On http://www.gcn.com/print/27_8/46116-1.html Ada is touted briefly.

The sentence(s) that most jumped into my eye (and hurt my brain a bit)
was this:

"[...] Ada has a feature called strong typing. This means that for every
variable a programmer declares, he or she must also specify a range of
all possible inputs.[...]"

"[...] This ensures that a malicious hacker can’t enter a long string of
characters as part of a buffer overflow attack or that a wrong value
won’t later crash the program. [...]"

But clearly that is simple to do in ruby as well (and I never heard of a
buffer overflow outside of the C world anyway): Just specify which input
range would be allowed and discard the rest, warn the programmer, or
simply convert it to the nearest allowed value - am I missing on
something? Maybe there are some other reasons why Ada is still so en
vogue for aviation software but I dont really get it (other than legacy
code that was sitting there for thousand of years already). Maybe it is
a paradigm that is only possible in Ada.

Ruby being too slow would be something I could not quite understand
insofar that, after all you could write parts in C anyway, or you could
use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
fast. Somehow despite that Ada is still in use, to me it seems like a
"dead" language (means noone really learns it because there are better
alternatives available)

The biggest confusion I get here is simply that strong typing is touted
as a very good thing to have. I dont know if this is the case or not,
but it seems to me that this is more "behaviour" that is imposed onto
the programmer anyway (as in, he must do extra work to ensure his
variables are a certain way etc..)
For example, the "strong typing" as described here appears to me more a
"force the programmer to do this and that". This may have advantages in
the long run, I dont know, maybe fewer bugs or no buffer overflow
problems, but to me it still is forcing the programmer to comply. I dont
get what is so great about having to worry about many details. And on
blogs you do sometimes see proponents of this solution scold on the
people that use another solution (not only typing, but also test driven
development and so on...)
 
R

Robert Dober

Hi,

On http://www.gcn.com/print/27_8/46116-1.html Ada is touted briefly.

The sentence(s) that most jumped into my eye (and hurt my brain a bit)
was this:

"[...] Ada has a feature called strong typing. This means that for every
variable a programmer declares, he or she must also specify a range of
all possible inputs.[...]"

"[...] This ensures that a malicious hacker can't enter a long string of
characters as part of a buffer overflow attack or that a wrong value
won't later crash the program. [...]"

But clearly that is simple to do in ruby as well (and I never heard of a
buffer overflow outside of the C world anyway): Just specify which input
range would be allowed and discard the rest, warn the programmer, or
simply convert it to the nearest allowed value - am I missing on
something? Maybe there are some other reasons why Ada is still so en
vogue for aviation software but I dont really get it (other than legacy
code that was sitting there for thousand of years already). Maybe it is
a paradigm that is only possible in Ada.
I was luck enough to write an Ada debugger in Ada for Ada83 in 1986
and I have to tell you
that it was indeed revolutionary for it's safety concepts. Agility was
of course not at all a design requirement of the DoD which has chosen
the final design of the language as proposed by Jean Ichbiah.

http://en.wikipedia.org/wiki/Ada_(programming_language)

As you can read above there is some discussion about the real value of
Ada, but I have to admit that living in the Ada world and being payed
for not doing anything else then using and studying it was a nice time
and put me into a mind setup of it's own.

It for sure is the champion of early failure (probably the compiler
detecting more potential runtime errors, especially in multitasking
than any other ) and I believe that this makes it very valuable in
mission critical domains.
Ruby being too slow would be something I could not quite understand
insofar that, after all you could write parts in C anyway, or you could
use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
fast. Somehow despite that Ada is still in use, to me it seems like a
"dead" language (means noone really learns it because there are better
alternatives available)
Dead? I would be very much surprised, just restricted to a domain
where it is useful.
The biggest confusion I get here is simply that strong typing is touted
as a very good thing to have. Under some conditions it is.
I dont know if this is the case or not,
but it seems to me that this is more "behaviour" that is imposed onto
the programmer anyway (as in, he must do extra work to ensure his
variables are a certain way etc..)
Oh it is an awfull lot of work, but less than in C++ I feel.
For example, the "strong typing" as described here appears to me more a
"force the programmer to do this and that".
Wait a second it is still the programmer who is defining the types ;)
This may have advantages in
the long run, I dont know, maybe fewer bugs or no buffer overflow
problems, but to me it still is forcing the programmer to comply. I dont
get what is so great about having to worry about many details. And on
blogs you do sometimes see proponents of this solution scold on the
people that use another solution (not only typing, but also test driven
development and so on...)
If I had been an Ada programmer for the last 20 years I definitely
would not know about the other domains and the usefulness of duck
typing and agile development.
It is an old story repeating itself like history. There were people
programming in assembler (or even machine code) for their life and
then they were asked about Fortran, what did you think they told?

Robert
 
M

Michael Neumann

Marc said:
Hi,

On http://www.gcn.com/print/27_8/46116-1.html Ada is touted briefly.

The sentence(s) that most jumped into my eye (and hurt my brain a bit)
was this:

"[...] Ada has a feature called strong typing. This means that for every
variable a programmer declares, he or she must also specify a range of
all possible inputs.[...]"

"[...] This ensures that a malicious hacker can’t enter a long string of
characters as part of a buffer overflow attack or that a wrong value
won’t later crash the program. [...]"

But clearly that is simple to do in ruby as well (and I never heard of a
buffer overflow outside of the C world anyway): Just specify which input
range would be allowed and discard the rest, warn the programmer, or
simply convert it to the nearest allowed value - am I missing on
something? Maybe there are some other reasons why Ada is still so en
vogue for aviation software but I dont really get it (other than legacy
code that was sitting there for thousand of years already). Maybe it is
a paradigm that is only possible in Ada.

You're right. The problem in C is that C strings do not have a length,
they are just pointers, and strings have to be zero-terminated. That is
a very bad thing. Imagine there is no terminating zero, then any call to
a string related function will read the whole memory and will most
likely result in an exception. And determining the length of a string is
O(n). But the real security issue is, that some functions that read
input don't take a maximum length. Function gets(3) is one example.
It reads a line into a buffer, regardless how long the buffer is.

But this is more a library related problem, not so much language
related. There are string libraries out there for C that are safe.

Ada compilers have to pass a lot of tests before they get a certificate.
A huge problem is that you can't trust the compiler, especially not
optimizing compilers. They might produce code that is buggy, even if
your program is correct. That's where Ada shines.

Then the language C is not type safe. You can do all kind of type casts.
And there are numerous constructs in C that increase the possibilities
for errors. Ada is here a lot better too. For example you can limit the
range of an integer.

Furthermore, Ada has built-in support for processes and synchronization
primitives. C and C++ just can't reliably do that, as there is no
language support. That's why C++0x, the next upcoming version of C++,
exist. It's goal is to make C++ multi-thread safe.

And Ada's language specification is very detailed, whereas that of C
lets many things open, which is not that desirable. You don't want any
suprise here. This problem came up recently in the Gnu Compiler
Collection (GCC), where they changed the behaviour of the generated
code, just because the C spec didn't specified it. This broke some
applications and operating systems, and possibly introduced a lot
of unknown bugs. Nothing you can build on reliable software.
Ruby being too slow would be something I could not quite understand
insofar that, after all you could write parts in C anyway, or you could
use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
fast. Somehow despite that Ada is still in use, to me it seems like a
"dead" language (means noone really learns it because there are better
alternatives available)

You will never ever be able to use Ruby for aviation software, neither
Lua, Python, Perl etc.

It's not about slowness. Realtime systems can be slow as long as they
meet their deadlines. Indeed, a lot of real-time systems are very slow.
They use 20 year old technology, no caches, no speculation etc., just
because in real-time systems, you always have to calculate with the
longest possible execution time, and modern processors only improve
average execution time.

Ada is not that bad at all. It's a beautiful language, maybe a bit
verbose, but very powerful. Personally, I like it more than C++.
The biggest confusion I get here is simply that strong typing is touted
as a very good thing to have. I dont know if this is the case or not,
but it seems to me that this is more "behaviour" that is imposed onto
the programmer anyway (as in, he must do extra work to ensure his
variables are a certain way etc..)
For example, the "strong typing" as described here appears to me more a
"force the programmer to do this and that". This may have advantages in
the long run, I dont know, maybe fewer bugs or no buffer overflow
problems, but to me it still is forcing the programmer to comply. I dont
get what is so great about having to worry about many details. And on
blogs you do sometimes see proponents of this solution scold on the
people that use another solution (not only typing, but also test driven
development and so on...)

Well, in the case of safety critical software, you don't want to have
runtime exceptions. This software must not have errors, at least it's
desirable ;-)

Duck-typing doesn't guarantee you anything at compile-time.

Regards,

Michael
 
R

Robert Dober

I think it depends on what is meant by "aviation software". I
wouldn't use Ruby for embedded avionics, for several reasons. But I
might use it (or Lua, or...) to power a visual display of the state of
that avionics, for example.
You know one can bet any value on statements like "X will never
happen". When am I going to pay? I can only win.
Sorry could not resist ;).
R.
 
B

britt.snodgrass

Hi,

Onhttp://www.gcn.com/print/27_8/46116-1.htmlAda is touted briefly.

The sentence(s) that most jumped into my eye (and hurt my brain a bit)
was this:

"[...] Ada has a feature called strong typing. This means that for every
variable a programmer declares, he or she must also specify a range of
all possible inputs.[...]"

I am an Ada programmer. The quoted statement from the GCN article is
not correct as written - "must" should be "may". Many languages
including C++ and Java, claim to be strongly typed. Strong typing is
a very desirable language feature. One key difference with Ada is
that Ada supports strong typing and optional range constraints of
primitive (e.g. integer, fixed point and floating point) types.
"[...] This ensures that a malicious hacker can't enter a long string of
characters as part of a buffer overflow attack or that a wrong value
won't later crash the program. [...]"

But clearly that is simple to do in ruby as well (and I never heard of a
buffer overflow outside of the C world anyway): Just specify which input
range would be allowed and discard the rest, warn the programmer, or
simply convert it to the nearest allowed value - am I missing on
something? Maybe there are some other reasons why Ada is still so en
vogue for aviation software but I dont really get it (other than legacy
code that was sitting there for thousand of years already). Maybe it is
a paradigm that is only possible in Ada.

Ruby being too slow would be something I could not quite understand
insofar that, after all you could write parts in C anyway, or you could
use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
fast. Somehow despite that Ada is still in use, to me it seems like a
"dead" language (means noone really learns it because there are better
alternatives available)

Ada is far from dead - its a great general purpose language and is
currently being used on new projects. In the high assurance domains
where it is principally used, there is currently nothing better,
certainly not C++ or Java. There also exists the SPARK
(www.sparkada,com) subset of Ada and its associated set of formal
methods based static analysis tools. I use SPARK and, though it
requires a certain mindset to use effectively, I think its the "real
deal" for producing the highest quality code (i.e., free of initial
defects). We really don't expect to find many bugs during debugging
or formal testing, at least not many bugs that can't be traced back to
a missing or ambiguous requirement.
The biggest confusion I get here is simply that strong typing is touted
as a very good thing to have. I dont know if this is the case or not,
but it seems to me that this is more "behaviour" that is imposed onto
the programmer anyway (as in, he must do extra work to ensure his
variables are a certain way etc..)
For example, the "strong typing" as described here appears to me more a
"force the programmer to do this and that". This may have advantages in
the long run, I dont know, maybe fewer bugs or no buffer overflow
problems, but to me it still is forcing the programmer to comply. I dont
get what is so great about having to worry about many details. And on
blogs you do sometimes see proponents of this solution scold on the
people that use another solution (not only typing, but also test driven
development and so on...)

"worry about many details" isn't great fun but its necessary for
safety and/or security critical software. If a well specified
programming language and its associated compilers/ static analysis
tools help me to manage the details all the way from the big picture
design down to bit-level ASIC interfaces, then I welcome the help.

- Britt
 
F

framefritti

Interesting thread... also because I use both Ruby and Ada. No,
better...
since I _love_ both Ruby and Ada. Yes, they could not be more
different and...
no, I do not have any split-personality problem (at least, non that I
am
aware of it...:)

In my personal experience, they are both great languages and each one
"shines"
in its field. I use Ruby for small to medium-large applications where
"duck
typing" allows you to write good and flexible software in little
time. However,
I discovered that when I go to large to very large applications, a
pedantic language
as Ada (which would not allow you to write sqrt(5) because "5" is an
integer and
not a float... my first Ada program...) is a better choice since many
errors
are caught at compile time and many others just at the first few runs
by the
checks automatically inserted by the compiler. For example, if you
write

type Month_Day is new Integer range 1..31;

MD : Month_Day := 30;

MD := MD + 3;

you will get a runtime error because MD exit from the allowed range.
In C this bug could comfortably sleeps for centuries...

Moreover, if you define

type Counter is new Integer;

Ada strong typing will prevent you to assign a value of type Month_Day
to
a variable of type Counter (the magic word is "new") and this makes a
lot
of sense, unless in your application you can convert a day into a
counter.
I discovered that when your software grows larger, this kind of
constraints
that you _ask to the compiler_ to enforce on you, can really help.
[there
are *lots* of discussion about the usefulness of the introduction of
new incompatible types. The sentece above is just my opinion,
based on some personal experience. I hope I did not open a new
can of worms...]

Maybe your initial productivity (mesured in lines of code written for
unit of time) can be smaller because of the loss of flexibility,
but if your software is very large you gain in debugging and
maintenace
time.

Of course, if you just want to extract data from a CSV file, or write
a wget-like program, Ada can be a "gun for mosquitos."
 
T

Todd Benson

Interesting thread... also because I use both Ruby and Ada. No,
better...
since I _love_ both Ruby and Ada. Yes, they could not be more
different and...
no, I do not have any split-personality problem (at least, non that I
am
aware of it...:)

In my personal experience, they are both great languages and each one
"shines"
in its field. I use Ruby for small to medium-large applications where
"duck
typing" allows you to write good and flexible software in little
time. However,
I discovered that when I go to large to very large applications, a
pedantic language
as Ada (which would not allow you to write sqrt(5) because "5" is an
integer and
not a float... my first Ada program...) is a better choice since many
errors
are caught at compile time and many others just at the first few runs
by the
checks automatically inserted by the compiler. For example, if you
write

type Month_Day is new Integer range 1..31;

MD : Month_Day := 30;

MD := MD + 3;

you will get a runtime error because MD exit from the allowed range.
In C this bug could comfortably sleeps for centuries...

Moreover, if you define

type Counter is new Integer;

Ada strong typing will prevent you to assign a value of type Month_Day
to
a variable of type Counter (the magic word is "new") and this makes a
lot
of sense, unless in your application you can convert a day into a
counter.
I discovered that when your software grows larger, this kind of
constraints
that you _ask to the compiler_ to enforce on you, can really help.
[there
are *lots* of discussion about the usefulness of the introduction of
new incompatible types. The sentece above is just my opinion,
based on some personal experience. I hope I did not open a new
can of worms...]

You can "type" your variables in Ruby if you have to. I don't think
that's the problem. It's the possibly reckless meta-programming in
libraries you use (I'm not talking about you, Trans, I think Facets is
great).

Being an engineer and a db guy, you would think that Ruby is the most
god awful thing I've ever seen. Well, it has its place.

For realtime, Michael is right about the "time of execution" being
_the_ important thing. I would like to see in the future, however, a
Ruby that talks to the hardware like RTLinux or QNX. I'd take up such
a project myself, except I don't know enough C or assembly. I suppose
you'd have to make certain objects allowed to have free reign over the
processor/memory. Like an Object#become_real, though that's a little
scary :)

Todd
 
B

Bill Kelly

From: said:
For example, if you write

type Month_Day is new Integer range 1..31;

MD : Month_Day := 30;

MD := MD + 3;

you will get a runtime error because MD exit from the allowed range.
In C this bug could comfortably sleeps for centuries...

The example you've provided causes me to wonder whether such
language level range limiting could instill a false sense of
security in the programmer.

Please have your ada program send me an email on February 31st!

<grin>

Seems like range checking would work well for Month range 1..12;
but not so well for Month_Day... ?


Regards,

Bill
 
E

Eleanor McHugh

You will never ever be able to use Ruby for aviation software, neither
Lua, Python, Perl etc.

You provide the budget, I'll provide the code ;) Having designed and
implemented avionics systems I see nothing in Ruby or any other
scripting language that would stand in the way of using it to do the
same thing. In fact Lua began its life as a language for device
control. That's not to say that MRI is particularly suited to the
task, but the necessary changes could be made if anyone wanted to
without having to change the language syntax and semantics.
It's not about slowness. Realtime systems can be slow as long as
they meet their deadlines. Indeed, a lot of real-time systems are
very slow.
They use 20 year old technology, no caches, no speculation etc.,
just because in real-time systems, you always have to calculate with
the
longest possible execution time, and modern processors only improve
average execution time.

It's true that realtime execution is easier when you get the execution
windows balanced, but it's mostly about coding defensively and knowing
how to handle failure states and recover when calculations exceed
their desired execution budget. The latter is particularly important
as many calculations have unpredictable run-time characteristics.

as for the reason 20 year old technology is so popular, you don't have
to look much further than the low cost of that generation of
processors and the low computational requirements of many problems: a
PIC17C42 for example has all the grunt you could ever want for
steering a light aircraft, and a Dragonball is more than adequate for
real-time GPS navigation. Chucking even a Pentium at these jobs would
be overkill unless you want to run a Windows kernel.
Well, in the case of safety critical software, you don't want to
have runtime exceptions. This software must not have errors, at
least it's desirable ;-)

There's nothing wrong with runtime exceptions so long as you figure
out what the correct fail-safe behaviour of the system is and make
sure it takes it. In fact for high-spec aviation systems where there's
a statistical risk of cosmic ray interference flipping bits at run-
time I'd want to see the fail-safe strategy before I even considered
the rest of the system desing (although admittedly that was a
consideration that always made me laugh when I was doing my CAA
certifications ;).
Duck-typing doesn't guarantee you anything at compile-time.

True. But nothing guarantees you anything at run-time, including 100%
compliance at compile-time. That's why most CS and IS degrees have
lectures explaining the difference between Verification (what your
compiler does) and Validation (what you do before you start coding).

As a rule of thumb even the highest-quality systems will have one bug
for every 30000 lines of source code (that's only 1% of the bug
density in standard shrink-wrap applications) which can amount to tens
of thousands of defects. These are not 'errors' in the sense that a
compiler understands them, but genuine misunderstandings of the
problem space in question that will lead to actively dangerous
software states.


Ellie

Eleanor McHugh
Games With Brains
http://slides.games-with-brains.net
 
R

Rick DeNatale

runtime exceptions. This software must not have errors, at least it's
desirable ;-)

There's nothing wrong with runtime exceptions so long as you figure out
what the correct fail-safe behaviour of the system is and make sure it takes
it. In fact for high-spec aviation systems where there's a statistical risk
of cosmic ray interference flipping bits at run-time I'd want to see the
fail-safe strategy before I even considered the rest of the system desing
(although admittedly that was a consideration that always made me laugh when
I was doing my CAA certifications ;).

This argument is giving me a flash back to a decade or two ago.

Bjarne Stroustrup used to use the same argument against Smalltalk,
saying that he wouldn't want to fly in an airplane whose autopilot
could throw a MessageNotFound exception.

I would counter argue saying that I'd rather fly on that plane than
the one with the C++ autopilot which instead would branch to a random
location because a dangling pointer caused a branch to a virtual
function through a virtual function table which really wasn't a
virtual function table anymore.
True. But nothing guarantees you anything at run-time, including 100%
compliance at compile-time. That's why most CS and IS degrees have lectures
explaining the difference between Verification (what your compiler does) and
Validation (what you do before you start coding).

Amen, Sister! And languages which rely on static typing have a
tendency to do much more random things when things go wrong. Language
like Ruby tend to have a more vigilant runtime.
 
A

Arved Sandstrom

[ SNIP ]
The biggest confusion I get here is simply that strong typing is touted
as a very good thing to have. I dont know if this is the case or not,
but it seems to me that this is more "behaviour" that is imposed onto
the programmer anyway (as in, he must do extra work to ensure his
variables are a certain way etc..)
For example, the "strong typing" as described here appears to me more a
"force the programmer to do this and that". This may have advantages in
the long run, I dont know, maybe fewer bugs or no buffer overflow
problems, but to me it still is forcing the programmer to comply. I dont
get what is so great about having to worry about many details. And on
blogs you do sometimes see proponents of this solution scold on the
people that use another solution (not only typing, but also test driven
development and so on...)

It sounds like by strong typing you actually mean static explicit typing, as
in Java or C. Bear in mind that you can have static typing without explicit
declarations, for example where type inference is used, like in Haskell or
F# (or to some extent in C# 3.0). This removes one of your
objections...inconvenience.

AHS
 
P

Phillip Gawlowski

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Rick DeNatale wrote:

|
| Amen, Sister! And languages which rely on static typing have a
| tendency to do much more random things when things go wrong. Language
| like Ruby tend to have a more vigilant runtime.
|

I wouldn't fly in an aeroplane that relies on the runtime to catch errors.

Take the Space Shuttle as an extreme. Does the language breed perfection
in the Shuttle's source, or is it the process NASA uses?

I bet you dollars to doughnuts that it is the process, with
more-than-due-diligence in writing and testing the software. That the
requirements are clear cut and well understood is another bonus.

Languages don't matter. Compilers don't matter. Process, however, does.

Or methodology. TDD has its benefits, as does BDD. Without these, the
Agile way wouldn't work. QA is the key, not that language.

Don't just take my word for it:

http://www.nap.edu/html/statsoft/chap2.html

The above link has a case study on NASA's process for developing the
Space Shuttle's flight control software.

- --
Phillip Gawlowski
Twitter: twitter.com/cynicalryan

You've got to stand up and live before you can sit down and write.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkgFZv8ACgkQbtAgaoJTgL/GSgCfVkHiDMTN/GeqsfzImN3FdP8O
050AniEDC3937FBIB6wfdGa2EadoFWco
=YSBK
-----END PGP SIGNATURE-----
 
M

M. Edward (Ed) Borasky

Rick said:
This argument is giving me a flash back to a decade or two ago.

Bjarne Stroustrup used to use the same argument against Smalltalk,
saying that he wouldn't want to fly in an airplane whose autopilot
could throw a MessageNotFound exception.

I would counter argue saying that I'd rather fly on that plane than
the one with the C++ autopilot which instead would branch to a random
location because a dangling pointer caused a branch to a virtual
function through a virtual function table which really wasn't a
virtual function table anymore.

And there is the apocryphal story that when John Glenn buckled himself
into the Mercury spacecraft, he turned to one of the aides and said,
"Just remember ... every piece of equipment here was provided by the low
bidder." :)
 
R

Rick DeNatale

And there is the apocryphal story that when John Glenn buckled himself into
the Mercury spacecraft, he turned to one of the aides and said, "Just
remember ... every piece of equipment here was provided by the low bidder."

Actually, I'm pretty sure that that was Wally, much more his style than Glenn.

And Project Mercury is a particular interest of mine.

http://www.mercuryspacecraft.com/wiki runs on the same server in my
house as my blog.
 
R

Rick DeNatale

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1


Rick DeNatale wrote:

|
| Amen, Sister! And languages which rely on static typing have a
| tendency to do much more random things when things go wrong. Language
| like Ruby tend to have a more vigilant runtime.
|

I wouldn't fly in an aeroplane that relies on the runtime to catch errors.

Take the Space Shuttle as an extreme. Does the language breed perfection
in the Shuttle's source, or is it the process NASA uses?

I bet you dollars to doughnuts that it is the process, with
more-than-due-diligence in writing and testing the software. That the
requirements are clear cut and well understood is another bonus.

Languages don't matter. Compilers don't matter. Process, however, does.

Or methodology. TDD has its benefits, as does BDD. Without these, the
Agile way wouldn't work. QA is the key, not that language.

I was pondering this thread earlier today, and before I pitched in,
and was going to draw an analogy with Frank Borman's comments during
the Senate commitee hearing on the Apollo 1 fire. He said that the
real cause of the fire, was "a lack of imagination" about the dangers
of doing ground testing with the spacecraft filled with pure O2 at
sea-level atmospheric pressure.

Relying on static-typing to 'prevent' fatal errors exhibits the same
kind of lack of imagination about the range of possible failure modes.
Nothing is perfect, but I'll take disciplined testing over relying on
ceremonial static typing any day.
Don't just take my word for it:

http://www.nap.edu/html/statsoft/chap2.html

The above link has a case study on NASA's process for developing the
Space Shuttle's flight control software.

Of course even with good process, it's still hard to get it right the
first time, remember "the bug heard round the world," which kept
Columbia on the pad during the first attempt to launch STS-1?

http://portal.acm.org/citation.cfm?id=1005928.1005929
 
P

Phillip Gawlowski

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Rick DeNatale wrote:

|
| I was pondering this thread earlier today, and before I pitched in,
| and was going to draw an analogy with Frank Borman's comments during
| the Senate commitee hearing on the Apollo 1 fire. He said that the
| real cause of the fire, was "a lack of imagination" about the dangers
| of doing ground testing with the spacecraft filled with pure O2 at
| sea-level atmospheric pressure.
|
| Relying on static-typing to 'prevent' fatal errors exhibits the same
| kind of lack of imagination about the range of possible failure modes.
| Nothing is perfect, but I'll take disciplined testing over relying on
| ceremonial static typing any day.

Indeed. It is about knowing the limits of a language, and its features,
too. Not just "What can @language do?", but also "What can't @language
do?" needs to figure into it.

And a lot of math can figure into it, too. Jim Weirich told an anecdote
to that effect in his keynote at MWRC08 to a similar effect: A bug hits
once in a million. The piece of hardware using that buggy software
stalled "once, maybe twice a day". After a bit of math, the code was
called ~1.3 million times in 8 hours. Resulting in a failure of "one, or
twice a day".

Faith is good. Testing (unit tests, functional tests, integration test,
regression tests, usability tests, acceptance tests...) is better.

As Knuth once said: "Beware of this code. I have merely proven it
correct, not tested it" (or something along those lines, anyway).

|
| Of course even with good process, it's still hard to get it right the
| first time, remember "the bug heard round the world," which kept
| Columbia on the pad during the first attempt to launch STS-1?

No, I don't remember that. I was a wee one when the Shuttle Program
started. :)

However, without process, any process, it is impossible to get things
right at *any* time.

The difficulty is in picking the most correct approach to a problem. The
NASA process doesn't necessarily translate into, say, corporate or web
development, or any situations where requirements change rapidly and/or
are not well understood (in the case of the Space Shuttle, the
requirements were well understood. Or so I hope. Business processes
aren't necessarily well understood, or can even be expressed).

I wouldn't use Agile to build Flight control software. But I wouldn't
use a statistical methodology to build a billing system, either.

Long story short: in today's world, we don't have to be multilingual in
the languages we speak, but adaptable to the methodologies we are able
to work in.

Well, computer science seems to be maturing, and thus software
development, too.

| http://portal.acm.org/citation.cfm?id=1005928.1005929

Dang, I'll have to find an alternative to this link (lacking the means
to access this resource, unfortunately).

- --
Phillip Gawlowski
Twitter: twitter.com/cynicalryan

~ I'm looking for something that can deliver a 50-pound payload of snow
~ on a small feminine target. Can you suggest something? Hello...?
~ --- Calvin
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkgFfskACgkQbtAgaoJTgL+wswCeJJ0C5u/s6zqW2wxd7zADaoWN
Oa0An0fKBWUtbsLuXHrX2Ebj9jBd+9G2
=iZk1
-----END PGP SIGNATURE-----
 
M

Matt Todd

I'd much rather be damn sure and also have exception handling for what
I don't expect.

You know, because exceptions to the rules of life can happen and they
aren't always what you expect. Because life isn't always as linear as
we'd hope.

Matt Todd
 
T

ThoML

for example where type inference is used, like in Haskell or
F# (or to some extent in C# 3.0)

IIRC D is capable of doing some type inferencing too. But D is still
on my to-be-learned list.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top