M
Marc Heiler
Hi,
On http://www.gcn.com/print/27_8/46116-1.html Ada is touted briefly.
The sentence(s) that most jumped into my eye (and hurt my brain a bit)
was this:
"[...] Ada has a feature called strong typing. This means that for every
variable a programmer declares, he or she must also specify a range of
all possible inputs.[...]"
"[...] This ensures that a malicious hacker can’t enter a long string of
characters as part of a buffer overflow attack or that a wrong value
won’t later crash the program. [...]"
But clearly that is simple to do in ruby as well (and I never heard of a
buffer overflow outside of the C world anyway): Just specify which input
range would be allowed and discard the rest, warn the programmer, or
simply convert it to the nearest allowed value - am I missing on
something? Maybe there are some other reasons why Ada is still so en
vogue for aviation software but I dont really get it (other than legacy
code that was sitting there for thousand of years already). Maybe it is
a paradigm that is only possible in Ada.
Ruby being too slow would be something I could not quite understand
insofar that, after all you could write parts in C anyway, or you could
use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
fast. Somehow despite that Ada is still in use, to me it seems like a
"dead" language (means noone really learns it because there are better
alternatives available)
The biggest confusion I get here is simply that strong typing is touted
as a very good thing to have. I dont know if this is the case or not,
but it seems to me that this is more "behaviour" that is imposed onto
the programmer anyway (as in, he must do extra work to ensure his
variables are a certain way etc..)
For example, the "strong typing" as described here appears to me more a
"force the programmer to do this and that". This may have advantages in
the long run, I dont know, maybe fewer bugs or no buffer overflow
problems, but to me it still is forcing the programmer to comply. I dont
get what is so great about having to worry about many details. And on
blogs you do sometimes see proponents of this solution scold on the
people that use another solution (not only typing, but also test driven
development and so on...)
On http://www.gcn.com/print/27_8/46116-1.html Ada is touted briefly.
The sentence(s) that most jumped into my eye (and hurt my brain a bit)
was this:
"[...] Ada has a feature called strong typing. This means that for every
variable a programmer declares, he or she must also specify a range of
all possible inputs.[...]"
"[...] This ensures that a malicious hacker can’t enter a long string of
characters as part of a buffer overflow attack or that a wrong value
won’t later crash the program. [...]"
But clearly that is simple to do in ruby as well (and I never heard of a
buffer overflow outside of the C world anyway): Just specify which input
range would be allowed and discard the rest, warn the programmer, or
simply convert it to the nearest allowed value - am I missing on
something? Maybe there are some other reasons why Ada is still so en
vogue for aviation software but I dont really get it (other than legacy
code that was sitting there for thousand of years already). Maybe it is
a paradigm that is only possible in Ada.
Ruby being too slow would be something I could not quite understand
insofar that, after all you could write parts in C anyway, or you could
use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
fast. Somehow despite that Ada is still in use, to me it seems like a
"dead" language (means noone really learns it because there are better
alternatives available)
The biggest confusion I get here is simply that strong typing is touted
as a very good thing to have. I dont know if this is the case or not,
but it seems to me that this is more "behaviour" that is imposed onto
the programmer anyway (as in, he must do extra work to ensure his
variables are a certain way etc..)
For example, the "strong typing" as described here appears to me more a
"force the programmer to do this and that". This may have advantages in
the long run, I dont know, maybe fewer bugs or no buffer overflow
problems, but to me it still is forcing the programmer to comply. I dont
get what is so great about having to worry about many details. And on
blogs you do sometimes see proponents of this solution scold on the
people that use another solution (not only typing, but also test driven
development and so on...)