The Industry choice

S

Stefan Axelsson

Paul said:
I do believe that it's a horrible deficiency in Python that it has no
declarations at all, even optional ones, like "perl -w" or "use
strict". Python's scoping hacks that result from the lack of
declarations just seem to me like pure insanity.

Yes, ignoring most of the debate about static vs. dynamic typing, I've
also longed for 'use strict'. Sure Python isn't as bad as (say) Awk in
this respect; you have to at least assign a variable to make it spring
into existence, but I've been bitten by typos there as well. Same when
it comes to object methods (I can often never remember my method names).

Pychecker helps to some extent, but I wouldn't mind a compiler that only
accepted identifiers that had been declared. I don't think that anyone
could argue that typing 'use apa' before the first actual use (or words
to that effect) would 'slow them down', or be very onerous.

Stefan,
 
M

Mark Carter

It might be nice if it was widely understood (in IT) that Python was
a language any competent
> programmer could pick up in an afternoon

I am a programmer who works for a firm of engineers, where they program
in VBA, badly. I've often mentioned Python, whereupon I'm usually
dismissed as a crank. One of them expressed concern that if they used
Python and I left, then nobody would understand what to do. I could have
countered that Python is actually quite an easy language to pick up, but
what's the point.

We might be doing a project which involves web-type stuff. I pointed out
that if they did, they wouldn't be able to use VB/VBA, and may need to
use something like Python. I didn't get a reaction from that at the
time, but no doubt they'll be telling me that I'll have to make Excel
work through the internet, or something.
 
P

Peter Dembinski

[...]
For me, the effect is striking. I pound out a little program,
couple hundred lines maybe, and think "hm, guess that's it" and save
it to disk. Run the compiler, it says "no, that's not it - look
at line 49, where this expression has type string but context
requires list string." OK, fix that, iterate.

I believe program-specific unit tests are more effective than compiler
typechecking :)
 
P

Peter Dembinski

Paul Rubin said:
That's something to think about and it's come up in discussions,
but probably complicates stuff since it's not currently available
on the target platform. Also, the people on the project have
significant Java and Python experience but haven't used Ada.
Do you think it has real advantages over Java?

As I wrote before, it is more redundant language[1], plus (AFAIR)
it has the strongest type checking of all the programming languages
I know about.

Plus, most of Ada compilers (such as gnat) generate machine/operating
system - specific code[2], not bytecode, which could be advantage
if performance is one of priorities.

I may have a little skewed viewpoint because Ada 95 is the language
I have recently studied on my RTS labs :>


[1] for example, one has to define interface and implementation parts
of each module in separate files

[2] AFAIR gnat generates C code, which is then compiled with gcc.
 
P

Paul Rubin

Mark Carter said:
We might be doing a project which involves web-type stuff. I pointed
out that if they did, they wouldn't be able to use VB/VBA, and may
need to use something like Python.

They'll probably use vb.net.
 
R

Roy Smith

Stefan Axelsson said:
Yes, ignoring most of the debate about static vs. dynamic typing, I've
also longed for 'use strict'.

You can use __slots__ to get the effect you're after. Well, sort of; it
only works for instance variables, not locals. And the gurus will argue
that __slots__ wasn't intended for that, so you shouldn't do it.
Sure Python isn't as bad as (say) Awk in this respect; you have to at
least assign a variable to make it spring into existence

I think you've hit the nail on the head. In awk (and perl, and most
shells, and IIRC, FORTRAN), using an undefined variable silently gets
you a default value (empty string or zero). This tends to propagate
errors and make them very difficult to track down.

In Python, you raise NameError or AttributeError, so you find out about
your mistake quickly, and you know exactly where it is. The only time
you can really go wrong is when you've got multiple assignment
statements with the same lhs and you make a typo in one of them. And
even then, as you say, things like Pychecker will probably catch the
mistake.

In perl, I always use "use strict", but in Python, I just don't feel the
need. Between the exception mechanism and unit tests, the odds of a
typo going unnoticed for very long are pretty slim. I'll admit I don't
use Pychecker, but if I was doing production code, I would probably use
it as part of my QA process.
I don't think that anyone could argue that typing 'use apa' before
the first actual use (or words to that effect) would 'slow them
down', or be very onerous.

Well, I'll have to (respectfully) disagree with you on that. It's not
that there's no value in explicit declarations, it's just that (IMHO)
the cost exceeds the value, given the other tools we have in Python to
catch the mistake.
 
S

Steve Holden

Paul said:
I wouldn't say so. I'd say the Linux kernel, GCC, Emacs, Apache,
Mozilla, etc. are all developed with a much more serious attitude than
Python is. Of course there are lots of other FOSS programs that
someone wrote for their own use and released, that are less polished
than Python, but that are also the subject of less advocacy than Python.

Well clearly there's a spectrum. However, I have previously written that
the number of open source projects that appear to get stuck somewhere
between release 0.1 and release 0.9 is amazingly large, and does imply
some dissipation of effort.

Give that there's no overall coordination this is of course inevitable,
but some open source projects are doomed from the start to be incomplete
because the original authors have never been involved in producing
software with a reasonably large user base, and so their production
goals and quite often their original specifications (where there are
any) are unrealistic.

These projects meander towards a half-assed initial implementation and
then become moribund.

This is not to tar respectable projects like Linux, many (but not all)
of the Gnu projects, and Python with that same brush, and personally I
think the Python *core* is pretty solid and quite well-documented, but I
don;t regard IDLE as part of the core myself. Since I'm not an active
developer, this may not be in line with python-dev's opinions on the matter.

regards
Steve
 
S

Steve Holden

Aahz said:
That's funny -- Bruce Eckel talks about how he used to love checked
exceptions but has come to regard them as the horror that they are.
I've learned to just write "throws Exception" at the declaration of
every method.

Pretty sloppy, though, no? And surely the important thing is to have a
broad handler, not a broad specification of raisable exceptions?

regards
Steve
 
S

Steve Holden

Mark said:
I am a programmer who works for a firm of engineers, where they program
in VBA, badly. I've often mentioned Python, whereupon I'm usually
dismissed as a crank. One of them expressed concern that if they used
Python and I left, then nobody would understand what to do. I could have
countered that Python is actually quite an easy language to pick up, but
what's the point.

We might be doing a project which involves web-type stuff. I pointed out
that if they did, they wouldn't be able to use VB/VBA, and may need to
use something like Python. I didn't get a reaction from that at the
time, but no doubt they'll be telling me that I'll have to make Excel
work through the internet, or something.

They'll probably just move to .NET, which allows them to write .aspx
pages using VB.

regards
Steve
 
S

Stefan Axelsson

Roy said:
In perl, I always use "use strict", but in Python, I just don't feel the
need. Between the exception mechanism and unit tests, the odds of a
typo going unnoticed for very long are pretty slim. I'll admit I don't
use Pychecker, but if I was doing production code, I would probably use
it as part of my QA process.

Well, I don't have any experience with Python in the industrial setting
(all my Python has been solo so far). I do have quite a bit of
experience with Erlang (http://www.erlang.org) though, and while I agree
that it's not quite as bad in practice as the most vocal static typing
people would have it, it's not all roses either. The problem with unit
tests is that they can be skipped (and frequently are) and you also have
to be certain you exercise all code paths, even to detect a simple typo.

It's not that these survive for 'very long' or (God forbid) to the final
product, but many of them survive for long enough that they cost more
than they should/would have. So *if* (substantial 'if' I realise that)
my Erlang experiences generalises to this case, I'd say the benefits
would outweigh the cost.

Then again I'm seriously considering going back to Haskell, so I guess
I'm at least a little biased. :) :)

Stefan,
 
A

Aahz

I can only believe that if you think the benefit of static typing is
psychological, either something is very different between the way you
and I write programs, or you're not doing it right.

For me, the effect is striking. I pound out a little program, couple
hundred lines maybe, and think "hm, guess that's it" and save it to
disk. Run the compiler, it says "no, that's not it - look at line 49,
where this expression has type string but context requires list string."
OK, fix that, iterate. Most of this goes about as fast as I can edit,
sometimes longer, but it's always about structural flaws in my program,
that got there usually because I changed my mind about something in
midstream, or maybe I just mistyped something or forgot what I was doing.
Then, when the compiler is happy -- the program works. Not always, but
so much more often than when I write them in Python.

That's just not true for me. Take my recent Java experience (please!).
I spent much effort trying to resolve stupid type dependencies that made
no sense. Python's duck-typing just works -- if it looks like you should
be able to use an object for a particular operation, you probably can.
Python programs that I write mostly just work; instead of pounding out
two hundred lines of code straight, I keep adding stubs and filling them
in, testing operation as I go. This isn't even unit-testing -- I haven't
drunk that Kool-Aid yet.

This is easy because running a Python program is faster than invoking the
Java compiler -- and you still haven't tested the actual operation of
your Java program.
 
A

Aahz

Pretty sloppy, though, no? And surely the important thing is to have a
broad handler, not a broad specification of raisable exceptions?

Yes, it's sloppy, but I Don't Care. I'm trying to write usable code
while learning a damnably under-documented Java library -- and I'm *not*
a Java programmer in the first place, so I'm also fighting with the Java
environment. Eventually I'll add in some better code.
 
B

Bulba!

There is the stability issue you mention... but also probably the fear
issue. If you choose a solution from a major company -- then it fails for
some reason or they drop the product -- it's their fault -- you've got an
automatic fall guy.

True. I have a bit of interest in economics, so I've seen e.g.
this example - why is it that foreign branches of companies
tend to cluster themselves in one city or country (e.g.
China right now)? According to standard economics it should
not happen - what's the point of getting into this overpriced
city if elsewhere in this country you can find just as good
conditions for business.

The point is obviously "cover your ass" attitude of managers:
if this investment fails, this manager can defend himself
"but everybody invested in that particular place, too, so
you see, at the time it was not a bad decision, we could
not predict... yadda yadda".
 
B

Bulba!

You are ignoring the fact that with the open source solution you do at
least have the option of hiring bright programmers to support the
framework which has now become moribund,

Theoretically. Because even though the source code is available
and free (like in beer as well as in speech) the work of
programmers isn't cheap.

This "free software" (not so much OSS) notion "but you can
hire programmers to fix it" doesn't really happen in practice,
at least not frequently: because this company/guy remains
ALONE with this technology, the costs are unacceptable.

It's a case of "marginal cost" (cost of making yet another copy)
becoming equals to the costs of a project: that is extraordinarily
expensive software. If this software gets sold or copied by
the millions, the marginal costs is going down to zero, like
it is the case with Linux.

Imagine NOT being a technology company (say, Sun or
IBM or Borland) and trying to hire programmers to fix
you the kernel of this operating system.
whereas when a company goes
bust there's no guarantee the software IP will ever be extricated from
the resulting mess.

There is a good _chance_ here: money. Somebody has poured a lot
of money into this thing. It's not going to get dropped bc of that.
So I'm not sure I'd agree with "rational" there, though "comprehensible"
might be harder to argue with.

It depends on definition of "rational", on definition of your or
company's goals and on the definitions of the situations that
are the context.

Avoidance of blame is way too large a motivator in large organizations,
and it leads to many forms of sub-optimal decision making.

This might be of interest to some people:

http://www.pkarchive.org/new/DefectiveInvestors.html
 
B

Bulba!

Let me add a cautionary note, though: Big Companies,
including Oracle, Software AG, IBM, Cisco, and so on, have
adopted Tcl over and over. All of them still rely on Tcl
for crucial products. All of them also have employees who
sincerely wonder, "Tcl? Isn't that dead?"
I offer this as a counter-example to the belief that Adop-
tion by a heavyweight necessarily results in widespread
acceptance.

It's a quiet adoption. It's not a firework show a la Java
combined with blowing all those truckloads of money on
it (I've read this good comment in one of the IT periodicals
that in future Java will be cited as an example of work of
genius - not a genius of computing, though, but of marketing).

There is a rational element in this craziness: people
around watch and see that $$$ has been sunk in
this, so they know that this who sank all those $$$
has very, very much of motivation not to let this
thing wither away. No guarantee, of course, but a
much better chance they just won't drop it.

The problem of Python is not that it's not used, but
because some company like IBM didn't decide to blow
$1 bln on it or smth. Using it alone and even announcing
it is not enough. "put your money where your mouth
is", or so the thinking goes.
 
C

Cameron Laird

.
[tale of *very*
typical experience
with non-software
engineers]
.
.
use something like Python. I didn't get a reaction from that at the
time, but no doubt they'll be telling me that I'll have to make Excel
work through the internet, or something.

I do that, by the way--work with Excel through the 'Net.

I use Python, of course.
 
R

Roy Smith

Let me add a cautionary note, though: Big Companies,
including Oracle, Software AG, IBM, Cisco, and so on, have
adopted Tcl over and over. All of them still rely on Tcl
for crucial products. All of them also have employees who
sincerely wonder, "Tcl? Isn't that dead?"

A lot of people laugh at Tcl, but it's really a very useful tool. In my
last job, we did a major component of our product (SNMP-based network
management package) in Tcl, probably on the order of 10 kloc. It has
its limitations, but it's very easy to learn, very easy to embed and
extend, and reasonably fast. It certainly blows away shell scripting.

Around here, AOL/Moviephone has been trolling for years for Tcl people;
I guess that counts as a big company.
 
B

beliavsky

Roy said:
I think you've hit the nail on the head. In awk (and perl, and most
shells, and IIRC, FORTRAN), using an undefined variable silently gets
you a default value (empty string or zero). This tends to propagate
errors and make them very difficult to track down.

You may recall correctly, but Fortran compilers have improved. The
following Fortran 90 program

integer, parameter :: n = 1
real :: x,y=2.0,z(n)
print*,"dog"
print*,x
z(n+1) = 1.0
print*,z
end

has 3 errors, all detected at compile time by the Lahey/Fujitsu Fortran
95 compiler, with the proper options:

2004-I: "xundef.f", line 2: 'y' is set but never used.
2005-W: "xundef.f", line 4: 'x' is used but never set.
2153-W: "xundef.f", line 5, column 1: Subscript out of range.

At run time, the output is

dog
The variable (x) has an undefined value.
Error occurs at or near line 4 of _MAIN__

Running Python 2.4 on the Python analog,

n = 1
y = 2.0
z = range(n)
print "dog"
print x
z[n] = 1.0
print z

one error is caught:

dog
Traceback (most recent call last):
File "xundef.py", line 5, in ?
print x
NameError: name 'x' is not defined

You will see the out-of-bounds error for z only after fixing the
undefined-x error. No warning is ever given about y, which is set but
never used. In practice, 'print "dog"' could be some operation taking
hours. Can PyChecker find all the problems in a single run, without
executing 'print "dog"'? If so, it would be great if it were integrated
with the CPython interpreter.

One reason interpreted languages like Python are recommended to
beginners is to avoid the edit/compile/debug cycle. But I think it is
faster and less frustrating to have many errors caught in one shot.
 
R

Roy Smith

[email protected] (Aahz) said:
Yes, it's sloppy, but I Don't Care. I'm trying to write usable code
while learning a damnably under-documented Java library -- and I'm *not*
a Java programmer in the first place, so I'm also fighting with the Java
environment. Eventually I'll add in some better code.

The whole point of exceptions is that they get propagated automatically.
If I'm not going to catch it, why do I have to even know it exists? I
don't consider "throws Exception" to be sloppy, I consider it to be
programmers voting with their feet.
 
M

Mark Carter

Cameron said:
> In article <[email protected]>,
> .
> [tale of *very*
> typical experience
> with non-software
> engineers]
> .
> .
>


Don't start me! Dammit, too late ...

I've noticed that they have an overwhelming obsession with GUIs, too.
They design wizards for everything. Damn pretty they are, too. Albeit a
bit flakey. They seem to conflate pretty interfaces with good interfaces
and good software.

I used to joke that since our software wasn't particularly magical, it
didn't need wizards. But I think I just ended up sounding bitter.

We once had a bit of software that we thought we'd like to turn into a
generic application. The focus on improvements was, predictably enough,
that we should design a GUI that could do anything a client would likely
to want to do. It was my opinion, though, having seen the very
"special-cases" nature required in the original software, that it was
almost impossible to predict exactly how a customer might want the
product tailored. I suggested that what they really needed was a library
(Python would have been good for this, Lisp might have been even better)
that could be extended as required. GUIs second, functionality first.
But hey, what would I know. Fortunately, the whole thing's been put on
the back burner.

And trying to get through to them why source control makes sense, that
when more than one person works on a project, some form of coordination
is required, that copying and pasting code is evil, and that Excel
probably isn't the hammer for every nail.

Honestly, I thought (real) engineers were supposed to be clever.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,733
Messages
2,569,440
Members
44,832
Latest member
GlennSmall

Latest Threads

Top