[posted and mailed]
[...]
But this is also my approach, I guess. Not exactly writing Universe
Critical code, I tend to proceed with a light reading of the docs, and
learn by having my naive intuitions corrected - as necessary when
necessary, by finding something broken.
My late boss approached languages and other problems the same way, and I
never could understand how he dealt with what for me would be continous
frustration. He seemed to have no problems with it though -- allthough
there was the "what do you mean Python doesn't have a case statement?"
meltdown that almost lead to an in-house Python dialect *eek*... Now, I'm
not against a case statement in Python per se, but until one has
experienced pattern matching in SML, one can't understand how inadequate
the Pascal/C case/switch statements are -- and I _am_ against C style
switches in Python (... and we all know how much my vote counts <grin>).
I'm more the kind of person that reads the user's manual to the VCR before
ever putting a tape in (did I just date myself?). Doing language theory in
grad school, while not always the most immediately useful, is a wonderful
way to develop passionate ideas about how things ought to be (you should
have heard the lunch discussions when Java came out said:
But there is I feel a valid point implicit in my original question.
The "I have been using Python 42 years and never imported copy" is a
very ambiguous assertion,made time and again.
You're most certainly correct, although I don't believe it was very
ambigous when I stated it. If you look at my response to Alex, you'll see
that I really do mean that copies are virtually never needed [in the
aesthetic, not the Turing complete sense], and if you do use them you're
likely to (a) do more work, sometimes potentially _much_ more work, (b)
complexificate the algorithm making it harder to visually verify, (c) make
it _much_ harder to _prove_ correct, and (d) you're adding another entity
that you'll have to keep in memory when you're visualizing the system.
[digressing, I never thought I'd feel the need to do algorithmic proofs
until I started working for the people who give you your FICO score. The
thought that we measure personal information in (a lot of) Tera-bytes, that
we can store over 4000 individual pieces of data about a person and that a
bug could make someones life severly miserable.. No wonder that for such a
mission critical subsystem as the 'data-structure' managing this
information we wouldn't use anything but a Python code generator,
generating over 380KLoc of C++ code, SQL schemas and stored procs, etc.)].
Getting back to my point about copies, it is almost purely an aesthetic
issue. You can get working code while creating copies all over the place
without too much effort. If getting the code to work is the final goal,
there is no place for what I call "clean" code. For me, working code hasn't
been very exciting, probably since I saw the BSD sources long ago and was
utterly flummoxed when the pinnacle of what I considered a big, complex
"program" was easier to understand than what I was writing. I was
immediately convinced that only clean code can be good code, and only good
code can create exceptional systems. Being an aestethic quality, crafting
clean code is more art than engineering, but listening to coders around me
who wrote good code, and my own trial and error, I have no problem
categorically denouncing the creation of copies.
I can also categorically say that it's a really bad idea to walk to the
middle of a bridge, precariously climb up on the railing, before doing a
swan dive towards the dry riverbed several hundred feet below... but it's
really fun if you have a bungie-cord attached.
I haven't encountered any rules that didn't come with their own list of
exceptions. However, while the exceptions _are_ important, focusing on them
is perhaps misplaced. Taking an extra look when you're creating a copy has
an amazing track record in languages with gc... as does removing functions
bigger than 25x80, more than two levels of nested loops in a function, more
than two sequential loops in a function, more than three levels of
inheritance, almost all cases of multiple inheritance, classes with more
than a dozen methods, inconsistent use of whitespace, etc., etc.
Sure it is weird to want to think of such trivialities when the code is
working, but clean code makes me happy <smile>, and hey, after 20+ years of
programming, I'm still excited when I go to work and I still program after
I come home...
I hope you don't take my verbose mode as any kind of criticism -- while I
have no problem voicing my opinion about how things should be, I never
presume to tell someone else what to do (no desire, besides friends who
disagree with me are much more interesting ;-) I sincerely do not grasp
mentally however, how some people can "break path" in a new domain guided
mainly by their intuition? My intuition definitely doesn't work at that
level
If you have any meta-analysis it would be very interesting to
contemplate...
[...]
Don't we need to pick one meaning for that assertion? Or make the
meaning clear, when asserted.
Of course not, this is USENET <wink>. I was going to put something here
about the quote from Winnie the Pooh about "when I say ___ I mean ___", but
googling for "Winnie the Poo" is not something anyone should do before
lunch... seriously.
Anyways, God Jul og Godt Nytt År allesammen,
-- Bjørn Steinar