Object Relational Mappers are evil (a meditation)

L

Lie Ryan

A language is a thing. It may have syntax and semantics that bias it
towards the conventions and philosophies of its designers. But in the
end, a language by itself would have a hard time convincing a human
being to adopt bad practises.

Perhaps someone should make a research whether if you teach a language
to kids, where one group is taught the language filtered from "bad
words" and another group is taught all the languages' "bad words" on
purpose. Will one group have more behavioral problems compared to the other?
 
J

J Kenneth King

Lie Ryan said:
Perhaps someone should make a research whether if you teach a language
to kids, where one group is taught the language filtered from "bad
words" and another group is taught all the languages' "bad words" on
purpose. Will one group have more behavioral problems compared to the
other?

I would be curious to know, but the test is likely impossible without
trespassing on ethical boundaries. ;)

I would hypothesize that you would not find an increase in behavioural
problems.

a) Without cultural context "bad words" have little meaning

b) Behavioural issues can be attributed to several factors such as
physiology, health, environment, etc.

c) This has nothing to do with programming languages. A programmer that
lacks critical thinking is a bad programmer. The language they use has
no bearing on such human facilities.
 
R

r0g

J Kenneth King wrote:
c) This has nothing to do with programming languages. A programmer that
lacks critical thinking is a bad programmer. The language they use has
no bearing on such human facilities.


The language may well have a bearing on the quality of the programs
generated though, which is what most people care about. A dolt writing
in python is far less likely to write a program that bluescreens the
users machine than a comparative dolt writing the same program in C or
assembler.

Of course two gurus writing in different languages would produce equally
good results but gurus are considered gurus by virtue of their scarcity.
Back in the real world the further into dolthood you venture, the more
the more important the design of the language becomes to the quality of
outputs you can expect to get from your code monkeys.

Take 100 perfectly average programmers and give them the same programs
to write in a variety of languages, you will get higher quality results
from some languages than others i.e. not all languages are equal. I
think it's fair to say the ones that give the best results encourage
good coding and the ones that give the worst results encourage bad coding.

If you don't believe it's possible to have a language that encourages
bad coding practices consider this one I just made up, I call it Diethon..

It's entirely the same as Python 2.6 except that any syntax errors that
happen within class definitions cause the interpreter to send offensive
emails to everyone in your contacts list and then delete your master
boot record.

Unsurprisingly users of this language are reluctant to try to create
object oriented code and resort to ugly struct and list based paradigms
instead.

Roger.
 
S

Steven D'Aprano

A programmer that
lacks critical thinking is a bad programmer. The language they use has
no bearing on such human facilities.

That's nonsense, and I can demonstrate it by reference to a single
programming language, namely Python.

For many years, Python had no ternary if operator:

result = x if condition else y

Instead the accepted, idiomatic Python way of writing this was to use
short-circuit booleans:

result = condition and x or y

However this idiom is buggy! If x is a false-value (say, 0) then result
gets set to y no matter what the value of condition.

This buggy idiom survived many years of Python development, missed by
virtually everyone. Even coders of the calibre of Raymond Hettinger (who
neither lacks critical thinking nor is a bad programmer) have been bitten
by this:

"The construct can be error-prone. When an error occurs it can be
invisible to the person who wrote it. I got bitten in published code
that had survived testing and code review: ..."

http://mail.python.org/pipermail/python-dev/2005-September/056510.html


This is a clear and obvious case where a language feature (in this case,
the lack of a feature) encouraged an otherwise excellent coder to make an
error. It was a very subtle error, which was not picked up by the author,
the tests, or the coder reviewer(s). Had Python been different (either by
including a ternary if statement, or by forcing and/or to return bools
only) then this bug never would have occurred.

Of course awful programmers will be awful programmers in any language,
and excellent programmers will be excellent programmers in many languages.

(I say "many" rather than any deliberately. There's a reason why nobody
uses languages like Brainf*ck, Whitespace, Ook or Intercal for real work.)

But most coders are neither awful nor excellent. The language DOES make a
difference: the quality of a technician depends partly on the quality of
his tools, and programmers are no different.

If you don't believe me, imagine writing code in a language without
functions or loops, so you have to use GOTO for everything.
 
J

J Kenneth King

Steven D'Aprano said:
That's nonsense, and I can demonstrate it by reference to a single
programming language, namely Python.

For many years, Python had no ternary if operator:

result = x if condition else y

Instead the accepted, idiomatic Python way of writing this was to use
short-circuit booleans:

result = condition and x or y

However this idiom is buggy! If x is a false-value (say, 0) then result
gets set to y no matter what the value of condition.

This buggy idiom survived many years of Python development, missed by
virtually everyone. Even coders of the calibre of Raymond Hettinger (who
neither lacks critical thinking nor is a bad programmer) have been bitten
by this:

"The construct can be error-prone. When an error occurs it can be
invisible to the person who wrote it. I got bitten in published code
that had survived testing and code review: ..."

http://mail.python.org/pipermail/python-dev/2005-September/056510.html


This is a clear and obvious case where a language feature (in this case,
the lack of a feature) encouraged an otherwise excellent coder to make an
error. It was a very subtle error, which was not picked up by the author,
the tests, or the coder reviewer(s). Had Python been different (either by
including a ternary if statement, or by forcing and/or to return bools
only) then this bug never would have occurred.

Of course awful programmers will be awful programmers in any language,
and excellent programmers will be excellent programmers in many languages.

(I say "many" rather than any deliberately. There's a reason why nobody
uses languages like Brainf*ck, Whitespace, Ook or Intercal for real work.)

But most coders are neither awful nor excellent. The language DOES make a
difference: the quality of a technician depends partly on the quality of
his tools, and programmers are no different.

If you don't believe me, imagine writing code in a language without
functions or loops, so you have to use GOTO for everything.

All very true.

But did the lack of ternary encourage Raymond to become a bad
programmer?

That is what I believe the core of the argument is. Sure the misfeature
was over-looked by Raymond, but it took him (and perhaps the help of
others) to recognize it and fix it. That's because he's human and the
language is inert. He is smart and obviously has the cognitive
capabilities to recognize that the language has to change in order to be
a better tool.

It would be a different story if he just assumed that the misfeature was
actually a feature and that it was a good thing. In such a case would
Python the language be at fault or the people who write programs with
it?

Good tools make all the difference in the world, I'm not arguing that.

Just that the tools don't use us; we use them. Programming in Python
doesn't instantly make me a better programmer. It can certainly make me
think of myself as a good programmer though... ;)
 
S

Steven D'Aprano

Steven D'Aprano said:
That's nonsense, and I can demonstrate it by reference to a single
programming language, namely Python.

For many years, Python had no ternary if operator:
[...]

But did the lack of ternary encourage Raymond to become a bad
programmer?

No, but Raymond started off in a position of being an excellent
programmer. A single buggy idiom lead him to be slightly-less excellent
than he otherwise would have been. How many buggy idioms would it take to
lead him to become a mediocre coder, if he was unable to change languages?

Because Python is generally an excellent language, the harm done by one
or two misfeatures is minor. But less excellent languages encourage
coding styles, techniques and idioms that encourage the programmer to
write poor code: either complicated, baroque, unreadable code; or slow
inefficient code; or buggy code. To avoid starting a flame war, I will
avoid mentioning PHP. *cough*

Sometimes you know what you need to do to write non-buggy code, but
because covering all the corners are just Too Damn Hard in a certain
language, you simply lower your expectations. Error checking is tedious
and hard to get right in some languages, like C and Pascal, and hence
even good programmers can miss some errors.

Different languages encourage different mind-sets in the programmer: C
encourages the coder to think at the low level of pointers and addresses,
and primarily about machine efficiency; Java encourages the use of big
object hierarchies and design patterns (it's hard to write lightweight
code in Java, so everything turns into heavyweight code); Perl encourages
cleverness and code-golf (writing a program in as few lines or characters
as possible); Haskell and Lisp encourage a heavily abstract approach that
often requires an elite coder to follow; Forth encourages you to think
like Yoda.


[...]
Good tools make all the difference in the world, I'm not arguing that.

You appear to be arguing against that.
Just that the tools don't use us; we use them.

Nobody said that tools use us.

Programming in Python
doesn't instantly make me a better programmer.

No, not instantly, but I would argue that after many years of coding in
Python you will be a better programmer than after the same number of
years of coding in PHP or Basic.

It also depends on what you mean by "better programmer". Some languages
value cleverness above all else. Python is not a language for writing
amazing, awe-inspiring hacks that work where nobody but the author can
work out why. This is why there is an Obfuscated C contest and an
Obfuscated Perl contest but no Obfuscated Python contest -- it wouldn't
be anywhere near as awe-inspiring.

So one might argue that the best C and Perl coders are better than the
best Python coders, but the average Python coder is better than the
average C and Perl coder.

(I suggest this as a hypothetical, and do not wish to defend it
scientifically.)
 
T

Terry Reedy

This is only a bug if one expects otherwise.

The last statement is false. The hazard of using and/or was well-known
back in '97 or so when I discovered or learned it and I believe it was
mentioned in the FAQ entry on the subject. The new alternative has the
hazard that the condition and if-branch must be written and read in a
backwards order. I consider that buggy and do not use it for that reason.

Terry Jan Reedy
 
S

Steven D'Aprano

This is only a bug if one expects otherwise.

I'm not saying the behaviour of `a and x or y` is buggy, but that it's
use as a replacement for a ternary conditional expression is buggy; the
*idiom* is buggy, not the behaviour of and/or.

If I say "you can make perfect hard boiled eggs by putting the egg in a
glass of water in the microwave on high for eight minutes", and the egg
explodes, that's not a bug in the microwave, that's a bug in the recipe.


The last statement is false. The hazard of using and/or was well-known
back in '97 or so when I discovered or learned it and I believe it was
mentioned in the FAQ entry on the subject.

We can argue about how well-known it was for somebody like Raymond
Hettinger to miss it, and for whoever did a code-review of his
application to also miss it.

The new alternative has the
hazard that the condition and if-branch must be written and read in a
backwards order.

If you had asked me a couple of years ago, I would have agreed, but I've
now come to the conclusion that `x if condition else y` is not only
perfectly natural, but at least as natural as the conventional order of
`if condition then x else y` (at least for expressions, not for if
statements).

"Steven, what are you doing on Monday night?"
"Going to the movies if I can get away from work on time, otherwise
sitting at home answering questions on comp.lang.python."
 
L

Lie Ryan

The last statement is false. The hazard of using and/or was well-known
back in '97 or so when I discovered or learned it and I believe it was
mentioned in the FAQ entry on the subject. The new alternative has the
hazard that the condition and if-branch must be written and read in a
backwards order. I consider that buggy and do not use it for that reason.

Oh really? I thought putting the conditional in the middle was
ingenious, whoever thought that must have the next Turing award!

I always feel there's something wrong with the (condition ? true :
false) or (if condition then true else false) expressions found in other
languages; and just realized it was because of their unnatural ordering.
I have to admit initially the reversed ordering do confound me, but
afterward it felt even more natural than the traditional
"conditional-first" expression.
 
J

J Kenneth King

Steven D'Aprano said:
Steven D'Aprano said:
On Mon, 21 Dec 2009 11:44:29 -0500, J Kenneth King wrote:

A programmer that
lacks critical thinking is a bad programmer. The language they use
has no bearing on such human facilities.

That's nonsense, and I can demonstrate it by reference to a single
programming language, namely Python.

For many years, Python had no ternary if operator:
[...]

But did the lack of ternary encourage Raymond to become a bad
programmer?

No, but Raymond started off in a position of being an excellent
programmer. A single buggy idiom lead him to be slightly-less excellent
than he otherwise would have been. How many buggy idioms would it take to
lead him to become a mediocre coder, if he was unable to change languages?

Because Python is generally an excellent language, the harm done by one
or two misfeatures is minor. But less excellent languages encourage
coding styles, techniques and idioms that encourage the programmer to
write poor code: either complicated, baroque, unreadable code; or slow
inefficient code; or buggy code. To avoid starting a flame war, I will
avoid mentioning PHP. *cough*

Sometimes you know what you need to do to write non-buggy code, but
because covering all the corners are just Too Damn Hard in a certain
language, you simply lower your expectations. Error checking is tedious
and hard to get right in some languages, like C and Pascal, and hence
even good programmers can miss some errors.

Different languages encourage different mind-sets in the programmer: C
encourages the coder to think at the low level of pointers and addresses,
and primarily about machine efficiency; Java encourages the use of big
object hierarchies and design patterns (it's hard to write lightweight
code in Java, so everything turns into heavyweight code); Perl encourages
cleverness and code-golf (writing a program in as few lines or characters
as possible); Haskell and Lisp encourage a heavily abstract approach that
often requires an elite coder to follow; Forth encourages you to think
like Yoda.

If anyone continues to follow bad idioms without questioning their
usefulness from time to time, I'd question their ability as a
programmer. Critical thinking is important. Which is why good programs
can be written in PHP, Forth, Lisp, Perl, and anything else. However,
if a programmer thinks the only language they will ever need to know is
BF, they have a serious screw loose. ;)
[...]
Good tools make all the difference in the world, I'm not arguing that.

You appear to be arguing against that.

Maybe you need to reconsider my arguments.

It takes a good programmer to recognize the values and trade-offs of the
tools they work with.

Ignorance is not an excuse to blame the language. It's too easy to say,
"Well Perl sucks because it encourages you to be a bad programmer
because it has all these features that let you shoot yourself in the
foot." In reality, lots of really great programs are written in Perl
all the time and some very smart people write them. It just so happens
that in hands of the educated, those very features are useful in certain
cases.

Python doesn't "encourage" you to be a better programmer. It just
enforces particular idioms and conventions in its design. As long as
the ignorant programmer follows them they should be better off. Yet if
they are ignorant, no amount of encouragement will get them to think
critically about Python and find bugs in it. They will have to rely on
the community of developers to do that thinking for them.
Nobody said that tools use us.

But it is being suggested that they influence our thinking.

Pretty smart thing for a language to be able to do.
No, not instantly, but I would argue that after many years of coding in
Python you will be a better programmer than after the same number of
years of coding in PHP or Basic.

And my argument is that the human element is what will determine who is
better.

There are good programmers who can program in PHP. Some of the biggest
websites on the Internet are programmed in it. And like any language
I'm sure it has a good number of inefficiencies and bad design decisions
that the programmers using it had to work around. Yet despite it being
a poor language in your opinion, they built successful programs with
it. I wouldn't feel right calling them bad programmers.

(large portions of Facebook and Flickr, for example, are written in
PHP. They used to be written entirely in PHP before migrating the
bottlenecks out to lower-level languages as they scaled up... as is
common in most high-level languages)
It also depends on what you mean by "better programmer". Some languages
value cleverness above all else. Python is not a language for writing
amazing, awe-inspiring hacks that work where nobody but the author can
work out why. This is why there is an Obfuscated C contest and an
Obfuscated Perl contest but no Obfuscated Python contest -- it wouldn't
be anywhere near as awe-inspiring.

So one might argue that the best C and Perl coders are better than the
best Python coders, but the average Python coder is better than the
average C and Perl coder.

(I suggest this as a hypothetical, and do not wish to defend it
scientifically.)

I should hope not. ;)

Particularly because people often go out of their way to write clear,
concise, and maintainable Perl and C code every day.

In many contexts I'm sure there is reason to use Perl instead of Python
just as there are situations where C is more appropriate than either.

However, the mark of a poor programmer in my line of reasoning is one
who cannot recognize such distinctions.

One must be aware of the benefits and short-comings of their tools. If
your tools influence the way you think then you are being ignorant of
this principle. And I would suggest that makes you a poor programmer.
 
E

Ethan Furman

J said:
In many contexts I'm sure there is reason to use Perl instead of Python
just as there are situations where C is more appropriate than either.

However, the mark of a poor programmer in my line of reasoning is one
who cannot recognize such distinctions.

One must be aware of the benefits and short-comings of their tools. If
your tools influence the way you think then you are being ignorant of
this principle. And I would suggest that makes you a poor programmer.

Perhaps "influence the way you think" is not the right way to phrase
it... how about "be the tool" ;)

We have all seen the struggles that newcomers to a language go through
as they try (or don't try) to adjust their thinking to the tool at hand
-- programming Java, BASIC, FORTRAN, or xyz in Python. Even now Phlip
is raging against exceptions and the very Zen of Python.

Converting FoxPro to Python is an interesting excercise for me --
version 6 at least doesn't have many of the cool things that Python
does, and consequently thinking in Python while writing FoxPro (when I
have to) is extremely frustrating; going the other way is a bit of a
challenge also, although much more rewarding.

For a more concrete example, take sail-boats and speed-boats: you get
used to the speed boat, it's quick handling and sharp turns... then you
take a sail boat out for a cruise -- if you don't adjust your thinking
from speed to sail, you could very well end up in the rocks.

To sum up: I agree with your "poor programmer" line of reasoning in the
second paragraph above, but not with the "tools influencing the way we
think" line of reasoning -- while I am cognizant of Python's
shortcomings, I am very much enjoying the changes in my thinking the
more I use it.

~Ethan~
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,774
Messages
2,569,596
Members
45,143
Latest member
DewittMill
Top