Joel Spolsky on languages for web programming

A

Alex Young

David said:
Utter pants. I mean, you used the word "bloat", which should make people
lose any debate by default.



Neither is Ruby / Rails. *No technology* is a guarantee for success, no
technology ever was, and I'll bet a gold bar against a plastic spoon no
technology ever will. Technology used is a very important decision to
make, but it never single-handedly moves you from doable to undoable or
vice versa.
That's the wrong argument to pick. Try calculating the full dynamics of
a modern metropolitan water supply network with just pen and paper.
Technological advances *do* move us from undoable to doable, and it's
specific technologies that do it.

Pure, unadulterated shite. Give me numbers. Credible statistics and real
research, not random anectodal success stories that are too pathethic to
sell Herbalife diet pills.
I'm not going to address this - research on this level is heavily
funded, and heavily trend-driven. The answers you get depend too
heavily on what questions you ask.
Also, initial development cost isn't a very important factor. Recalls
your uni software lifecycle charts about how much of a project's life is
maintenance. For a successful project, the numbers are very much true.
With a successful product comes the responsibility of supporting it and
keeping it successful, and in some cases this responsibility creates
ongoing costs that dwarf the initial development horribly.
No argument there whatsoever.
Noone cares about pretty. It's also a completely irrelevant issue when
deciding on implementation language if you're at least remotely
responsible.
Actually, pretty does matter. The comfort of a problem solver directly
impacts his/her approach to a problem. That's just human nature.
Speaking purely theorethically, Ruby can not be made as performant as
Java or C# could be made if they had ideally performing implementations.
Latent typing makes it almost impossible to do certain optimizations as
static typing does. That's pure fact.
I remain unconvinced by this - and it's mainly JIT optimisation that
keeps me on the fence. Dynamic optimisations can beat static - but not
in all cases. I believe this is what one calls an "open research" question.
 
J

Joseph

Although I respect Joel very much, I believe he makes a fundamental
mistake in his reasoning.

Basically what he is saying can be deconstructed this way:

* Do not risk developing in new cutting edge technology. Even if
successful proof of concepts are already out there (37 signals et. al)
* Use what most people use: PHP / J2EE / .Net not what most experts
tell you to use. Communities and support are paramount.
* Corporations and the people in those organizations favor safety, if
your job is on the line go with the tried and true. Take no risks.

All three assumptions rely on a single assumption: FEAR.

* Fear the technology would eventually not deliver.
* Fear the support will not be sufficient.
* Fear regarding your job safety as a corporate developer or manager
who chooses Ruby or Ruby on Rails for some mission critical project.

All assumptions are wrong.

The only way significant progress is accomplished is precisely a
combination of: FAITH and COURAGE. That will make you stand out
anywhere.

The ideal place for those characteristics is inside a Startup or inside
of a bold, courageous corporation! It is not about the size of the
organization though, it is about the courage and boldness of the people
inside those companies.

People forget how the Internet, yes the OLD Internet was built. It was
done on new technology (www, http, mosaic, Perl), new development
models (open source, collaboration), new business objectives (community
first, users, and yes finally profits too.)

So, this is my take on this issue regarding Ruby and Ruby on Rails:

Do it, risk it, it's worth it.

And the biggest advantage to Joel's thinking for you would be that
neither he, nor corporations who thing like he does (most of them) will
be your competition. So when they do have some serious issues to
tackle, like that huge Framework called [insert-your-safe-choice-here]
trying to bent backwards to do what needs to be done fast... you will
have the last laugh.

Best Regards,

Jose L. Hurtado
Web Developer
Toronto, Canada
 
P

Phlip

Joseph said:
Although I respect Joel very much, I believe he makes a fundamental
mistake in his reasoning.

Joel is such a good writer that sometimes his jaw-drooping errors are
impossible to refute. (And don't encourage him; he loves it when you fight
back!)
Basically what he is saying can be deconstructed this way:

* Do not risk developing in new cutting edge technology. Even if
successful proof of concepts are already out there (37 signals et. al)
* Use what most people use: PHP / J2EE / .Net not what most experts
tell you to use. Communities and support are paramount.

The open source tools that succeed must have higher technical quality than
the Daddy Warbucks tools. The latter can afford to buy their communities and
"support" networks. Because an open source initiative cannot buy its
community and marketing, only the strong survive, and their early adopters
will form this community spontaneously. They will provide the true
word-of-mouth advertising that marketing tends to simulate.

And I am sick and tired of seeing at shops dragged down by some idiotic
language choice made between the marketeers and a computer-illiterate
executive.
* Corporations and the people in those organizations favor safety, if
your job is on the line go with the tried and true. Take no risks.

Ah, so looking like you are following best practices is more important than
doing everything you can to ensure success. Gotcha!

Yes, I have seen that upclose, too!
All three assumptions rely on a single assumption: FEAR.

* Fear the technology would eventually not deliver.
* Fear the support will not be sufficient.
* Fear regarding your job safety as a corporate developer or manager
who chooses Ruby or Ruby on Rails for some mission critical project.

Yup - that's the Fear Uncertainty and Doubt formula that Microsoft (among
others) use all the time. They have tried, over and over again, to FUD
Linux. Their CEO will get up on stage and say incredibly stupid things, like
"if an open source platform fails you, there is nobody you can go to for
help!" He means there's nobody you can sue. As if you could go to MS for
help, without paying thru the nose...

Oh, Joel is pro-Linux, right? What's the difference??
All assumptions are wrong.

Better, fear that your boss will experience misguided fear.
 
V

Vidar Hokstad

Joseph said:
Although I respect Joel very much, I believe he makes a fundamental
mistake in his reasoning.

Basically what he is saying can be deconstructed this way:

* Do not risk developing in new cutting edge technology. Even if
successful proof of concepts are already out there (37 signals et. al)
* Use what most people use: PHP / J2EE / .Net not what most experts
tell you to use. Communities and support are paramount.
* Corporations and the people in those organizations favor safety, if
your job is on the line go with the tried and true. Take no risks.

All three assumptions rely on a single assumption: FEAR.

No. They rely on sound risk management principles.
* Fear the technology would eventually not deliver.

Replace "Fear" with "Risk" and the above is reasonable if your company
does not have people experienced in a particular technology. And fact
is today it is still far harder to find people skilled at Ruby than
many other languages. More importantly, there is too little experience
with many Ruby technologies for a company with no Ruby experience to
_know_ whether Ruby will be appropriate for a specific project.
* Fear the support will not be sufficient.

Replace "Fear" with "Risk" again. The company I work for, Edgeio, uses
PHP for our frontend code (but Ruby for our backend) because when we
started building it I had concerns about the availability of people
skilled with Ruby in general or Rails in particular.

When we started hiring those concerns were validated: It's proved
extremely hard to find people with Ruby experience. While it's
certainly getting easier rapidly, not everyone can afford to take the
risk. In our case I decided to start phasing Ruby in for small self
contained components in our backend, and gradually take it from there
as we get enough Ruby skills through training or hiring, which has
proven to work well and meant that in the event that we'd run into
unforeseen problems, the effort involved in switching back to another
language would have been limited.
* Fear regarding your job safety as a corporate developer or manager
who chooses Ruby or Ruby on Rails for some mission critical project.

Which is very valid if you make a choice detrimental to the company,
regardless which language it involves. As much as I love working with
Ruby, if someone working for me picked it for something inappropriate,
and the project tanked badly, they certainly would have to take the
responsibility for the language choice. If you don't have Ruby skills,
or your team doesn't have sufficient Ruby skills, or there aren't
enough skilled Ruby developers available in your location, picking it
for a high risk project will certainly not speak to your favor with any
risk

"Fear" as you say, or "risk" is an important decision factor for any
conscientious manager. Deciding what level of risk is appropriate for a
project vs. the potential payoffs is one of the most important skill a
manager must have to make good decisions.

They key is whether you/your team has or can easily aquire the skills
required to minimise the risks and maximise the payoff. For many teams
that will not be the case when dealing with any specific new tech.

As for "successfull proof of concepts", they mean nothing unless a) you
have the same skills and resources as the company in question, and b)
your project is sufficiently similar. Which means most decisions about
technology tends to boil down to 1) what your team knows to a certain
degree, 2) which technologies are the most widely deployed. Ideally
you're looking for an intersection.

_Sometimes_ the payoff in trying a technology that your team is
inexperienced with or that isn't widely deployed is large enough to
outweigh the risks, or the risks are mitigated by your teams experience
(in the case of tech that isn't widely deployed) or by the available
pool of external experience (in the case where your team doesn't have
the skills), but that is not a decision to take lightly.

I am all for using Ruby, and I think a lot of companies that aren't
using Ruby could get great benefit from testing it. But on low impact,
low risk, simple projects first. Not because Ruby in itself is
inherently high risk, but because few companies have enough experience
with Ruby to jump right into using it on a large or high profile
project.

Vidar
 
A

Alvin Ryder

David said:
Utter pants. I mean, you used the word "bloat", which should make people
lose any debate by default.

I don't like bloated software, it is unnecessary.
Neither is Ruby / Rails. *No technology* is a guarantee for success, no
technology ever was, and I'll bet a gold bar against a plastic spoon no
technology ever will. Technology used is a very important decision to
make, but it never single-handedly moves you from doable to undoable or
vice versa.

There are many factors required for success and I don't believe any one
factor guarantees it but interestingly it can take so much as one
element gone wrong to ruin everything.
Pure, unadulterated shite. Give me numbers. Credible statistics and real
research, not random anectodal success stories that are too pathethic to
sell Herbalife diet pills.

The "10 to 3" ratio wasn't meant to be taken literally surely you don't
think otherwise? And can you tell me where can I get such "credible
statistics and real research" from?

Are you saying all languages yield the same level of productivity? If
they aren't equally productive then how much more productive is Java
over C++ or VB over assembler? Do you need "credible statistics and
research" to answer the question?
Also, initial development cost isn't a very important factor. Recalls
your uni software lifecycle charts about how much of a project's life is
maintenance. For a successful project, the numbers are very much true.
With a successful product comes the responsibility of supporting it and
keeping it successful, and in some cases this responsibility creates
ongoing costs that dwarf the initial development horribly.

I disagree, the initial cost is vital. Most projects get approved or
not approved based on that initial cost and if that money is drained on
developers trying to tame an unwieldly platform instead of building the
actual system then we have a problem don't we?
Noone cares about pretty. It's also a completely irrelevant issue when
deciding on implementation language if you're at least remotely responsible.

I care about pretty.
You need to tighten off Unix-based servers too. Heck, there are even
serious and expensive firewalls for Linux around too, because not
everyone has an in-house iptables guru.

True, no platform is 100% impervious to attack but some are less secure
than others.
Sometimes things aren't so common. Ruby and Rails DO have faults. Just
google around, I'm not going to go namecall out of respect and out of a
sense of realism - every technology has flaws and any mudslinging would
only lead to a pointless flamewar. Sometimes they are uneducated rants
and / or whining, but some of them are valid.

Yes I know all platforms have faults and wish lists, I didn't think
otherwise.
And if you do NOT go out and learn about these flaws, and what impact
they could have, and be fully aware of them when making the
implementation technology decision on a project to consider the severity
of their impact under the circumstances of your project, then your
decision may cause a lot of trouble.

Fair enough, I agree. I think software should be published with
specifications and limits as they do in other industries. This is a 100
ohm resister +/- 2%, capable of running in these temperators, it
handles this much power ... but in software its just "blah" you have to
discover the limits yourself (ouch).
Speaking purely theorethically, Ruby can not be made as performant as
Java or C# could be made if they had ideally performing implementations.
Latent typing makes it almost impossible to do certain optimizations as
static typing does. That's pure fact. Of course, it's not saying Ruby
can't be fast enough - but there have been people with more experience
at the performance side of software development that talked much better
about that

I'm not sure how fast or slow Ruby is but if it's as fast as Perl I'll
be happy enough. Yes I know C is faster but I need fast development
times too.
I advise you go on throught freshman year on a management school. It's
the managers' job to "not have balls" and risk when there's apparently
nothing to be had from taking it. If you want to be a Ruby advocate, you
need to be able to persuade them, not yourself, of the advantages or
using it.

I've worked with Harvard level managers, they seemed to think it *was
their job to have balls*, which is the opposite of what your saying? I
prefer to work with managers that have knowledge, intelligence, energy
and conviction to back up their decisions.

Besides that the choice of language is usually mine, that's why I
gravitate to the more productive ones. In my experience run-time
performance is rarely an issue but development time is.
That's because it's not. Since projects never fail purely on a
technology solution - they fail on results of bad project planning more
often than not (assigning novice programmers to large projects that will
probably go over their heads), mistaken business objectives as whoever
contracted the software finds he isn't really interested in what the
tech demos had to show at all.

And the stereotype of lazy management that never gets punished is good
to make Dilbert strips from - in real life, it probably doesn't hold

No, I've seen it hold in real life too many times, the "pointy haired
boss with the corner office" still brings a chuckle out of me.
true outside of a select few huge moloch companies, or on the opposite
side of the spectrum small short-lived hick-led shops where the bosses
kids and nephews gets all sort of crap assigned to get better allowance.
In a well-led company with working internal management processes, when
the shit hits the fan, everyone gets the stink.

David Vallner

Cheers ;-)
 
M

M. Edward (Ed) Borasky

David said:
You need to tighten off Unix-based servers too. Heck, there are even
serious and expensive firewalls for Linux around too, because not
everyone has an in-house iptables guru.
But everybody *should* have a *certified* Cisco engineer if they use
Cisco routers, for example. It's one of the costs of doing business.
Speaking purely theorethically, Ruby can not be made as performant as
Java or C# could be made if they had ideally performing implementations.
Latent typing makes it almost impossible to do certain optimizations as
static typing does. That's pure fact.

I'm not sure I agree with you here. First of all, while latent typing
may prevent you from optimizing (and I'm writing in Perl, not Ruby)

$j=0;
for ($k=0; $k<100000; $k++) {
$j++;
}

to

$j=$k=100000;

that kind of optimization is a trick used by compilers to get good
performance on trivial benchmarks, rather than something with a more
wide-ranging real-world payoff.

Second "compiled languages", like Java, C#, C++ and even C have
extensive optimized run-time libraries to do all the lower-level things
that a "true optimizing compiler", if such a thing existed, would do
automatically. Over the years, compilers have improved to the point
where they generate optimal code for things like LINPACK and the
Livermore Kernels.

In short, I don't see why a Ruby interpreter *and* run time can't
compete with a Java, C# or C++ compiler *and* run time! As long as you
have to have the same number of bits around to keep track of the
program's data structures, objects, etc., "optimization" becomes a
matter of implementing the operations on the data structures efficiently.
 
D

Devin Mullins

David said:
Noone cares about pretty. It's also a completely irrelevant issue when
deciding on implementation language if you're at least remotely
responsible.
*Everyone* cares about pretty. http://www.paulgraham.com/taste.html

Pretty means understandable, maintainable, clean (and what the heck does
clean mean? reduced duplication?). Pretty means fewer LOC, which is
about the only objective measure of maintainability we know. (Cyclomatic
complexity being another, I suppose..) Pretty means fun, which we all
know means productive.
Speaking purely theorethically, Ruby can not be made as performant as
Java or C# could be made if they had ideally performing implementations.
Latent typing makes it almost impossible to do certain optimizations as
static typing does. That's pure fact.
Irrelevant. In many cases, the fact that Ruby has latent typing is an
*implementation detail*. Ruby has *no type declarations*, but in many
cases static type inference can be applied to get the same optimizations
of which Java and C# implementations avail themselves. (Disclaimer:
that's about as much as I know about this subject.)

That's not to say that I expect the current CRuby maintainers to add
such optimizations. They seem not to care, and that's just fine by me.

Devin
 
C

Chad Perrin

And another point is that quite a few Ruby frameworks do come to
defining a domain-specific language in Ruby - cf. Og data definition,
Puppet, rake. There's a (maybe not quite fine) line between a very
specific framework and a DSL that just gets crossed, and I don't believe
rubyists are the innocents to throw the first stone.

There's a distinct difference between a subset of an already extant
language and an entirely separate language with its own idiomatic
syntax.
 
C

Chad Perrin

That's the wrong argument to pick. Try calculating the full dynamics of
a modern metropolitan water supply network with just pen and paper.
Technological advances *do* move us from undoable to doable, and it's
specific technologies that do it.

. . and in any case, I don't think anyone was saying Ruby was any kind
of guarantee of anything. The point is that Joel Spolsky's
characterization of ultraconservative technology choices as necessarily
"right" is chaff and nonsense. Despite Joel's usually intelligent and
well-reasoned commentary, he dropped the ball on this one, effectively
saying that Ruby is a guarantee of failure.

Bollocks, I say.

No argument there whatsoever.

I have a caveat to add:

It's true that initial development is often one of the cheaper parts of
a "successful" project, cost of initial development is still critically
important. If your initial development is too costly, you never get to
maintenance. Additionally, if you think middle managers think ahead
enough to just ignore initial development costs (even when they can
afford to do so) in favor of long-term cost savings, you probably
haven't dealt with middle managers as much as I have. CxO-types are
even worse, because their job success metrics are more tied to quarterly
stock prices and market shares than anything more long-term (generally
speaking).

Actually, pretty does matter. The comfort of a problem solver directly
impacts his/her approach to a problem. That's just human nature.

. . and how much more do you think it costs in the long run to
maintain code that is a nasty, overly complex, ugly mess? Pretty
matters.

I remain unconvinced by this - and it's mainly JIT optimisation that
keeps me on the fence. Dynamic optimisations can beat static - but not
in all cases. I believe this is what one calls an "open research" question.

Unfortunately, JIT implementations haven't been subjected to the same
long-term scrutiny and advancement as more traditional persistent binary
executable compiling implementations. As a result, I don't think the
state of the art is there yet -- leaving JIT implementations effectively
slower by nature until they get some more advancement over the years to
come. I really believe that gap will be closed rapidly in the near
future. Only time and experience will tell whether it can be made as
fast or faster, though I have no doubt that it can at least be made
close enough that most of us won't care.
 
C

Chad Perrin

To be fair, it's not just corporate politics. Statistically, it's more
likely a development house will have a strong base of Java developers or
C# developers (C#, while being very young and so far an abomination unto
Nuggan, is reasonably Java compatible), and that starting a Rails
project means you'll probably have to get people with no Ruby experience
on the team, or create a burden on the company in case the original team
falls apart and quits to other companies regarding maintenance, or whatever.

Choosing a language despite the resources at your disposal, rather than
because of them, would probably make that a "bad decision". That in no
way invalidates the summarized point I already made:

"Regardless of how good or bad a decision a given language is for a
given task, Ruby is more likely to get you fired that Java."
 
M

M. Edward (Ed) Borasky

Alex said:
That's not quite the same - those DSL's build upon a known and well
understood foundation, because they use Ruby's syntax to their own ends.
I'm inferring from the very little information that's out there that
Wasabi has its own parser, and that makes it a very, very different
beast to a DSL in the sense that I've come across the term in Ruby.

To use Martin Fowler's terminology, there are external DSLs -- a
language created for the domain and implemented with a parser, etc., in
some general-purpose language. And there are *internal* DSLs, written as
extensions/subsets inside a language like Ruby.

Rails and rake are internal DSLs, and Ruby makes internal DSL creation
much easier than many other languages. I can't tell from this thread
whether Wasabi is external or internal.

I hardly think of an external DSL as anything special any more. They've
been around as long as I've been programming, which is -- well, let's
just say your toaster has more compute power than the machine I learned
on. :) Almost every major decades-old Fortran code, for example, is
really implementing an external DSL.
 
C

Chad Perrin

But everybody *should* have a *certified* Cisco engineer if they use
Cisco routers, for example. It's one of the costs of doing business.

Frankly, iptables is easier to learn effectively than most proprietary
firewalls -- and then there's stuff like IPCop, which makes things even
easier.
 
J

James Edward Gray II

W

William Grosso

Vidar said:
When we started hiring those concerns were validated: It's proved
extremely hard to find people with Ruby experience. While it's
certainly getting easier rapidly, not everyone can afford to take the
risk. In our case I decided to start phasing Ruby in for small self
contained components in our backend, and gradually take it from there
as we get enough Ruby skills through training or hiring, which has
proven to work well and meant that in the event that we'd run into
unforeseen problems, the effort involved in switching back to another
language would have been limited.

This leads to an interesting question: how many ruby programmers are
there, anyway?

I ran across http://sanjose.bizjournals.com/sanjose/stories/2006/08/28/daily1.html
today and boggled at the "2.5 million" number for PHP.

Any ideas for Ruby?


Bill
 
W

William Grosso

William said:
This leads to an interesting question: how many ruby programmers are
there, anyway?

I ran across
http://sanjose.bizjournals.com/sanjose/stories/2006/08/28/daily1.html
today and boggled at the "2.5 million" number for PHP.

Minor correct: I ran across that, and then read the following in Mark
De Visser's profile on LinkedIn.

Zend Technologies creates PHP products, software for rapid development
and deployment of Web applications. PHP is being increasingly adopted
with an estimated 2.5 million developers currently using it and 22 million
deployed websites.
 
M

M. Edward (Ed) Borasky

Chad said:
Frankly, iptables is easier to learn effectively than most proprietary
firewalls -- and then there's stuff like IPCop, which makes things even
easier.
When there are certified iptables engineers, I'll trust my business to
them. Until then, I'm sticking with Cisco and certified Cisco engineers.
When you post a job application for a sysadmin position, you're going to
get at least ten times as many applicants as you need, so you can afford
to *insist* that they be certified by Cisco, Microsoft or Red Hat as
appropriate.
 
M

M. Edward (Ed) Borasky

Chad said:
Unfortunately, JIT implementations haven't been subjected to the same
long-term scrutiny and advancement as more traditional persistent binary
executable compiling implementations. As a result, I don't think the
state of the art is there yet -- leaving JIT implementations effectively
slower by nature until they get some more advancement over the years to
come. I really believe that gap will be closed rapidly in the near
future. Only time and experience will tell whether it can be made as
fast or faster, though I have no doubt that it can at least be made
close enough that most of us won't care.

In the "good old days", an assembly language programmer could turn out
code that was from 2 to 10 times as fast as that turned out by a
compiler, and a compiler could turn out code that was from 2 to 10 times
as fast as an interpreter.

The gap has narrowed. It's rare that an assembly language coder can beat
a compiler by more than a factor of 2 these days, and on some
architectures it's a dead tie -- there's only one way to do something
and the compiler always finds it. Interpreters are better now too,
mostly because today's languages have such a large component that has to
be dealt with at run time anyway that the "heavy lifting" is done by
compiled code.


I'm not sure JIT is "necessary" for efficient interpretation of Ruby
anyway. But you're right ... if the economics is there, the gap will get
closed, just like the compiler/assembler gap got closed.
 
M

M. Edward (Ed) Borasky

Alvin said:
Are you saying all languages yield the same level of productivity? If
they aren't equally productive then how much more productive is Java
over C++ or VB over assembler? Do you need "credible statistics and
research" to answer the question?

*He* may not be saying all languages yield the same level of
productivity. But I'll say something similar: the productivity of
programmers depends more on their knowledge of the application area and
their *familiarity* with the development environment than it does on the
environment and language.

There are tools that can drag down an otherwise productive team, but
they tend to get discarded fairly quickly.
 
V

Vidar Hokstad

Devin said:
Irrelevant. In many cases, the fact that Ruby has latent typing is an
*implementation detail*. Ruby has *no type declarations*, but in many
cases static type inference can be applied to get the same optimizations
of which Java and C# implementations avail themselves. (Disclaimer:
that's about as much as I know about this subject.)

You're absolutely right.

Look to Haskell for a good example a _statically typed_ language
almost free of type annotation of any kind - type information is almost
exclusively added by the compiler (though you can add type
annotations).

While Ruby has features that make it impossible for an implementation
to use strict static typing everywhere, a lot of a typical Ruby
application could be statically typed by an implementation using type
inference fairly easily by doing some relatively simple flow analysis
combined with marking up the parse tree.

Doing it for a pure interpreter would be easy, but the advantages would
be relatively limited. Doing it for a JIT compiler would also be quite
straightforwards and does have the potential of very significant
speedups.

For a full fledged compiler it would be tricky without some
restrictions - the main problem is Ruby's introspective features and
various eval mechanisms, which means the type inference valid at
compile time might not hold at runtime. Add a few restrictions on the
use of load/require etc. and the use of eval's and/or some way of
adding some basic type annotation to guide the compiler for "extension
points" (classes/methods that will be affected by runtime changes) and
it would be doable without significant changes.

Vidar
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,007
Latest member
obedient dusk

Latest Threads

Top