The economics of a slow but productive Ruby

J

Jacob Fugal

[NOTE: I'm trying to present the facts and be objective in this post.
I love Ruby, and would choose it any day when economics didn't matter.
But in the sense of the "Real World", this is what I discovered. And
of course, if I made any serious mistakes, be sure to let me know!]

Company QUUX is deciding on technologies for a new project. They
estimate a development budget of A and a hardware budget of B under
technology BAR:

development budget under BAR = A
hardware budget under BAR = B
total budget under BAR = A + B

They are also considering using technology FOO as well. FOO is widely
reputed to grant productivity gains of a factor Y, but is slower than
BAR, requiring X times the servers. FOO developers make about Z times
as much as BAR developers, on average:

X = servers required under FOO / servers required under BAR
Y = productivity FOO / productivity BAR
Z = annual FOO salary / annual BAR salary

The development budget under FOO would be reduced by the productivity
increase, but that increase will be mitigated by the difference in
salary:

development budget under FOO = AZ/Y

The hardware budget under FOO would be increased by the factor X:

hardware budget under FOO = BX

The total budget under FOO, in terms of the budget under BAR, would then be:

total budget under FOO = AZ/Y + BX

Given these estimates, it would be a profitable decision to choose FOO
over BAR if and only if the total budget under FOO is less than the
total budget under BAR.

choose FOO iff AZ/Y + BX < A + B -- or, rearranging...
choose FOO iff (X - 1)B < (1 - Z/Y)A
choose FOO iff [(X - 1) + (1 - Z/Y)]B < (1 - Z/Y)(A + B)
choose FOO iff B < [(1 - Z/Y) / (X - Z/Y)](A + B)

Let's apply this estimate to the current standing between .NET and
Ruby/Rails, using the figures from Joel (X = 5, Y = 5). In this case,
Z = 1 (actually, in my comparisons, Z was slight *less* than one).

(1 - Z/Y) / (X - Z/Y)
= (1 - 1/5) / (5 - 1/5)
= (4/5) / (24/5)
= 4 / 24
= 1 / 6

So, choosing Ruby of .NET (assuming Joel's numbers are correct) is
economically sound iff your hardware budget makes up 1/6th or less of
the total estimated .NET budget.

Now, let's assume 20 servers and a 5 year application lifespan, with a
$5K one-time cost per server, $500 annually for repairs and one
Sysadmin with a salary comparable to the developers ($60K). This
brings our hardware budget to $450K over the 5 years[1]. If this is
only 1/6 the total budget, we need to be spending at least 5 times as
much on developers, or exactly that amount per year. Using the same
$60K figure for developer salaries, this comes to 7.5 developers. So,
if you're developer to server ratio is at least 3 developers for each
production server, Ruby is probably economical. If you start getting a
lot more servers than developers however, the hardware cost of a slow
Ruby builds up on you.

Jacob Fugal

[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...
 
C

Carl Lerche

1) It doesn't take 5 times more boxes for a ruby app than a .NET app,
the single biggest factor in efficiency is the quality of the
developer. You can do many things on the code level to optimize
server CPU. I've never found it to be an issue. Honestly, if you need
5 times more servers to run a ruby on rails app than a .NET app, I'll
have to laugh.

As an example, I worked for a company that developed a PHP app and it
took 15 application servers to run it when it should have taken 5. It
took that many because the coding (before I was hired) was terrible.
The same can happen with any technology.

2) Network latency is a far bigger bottleneck than CPU. All
technologies face the same problem.

3) Joel pulled that number out of his ass, I mean, I could say that
the same app coded in .NET would take 2834 servers where as it would
take a 3 year old palm using ruby. That doesn't make it true.

4) I didn't see any factor for software budget.

-carl

[NOTE: I'm trying to present the facts and be objective in this post.
I love Ruby, and would choose it any day when economics didn't matter.
But in the sense of the "Real World", this is what I discovered. And
of course, if I made any serious mistakes, be sure to let me know!]

Company QUUX is deciding on technologies for a new project. They
estimate a development budget of A and a hardware budget of B under
technology BAR:

development budget under BAR = A
hardware budget under BAR = B
total budget under BAR = A + B

They are also considering using technology FOO as well. FOO is widely
reputed to grant productivity gains of a factor Y, but is slower than
BAR, requiring X times the servers. FOO developers make about Z times
as much as BAR developers, on average:

X = servers required under FOO / servers required under BAR
Y = productivity FOO / productivity BAR
Z = annual FOO salary / annual BAR salary

The development budget under FOO would be reduced by the productivity
increase, but that increase will be mitigated by the difference in
salary:

development budget under FOO = AZ/Y

The hardware budget under FOO would be increased by the factor X:

hardware budget under FOO = BX

The total budget under FOO, in terms of the budget under BAR, would
then be:

total budget under FOO = AZ/Y + BX

Given these estimates, it would be a profitable decision to choose FOO
over BAR if and only if the total budget under FOO is less than the
total budget under BAR.

choose FOO iff AZ/Y + BX < A + B -- or, rearranging...
choose FOO iff (X - 1)B < (1 - Z/Y)A
choose FOO iff [(X - 1) + (1 - Z/Y)]B < (1 - Z/Y)(A + B)
choose FOO iff B < [(1 - Z/Y) / (X - Z/Y)](A + B)

Let's apply this estimate to the current standing between .NET and
Ruby/Rails, using the figures from Joel (X = 5, Y = 5). In this case,
Z = 1 (actually, in my comparisons, Z was slight *less* than one).

(1 - Z/Y) / (X - Z/Y)
= (1 - 1/5) / (5 - 1/5)
= (4/5) / (24/5)
= 4 / 24
= 1 / 6

So, choosing Ruby of .NET (assuming Joel's numbers are correct) is
economically sound iff your hardware budget makes up 1/6th or less of
the total estimated .NET budget.

Now, let's assume 20 servers and a 5 year application lifespan, with a
$5K one-time cost per server, $500 annually for repairs and one
Sysadmin with a salary comparable to the developers ($60K). This
brings our hardware budget to $450K over the 5 years[1]. If this is
only 1/6 the total budget, we need to be spending at least 5 times as
much on developers, or exactly that amount per year. Using the same
$60K figure for developer salaries, this comes to 7.5 developers. So,
if you're developer to server ratio is at least 3 developers for each
production server, Ruby is probably economical. If you start getting a
lot more servers than developers however, the hardware cost of a slow
Ruby builds up on you.

Jacob Fugal

[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...
 
J

Jacob Fugal

choose FOO iff B < [(1 - Z/Y) / (X - Z/Y)](A + B)

Let's apply this estimate to the current standing between .NET and
Ruby/Rails, using the figures from Joel (X = 5, Y = 5). In this case,
Z = 1 (actually, in my comparisons, Z was slight *less* than one).

Also note that the values I used here a pretty conservative. As many
have mentioned, Ruby will often not be the bottleneck -- X can be less
than 5. Also, depending on your programmers, Y may be more or less
than 5. Doing the calculation with X = 2 and Y = 10 yields much more
favorable results:

(1 - Z/Y) / (X - Z/Y)
= (1 - 1/10) / (2 - 1/10)
= (9/10) / (19/10)
= 9 / 19
= 47%

So under optimistic cases, Ruby will still be economical until
hardware eats up *half* your budget. Or, pessimistically, let's try X
= 10, Y = 2:

(1 - Z/Y) / (X - Z/Y)
= (1 - 1/2) / (10 - 1/2)
= (1/2) / (49/2)
= 1/49

You're hardware budget would need to be negligible under those
circumstances to make Ruby economical.

Fortunately, in my experience, X has never even approached 5, let
alone 10. And Y has always been good to me. The important thing is
that for *your* decision, you need to:

1) Evaluate what X is *for your application*
2) Evaluate what Y you will believe
3) Know how your hardware costs will scale (see the footnote in my
original email)

All these factors will affect the outcome greatly.

Jacob Fugal
 
M

Matt Lawrence

[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

Howdy folks. As a top notch sysadmin, I just wanted to rimind y'all that
I'm out here.


-- Matt
It's not what I know that counts.
It's what I can remember in time to use.
 
J

Jacob Fugal

1) It doesn't take 5 times more boxes for a ruby app than a .NET app,
the single biggest factor in efficiency is the quality of the
developer. You can do many things on the code level to optimize
server CPU. I've never found it to be an issue. Honestly, if you need
5 times more servers to run a ruby on rails app than a .NET app, I'll
have to laugh.

I agree, but I was using the numbers from Joel's article. See my
follow up email for a little more detail on what I believe it would
*really* be...

My *main* point in the original email is that there *is* a line where
throwing more servers at it isn't economical. Where that line is
depends a great deal on your individual situation.

Jacob Fugal
 
C

Carl Lerche

I realize that you are using the numbers from Joel's article, but
(and maybe it's just me), those numbers are just so absurd, they
don't merit any more discussion than "that's absurd" and maybe point
out why using real world situations.

Also, yes, there are some extreme cases... such as the google search
engine. However, scaling is not linear. Hypothetically, IF at a
certain point a .NET web-application takes 5 servers and a similar
ruby web-application takes 25 servers (this already sounds a bit
ridiculous, but allow me to continue...). This does NOT mean that
when this .NET application requires 50 servers to run that the
similar ruby web-app will require 250.

As such, I don't see where this line would be, not using your method
of proving that there is a line.

And lastly, if there are any developers that develop Ruby apps for a
company that requires 5 times as many servers as an equivalent .NET
app.. they should be fired :p

-carl
 
C

Chad Perrin

Also, yes, there are some extreme cases... such as the google search
engine. However, scaling is not linear. Hypothetically, IF at a
certain point a .NET web-application takes 5 servers and a similar
ruby web-application takes 25 servers (this already sounds a bit
ridiculous, but allow me to continue...). This does NOT mean that
when this .NET application requires 50 servers to run that the
similar ruby web-app will require 250.

No kidding. For one thing, while it's possible that in some
pathological edge-case it might require five LAMRoR servers to equate
one WS2k3 .NET server, the level of system resources required just to
run each individual server is rather greater for WS2k3/IIS systems than
for Linux/Apache systems. Additionally, there are more options
available for scaling up with Linux than Windows solutions -- better
load balancing, effective clustering, et cetera (Microsoft promised a
clustering version of Windows last year -- the result being that once
they achieved something testworthy, nobody bothered to use it except for
academic demonstration purposes because, of course, the cost of
licensing would be far greater than any return on investment, especially
considering the artificial technical limitations imposed because of the
MS business model).

There's a sweet spot for vertically integrated Microsoft solutions. If
you stay inside that sweet spot, it's cheaper to use a .NET solution
than certain other solutions. Your project, whatever it may be, may or
may not lose to .NET inside that sweet spot -- in fact, I'll go so far
as to say that .NET is almost certainly a net (ha ha) win. The term
"scalability", however, refers to the mobility of the economics of your
solution, and in that sense one of the standard Linux-based solutions
will probably scale better.
 
C

Chad Perrin

[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

It also helps if you're using a system that has a lower
admins-to-servers requirement ratio. As indicated by recent studies,
Linux and Solaris both require far fewer admins for the number of boxen
than Windows:

http://www.cioupdate.com/article.php/10493_1477911

From the article:

Linux, along with Solaris, also came out ahead of Windows in terms of
administration costs, despite the fact that it's less expensive to
hire Windows system administrators. The average Windows administrator
in the study earned $68,500 a year, while Linux sys admins took home
$71,400, and those with Solaris skills were paid $85,844. The Windows
technicians, however, only managed an average of 10 machines each,
while Linux or Solaris admins can generally handle several times that.

This, like the number of servers required for a given software project,
does not scale linearly -- but the scalability of Windows systems in
terms of administrative requirements never overtakes that of Solaris and
Linux systems (except possibly in pathological edge-cases).
 
M

M. Edward (Ed) Borasky

Jacob said:
[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

Ah, but does SuperSysAdmin have to use a slow scripting language?

<ducking>
 
C

Chad Perrin

Jacob said:
[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

Ah, but does SuperSysAdmin have to use a slow scripting language?

Do you suggest they should use a slower scripting language, like batch
files? It's not like sysadmins write their administrative scripts in
assembly language for performance.
 
G

Gregory Brown

4) I didn't see any factor for software budget.

Which is significant. I don't know the cost of licenses for Windows
servers, but I imagine it is costly, not to mention things like
development tools... Actually, I imagine a .Net project could greatly
exceed its hardware costs in software costs, given the right
circumstances.
 
M

Marc Heiler

The same what can be said about Ruby here can also be said about
Python, even if Python would be a tiny bit faster. However I feel
that Python - with all its quirks - is "more advanced" in terms of
being used or accepted in companies.
The world isn't a monoculture, different qualities (of technologies)
can coexist for a long time.

The bottleneck for the money part still seems to be the
sysadmin no matter which language or? =)
 
C

Chad Perrin

Which is significant. I don't know the cost of licenses for Windows
servers, but I imagine it is costly, not to mention things like
development tools... Actually, I imagine a .Net project could greatly
exceed its hardware costs in software costs, given the right
circumstances.

I don't have current figures, but you're right -- server licenses are
far beyond hardware costs with Windows, especially when including MS
software for development, for the framework, et cetera.
 
C

Chad Perrin

The same what can be said about Ruby here can also be said about
Python, even if Python would be a tiny bit faster. However I feel
that Python - with all its quirks - is "more advanced" in terms of
being used or accepted in companies.
The world isn't a monoculture, different qualities (of technologies)
can coexist for a long time.

The bottleneck for the money part still seems to be the
sysadmin no matter which language or? =)

More generally, the economic bottleneck tends to be skilled personnel --
sysadmins, developers, et cetera. That's why increased productivity
(from tools like Ruby, Python, Perl, Lisp, et cetera) and decreased
administrative overhead (from platforms like Linux, Solaris, et cetera)
are so important. There is, however, a point of diminishing returns,
which is where technologies like .NET, J2EE, and so on are valuable.
 
G

Gregory Brown

The same what can be said about Ruby here can also be said about
Python, even if Python would be a tiny bit faster. However I feel
that Python - with all its quirks - is "more advanced" in terms of
being used or accepted in companies.

can we avoid Python vs. Ruby in this argument... please?

I don't think that arguments about this are helpful. A fine
pythonista will do better in python than in ruby, and likely, vice
versa.
 
J

Jacob Fugal

Jacob said:
[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

Ah, but does SuperSysAdmin have to use a slow scripting language?

As others have mentioned, the scripts used by sysadmins aren't really
CPU intensive nor too dependent on the host language speed. So it's a
moot point. :)

Jacob Fugal
 
C

Chad Perrin

can we avoid Python vs. Ruby in this argument... please?

I don't think that arguments about this are helpful. A fine
pythonista will do better in python than in ruby, and likely, vice
versa.

I don't see how Python is even relevant, except in saying "If so-and-so
works for Python, it should work for Ruby too, as they fill very similar
niches in the programming ecosystem," or something like that. They
differ, for purposes of this discussion, mainly in terms of popularity,
and that difference varies depending on the specific sub-niche you're
addressing. So . . . what exactly was the point of bringing it up in a
discussion of the economies of Rails vs. .NET in the enterprise (or
whatever)?
 
C

Chad Perrin

Jacob said:
[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

Ah, but does SuperSysAdmin have to use a slow scripting language?

As others have mentioned, the scripts used by sysadmins aren't really
CPU intensive nor too dependent on the host language speed. So it's a
moot point. :)

Yeah . . . premature eja^H^H^Hoptimization, evil, et cetera.
 
N

Neil Wilson

It's rarely a matter of economics and statistics. Essentially there is
very little good scientific data on the relative merits of different
development systems. Practically all you can find was done on
productivity and sociology in development was done in the 70s and early
80s.

Beyond that it is really a matter of faith.
 
J

Jacob Fugal

It's rarely a matter of economics and statistics. Essentially there is
very little good scientific data on the relative merits of different
development systems. Practically all you can find was done on
productivity and sociology in development was done in the 70s and early
80s.

I'll actually agree with you here as well. My email actually started
out in a much different direction, which is when I wrote that
disclaimer. The disclaimer escaped editing however when the body of
the email changed. When I refer to "objective facts", I am referring
to the derivation of the equation in general, not specific values.

When I started the email I never intended to delve into placing
specific values on X and Y. My primary intent was just to examine the
equation that could result in the abstract. This was to demonstrate
that a boolean choice can be made, but *only* if the party making the
choice is willing to decide on values for X and Y (and Z).

Values for X can be locked down for any given choice. The important
thing is that the X you use be related to your situation. There is no
one size fits all X -- in some cases X will be significant (number
crunching) in others it might be nearly 1. That determination needs to
be made on a per project basis.

Values for Y on the other hand are very subjective. As you said, there
is very little if any hard scientific data (in the form of published
studies) to support any value of Y. Most of what we have is anecdotal.
*But*, if a decision maker has experienced some of those anecdotes
him/herself, or is willing to accept the judgment of another in
evaluating those anecdotes, they can determine a Y value that they are
willing to believe for the sake of the decision.

My purpose was never to propose that certain values of X or Y are
correct, but rather to provide a framework equation inside which
different values of X and Y can be examined.

Jacob Fugal
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,537
Members
45,022
Latest member
MaybelleMa

Latest Threads

Top