Recent Criticism about Ruby (Scalability, etc.)

M

M. Edward (Ed) Borasky

Chad said:
Adjusted for programmers:

1. Be(come) a good programmer.

2. Write good code.

3. Let your boss throw hardware at the scaling problem, because the
good code you wrote as a good programmer can handle it.

Better?

I'm with Jay on this one -- no matter *how* good the programmers and the
code are, there are limits to scalability. Remember, I'm a performance
engineer. ;)

Then again, there are much better ways to explain scalability and
capacity planning than the way some authors do it. I won't mention any
names, of course ....
 
C

Chad Perrin

Now, granted, the state of the art has advanced quite a bit in the past
decade, to say the least. So maybe you're just used to working with
software that's already been rewritten to handle eBay-sized needs, and as
far as you know, it's always just worked that way.

I think you're assuming I'm talking about software that is written, put
into operation, and never touched again by programmers. I'm not. I'd
like to be very clear about this:

Scalability assumes maintenance -- *lots* of maintenance. Over time,
you end up with software that may very well contain no more than about
2% original code (probably less).

The world itself is changing, and with web startups popping up all over
the place more and more people are thinking about their millionth user
about the time they have zero users (because the software isn't even in
usable alpha testing state yet). Things like BIND, et al., weren't
really planned ahead that way (and, for that matter, neither were things
like TCP/IP -- thus the IPv6 vapornet we've been hearing so much about).
These days, every time someone talks about writing software, they talk
about making sure it doesn't crash and burn when they "hit the big time".

When one of these things takes off on the web these days, it takes of
*fast*. It will require rewriting and tweaking of components, but in the
midst of all this it has to be able to scale easily from day one without
rewriting the entire thing in a dark room from scratch then hot-swapping
it into operation. That sort of thing *doesn't* work, in part because it
means you lose a lot of resources on the production software maintenance
and in part because when you do that you suddenly discover a lot of bugs
in your bug-free software.

Part of writing scalable software is writing software that can be
upgraded piecemeal, as needed. Couple that with the ability to throw
hardware at it, without missing a step, and you've got a winner.

If you want to just write off all that as "bad software" by definition -
hey, if it didn't scale, it's bad software! - then you're missing the
point. Even if it *is* bad software by today's standards, it certainly
wasn't at the time. Which means software that we think is good today may,
in fact, not be. Which means: You need more than pithy sayings about
business plans to write scalable software.

You're misunderstanding (or misrepresenting) what I've said. Something
isn't bad software because it ran into a limit on scalability. It may be
bad *at scaling* -- and if scaling was the point of the software design,
*that* is bad(ly written) software. If scaling wasn't the point of the
software design, and you find that it's being used in a situation where
that scalability is needed, either your design decisions were poor (in
retrospect at least), or you're "misusing" it.

I thought I already made that point.

Again, I'm curious to hear your real-world experiences, since they differ
greatly from mine. Maybe everything's changed now. Tell me some stories
about what didn't break.

Everything "breaks". If it only breaks a little at a time, and you have
a plan in place for dealing with those little breaks -- and you're
*lucky* -- then you might scale smoothly.

If not, it wasn't scalable.
 
C

Chad Perrin

OK, the first time got a chuckle. The second time gets a "what AOL were
you smoking?"

Seriously. You're a techie, not in AOL's targeted "novice" market, so I
assume most of your interactions with AOL were (a) commercials involving
Batman, (b) free DVDs in your cereal box, and (c) friends and family that
asked you for help in uninstalling it. So I can understand if you got the
wrong impression of AOL, and all the shiny lights impressed you.

So now I'm stupid and easily distracted by shiny things. Thanks.

Experience is subjective, and all that, so I can never know the AOL you
used. All I know is the AOL I wrote. And it had like a dozen features
visible to the user.

What was this -- circa '87? Sorry, I wasn't familiar with AOL prior to
the early '90s in any sense. I guess I should have been more specific in
my use of the word "always". Perhaps that means I'm easily distracted by
pretty things, and unable to apply critical thought to concepts like
"organic expansion of a chatroom network is a feature".

And the point I keep trying to make, though you keep deflecting it, is that
there ARE no guarantees that if you "do everything right" with a focused
business plan and a lightweight feature set, you're golden for scaling. Go
back to the little list of AOL mail features I posted, and realize that not
a single one of them scales linearly. And realize that you might well
implement a feature like that in a different system, maybe even without
thinking about it, because it's not a big deal. (Let's select a nicer
screen name for someone if they can't find the one they want.)

I never said there were guarantees -- but I can see how you'd make the
assumption that my statements led in that direction, what with the fact
it would fit in so well with my obvious stupidity. That, or your
imagining I said things I didn't because that's easier to dispute.
 
C

Chad Perrin

I'm with Jay on this one -- no matter *how* good the programmers and the
code are, there are limits to scalability. Remember, I'm a performance
engineer. ;)

There are always limits -- but I'm talking about scalability in the sense
of "scalability within the realm of reason". Obviously, I'm not
suggesting that reddit is ready to take on the complete userbase of the
Andromeda Galactic Empire added to the already weighty traffic it gets
from one measely little planet when Digg manages to piss off most of its
user-base on a censorship lark.

There are similarly limits to Moore's Law (insofar as it has ever really
been "true"), but I don't think we've approached them yet (again, insofar
as it has ever really been applicable).

Then again, there are much better ways to explain scalability and
capacity planning than the way some authors do it. I won't mention any
names, of course ....

I'm afraid you must be getting a little too subtle for me.
 
J

Jos Backus

There are some languages that can *only* be used by "very smart people".
APL comes to mind, and I suspect there are those who could make the same
case for Forth, Haskell and Prolog. For "most of us", languages like
Python, Perl, PHP, Ruby and Lua are great *because* they're easy to learn
even if you're not a "very smart person".

(OT) My first full-time IT job back when I was 24 or so was to write assembly
line startup personnel planning software using VS APL 4.0 on VM/CMS (and using
GDDM for graphics). Thanks for making me feel good about that. :)
 
M

M. Edward (Ed) Borasky

Jay said:
Having just jumped back to PL/I for a few weeks.. that one's no fun at ALL.

I mean, it's good for what it does; it's a lot like "C with the pain taken
out and some sugar". But I don't say "Whee, I'm programming in PL/I!" And
I don't stop to marvel at how elegant some code turned out thanks to some
PL/I construct. Doesn't happen.

I was recently asked what languages were my favorite before Ruby. They had
to repeat the question. I never had the *concept* of a favorite language
before Ruby.

1. I sometimes refer to Ruby as a happy marriage of Java and Perl. PL/I,
on the other hand, was a shotgun wedding of FORTRAN and COBOL. :)

2. I have the concept of a "second-favorite language". My favorite
language is the one I get paid to write, and my second-favorite is the
one I'm learning at any given time.

So I think my second-favorite language of all time -- one that I enjoyed
more than any other but never got paid to write -- is a dead tie between
Lisp 1.5 and Forth. It seems more likely I'll get paid to write Ruby
than either of them. :)
 
J

John W. Kennedy

M. Edward (Ed) Borasky said:
I haven't ... but I don't think fun is language-specific. I can't think
of a single programming language I hated using, but then, I never used
RPG. :) That one I think would have sucked.

It depends. In their classic problem domain, RPG and RPG II get the job
done quickly, and with a minimum of fuss that standard languages can't
touch. However, that problem domain is considerably more narrow today,
what with the gradual disappearance of batch processing and printed
reports on the one hand, and the available of such facilities as QMF on
the other.

RPG III is something else. That one strikes me as a three-headed,
orange-blooded, radioactive mutant. BUT -- I've never worked with a
S/38, AS/400, iSeries, or System I, so what do I know?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,483
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top