Joel Spolsky on languages for web programming

C

Chad Perrin

Rails and rake are internal DSLs, and Ruby makes internal DSL creation
much easier than many other languages. I can't tell from this thread
whether Wasabi is external or internal.

I'm having a hard time imagining it being internal, conidering the way
Joel describes it in an essay where he addresses Wasabi directly:
http://www.joelonsoftware.com/items/2006/09/01b.html

I am, frankly, having a tough time imagining how he could have designed
a language that does what he claims as an internal DSL of VBScript. In
fact, he describes it as "100% backwards-compatible with VBScript".

On the other hand, it seems utterly incomprehensible that someone would
recreate VBScript, but with more power, from scratch -- which is what
he'd have to do, considering I doubt Microsoft gave him the source for
it.

I hardly think of an external DSL as anything special any more. They've
been around as long as I've been programming, which is -- well, let's
just say your toaster has more compute power than the machine I learned
on. :) Almost every major decades-old Fortran code, for example, is
really implementing an external DSL.

In a manner of speaking, one might say that all programming languages
are, in one way or another, DSLs of a sort.
 
C

Chad Perrin

The gap has narrowed. It's rare that an assembly language coder can beat
a compiler by more than a factor of 2 these days, and on some
architectures it's a dead tie -- there's only one way to do something
and the compiler always finds it. Interpreters are better now too,
mostly because today's languages have such a large component that has to
be dealt with at run time anyway that the "heavy lifting" is done by
compiled code.

This brings me to a thought I've been having a lot, lately: that the
future of compiled code will probably start looking in some ways more
and more like interpreted code. I don't see why we can't, relatively
soon, have a compiler that produces a compiled executable of a dynamic
language such as Ruby that does not require a VM or interpreter to be
run (outside of the very loose definition of "interpreter" or "VM" that
might include your whole OS in that definition). The dynamic aspects of
the language would be handled by the executable binary itself, rather
than by an interpreter external to the program.

I'm not entirely sure how to explain what I'm thinking at this time so
I'm sure I get my point across. Hopefully someone who reads this will
get where I'm aiming, and may even be able to help me clarify it.

I'm not sure JIT is "necessary" for efficient interpretation of Ruby
anyway. But you're right ... if the economics is there, the gap will get
closed, just like the compiler/assembler gap got closed.

There are things that Ruby allows that simply cannot be done without a
certain amount of runtime interpretation, with the possible exception of
the evolution of persistent compiled executable binaries described
above.
 
C

Chad Perrin

When there are certified iptables engineers, I'll trust my business to
them. Until then, I'm sticking with Cisco and certified Cisco engineers.
When you post a job application for a sysadmin position, you're going to
get at least ten times as many applicants as you need, so you can afford
to *insist* that they be certified by Cisco, Microsoft or Red Hat as
appropriate.

As someone with a combination of college, trade school, on-the-job, and
purely autodidactic education, with several certifications of my own, my
experience is that all certificiations really prove is A) a certain
minimum standard of test-taking competence that can be sized up within
five minutes of meeting someone anyway and B) a certain amount of money
invested in professional advancement.

. . with the exception that some certifications require certain resume
bullet-points before one is allowed to take the certification exam in
question (CISSP comes to mind). Considering one doesn't require a
certification to determine whether someone has such resume
bullet-points, however, that seems irrelevant.
 
C

Chad Perrin

No. They rely on sound risk management principles.

One might say that's just euphemistic phrasing. I'm not prepared to
make such an assertion at this time (I'd like to think about this a bit
more before doing so), but it does occur to me as a possibility.

Replace "Fear" with "Risk" and the above is reasonable if your company
does not have people experienced in a particular technology. And fact
is today it is still far harder to find people skilled at Ruby than
many other languages. More importantly, there is too little experience
with many Ruby technologies for a company with no Ruby experience to
_know_ whether Ruby will be appropriate for a specific project.

This brings us to the "real" problem:

Decision makers need to know something about the technologies to be able
to make the "right" decisions. One cannot effectively expect that any
particular decision is more or less likely to be a good one unless the
decision maker actually knows the options at hand. In other words, Joel
Spolsky's advice about choosing "proven" technologies is nonsense: the
real advice should be "Choose from among technologies you know. If you
are not an expert at all the options that sound good, learn enough to be
able to make an informed decision. Failing to do so does not guarantee
that you will make the wrong decision, but it does guarantee that you
will make your decision for the wrong reasons. Period."

In other words, every time a nontechnical manager is given the
responsibility of choosing a programming language and/or framework for a
project, someone has screwed up. How can (s)he possibly evaluate the
available technologies, or even the advice (s)he receives about them
(whether from employees, friends, consultants, or Joel On Software) to
be sure it's not a load of hooey without knowing the technologies
personally?

Replace "Fear" with "Risk" again. The company I work for, Edgeio, uses
PHP for our frontend code (but Ruby for our backend) because when we
started building it I had concerns about the availability of people
skilled with Ruby in general or Rails in particular.

Sure, Java and PHP programmers are a dime a dozen -- as long as you're
willing to settle for a dime-a-dozen quality programmer. If you want
programmers that are worth their paychecks, however, you significantly
narrow the field no matter what the language you're using. Considering
the learning ability and proclivities of excellent programmers, however,
I rather suspect that you'll find as many excellent programmers who know
"exciting new languages" as "boring old languages", Considering the
direction language design has been going lately, "exciting new
languages" are generally easier to learn, too. This means that if you
choose to hire for excellence over familiarity with a given language,
you're just as likely to find yourself constrained to choose an
excellent C programmer over a poor Java programmer as you are to choose
an excellent C programmer over a poor Ruby programmer -- but if you're
working with Ruby, your excellent C programmer will probably pick up the
language faster.

I guess what I'm saying is that you're probably better off choosing
excellent programmers and the language that works best, technically
speaking, for your project. Choosing a language for which programmers
are a dime a dozen regardless of technical merit is more likely to leave
you with crappy software development, lightning-fast employee turnover,
or (more likely) both.

When we started hiring those concerns were validated: It's proved
extremely hard to find people with Ruby experience. While it's
certainly getting easier rapidly, not everyone can afford to take the
risk. In our case I decided to start phasing Ruby in for small self
contained components in our backend, and gradually take it from there
as we get enough Ruby skills through training or hiring, which has
proven to work well and meant that in the event that we'd run into
unforeseen problems, the effort involved in switching back to another
language would have been limited.

Define "experience". If by "experience" you mean "has spent ten years
developing enterprise applications in the language", darn right it would
be more difficult to find people with Ruby "experience" than many other
languages. If, on the other hand, you mean "has demonstrated aptitude,
Ruby skill, and programming wizardry likely to prove to be an
unequivocal asset to your team", you're probably looking in the wrong
places (since you're unlikely to find that in college internship
programs, where all they teach anyone is Java and .NET).

Which is very valid if you make a choice detrimental to the company,
regardless which language it involves. As much as I love working with
Ruby, if someone working for me picked it for something inappropriate,
and the project tanked badly, they certainly would have to take the
responsibility for the language choice. If you don't have Ruby skills,
or your team doesn't have sufficient Ruby skills, or there aren't
enough skilled Ruby developers available in your location, picking it
for a high risk project will certainly not speak to your favor with any
risk

The problem is where people fear for job security based on choosing a
non-conservative technology, rather than for choosing an inappropriate
technology. Many people would never (under current conditions) choose
Ruby over Java, even if guaranteed that the project would be completed
with 110% requirements satisfaction within two months for Ruby or with a
10% chance of project failure, a 90% requirements satisfaction rate if
"successful", and an eighteen month development time for Java -- all
based on fear for job security. It's the "nobody ever got fired for
choosing IBM" syndrome. Even if choosing the conservative technology is
the Wrong Answer for the task at hand, it will be considered the Right
Answer for job security by a great many people.
 
S

Stephen Kellett

David Vallner said:
research, not random anectodal success stories that are too pathethic
to sell Herbalife diet pills.

Thats a great line.

Stephen
 
C

Chad Perrin

I'm not sure how fast or slow Ruby is but if it's as fast as Perl I'll
be happy enough. Yes I know C is faster but I need fast development
times too.

Based on what I've heard/seen/experienced, Ruby is generally somewhat
slower than Python which is, in turn, somewhat slower than Perl.
Generally. On average. For some definition of "average". One of the
nice things about Perl's relative maturity is the significant work that
has gone into performance concerns for the parser/runtime. I have no
doubt that Ruby will eventually match Perl for general-case performance,
but I also have no doubt that on the whole it has not matched it yet.

On the other hand, the difference is not so great that execution
performance is a factor I really take into account when choosing between
the two languages for a given project.
 
S

Stephen Kellett

Devin Mullins said:
does clean mean? reduced duplication?). Pretty means fewer LOC, which
is about the only objective measure of maintainability we know.

I take it you've never had the pleasure of reading someone else's APL
code? Its about as dense as you can get in LOC.

Sure is not easy to maintain. Often described as a "write only
language".

I think the word "pretty" is not the correct word, "elegant" would be
better.

Stephen
 
J

Joseph

Vidar,

Risk Management IS NOT equivalent to FEAR, in that you are right.

However, as I said earlier, no SIGNIFICANT progress can be expected
without some risk. Risk Management is about dealing with risk, not
eliminating it.

Ruby and Ruby on Rails are not the safest choice, but I believe they
are one of the very best choices for web development. There is a
slight risk in it, but not enough to stop any bold, courageous
corporation, startup or even lone developer to create great software
with it.

Joel and people who share his views, equate Risk with FEAR. That is
their main mistake. Ruby is ready now... not for everything, but is
uniquely ready for web development, that is I believe a fact.

As another poster mentioned, there is an evolution in the adoption of
technology. Ruby is still with the early adopters, but that does not
mean is not mature enough for critical applications.

What will prove me right however is not my rationale here or in my
previous post, but TIME, time will prove those who sticked to Ruby and
Ruby on Rails did so wisely, because TRUTH is tested in time.... I
believe Ruby is ready now, many people disagree, but ultimately time
and people using Ruby for critical applications will be the deciding
factor.

I love this quote from the Linux Advocacy video Red Hat produced
recently, it is incredibly accurate to this issue we are discussing,
and I recommend everyone to watch it, I will quote it here:

"Despite Ignorance
Despite Ridicule
Despite Opposition
Despite it ALL
TRUTH HAPPENS"
Source: http://www.redhat.com/truthhappens/

Time will tell us indeed, but I am not waiting for the jury, I am
learning Ruby and RoR now, eager to apply it to create cool, amazing
web applications... isn't that the whole point? To push technology?
To make it fun again? To innovate?

Jose Hurtado
Web Developer
Toronto, Canada
 
M

M. Edward (Ed) Borasky

Stephen said:
I take it you've never had the pleasure of reading someone else's APL
code? Its about as dense as you can get in LOC.

Sure is not easy to maintain. Often described as a "write only language".

I think the word "pretty" is not the correct word, "elegant" would be
better.

Stephen

I've never written a line of APL code, but that hasn't ever stopped me
from being able to read APL code if the need or desire to do so arose.
There was a time when it was a dominant language in econometrics and
computational finance; indeed, the "A Plus" open source descendant of
APL originated at Morgan Stanley.

APL and its original implementation APL\360 were/are works of pure
genius. I was privileged to meet one of their creators (Falkoff) when I
was a very young programmer working at IBM. APL is one of the few truly
unique programming languages and possesses an elegant simplicity found,
in my opinion anyway, in only two other programming languages --
Lisp/Scheme and Forth.
 
M

M. Edward (Ed) Borasky

Chad said:
Based on what I've heard/seen/experienced, Ruby is generally somewhat
slower than Python which is, in turn, somewhat slower than Perl.
Generally. On average. For some definition of "average".

Would you be interested in the correct definition of average in
benchmarking? Of course you would! :)

http://portal.acm.org/citation.cfm?id=5673&dl=ACM&coll=&CFID=15151515&CFTOKEN=6184618
One of the
nice things about Perl's relative maturity is the significant work that
has gone into performance concerns for the parser/runtime. I have no
doubt that Ruby will eventually match Perl for general-case performance,
but I also have no doubt that on the whole it has not matched it yet.

And this was one of the motivations of the Parrot team -- a common
virtual machine for Perl, Python and Ruby. The Ruby community seems to
have put a lot more effort into YARV than the Cardinal/Parrot approach.
Has the Python community similarly gone their own way, or do they plan
to use Parrot?
 
M

M. Edward (Ed) Borasky

Chad said:
As someone with a combination of college, trade school, on-the-job, and
purely autodidactic education, with several certifications of my own, my
experience is that all certificiations really prove is A) a certain
minimum standard of test-taking competence that can be sized up within
five minutes of meeting someone anyway and B) a certain amount of money
invested in professional advancement.
They also prove that you can learn and carry out a learning task to
completion. They also provide HR and the hiring manager with an
objective way of ruling out unqualified candidates. If I post a network
engineer position and get 100 applications, ten of whom have completed
their certification, that's 90 resumes I can throw in the trash.
 
M

M. Edward (Ed) Borasky

Chad said:
This brings me to a thought I've been having a lot, lately: that the
future of compiled code will probably start looking in some ways more
and more like interpreted code. I don't see why we can't, relatively
soon, have a compiler that produces a compiled executable of a dynamic
language such as Ruby that does not require a VM or interpreter to be
run (outside of the very loose definition of "interpreter" or "VM" that
might include your whole OS in that definition). The dynamic aspects of
the language would be handled by the executable binary itself, rather
than by an interpreter external to the program.

I'm not entirely sure how to explain what I'm thinking at this time so
I'm sure I get my point across. Hopefully someone who reads this will
get where I'm aiming, and may even be able to help me clarify it.
Perhaps you're thinking along the lines of Lisp or Forth, where an
application is layered on top of the entire compiler/interpreter/runtime
package and then saved as an executable. As far as I can tell, there's
absolutely no reason this couldn't be done for Ruby. IIRC that's also
the way the Squeak Smalltalk environment works and the way Self worked.

Incidentally, Forth contains two interpreters and a compiler. A typical
Common Lisp contains one compiler and one interpreter. Right now, Ruby
is simple enough that what you're describing seems feasible -- a couple
more years of co-evolution with its users and it might not be. :)
 
C

Chad Perrin

They also prove that you can learn and carry out a learning task to
completion. They also provide HR and the hiring manager with an
objective way of ruling out unqualified candidates. If I post a network
engineer position and get 100 applications, ten of whom have completed
their certification, that's 90 resumes I can throw in the trash.

I don't think I can really put much value in that "carry out a learning
task to completion" idea, in this case. The sort of "learning" it
measures is, generally speaking, more suited to learning to give the
answers people are expecting than coming up with correct answers.
Microsoft certs, in particular, are bad about this -- filled with
marketing euphemisms and salesworthy "this solution for that problem"
questions.

That's not to say certifications are useless, but they carry little
enough worth in (accurately) judging a candidate's value that ignoring
them entirely probably wouldn't hurt your hiring strategies.

You're right about certifications providing HR and hiring managers with
an "objective" metric for candidate qualifications, but that's pretty
self-referential (they're "qualified" if they meet the qualification
requirements, including a certification, which is required so that
you'll have some way to tell if they're qualified, et cetera), and
there's not really any indication that what it objectively measures is
useful for most purposes. About the only way it measures something
useful with regard to job performance is if someone can literally just
walk into the exam cold, with no studying, and answer all the questions
correctly . . . except for the questions that are misgraded on the exam
(I've yet to see a certification test that doesn't require technically
inaccurate answers to get everything "right").

Throwing out 90% of candidates for not having a certification in the IT
industry is about like throwing out 90% of candidates because their
ties aren't the right width. I mean, sure, having ties of the "right"
width indicates an attention to detail and ability to keep up with
changing trends, which is useful for technical matters, but there's no
guarantee the people you've excluded aren't just fashion-impaired
despite attention to detail and throughly current knowledge of
information technologies, nor that the people with the "right" ties
aren't more focused on fashion than professional skills, or even just
really lucky in their choice of ties today.
 
C

Chad Perrin

Perhaps you're thinking along the lines of Lisp or Forth, where an
application is layered on top of the entire compiler/interpreter/runtime
package and then saved as an executable. As far as I can tell, there's
absolutely no reason this couldn't be done for Ruby. IIRC that's also
the way the Squeak Smalltalk environment works and the way Self worked.

No . . . that's not quite it. Maybe a really bad diagram will help.

interpreter for a dynamic language:
|--------------------------------------------------|

interpreter capabilities exercised by a program in a dynamic language:
|++++++++++++|

compiled static binary for an equivalent program from a static language:
|++++++++++++|

combination static/dynamic compiled binary from a dynamic language:
|+++++++++++----|

. . roughly.

There would likely be more binary size necessary, but considering that
even an interpreter is (generally) a compiled binary that just operates
on input, I don't see any reason to assume we cannot cannot compile
dynamic language code into a persistent binary with accomodations made
for the parts of the program that require runtime dynamic behavior.
This strikes me as a superior approach to a JIT compiler/interpreter
approach like Perl's, a pure interpreter approach like Ruby's, or a
bytecode compilation plus runtime interpreter VM like Java's, for
performance. Add to that the potential increased performance for some
parts of a program written in a more dynamic language something like the
following might actually run faster than the equivalent compiled program
I diagrammed above:

|+++++++--------|

. . depending on how well those dynamic bits (represented by the
hyphens) optimize at runtime for a particular run of the program.
 
M

M. Edward (Ed) Borasky

Chad said:
No . . . that's not quite it. Maybe a really bad diagram will help.

interpreter for a dynamic language:
|--------------------------------------------------|

interpreter capabilities exercised by a program in a dynamic language:
|++++++++++++|

compiled static binary for an equivalent program from a static language:
|++++++++++++|

combination static/dynamic compiled binary from a dynamic language:
|+++++++++++----|

. . . roughly.

You can usually do something like this in Forth. As you're developing,
you save off the whole enchilada (the Forth interpreters and compiler
and assembler, along with your application code, all of which reside in
the dictionary) as an executable. When you're ready to release the
application, you take a special pass and strip out everything your
application doesn't use, getting a smaller executable that only contains
the pieces of the Forth environment needed to run the application.

I haven't spent any appreciable time inside either Common Lisp or
Scheme, or for that matter Ruby, so I don't know how this would work in
any language except Forth. Maybe what you want is as "simple" as
implementing Ruby on top of Forth. :)
There would likely be more binary size necessary, but considering that
even an interpreter is (generally) a compiled binary that just operates
on input, I don't see any reason to assume we cannot cannot compile
dynamic language code into a persistent binary with accomodations made
for the parts of the program that require runtime dynamic behavior.
No reason it can't be done. The question is only "should it be done?" :)
This strikes me as a superior approach to a JIT compiler/interpreter
approach like Perl's, a pure interpreter approach like Ruby's, or a
bytecode compilation plus runtime interpreter VM like Java's, for
performance.

Java also has JIT, of course. Curiously enough, someone once told me
that if I looked at the JVM carefully, I'd see Forth. :)

Add to that the potential increased performance for some
parts of a program written in a more dynamic language something like the
following might actually run faster than the equivalent compiled program
I diagrammed above:

|+++++++--------|

. . . depending on how well those dynamic bits (represented by the
hyphens) optimize at runtime for a particular run of the program.

Well ... maybe we should leave that to the chip? :)
 
V

Vidar Hokstad

Joseph said:
Risk Management IS NOT equivalent to FEAR, in that you are right.

However, as I said earlier, no SIGNIFICANT progress can be expected
without some risk. Risk Management is about dealing with risk, not
eliminating it.

This we agree on.
Ruby and Ruby on Rails are not the safest choice, but I believe they
are one of the very best choices for web development. There is a
slight risk in it, but not enough to stop any bold, courageous
corporation, startup or even lone developer to create great software
with it.

And we agree on this too, to some extent. My argument is mainly that
without a certain level of knowledge about Ruby, the level of risk is
unknown, in which case it is prudent to assume the likely risk is high
until you have investigated it closer.

For those of us that know Ruby, the best we can do to spread it is to
help people get to the stage where they know enough that they can
accurately assess the risk of using Ruby for their projects, but until
people have that knowledge the risk as seen by someone who doesn't know
Ruby will be higher than the real risks with using Ruby.
Joel and people who share his views, equate Risk with FEAR. That is
their main mistake. Ruby is ready now... not for everything, but is
uniquely ready for web development, that is I believe a fact.

And it may very well be a fact, but again, there are still risks, and
those risks are greater for someone who doesn't know Ruby, or who
doesn't have the skills inhouse, whereas the corresponding risks for a
Java shop of doing something in Java may be very low if their staff is
skilled enough in Java.

Personally I hate Java and love using Ruby, but if I had to manage a
team of Java guru's, I'd still consider java as a safer choice than
Ruby unless the project was long enough to take significant time
retraining staff and possibly hiring replacements for anyone who decide
to leave.

They then have to make a tradeoff: low risk in Java (or C# or PHP or
LISP or whatever language they have the sufficient experience with to
make the risk a known, low factor) or a possibly higher risk in another
language vs. _possibly_ lower cost and shorter development time.
Developer hapiness doesn't count unless it affects one of the previous
two, or increases employee retention.

However, that possible payoff depends on whether they make a
successfull transition, which they won't know the chances off if they
have little to no exposure to the language. It also depends on whether
Ruby is right for _their specific project_, which they won't know if
they have little experience with the language.

These factors are all reasons why - regardless of how good Ruby is -
for someone to be picking Ruby just because you and I and other
Rubyists say it's good without having a reasonable degree of knowledge
about how appropriate it would be for their project themselves would be
quite irresponsible.
As another poster mentioned, there is an evolution in the adoption of
technology. Ruby is still with the early adopters, but that does not
mean is not mature enough for critical applications.

For some it certainly is. But despite having a reasonable experience
with Ruby, I'd hesitate about making a blanket statement about it.
Performance _will_ be an issue for some apps (as I've noted elsewhere,
it won't be for _most_ web apps, but there certainly are web apps that
are CPU intensive too, and where C/C++ extensions would be vital if you
were to go with Ruby at the current stage), and lack of certain
libraries might be an issue for some.

Feature poor XML integration IS an issue for my company (Edgeio.com) at
the moment. It's one we expect to solve, but at the cost of additional
work which we wouldn't incur in some other languages. Ruby is still
good for us, but it's not a panacea for all types of development. It
likely never will be, but as time goes the space of apps for which Ruby
is a good choice will of course increase significantly and I do believe
it can supplant many currently more widely used languages.

We still use PHP for our web frontend, though. All our Ruby code is in
the backend for now. I did consider Rails, and maybe we'll migrate to
it at some point, but currently the potential savings are too small to
outweigh the cost/time to migrate, and our frontend is growing thinner
and thinner as we refactor our middleware and backend, so it pays to
just wait for now.
What will prove me right however is not my rationale here or in my
previous post, but TIME, time will prove those who sticked to Ruby and
Ruby on Rails did so wisely, because TRUTH is tested in time.... I
believe Ruby is ready now, many people disagree, but ultimately time
and people using Ruby for critical applications will be the deciding
factor.

Ruby is ready now for some apps _if you have the experience_ or your
potential cost savings are large enough to justify taking the time to
retrain your staff or hire new people.

I use Ruby because it's the best of an increasing pool of bad
alternatives. I still haven't found a language I don't see tons of
flaws in, Ruby included. Ruby's flaws are just less annoying than the
rest :) I don't believe in "truths" in language choices - people need
to pick what works for them, and while looking at what's popular is
often good, there are always exceptions.
Time will tell us indeed, but I am not waiting for the jury, I am
learning Ruby and RoR now, eager to apply it to create cool, amazing
web applications... isn't that the whole point? To push technology?
To make it fun again? To innovate?

That's one viewpoint. But the point for the companies considering
language choices is what technologies will bring them the greatest
profit at the lowest risk.

As much as it's tempting for me as a geek to pick technology based on
personal preference, ultimately I have a responsibility to the
shareholders that needs to take preference.
(and since I'm one of them, and I work at a startup, there's also the
hope of an opportunity for early "retirement" :) )

Vidar
 
C

Chad Perrin

Chad Perrin wrote:

[ snip a bunch of bad diagramming ]
You can usually do something like this in Forth. As you're developing,
you save off the whole enchilada (the Forth interpreters and compiler
and assembler, along with your application code, all of which reside in
the dictionary) as an executable. When you're ready to release the
application, you take a special pass and strip out everything your
application doesn't use, getting a smaller executable that only contains
the pieces of the Forth environment needed to run the application.

That's at least darned close. I'd have to learn more about what exactly
it does to know how close.

I haven't spent any appreciable time inside either Common Lisp or
Scheme, or for that matter Ruby, so I don't know how this would work in
any language except Forth. Maybe what you want is as "simple" as
implementing Ruby on top of Forth. :)

Actually, now that I think about it, I wish something like that would be
what they'd do for Perl 6 instead of wedding it to a VM if they want
some kind of persistent compilation that doesn't preclude runtime
dynamism.

No reason it can't be done. The question is only "should it be done?" :)

I certainly think so, if only to provide an alternative to the "worst of
both worlds" bytecode-VM approach.

Java also has JIT, of course. Curiously enough, someone once told me
that if I looked at the JVM carefully, I'd see Forth. :)

It's a quite different approach to JIT compilation than Perl's, of
course.

Well ... maybe we should leave that to the chip? :)

That's sorta the idea.
 
R

Richard Conroy

Vidar,

Risk Management IS NOT equivalent to FEAR, in that you are right.

However, as I said earlier, no SIGNIFICANT progress can be expected
without some risk. Risk Management is about dealing with risk, not
eliminating it.

I would have thought that eliminating risk would be a job well done
by someone responsible for Risk Management? No?

I am seeing an awful lot of chatter here along the lines that technology
decision makers are insipid jobsworths who fall in line behind the big
tech brands because they are afraid to stick their neck out. ie. the only reason
they are not picking Rails is because they don't have the stones for it.

Has anyone ever considered the fact that many of these decision makers
are very serious, ethically minded people? They take their job seriously
and feel a strong responsibility to make a correct technology decision.

I am really strongly looking at Rails at the moment for an up and coming
solution. But we got some funky requirements that may result in our use
of Rails being purely reserved for rapid prototyping and development/test
tools. While I love how quickly you can get a best-practice solution together,
and how elegant the solutions are, I am concerned that the time you
save early on you lose down the road dealing with edge problems.

The concerns are not that questions exist, but that the questions are
not being really well answered. Some concerns that I have about RoR:
- lack of good success & failure case studies with lessons learned
- library (Ruby) and plugin (Rails) immaturity
- library portability
- what happens to productivity when you go outside the rails problem domain
- how narrow is that problem domain (how easy is it to overstep)
- what happens (to productivity/performance) when your rails apps
need to do wierd stuff like bolt-on SNMP processing ruby-code
- how forgiving is the technology, if you make mistakes/bad assumptions,
how easy is it to recover
- deployment of Rails apps/bundling rails apps
- immaturity of tools
- international support

I am happy enough with a lot of these issues to go with a Rails solution
for something non-critical or prototyping. But I can't in good faith bet the
project on it. I would be happy enough to wait a year though and see what
happens to my concerns as it is moving really rapidly, in the meantime
levelling up my Rails skills.

Don't assume decision makers are stupid or spineless. Their responsibility
is to their employer, it is not their responsibility to promote a technology.
They read mailing lists and bloggers and case studies and do google searches.
They see the extended debates on multinational rails, performance/scalability,
plugin life-expectancy and weak/unknown applicability outside of classic
web apps. Sure the rapid development aspects with implicit bet practices
is great, thats why they are looking at it in the first place, thats the carrot.

I would like it to be ready for prime time now, because next year I probably
won't be in a position to put in place any Rails solution. And it sure is a
lot of fun to work with - I can code for fun at home, but if I get my
employer to
adopt it i can get paid for RoR-ing too.
 
C

Chad Perrin

I would have thought that eliminating risk would be a job well done
by someone responsible for Risk Management? No?

If you eliminate risk entirely, you end up guaranteeing failure -- for
some definition of risk. Any definition of risk that does not result in
that end is either meaningless or effectively impossible to eliminate.

I am seeing an awful lot of chatter here along the lines that technology
decision makers are insipid jobsworths who fall in line behind the big
tech brands because they are afraid to stick their neck out. ie. the only
reason
they are not picking Rails is because they don't have the stones for it.

My take is that people who choose a technology based on popularity
rather than knowledge of the technology is an insipid jobsworth who
falls in line behind the big tech brands because (s)he is afraid to
stick his/her neck out. Those who choose a technology based on
knowledge of the technology, on the other hand, is a smart guy that
should be making a lot of money, whether the ultimate decision is to go
with J2EE, Rails, Common Lisp, VB.NET, or Wasabi. Of course, I think
VB.NET is unlikely to be a good choice outside of extremely pathological
edge-cases, but that's beside the point.

Has anyone ever considered the fact that many of these decision makers
are very serious, ethically minded people? They take their job seriously
and feel a strong responsibility to make a correct technology decision.

. . but if they end up making a decision based on the criteria Joel
Spolsky advocated in the essay that started all this discussion, they're
either malicious or incompetent.
 
A

Austin Ziegler

Best Regards,

Jose L. Hurtado
Web Developer
Toronto, Canada

So ...

we've never seen you at a TRUG meeting (we just had one yesterday).

Come on out and join us!

-austin
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,764
Messages
2,569,564
Members
45,039
Latest member
CasimiraVa

Latest Threads

Top