JavaScript as AJAX response (RPC)

P

petermichaux

Hi,

Some servers return JavaScript as the response to an AJAX request. When
the response JavaScript is eval'ed it calls other JavaScript functions
already in the browser to update elements, etc. This seems like a good
system because it allows so much freedom in creating the desired
behavior in the browser. The required data doesn't have to be converted
to XML or JSON on the server. The browser doesn't have to have
templates for interpreting and converting this data into some change in
the browser. All of the conversion algorithms don't have to be written
and changed when new behavior is required. This remote procedure call
approach is the predominant system in the Ruby on Rails world.
(Unfortunately they are calling Prototype.js functions.)

However apparently some people seem to think this remote procedure call
approach is a bad idea. I can't see why it is so bad because it is so
lightweight and flexible. It also helps to keep the client less
intellegent which seems good in a world of incompatible client-side
bugs.

If I use some neutral data format like XML to accomdate different types
of clients then I have to write different client-side interpreters for
each type of client (browser, RSS, POP, cell phone, etc). Why not just
write different server-side code that generates the correct JavaScript
(or other) for the requesting client type?

When is the RPC approach such a bad idea?

Thank you,
Peter
 
L

Leo Meyer

Some servers return JavaScript as the response to an AJAX request.
When is the RPC approach such a bad idea?

One problem I see with this approach is the need to change the API you are
calling. You limit your refactoring choices if you need to keep the new API
call backwards-compatible, and you may run into trouble when changing
versions.
The other issue is that you have a relatively high coupling between server
and client code across language boundaries, which is not always desirable.
Having said that, I also like this type of "RPC" calls. In my shop we use it
for one of our products. So far we haven't had any problems.
OTOH, we have a very small codebase for this product, and we use only one
AJAX request with a choice of server code (PHP, ASP, Java). In our setting
this works quite ok without too much hassle.
For large codebases with lots of different AJAX requests, though, I would
recommend to use this approach with extreme caution. You may be better off
with a data-centric approach here. But, then, I'd recommend extreme caution
for any kind of programming technique. There is no such thing as fool-proof.

Regards,
Leo
 
L

Laurent Bugnion

Hi,

Hi,

Some servers return JavaScript as the response to an AJAX request. When
the response JavaScript is eval'ed it calls other JavaScript functions
already in the browser to update elements, etc. This seems like a good
system because it allows so much freedom in creating the desired
behavior in the browser. The required data doesn't have to be converted
to XML or JSON on the server. The browser doesn't have to have
templates for interpreting and converting this data into some change in
the browser. All of the conversion algorithms don't have to be written
and changed when new behavior is required. This remote procedure call
approach is the predominant system in the Ruby on Rails world.
(Unfortunately they are calling Prototype.js functions.)

However apparently some people seem to think this remote procedure call
approach is a bad idea. I can't see why it is so bad because it is so
lightweight and flexible. It also helps to keep the client less
intellegent which seems good in a world of incompatible client-side
bugs.

If I use some neutral data format like XML to accomdate different types
of clients then I have to write different client-side interpreters for
each type of client (browser, RSS, POP, cell phone, etc). Why not just
write different server-side code that generates the correct JavaScript
(or other) for the requesting client type?

When is the RPC approach such a bad idea?

Thank you,
Peter

The main issue I have with this approach is that you limit yourself to a
JavaScript client. Moreover, probably you limit yourself to a web
browser based JavaScript client.

Web services (in the broad sense of the word) should be, IMHO,
"universally" callable. We are only starting to see RIAs (rich internet
applications) calling all kind of web services (SOAP based, POX based
(plain old XML), etc...) to enhance the user experience. If you publish
a JavaScript-only web service, you close your door to such applications.
Additionally, if you want to open the door, the refactoring effort will
be huge.

I prefer standard interfaces when possible, i.e. SOAP or POX. In .NET
2.0 applications (ASP.NET, WinForms, WPF), for example, integrating SOAP
based web services is ridiculously easy. It means that a lot of
programmers will do so. As a web service provider, you have "duties" to
them (just like an interface is a "contract"). IMHO the duties are:
don't add breaking changes to your interface; make the interface easy to
use; make the interface as universal as possible.

Of course there are exceptions to that "rule" (of thumb ;-), but more
than often an interface designed for one single application ends up
being reused in other apps as well. If you choose standard to start
with, the refactoring will be easier.

HTH,
Laurent
 
L

Laurent Bugnion

Hi,

Chris said:
Who was talking about web services? This was a question about AJAX.

Which is why I wrote "in the broad sense of the word". Web services as
services exposed on the web. I think I made it clear in my answer that I
was also talking about POX.

Greetings,
Laurent
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,772
Messages
2,569,593
Members
45,111
Latest member
KetoBurn
Top