confused said:
After expressing my interest in expanding my new knowledge of HTML
and CSS into the wild realm of JavaScript, I was advised that it is
wiser to avoid it, since not all browsers are use it or are enabled
to read it.
Many people who encounter web pages where the inappropriate or
ill-considered use of javascript has had disastrous consequences for the
page (a blank page being the most obvious and extreme consequence)
conclude that javascript is all bad and recommend that it is never used
by anyone.
Obviously that is not an attitude that you will find promoted here.
However, presenting a web site user with a blank screen, or a web page
that is unusable for any other reason, is not an practice that is
considered acceptable here either (except maybe by a few, who are not
capable of doing any better).
The question we struggle with is: can javascript be designed,
implemented and deployed to enhance a web site wen it is available and
functional without detracting from the 100% reliable HTML over HTTP that
is the basis for any web site.
After searching for other opinions on the web, I found
that some estimate that the frequency of browsers that can read JS
currently is better than 90% --
<snip>
The availability of javascript is not the only issue. In addition you
have to consider the viability of the things that you intend to do with
javascript.
Opening new browser windows is an obvious example. Because of pop-up
advertising people run various flavours of pop-up blocker. These people
ate not using pop-up blockers with javascript disabled/incapable
browsers, there would be no point as those browsers could not open
pop-ups anyway. So to your supposed 10% failure of any script that
attempts to open a new window as a result of javascript being disabled
you also have to add a percentage resulting from the accumulated effect
of the various forms of pop-up blockers.
Then there is the question of how dynamic the browser is. Modern desktop
browsers are very dynamic, small embedded browsers and older browsers
are less so. So a script that wants to actively insert content into a
web page will fail on javascript enabled and capable browsers that are
not dynamic in addition to javascript incapable browsers.
Even something as trivial as using IE's ability to write to the
clipboard is not guaranteed to be available as IE's configurations allow
that to be independently disable. And ActiveX is even less reliable.
Javascript cannot be written with an attitude that javascript is either
available on the client or not. The availability of javascript does not
guarantee that any given script will execute successfully, that will
depend entirely on what it is going to be attempting to do, and how it
attempts to do it. So, even if you are gullible enough to believe that
accurate and representative statistics can be gathered about a global
HTTP network, any statistic for javascript availability only represents
one end of a very blurred boundary region between successful script
execution and script failure.
that is certainly workable for me!
If you went to the business people behind a web site (assuming it is a
commercial project) and tell them that an arbitrary and unnecessary
design decision was going to cost them 10% of their turnover, would you
expect that decision to be endorsed?
The starting point for a web site is HTML over HTTP (often in
combination with server-side scripting), which is 100% reliable and will
"work" on every web browser for all users (or at least as near to that
as can be achieved). Everything that detracts form that underlying 100%
reliability is the direct result of a design decision made by a human
(it is not inherent in the system).
How and why humans introduce unreliabilities varies a great deal; it
could be ignorance, or an unwillingness to recognise the significance of
a cause of action, or it could be an informed decision resulting from an
analysis of conditions and an assessment that the introduced
unreliability is an acceptable trade-off for some other advantage. (Of
course you cannot make an informed assessment of the relative merits of
introducing unreliability into an otherwise reliable system until you
understand how to avoid doing so, and what that would involve.)
Javascript design is probably the point where the most can be detracted
from the underlying reliability of the system. That makes the design of
javascript probably more important than its specific implementation
(though that is not an excuse for bad coding). So can browser scripts be
designed so that they add to a web page when they work, but do not
detract from it when they fail?
When designing javascript is important to realise that it will always
fail somewhere. On whatever percentage of browsers do not support
javascript at all for sure, and then additionally on all of those
browsers that do not support the required features (or are operated in
an environment that impacts of their use (such as, along side pop-up
blockers)). How a script fails (preferably under its own control)
becomes as important as what it will do when it works, and for this the
criteria is "clean degradation". Clean degradation means that the user
is not shown error reports, the user is not badgered (about their
browsers and its capabilities) and after the degradation the resulting
web page is completely useable in terms of what it presents to that
user.
The easiest way to achieve clean degradation is to start with 100%
reliable HTML and use javascript to go up from there. That allows the
degraded state to be that underlying HTML. The script is then designed
so that it enhances that underlying HTML through its manipulation (and
possibly adding dynamically generated controls and the like), but
cautiously, feature detecting the browser's support for what it is
required of it and withdrawing gracefully whenever the browser does not
appear to be supportive. And so, cleanly degrading back to the original
underlying HTML.
That is all fine and good as a design principle, but can it be realised?
Can a javascript manipulated UI still make sense without javascript?
There are things that cannot be implemented to cleanly degrade, but not
that many (and they may be the aspects of javascript best avoided). The
vast majority of the things that are commonly implemented in a way that
introduces a javascript dependency are completely amenable to being
implemented without that dependency.
With the modern desktop browsers capable of considerable manipulation of
HTML anything paced within that HTML can be transformed by scripts to be
(and do) just about anything imaginable, while the fact that the most
dynamic (and common) browsers can be manipulating existing content means
that there is always a path of clean degradation available to any (and
all) less capable browsers.
I have recently been playing with a script that illustrates an extreme
of the manipulation of underlying HTML with javascript. It started life
as a response to a question asked on this group, but I have considerably
elaborated it in the mean while:-
<URL:
http://www.litotes.demon.co.uk/js_info/pop_ups.html >
- Visit it with a javascript capable/enabled modern desktop browser to
see how it manipulates its contents. And then use the first link on the
page to re-load with the script disabled (or disable javascript and
re-load the page, or re-visit with a less capable browser) to see it in
its degraded state. It isn't yet perfect but it does demonstrate that
much can be done with javascript without introducing any dependence upon
javascript, and once that is achieved it stops being important that some
browsers may not support javascript.
Richard.