This assumption won't hurt anything if the page is designed to degrade
gracefully.
I have a situation where it will hurt and this is a common situation.
A page arrives with its HTML for static viewing. I want to manipulate
the DOM structure and page CSS to create a widget. I think of the
point at which the DOM and CSS are manipulated as "the point of no
return". Before I make these manipulations I need to make sure the
widget will function completely. If the DOM structure and CSS both
work, but the browser doesn't have correct event handling, then the
widget will be useless and some content will not be visible to the
user.
This isn't ideal as it relies on setAttribute, but it does work for
Firefox. Alternatively, you could define the onclick attribute in the
markup.
var el = getAnElement(), dom0 = true;
if (typeof el.onclick == 'undefined' &&
isHostMethod(el, 'setAttribute')) {
el.setAttribute('onclick', '(function() {})');
dom0 = (typeof el.onclick == 'function');
el.setAttribute('onclick', '');
}
There doesn't seem to be anything in the Element.setAttribute spec
that make me think this is a particularly robust test.
http://www.w3.org/TR/2000/REC-DOM-Level-2-Core-20001113/core.html#ID-F68F082
If a design relies on DOM0 event support, the best solution is to call
its gateway function like this:
this.onload = myGateway;
I agreed that this is a solid solution. What I'm actually trying to
determine if the above will work. Here is why...
When a page is built with progressive enhancement in mind, the page
looks like like the unenhanced version while the whole page loads.
When window.onload occurs, then the page can be enhanced and made to
look like the enhanced version. For the user with a browser that will
support the enhanced version, this is unsightly. For example, a huge
list of nested ul elements appear on the page for a while while the
page loads, when window.onload fires, the hander determines that the
nested ul elements can be converted into a drop-down menu or tree
menu. When this conversion happens the page appears to "jump" as the
CSS changes and the page is re-rendered. This is not really acceptable
as more users will be able to have the enhanced version than the
unenhanced version.
So what I want to do is the following...
When the page is loading, determine if the browser has enough features
to "get out of trouble". If that is the case then add a CSS file to
the page with document.write, that will give the page an acceptable
appearance while the page loads. When window.onload fires, determine
if the widget can fully function and enable that widget. If the widget
cannot fully function then execute the "get out of trouble" and revert
the styling of the HTML to something more pleasing for the unenlivened
version. It almost allows me to *tentatively* cross the point of no
return with just one foot so the page looks nicer during load.
Makes sense?
Is this in regard to the Safari 1.x preventDefault bug on click/
dblclick?
Yes my question also impacts this.
I have thought about that one and I don't think it requires
a workaround.
You may be right that the workaround is not necessary. I need to think
about this more.
If, for instance, a user of Safari 1.x clicks a link
that has a click event listener and preventDefault fails, the browser
will fire the event and then navigate normally. As long as the page
is designed to work without script, this should not cause an issue.
But it might cause an issue if the JavaScript does something that
shouldn't be followed by the normal navigation. I know we aren't
supposed to use <a> links to add items to a shopping cart but suppose
someone did. And then suppose they used hijax[1] to make the link add
the item to the cart with Ajax. If the Ajax is initiated an then the
link is followed, the item will be added to the cart twice.
[1]
http://domscripting.com/blog/display/41
I am completely aware this is a crappy example but my gut tells me
there must be situations where having the JavaScript run followed by
the normal navigation is bad.
I suppose the JavaScript could be run after a timeout and that way it
only runs if the normal navigation didn't work. Messy solution.
I know some libraries (e.g. YUI) sniff for the older Safari versions and
fallback to DOM0 for click/dblclick, but I think that is a mistake.
I think the sniff is also a mistake. I use DOM0 listeners for all
click events. That way I treat all browsers equally and don't need the
sniff. I think it would be fine to use DOM0 listeners for all event
handlers. There really isn't a need for addEventListener and
attachEvent because a wrapper API around DOM0 can do it all anyway. My
problem is that right now I am not testing that DOM0 will work. This
is not a large practical problem for me but it is a large problem in
trying to write a widget properly.
By the way, this problem is the last detail for an article I am
writing for my blog about writing progressive enhancement widgets (a
tabbed pane) for the general web. It uses the isHostMethod etc. If
anyone is interested in reviewing the article I can post it here for
debate. I've never seen such an article and I think such an article
has long been needed.
Peter