Browser Detection Article

G

Garrett Smith

(resending, as the first attempted post appears to have failed)

I have created a copy of Richard Cornford's Browser Detection article to
the new notes section (/faq/notes/).

The article explains the problem of serving content to an unknown
browser and how to easily avoid problems with bad strategies.

<http://jibbering.com/faq/faq_notes/not_browser_detect.html>

The article is in need of some attention and this is being done under
the new notes location under /faq/notes/browser-detection/.

I have added links to the document from:
<http://jibbering.com/faq/notes/>
<http://jibbering.com/faq/notes/code-guidelines>

I changed also the FAQ link for "How do I detect..." to link to the
updated document at the new location.

The following changes to the article have been made to the article:

* Change title
From: Browser Detecting (and what to do Instead)
To: Browser Detection (and What to Do Instead)
(infinitive "to" and conjunction "and" are LC; verb "Do" and pronoun
"What" are UC)
* Author name and editor name at top of document
* remove reference to alt_dynwrite document
* Update TOC
* Markup:
- html 4 strict doctype
- remove all named anchors e.g. <a name="..."
* Text edits
- spelling:
* beare -> bear
* sting -> string
* all occurrences of clipboardDate -> clipboardData
* crateElement - > createElement
* ECMA Script -> ECMAScript
* "Client-side" -> "client side"
* "unsupporting" -> "nonsupporting"
Grammar, etc
- "undefined values" -> "undefined"
- Remove fullstop in headings
- Capitalize each word in heading "Avoiding structural differences in
the browser DOMs"
- Capitalize the first word of the sentence "start from a basis of
valid HTML."
* Links
- From:
see <a href="http://jibbering.com/faq/#FAQ4_15">FAQ 4.15</a>
To:
<a href="/faq/#updateContent">FAQ: How do I modify the content of
the current page?</a>

The alt_dynwrite document was unlinked because it is dated; it does not
address a current problem.

The IDed Element Retrieval section provides an example of the problem
and shows a contrasting solution using a getElementWithId function. This
function has the classic depth first search Netscape 4 layer-crawler
function. Great.

The fundamental concept is still relevant, but the example is dated.
This lessens the impact for the reader who sees some code and identifies
it as being something that is not relevant to what he is doing. I do not
have an entire section rewrite proposal. That one remains.

TODO:
The styles are an eyesore. Literally. The code comments have green text
on a yellow text-box background, on the pink code block background. This
causes my eyes to hurt after looking at it for a while.
 
V

VK

(resending, as the first attempted post appears to have failed)

I have created a copy of Richard Cornford's Browser Detection article to
the new notes section (/faq/notes/).

The article explains the problem of serving content to an unknown
browser and how to easily avoid problems with bad strategies.

<http://jibbering.com/faq/faq_notes/not_browser_detect.html>

This well-known article is written with three false assumptions:

1) The exact platform and UA version is never needed to be known for
the proper code behavior if the proper feature-detection application
is in use.
2) The feature-detection branching approach is by design more reliable
and effective than platform/UA-detection approach.
3) JavaScript code once written has to be fall-back and fall-forward
compliant so must degrade nicely on any former JavaScript
implementation and function as intended on any future ECMA-262-3-
compliant platforms

Neither of these assumptions are true and not considered seriously
anywhere outside of narrow borders of c.l.j. newsgroup. Yet these
three assumptions being taken as dogmas do form the religious ground
of many people so ensuring their only way to stay in a comfortable,
logical and moral cope with the reality in its programming domain.
This way instead of claiming "There is not your God" as I would years
ago I simply say "So let you God bless you".
 
G

Garrett Smith

VK said:
This well-known article is written with three false assumptions:

This looks like it might be a straw man. I don't see that exact text
you've written in the article.
1) The exact platform and UA version is never needed to be known for
the proper code behavior if the proper feature-detection application
is in use.

Assumption is the wrong word to describe that statement. A more suitable
term for the explanation of that statement would be "theory".

However, what is stated under "The Javascript Design Problems" is different:

| ... When authoring for the Internet nothing is known about the
| receiving software in advance, and even when that software is a web
| browser that will execute javascript, there is still a spectrum of
| possible DOM implementations to contend with.

That says that the author does not know the platform. It does not say
that the UA and version is never needed; it says that, at the time of
writing the code, that information is not available.

The article does not need to support that theory. It needs only to
support its own theories, which it does.
2) The feature-detection branching approach is by design more reliable
and effective than platform/UA-detection approach.

The article does not make such blanket statement. It even shows a bad
example of feature detection, providing the following explanation for that:
| However, that function is not a particularly clever application of
| feature detecting because, while it avoids the function erroring in an
| attempt to execute clipboardData.setData on a browser that does not
| support it, it will do nothing on a browser that does not support it.
3) JavaScript code once written has to be fall-back and fall-forward
compliant so must degrade nicely on any former JavaScript
implementation and function as intended on any future ECMA-262-3-
compliant platforms
Ah, I see the article uses "fall-back" as well. I changed that fallback
just now, and I also just fixed what would be a SyntaxError in a
comment: return ''';

Comments in that document are overdonw. Superflous comments tend to
ignored and eventually, when the code is changed, the code can run just
fine while the perfectly misleading comment sits there.

OTOH, if the code is full of simple short methods and one or two
comments, the comments stand out a lot more.

Regarding your third statement, the article does not state that
back/forwards compatibility is /necessary/, per se, but it does explain:

| An important aspect of feature detecting is that it allows a script to
| take advantage of possible fall-back options

The article explains why earlier.
Neither of these assumptions are true and not considered seriously
s/Neither/None

anywhere outside of narrow borders of c.l.j. newsgroup. Yet these
three assumptions being taken as dogmas do form the religious ground
of many people so ensuring their only way to stay in a comfortable,
logical and moral cope with the reality in its programming domain.
This way instead of claiming "There is not your God" as I would years
ago I simply say "So let you God bless you".

That's definitely a straw man. The problems are well explained in the
article.

Of course, there are many many ways to cause problems in code, both with
strategies and overall design. Javascript is very flexible, the browser
APIs are unknown, the parardigm of RIA for web browsers is relatively
new, and the amount of misinformation is great.

We do are best, learning from mistakes, writing it down when we have time.

It's a good article, if a bit dated. I have also liked the browser
detection article from the code guidelines document in two places. Can
you spot them?

<http://jibbering.com/faq/notes/code-guidelines/>
 
V

VK

Assumption is the wrong word to describe that statement. A more suitable
term for the explanation of that statement would be "theory".

Uhmm... In what definition of this word?
http://www.merriam-webster.com/dictionary/theory
The only definition of a theory that would somehow allow to attack
anyone for not fully sharing that theory and call him a by definition
lousy programmer would be the meaning 3. from Merriam-Webster. Yet if
taken out of the field of science the meanings 1. and 2. are primarily
assumed. I like the best 2. so I agree on "theory" in whatever meaning
for others but 2. for myself. :)
However, what is stated under "The Javascript Design Problems" is different:

| ... When authoring for the Internet nothing is known about the
| receiving software in advance, and even when that software is a web
| browser that will execute javascript, there is still a spectrum of
| possible DOM implementations to contend with.

That is the key point when the theory goes wrong. When one writes the
code she knows in advance at least one "receiving software" in
advance: that is at least one browser installed where the code is
tested. I am not aware of commercial developers who would just type
the source and publish it w/o a single test.
So what would that - let it be the only one - browser be? NN4? Opera
3.x? Fx 1.0.4? IE6 ? I have a strange feeling that it will be either
current Fx or current IE. So the decision at least about a big or the
biggest part of the target audience has to be done and is always done
before a single line of code is written.
Of course a single browser for testing - it is not a commercial
developer of any kind: just a code playing amateur. For each public
project the current market share has to be learned. The requirement is
that the coverage is no lesser than 99% of the current market. 99% -
it is not where the code has to "degrade gracefully". It is where it
has to work. Otherwise the benefits will not cover all production
cost. Again: it is for an open market solution, not for a particular
intranet solution where any cost can be covered by making it to work
for 10-100 exactly these machines in exactly that prerequisite
configuration.
All lesser than 1% of the market share goes to the deepest corner of
the hell - unless the developers and the QA testers agree to do it at
their spare time free of charge.
100% coverage is not reachable even if someone will spend the rest of
his/her life for a few liner code. Everything can fail in the most
miserable way. That one:

<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd">
<html>
<head>
<title>Oops...</title>
<meta http-equiv="Content-Type"
content="text/html; charset=iso-8859-1">
<script type="text/javascript">
var foo = 'bar';
</script>
</head>
<body>
<p onclick="window.alert(foo);">Oops...</p>
</body>
</html>

will error out on BlackBerry because until rather recently its
JavaScript implementation was not able to peek up <script> blocks in
the <head> section, only in the <body> section. BlackBerry is an often
spooky-boo here, with "think of BlackBery users!" followed by
"adjusted" script in the head block - to my silent amusement. I don't
care of BlackBery and I don't want to think of BlackBery until
BlackBery-specific solution is ordered and the advance payment is
received. I want my 99% of the *current* market coverage. Who wants
the same may share how do they get the fugures. Who wants lesser than
99% - I'd like to see them. Who wants 100% - prove Fermat's Last
Theorem wrong, the proof will hold the key. :)

<snip>
 
G

Garrett Smith

VK said:
Uhmm... In what definition of this word?
http://www.merriam-webster.com/dictionary/theory
The only definition of a theory that would somehow allow to attack
anyone for not fully sharing that theory and call him a by definition
lousy programmer would be the meaning 3. from Merriam-Webster. Yet if
taken out of the field of science the meanings 1. and 2. are primarily
assumed. I like the best 2. so I agree on "theory" in whatever meaning
for others but 2. for myself. :)


That is the key point when the theory goes wrong. When one writes the
code she knows in advance at least one "receiving software" in
advance: that is at least one browser installed where the code is
tested. I am not aware of commercial developers who would just type
the source and publish it w/o a single test.

The developer assumes that the client is a web browser. That is an
assumption.

[...]
All lesser than 1% of the market share goes to the deepest corner of
the hell - unless the developers and the QA testers agree to do it at
their spare time free of charge.
100% coverage is not reachable even if someone will spend the rest of
his/her life for a few liner code. Everything can fail in the most
miserable way. That one:

100% coverage is not reachable. The term "clean degradation" explains
what how to handle this problem.
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd">
<html>
<head>
<title>Oops...</title>
<meta http-equiv="Content-Type"
content="text/html; charset=iso-8859-1">
<script type="text/javascript">
var foo = 'bar';
</script>
</head>
<body>
<p onclick="window.alert(foo);">Oops...</p>
</body>
</html>

will error out on BlackBerry because until rather recently its
JavaScript implementation was not able to peek up <script> blocks in
the <head> section, only in the <body> section. BlackBerry is an often
spooky-boo here, with "think of BlackBery users!" followed by
"adjusted" script in the head block - to my silent amusement. I don't
care of BlackBery and I don't want to think of BlackBery until
BlackBery-specific solution is ordered and the advance payment is
received. I want my 99% of the *current* market coverage. Who wants
the same may share how do they get the fugures. Who wants lesser than
99% - I'd like to see them. Who wants 100% - prove Fermat's Last
Theorem wrong, the proof will hold the key. :)
The only time I have noticed problems with executing a script in HEAD
was when it was my fault.

My unit tests have script in head and they run in Blackberry 9000.

Any browser that cannot execute script in head falls into the graceful
degradation category.
 
D

David Mark

VK said:
Uhmm... In what definition of this word?
http://www.merriam-webster.com/dictionary/theory
The only definition of a theory that would somehow allow to attack
anyone for not fully sharing that theory and call him a by definition
lousy programmer would be the meaning 3. from Merriam-Webster. Yet if
taken out of the field of science the meanings 1. and 2. are primarily
assumed. I like the best 2. so I agree on "theory" in whatever meaning
for others but 2. for myself. :)


That is the key point when the theory goes wrong. When one writes the
code she knows in advance at least one "receiving software" in
advance: that is at least one browser installed where the code is
tested.

So what? That one browser/configuration is what leads to overconfidence
in browser script developers.
I am not aware of commercial developers who would just type
the source and publish it w/o a single test.

Then you haven't browsed the Intrer
So what would that - let it be the only one - browser be? NN4? Opera
3.x? Fx 1.0.4? IE6 ?

You are just being stupid (as usual).
I have a strange feeling that it will be either
current Fx or current IE.

For the _one_ test that the typical overconfident developer might make?
Probably. And you don't see any problem with that? What is "current
IE" anyway? IE8? I assume you know that that browser can be configured
to work very similarly to IE6/7. Do you just test the "current
configuration" and how do you know what that is at development time?
So the decision at least about a big or the
biggest part of the target audience has to be done and is always done
before a single line of code is written.

Don't be silly. Professionals re-use the same cross-browser code for
years. Amateurs sit down to write a script that is "targeted" at a
handful of the "current" browsers because they have nothing but previous
outdated browser sniffing code in their archives. Get it?
Of course a single browser for testing - it is not a commercial
developer of any kind: just a code playing amateur.
What?

For each public
project the current market share has to be learned. The requirement is
that the coverage is no lesser than 99% of the current market.

Meaningless. And you don't get it anyway. Testing in more than two
browsers is done to expose problems in the code's logic, not because it
is known that all of the tested browsers are in heavy use at the time.
The mire solid the logic, the more likely the script won't need to be
rewritten next year.

And 1% of Web browsers is a _huge_ number BTW. If you weren't
incompetently fumbling around with rewrites for "current" browsers,
you'd have code stocked up that covers far more than what you perceive
as "99%" of browsers.
99% -
it is not where the code has to "degrade gracefully". It is where it
has to work.

If code doesn't degrade gracefully, then it doesn't actually work at
all, (except in the mind's of incompetent developers). It's a surefire
sign that the concoction will fall apart in the future. Why do you
think projects like jQuery, Dojo, etc. are in a constant state of panic?
Why do you think they constantly update the list of browsers they don't
"care" about? Because they are all running Chinese fire drills every
time a new browser comes out (or a new configuration is "discovered").
And if they weren't doing this VK-esque nonsense, they'd have time to
learn how to do this stuff correctly.
Otherwise the benefits will not cover all production
cost.

LOL. I've got scripts in production that haven't changed in a _decade_.
Not for IE7, not for IE8 (any of its silly modes), not for Chrome,
Blackberry, iPhone, etc. And I can reuse these scripts without fear of
breaking old browsers or fouling up new ones.
Again: it is for an open market solution, not for a particular
intranet solution where any cost can be covered by making it to work
for 10-100 exactly these machines in exactly that prerequisite
configuration.

It would be a silly and costly strategy, even in such an environment as
those 10-100 machines won't be configured the same way forever.
All lesser than 1% of the market share goes to the deepest corner of
the hell - unless the developers and the QA testers agree to do it at
their spare time free of charge.

Or... If you have at least _one_ experienced cross-browser developer
overseeing the development. That's what I do for a living and the QA
testers love me as I save them a hell of a lot of time (my stuff tends
to work the _first_ time).
100% coverage is not reachable even if someone will spend the rest of
his/her life for a few liner code. Everything can fail in the most
miserable way. That one:

It's the same old tired nonsense. Nothing is 100% guaranteed, but
that's not an excuse to do stupid things.
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd">
<html>
<head>
<title>Oops...</title>
<meta http-equiv="Content-Type"
content="text/html; charset=iso-8859-1">
<script type="text/javascript">
var foo = 'bar';
</script>
</head>
<body>
<p onclick="window.alert(foo);">Oops...</p>
</body>
</html>

will error out on BlackBerry because until rather recently its
JavaScript implementation was not able to peek up <script> blocks in
the <head> section, only in the <body> section.

If that were the case, then scripts would fail (completely) in that
browser. So what? Is that somehow an excuse to use browser sniffing?
Sounds more like a good reason to use progressive enhancement (leaving
users of a completely broken browser with a usable static document).
BlackBerry is an often
spooky-boo here, with "think of BlackBery users!" followed by
"adjusted" script in the head block - to my silent amusement.
I don't
care of BlackBery and I don't want to think of BlackBery until
BlackBery-specific solution is ordered and the advance payment is
received. I want my 99% of the *current* market coverage. Who wants
the same may share how do they get the fugures. Who wants lesser than
99% - I'd like to see them. Who wants 100% - prove Fermat's Last
Theorem wrong, the proof will hold the key. :)

Blackberry-specific solution? Blackberry users are not part of "your"
99% of the "current" market coverage? And are you really dim enough to
think that a strategy that is based on what you can see of today's
browsers, ignoring everything that came before, is self-defeating.
Isn't your whole problem that you are overwhelmed by too many variables?
Do you think there will be less or more of those in the future? Your
ill-conceived pigeonhole coverage is just not designed to deal with
anything more than what you can see at the time you write the code.

Even the jQuery proponents are getting fed up with such babbling
"pragmatism". VK is the last holdout for saving money through endless
incompetence? You really need to find a new line of work.
 
V

VK

The developer assumes that the client is a web browser. That is an
assumption.

The developer expects that the code will be profitable for himself if
a freelancer or he is being ordered to make the code because his boss
considers it to be profitable. "Profitable" doesn't necessary mean
money directly, it can be a brand or name promotion, visitors
attraction and many other ways to benefit from the code usage. That is
not an assumption, that is the observable fact of the nature #1. With
this fact in hands the fears of the code being somewhere someday
executed in a stay-alone JavaScript engine, in a C++ wrapper over
JavaScript interfaces, in a self-made or/and self-compiled non-
graphical browser with JS support - all these fears will disappear by
themselves. If not right away, then make a poster "Dorky experimenters
and web punks failed to use my code? I don't care!" Meditate at that
poster for a few minutes every day for several days. The positive
result is guaranteed by Dr. VK :)
100% coverage is not reachable. The term "clean degradation" explains
what how to handle this problem.

The term is applied to the "scripted enhancements" on the page, so for
all those dynamic menus, "form helpers" and other last century Web 1.x
stuff.
"A surprisingly large number of people look at the benefits of the
thin client model -- easy updates (the web), a declarative UI language
(HTML), an easy-to-learn and powerful language (JScript) -- and decide
that this is the ideal environment to develop a rich client
application.
That's a bad idea on so many levels. ... JScript was designed for
simple scripts on simple web pages, not large-scale software."
Eric Lippert, November 2003
http://blogs.msdn.com/ericlippert/archive/2003/11/18/53388.aspx

Really, I think that this quote from the developer of the original
JScript engine will go one time to the set with Bill's "1Mb of RAM is
enough for any kind of application".
Having said that I want to note that Eric Lippert is a good
professional and an excellent technical writer. His "Fabulous
Adventures In Coding" blog articles are a suggested reading to
everyone though some data may be obsolete.

Any way, I don't understand what "clean degradation" would be for Web
2.0 and higher. If I have a fully featured client-side word processor
with graphics editing capabilities - then to where should I degrade
for IE6 or script disabled "users"? To a textarea with submit button?
You must be kidding! To an equivalent server-side solution duplicating
the client-side one? That is not even funny...

Yes, one of these users could be a crazy millionaire who promised to
leave all his money to the site authors if that site cope with his
IE5.5 or NN6. Well, I will not spend time and money on a daily basis
in hope of that lucky chance. One may say that I missed a huge
opportunity. Sorry, but I will just have to live with it. :)
 
D

David Mark

VK said:
The developer expects that the code will be profitable for himself if
a freelancer or he is being ordered to make the code because his boss
considers it to be profitable.

The typical incompetent (and/or disingenuous) freelance developer (e.g.
VK) wants to make a quick buck. So they show the fruits of their labor
to their clients on one or two "current" browsers that they've
painstakingly tuned the code to work with. Never mind that new browsers
come out seemingly every day. That just means more work for the
developer to "keep up" with those pesky browsers (designed
obsolescence). And if their nudging and twiddling (usually per the UA
string or some other thoughtless inference) breaks older browsers or
browsers with less than 1% of the world, that is spun as "saving money"
by not bothering to test thoroughly. Get it? If not, you are stupid.
If so, you are posing, stealing or both (and you best hope your clients
don't read this group).
"Profitable" doesn't necessary mean
money directly, it can be a brand or name promotion, visitors
attraction and many other ways to benefit from the code usage. That is
not an assumption, that is the observable fact of the nature #1. With
this fact in hands the fears of the code being somewhere someday
executed in a stay-alone JavaScript engine, in a C++ wrapper over
JavaScript interfaces, in a self-made or/and self-compiled non-
graphical browser with JS support - all these fears will disappear by
themselves. If not right away, then make a poster "Dorky experimenters
and web punks failed to use my code? I don't care!" Meditate at that
poster for a few minutes every day for several days. The positive
result is guaranteed by Dr. VK :)

Please stop wasting everyone's time with random gibberish.
The term is applied to the "scripted enhancements" on the page, so for
all those dynamic menus, "form helpers" and other last century Web 1.x
stuff.

Are you really that out of touch? Do you think that proponents of
progressive enhancement (and the like) are strictly talking about form
validation? And many incompetents use huge and complex blobs of browser
sniffing to do form validation. Mostly they aren't capable of writing
anything more than that (with perhaps a fade effect on top to make their
stuff look "modern"). Trying to write something more than that with
brittle scripts like jQuery is madness.
"A surprisingly large number of people look at the benefits of the
thin client model -- easy updates (the web), a declarative UI language
(HTML), an easy-to-learn and powerful language (JScript) -- and decide
that this is the ideal environment to develop a rich client
application.

Lots of people did that around 2005 and we are still feeling the
after-effects of the tsunami of futility that followed.
That's a bad idea on so many levels. ... JScript was designed for
simple scripts on simple web pages, not large-scale software."
Eric Lippert, November 2003
http://blogs.msdn.com/ericlippert/archive/2003/11/18/53388.aspx

Who is Eric Lippert and how does this observation from 2003 help your
"argument" for browser sniffing.
Really, I think that this quote from the developer of the original
JScript engine will go one time to the set with Bill's "1Mb of RAM is
enough for any kind of application".

So why are you quoting it?
Having said that I want to note that Eric Lippert is a good
professional and an excellent technical writer. His "Fabulous
Adventures In Coding" blog articles are a suggested reading to
everyone though some data may be obsolete.

Your recommendations are the kiss of death.
Any way, I don't understand what "clean degradation" would be for Web
2.0 and higher.

I know you don't. Neither does "Jorge" and several others who post
similar drivel here. Here's a hint: there's no such thing as "Web 2.0".
If I have a fully featured client-side word processor
with graphics editing capabilities - then to where should I degrade
for IE6 or script disabled "users"?

For script disabled users?

document.write('<a href="bs.html">VK's imaginary word processor<\/a>');

And if you are really so incompetent as to write an application that
fails entirely in IE6 (which means it likely won't work in some
configurations/modes of IE7/8), wrap the above in a conditional comment
(and add another aimed at IE6 that explains what they are missing).
To a textarea with submit button?

That would depend on the context, wouldn't it? Often a textarea is
replaced with an RTF editor (usually a bad one).
You must be kidding! To an equivalent server-side solution duplicating
the client-side one? That is not even funny...

As is often the case, feeble reasoning sinks the design (hypothetical in
this case). You don't seem to have any technical ability either. How
do you manage to find work in this industry after so many years of
public ineptitude laced with gibberish?
Yes, one of these users could be a crazy millionaire who promised to
leave all his money to the site authors if that site cope with his
IE5.5 or NN6. Well, I will not spend time and money on a daily basis
in hope of that lucky chance. One may say that I missed a huge
opportunity. Sorry, but I will just have to live with it. :)

You still don't get it. You test as many browsers as you can. It's not
about making everything "perfect" in IE5.5 (or whatever). It is about
testing for gaps in your degradation logic. Repeat that over and over.
;) If I load up some complicated DOM-intensive Web app in IE5 and it
doesn't look quite right, it likely won't be an issue. But if it throws
an exception out of the gate due to some bad assumption about its
limited host environment... (hint: the issue may not be relegated to
IE5). I ran into several of these loopholes in logic when testing My
Library in browsers like Opera 5, NN4 (and yes, even a couple in IE5).
They were virtually always one-line adjustments that had _nothing_ to do
with the browsers being tested (other than those limited browsers
exposed the holes). Now, if I have an app that runs in Opera 5-10.5
(degrading in 5 and 6 of course) and you have some bullshit that appears
to work in IE8 and FF3.6 (in their default configurations), which do you
think has the better chance of working in Opera 11? Which author has
_learned_ something along the way? And which is still in the dark due
to years of half-ass coding efforts? In other words, would you put a
penny on My Library or Dojo? It's years of success building momentum
versus years of failure building and tearing down the same crap over and
over.

Today's "current" browsers will not be current next year. So what will
happen to your code in the future? If it only works for a fleeting
moment in time (as indicated by failures in older browsers), it isn't
likely to withstand the future. You seem to be hoping that you will be
called back to rewrite it. I wouldn't bet on it. :
 
D

David Mark

David said:
The typical incompetent (and/or disingenuous) freelance developer (e.g.
VK) wants to make a quick buck. So they show the fruits of their labor
to their clients on one or two "current" browsers that they've
painstakingly tuned the code to work with. Never mind that new browsers
come out seemingly every day. That just means more work for the
developer to "keep up" with those pesky browsers (designed
obsolescence). And if their nudging and twiddling (usually per the UA
string or some other thoughtless inference) breaks older browsers or
browsers with less than 1% of the world, that is spun as "saving money"
by not bothering to test thoroughly. Get it? If not, you are stupid.
If so, you are posing, stealing or both (and you best hope your clients
don't read this group).


Please stop wasting everyone's time with random gibberish.


Are you really that out of touch? Do you think that proponents of
progressive enhancement (and the like) are strictly talking about form
validation? And many incompetents use huge and complex blobs of browser
sniffing to do form validation. Mostly they aren't capable of writing
anything more than that (with perhaps a fade effect on top to make their
stuff look "modern"). Trying to write something more than that with
brittle scripts like jQuery is madness.


Lots of people did that around 2005 and we are still feeling the
after-effects of the tsunami of futility that followed.


Who is Eric Lippert and how does this observation from 2003 help your
"argument" for browser sniffing.


So why are you quoting it?


Your recommendations are the kiss of death.


I know you don't. Neither does "Jorge" and several others who post
similar drivel here. Here's a hint: there's no such thing as "Web 2.0".


For script disabled users?

document.write('<a href="bs.html">VK's imaginary word processor<\/a>');

document.write('<a href="bs.html">VK\'s imaginary word processor<\/a>');
 
V

VK

The typical incompetent (and/or disingenuous) freelance developer (e.g.
VK) wants to make a quick buck.

"a quick buck"? You have a data of the development lifecycle in my
bureau? It took around one year for stable SVL library (Superimposed
Vector Language, a top layer interface over SVG and VML) for IE6,
Firefox, Opera and Safari. Some things can be done quickly, some
longer, some very long. I assume we are discussing one "theory", not
my possible business reputation. The latter let be leaven to my
clients.
So they show the fruits of their labor
to their clients on one or two "current" browsers that they've
painstakingly tuned the code to work with.

Always:
--------------------------
5 browsers, 3 OS:
IE 8 / Win Vista + Win 7
Fx 3.6 / Win Vista + Win 7
Sf 4 / Win Vista + Win 7 + Snow Leopard
Ch 4.1 / Win Vista + Win 7
Op 10 / Win Vista + Win 7
--------------------------
Per extra negotiations:
full functional equality for IE 6 / Win XP SP2

This is the full QA testing, not just load once to see if any error in
the error panel.

May you share your QA testing platforms? I assume you list is much
longer than mine with your clients happily paying for extra hours to
you for each UA which is not in my list. Or you are QA only for IE8/Fx
and the rest comes as a free of charge bonus? Just curious.
Never mind that new browsers
come out seemingly every day.

Oh you do have to mind. Moreover you have not only to continuously
monitor browsers coming/leaving across the 1% line but also monitor
browsers that are going to come to the market and predict their
chances to be a must to support based on betas, producer's market
power etc. There is almost no time for golf. :)
Lots of people did that around 2005 and we are still feeling the
after-effects of the tsunami of futility that followed.

Yeah... Lots of people still cannot believe that the secured niche of
prototype inheritance explanation, proper script tag attribute
bitching and stuff is almost vanished. JavaScript went very quickly by
the regular language way: from chunks of code to simple libraries,
then complex universal libraries, then on a certain complexity level
libraries for managing libraries (frameworks). Yet the "historical"
c.l.j. decision, that a library in JavaScript is possibly not an
absolute evil, came no earlier than the year 2008. At least that year
the consensus remained on the opposite side.
Who is Eric Lippert

http://blogs.msdn.com/user/Profile.aspx?UserID=2989

The author of the only one readable prototype explanation I know:
http://blogs.msdn.com/ericlippert/archive/2003/11/06/53352.aspx

And presumably that terrible guy grace to who JScript doesn't
enumerate variables over host object and mistreats named
FunctionExpression :)
http://blogs.msdn.com/ericlippert/archive/2005/05/04/414684.aspx
and how does this observation from 2003 help your
"argument" for browser sniffing.

Because for reach applications I don't care of getElementById
presence, that is a far past. I want newest tools in the way they
implemented per platform, with all their power and - alas - with all
their inevitable peculiarities.
I know you don't.  Neither does "Jorge" and several others who post
similar drivel here.  Here's a hint: there's no such thing as "Web 2.0"..

There are a lot of things that do not exist on the c.l.j. planet: Web
2.0, Ajax, from the most recent sound discoveries - the non-existence
of JavaScript itself. Lucky these products of bored minds thinking do
not affect the world. ;-)

http://www.google.com/#hl=en&source=hp&q=Web+2.0
 
D

David Mark

VK said:
"a quick buck"? You have a data of the development lifecycle in my
bureau?
What?

It took around one year for stable SVL library (Superimposed
Vector Language, a top layer interface over SVG and VML) for IE6,
Firefox, Opera and Safari.

So what?
Some things can be done quickly, some
longer, some very long. I assume we are discussing one "theory", not
my possible business reputation. The latter let be leaven to my
clients.

Your poor long-suffering clients? What about them?

Which mode, genius?
Fx 3.6 / Win Vista + Win 7

Works mostly the same as FF3.5, FF3, FF2...
Sf 4 / Win Vista + Win 7 + Snow Leopard
Ch 4.1 / Win Vista + Win 7

Chrome 3-4.1 is off your radar completely?! And I suppose 4.2 will be
out of the question by next month.
Op 10 / Win Vista + Win 7
--------------------------

Don't you see how stupid all of this is? Er, scratch that. Of course
you don't, else this thread would be long over.

Are you seriously saying that you would only test your code in the above
browsers? Or perhaps you would use the results of such "strenuous"
testing as evidence that you have done your job? Well, what happens
when Opera 11 comes out? If you wrote code (by observation) that only
worked in Opera 10, what chance does it have in anything but Opera 10?
Per extra negotiations:
full functional equality for IE 6 / Win XP SP2

You wish. You forgot that you are not a capable developer. And what
about IE7 and its umpteen billion possible configurations? What about
IE8 impersonating IE7? All academic?
This is the full QA testing, not just load once to see if any error in
the error panel.

QA testing something that sniffs the UA string is a complete waste of
time. You know that the logical "glue" that holds such scripts together
will dry up within a short period of time (and you'll be starting all
over with another round of observing/sniffing/testing). Why not just
learn how to do things right once?
May you share your QA testing platforms? I assume you list is much
longer than mine with your clients happily paying for extra hours to
you for each UA which is not in my list.

My clients pay _much_ less as they don't need me to come in every six
months to observe/sniff/test again. There's no list. Get it?
Or you are QA only for IE8/Fx
and the rest comes as a free of charge bonus? Just curious.

QA is done by QA testers (not me). Oddly enough, they rarely have
issues with my scripts. And I don't mean on two browsers. I mean on
whatever the hell they want to try them with now or in the future.
Oh you do have to mind. Moreover you have not only to continuously
monitor browsers coming/leaving across the 1% line but also monitor
browsers that are going to come to the market and predict their
chances to be a must to support based on betas, producer's market
power etc. There is almost no time for golf. :)

Actually, I pay very little attention to new browsers. They come and
go. My scripts stay. Must be magic, huh?

Yeah... Lots of people still cannot believe that the secured niche of
prototype inheritance explanation, proper script tag attribute
bitching and stuff is almost vanished.

What does that mean?
JavaScript went very quickly by
the regular language way: from chunks of code to simple libraries,
then complex universal libraries, then on a certain complexity level
libraries for managing libraries (frameworks).

LOL. Very few JS libraries are worth pinning your hopes on. The fact
that "frameworks" are built on shoddy JS libraries (not to mention that
GP libraries aren't suited for browser scripting) should tell you
something about them (hint: it's not progress).
Yet the "historical"
c.l.j. decision, that a library in JavaScript is possibly not an
absolute evil, came no earlier than the year 2008.

Lunacy. What's a library? The term could describe almost any script.
Bad GP scripts are evil (e.g. jQuery, Prototype, etc.) Bad scripts are
typically written (and boxed in to ill-advised designs) by neophytes.
Why? Because people new to browser scripting don't know any better.

And you don't know your history either. My Library was written in 2007
to serve as a counterpoint for people (like you) that argue endlessly
(and without substance) that they "need" to use browser sniffing. How
you've managed to remain a neophyte for over a decade is beyond me.
At least that year
the consensus remained on the opposite side.

As mentioned, you make a lousy historian (and programmer, amateur
psychologist, philosopher, etc.)

Again, your recommendations...
And presumably that terrible guy grace to who JScript doesn't
enumerate variables over host object and mistreats named
FunctionExpression :)
http://blogs.msdn.com/ericlippert/archive/2005/05/04/414684.aspx
Whatever.


Because for reach applications I don't care of getElementById
presence, that is a far past. I want newest tools in the way they
implemented per platform, with all their power and - alas - with all
their inevitable peculiarities.
Whatever.


There are a lot of things that do not exist on the c.l.j. planet: Web
2.0, Ajax, from the most recent sound discoveries - the non-existence
of JavaScript itself. Lucky these products of bored minds thinking do
not affect the world. ;-)

Web 2.0? What's that? Oh, where every Web page loads ten tons of
script and then throws exceptions all over the place? I often use Opera
10.5 and have it configured to display errors. Every miserable site
(certainly all of the "majors") runs programs that are doing things that
their developers never expected them to do. Wonder how that happened.
Perhaps Opera 10 wasn't on their list when they wrote the scripts?

And JavaScript(tm) is a brand name. Do your business cards say
"JavaScript expert" or what?
 
V

VK

His example is a truly horrible way to build a prototype chain, so why
should anyone believe him.

It is obviously seen from the article that it is not a sample of a
practical prototype inheritance usage. It is an explanation of that
prototype and constructor are as such. You find me another article
where that simple fact would be simply and unambiguously stated, with
any other code sample:
<quote>
Reptile.prototype = Animal;
this does NOT mean "the prototype of Reptile is Animal". It cannot
mean that because (obviously!) the prototype of Reptile, a function
object, is Function.prototype. No, this means "the prototype of any
instance of Reptile is Animal".
</quote>

Semi OT:
That aside I am not using prototype inheritance in my code as anyone
can see. From my point of view it is lesser effective, much more
convoluted and much lesser flexible than the ol' good super-derivative
chain constructors with all methods on the same level:

function Super() {
this.method = Super.m;
}
Super.m = function() {
// do with [this]
}

function Derivative() {
Super.call(this);
this.new_method = Derivative.m;
}

Derivative.m = function() {
// do with [this]
}

Yet JavaScript was build similar to Perl where the inheritance can be
build in any way one used to. Ones can go with super-derivative chain
constructors:
http://docs.sun.com/source/816-6408-10/function.htm#1194243
others who like new and original stuff can go with prototype:
http://docs.sun.com/source/816-6408-10/function.htm#1193426
some others can combine both (not really suggested from the code
readability point of view but well possible).
This is why - but not only because of that - JavaScript is a great
language.
 
V

VK

May you share your QA testing platforms? I assume you list is much
My clients pay _much_ less as they don't need me to come in every six
months to observe/sniff/test again.  There's no list.  Get it?

Yes, I got it. Respectively that's the last time I am answering you.
Maybe you are a great poet or you play golf like God, but as a
programmer you and your products are of zero interest to me. Same for
any sane customer.
QA is done by QA testers (not me).  Oddly enough, they rarely have
issues with my scripts.  And I don't mean on two browsers.  I mean on
whatever the hell they want to try them with now or in the future.

You "My Library" is claimed to exists from 2007. Where the version
vector? fixed bugs list? current bugs tickets? verifiable
testimonials? "ultimate programmer" you are for God's crying...

Not a single one found in three years. OK, mark it as the #000001
then:
http://www.cinsoft.net/mylib-examples.html
In the "Alert" section get a multi button modal and navigate with Tab
by buttons. That alone shows right away that no one ever used your
script on practice, because it is a frequent user action that would be
discovered very soon. I may look for a partial fix from my old
TransModal code
http://transmodal.sourceforge.net/TransModal_0_0_4_full/0_0_4_Strict.html
As remained on SourceForge it has a bunch of bugs itself but it was
fairly noted as for testing only 0.0.4 alpha, not as a perfect code to
be ready for use.
And JavaScript(tm) is a brand name.

JavaScript is the name of the language. The only name of the only
language. Indeed ECMAScript is not a registered trademark yet in the
US or EU
http://tess2.uspto.gov/
http://www.trademarkdirect.co.uk/corporate/
That is easy to correct, I am going to call my lawyer to CA first
thing tomorrow. Never was sitting on royalties, only full sells. Well,
everything once happens for the first time.
 
D

Dr J R Stockton

In comp.lang.javascript message <[email protected]
september.org>, Fri, 30 Apr 2010 22:44:44, Garrett Smith
TODO:
The styles are an eyesore. Literally. The code comments have green text
on a yellow text-box background, on the pink code block background. This
causes my eyes to hurt after looking at it for a while.

Remove ALL of the CSS. Then insert such CSS as is really useful. Using
border: 1px solid black for what looks like a PRE element is useful;
little else is. And use the simpler style for the FAQ as well.

But move turgid explanations of obvious detail, such as that on !!, to
something like float:right; width:30%; font-size:smaller; with
modest bordering.



In current
<http://jibbering.com/faq/faq_notes/not_browser_detect.html> :

is boolean true seems to be a ...
Boolean
Search the standard for "Boolean" (lots) and "boolean" (2, justified).
More of them further on.

As More browsers were written their ...
! ,

feature detecting because, while it avoids the function erroring in
Bad English : ^^^^^^^^
 
J

John G Harris

On Sun, 2 May 2010 at 08:36:30, in comp.lang.javascript, VK wrote:

JavaScript is the name of the language. The only name of the only
language. Indeed ECMAScript is not a registered trademark yet in the
US or EU
http://tess2.uspto.gov/
<snip>

That website says that 'JavaScript' is a trade mark owned by Sun.

Don't you want your pages to work in IE or Opera ?

John
 
D

David Mark

VK said:
Yes, I got it. Respectively that's the last time I am answering you.
Good.

Maybe you are a great poet or you play golf like God, but as a
programmer you and your products are of zero interest to me. Same for
any sane customer.

Any sane customer? Where do you come up with that?
You "My Library" is claimed to exists from 2007.

Yes, it was developed *here* genius (at least the foundation) as the
"CWR project". Where were you?
Where the version
vector? fixed bugs list? current bugs tickets?

There is a repository linked from every page on the site.
verifiable
testimonials?

There have been lots of them over the years. Right here. Where were you?
"ultimate programmer" you are for God's crying...
Huh?


Not a single one found in three years.

I wasn't looking for most of those three years (nor was I asking others
to use the code or even mentioning it existed). Still, there were some
keen-eyed readers who posted bug reports (and they were fixed
immediately). There have been others who pointed out non-problems
because they didn't take the time to read up on how it is supposed to
work (like you). Again, where were you?
OK, mark it as the #000001
then:
http://www.cinsoft.net/mylib-examples.html
In the "Alert" section get a multi button modal and navigate with Tab
by buttons.

You should read my comments about that in the My Library forum. To
summarize, it is up to the calling app to deflect unwanted focus back to
the dialog. I didn't bother demonstrating that on that page, but have
promised to do so in the near future.
That alone shows right away that no one ever used your
script on practice, because it is a frequent user action that would be
discovered very soon.

The alert thing is an add-on. It was the first "widget" written for My
Library as a proof of concept (yes, back near the end of 2007). It was
(as I've admitted) a complete piece of junk as a GP widget (but did
quite well for what I needed it for).

There's no fix as there's no problem. I know how to make the _calling
app_ deflect focus when _it_ has invoked a modal of some sort (i.e. this
has nothing to do with the "modal" widget). Thanks anyway.
As remained on SourceForge it has a bunch of bugs itself but it was
fairly noted as for testing only 0.0.4 alpha, not as a perfect code to
be ready for use.
LOL.


JavaScript is the name of the language. The only name of the only
language. Indeed ECMAScript is not a registered trademark yet in the
US or EU
http://tess2.uspto.gov/
http://www.trademarkdirect.co.uk/corporate/

Javascript. The camel-case version is the brand name. Do you read this
group at all or do you just post blindly?
That is easy to correct, I am going to call my lawyer to CA first
thing tomorrow. Never was sitting on royalties, only full sells. Well,
everything once happens for the first time.

:)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top