which content type

S

shypen42

Hi,

I'd like to know how to serve web pages as "cleanly" as possible,
maybe validating them if it's possible. How should I do to
both respect standards *and* have it work under the various
IE versions?

Is there a "best practice" regarding the content type of a web page
using quite some amount of Javascript (and "AJAX"-server querying)?

Here are three examples:

1. XHTML 1.0 Transitional, content="text/html"

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<title>script.aculo.us - web 2.0 javascript</title>
<meta http-equiv="content-type" content="text/html; charset=utf-8" />


2. XHTML 1.0 Strict, content="text/html"

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="fr" lang="fr">
<head>
<title>Openweb.eu.org - L'objet XMLHttpRequest</title>
<meta http-equiv="Content-type" content="text/html;
charset=utf-8"/>



3. XHTML 1.1, content="application/xhtml+xml"

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en">
<head>
<meta http-equiv="Content-type" content="application/xhtml+xml;
charset=utf-8" />

Now I've read this:

http://hixie.ch/advocacy/xhtml

"Sending XHTML as text/html Considered Harmful"

And then some guy replying to the following affirmation:

++ there is ONLY ONE valid Content-type for XHTML content,
++ and that is application/xml+xhtml, not text/html.

- This is simply not true. RFC 2854 [ietf.org], the definition of
- text/html, explicitly permits XHTML 1.0 documents that
- follow Appendix C to be transmitted as text/html.
- Doing so causes Mozilla and Opera to parse it as HTML and
- not XHTML, but that doesn't mean it's "invalid" or non-standard
- in any way.



So what would you recommend, what is currently seen
as "best practices" knowing that some Javascript is
used (in an "buzzwordy-AJAX-fashion")?
 
R

Randy Webb

(e-mail address removed) said the following on 5/18/2006 8:59 AM:
Hi,

I'd like to know how to serve web pages as "cleanly" as possible,
maybe validating them if it's possible. How should I do to
both respect standards *and* have it work under the various
IE versions?

Serve it as text/html
Code it to HTML4.01 Strict
 
R

Richard Cornford

I'd like to know how to serve web pages as "cleanly" as possible,
maybe validating them if it's possible. How should I do to
both respect standards *and* have it work under the various
IE versions?

Valid HTML 4.01 documents served with content-type text/html.
Is there a "best practice" regarding the content type of a web page
using quite some amount of Javascript (and "AJAX"-server querying)?

The important relationship between a content type headers and scripting
is that with a text/html content type the browser should create an HTML
DOM and with a application/xhtml+xml content type the browser should
create an XHTML DOM. (some 'file extension' and DOCTYPE testing may
leave some browsers that receive text/html still creating an XHTML DOM,
but they really shouldn't). Almost no non-trivial browser script
written for an HTML DOM will work correctly with an XHTML DOM.
Generally you would not want to expose the same script to both types of
DOM, and if you did you would find the script authoring task
considerably complicated.
Here are three examples:

1. XHTML 1.0 Transitional, content="text/html"
<snip>

Self-delusion! If it is sent as text/html it will almost always be
interpreted as HTML (error filled tag soup HTML) and an HTML DOM
created for it. Appendix C XHTML is only XHTML in the mid of its
author, to the browser it is tag soup HTML.
2. XHTML 1.0 Strict, content="text/html"
3. XHTML 1.1, content="application/xhtml+xml"
<snip>
Useless in a commercial context because IE does not understand XHTML at
all.

++ there is ONLY ONE valid Content-type for XHTML content,
++ and that is application/xml+xhtml, not text/html.

- This is simply not true. RFC 2854 [ietf.org], the definition of
- text/html, explicitly permits XHTML 1.0 documents that
- follow Appendix C to be transmitted as text/html.

But when Appendix C XHTML 1.0 is transmitted as text/html the receiving
browser regards the document as HTML (and builds an HTML DOM for it).
So the degree to which Appendix C documents really are XHTML is
questionable. They certainly are not XHTML when it comes to scripting
their DOMs.
- Doing so causes Mozilla and Opera to parse it as HTML and
- not XHTML, but that doesn't mean it's "invalid" or non-standard
- in any way.

It still doesn't mean that doing so is sensible. If you script a
document in a way that is predicated upon it being interpreted as HTML
(an HTML DOM being exposed to be scirpted) then it is almost perverse
to impose a superficial illusion of the document being XHTML.
So what would you recommend, what is currently seen
as "best practices" knowing that some Javascript is
used (in an "buzzwordy-AJAX-fashion")?

If you are going to script it; Valid HTML 4.01 served as text/html.

Richard.
 
V

VK

Hi,

I'd like to know how to serve web pages as "cleanly" as possible,
maybe validating them if it's possible. How should I do to
both respect standards *and* have it work under the various
IE versions?

Irrelevant to IE or any other particular browser: a "cleanly served
page" in my mind is a page that:

1) being build in accordance with some chosen DTD
2) having matching Content-Type (for local files matching extension <>
Content-Type association will be needed).

In this concern "cleanly served page" is:

1) HTML page build in accordance with HTML Transitional, HTML Frameset
or HTML Strict DTD with the relevant DOCTYPE and served with
Content-Type text/html

2) XSLT page build in accordance with XML/XSL rules and served with
Content-Type text/xml

3) XHTML page build in accordance with the chosen DTD, with the
relevant DOCTYPE and served with Content-Type application/xhtml+xml

Any other options are profanations of WWW and are not functional in
many aspects.
From the three options above

The option 1 is the only one fully supported by all existing UA's.

Yet the option 2 is also supported by any more-or-less modern UA. The
potential amount of non-capable UA's is approx. equal to the amount of
users not capable to hit the keyboard right or destinguish between a
mouse and a monitor. So both groups can be freely signed off as
unevitable goodies shrink.

The option 3 is currently supported by less than 10% of current UA's so
cannot be suggested for the Wide Web use.
 
T

Thomas 'PointedEars' Lahn

VK said:
Irrelevant to IE or any other particular browser: a "cleanly served
page" in my mind is a page that:

1) being build in accordance with some chosen DTD
2) having matching Content-Type (for local files matching extension <>
Content-Type association will be needed).

In this concern "cleanly served page" is:

1) HTML page build in accordance with HTML Transitional, HTML Frameset
or HTML Strict DTD with the relevant DOCTYPE and served with
Content-Type text/html

Full ACK.
2) XSLT page build in accordance with XML/XSL rules and served with
Content-Type text/xml

Please learn to understand the difference between a stylesheet and a plain
markup language. XSLT (XSL Transformation) is a member of the XSL
(Extensible Stylesheet Language) language family that is used to transform
XML markup into other formats. There is no "XSLT page". (There is also
no "XML page" or an "(X)HTML page" simply because display depends on the
medium; but that is a different matter.)

3) XHTML page build in accordance with the chosen DTD, with the
relevant DOCTYPE and served with Content-Type application/xhtml+xml

XHTML may also be served as application/xml (which is also a
[questionable] means to have XHTML documents displayed by IE)
or text/xml, however application/xhtml+xml is preferred:

<URL:http://www.w3.org/TR/xhtml-media-types/>

(Whereas the decision of allowing XHTML 1.0 to be served as
text/html is questionable as well, as already explained.)
Any other options are profanations of WWW
^^^^^^^^^^^^^^^^^^^
Parse error.
and are not functional in many aspects.
ACK


The option 1 is the only one fully supported by all existing UA's.
True.

Yet the option 2 is also supported by any more-or-less modern UA. The
potential amount of non-capable UA's is approx. equal to the amount of
users not capable to hit the keyboard right or destinguish between a
mouse and a monitor. So both groups can be freely signed off as
unevitable goodies shrink.

Parse error. (IOW: Nonsense. You really should stick to the facts.)
The option 3 is currently supported by less than 10% of current UA's so
cannot be suggested for the Wide Web use.

Your numbers are questionable. For example, even your favorite
w3schools.com states in its "Browser Statistics"[1] that as of May 2006 CE
(and we are at the end of that month), Gecko-based browsers, that have a
working XML parser built-in, in total have a share of 28% of the hits
(Firefox: 25.7%, continuous slight increase over the last months with the
exception of March; Mozilla Seamonkey[?]: 2.3%, currently decreasing). Of
course, you could also say that those numbers are questionable, and in fact
they are. Statistics, whether true ones or not, do not help for mid-term
design decisions, you see.


PointedEars
___________
[1] <URL:http://www.w3schools.com/browsers/browsers_stats.asp>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,583
Members
45,074
Latest member
StanleyFra

Latest Threads

Top