HTML preprocessor

A

aa

David Dorward said:
Essential functionality should be provided by the server.
I guess web-hosts without server side scripts are more probable then
visitors without JS
 
A

aa

Toby Inkster said:
that doesn't mean that it can't be used for some visual/behavioural
nicities.

JS is a mainstream developments in webdesign. Not for every website of
course. Limiting it to nicities? Is there any alternative to JS which is
suppported by all the browsers as good as the <body> tag?
 
A

aa

Chaddy2222 said:
Well, they can't and the keywords Meta Tag has nothing to do with what
where talking about!.
If you confine keyword to keywords Meta Tag you will never get a decent
rating
 
A

Andy Dingley

Client-side assembly of this pre-processed templating is a bad idea,
whether (as David says) by JS, or by any other possible mehod. It's a
one-off static publishing process, so do it that way. Merge the files
when you publish them and have the server serve their simple static
merged versions. This works better in terms of what matters, the
content you send out to the client. Caching will be assisted, for one
thing.

Client-side JS is of course a bad idea for anything that is essential
to the core functioning of a site or page.

Given your other recent clueless post, you're not surprising me here.
8-(
1. The author is talking about navigation, header, footer do not need to be
scanned by Search Engines.

That is simply ridiculous. Of course navigation needs to be accessible
to minimal non-JS spiders.

You claim that there is some sort of "menu bar navigation" that
benefits from JS and some "minimal links" navigation that doesn't
require it, and so is spiderable anyway. Although you might have a
point in relation to some issue of JS dropdowns for the main menu, the
OP here is talking about pre-processing include files for content
management reasons, not to enhance the display of their menu bar.

If they need CMS of one set of links, they need CMS for all of their
usages of these links. If this CMS is achieved by client-side JS, then
a non-JS client would lose it in all instances where it had been used.

2. What is the percentage of non-JS visitors?

High enough. Google doesn't use it, and you care about Google.
Are you saying that JS
should not be used at all?

No, but only to enhance a "core page" that remains functional without
it.

For a separate reason, client-side techniques (either JS or <iframe>)
are a poor idea as a substitute for SSI.
 
T

Toby Inkster

aa said:
I guess web-hosts without server side scripts are more probable then
visitors without JS

But you have full control over the server, but very little control over
the client. So if you choose a server with server-side scripting
capabilities, then your "probability" of having a server without
server-side scripting available is 0%, whereas the probability of having a
visitor without client-side scripting is still 5-15%.

Ultimately, it's your choice, as it's your website, but personally I don't
think it's a good choice to exclude about 10% of your potential visitors.
That would be like switching your web server off from midnight 1 Jan to
midday 6 Feb every year (36.5 days, 10% of the year).
 
C

Chaddy2222

aa said:
If you confine keyword to keywords Meta Tag you will never get a decent
rating
There is no such thing. Oh unless you mean ranking.
Also, I have know idea what you mean by keywords (unless your on about
including them in the content of the pages, which by the way is
recommended.
Although hidden words in a page will get you band by Google and other
SE's.
They also ignore keywords in the Meta Tags.
 
A

aa

server-side cannot be alternative to client side. You object JS because it
is not available to all browsers. Server-side is not availabe to all
webdesigners :(

Besides server side has nothing to do with browser compatibility we are
talking about
And there are a number of situations when doing ceratin job on client's
machine without round trips to the server prevents customer's frustration
 
A

aa

Chaddy2222 said:
There is no such thing. Oh unless you mean ranking.
Well done!
Also, I have know idea what you mean by keywords
I do not know which idea you have, but keywords are a standard concept not
too complicated and see no point in repeating here basic things like
"including them in the content of the pages, which by the way is
recommended". Likewise, I never mentioned using hidden keywords.
I was just puzzled with your stetement "Well, they (navigation, header,
footer) can't and the keywords Meta Tag has nothing to do with what where
talking about!"
 
J

Jonathan N. Little

aa said:
JS is a mainstream developments in webdesign. Not for every website of
course. Limiting it to nicities? Is there any alternative to JS which is
suppported by all the browsers as good as the <body> tag?
Maybe supported, just not enabled. JavaScript driven navigation although
convenient now is just not wise or practical. Back in '97, maybe; but
2007, absolutely not! If links to other pages on your site are inserted
via JavaScript then they will not be "visible" to bots and therefore
will not be "visible" or indexed by search engines. Search engines do
not have administrator rights to do directory listings of all files on
the server, they must have a link as the rest of the public!
 
J

Jonathan N. Little

aa said:
If you confine keyword to keywords Meta Tag you will never get a decent
rating
For the keywords "Best Coffee" here is Google's #1 pick:

http://www.seattlesbest.com/
Seattle's Best Coffee

Look at the source and find me the meta tag "Keywords". Just saying so
does not make it so!
 
A

aa

Andy Dingley said:
whether (as David says) by JS, or by any other possible mehod. It's a
one-off static publishing process, so do it that way. Merge the files
when you publish them and have the server serve their simple static
merged versions. This works better in terms of what matters, the
content you send out to the client. Caching will be assisted, for one
thing.

Client-side caching works when there are files external to a given HTML
If you include these files at publishing, client side cache does not help
downloading pages using the same, say footer.
Yet I have no concerns for this for I am not a graphic addict and my pages
usually are very small hand coded things (the site with Flashes discussed
recently was an exception because I inheireted the graphics and the audience
is special, not just every visitor). My concern is maintenance. If there is
no access to server-side scripts, then JS works fine for to change menu I
need to change and upload just one JS file no mater how many HTML pages use
this menu.

to the core functioning of a site or page.
Given your other recent clueless post, you're not surprising me here. 8-(

You know what your problem is? You and the likes of you here imagined
yourselves to be running the show. You do not give advices. You issue
instruction. And instead of free exchange of opinions you are trying to
command. Anyone who does not fit into your model is, if you let me use your
own lexicon, "a clue-proof idiot" who should "remove both of your thumbs
from your arse and stop talking shit". You cannot impatially discuss
technicalities. You've got to make personal assaults
That is simply ridiculous. Of course navigation needs to be accessible
to minimal non-JS spiders.

So if you do not accept the concept of different purposes of websites and
different target audiences, then I respect your opinion and memorise it.
No point to get hot.
 
A

Andy Dingley

aa said:
Besides server side has nothing to do with browser compatibility we are
talking about

Of course it does. Server-side in this case is a way to avoid the issue
of client-side compatibility altogether.

And there are a number of situations when doing ceratin job on client's
machine without round trips to the server prevents customer's frustration

Agreed, but totally irrelevant for this situation.
 
A

aa

Toby Inkster said:
But you have full control over the server, but very little control over
the client. So if you choose a server with server-side scripting
capabilities, then your "probability" of having a server without
server-side scripting available is 0%, whereas the probability of having a
visitor without client-side scripting is still 5-15%.

I like the way you use "if" :)
Henry Ford said that his customers can shoose a car of every color as long
as this color is black
Following your pattern I can coin the following:
IF every customer selects IE the probability of having a JS-less user is 0.
Or probably a bit higher for there are some cranks who bothered to learn how
to disable JS, graphics, cookies etc

Shall we ask visitors of this NG how many of them have full control over the
server hosting their websites?
 
A

aa

already duscussed in this here thread
If a home page is correctly designed, no need for SE to parse navigation,
footers and headers
 
A

aa

Jonathan N. Little said:
For the keywords "Best Coffee" here is Google's #1 pick:

http://www.seattlesbest.com/
Seattle's Best Coffee

Look at the source and find me the meta tag "Keywords". Just saying so
does not make it so!
You should tell this to Chaddy2222 - this is he who meta tag "Keywords" to
this discussion and I still at a loss why he did that.
Besides the absence of this tag in your file says absolutely nothing except
the desginer just missed it.
My point was that key words should not be confined to meta tag "Keywords",
but I still believe that it is better to have it
 
J

Jonathan N. Little

aa said:
server-side cannot be alternative to client side. You object JS because it
is not available to all browsers. Server-side is not availabe to all
webdesigners :(

Agree, server-side is far better for the designer. The sesigner has
complete control over the output independent of the client.
Besides server side has nothing to do with browser compatibility we are
talking about
And there are a number of situations when doing ceratin job on client's
machine without round trips to the server prevents customer's frustration

The problem here is your mindset. You are basing your opinions here and
on other threads on conditions and practices 10+ years in the past. Time
to update both your attitude and knowledge base. Back in the 90's
server-side scripting was expensive and JavaScript was new, hence web
designers gravitated towards the latter (as with DHTML and dancing
cursors). Now however, server-side scripting is cheap and readily
available to bargain-basement hosting and free hosting; additionally
browser security has been exploited via JavaScript and is now a real issue.

If your hosting does not have server side scripting, change it.
Dime-a-dozen, and do your website's CMS properly.
 
T

Toby Inkster

aa said:
server-side cannot be alternative to client side. You object JS because it
is not available to all browsers. Server-side is not availabe to all
webdesigners :(

Server-side scripting *is* *available* to all web designers. They may
*choose* not to host on a server that supports it, but that is a choice
that they have made. The possibility for them to use it is always there.
Hence, if the designer chooses to use it, it will work far more
consitantly than client-side, which the designer has no control over.
 
A

Andy Dingley

aa said:
Client-side caching works when there are files external to a given HTML

The question is not "Does caching work for the other documents" but
rather "Does caching still work for the merged HTML, if I use a
particular technique to merge it ?"

For SSI, caching is fine.
For client-side assembly, caching is not possible.
If you include these files at publishing, client side cache does not help
downloading pages using the same, say footer.

The footer will already have been merged into the resulting HTML
document. Its content is cached (as part of the main document), the
file itself is not needed, not visible, and so the fact it's not cached
doesn't matter.

Yet I have no concerns for this for I am not a graphic addict and my pages
usually are very small hand coded things

The notion that "I can do something badly because I don't do much of
it" isn't conducive to developing good skills.

I do this. We all do this. But I don't _like_ doing this, but sometimes
I work on a big site and I can't do it any more. Then it's useful to
know beforehand how to do things right.

Learning to do things right is often hard and lengthy. Once you've
learned though, it's usually quicker and easier to do them right
anyway, and everywhere.

If there is
no access to server-side scripts, then JS works fine for to change menu I
need to change and upload just one JS file no mater how many HTML pages use
this menu.

This is a complete red herring. Your argument is "inclusion is good,
therefore client-side inclusion is also good".
Our argument is instead "server side inclusion is better than client
side inclusion". There is no contradiction here because there is no
overlap.

However no-one is advocating an absence of inclusion (the only case
worse than your advice). Server-side inclusion is easily available in
most cases and can still be obtained in the others, by less direct
routes.
Anyone who does not fit into your model is, if you let me use your
own lexicon, "a clue-proof idiot"

No, not anyone. Just someone, like yourself, who begins as merely
ignorant but then just becomes entrenched in their ignorance rather
than bothering to learn something. It's your choice. No-one else cares.
So if you do not accept the concept of different purposes of websites and
different target audiences, then I respect your opinion and memorise it.

There are two fallacies in yoru argument here.

Firstly there are indeed "different websites". There are even two
"groups of websites", where one group cares about search engine
performance and one doesn't. However the first group is far, far bigger
than the second (kids' homepages and photos to share with the family).
Pretty much all of us here, whatever sort of page we write, care very
much about search engines.

Your fallacy though is to equate this categorisation with a
categorisation by purpose or implementation technology. It doesn't
matter if you're large or small, graphical or text, chances are that
you're in that huge group of search-engine-hungry sites. You simply
cannot say "My page is small and is made of badly-sized unreadable
text, therefore I don't care about search engines".
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top