Frames or not Frames...

A

Ale

Hi...
I always use frames... I found them useful in writing/updating a website
relatively fast at hand.
However, I know about the possible problem in the crawler indexing.

I do not have trouble a page inside a frame is linked because I programmed
the dynamic page in order to realize this and load itself in frames.
I did not make this by javascript because I remember reading could give
trouble with indexing too.

My questions are:

1) does the ASP response.redirect has trouble with web-crawler as could have
javascript?
2) Do you suggest to abandon frames or do you think the solution I adopted,
together with a linked sitemap, could avoid problem with bot?

Thanks
Ale
 
B

Barbara de Zoete

I always use frames... I found them useful in writing/updating a website
relatively fast at hand.
My questions are:
2) Do you suggest to abandon frames or do you think the solution I adopted,
together with a linked sitemap, could avoid problem with bot?

This has been discussed many times:
<http://groups.google.com/groups?q="frames+are+evil"+group:alt.html*>


--
,-- --<--@ -- PretLetters: 'woest wyf', met vele interesses: ----------.
| weblog | http://home.wanadoo.nl/b.de.zoete/_private/weblog.html |
| webontwerp | http://home.wanadoo.nl/b.de.zoete/html/webontwerp.html |
|zweefvliegen | http://home.wanadoo.nl/b.de.zoete/html/vliegen.html |
`-------------------------------------------------- --<--@ ------------'
 
D

dingbat

1) does the ASP response.redirect has trouble with web-crawler as could
have javascript?

No, this is the right way. A crawler that can't follow this is having
real problems.
2) Do you suggest to abandon frames

Yes.

The problem (for your case) is the old problem that frames aren't
bookmarkable. A crawler may crawl your site, but it then finds URLs
that are to child frames, not to the parent page. It may then hand
these out to users of the search engine (or whatever). The extent of
the problem with frames depends on how your site responds to these URLs
and how easily navigable your site is.

If you code your site carefully, then it can respond to an internal
frame URL by auto-wrapping the target in the right parent frameset and
also navigating the target frame to the correct page. This is quite
easy, so long as you bothered to do it ! It also means that every page
needs to be scripted, rather than being static HTML (unless someone
knows some Apache cleverness that I don't). This pervasive scripting
may be a problem for some people and not others.

Stick with frames if there's a good reason to use them - i.e. the user
can see them and uses them as an independent scrolling mechanism or
whatever. If you're only using frames as a way of sharing a common menu
bar between pages, then get rid of them.
 
A

Ale

ok thanks..., the reason to stick on frames for me is to use the scrollbar
always having a header and a footer.
I know I can do this with script... but I want to keep the website simple
because I do not have time to manage the website.

I am trying to solve the indexing problem by the use of noframe tag in the
entry page framset... here I generate the sitemap. Furthermore I sbmitted
the sitemap to google and in one of the initial frames I linked the sitemap
too...

I this point I guess that, if really as you assured me response.redirect
does not give trouble, a crawler will find linked all the website.

I do not have the problem of wrong referencing... In fact, I have a session
variable that is flagged only and only if the main index page (the one with
the framset) is open. If not, the page redirect the website to index.asp
posting its reference to be reopen in the right place.

As I mentioned I wanted trying avoid to much javascript.

Well, thanks for your comments, I feel more safe :)
 
A

Ale

yes... I know... I was mainly asking if you think the solutions I adopted
can avoid the problems I read about... :)
cheers
 
D

David Dorward

Ale said:
ok thanks..., the reason to stick on frames for me is to use the scrollbar
always having a header and a footer.

In the vast majority of cases the loss of vertical space for content greatly
outweighs the benefits of access to the header and footer without
scrolling.
 
D

dorayme

From: David Dorward said:
In the vast majority of cases the loss of vertical space for content greatly
outweighs the benefits of access to the header and footer without scrolling.

This is correct. So OP should make sure the header is little in height,
ditto the footer. If the main reason for frames happened to be for a
reasonable left nav col then this would be little loss of valuable space for
most contents...

dorayme
 
A

Adrienne

ok thanks..., the reason to stick on frames for me is to use the
scrollbar always having a header and a footer.
I know I can do this with script... but I want to keep the website
simple because I do not have time to manage the website.

I am trying to solve the indexing problem by the use of noframe tag in
the entry page framset... here I generate the sitemap. Furthermore I
sbmitted the sitemap to google and in one of the initial frames I
linked the sitemap too...

I this point I guess that, if really as you assured me
response.redirect does not give trouble, a crawler will find linked all
the website.

I do not have the problem of wrong referencing... In fact, I have a
session variable that is flagged only and only if the main index page
(the one with the framset) is open. If not, the page redirect the
website to index.asp posting its reference to be reopen in the right
place.

As I mentioned I wanted trying avoid to much javascript.

Well, thanks for your comments, I feel more safe :)

Don't feel safe yet, you still have problems:

1. Sessions are not available to those who do not have session cookies
enabled, or available, that would be web crawlers and people who have their
security set to even medium when there is no machine readable privacy
policy.

2. Some browsers will not redirect if they are set up that way. That's no
good if someone is coming in from a search engine into one of the orphan
pages.

3. I fail to see why you think that using frames addresses your issue with
managing the web site. Use an include.

<% option explicit%>
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd">

<html>
<head>
<!-- the following include file includes the stylesheet and favicon -->
<!--#include file="linkrel.asp"-->
<title></title>
</head>
<body>
<!--#include file="header.asp"-->
<!--#include file="menu.asp"-->
<div id="content">
<h1>Page Title</h1>
<p>Some content.</p>
</div>
<!--#include file="copyright.asp"-->
</body>
</html>

*** header.asp ***
<div id="header">
<img src="logo.png" height="height" width="width" alt="">
<!-- leave alt null because this is for decoration only -->
</div>

*** menu.asp ***
<div id="menu">
<ul>
<li><a href="index.asp">Home</a></li>
....
</ul>
</div>

*** copyright.asp ***

<div id="copyright">
Copyright &copy; <%=datepart("yyyy",date())%>
</div>
<% 'clean up and close any open connections %>
 
A

Ale

ok... I see that the most diffuse opinion is this... therefore I guess is
the correct one :)
At this point I will try to migrate the website from one structure to the
other.

However, I still did not get two points:
1) The ASP session variable reside on the memory of the server and not in a
cookie. Thus, the browser, or in this case the bot, does not have to be
cookie enabled. All happen at the server side. Therefore, when a bot visit
my index.asp, set the session variable (at the server side) flagged. Then,
here can happen a problem. When explorer (even not cookie enabled) access a
website, it active a session on the server side till it goes in time-out, or
the browser is closed. When a crawler access my index.asp activate this
session, but all work only and only if the same bot proceed in the indexing.
If crawler, once reached the index.asp start to crawl the website with
multiple bots my approach does not work
2) The ASP response.redirect it is interpreted server side... thus, the bot
does not see it and see only the page that the server send to it...
Why it should not work with a bot, if this happen server-side?

Ok... let me remind that in my index.asp I have a frameset. But I use also
the noframe tag which it is supported by bots. There I generate the sitemap.
Thanks to this discussion I found a problem (see point 1). I am going to
abandon the session variable for the bots in the case multiple bot are used.
Each link in the sitemap will post the target page a variable (posting
should be supported by bots) which will stop the reframing of the website...
Those links should be indexed, but when used from browser, not passing
through the index.asp (here the session variable works for sure) the single
page will redirect to the framed versions...

I think could work...

Thanks
 
A

Ale

thanks for all suggestions...

dorayme said:
This is correct. So OP should make sure the header is little in height,
ditto the footer. If the main reason for frames happened to be for a
reasonable left nav col then this would be little loss of valuable space
for
most contents...

dorayme
 
D

dingbat

In fact, I have a session variable that is flagged only and only if the main index page (the > one with the framset) is open.

No you don't. You have a session variable that interpolates some
information it can see and hopefully tries to interpret this as
something about the state of the frames. But this is so unreliable as
to be worthless, so forget doing it this way.

If you wish, use client-side JS to burst other people's framing, or to
wrap a bare frame in the correct wrapper it it feels exposed. But don't
fool yourself into thinking you can do this usefully on the server.




Sessions are bad. You don't need them, so don't use them.

There are a _few_ times when a site needs a session - "shopping
baskets" is one obvious one. But most sites don't need them, and most
parts of most sites don't need them either - even if some of the site
does. Sessions also cause trouble and won't work for many of your
users, so if you possibly can avoid using them, don't use them.

A message "This site requires cookies / sessions" because its shopping
basket depends on them is a _real_ irritation when you're only trying
to browse the catalogue. Your job as a site developer involves many
things. but it does not require you to deliberately annoy the users
when you don't need to.
 
A

Ale

this is the point I do not understand... I will try to read more
literature... you seems to be quite sure abut this...
I do not understand which is the link between cookies and sessions...

cheers
Ale
 
D

dingbat

There are two points here - try to keep them separate.

First of all is that "frames" are basically a client-side technology
(that's where their few advantages come from - they make things very
simple for the server). If you want to have code that behaves actively
depending on the state of these frames (e.g. not permitting an
unwrapped child frame) then the client is the best place to discover
this.

(yes, the local pinheads will be along shortly to bleat about how
client-side JavaScript is bad and unreliable, but that's because
they're too stupid to have room for anything else in their sheep-like
skulls)

You can _try_ to work out on the server what state the client's framing
is in, but this is very much guesswork. You just can't do it.



Secondly, there are sessions. These are quite separate !

There are no HTTP sessions. HTTP is (by design) stateless and so
sessionless. Web developers have been cursing (or defending) this
choice for years.

So it's now the job of "middleware" to do sessions. No web developer
should ever deal with implementing sessions directly - get it
ready-made from your platform vendor. The middleware should implement
sessions by some method that's invisible to you, but works (mostly) and
is as little irritation for the user as possible. As an ASP developer
you have a reasonably robust session implementation - there's a switch
to turn it on or off, there's a convenient persistent array for you to
use and there's a reasonably stable and secure implementation to make
it all work. This implementation uses cookies, so whenever you (the
developer) requests for IIS/ASP to use sessions, then cookies will
start to fly.

What M$oft don't tell you (and this is typical of them) is that they've
sold you a good core product, surrounded by useless fluff that's
switched on by default. ASP developers should learn (because M$ doesn't
tell them) that sessions need to be switched off explicitly (for most
sites) and that they should only be turned on for the sections of a
site that really need them (issues of what you do with global.asa). As
for most M$oft dev products, you _can_ achieve good things with it, but
they don't make it clear how to do so, or that doing so is mainly a
pruning exercise.
 
A

Ale

yep... they are different... and yes I wanted try to operate on the server
side because I cannot rely on client scripting because I want to have a
solution for bot, not for users... I can handle them.
However, frame not frame... I got the points... now I wil adpot the above
mentioned solution as buffer till I will have the time to convert the
structure.

Session<->Cookies... thanks for the suggestion. As I said before I need to
check more literature. I still do not understand this point. I always
avoided coockies in my websites. For this sometimes I was relying on session
variables. And they always worked with whatever browser... I never had a
single complain, from both Mac and Win users....

Thanks a lot
Ale
 
D

dingbat

Bots are easy - just serve them the content, serve it under a useful
URL, leave the rest to them. Your supposed "bot problem" isn't - it's
actually the old "you can't bookmark frames" problem. If you solve
(part of) this by making a child-frame self-wrapping, then the child's
URL (what the bot sees) becomes a usable URL to enter your site from
outside (i.e. a search engine) and everything instantly becomes happy.
You don't need to do _anything_ that is bot specific, or worry about
whether bots are running CS JS.

Even better, when you finally dump the frames the old child URLs become
your new URLs and the bots' old content-scraping into the search engine
still works fine with the new site.



You never hear complaints about web sites, that's a characteristic of
the medium,. What you hear instead is just silence.
 
A

Adrienne

yep... they are different... and yes I wanted try to operate on the
server side because I cannot rely on client scripting because I want to
have a solution for bot, not for users... I can handle them.
However, frame not frame... I got the points... now I wil adpot the
above mentioned solution as buffer till I will have the time to convert
the structure.

Session<->Cookies... thanks for the suggestion. As I said before I need
to check more literature. I still do not understand this point. I
always avoided coockies in my websites. For this sometimes I was
relying on session variables. And they always worked with whatever
browser... I never had a single complain, from both Mac and Win
users....

Thanks a lot
Ale

Sessions ARE cookies, but are not persistant. A persistant cookie is one
that has an expiration date, one that does not lives for the life of the
session, hence it is called a "session cookie".

You set a persistant cookie like:
<% response.cookies("mycookie") = "chocolate"
response.cookies("mycookie").expires = date() + 2 'expires in two days
%>

You set a session cookie like:
<% session("mycookie") = "vanilla" %>

Or:

<% session("mycookie") = request.cookies("mycookie") 'will set session
cookie to chocolate if the user's cookie value is chocolate %>

However, to a user agent, a cookie is a cookie, and if the user agent does
not or cannot accept any cookie, then your sessions don't even get to the
oven, let along get baked.

Here's a good test to see if a session cookie is available:

*** page1.asp ***
<p>The session id is: <%=Session.SessionID%><br>
<a href="page2.asp?test=<%=Session.SessionID%>">Go to Page 2</a></p>

*** page2.asp ***
<p>The session id is: <%=Session.SessionID%><br>
The session id from page one is : <%=request.querystring("test")%></p>
 
A

Andy Dingley

Sessions ARE cookies,

No they're not. They're a server-side presentation of apparent state,
one way of implementing this being by use of cookies.

If you try to use sessions under PHP and cookies are disabled, then the
server (quite reasonably) falls back to dropping a magic ID into the
URLs instead. No cookies at all, but you still get sessions.
 
A

Adrienne

Gazing into my crystal ball I observed Andy Dingley
No they're not. They're a server-side presentation of apparent state,
one way of implementing this being by use of cookies.

If you try to use sessions under PHP and cookies are disabled, then the
server (quite reasonably) falls back to dropping a magic ID into the
URLs instead. No cookies at all, but you still get sessions.

PHP may act that way, but Classic ASP does not. The OP is working with
Classic ASP.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,731
Messages
2,569,432
Members
44,835
Latest member
KetoRushACVBuy

Latest Threads

Top