data access

H

Hugh Welford

hi

Running an asp site on win/IIs/MSACCESS with a database reaching 45 meg.
Responses seem a little slow. Could anyone provide a checklist of things to
look at to optimise data access on this platform, or point me in the right
direction where I can read up on the subject.

Thanks

Hugh
 
S

surf_doggie

IMHO you should start looking @ going to mySQL. From experience I can
tell you once your db reaches about 60 meg it will start failing
intermittantly depending on how much traffic you have.

Earl
www.jhdesigninc.com
 
B

Bob Barrows [MVP]

surf_doggie said:
IMHO you should start looking @ going to mySQL. From experience I can
tell you once your db reaches about 60 meg it will start failing
intermittantly depending on how much traffic you have.
I disagree. Wight proper indexing, the size of the database should not
matter. It's the number of concurrent users whcih should dictate the
decision to move to a more powerful rdbms.
 
B

Bob Barrows [MVP]

Hugh said:
hi

Running an asp site on win/IIs/MSACCESS with a database reaching 45
meg. Responses seem a little slow. Could anyone provide a checklist
of things to look at to optimise data access on this platform, or
point me in the right direction where I can read up on the subject.

1. Proper indexing. use the database utilities included with Access to
optimize your database. See an Access newsgroup for help.
2. Minimize the time that your pages are actually connected to the database.
This means:
a. Using recordsets only for read-only display, never data maintenance
- use UPDATE/INSERT/DELETE statements ((aka DML) for
maintaining data
b. Use GetRows or disconnected recordsets so the connection can be
closed as soon as the data is retrieved
i. A corrolary (or perhaps: further amplification) to this is:
retrieve the
data, close the cnnection, then process the data
c. Use saved parameter queries rather than dynamic sql
3. Make the proper sacrifices to the database gods - virgins only <joke>

Consider moving to a more powerful rdbms if none of the above has the
desired effect.

Bob Barrows
 
S

surf_doggie

Hey Bob good advice I think I should point out that often times rouge
bots will make multiple connections in one session. There is some
company called cyberville that will send out a bot and index my entire
site (which has a db connection on almost every page) in about 3
minutes 120 webpages in 3 mins.

Obviously they have a ban on their ip's but I would recommend Hugh
check his server logs and see if hes getting traffic hes not counting
on.

Earl,
www.jhdesigninc.com
 
M

Mike Brind

surf_doggie said:
Hey Bob good advice I think I should point out that often times rouge
bots will make multiple connections in one session. There is some
company called cyberville that will send out a bot and index my entire
site (which has a db connection on almost every page) in about 3
minutes 120 webpages in 3 mins.

That's less than 1 connection per second. A properly written
application (get in - get out) using Jet comprising of mainly database
reads should be able to serve up 10-20 times that volume of traffic, in
my experience.
 
S

surf_doggie

That's less than 1 connection per second. A properly written
application (get in - get out) using Jet comprising of mainly database
reads should be able to serve up 10-20 times that volume of traffic, in
my experience.

Im confused then how can the logs show 3 to 5 pages being grabbed at
the same time down to the second and it not be a concurrent connect?
Ive put cross section of a log below, am I missing something?


5/25/2006-----------------3:46:42 PM
http://www.jhdesigninc.com/Campaign-Items/Special-Campaign-Package.asp
Refering Web Address
Mozilla/4.0 (compatible; MSIE 6.0; Windows XP) Browser Type
C:\websites\websites\jhdesign\Campaign-Items\disclaimer.asp Page Viewed
63.148.99.237 Remote Address

5/25/2006-----------------3:46:42 PM
http://www.jhdesigninc.com/Campaign-Items/Election-Day-Polls.asp
Refering Web Address
Mozilla/4.0 (compatible; MSIE 6.0; Windows XP) Browser Type
C:\websites\websites\jhdesign\Campaign-Items\Poly-Sleeve-YardSigns.asp
Page Viewed
63.148.99.237 Remote Address

5/25/2006-----------------3:46:42 PM
http://www.jhdesigninc.com/Campaign-Items/Election-Day-Polls.asp
Refering Web Address
Mozilla/4.0 (compatible; MSIE 6.0; Windows XP) Browser Type
C:\websites\websites\jhdesign\Campaign-Items\Political-Posters.asp Page
Viewed
63.148.99.237 Remote Address

It appears to me that this bot has 3 concurent connections am I
incorrect in that assumption? The original question refers to refining
db connections. This site only has a 1 meg access db. I had a client
with a 60 meg site and once a week or so it would go down once I
changed them to mySql no worries. Later while checking the logs I
realized there was a lot of traffic hitting the access db I didnt know
about. As soon as I went to mySql it was smooth sailing.

Earl
www.jhdesigninc.com
 
M

Mike Brind

Bob said:
1. Proper indexing. use the database utilities included with Access to
optimize your database. See an Access newsgroup for help.
2. Minimize the time that your pages are actually connected to the database.
This means:
a. Using recordsets only for read-only display, never data maintenance
- use UPDATE/INSERT/DELETE statements ((aka DML) for
maintaining data
b. Use GetRows or disconnected recordsets so the connection can be
closed as soon as the data is retrieved
i. A corrolary (or perhaps: further amplification) to this is:
retrieve the
data, close the cnnection, then process the data
c. Use saved parameter queries rather than dynamic sql
3. Make the proper sacrifices to the database gods - virgins only <joke>

Consider moving to a more powerful rdbms if none of the above has the
desired effect.

In addition to what Bob says, make sure you are using the Jet 4.0 OLEDB
driver, and not an ODBC driver.

Look for opportunities to cache data rather than forcing database reads
for infrequently changing data. Typical examples of this include drop
down lists of countries derived from the db; a list of headlines that
changes once every few days, but is requested hundreds or thousands of
times a day etc, etc...
 
S

surf_doggie

Look for opportunities to cache data rather than forcing database reads
for infrequently changing data.

Do you have any links that give good examples of the best way to handle
data caching. That had not even crossed my mind. I have price tables
but they dont change but every couple of months.

Thanks Mike!

Earl
www.jhdesigninc.com
 
M

Mike Brind

surf_doggie said:
Im confused then how can the logs show 3 to 5 pages being grabbed at
the same time down to the second and it not be a concurrent connect?
Ive put cross section of a log below, am I missing something?


5/25/2006-----------------3:46:42 PM
http://www.jhdesigninc.com/Campaign-Items/Special-Campaign-Package.asp
Refering Web Address
Mozilla/4.0 (compatible; MSIE 6.0; Windows XP) Browser Type
C:\websites\websites\jhdesign\Campaign-Items\disclaimer.asp Page Viewed
63.148.99.237 Remote Address

5/25/2006-----------------3:46:42 PM
http://www.jhdesigninc.com/Campaign-Items/Election-Day-Polls.asp
Refering Web Address
Mozilla/4.0 (compatible; MSIE 6.0; Windows XP) Browser Type
C:\websites\websites\jhdesign\Campaign-Items\Poly-Sleeve-YardSigns.asp
Page Viewed
63.148.99.237 Remote Address

5/25/2006-----------------3:46:42 PM
http://www.jhdesigninc.com/Campaign-Items/Election-Day-Polls.asp
Refering Web Address
Mozilla/4.0 (compatible; MSIE 6.0; Windows XP) Browser Type
C:\websites\websites\jhdesign\Campaign-Items\Political-Posters.asp Page
Viewed
63.148.99.237 Remote Address

It appears to me that this bot has 3 concurent connections am I
incorrect in that assumption? The original question refers to refining
db connections. This site only has a 1 meg access db. I had a client
with a 60 meg site and once a week or so it would go down once I
changed them to mySql no worries. Later while checking the logs I
realized there was a lot of traffic hitting the access db I didnt know
about. As soon as I went to mySql it was smooth sailing.

Earl
www.jhdesigninc.com

That might be 3 concurrent connections - it might not. It depends on
what db interaction each page involves.

A typical SELECT of 14,000 rows from a db on my machine (XP Pro, IIS
5.1, 52Mb Ram - nothing special) might take 0.05 secs to execute from
instantiation of connection object to closing it again. So you can
have 20 separate connections per second serially using that example.
MS claim that Jet supports 225 concurrent users (connections), so
theoretically, you could perform my test operation 4,500 times a
second. But they also recommend that you do not use Jet where you may
exceed 10 concurrent users, so that's 200 times per second.

Threre is no doubt whatsoever that SQL Server is a far superior
solution than Jet, but it is highly unlikely that the SIZE of your
client's db was the problem. Inefficient use of connections, use of
ODBC, version of driver, weight of traffic are all more likely to be
contributors.

I had a site that "fell over" regularly using Access 97 and a system
DSN on 20,000 page impressions a month (5 db calls per page - db about
10Mb). I changed it to Access 2000, use the Jet 4 driver and it now
serves 5 times as much traffic on a db approaching 200Mb, and hasn't
had a problem in 3 years.

I've been trying to find out just how scalable Jet is in a web
environment, but there isn't very much information available. The
difficulty is that there are so many variables involved in scalability,
so I have found - but size of db isn't one of them. I bookmarked an
article I found a few years ago that suggested Jet should be
comfortable serving up 50,000 reads an hour, but I lost that together
with the PC I was using when the "magic smoke" escaped from it.
 
M

Mike Brind

surf_doggie said:
Do you have any links that give good examples of the best way to handle
data caching. That had not even crossed my mind. I have price tables
but they dont change but every couple of months.

Thanks Mike!

Earl
www.jhdesigninc.com

Here's one that looks at caching data in an Application variable:
http://www.mikebrind.com/article.asp?article=18, but you can just as
easily use the Scripting.FileSystemObject to create dynamic include
files, and that way, save your data to disk.
 
M

Mike Brind

Mike said:
That might be 3 concurrent connections - it might not. It depends on
what db interaction each page involves.

A typical SELECT of 14,000 rows from a db on my machine (XP Pro, IIS
5.1, 52Mb Ram - nothing special) might take 0.05 secs to execute from

That's 512Mb Ram
instantiation of connection object to closing it again. So you can
have 20 separate connections per second serially using that example.
MS claim that Jet supports 225 concurrent users (connections), so

That's 255 concurrent users
theoretically, you could perform my test operation 4,500 times a

That's 5,100


That's time to clean my keyboard....

:-^)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,262
Messages
2,571,052
Members
48,769
Latest member
Clifft

Latest Threads

Top