MSXML2.XMLHTTP

R

Roland Hall

I wrote a small script that grabs two CSV files [links to the data files]
from a remote web site, parses them out and displays them in scrolling divs.
The first file has a little over 27k records, the second has less. It
retrieves the data pretty quick but it takes awhile to write the page.

Is there a better alternative to this approach?
This is my page:
http://kiddanger.com/lab/getsaveurl.asp

This is the relevant code to retrieve the data:

function strQuote(strURL)
dim objXML
set objXML = CreateObject("MSXML2.ServerXMLHTTP")
objXML.Open "GET", strURL, False
objXML.Send
strQuote = objXML.ResponseText
set objXML = nothing
end function

I split the data into an array and then split that into a new array because
the delimeters are line feed and comma, respectively.

TIA...

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
 
B

Bob Barrows [MVP]

Roland said:
I wrote a small script that grabs two CSV files [links to the data
files] from a remote web site, parses them out and displays them in
scrolling divs. The first file has a little over 27k records, the
second has less. It retrieves the data pretty quick but it takes
awhile to write the page.

Is there a better alternative to this approach?
This is my page:
http://kiddanger.com/lab/getsaveurl.asp

This is the relevant code to retrieve the data:

function strQuote(strURL)
dim objXML
set objXML = CreateObject("MSXML2.ServerXMLHTTP")
objXML.Open "GET", strURL, False
objXML.Send
strQuote = objXML.ResponseText
set objXML = nothing
end function

I split the data into an array and then split that into a new array
because the delimeters are line feed and comma, respectively.

TIA...

It's pretty tough to comment on this. You've identified the bottleneck as
the process of writing the data to the page, so the strQuote function is not
relevant, is it? What you do with the array contents seems to be more
relevant, at least to me.

Somebody (I think it might have been Chris Hohmann) posted an analysis of
different techniques for generating large blocks of html a few weeks ago
that you may find interesting.

Bob Barrows
 
R

Roland Hall

in message
: Roland Hall wrote:
: > I wrote a small script that grabs two CSV files [links to the data
: > files] from a remote web site, parses them out and displays them in
: > scrolling divs. The first file has a little over 27k records, the
: > second has less. It retrieves the data pretty quick but it takes
: > awhile to write the page.
: >
: > Is there a better alternative to this approach?
: > This is my page:
: > http://kiddanger.com/lab/getsaveurl.asp
: >
: > This is the relevant code to retrieve the data:
: >
: > function strQuote(strURL)
: > dim objXML
: > set objXML = CreateObject("MSXML2.ServerXMLHTTP")
: > objXML.Open "GET", strURL, False
: > objXML.Send
: > strQuote = objXML.ResponseText
: > set objXML = nothing
: > end function
: >
: > I split the data into an array and then split that into a new array
: > because the delimeters are line feed and comma, respectively.
: >
: > TIA...
: >
:
: It's pretty tough to comment on this. You've identified the bottleneck as
: the process of writing the data to the page, so the strQuote function is
not
: relevant, is it? What you do with the array contents seems to be more
: relevant, at least to me.

Hi Bob. Thanks for responding.

Perhaps. I'm assuming the data is retrieved due to the activity light on my
switch. I have not actually put timers in, which I guess would be the next
test.

:
: Somebody (I think it might have been Chris Hohmann) posted an analysis of
: different techniques for generating large blocks of html a few weeks ago
: that you may find interesting.

I searched in this NG for all of Chris' posting and didn't find anything.
Then I searched for the reference you made and didn't find anything that way
either. Here is my subroutine for parsing the data and perhaps someone will
notice something that will help speed it up.

sub strWrite(str)
dim arr, i, arr2, j
arr = split(str,vbLf)
prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
strURL & "</legend>")
prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
prt("<table style=""padding: 3px"">")
for i = 1 to ubound(arr)
arr2 = split(arr(i),",")
if i = 1 then
prt("<tr style=""font-weight: bold"">")
else
if i mod 2 = 0 then
prt("<tr style=""background-color: #ddd"">")
else
prt("<tr>")
end if
end if
for j = 0 to ubound(arr2)
prt("<td>" & arr2(j))
next
next
prt("</table>")
prt("</div>")
prt("</fieldset>")
end sub

These are the calls for the two files:

dim strURL
strURL = "http://neustar.us/reports/rgp/domains_in_rgp.csv"
strWrite strQuote(strURL)
strURL = "http://neustar.us/reports/rgp/domains_out_rgp.csv"
strWrite strQuote(strUrl)

I made some changes to my buffer and some variables and it's noticably
faster. It still takes about 4-5 seconds to parse the data but I'm not sure
if that's all that bad for that amount.

I'm testing with two links, one on the Internet and one on my Intranet. The
Internet link normally displays them almost simultaneously. The Intranet
displays the first file, then almost as much of a delay for the next, which
is what I expected.

http://kiddanger.com/lab/getsaveurl.asp Internet
http://netfraud.us/asp/rgpr.asp Intranet

I wonder if I wrote everything to a string and then made only one write
statement if that would be faster. Any ideas?

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
 
R

Roland Hall

I added the record count to the legend and now I know why the second one is
a lot faster. 1/10 the amount of records.
 
D

Dave Anderson

Roland said:
...Here is my subroutine for parsing the data and perhaps someone
will notice something that will help speed it up...

for i = 1 to ubound(arr)
arr2 = split(arr(i),",")
if i = 1 then
prt("<tr style=""font-weight: bold"">")
else
if i mod 2 = 0 then
prt("<tr style=""background-color: #ddd"">")
else
prt("<tr>")
end if
end if
for j = 0 to ubound(arr2)
prt("<td>" & arr2(j))
next
next

Have you tried using Replace() instead of split?

for i = 1 to ubound(arr)
if i = 1 then
prt("<tr style=""font-weight: bold"">")
else
if i mod 2 = 0 then
prt("<tr style=""background-color: #ddd"">")
else
prt("<tr>")
end if
end if

prt(Replace(arr(i),",","<td>"))
next



--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
 
B

Bob Barrows [MVP]

Roland said:
I searched in this NG for all of Chris' posting and didn't find
anything. Then I searched for the reference you made and didn't find
anything that way either.

Darn. I just tried to find it as well, and failed. ISTR that the consensus
was that adding the individual strings to an array and then using Join to
combine them was the fastest method. Combined with Dave's idea, you would
get something like this:

sub strWrite(str)
dim arr, i, arr2, j
dim arHTML(), sRow
arr = split(str,vbLf)
prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
strURL & "</legend>")
prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
prt("<table style=""padding: 3px"">")
redim arHTML(ubound(arr))
for i = 1 to ubound(arr)
if i = 1 then
sRow= "<tr style=""font-weight: bold"">"
else
if i mod 2 = 0 then
sRow="<tr style=""background-color: #ddd"">"
else
sRow="<tr>"
end if
end if
sRow=sRow & vbCrLf & vbTab & Replace(arr(i),",","<td>"))
arHTML(i) =sRow
next
prt(Join(arHTML,vbCrLf))
prt("</table>")
prt("</div>")
prt("</fieldset>")
end sub
:

Bob Barrows
 
R

Roland Hall

in message
: Roland Hall wrote:
: > ...Here is my subroutine for parsing the data and perhaps someone
: > will notice something that will help speed it up...
: >
: > for i = 1 to ubound(arr)
: > arr2 = split(arr(i),",")
: > if i = 1 then
: > prt("<tr style=""font-weight: bold"">")
: > else
: > if i mod 2 = 0 then
: > prt("<tr style=""background-color: #ddd"">")
: > else
: > prt("<tr>")
: > end if
: > end if
: > for j = 0 to ubound(arr2)
: > prt("<td>" & arr2(j))
: > next
: > next
:
: Have you tried using Replace() instead of split?
:
: for i = 1 to ubound(arr)
: if i = 1 then
: prt("<tr style=""font-weight: bold"">")
: else
: if i mod 2 = 0 then
: prt("<tr style=""background-color: #ddd"">")
: else
: prt("<tr>")
: end if
: end if
:
: prt(Replace(arr(i),",","<td>"))
: next
:
:
:

Thank Dave. I'll put a timer on it to see if the difference. Hard to tell
just looking. I know it's hard to write this stuff off the top of your
head, especially not seeing the raw data but I needed to make one mod to
your suggestion. There is no leading , (comma) so another <td> had to be
inserted.

prt("<td>" & replace(arr(i),",","<td>"))

Thanks for your insight. I like that a lot better than the array loop.

Roland
 
M

Michael D. Kersey

Roland said:
I wrote a small script that grabs two CSV files [links to the data files]
from a remote web site, parses them out and displays them in scrolling divs.
The first file has a little over 27k records, the second has less. It
retrieves the data pretty quick but it takes awhile to write the page.

Is there a better alternative to this approach?

How often are the CSV files updated at their remote site? If it's not
too frequently, then the files could be transferred to your server when
updated or by a periodically-executed script or Windows service. The
files could then be accessed locally (more quickly).

BTW 27,000 rows seems like an excessive amount of data for a user to
digest at once. Could the program present a search page (by field, by
alphabetic order, etc.) or summary page (listing categories) first? Then
the user could limit the search somewhat.

I would be tempted to periodically transfer the CSV file(s) to a local
directory and import the data into a database. Then an ASP page would
handle the search and presentation.
 
D

Dave Anderson

Roland said:
There is no leading , (comma) so another <td> had to be inserted.

prt("<td>" & replace(arr(i),",","<td>"))

I would go even further and reach for HTML completeness:

prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")



--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
 
D

Dave Anderson

Michael said:
BTW 27,000 rows seems like an excessive amount of data for
a user to digest at once...

That raises another point I forgot to address in my other post. If the
client machine is Internet Explorer, the table will be displayed all at
once, rather than line-by-line, no matter what buffering you use.

I have jobs that I occasionally run with ASP scripts, and I often set the
script up to spit out every changed record and/or every 100th record, or
something similar. I typically break the table every 10 or 20 rows by
inserting one of these: "</table><table>".

It has been my observation that IE displays nothing at all until the table
is closed, while Mozilla/Firefox/Opera will display each row as it arrives
(buffering must be off to see this in effect).



--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
 
E

Evertjan.

Dave Anderson wrote on 13 dec 2004 in
microsoft.public.inetserver.asp.general:
That raises another point I forgot to address in my other post. If the
client machine is Internet Explorer, the table will be displayed all
at once, rather than line-by-line, no matter what buffering you use.

I have jobs that I occasionally run with ASP scripts, and I often set
the script up to spit out every changed record and/or every 100th
record, or something similar. I typically break the table every 10 or
20 rows by inserting one of these: "</table><table>".

It has been my observation that IE displays nothing at all until the
table is closed, while Mozilla/Firefox/Opera will display each row as
it arrives (buffering must be off to see this in effect).

my observation is otherwise
 
C

Chris Hohmann

Bob Barrows said:
Darn. I just tried to find it as well, and failed. ISTR that the consensus
was that adding the individual strings to an array and then using Join to
combine them was the fastest method.

It sounds familiar but I couldn't find it either. Maybe the underpants
gnomes stole it. :) The closest thing I could come up with it this:

IsArray doesn't work with array var populated with xxx.GetRows()
http://groups-beta.google.com/group..._frm/thread/d14477b8cb5b682e/788211a93f83f823

Here are some older threads:

return single value in asp/sql
http://groups-beta.google.com/group..._frm/thread/b09df71bacf40ff2/2161356799605006

logical problem
http://groups-beta.google.com/group..._frm/thread/8804807f08dc4c88/c89c1498e99d805e

Response.Write speed problem
http://groups-beta.google.com/group...components/browse_frm/thread/e8879828821abe40
 
R

Roland Hall

in message
: Roland Hall wrote:
: >> Somebody (I think it might have been Chris Hohmann) posted an
: >> analysis of different techniques for generating large blocks of html
: >> a few weeks ago that you may find interesting.
: >
: > I searched in this NG for all of Chris' posting and didn't find
: > anything. Then I searched for the reference you made and didn't find
: > anything that way either.
:
: Darn. I just tried to find it as well, and failed. ISTR that the consensus
: was that adding the individual strings to an array and then using Join to
: combine them was the fastest method. Combined with Dave's idea, you would
: get something like this:
:
: sub strWrite(str)
: dim arr, i, arr2, j
: dim arHTML(), sRow
: arr = split(str,vbLf)
: prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
: strURL & "</legend>")
: prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
: prt("<table style=""padding: 3px"">")
: redim arHTML(ubound(arr))
: for i = 1 to ubound(arr)
: if i = 1 then
: sRow= "<tr style=""font-weight: bold"">"
: else
: if i mod 2 = 0 then
: sRow="<tr style=""background-color: #ddd"">"
: else
: sRow="<tr>"
: end if
: end if
: sRow=sRow & vbCrLf & vbTab & Replace(arr(i),",","<td>"))
: arHTML(i) =sRow
: next
: prt(Join(arHTML,vbCrLf))
: prt("</table>")
: prt("</div>")
: prt("</fieldset>")
: end sub

Thanks for your help Bob. I only had to make a few adjustments.
 
R

Roland Hall

in message
: Roland Hall wrote:
: > There is no leading , (comma) so another <td> had to be inserted.
: >
: > prt("<td>" & replace(arr(i),",","<td>"))
:
: I would go even further and reach for HTML completeness:
:
: prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")

HTML completeness? I thought ending tags were no longer required? However,
wouldn't it then be:
prt("<td>" & replace(arr(i),",","</td><td>") & "</td></tr>")

Roland
 
R

Roland Hall

in message
: Roland Hall wrote:
: > I wrote a small script that grabs two CSV files [links to the data
files]
: > from a remote web site, parses them out and displays them in scrolling
divs.
: > The first file has a little over 27k records, the second has less. It
: > retrieves the data pretty quick but it takes awhile to write the page.
: >
: > Is there a better alternative to this approach?
:
: How often are the CSV files updated at their remote site? If it's not
: too frequently, then the files could be transferred to your server when
: updated or by a periodically-executed script or Windows service. The
: files could then be accessed locally (more quickly).

I think they are updated once a day.

: BTW 27,000 rows seems like an excessive amount of data for a user to
: digest at once. Could the program present a search page (by field, by
: alphabetic order, etc.) or summary page (listing categories) first? Then
: the user could limit the search somewhat.
:
: I would be tempted to periodically transfer the CSV file(s) to a local
: directory and import the data into a database. Then an ASP page would
: handle the search and presentation.

These files are lists of domains being deleted and their status in the
deletion process. Sure, if you know a domain, a simple record would be
great but I believe this is a list that is mostly unknown to those seeking
it and why it is only available as a csv file.

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
 
R

Roland Hall

: Michael D. Kersey wrote:
: > BTW 27,000 rows seems like an excessive amount of data for
: > a user to digest at once...
:
: That raises another point I forgot to address in my other post. If the
: client machine is Internet Explorer, the table will be displayed all at
: once, rather than line-by-line, no matter what buffering you use.

That's what happens.

: I have jobs that I occasionally run with ASP scripts, and I often set the
: script up to spit out every changed record and/or every 100th record, or
: something similar. I typically break the table every 10 or 20 rows by
: inserting one of these: "</table><table>".
:
: It has been my observation that IE displays nothing at all until the table
: is closed, while Mozilla/Firefox/Opera will display each row as it arrives
: (buffering must be off to see this in effect).

There are only 3 columns in the first file and 2 in the second. Roughly 21k
rows in the first and 2k in the second. This is also a variant because it
is based upon the date each domain was registered. The following day, could
have twice as many or half as much but I doubt they'll vary greatly.

Currently it appears splitting it up will just slow down the process since
retrieving the file is where most of the latency occurs. I'll probably end
up writing and app to grab the file daily which will decrease the bandwidth
usage by almost 50%.

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
 
B

Bob Barrows [MVP]

Roland said:
in message
Roland said:
I wrote a small script that grabs two CSV files [links to the data
files] from a remote web site, parses them out and displays them in
scrolling divs. The first file has a little over 27k records, the
second has less. It retrieves the data pretty quick but it takes
awhile to write the page.

Is there a better alternative to this approach?

How often are the CSV files updated at their remote site? If it's not
too frequently, then the files could be transferred to your server
when updated or by a periodically-executed script or Windows
service. The files could then be accessed locally (more quickly).

I think they are updated once a day.
You might want to consider caching them, refreshing the cache each day.
Generate the html strings once each day and put them in SSI files.

I would consider making them filterable, either by importing them into a
database, or converting them into xml.

Bob Barrows
 
R

Roland Hall

in message
: You might want to consider caching them, refreshing the cache each day.
: Generate the html strings once each day and put them in SSI files.

I'm not familiar.

: I would consider making them filterable, either by importing them into a
: database, or converting them into xml.

I plan on putting them in SQL.

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
 
B

Bob Barrows [MVP]

Roland said:
in message


I'm not familiar.

SSI = server-side includes
In other words, each day, generate the html and write it into a file which
you include in your display page using <!--#include etc.
I plan on putting them in SQL.
Good

Bob Barrows
 
D

Dave Anderson

Evertjan. said:
my observation is otherwise

From your detailed response I infer you observed a powered-off CRT. That is
most certainly "otherwise".



--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads

Creating an HTML file 1
Problem with XMLHTTP component 1
VBScript function returning multiple values 17
ASP design question 4
Compressed Zipped Folder 6
Application question 2
the perfect function 4
replace text 2

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,581
Members
45,057
Latest member
KetoBeezACVGummies

Latest Threads

Top