MSXML2.XMLHTTP

Discussion in 'ASP General' started by Roland Hall, Dec 13, 2004.

  1. Roland Hall

    Roland Hall Guest

    I wrote a small script that grabs two CSV files [links to the data files]
    from a remote web site, parses them out and displays them in scrolling divs.
    The first file has a little over 27k records, the second has less. It
    retrieves the data pretty quick but it takes awhile to write the page.

    Is there a better alternative to this approach?
    This is my page:
    http://kiddanger.com/lab/getsaveurl.asp

    This is the relevant code to retrieve the data:

    function strQuote(strURL)
    dim objXML
    set objXML = CreateObject("MSXML2.ServerXMLHTTP")
    objXML.Open "GET", strURL, False
    objXML.Send
    strQuote = objXML.ResponseText
    set objXML = nothing
    end function

    I split the data into an array and then split that into a new array because
    the delimeters are line feed and comma, respectively.

    TIA...

    --
    Roland Hall
    /* This information is distributed in the hope that it will be useful, but
    without any warranty; without even the implied warranty of merchantability
    or fitness for a particular purpose. */
    Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
    WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
    MSDN Library - http://msdn.microsoft.com/library/default.asp
     
    Roland Hall, Dec 13, 2004
    #1
    1. Advertising

  2. Roland Hall wrote:
    > I wrote a small script that grabs two CSV files [links to the data
    > files] from a remote web site, parses them out and displays them in
    > scrolling divs. The first file has a little over 27k records, the
    > second has less. It retrieves the data pretty quick but it takes
    > awhile to write the page.
    >
    > Is there a better alternative to this approach?
    > This is my page:
    > http://kiddanger.com/lab/getsaveurl.asp
    >
    > This is the relevant code to retrieve the data:
    >
    > function strQuote(strURL)
    > dim objXML
    > set objXML = CreateObject("MSXML2.ServerXMLHTTP")
    > objXML.Open "GET", strURL, False
    > objXML.Send
    > strQuote = objXML.ResponseText
    > set objXML = nothing
    > end function
    >
    > I split the data into an array and then split that into a new array
    > because the delimeters are line feed and comma, respectively.
    >
    > TIA...
    >


    It's pretty tough to comment on this. You've identified the bottleneck as
    the process of writing the data to the page, so the strQuote function is not
    relevant, is it? What you do with the array contents seems to be more
    relevant, at least to me.

    Somebody (I think it might have been Chris Hohmann) posted an analysis of
    different techniques for generating large blocks of html a few weeks ago
    that you may find interesting.

    Bob Barrows
    --
    Microsoft MVP -- ASP/ASP.NET
    Please reply to the newsgroup. The email account listed in my From
    header is my spam trap, so I don't check it very often. You will get a
    quicker response by posting to the newsgroup.
     
    Bob Barrows [MVP], Dec 13, 2004
    #2
    1. Advertising

  3. Roland Hall

    Roland Hall Guest

    "Bob Barrows [MVP]" wrote in message
    news:...
    : Roland Hall wrote:
    : > I wrote a small script that grabs two CSV files [links to the data
    : > files] from a remote web site, parses them out and displays them in
    : > scrolling divs. The first file has a little over 27k records, the
    : > second has less. It retrieves the data pretty quick but it takes
    : > awhile to write the page.
    : >
    : > Is there a better alternative to this approach?
    : > This is my page:
    : > http://kiddanger.com/lab/getsaveurl.asp
    : >
    : > This is the relevant code to retrieve the data:
    : >
    : > function strQuote(strURL)
    : > dim objXML
    : > set objXML = CreateObject("MSXML2.ServerXMLHTTP")
    : > objXML.Open "GET", strURL, False
    : > objXML.Send
    : > strQuote = objXML.ResponseText
    : > set objXML = nothing
    : > end function
    : >
    : > I split the data into an array and then split that into a new array
    : > because the delimeters are line feed and comma, respectively.
    : >
    : > TIA...
    : >
    :
    : It's pretty tough to comment on this. You've identified the bottleneck as
    : the process of writing the data to the page, so the strQuote function is
    not
    : relevant, is it? What you do with the array contents seems to be more
    : relevant, at least to me.

    Hi Bob. Thanks for responding.

    Perhaps. I'm assuming the data is retrieved due to the activity light on my
    switch. I have not actually put timers in, which I guess would be the next
    test.

    :
    : Somebody (I think it might have been Chris Hohmann) posted an analysis of
    : different techniques for generating large blocks of html a few weeks ago
    : that you may find interesting.

    I searched in this NG for all of Chris' posting and didn't find anything.
    Then I searched for the reference you made and didn't find anything that way
    either. Here is my subroutine for parsing the data and perhaps someone will
    notice something that will help speed it up.

    sub strWrite(str)
    dim arr, i, arr2, j
    arr = split(str,vbLf)
    prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
    strURL & "</legend>")
    prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
    prt("<table style=""padding: 3px"">")
    for i = 1 to ubound(arr)
    arr2 = split(arr(i),",")
    if i = 1 then
    prt("<tr style=""font-weight: bold"">")
    else
    if i mod 2 = 0 then
    prt("<tr style=""background-color: #ddd"">")
    else
    prt("<tr>")
    end if
    end if
    for j = 0 to ubound(arr2)
    prt("<td>" & arr2(j))
    next
    next
    prt("</table>")
    prt("</div>")
    prt("</fieldset>")
    end sub

    These are the calls for the two files:

    dim strURL
    strURL = "http://neustar.us/reports/rgp/domains_in_rgp.csv"
    strWrite strQuote(strURL)
    strURL = "http://neustar.us/reports/rgp/domains_out_rgp.csv"
    strWrite strQuote(strUrl)

    I made some changes to my buffer and some variables and it's noticably
    faster. It still takes about 4-5 seconds to parse the data but I'm not sure
    if that's all that bad for that amount.

    I'm testing with two links, one on the Internet and one on my Intranet. The
    Internet link normally displays them almost simultaneously. The Intranet
    displays the first file, then almost as much of a delay for the next, which
    is what I expected.

    http://kiddanger.com/lab/getsaveurl.asp Internet
    http://netfraud.us/asp/rgpr.asp Intranet

    I wonder if I wrote everything to a string and then made only one write
    statement if that would be faster. Any ideas?

    --
    Roland Hall
    /* This information is distributed in the hope that it will be useful, but
    without any warranty; without even the implied warranty of merchantability
    or fitness for a particular purpose. */
    Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
    WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
    MSDN Library - http://msdn.microsoft.com/library/default.asp
     
    Roland Hall, Dec 13, 2004
    #3
  4. Roland Hall

    Roland Hall Guest

    I added the record count to the legend and now I know why the second one is
    a lot faster. 1/10 the amount of records.
     
    Roland Hall, Dec 13, 2004
    #4
  5. Roland Hall wrote:
    > ...Here is my subroutine for parsing the data and perhaps someone
    > will notice something that will help speed it up...
    >
    > for i = 1 to ubound(arr)
    > arr2 = split(arr(i),",")
    > if i = 1 then
    > prt("<tr style=""font-weight: bold"">")
    > else
    > if i mod 2 = 0 then
    > prt("<tr style=""background-color: #ddd"">")
    > else
    > prt("<tr>")
    > end if
    > end if
    > for j = 0 to ubound(arr2)
    > prt("<td>" & arr2(j))
    > next
    > next


    Have you tried using Replace() instead of split?

    for i = 1 to ubound(arr)
    if i = 1 then
    prt("<tr style=""font-weight: bold"">")
    else
    if i mod 2 = 0 then
    prt("<tr style=""background-color: #ddd"">")
    else
    prt("<tr>")
    end if
    end if

    prt(Replace(arr(i),",","<td>"))
    next



    --
    Dave Anderson

    Unsolicited commercial email will be read at a cost of $500 per message. Use
    of this email address implies consent to these terms. Please do not contact
    me directly or ask me to contact you directly for assistance. If your
    question is worth asking, it's worth posting.
     
    Dave Anderson, Dec 13, 2004
    #5
  6. Roland Hall wrote:
    >> Somebody (I think it might have been Chris Hohmann) posted an
    >> analysis of different techniques for generating large blocks of html
    >> a few weeks ago that you may find interesting.

    >
    > I searched in this NG for all of Chris' posting and didn't find
    > anything. Then I searched for the reference you made and didn't find
    > anything that way either.


    Darn. I just tried to find it as well, and failed. ISTR that the consensus
    was that adding the individual strings to an array and then using Join to
    combine them was the fastest method. Combined with Dave's idea, you would
    get something like this:

    sub strWrite(str)
    dim arr, i, arr2, j
    dim arHTML(), sRow
    arr = split(str,vbLf)
    prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
    strURL & "</legend>")
    prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
    prt("<table style=""padding: 3px"">")
    redim arHTML(ubound(arr))
    for i = 1 to ubound(arr)
    if i = 1 then
    sRow= "<tr style=""font-weight: bold"">"
    else
    if i mod 2 = 0 then
    sRow="<tr style=""background-color: #ddd"">"
    else
    sRow="<tr>"
    end if
    end if
    sRow=sRow & vbCrLf & vbTab & Replace(arr(i),",","<td>"))
    arHTML(i) =sRow
    next
    prt(Join(arHTML,vbCrLf))
    prt("</table>")
    prt("</div>")
    prt("</fieldset>")
    end sub
    :

    Bob Barrows

    --
    Microsoft MVP -- ASP/ASP.NET
    Please reply to the newsgroup. The email account listed in my From
    header is my spam trap, so I don't check it very often. You will get a
    quicker response by posting to the newsgroup.
     
    Bob Barrows [MVP], Dec 13, 2004
    #6
  7. Roland Hall

    Roland Hall Guest

    "Dave Anderson" wrote in message
    news:%...
    : Roland Hall wrote:
    : > ...Here is my subroutine for parsing the data and perhaps someone
    : > will notice something that will help speed it up...
    : >
    : > for i = 1 to ubound(arr)
    : > arr2 = split(arr(i),",")
    : > if i = 1 then
    : > prt("<tr style=""font-weight: bold"">")
    : > else
    : > if i mod 2 = 0 then
    : > prt("<tr style=""background-color: #ddd"">")
    : > else
    : > prt("<tr>")
    : > end if
    : > end if
    : > for j = 0 to ubound(arr2)
    : > prt("<td>" & arr2(j))
    : > next
    : > next
    :
    : Have you tried using Replace() instead of split?
    :
    : for i = 1 to ubound(arr)
    : if i = 1 then
    : prt("<tr style=""font-weight: bold"">")
    : else
    : if i mod 2 = 0 then
    : prt("<tr style=""background-color: #ddd"">")
    : else
    : prt("<tr>")
    : end if
    : end if
    :
    : prt(Replace(arr(i),",","<td>"))
    : next
    :
    :
    :

    Thank Dave. I'll put a timer on it to see if the difference. Hard to tell
    just looking. I know it's hard to write this stuff off the top of your
    head, especially not seeing the raw data but I needed to make one mod to
    your suggestion. There is no leading , (comma) so another <td> had to be
    inserted.

    prt("<td>" & replace(arr(i),",","<td>"))

    Thanks for your insight. I like that a lot better than the array loop.

    Roland
     
    Roland Hall, Dec 13, 2004
    #7
  8. Roland Hall wrote:
    > I wrote a small script that grabs two CSV files [links to the data files]
    > from a remote web site, parses them out and displays them in scrolling divs.
    > The first file has a little over 27k records, the second has less. It
    > retrieves the data pretty quick but it takes awhile to write the page.
    >
    > Is there a better alternative to this approach?


    How often are the CSV files updated at their remote site? If it's not
    too frequently, then the files could be transferred to your server when
    updated or by a periodically-executed script or Windows service. The
    files could then be accessed locally (more quickly).

    BTW 27,000 rows seems like an excessive amount of data for a user to
    digest at once. Could the program present a search page (by field, by
    alphabetic order, etc.) or summary page (listing categories) first? Then
    the user could limit the search somewhat.

    I would be tempted to periodically transfer the CSV file(s) to a local
    directory and import the data into a database. Then an ASP page would
    handle the search and presentation.
     
    Michael D. Kersey, Dec 13, 2004
    #8
  9. Roland Hall wrote:
    > There is no leading , (comma) so another <td> had to be inserted.
    >
    > prt("<td>" & replace(arr(i),",","<td>"))


    I would go even further and reach for HTML completeness:

    prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")



    --
    Dave Anderson

    Unsolicited commercial email will be read at a cost of $500 per message. Use
    of this email address implies consent to these terms. Please do not contact
    me directly or ask me to contact you directly for assistance. If your
    question is worth asking, it's worth posting.
     
    Dave Anderson, Dec 13, 2004
    #9
  10. Michael D. Kersey wrote:
    > BTW 27,000 rows seems like an excessive amount of data for
    > a user to digest at once...


    That raises another point I forgot to address in my other post. If the
    client machine is Internet Explorer, the table will be displayed all at
    once, rather than line-by-line, no matter what buffering you use.

    I have jobs that I occasionally run with ASP scripts, and I often set the
    script up to spit out every changed record and/or every 100th record, or
    something similar. I typically break the table every 10 or 20 rows by
    inserting one of these: "</table><table>".

    It has been my observation that IE displays nothing at all until the table
    is closed, while Mozilla/Firefox/Opera will display each row as it arrives
    (buffering must be off to see this in effect).



    --
    Dave Anderson

    Unsolicited commercial email will be read at a cost of $500 per message. Use
    of this email address implies consent to these terms. Please do not contact
    me directly or ask me to contact you directly for assistance. If your
    question is worth asking, it's worth posting.
     
    Dave Anderson, Dec 13, 2004
    #10
  11. Roland Hall

    Evertjan. Guest

    Dave Anderson wrote on 13 dec 2004 in
    microsoft.public.inetserver.asp.general:

    > Michael D. Kersey wrote:
    >> BTW 27,000 rows seems like an excessive amount of data for
    >> a user to digest at once...

    >
    > That raises another point I forgot to address in my other post. If the
    > client machine is Internet Explorer, the table will be displayed all
    > at once, rather than line-by-line, no matter what buffering you use.
    >
    > I have jobs that I occasionally run with ASP scripts, and I often set
    > the script up to spit out every changed record and/or every 100th
    > record, or something similar. I typically break the table every 10 or
    > 20 rows by inserting one of these: "</table><table>".
    >
    > It has been my observation that IE displays nothing at all until the
    > table is closed, while Mozilla/Firefox/Opera will display each row as
    > it arrives (buffering must be off to see this in effect).


    my observation is otherwise

    --
    Evertjan.
    The Netherlands.
    (Please change the x'es to dots in my emailaddress)
     
    Evertjan., Dec 13, 2004
    #11
  12. "Bob Barrows [MVP]" <> wrote in message
    news:%23Qld%...
    > Roland Hall wrote:
    > >> Somebody (I think it might have been Chris Hohmann) posted an
    > >> analysis of different techniques for generating large blocks of html
    > >> a few weeks ago that you may find interesting.

    > >
    > > I searched in this NG for all of Chris' posting and didn't find
    > > anything. Then I searched for the reference you made and didn't find
    > > anything that way either.

    >
    > Darn. I just tried to find it as well, and failed. ISTR that the consensus
    > was that adding the individual strings to an array and then using Join to
    > combine them was the fastest method.


    It sounds familiar but I couldn't find it either. Maybe the underpants
    gnomes stole it. :) The closest thing I could come up with it this:

    IsArray doesn't work with array var populated with xxx.GetRows()
    http://groups-beta.google.com/group..._frm/thread/d14477b8cb5b682e/788211a93f83f823

    Here are some older threads:

    return single value in asp/sql
    http://groups-beta.google.com/group..._frm/thread/b09df71bacf40ff2/2161356799605006

    logical problem
    http://groups-beta.google.com/group..._frm/thread/8804807f08dc4c88/c89c1498e99d805e

    Response.Write speed problem
    http://groups-beta.google.com/group...components/browse_frm/thread/e8879828821abe40
     
    Chris Hohmann, Dec 13, 2004
    #12
  13. Roland Hall

    Roland Hall Guest

    "Bob Barrows [MVP]" wrote in message
    news:%23Qld%...
    : Roland Hall wrote:
    : >> Somebody (I think it might have been Chris Hohmann) posted an
    : >> analysis of different techniques for generating large blocks of html
    : >> a few weeks ago that you may find interesting.
    : >
    : > I searched in this NG for all of Chris' posting and didn't find
    : > anything. Then I searched for the reference you made and didn't find
    : > anything that way either.
    :
    : Darn. I just tried to find it as well, and failed. ISTR that the consensus
    : was that adding the individual strings to an array and then using Join to
    : combine them was the fastest method. Combined with Dave's idea, you would
    : get something like this:
    :
    : sub strWrite(str)
    : dim arr, i, arr2, j
    : dim arHTML(), sRow
    : arr = split(str,vbLf)
    : prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
    : strURL & "</legend>")
    : prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
    : prt("<table style=""padding: 3px"">")
    : redim arHTML(ubound(arr))
    : for i = 1 to ubound(arr)
    : if i = 1 then
    : sRow= "<tr style=""font-weight: bold"">"
    : else
    : if i mod 2 = 0 then
    : sRow="<tr style=""background-color: #ddd"">"
    : else
    : sRow="<tr>"
    : end if
    : end if
    : sRow=sRow & vbCrLf & vbTab & Replace(arr(i),",","<td>"))
    : arHTML(i) =sRow
    : next
    : prt(Join(arHTML,vbCrLf))
    : prt("</table>")
    : prt("</div>")
    : prt("</fieldset>")
    : end sub

    Thanks for your help Bob. I only had to make a few adjustments.
     
    Roland Hall, Dec 14, 2004
    #13
  14. Roland Hall

    Roland Hall Guest

    "Dave Anderson" wrote in message
    news:%...
    : Roland Hall wrote:
    : > There is no leading , (comma) so another <td> had to be inserted.
    : >
    : > prt("<td>" & replace(arr(i),",","<td>"))
    :
    : I would go even further and reach for HTML completeness:
    :
    : prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")

    HTML completeness? I thought ending tags were no longer required? However,
    wouldn't it then be:
    prt("<td>" & replace(arr(i),",","</td><td>") & "</td></tr>")

    Roland
     
    Roland Hall, Dec 14, 2004
    #14
  15. Roland Hall

    Roland Hall Guest

    "Michael D. Kersey" wrote in message
    news:O%...
    : Roland Hall wrote:
    : > I wrote a small script that grabs two CSV files [links to the data
    files]
    : > from a remote web site, parses them out and displays them in scrolling
    divs.
    : > The first file has a little over 27k records, the second has less. It
    : > retrieves the data pretty quick but it takes awhile to write the page.
    : >
    : > Is there a better alternative to this approach?
    :
    : How often are the CSV files updated at their remote site? If it's not
    : too frequently, then the files could be transferred to your server when
    : updated or by a periodically-executed script or Windows service. The
    : files could then be accessed locally (more quickly).

    I think they are updated once a day.

    : BTW 27,000 rows seems like an excessive amount of data for a user to
    : digest at once. Could the program present a search page (by field, by
    : alphabetic order, etc.) or summary page (listing categories) first? Then
    : the user could limit the search somewhat.
    :
    : I would be tempted to periodically transfer the CSV file(s) to a local
    : directory and import the data into a database. Then an ASP page would
    : handle the search and presentation.

    These files are lists of domains being deleted and their status in the
    deletion process. Sure, if you know a domain, a simple record would be
    great but I believe this is a list that is mostly unknown to those seeking
    it and why it is only available as a csv file.

    --
    Roland Hall
    /* This information is distributed in the hope that it will be useful, but
    without any warranty; without even the implied warranty of merchantability
    or fitness for a particular purpose. */
    Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
    WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
    MSDN Library - http://msdn.microsoft.com/library/default.asp
     
    Roland Hall, Dec 14, 2004
    #15
  16. Roland Hall

    Roland Hall Guest

    "Dave Anderson" <> wrote in message
    news:...
    : Michael D. Kersey wrote:
    : > BTW 27,000 rows seems like an excessive amount of data for
    : > a user to digest at once...
    :
    : That raises another point I forgot to address in my other post. If the
    : client machine is Internet Explorer, the table will be displayed all at
    : once, rather than line-by-line, no matter what buffering you use.

    That's what happens.

    : I have jobs that I occasionally run with ASP scripts, and I often set the
    : script up to spit out every changed record and/or every 100th record, or
    : something similar. I typically break the table every 10 or 20 rows by
    : inserting one of these: "</table><table>".
    :
    : It has been my observation that IE displays nothing at all until the table
    : is closed, while Mozilla/Firefox/Opera will display each row as it arrives
    : (buffering must be off to see this in effect).

    There are only 3 columns in the first file and 2 in the second. Roughly 21k
    rows in the first and 2k in the second. This is also a variant because it
    is based upon the date each domain was registered. The following day, could
    have twice as many or half as much but I doubt they'll vary greatly.

    Currently it appears splitting it up will just slow down the process since
    retrieving the file is where most of the latency occurs. I'll probably end
    up writing and app to grab the file daily which will decrease the bandwidth
    usage by almost 50%.

    --
    Roland Hall
    /* This information is distributed in the hope that it will be useful, but
    without any warranty; without even the implied warranty of merchantability
    or fitness for a particular purpose. */
    Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
    WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
    MSDN Library - http://msdn.microsoft.com/library/default.asp
     
    Roland Hall, Dec 14, 2004
    #16
  17. Roland Hall wrote:
    > "Michael D. Kersey" wrote in message
    > news:O%...
    >> Roland Hall wrote:
    >>> I wrote a small script that grabs two CSV files [links to the data
    >>> files] from a remote web site, parses them out and displays them in
    >>> scrolling divs. The first file has a little over 27k records, the
    >>> second has less. It retrieves the data pretty quick but it takes
    >>> awhile to write the page.
    >>>
    >>> Is there a better alternative to this approach?

    >>
    >> How often are the CSV files updated at their remote site? If it's not
    >> too frequently, then the files could be transferred to your server
    >> when updated or by a periodically-executed script or Windows
    >> service. The files could then be accessed locally (more quickly).

    >
    > I think they are updated once a day.
    >

    You might want to consider caching them, refreshing the cache each day.
    Generate the html strings once each day and put them in SSI files.

    I would consider making them filterable, either by importing them into a
    database, or converting them into xml.

    Bob Barrows

    --
    Microsoft MVP -- ASP/ASP.NET
    Please reply to the newsgroup. The email account listed in my From
    header is my spam trap, so I don't check it very often. You will get a
    quicker response by posting to the newsgroup.
     
    Bob Barrows [MVP], Dec 14, 2004
    #17
  18. Roland Hall

    Roland Hall Guest

    "Bob Barrows [MVP]" wrote in message
    news:%...
    : You might want to consider caching them, refreshing the cache each day.
    : Generate the html strings once each day and put them in SSI files.

    I'm not familiar.

    : I would consider making them filterable, either by importing them into a
    : database, or converting them into xml.

    I plan on putting them in SQL.

    --
    Roland Hall
    /* This information is distributed in the hope that it will be useful, but
    without any warranty; without even the implied warranty of merchantability
    or fitness for a particular purpose. */
    Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
    WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
    MSDN Library - http://msdn.microsoft.com/library/default.asp
     
    Roland Hall, Dec 14, 2004
    #18
  19. Roland Hall wrote:
    > "Bob Barrows [MVP]" wrote in message
    > news:%...
    >> You might want to consider caching them, refreshing the cache each
    >> day. Generate the html strings once each day and put them in SSI
    >> files.

    >
    > I'm not familiar.


    SSI = server-side includes
    In other words, each day, generate the html and write it into a file which
    you include in your display page using <!--#include etc.

    >
    >> I would consider making them filterable, either by importing them
    >> into a database, or converting them into xml.

    >
    > I plan on putting them in SQL.
    >

    Good

    Bob Barrows
    --
    Microsoft MVP -- ASP/ASP.NET
    Please reply to the newsgroup. The email account listed in my From
    header is my spam trap, so I don't check it very often. You will get a
    quicker response by posting to the newsgroup.
     
    Bob Barrows [MVP], Dec 14, 2004
    #19
  20. Evertjan. wrote:
    >> It has been my observation that IE displays nothing at all until the
    >> table is closed, while Mozilla/Firefox/Opera will display each row as
    >> it arrives (buffering must be off to see this in effect).

    >
    > my observation is otherwise


    From your detailed response I infer you observed a powered-off CRT. That is
    most certainly "otherwise".



    --
    Dave Anderson

    Unsolicited commercial email will be read at a cost of $500 per message. Use
    of this email address implies consent to these terms. Please do not contact
    me directly or ask me to contact you directly for assistance. If your
    question is worth asking, it's worth posting.
     
    Dave Anderson, Dec 14, 2004
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Paul
    Replies:
    0
    Views:
    510
  2. Raúl Martín
    Replies:
    1
    Views:
    10,838
    Natty Gur
    May 13, 2004
  3. BjörnHolmberg

    DotNet WS from VB6 with MSXML2.XMLHTTP

    BjörnHolmberg, Jul 5, 2004, in forum: ASP .Net Web Services
    Replies:
    3
    Views:
    370
    Ken Cox [Microsoft MVP]
    Jul 8, 2004
  4. Dmitry

    MSXML2.XMLHTTP

    Dmitry, Feb 25, 2004, in forum: ASP General
    Replies:
    1
    Views:
    287
    Mark Schupp
    Feb 25, 2004
  5. Dave

    MSXML2.XMLHTTP.3.0 error

    Dave, Jul 26, 2007, in forum: ASP General
    Replies:
    2
    Views:
    474
Loading...

Share This Page