ServerXMLHTTP uses 100% CPU for a long time

Discussion in 'ASP General' started by Ed McNierney, Dec 2, 2005.

  1. Ed McNierney

    Ed McNierney Guest

    I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to retrieve
    large binary data from a remote server. When the request is large (more than
    a few megabytes), the ServerXMLHTTP page jumps to nearly 100% CPU utilization
    for an unusually long time. The remote server needs a few seconds to prepare
    the request, during which time the CPU seems OK. It seems that as soon as
    the data is ready to retrieve, the CPU usage jumps and remains that way until
    the data has all been copied to the requesting server. That takes way too
    long - about 35 seconds when requesting a 12 MB file over a gigabit Ethernet.

    I use ServerXMLHTTP hundreds of thousands of times daily on this same system
    on the same network, with absolutely no problem - but for smaller requests.
    There's something about the size of the request that makes it blow up.

    I saw some reports of older systems with this problem (Windows 2000), but
    I'm running IIS 6 on Windows Server 2003, SP1. Thanks!
     
    Ed McNierney, Dec 2, 2005
    #1
    1. Advertising

  2. Ed McNierney wrote:
    > I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
    > retrieve large binary data from a remote server. When the request is
    > large (more than a few megabytes), the ServerXMLHTTP page jumps to
    > nearly 100% CPU utilization for an unusually long time. The remote
    > server needs a few seconds to prepare the request, during which time
    > the CPU seems OK. It seems that as soon as the data is ready to
    > retrieve, the CPU usage jumps and remains that way until the data has
    > all been copied to the requesting server. That takes way too long -
    > about 35 seconds when requesting a 12 MB file over a gigabit
    > Ethernet.
    >
    > I use ServerXMLHTTP hundreds of thousands of times daily on this same
    > system on the same network, with absolutely no problem - but for
    > smaller requests. There's something about the size of the request
    > that makes it blow up.
    >
    > I saw some reports of older systems with this problem (Windows 2000),
    > but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!


    Reminds me of the oldie, but goodie:

    Patient: Doctor, it hurts when I raise my arm
    Doctor: So stop raising your arm!
    ;-)

    Sounds to me as if a different technology is needed for this - perhaps FTP?
    Bob Barrows

    --
    Microsoft MVP -- ASP/ASP.NET
    Please reply to the newsgroup. The email account listed in my From
    header is my spam trap, so I don't check it very often. You will get a
    quicker response by posting to the newsgroup.
     
    Bob Barrows [MVP], Dec 2, 2005
    #2
    1. Advertising

  3. Ed McNierney

    Ed McNierney Guest

    Bob -

    Thanks for the quick reply!

    First, I'd like to understand the problem, not ignore it. That won't get it
    fixed.

    Second, I don't have an option of a different technology. The service that
    is producing these files (they're images, produced on the fly based on an
    HTTP request) serves them via an HTTP interface, not FTP or any other.

    I did a lot of searching and cannot find any other example of this problem
    (other than old ones). The "alternative technology" available to me is to
    move this portion of the site to a Linux server, where my older PHP code
    works flawlessly. The intent was to move the entire site to Windows, but if
    Windows can't cut it, I'll need to stick to Linux.

    "Bob Barrows [MVP]" wrote:

    > Ed McNierney wrote:
    > > I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
    > > retrieve large binary data from a remote server. When the request is
    > > large (more than a few megabytes), the ServerXMLHTTP page jumps to
    > > nearly 100% CPU utilization for an unusually long time. The remote
    > > server needs a few seconds to prepare the request, during which time
    > > the CPU seems OK. It seems that as soon as the data is ready to
    > > retrieve, the CPU usage jumps and remains that way until the data has
    > > all been copied to the requesting server. That takes way too long -
    > > about 35 seconds when requesting a 12 MB file over a gigabit
    > > Ethernet.
    > >
    > > I use ServerXMLHTTP hundreds of thousands of times daily on this same
    > > system on the same network, with absolutely no problem - but for
    > > smaller requests. There's something about the size of the request
    > > that makes it blow up.
    > >
    > > I saw some reports of older systems with this problem (Windows 2000),
    > > but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!

    >
    > Reminds me of the oldie, but goodie:
    >
    > Patient: Doctor, it hurts when I raise my arm
    > Doctor: So stop raising your arm!
    > ;-)
    >
    > Sounds to me as if a different technology is needed for this - perhaps FTP?
    > Bob Barrows
    >
    > --
    > Microsoft MVP -- ASP/ASP.NET
    > Please reply to the newsgroup. The email account listed in my From
    > header is my spam trap, so I don't check it very often. You will get a
    > quicker response by posting to the newsgroup.
    >
    >
    >
     
    Ed McNierney, Dec 2, 2005
    #3
  4. From
    http://support.microsoft.com/default.aspx?scid=/servicedesks/webcasts/wc052802/WCT052802.asp:

    .... Another limitation, which we touched on earlier, is that WinInet doesn't
    handle some of the higher-level content-related services with regard to HTTP
    data. Some of those things are handled by URLMON. Particularly, URLMON
    implements MIME type detection and implements HTTP compression.
    HTTP compression is a unique technology on your server that says, "Please
    gzip this data, compress it before it gets sent to the client." The client
    sees it, sees the header indicating that it's gzipped content, and
    decompresses it before displaying. If you have a large amount of content
    you're sending, then the cost of performing this compression and
    decompression can be much less than the cost of transmitting the
    uncompressed content down from the server to the client. However, this is
    implemented at the URLMON level. Because ServerXMLHTTP doesn't use URLMON,
    it goes through WinHTTP, it uses a more bare-bones interface, it can't
    handle HTTP compression and, again, there is no MIME type detection at all.
    Use that at your own risk and your own best judgment.

    However, according to this:
    http://groups.google.com/group/micr...MLHTTP 100% CPU&rnum=4&hl=en#6c4482f75218b1b1

    There is a known performance issue that sas fixed in SP3 for MSXML 3.0

    What version of MSXML are you using?

    Ed McNierney wrote:
    > Bob -
    >
    > Thanks for the quick reply!
    >
    > First, I'd like to understand the problem, not ignore it. That won't
    > get it fixed.
    >
    > Second, I don't have an option of a different technology. The
    > service that is producing these files (they're images, produced on
    > the fly based on an HTTP request) serves them via an HTTP interface,
    > not FTP or any other.
    >
    > I did a lot of searching and cannot find any other example of this
    > problem (other than old ones). The "alternative technology"
    > available to me is to move this portion of the site to a Linux
    > server, where my older PHP code works flawlessly. The intent was to
    > move the entire site to Windows, but if Windows can't cut it, I'll
    > need to stick to Linux.
    >
    > "Bob Barrows [MVP]" wrote:
    >
    >> Ed McNierney wrote:
    >>> I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
    >>> retrieve large binary data from a remote server. When the request
    >>> is large (more than a few megabytes), the ServerXMLHTTP page jumps
    >>> to nearly 100% CPU utilization for an unusually long time. The
    >>> remote server needs a few seconds to prepare the request, during
    >>> which time the CPU seems OK. It seems that as soon as the data is
    >>> ready to retrieve, the CPU usage jumps and remains that way until
    >>> the data has all been copied to the requesting server. That takes
    >>> way too long - about 35 seconds when requesting a 12 MB file over a
    >>> gigabit Ethernet.
    >>>
    >>> I use ServerXMLHTTP hundreds of thousands of times daily on this
    >>> same system on the same network, with absolutely no problem - but
    >>> for smaller requests. There's something about the size of the
    >>> request that makes it blow up.
    >>>
    >>> I saw some reports of older systems with this problem (Windows
    >>> 2000), but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!

    >>
    >> Reminds me of the oldie, but goodie:
    >>
    >> Patient: Doctor, it hurts when I raise my arm
    >> Doctor: So stop raising your arm!
    >> ;-)
    >>
    >> Sounds to me as if a different technology is needed for this -
    >> perhaps FTP? Bob Barrows
    >>
    >> --
    >> Microsoft MVP -- ASP/ASP.NET
    >> Please reply to the newsgroup. The email account listed in my From
    >> header is my spam trap, so I don't check it very often. You will get
    >> a quicker response by posting to the newsgroup.


    --
    Microsoft MVP -- ASP/ASP.NET
    Please reply to the newsgroup. The email account listed in my From
    header is my spam trap, so I don't check it very often. You will get a
    quicker response by posting to the newsgroup.
     
    Bob Barrows [MVP], Dec 2, 2005
    #4
  5. Ed McNierney

    Ed McNierney Guest

    Bob -

    Thanks again for the quick replies. There is no HTTP compression involved,
    and I was running on MSXML 4.0 and then upgraded to MSXML 6.0 to see if this
    bug was fixed. There was no apparent difference in behavior between 4.0 and
    6.0.

    I did read the item you mention about the MSXML 3.0 bug because the symptom
    sounds virtually identical. But I have found no mention of a similar bug in
    4.0 or 6.0, which I would have expected if there was regression from 3.0
    (e.g. if the SP3 bug fix never made it to 4.0).

    - Ed

    "Bob Barrows [MVP]" wrote:

    > From
    > http://support.microsoft.com/default.aspx?scid=/servicedesks/webcasts/wc052802/WCT052802.asp:
    >
    > .... Another limitation, which we touched on earlier, is that WinInet doesn't
    > handle some of the higher-level content-related services with regard to HTTP
    > data. Some of those things are handled by URLMON. Particularly, URLMON
    > implements MIME type detection and implements HTTP compression.
    > HTTP compression is a unique technology on your server that says, "Please
    > gzip this data, compress it before it gets sent to the client." The client
    > sees it, sees the header indicating that it's gzipped content, and
    > decompresses it before displaying. If you have a large amount of content
    > you're sending, then the cost of performing this compression and
    > decompression can be much less than the cost of transmitting the
    > uncompressed content down from the server to the client. However, this is
    > implemented at the URLMON level. Because ServerXMLHTTP doesn't use URLMON,
    > it goes through WinHTTP, it uses a more bare-bones interface, it can't
    > handle HTTP compression and, again, there is no MIME type detection at all.
    > Use that at your own risk and your own best judgment.
    >
    > However, according to this:
    > http://groups.google.com/group/micr...MLHTTP 100% CPU&rnum=4&hl=en#6c4482f75218b1b1
    >
    > There is a known performance issue that sas fixed in SP3 for MSXML 3.0
    >
    > What version of MSXML are you using?
    >
    > Ed McNierney wrote:
    > > Bob -
    > >
    > > Thanks for the quick reply!
    > >
    > > First, I'd like to understand the problem, not ignore it. That won't
    > > get it fixed.
    > >
    > > Second, I don't have an option of a different technology. The
    > > service that is producing these files (they're images, produced on
    > > the fly based on an HTTP request) serves them via an HTTP interface,
    > > not FTP or any other.
    > >
    > > I did a lot of searching and cannot find any other example of this
    > > problem (other than old ones). The "alternative technology"
    > > available to me is to move this portion of the site to a Linux
    > > server, where my older PHP code works flawlessly. The intent was to
    > > move the entire site to Windows, but if Windows can't cut it, I'll
    > > need to stick to Linux.
    > >
    > > "Bob Barrows [MVP]" wrote:
    > >
    > >> Ed McNierney wrote:
    > >>> I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
    > >>> retrieve large binary data from a remote server. When the request
    > >>> is large (more than a few megabytes), the ServerXMLHTTP page jumps
    > >>> to nearly 100% CPU utilization for an unusually long time. The
    > >>> remote server needs a few seconds to prepare the request, during
    > >>> which time the CPU seems OK. It seems that as soon as the data is
    > >>> ready to retrieve, the CPU usage jumps and remains that way until
    > >>> the data has all been copied to the requesting server. That takes
    > >>> way too long - about 35 seconds when requesting a 12 MB file over a
    > >>> gigabit Ethernet.
    > >>>
    > >>> I use ServerXMLHTTP hundreds of thousands of times daily on this
    > >>> same system on the same network, with absolutely no problem - but
    > >>> for smaller requests. There's something about the size of the
    > >>> request that makes it blow up.
    > >>>
    > >>> I saw some reports of older systems with this problem (Windows
    > >>> 2000), but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!
    > >>
    > >> Reminds me of the oldie, but goodie:
    > >>
    > >> Patient: Doctor, it hurts when I raise my arm
    > >> Doctor: So stop raising your arm!
    > >> ;-)
    > >>
    > >> Sounds to me as if a different technology is needed for this -
    > >> perhaps FTP? Bob Barrows
    > >>
    > >> --
    > >> Microsoft MVP -- ASP/ASP.NET
    > >> Please reply to the newsgroup. The email account listed in my From
    > >> header is my spam trap, so I don't check it very often. You will get
    > >> a quicker response by posting to the newsgroup.

    >
    > --
    > Microsoft MVP -- ASP/ASP.NET
    > Please reply to the newsgroup. The email account listed in my From
    > header is my spam trap, so I don't check it very often. You will get a
    > quicker response by posting to the newsgroup.
    >
    >
    >
     
    Ed McNierney, Dec 2, 2005
    #5
  6. I think what he was saying is that with URLMon, http compression is
    automatically used, reducing the download time. With WinInet, it can't be
    used.

    Otherwise, I am out of my depth there. You may want to try the
    ..inetserver.iis group (or even one of the xml groups) if nobody else steps
    up to the plate here.

    If you do get a solution, I would appreciate hearing about it.

    Bob

    Ed McNierney wrote:
    > Bob -
    >
    > Thanks again for the quick replies. There is no HTTP compression
    > involved, and I was running on MSXML 4.0 and then upgraded to MSXML
    > 6.0 to see if this bug was fixed. There was no apparent difference
    > in behavior between 4.0 and
    > 6.0.
    >
    > I did read the item you mention about the MSXML 3.0 bug because the
    > symptom sounds virtually identical. But I have found no mention of a
    > similar bug in
    > 4.0 or 6.0, which I would have expected if there was regression from
    > 3.0 (e.g. if the SP3 bug fix never made it to 4.0).
    >
    > - Ed
    >
    > "Bob Barrows [MVP]" wrote:
    >
    >> From
    >>

    http://support.microsoft.com/default.aspx?scid=/servicedesks/webcasts/wc052802/WCT052802.asp:
    >>
    >> .... Another limitation, which we touched on earlier, is that
    >> WinInet doesn't handle some of the higher-level content-related
    >> services with regard to HTTP data. Some of those things are handled
    >> by URLMON. Particularly, URLMON implements MIME type detection and
    >> implements HTTP compression.
    >> HTTP compression is a unique technology on your server that says,
    >> "Please gzip this data, compress it before it gets sent to the
    >> client." The client sees it, sees the header indicating that it's
    >> gzipped content, and decompresses it before displaying. If you have
    >> a large amount of content you're sending, then the cost of
    >> performing this compression and decompression can be much less than
    >> the cost of transmitting the uncompressed content down from the
    >> server to the client. However, this is implemented at the URLMON
    >> level. Because ServerXMLHTTP doesn't use URLMON, it goes through
    >> WinHTTP, it uses a more bare-bones interface, it can't handle HTTP
    >> compression and, again, there is no MIME type detection at all. Use
    >> that at your own risk and your own best judgment.
    >>
    >> However, according to this:
    >>

    http://groups.google.com/group/micr...MLHTTP 100% CPU&rnum=4&hl=en#6c4482f75218b1b1
    >>
    >> There is a known performance issue that sas fixed in SP3 for MSXML
    >> 3.0
    >>
    >> What version of MSXML are you using?
    >>
    >> Ed McNierney wrote:
    >>> Bob -
    >>>
    >>> Thanks for the quick reply!
    >>>
    >>> First, I'd like to understand the problem, not ignore it. That
    >>> won't get it fixed.
    >>>
    >>> Second, I don't have an option of a different technology. The
    >>> service that is producing these files (they're images, produced on
    >>> the fly based on an HTTP request) serves them via an HTTP interface,
    >>> not FTP or any other.
    >>>
    >>> I did a lot of searching and cannot find any other example of this
    >>> problem (other than old ones). The "alternative technology"
    >>> available to me is to move this portion of the site to a Linux
    >>> server, where my older PHP code works flawlessly. The intent was to
    >>> move the entire site to Windows, but if Windows can't cut it, I'll
    >>> need to stick to Linux.
    >>>
    >>> "Bob Barrows [MVP]" wrote:
    >>>
    >>>> Ed McNierney wrote:
    >>>>> I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
    >>>>> retrieve large binary data from a remote server. When the request
    >>>>> is large (more than a few megabytes), the ServerXMLHTTP page jumps
    >>>>> to nearly 100% CPU utilization for an unusually long time. The
    >>>>> remote server needs a few seconds to prepare the request, during
    >>>>> which time the CPU seems OK. It seems that as soon as the data is
    >>>>> ready to retrieve, the CPU usage jumps and remains that way until
    >>>>> the data has all been copied to the requesting server. That takes
    >>>>> way too long - about 35 seconds when requesting a 12 MB file over
    >>>>> a gigabit Ethernet.
    >>>>>
    >>>>> I use ServerXMLHTTP hundreds of thousands of times daily on this
    >>>>> same system on the same network, with absolutely no problem - but
    >>>>> for smaller requests. There's something about the size of the
    >>>>> request that makes it blow up.
    >>>>>
    >>>>> I saw some reports of older systems with this problem (Windows
    >>>>> 2000), but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!
    >>>>
    >>>> Reminds me of the oldie, but goodie:
    >>>>
    >>>> Patient: Doctor, it hurts when I raise my arm
    >>>> Doctor: So stop raising your arm!
    >>>> ;-)
    >>>>
    >>>> Sounds to me as if a different technology is needed for this -
    >>>> perhaps FTP? Bob Barrows
    >>>>
    >>>> --
    >>>> Microsoft MVP -- ASP/ASP.NET
    >>>> Please reply to the newsgroup. The email account listed in my From
    >>>> header is my spam trap, so I don't check it very often. You will
    >>>> get a quicker response by posting to the newsgroup.

    >>
    >> --
    >> Microsoft MVP -- ASP/ASP.NET
    >> Please reply to the newsgroup. The email account listed in my From
    >> header is my spam trap, so I don't check it very often. You will get
    >> a quicker response by posting to the newsgroup.


    --
    Microsoft MVP -- ASP/ASP.NET
    Please reply to the newsgroup. The email account listed in my From
    header is my spam trap, so I don't check it very often. You will get a
    quicker response by posting to the newsgroup.
     
    Bob Barrows [MVP], Dec 2, 2005
    #6
  7. "Ed McNierney" <> wrote in message
    news:...
    > Bob -
    >
    > Thanks again for the quick replies. There is no HTTP compression
    > involved,
    > and I was running on MSXML 4.0 and then upgraded to MSXML 6.0 to see if
    > this
    > bug was fixed. There was no apparent difference in behavior between 4.0
    > and
    > 6.0.
    >
    > I did read the item you mention about the MSXML 3.0 bug because the
    > symptom
    > sounds virtually identical. But I have found no mention of a similar bug
    > in
    > 4.0 or 6.0, which I would have expected if there was regression from 3.0
    > (e.g. if the SP3 bug fix never made it to 4.0).


    Hi Ed,

    You can use some alternatives.
    ADODB.Record
    and ADODB.Stream can use http uploads and logon to remote pages.

    Aditionally, you always should send large data, chunked, in loops, say
    blocks of data of 4096 KB. Within the loop, you test for connectivity
    issues.
     
    Egbert Nierop \(MVP for IIS\), Dec 3, 2005
    #7
  8. Egbert Nierop (MVP for IIS) wrote:
    > "Ed McNierney" <> wrote in
    > message news:...
    >> Bob -
    >>
    >> Thanks again for the quick replies. There is no HTTP compression
    >> involved,
    >> and I was running on MSXML 4.0 and then upgraded to MSXML 6.0 to see
    >> if this
    >> bug was fixed. There was no apparent difference in behavior between
    >> 4.0 and
    >> 6.0.
    >>
    >> I did read the item you mention about the MSXML 3.0 bug because the
    >> symptom
    >> sounds virtually identical. But I have found no mention of a
    >> similar bug in
    >> 4.0 or 6.0, which I would have expected if there was regression from
    >> 3.0 (e.g. if the SP3 bug fix never made it to 4.0).

    >
    > Hi Ed,
    >
    > You can use some alternatives.
    > ADODB.Record
    > and ADODB.Stream can use http uploads and logon to remote pages.
    >
    > Aditionally, you always should send large data, chunked, in loops, say
    > blocks of data of 4096 KB. Within the loop, you test for connectivity
    > issues.


    Both good suggestions. I wish I had thought of making them.

    Bob
    --
    Microsoft MVP - ASP/ASP.NET
    Please reply to the newsgroup. This email account is my spam trap so I
    don't check it very often. If you must reply off-line, then remove the
    "NO SPAM"
     
    Bob Barrows [MVP], Dec 3, 2005
    #8
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Replies:
    4
    Views:
    284
    Gabriel Genellina
    Apr 11, 2007
  2. jrbrady

    Web Server Using 100% CPU Time

    jrbrady, Apr 19, 2006, in forum: ASP .Net Web Services
    Replies:
    0
    Views:
    135
    jrbrady
    Apr 19, 2006
  3. fred
    Replies:
    3
    Views:
    291
    Zifud
    Mar 17, 2005
  4. DanWeaver
    Replies:
    0
    Views:
    107
    DanWeaver
    May 17, 2007
  5. Replies:
    5
    Views:
    903
Loading...

Share This Page