Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > ASP .Net > ASP General > ServerXMLHTTP uses 100% CPU for a long time

Reply
Thread Tools

ServerXMLHTTP uses 100% CPU for a long time

 
 
Ed McNierney
Guest
Posts: n/a
 
      12-02-2005
I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to retrieve
large binary data from a remote server. When the request is large (more than
a few megabytes), the ServerXMLHTTP page jumps to nearly 100% CPU utilization
for an unusually long time. The remote server needs a few seconds to prepare
the request, during which time the CPU seems OK. It seems that as soon as
the data is ready to retrieve, the CPU usage jumps and remains that way until
the data has all been copied to the requesting server. That takes way too
long - about 35 seconds when requesting a 12 MB file over a gigabit Ethernet.

I use ServerXMLHTTP hundreds of thousands of times daily on this same system
on the same network, with absolutely no problem - but for smaller requests.
There's something about the size of the request that makes it blow up.

I saw some reports of older systems with this problem (Windows 2000), but
I'm running IIS 6 on Windows Server 2003, SP1. Thanks!
 
Reply With Quote
 
 
 
 
Bob Barrows [MVP]
Guest
Posts: n/a
 
      12-02-2005
Ed McNierney wrote:
> I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
> retrieve large binary data from a remote server. When the request is
> large (more than a few megabytes), the ServerXMLHTTP page jumps to
> nearly 100% CPU utilization for an unusually long time. The remote
> server needs a few seconds to prepare the request, during which time
> the CPU seems OK. It seems that as soon as the data is ready to
> retrieve, the CPU usage jumps and remains that way until the data has
> all been copied to the requesting server. That takes way too long -
> about 35 seconds when requesting a 12 MB file over a gigabit
> Ethernet.
>
> I use ServerXMLHTTP hundreds of thousands of times daily on this same
> system on the same network, with absolutely no problem - but for
> smaller requests. There's something about the size of the request
> that makes it blow up.
>
> I saw some reports of older systems with this problem (Windows 2000),
> but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!


Reminds me of the oldie, but goodie:

Patient: Doctor, it hurts when I raise my arm
Doctor: So stop raising your arm!


Sounds to me as if a different technology is needed for this - perhaps FTP?
Bob Barrows

--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.


 
Reply With Quote
 
 
 
 
Ed McNierney
Guest
Posts: n/a
 
      12-02-2005
Bob -

Thanks for the quick reply!

First, I'd like to understand the problem, not ignore it. That won't get it
fixed.

Second, I don't have an option of a different technology. The service that
is producing these files (they're images, produced on the fly based on an
HTTP request) serves them via an HTTP interface, not FTP or any other.

I did a lot of searching and cannot find any other example of this problem
(other than old ones). The "alternative technology" available to me is to
move this portion of the site to a Linux server, where my older PHP code
works flawlessly. The intent was to move the entire site to Windows, but if
Windows can't cut it, I'll need to stick to Linux.

"Bob Barrows [MVP]" wrote:

> Ed McNierney wrote:
> > I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
> > retrieve large binary data from a remote server. When the request is
> > large (more than a few megabytes), the ServerXMLHTTP page jumps to
> > nearly 100% CPU utilization for an unusually long time. The remote
> > server needs a few seconds to prepare the request, during which time
> > the CPU seems OK. It seems that as soon as the data is ready to
> > retrieve, the CPU usage jumps and remains that way until the data has
> > all been copied to the requesting server. That takes way too long -
> > about 35 seconds when requesting a 12 MB file over a gigabit
> > Ethernet.
> >
> > I use ServerXMLHTTP hundreds of thousands of times daily on this same
> > system on the same network, with absolutely no problem - but for
> > smaller requests. There's something about the size of the request
> > that makes it blow up.
> >
> > I saw some reports of older systems with this problem (Windows 2000),
> > but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!

>
> Reminds me of the oldie, but goodie:
>
> Patient: Doctor, it hurts when I raise my arm
> Doctor: So stop raising your arm!
>
>
> Sounds to me as if a different technology is needed for this - perhaps FTP?
> Bob Barrows
>
> --
> Microsoft MVP -- ASP/ASP.NET
> Please reply to the newsgroup. The email account listed in my From
> header is my spam trap, so I don't check it very often. You will get a
> quicker response by posting to the newsgroup.
>
>
>

 
Reply With Quote
 
Bob Barrows [MVP]
Guest
Posts: n/a
 
      12-02-2005
From
http://support.microsoft.com/default...WCT052802.asp:

.... Another limitation, which we touched on earlier, is that WinInet doesn't
handle some of the higher-level content-related services with regard to HTTP
data. Some of those things are handled by URLMON. Particularly, URLMON
implements MIME type detection and implements HTTP compression.
HTTP compression is a unique technology on your server that says, "Please
gzip this data, compress it before it gets sent to the client." The client
sees it, sees the header indicating that it's gzipped content, and
decompresses it before displaying. If you have a large amount of content
you're sending, then the cost of performing this compression and
decompression can be much less than the cost of transmitting the
uncompressed content down from the server to the client. However, this is
implemented at the URLMON level. Because ServerXMLHTTP doesn't use URLMON,
it goes through WinHTTP, it uses a more bare-bones interface, it can't
handle HTTP compression and, again, there is no MIME type detection at all.
Use that at your own risk and your own best judgment.

However, according to this:
http://groups.google.com/group/micro...4482f75218b1b1

There is a known performance issue that sas fixed in SP3 for MSXML 3.0

What version of MSXML are you using?

Ed McNierney wrote:
> Bob -
>
> Thanks for the quick reply!
>
> First, I'd like to understand the problem, not ignore it. That won't
> get it fixed.
>
> Second, I don't have an option of a different technology. The
> service that is producing these files (they're images, produced on
> the fly based on an HTTP request) serves them via an HTTP interface,
> not FTP or any other.
>
> I did a lot of searching and cannot find any other example of this
> problem (other than old ones). The "alternative technology"
> available to me is to move this portion of the site to a Linux
> server, where my older PHP code works flawlessly. The intent was to
> move the entire site to Windows, but if Windows can't cut it, I'll
> need to stick to Linux.
>
> "Bob Barrows [MVP]" wrote:
>
>> Ed McNierney wrote:
>>> I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
>>> retrieve large binary data from a remote server. When the request
>>> is large (more than a few megabytes), the ServerXMLHTTP page jumps
>>> to nearly 100% CPU utilization for an unusually long time. The
>>> remote server needs a few seconds to prepare the request, during
>>> which time the CPU seems OK. It seems that as soon as the data is
>>> ready to retrieve, the CPU usage jumps and remains that way until
>>> the data has all been copied to the requesting server. That takes
>>> way too long - about 35 seconds when requesting a 12 MB file over a
>>> gigabit Ethernet.
>>>
>>> I use ServerXMLHTTP hundreds of thousands of times daily on this
>>> same system on the same network, with absolutely no problem - but
>>> for smaller requests. There's something about the size of the
>>> request that makes it blow up.
>>>
>>> I saw some reports of older systems with this problem (Windows
>>> 2000), but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!

>>
>> Reminds me of the oldie, but goodie:
>>
>> Patient: Doctor, it hurts when I raise my arm
>> Doctor: So stop raising your arm!
>>
>>
>> Sounds to me as if a different technology is needed for this -
>> perhaps FTP? Bob Barrows
>>
>> --
>> Microsoft MVP -- ASP/ASP.NET
>> Please reply to the newsgroup. The email account listed in my From
>> header is my spam trap, so I don't check it very often. You will get
>> a quicker response by posting to the newsgroup.


--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.


 
Reply With Quote
 
Ed McNierney
Guest
Posts: n/a
 
      12-02-2005
Bob -

Thanks again for the quick replies. There is no HTTP compression involved,
and I was running on MSXML 4.0 and then upgraded to MSXML 6.0 to see if this
bug was fixed. There was no apparent difference in behavior between 4.0 and
6.0.

I did read the item you mention about the MSXML 3.0 bug because the symptom
sounds virtually identical. But I have found no mention of a similar bug in
4.0 or 6.0, which I would have expected if there was regression from 3.0
(e.g. if the SP3 bug fix never made it to 4.0).

- Ed

"Bob Barrows [MVP]" wrote:

> From
> http://support.microsoft.com/default...WCT052802.asp:
>
> .... Another limitation, which we touched on earlier, is that WinInet doesn't
> handle some of the higher-level content-related services with regard to HTTP
> data. Some of those things are handled by URLMON. Particularly, URLMON
> implements MIME type detection and implements HTTP compression.
> HTTP compression is a unique technology on your server that says, "Please
> gzip this data, compress it before it gets sent to the client." The client
> sees it, sees the header indicating that it's gzipped content, and
> decompresses it before displaying. If you have a large amount of content
> you're sending, then the cost of performing this compression and
> decompression can be much less than the cost of transmitting the
> uncompressed content down from the server to the client. However, this is
> implemented at the URLMON level. Because ServerXMLHTTP doesn't use URLMON,
> it goes through WinHTTP, it uses a more bare-bones interface, it can't
> handle HTTP compression and, again, there is no MIME type detection at all.
> Use that at your own risk and your own best judgment.
>
> However, according to this:
> http://groups.google.com/group/micro...4482f75218b1b1
>
> There is a known performance issue that sas fixed in SP3 for MSXML 3.0
>
> What version of MSXML are you using?
>
> Ed McNierney wrote:
> > Bob -
> >
> > Thanks for the quick reply!
> >
> > First, I'd like to understand the problem, not ignore it. That won't
> > get it fixed.
> >
> > Second, I don't have an option of a different technology. The
> > service that is producing these files (they're images, produced on
> > the fly based on an HTTP request) serves them via an HTTP interface,
> > not FTP or any other.
> >
> > I did a lot of searching and cannot find any other example of this
> > problem (other than old ones). The "alternative technology"
> > available to me is to move this portion of the site to a Linux
> > server, where my older PHP code works flawlessly. The intent was to
> > move the entire site to Windows, but if Windows can't cut it, I'll
> > need to stick to Linux.
> >
> > "Bob Barrows [MVP]" wrote:
> >
> >> Ed McNierney wrote:
> >>> I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
> >>> retrieve large binary data from a remote server. When the request
> >>> is large (more than a few megabytes), the ServerXMLHTTP page jumps
> >>> to nearly 100% CPU utilization for an unusually long time. The
> >>> remote server needs a few seconds to prepare the request, during
> >>> which time the CPU seems OK. It seems that as soon as the data is
> >>> ready to retrieve, the CPU usage jumps and remains that way until
> >>> the data has all been copied to the requesting server. That takes
> >>> way too long - about 35 seconds when requesting a 12 MB file over a
> >>> gigabit Ethernet.
> >>>
> >>> I use ServerXMLHTTP hundreds of thousands of times daily on this
> >>> same system on the same network, with absolutely no problem - but
> >>> for smaller requests. There's something about the size of the
> >>> request that makes it blow up.
> >>>
> >>> I saw some reports of older systems with this problem (Windows
> >>> 2000), but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!
> >>
> >> Reminds me of the oldie, but goodie:
> >>
> >> Patient: Doctor, it hurts when I raise my arm
> >> Doctor: So stop raising your arm!
> >>
> >>
> >> Sounds to me as if a different technology is needed for this -
> >> perhaps FTP? Bob Barrows
> >>
> >> --
> >> Microsoft MVP -- ASP/ASP.NET
> >> Please reply to the newsgroup. The email account listed in my From
> >> header is my spam trap, so I don't check it very often. You will get
> >> a quicker response by posting to the newsgroup.

>
> --
> Microsoft MVP -- ASP/ASP.NET
> Please reply to the newsgroup. The email account listed in my From
> header is my spam trap, so I don't check it very often. You will get a
> quicker response by posting to the newsgroup.
>
>
>

 
Reply With Quote
 
Bob Barrows [MVP]
Guest
Posts: n/a
 
      12-02-2005
I think what he was saying is that with URLMon, http compression is
automatically used, reducing the download time. With WinInet, it can't be
used.

Otherwise, I am out of my depth there. You may want to try the
..inetserver.iis group (or even one of the xml groups) if nobody else steps
up to the plate here.

If you do get a solution, I would appreciate hearing about it.

Bob

Ed McNierney wrote:
> Bob -
>
> Thanks again for the quick replies. There is no HTTP compression
> involved, and I was running on MSXML 4.0 and then upgraded to MSXML
> 6.0 to see if this bug was fixed. There was no apparent difference
> in behavior between 4.0 and
> 6.0.
>
> I did read the item you mention about the MSXML 3.0 bug because the
> symptom sounds virtually identical. But I have found no mention of a
> similar bug in
> 4.0 or 6.0, which I would have expected if there was regression from
> 3.0 (e.g. if the SP3 bug fix never made it to 4.0).
>
> - Ed
>
> "Bob Barrows [MVP]" wrote:
>
>> From
>>

http://support.microsoft.com/default...WCT052802.asp:
>>
>> .... Another limitation, which we touched on earlier, is that
>> WinInet doesn't handle some of the higher-level content-related
>> services with regard to HTTP data. Some of those things are handled
>> by URLMON. Particularly, URLMON implements MIME type detection and
>> implements HTTP compression.
>> HTTP compression is a unique technology on your server that says,
>> "Please gzip this data, compress it before it gets sent to the
>> client." The client sees it, sees the header indicating that it's
>> gzipped content, and decompresses it before displaying. If you have
>> a large amount of content you're sending, then the cost of
>> performing this compression and decompression can be much less than
>> the cost of transmitting the uncompressed content down from the
>> server to the client. However, this is implemented at the URLMON
>> level. Because ServerXMLHTTP doesn't use URLMON, it goes through
>> WinHTTP, it uses a more bare-bones interface, it can't handle HTTP
>> compression and, again, there is no MIME type detection at all. Use
>> that at your own risk and your own best judgment.
>>
>> However, according to this:
>>

http://groups.google.com/group/micro...4482f75218b1b1
>>
>> There is a known performance issue that sas fixed in SP3 for MSXML
>> 3.0
>>
>> What version of MSXML are you using?
>>
>> Ed McNierney wrote:
>>> Bob -
>>>
>>> Thanks for the quick reply!
>>>
>>> First, I'd like to understand the problem, not ignore it. That
>>> won't get it fixed.
>>>
>>> Second, I don't have an option of a different technology. The
>>> service that is producing these files (they're images, produced on
>>> the fly based on an HTTP request) serves them via an HTTP interface,
>>> not FTP or any other.
>>>
>>> I did a lot of searching and cannot find any other example of this
>>> problem (other than old ones). The "alternative technology"
>>> available to me is to move this portion of the site to a Linux
>>> server, where my older PHP code works flawlessly. The intent was to
>>> move the entire site to Windows, but if Windows can't cut it, I'll
>>> need to stick to Linux.
>>>
>>> "Bob Barrows [MVP]" wrote:
>>>
>>>> Ed McNierney wrote:
>>>>> I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
>>>>> retrieve large binary data from a remote server. When the request
>>>>> is large (more than a few megabytes), the ServerXMLHTTP page jumps
>>>>> to nearly 100% CPU utilization for an unusually long time. The
>>>>> remote server needs a few seconds to prepare the request, during
>>>>> which time the CPU seems OK. It seems that as soon as the data is
>>>>> ready to retrieve, the CPU usage jumps and remains that way until
>>>>> the data has all been copied to the requesting server. That takes
>>>>> way too long - about 35 seconds when requesting a 12 MB file over
>>>>> a gigabit Ethernet.
>>>>>
>>>>> I use ServerXMLHTTP hundreds of thousands of times daily on this
>>>>> same system on the same network, with absolutely no problem - but
>>>>> for smaller requests. There's something about the size of the
>>>>> request that makes it blow up.
>>>>>
>>>>> I saw some reports of older systems with this problem (Windows
>>>>> 2000), but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!
>>>>
>>>> Reminds me of the oldie, but goodie:
>>>>
>>>> Patient: Doctor, it hurts when I raise my arm
>>>> Doctor: So stop raising your arm!
>>>>
>>>>
>>>> Sounds to me as if a different technology is needed for this -
>>>> perhaps FTP? Bob Barrows
>>>>
>>>> --
>>>> Microsoft MVP -- ASP/ASP.NET
>>>> Please reply to the newsgroup. The email account listed in my From
>>>> header is my spam trap, so I don't check it very often. You will
>>>> get a quicker response by posting to the newsgroup.

>>
>> --
>> Microsoft MVP -- ASP/ASP.NET
>> Please reply to the newsgroup. The email account listed in my From
>> header is my spam trap, so I don't check it very often. You will get
>> a quicker response by posting to the newsgroup.


--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.


 
Reply With Quote
 
Egbert Nierop \(MVP for IIS\)
Guest
Posts: n/a
 
      12-03-2005

"Ed McNierney" <(E-Mail Removed)> wrote in message
news:(E-Mail Removed)...
> Bob -
>
> Thanks again for the quick replies. There is no HTTP compression
> involved,
> and I was running on MSXML 4.0 and then upgraded to MSXML 6.0 to see if
> this
> bug was fixed. There was no apparent difference in behavior between 4.0
> and
> 6.0.
>
> I did read the item you mention about the MSXML 3.0 bug because the
> symptom
> sounds virtually identical. But I have found no mention of a similar bug
> in
> 4.0 or 6.0, which I would have expected if there was regression from 3.0
> (e.g. if the SP3 bug fix never made it to 4.0).


Hi Ed,

You can use some alternatives.
ADODB.Record
and ADODB.Stream can use http uploads and logon to remote pages.

Aditionally, you always should send large data, chunked, in loops, say
blocks of data of 4096 KB. Within the loop, you test for connectivity
issues.

 
Reply With Quote
 
Bob Barrows [MVP]
Guest
Posts: n/a
 
      12-03-2005
Egbert Nierop (MVP for IIS) wrote:
> "Ed McNierney" <(E-Mail Removed)> wrote in
> message news:(E-Mail Removed)...
>> Bob -
>>
>> Thanks again for the quick replies. There is no HTTP compression
>> involved,
>> and I was running on MSXML 4.0 and then upgraded to MSXML 6.0 to see
>> if this
>> bug was fixed. There was no apparent difference in behavior between
>> 4.0 and
>> 6.0.
>>
>> I did read the item you mention about the MSXML 3.0 bug because the
>> symptom
>> sounds virtually identical. But I have found no mention of a
>> similar bug in
>> 4.0 or 6.0, which I would have expected if there was regression from
>> 3.0 (e.g. if the SP3 bug fix never made it to 4.0).

>
> Hi Ed,
>
> You can use some alternatives.
> ADODB.Record
> and ADODB.Stream can use http uploads and logon to remote pages.
>
> Aditionally, you always should send large data, chunked, in loops, say
> blocks of data of 4096 KB. Within the loop, you test for connectivity
> issues.


Both good suggestions. I wish I had thought of making them.

Bob
--
Microsoft MVP - ASP/ASP.NET
Please reply to the newsgroup. This email account is my spam trap so I
don't check it very often. If you must reply off-line, then remove the
"NO SPAM"


 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Tetration (print 100^100^100^100^100^100^100^100^100^100^100^100^100^100) jononanon@googlemail.com C Programming 5 04-25-2012 08:49 PM
Msxml2.ServerXMLHTTP and Time Out NeedHelp ASP General 9 02-13-2010 08:19 AM
Having compilation error: no match for call to (const __gnu_cxx::hash<long long int>) (const long long int&) veryhotsausage C++ 1 07-04-2008 05:41 PM
Scriptaculous Sortable Uses 100% CPU and very slow DanWeaver Javascript 0 05-17-2007 11:38 AM
inetinfo.exe uses 100% CPU Alex Vinokur Computer Support 1 06-29-2005 04:42 PM



Advertisments