Different behavior locally vs remotely

R

Raghu Rudra

I have an ASP.NET web application, which accepts posted xml to do its work
and returns response xml. This xml is large and is in the order of 1MB. I
started testing this web app with client utility which spawns multiple
threads to post xml to this web site. I noticed an interesting thing. When I
kept the client utility and the web app on the same machine (I know that
this is not ideal testing), the perf monitor showed me that number of max
requests executing any time was some thing like 15 when client spawned 20
threads. (I did not use localhost in the url but used only the machine
name).

However as soon as I moved the client utility to a different machine (say
A), the max requests executing at any time dropped to 2 irrespective number
of client threads. At this point, I thought there may be a problem with
machine A. So I moved the client to yet another machine (say B). Same
pattern repeated. Just for experimenting, I used the client utility on
machines A and B simultaneously with each one running about 10 threads. The
max requests excuting changed to 4. I switched the web app to another
machine and used client utility from different machines. Again same pattern
repeated. So it appears that my web web app is allowing only 2 simultaneous
requests from each machine when being called remotely. But it works as
expected locally.

I checked the processModel element in machine.config to find answers but got
no where. I have not modified this file at all.

Is there any explanation for this behavior?

Thanks.
Raghu/..
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,534
Members
45,007
Latest member
obedient dusk

Latest Threads

Top