K
Kerry
Hello, I'm having a problem with connection pooling from my ASP.NET Web
Service. I have two identical servers (Software wise, hardware is different).
Server A and Server B both are running Windows 2000 Server sp4, Framework
2.0.50727 and 1.1.4322, IIS 5 and using the .Net SqlClient Data Provider.
When connecting via my .NET client app to the service on Server A the DB
connections to SQLServer 2005 sp2 re-uses the connections perfectly however
when connecting to ServerB which runs the exact same service connections are
being used up then I get the dreaded "The timeout period elapsed prior to
obtaining a connection from the pool. " I have compared both servers and
cannot find the difference...I'm thinking somewhere down the line I installed
something which enabled connection pooling to function correctly on
ServerA...ServerB is new server which will replace ServerA. I have checked
all the usual places for enabling connection pooling. With each DB read
operation the connections are being used up as if it thinks each read is
coming from a different user.
1. Both Machines do not have a registry entry for StartConnectionPool = 1 so
since it works on ServerA I assume ServerB does not need it , I created this
just in case with no luck so I removed it.
2. Connection string on both servers use Max Pool Size=10;Min Pool Size=1,
since this is Development 10 is sufficient and again ServerA runs fine with
this, i don't even need to specify connectionpool=true property.
3. Going to the ODBC Adminstrator, Connection Pooling is 'On' for the
Drivers SQLServer(200.85.1128) and SQL Native Client(2005.90.3042) on both
Servers. Connections are set to remain in the pool for 60 seconds.
Is there something I could be missing or some utility that could turn on
connection pooling for the .Net SqlClient Data Provider...registry
entry,config file, IIS5 command line param.
Thanks
Service. I have two identical servers (Software wise, hardware is different).
Server A and Server B both are running Windows 2000 Server sp4, Framework
2.0.50727 and 1.1.4322, IIS 5 and using the .Net SqlClient Data Provider.
When connecting via my .NET client app to the service on Server A the DB
connections to SQLServer 2005 sp2 re-uses the connections perfectly however
when connecting to ServerB which runs the exact same service connections are
being used up then I get the dreaded "The timeout period elapsed prior to
obtaining a connection from the pool. " I have compared both servers and
cannot find the difference...I'm thinking somewhere down the line I installed
something which enabled connection pooling to function correctly on
ServerA...ServerB is new server which will replace ServerA. I have checked
all the usual places for enabling connection pooling. With each DB read
operation the connections are being used up as if it thinks each read is
coming from a different user.
1. Both Machines do not have a registry entry for StartConnectionPool = 1 so
since it works on ServerA I assume ServerB does not need it , I created this
just in case with no luck so I removed it.
2. Connection string on both servers use Max Pool Size=10;Min Pool Size=1,
since this is Development 10 is sufficient and again ServerA runs fine with
this, i don't even need to specify connectionpool=true property.
3. Going to the ODBC Adminstrator, Connection Pooling is 'On' for the
Drivers SQLServer(200.85.1128) and SQL Native Client(2005.90.3042) on both
Servers. Connections are set to remain in the pool for 60 seconds.
Is there something I could be missing or some utility that could turn on
connection pooling for the .Net SqlClient Data Provider...registry
entry,config file, IIS5 command line param.
Thanks