Behavior of ADODB.Command .Execute changes on different servers???

S

SaltedSnacks

Hello.

We have an ASP 3.0 application that currently works "correctly" on one
server, Server A, and we're testing it on another server, Server B,
which is 64 bit.

The connection string for Server A is:

DRIVER={SQL Server};SERVER=...;DATABASE=...;UID=...;PWD=...

The connection string for Server B is:

PROVIDER=sqloledb;SERVER=...;DATABASE=...;UID=...;PWD=...

(Note: Both servers point to the same database on the same server)


The unexpected behavior occurs after calling .Execute on a command.
Here is some sample code:

Dim DBConn
Set DBConn = CreateObject("ADODB.Connection")
DBConn.Open strDBConnection '(the ones shown above)

Dim objCmd
Set objCmd = Server.CreateObject("ADODB.Command")
objCmd.ActiveConnection = DBConn

objCmd.CommandType = adCmdStoredProc
objCmd.CommandText = "sp_TestProcedure"

objCmd.Parameters.Append objCmd.CreateParameter("@Name", adVarChar,
adParamInput, 50, "Test")

Dim rs
Set rs = objCmd.Execute


For this example, the stored procedure sp_TestProcedure is:

CREATE PROCEDURE sp_TestProcedure
@Name varchar(50)
AS
INSERT INTO tblTest ([Name], Date) VALUES (@Name, getDate())
SELECT COUNT(*) AS 'Count' FROM tblTest

The basic point is the stored procedure does an INSERT and then a
SELECT.


Now... to the issue. On Server A, the variable rs above ends up with
a single open Recordset which is the results of the SELECT statement.
However, on Server B, rs is set to a closed recordset, and
rs.NextRecordset() gets a second recordset of the results of the
SELECT statement.

I understand what's going on. Server B is first returning the number
of rows affected by the INSERT which translates to a closed
recordset. But Server A does not do this.

I would like to know why the default behavior of the
command's .Execute is different on the different servers. Does it
relate to the Provider/Driver settings in the connection string? Does
it have anything to do with 64 bit VS. 32 bit servers?

I know that one way to address this issue to add SET NOCOUNT ON to the
start of the stored procedure. But we have many stored procedures,
and if the solution is a change in the connection string, that would
be preferred. Also, whatever the possible solution is, I also looking
to discover *why* it's happening.

Any help would be greatly appreciated.
 
B

Bob Barrows [MVP]

Hello.

We have an ASP 3.0 application that currently works "correctly" on one
server, Server A, and we're testing it on another server, Server B,
which is 64 bit.

The connection string for Server A is:

DRIVER={SQL Server};SERVER=...;DATABASE=...;UID=...;PWD=...

Using the deprecated OLE DB Provider for ODBC - not recommended
The connection string for Server B is:

PROVIDER=sqloledb;SERVER=...;DATABASE=...;UID=...;PWD=...
Using the native OLE DB Provider for SQL Server
(Note: Both servers point to the same database on the same server)


The unexpected behavior occurs after calling .Execute on a command.
Now... to the issue. On Server A, the variable rs above ends up with
a single open Recordset which is the results of the SELECT statement.
However, on Server B, rs is set to a closed recordset, and
rs.NextRecordset() gets a second recordset of the results of the
SELECT statement.
Actually, I would expect the Server B behavior.
I understand what's going on. Server B is first returning the number
of rows affected by the INSERT which translates to a closed
recordset. But Server A does not do this.
That is strange. It's been years since I used ODBC, but I am pretty sure
I had to handle this issue back then by using SET NOCOUNT ON.
I would like to know why the default behavior of the
command's .Execute is different on the different servers.

It's not. The results delivered by the different providers are obviously
different. As to why that is the case ... said:
Does it
relate to the Provider/Driver settings in the connection string?

Maybe, but ISTR the ODBC driver doing the same thing
Does
it have anything to do with 64 bit VS. 32 bit servers?

The only way to determine this is to try the both connection strings on
server B.
I know that one way to address this issue to add SET NOCOUNT ON to the
start of the stored procedure.

That's one way. The other is to modify your code to check the state of
the recordset object, calling NextRecordset if the recordset is closed.
But we have many stored procedures,
and if the solution is a change in the connection string, that would
be preferred.

I would think that you would want to confirm that changing back to the
ODBC string on server B would indeed make a difference. You should keep
in mind however, that the older ODBC provider is deprecated and may not
work as expected with SQL 2005+. You may want to bite the bullet now and
add SET NOCOUNT to the procedures that need it.
Also, whatever the possible solution is, I also looking
to discover *why* it's happening.

I'm totally baffled as to how knowing the "why" is going to help you
make a decision as to what course to take. Assuming that using the ODBC
string on server B "works", is it still the correct decision to use the
obsolete technology which may get broken in future versions of SQL
Server simply to avoid a few hours of work?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,537
Members
45,022
Latest member
MaybelleMa

Latest Threads

Top