Large row count from stored procedure

D

Dooza

Hi there,
I am using an ASP page to output to Excel file. Its using this at the
top of the page:

<%
Change HTML header to specify Excel's MIME content type
Response.Buffer = TRUE
Response.ContentType = "application/vnd.ms-excel"
Response.Addheader "Content-Disposition",
"attachment;Filename=Export.xls"
%>

I am using an ADODB command to access a stored procedure in SQL 2000.
This stored procedure takes 5 inputs, and returns 2 recordsets.

The first recordset is always just 1 row, and is displayed in a table.

The second recordset can vary between a 10 rows and 35,000 rows.

On smaller rows it works just fine, but with large numbers of rows the
object is destroyed before it gets displayed. I am assuming its down to
the size, or the server giving up, but it doesn't time out which is what
I would expect to happen.

Is there a limit, or is it a resources issue? Can I set the limit if
there is one? Or is there a better way to do this?

Cheers,

Steve
 
B

Bob Barrows

Dooza said:
Hi there,
I am using an ASP page to output to Excel file. Its using this at the
top of the page:

<%
Change HTML header to specify Excel's MIME content type
Response.Buffer = TRUE
Response.ContentType = "application/vnd.ms-excel"
Response.Addheader "Content-Disposition",
"attachment;Filename=Export.xls"
%>

I am using an ADODB command to access a stored procedure in SQL 2000.
This stored procedure takes 5 inputs, and returns 2 recordsets.

The first recordset is always just 1 row, and is displayed in a table.

The second recordset can vary between a 10 rows and 35,000 rows.

35000?!?!?
Err ... did you consider paging these records ... ?
On smaller rows it works just fine, but with large numbers of rows the
object is destroyed before it gets displayed. I am assuming its down
to the size, or the server giving up, but it doesn't time out which
is what I would expect to happen.

Is there a limit, or is it a resources issue? Can I set the limit if
there is one? Or is there a better way to do this?
Paging immediately suggests itself to me. Look it up at www.aspfaq.com
 
D

Dooza

Bob said:
35000?!?!?
Err ... did you consider paging these records ... ?

Paging immediately suggests itself to me. Look it up at www.aspfaq.com

I dont know if paging would work when dumping it to excel, or maybe it
would, I dont know.

Currently there is a form, you chose various options, submit, and output
is via excel, as that is the format I have been asked to make the report
in. Excel can take over 65000 rows. But I guess ADODB must have a limit,
but I can't find any information about this limit.

I was thinking about maybe output XML from the stored procedure, and
streaming the output, what do you think? Would that work? Can I get that
in excel?

Steve
 
D

Daniel Crichton

Dooza wrote on Thu, 25 Jun 2009 12:20:59 +0100:
Hi there,
I am using an ASP page to output to Excel file. Its using this at the top
of the page:
<%
Change HTML header to specify Excel's MIME content type
Response.Buffer = TRUE
Response.ContentType = "application/vnd.ms-excel"
Response.Addheader "Content-Disposition",
"attachment;Filename=Export.xls"
%>
I am using an ADODB command to access a stored procedure in SQL 2000. This
stored procedure takes 5 inputs, and returns 2 recordsets.
The first recordset is always just 1 row, and is displayed in a table.
The second recordset can vary between a 10 rows and 35,000 rows.
On smaller rows it works just fine, but with large numbers of rows the
object is destroyed before it gets displayed. I am assuming its down to
the size, or the server giving up, but it doesn't time out which is
what I would expect to happen.
Is there a limit, or is it a resources issue? Can I set the limit if there
is one? Or is there a better way to do this?

Steve

Use

Response.Buffer = False

and also in your loop that is writing the data out add a

Response.Flush

every 100 rows or so. I do both of these because I've found that even with
the buffer set to false that there is some buffering going on, and the flush
helps stop this causing the ASP engine from throwing an error due to the
buffer filling up.
 
D

Dooza

Daniel said:
Response.Buffer = False

and also in your loop that is writing the data out add a

Response.Flush

every 100 rows or so. I do both of these because I've found that even with
the buffer set to false that there is some buffering going on, and the flush
helps stop this causing the ASP engine from throwing an error due to the
buffer filling up.


Hi Daniel,
I have tried as you suggested, but it doesn't get passed the start of my
loop:

Microsoft VBScript runtime error '800a01a8'

Object required

/spreadsheet/excel.asp, line 140

Line 140 is <% Do While NOT rsData.EOF %>

With a smaller dataset I am informed that buffering must be on when I
try to do the flush.

Steve
 
B

Bob Barrows

Dooza said:
Hi Daniel,
I have tried as you suggested, but it doesn't get passed the start of
my loop:

Microsoft VBScript runtime error '800a01a8'

Object required

/spreadsheet/excel.asp, line 140

Line 140 is <% Do While NOT rsData.EOF %>

With a smaller dataset I am informed that buffering must be on when I
try to do the flush.
Can you successfully use getrows to pull the data into an array as a
test?
 
B

Bob Barrows

Dooza said:
I dont know if paging would work when dumping it to excel, or maybe it
would, I dont know.
Oh yeah ... <blush>

Hmm, is it possible to use DTS/SSIS to export this data to a spreadsheet
instead? You can then stream it to the user. I guess the answer would
depend on whether this was for an intranet or for the internet.
 
D

Dooza

Bob said:
Oh yeah ... <blush>

Hmm, is it possible to use DTS/SSIS to export this data to a spreadsheet
instead? You can then stream it to the user. I guess the answer would
depend on whether this was for an intranet or for the internet.

Its an intranet, but needs data from the user first.

Its a sales report. You can either give a string of item codes, comma
separated, or select a manufacturer from a drop down list.

You then enter the freight and duty multipliers, and optionally select a
country from another drop down list.

Steve
 
D

Dooza

Bob said:
Can you successfully use getrows to pull the data into an array as a
test?

I decided to change my stored procedure to output just one recordset,
and then to use getrows, but I get the same error message, just now it
points to the getrows line.

Steve
 
D

Dooza

Dooza said:
I decided to change my stored procedure to output just one recordset,
and then to use getrows, but I get the same error message, just now it
points to the getrows line.

Steve

Just noticed I am using a DSN on the server, is this perhaps limited the
rows returned?

Steve
 
B

Bob Barrows

Dooza said:
Its an intranet, but needs data from the user first.
That does not invalidate using a package. The learning curve may be
steep, but it's the correct technology to use for this.
 
B

Bob Barrows

This is the first mention I've heard of an error message. What is the
error message?
Just noticed I am using a DSN on the server, is this perhaps limited
the rows returned?
Probably not, but you should jettison the ODBC connection.
 
D

Dooza

Dooza said:
I decided to change my stored procedure to output just one recordset,
and then to use getrows, but I get the same error message, just now it
points to the getrows line.

Steve

I was wrong, it does work, well, I used rsDate instead of rsData. I got
rid of the getrows and its working... slowly, but its working.

So the question is, why did it not like having a big second recordset? I
was using the first one as totals along the top as that was where I was
asked to put it. Looks like I can't do it that way.

Steve
 
D

Dooza

Bob said:
This is the first mention I've heard of an error message. What is the
error message?

Check a couple messages up, it was even in the quoted reply. Basically
object required error.
Probably not, but you should jettison the ODBC connection.

I agree, not sure why it was using it in the first place.

Steve
 
D

Dooza

Bob said:
That does not invalidate using a package. The learning curve may be
steep, but it's the correct technology to use for this.

DTS may be a better solution for this, I will have a closer look and see
if I can schedule some research in.

Steve
 
D

Dooza

I was wrong, it does work, well, I used rsDate instead of rsData. I got
rid of the getrows and its working... slowly, but its working.

So the question is, why did it not like having a big second recordset? I
was using the first one as totals along the top as that was where I was
asked to put it. Looks like I can't do it that way.

I think I will try and have another recordset with the totals, rather
than trying to squeeze them both from the same stored procedure.

Steve
 
A

Adrienne Boswell

I think I will try and have another recordset with the totals, rather
than trying to squeeze them both from the same stored procedure.

Steve

I would keep on trying the getrows method. Much better to put the data
into an array, close the recordset, and loop through the array.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top