L
Luther Miller
I have noticed that performance is very poor when a web service
returns a large dataset.
Further testing has shown that the XML version of the dataset is
approximately 1.3GB in size - about 10x the binary representation of
the data. It is the transfer of this data over the network that is
causing the performance problem.
We are pulling a large result set into an Excel application for pivot
table processing.
Connecting directly to the database using ADO takes only about 30
seconds to finish retrieving all the data - presumably because the
data is coming over efficiently in binary format.
We also looked into using a 3rd party product to compress the XML
dataset before returning it from the web service, but that too is
prohibitively slow.
How is Microsoft recommending that applications deal with large
datasets? How are other people dealing with this issue (if at all)? I
realize that .NET 2.0 is supposed to support binary data over web
services, but this needs to be a 1.1 solution.
returns a large dataset.
Further testing has shown that the XML version of the dataset is
approximately 1.3GB in size - about 10x the binary representation of
the data. It is the transfer of this data over the network that is
causing the performance problem.
We are pulling a large result set into an Excel application for pivot
table processing.
Connecting directly to the database using ADO takes only about 30
seconds to finish retrieving all the data - presumably because the
data is coming over efficiently in binary format.
We also looked into using a 3rd party product to compress the XML
dataset before returning it from the web service, but that too is
prohibitively slow.
How is Microsoft recommending that applications deal with large
datasets? How are other people dealing with this issue (if at all)? I
realize that .NET 2.0 is supposed to support binary data over web
services, but this needs to be a 1.1 solution.