Datasets vs Custom Objects for a webservice

J

John Sheppard

Hello there,

We currently use datasets to pass data between our webservice and our
client. The perfomance is abismal (this is subjective of course), the users
are complaining, im not happy with the speed either.

I was wondering about 2 possible solutions

a) Has anyone used LLBLgen over a webservice...am I fool to go down this
route? Has anyone seen any quantification of speeds of custom objects vs
datasets...
b) How difficult is it to convert a webservice into .net remoting?

We do have compression happening and our internet isnt the fastest in the
world (its australia). Gotta be a way to make it faster :(

Thank you kindly for reading my message
John Sheppard
 
T

Tiago Halm

From a conceptual point of view, one would create:
- DAL (Data Access Layer)
- BAL (Business Access Layer)
- WebService to expose the BAL

All this layers can be logical (do not need to be in separate assemblies) or
physical (separate assemblies/modules). Give the above, it is logical to
expose a custom object or custom list of objects (when you return a list of
these). You know that most data controls give you the ability, using object
data source, to bind to custom objects.

The serialized dataset has a lot of properties which you don't need, and
sometime later you may even need to deal wih the objects the dataset
represent to do intermediate logic.

From a performance point of view, netTcpBinding you quite performant and
should give a speed similar to what you see with .NET remoting with less
under the cover funcionatility and more extensability points, plus tracing,
plus management, plus a lot other features available in WCF (WS-* standards
are quite rich and flexible for that matter)

Tiago Halm
 
J

John Sheppard

----- Original Message -----
From: "Tiago Halm" <[email protected]>
Newsgroups: microsoft.public.dotnet.framework.aspnet.webservices
Sent: Wednesday, March 19, 2008 9:45 AM
Subject: Re: Datasets vs Custom Objects for a webservice

From a conceptual point of view, one would create:
- DAL (Data Access Layer)
- BAL (Business Access Layer)
- WebService to expose the BAL

All this layers can be logical (do not need to be in separate assemblies)
or physical (separate assemblies/modules). Give the above, it is logical
to expose a custom object or custom list of objects (when you return a
list of these). You know that most data controls give you the ability,
using object data source, to bind to custom objects.

The serialized dataset has a lot of properties which you don't need, and
sometime later you may even need to deal wih the objects the dataset
represent to do intermediate logic.

From a performance point of view, netTcpBinding you quite performant and
should give a speed similar to what you see with .NET remoting with less
under the cover funcionatility and more extensability points, plus
tracing, plus management, plus a lot other features available in WCF (WS-*
standards are quite rich and flexible for that matter)

Tiago Halm

Thanks Tiago,

Our current Layers are
DAL (as webservice...Im sure everyone will shudder)
BAL
UI

I know that we have it wrong, known for a while. Not sure how to go from
datasets to there tho...I suppose if we use LLBGEN it's irrelevant how we
access the data then...

I have had a look at WCF and i found it badly documented and very
convoluted. I must admit i didnt have a good look tho.

One thing I really hate is debugging on the webservice...everything has to
stop :( When our business logic goes there Id imagine that will slow me down
alot. Wish I could somehow edit stuff on the webserver without having to
stop execution.

Thanks Tiago,
I will have to contemplate this...(wish I had more time :(...*sigh*)
John Sheppard
 
R

Roy Lawson

I've been developing WCF for over a year now. I have found it to be a very
robust solution.

So the problem as I understand it is performance across the wire. Yes you
could use LLBGEN or what I like to use is NetTiers via CodeSmith for the DAL.
But it is really not relevant to your problem as I understand it. LLBGEN
and CodeSmith will both produce datasets or business entities.

You really have several choices here to solve the problem of moving large
volumes of data across the wire. One is to use another (more efficient)
protocol - as mentioned already. Another is to pass business entities
instead of datasets (collections) since it would reduce the size of data
moving across the wire. You already know that. And obviously make sure the
service side filters out unwanted/needed data.

Assuming all of those things are done, another possibility would be
compression. You may want to check out this solution for WS-compression:
http://weblogs.asp.net/cibrax/archive/2006/03/29/441398.aspx

If that doesn't work for you, I'm sure there are other compression solutions
out there. Please let us know the ultimate solution you choose and how it
goes!


Roy Lawson
president of the Central Florida .Net Users Group
www.cfdotnet.org
 
J

John Sheppard

Roy Lawson said:
I've been developing WCF for over a year now. I have found it to be a
very
robust solution.

So the problem as I understand it is performance across the wire. Yes you
could use LLBGEN or what I like to use is NetTiers via CodeSmith for the
DAL.
But it is really not relevant to your problem as I understand it. LLBGEN
and CodeSmith will both produce datasets or business entities.

You really have several choices here to solve the problem of moving large
volumes of data across the wire. One is to use another (more efficient)
protocol - as mentioned already. Another is to pass business entities
instead of datasets (collections) since it would reduce the size of data
moving across the wire. You already know that. And obviously make sure
the
service side filters out unwanted/needed data.

Assuming all of those things are done, another possibility would be
compression. You may want to check out this solution for WS-compression:
http://weblogs.asp.net/cibrax/archive/2006/03/29/441398.aspx

If that doesn't work for you, I'm sure there are other compression
solutions
out there. Please let us know the ultimate solution you choose and how it
goes!


Roy Lawson
president of the Central Florida .Net Users Group
www.cfdotnet.org

For the very short term I have made it cache the retrieved datasets where I
can. It helps somewhat. Maybe cuts 10% of calls.

In the long term I think we will go with WCF and custom entities, that will
all hinge on a proof of concept and approval from the boss.

I have tried the WSE version of that compression link fron rodolfof, had a
hell of a time getting them to work, I have tried a few different methods
and that one was the only one I could get working. I even tried compressing
the dataset into bytes and trasfering that way.

Thanks Roy
John Sheppard
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,708
Messages
2,569,346
Members
44,650
Latest member
LuckyVivo

Latest Threads

Top