ASP.Net Caching Questions

P

Phil Sandler

All,

I am designing a system that will involve an IIS/ASP.Net application
server. The main purpose of the application server will be to load
large amounts of static data into memory and do fast lookups on it.
The total cache size at the time of first installation will be roughly
350MB, but may grow to as large as 10GB over the next five years.

My questions:

1. Can the ASP.Net cache utilize this much memory, assuming the
processor and OS are 64-bit?

2. If the cache size were 10GB, how much memory would the machine need
to prevent cache/application recycling (assume very little memory
usage in the app outside of the cache).

3. Since the loading of the static data would be expensive and time
consuming, would any other steps need to be taken to avoid application
restarts?

4. Is there any limit to how large a single object in the cache can be
(it's likely one dataset or hashtable in our system could approach
200MB).

5. Finally, is there another caching option that is more efficient/
usable/scalable/etc. than the one provided with ASP.Net?


Thanks for any insight.


Phil
 
K

KJ

To answer #5, custom serialization (ala XmlSerializer) using SQL Server
would likely scale better than a massive cache, although speed of access and
serializability of your objects might be issues.

Regarding #4, my question to you is, if the dataset or hastable is going to
be 200MB, isn't that just an in-memory representation of a database table or
entire database (and why not just use tables)?
 
P

Phil Sandler

To answer #5, custom serialization (ala XmlSerializer) using SQL Server
would likely scale better than a massive cache, although speed of access and
serializability of your objects might be issues.

Performance of the data lookup is the #1 priority of this particular
part of the system. So unless I'm misunderstanding your suggestion, I
don't think XML is the answer.
Regarding #4, my question to you is, if the dataset or hastable is going to
be 200MB, isn't that just an in-memory representation of a database table or
entire database (and why not just use tables)?

Yes, it's basically a modified representation of the static data in
the database. We are not simply using the database tables themselves
(I assume that's what you meant) because doing in-memory lookups are
much, much faster than going to the database for each one.


Thanks for your reply,

Phil
 
K

KJ

A serialization option (other than XML) you might consider is binary
serialization (saving an object as a stream of bytes to some medium,
such as a varbinary SQL Server column). Check out the topic "basic
serialization" in msdn.

One thing I thought of: What if for some reason the aspnet or iis
process tanks and has to restart? What would happen to your
application (I imagine that such an application would require serious
ramp-up time to load all the data)? SQL Server is built to withstand
these kinds of events, and a properly designed and optimized SQL
database will perform comparably to using Cache.
 
P

Phil Sandler

A serialization option (other than XML) you might consider is binary
serialization (saving an object as a stream of bytes to some medium,
such as a varbinary SQL Server column). Check out the topic "basic
serialization" in msdn.

As stated, the lookups have to be as fast as possible, so the idea is
to have all the information in memory so that the lookups are
instant. Loading the information from a file or sql column will not
perform nearly as well.
One thing I thought of: What if for some reason the aspnet or iis
process tanks and has to restart? What would happen to your
application (I imagine that such an application would require serious
ramp-up time to load all the data)?

Yes, there would be ramp up time. This is an expected and acceptable
condition, so long as the application performs well when it's running.
SQL Server is built to withstand
these kinds of events, and a properly designed and optimized SQL
database will perform comparably to using Cache.

With all due respect, what are you basing this on? Have you ever
tested this? Lookup up information in cache is many, many times
faster than querying a database for it.

I appreciate your taking the time to respond, but I need to find
answers to my specific questions before I start looking at alternate
solutions.


Thanks,

Phil
 
K

KJ

Hi Phil,

I don't have specific testing results, and I have no trouble conceding to
your assertion about Cache vs. database speed.

I'm only trying to get across the general idea that a stored procedure
executing in roughly 5-20 MS is sufficient for 100% of the ASP.NET
applications I have or am likely to encounter. Sending HTML across the wire
is not expected to happen instantaneously, as the speed of the network,
number of hops, etc, is always a limiting factor.

Is yours a real-time-dependent application where the consequences of waiting
a few extra MS are potentially disastrous or extremely problematic? Unless
it can be proven that a massive aspnet process is safe, reliable, and
manageable at the O/S level, and doesn't introduce any resource-based
performance concerns of its own.... (Maybe other folks with more knowledge
about aspnet internals will chime in on this thread and provide specifics?)

Now, for curiosity's sake, could you tell us why this particular application
has to be so fast, or is that proprietary info (etc)?

-KJ
 
P

Phil Sandler

Hi Phil,

I don't have specific testing results, and I have no trouble conceding to
your assertion about Cache vs. database speed.

I'm only trying to get across the general idea that a stored procedure
executing in roughly 5-20 MS is sufficient for 100% of the ASP.NET
applications I have or am likely to encounter. Sending HTML across the wire
is not expected to happen instantaneously, as the speed of the network,
number of hops, etc, is always a limiting factor.

Is yours a real-time-dependent application where the consequences of waiting
a few extra MS are potentially disastrous or extremely problematic? Unless
it can be proven that a massive aspnet process is safe, reliable, and
manageable at the O/S level, and doesn't introduce any resource-based
performance concerns of its own.... (Maybe other folks with more knowledge
about aspnet internals will chime in on this thread and provide specifics?)

Now, for curiosity's sake, could you tell us why this particular application
has to be so fast, or is that proprietary info (etc)?

Essentially, this is not a website, it's an application server. The
purpose of the application server is to receive a request for
processing, and then process the request as fast as possible. I can't
get deep into the details of what the system does, but a big part of
what it has to do is lookup many (!) thousands of values as fast as
possible. When I say as fast as possible I mean: if each lookup takes
2ms instead of 1ms, it would make a huge difference in how we measure
the success of the system.

So the added overhead of loading the huge set of data up front is not
a problem, as long as the goal of making the system respond as quickly
as possible is met. Looking up thousands of values in a database,
even if the database is very, very fast, is significantly slower than
looking up the values in memory (at least from my admittedly
unscientific tests).

Your question of whether this method would be safe, reliable and
manageable is exactly what I need to get to. :)


Thanks,

Phil
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,057
Latest member
KetoBeezACVGummies

Latest Threads

Top