Asynchronous Logins

S

Simon Gorski

I have a large problem, and I believe there is not yet a way to solve this
using IIS and ASP.NET. I hope someone has a solution which we couldn't
find.

The current situation
When a user logs in to our website, we implement a single login for multiple
services (let's call them services A-D). This means, that on the backend,
while sampleuser logs in to our service with only one username and password,
he is automatically logged in to many different services in the background.
The problem with this method is, that these other services are often
provided by 3rd parties, which are halfway around the world, and sometimes
have unexpected downtime. In the case where a service goes down (service B,
for example), the user's login will take 20 seconds while we wait for the
timeout to reach the affected service. If the user has no interest in the
service (perhaps he only wants to use service A today), this makes for quite
a bad user experience.

The suggested solution
Naturally, our first thought is to implement an asynchronous login in the
background, which updates the website as the user is logged into services to
indicate that the login was completed. In the meantime, the user can surf
around the site, and use those services where he was already logged in
(service A, for example). We thought this would be no problem with ASP.NET
2.0.

The limitations/requirements
For the service logins to work, we need access to the user's session. We
set things like login-specific data, as well as a LoggedIn flag for each
service. The background login should not stop the user from using other
services where he is already logged in. Our session data is currently being
stored in a SQL database, and our web servers are in a web farm with load
balancing. So any solution has to take all this into account.

What we tried
At first, we tried simply spawning a new thread to perform the login. The
problem with this is it has no session access.
Next, we tried the methods as described in the PPT on this site:
http://blogs.msdn.com/dmitryr/archive/2005/11/09/490980.aspx. The problem
is, we realized, that all of these "Async" methods are simply a way of
performing multiple tasks inside the processing of a page. We needed
something out-of-band.
Finally, we started looking into AJAX/Atlas. This brought us the closest to
a real solution, using an Asynchronous Web Method call, where the Web Method
had the EnableSession=true flag. This quite nearly works except for one
large problem. While the method is executing, the Session object is
blocked, and therefore, the user cannot perform anything on our site that
requires a postback until the WebMethod is finished. Watching the WebCast
at http://support.microsoft.com/default.aspx?kbid=820913 (specifically part
9 on Threading Synchronization) led us to the conclusion that this will not
be possible to solve using this method either.

So the question seems simple, the but the solution evades us: How can you
perform LongProcess() in the background of a user's web activity when
LongProcess() requires Session access?
 
N

Nick Malik [Microsoft]

Your design is frail and needs to be rethought.

Here's the odd assumptions that I am seeing.
When a user logs in to our website, we implement a single login for
multiple services (let's call them services A-D). This means, that on the
backend, while sampleuser logs in to our service with only one username
and password, he is automatically logged in to many different services in
the background. The problem with this method is, that these other services
are often provided by 3rd parties, which are halfway around the world, and
sometimes have unexpected downtime.

OK. First odd thing. Services, provided by third parties, all have the
same userid and password. These services must either be provided by third
parties CLOSELY associated with your company, or they don't believe in
security very much. The problem with this assumption is that it is a frail
design. If you ever need to get a service from a company that ISN'T closely
associated with you, where the userid and password is not shared, your
design fails.
In the case where a service goes down (service B, for example), the user's
login will take 20 seconds while we wait for the timeout to reach the
affected service. If the user has no interest in the service (perhaps he
only wants to use service A today), this makes for quite a bad user
experience.

So the user's access to Service B has to occur when the user logs in to your
site because the information from those services is (a) user specific,
personalized, or individually purchased, and (b) not cached on your site.
That is another odd requirement. It sounds like your app is NOT providing
access to services per se. I could be wrong, but it sounds like you are
providing access to sites.

So it sounds like you are not so much providing a web site as an access
portal.

Please tell us what your site does. You don't have to tell us what content
is served. Personally I don't care if it is financial data or prurient
images. You are a software developer and I help all software developers.
What I do care about is knowing what your site does with respect to these
'services'.

Do the services produce HTML or just data? Do they pipe data to you that
you reformat or interpret, or do you push the data to the user?
Do the services have seperate financial relationships with the user or just
with you?
Does the customer purchase these services as a block (through you) or can
they subscribe to one service seperate from another?

The reason I ask is that the problems of federated identity and single sign
on has been solved many times. I'm happy to give you some pointers, but
it's a big topic and I want to make sure that this is what you are doing
(just to save my fingers from giving you information that is not useful to
you).


--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
 
S

Simon Gorski

Perhaps I should clarify a bit. I didn't mean to say that the external
services use the same login as that which the users have on our site. All
the translation from our account information to the login for the external
services is done by us. The external services have no data about our
customers which isn't necessary. The idea is just that the user has a
single sign-on for different services which our site offers. The services
which we provide are sometimes supplied by third-party providers.

Regarding the interfaces to the services, they each have an API, where we
can query things like account balance for that particular customer and
service. I guess you could say that this is "access to sites", but as far
as I would say, we are providing access to external services on our site,
with our own interface, and of course, those services have information about
the customer which we would like to display on our site. The issue of
caching is a valid one, but you can imagine the problems that could create
for things like account balance from the external service. That data cannot
be cached without getting some customer backlash.

Without saying too much about our site, we can think up a similar situation
without too much trouble, and perhaps we can find a good solution for that:
Imagine that our site is a bank account management site. In it, you can add
your account information for various banks and accounts, and our site will
display a summary of all your portfolios. Ignoring the obvious security
issues which such a site would present, let's focus on the technical aspect.
How can you get the information like account balance etc from the various
external bank accounts, without delaying the updates/logins when one bank
site goes down? Also assume that getting the bank account balance requires
some access to Session state, as this is one of our limitations.

Thanks for the feedback
 
N

Nick Malik [Microsoft]

The one thing that isn't clear, so far, is whether you have participation
from the sites whose services you are mashing up. If yes, you have one
solution. If no, you have another.

Let's assume that the sites offer services and that you and they are in a
tight corporate relationship. Then you can create a trust relationship
where the user logs in to you, and you simply pass credentials to their
site. They take your credentials as an "instant login" because they trust
you, and let the user data flow. This is called Federated Security. Look
for articles on ADFS for more information on how this can be done using MS
technology easily. (There are less easy ways to do the same thing, but it's
the same concept as long as the service sites trust your credentials without
requiring a re-login.

OK... other option... you are 'mashing up' these services without the direct
cooperation of the site owners. Perhaps you do have their API, or perhaps
you are doing glorified screen scraping. Either way, you don't have a close
cooperation and they are not willing to trust credentials that you app
issues.

That's more common anyway.

In this case, you have an interesting issue. You have to get logins to
their sites when you need the data. You know: in this case, the user had to
provide you with the userid and password for the remote site.

Mechanism 1: IMHO, the best and most honest way is to allow them to log into
your site, but don't log into the remote sites at all until your customer
asks you for information that comes from them. Perhaps they see a list of
sites whose credentials you are managing, and when they click the link, you
show a page that says 'establishing remote site login... please wait' and
then lets them through as soon as the login is successful. This gives you
some advantages: (a) you don't have to log in to sites where the login is
not actually needed, and (b) if the remote site is down, you can blame the
remote site in your error message, without looking like it is your fault.

Mechanism 2: You create a memory cache not of the remote site's actual data,
but of the in-memory cookie that they issue to you when you log in. So it
works like this. User logs in to your site. You QUEUE requests for login
to remote sites. You have a service reading the queue and it picks up the
requests. Spawns threads. Logs in. Captures the in-memory cookies issued
by the remote site, ties the cookie to user information, and stores it in
in-memory cache. Your app, after queueing the requests, simply keeps the
user busy with something else like your own data or content. When they ask
for data from the remote site, you check to see if the login request has
come back with the cached cookies. If it has, load up the cookies into your
request header and get the data the user wants. If not, put up a message
saying that "the information is currently not available from the remote
server."

Advantages: (a) user doesn't wait for remote logins to occur, and (b) you
get to blame the remote site for failures to login. You are still logging
in to remote sites, even if that login is not needed. This could allow a
number of attacks including 'man in the middle' and any number of attacks
involving processes that get loaded on your server that can gain access to
your credentials cache. Not the most likely thing in the world, but if you
get to the point where you have large dollars flowing through, expect
someone to try.

I do hope this helps.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
Simon Gorski said:
Perhaps I should clarify a bit. I didn't mean to say that the external
services use the same login as that which the users have on our site. All
the translation from our account information to the login for the external
services is done by us. The external services have no data about our
customers which isn't necessary. The idea is just that the user has a
single sign-on for different services which our site offers. The services
which we provide are sometimes supplied by third-party providers.

Regarding the interfaces to the services, they each have an API, where we
can query things like account balance for that particular customer and
service. I guess you could say that this is "access to sites", but as far
as I would say, we are providing access to external services on our site,
with our own interface, and of course, those services have information
about the customer which we would like to display on our site. The issue
of caching is a valid one, but you can imagine the problems that could
create for things like account balance from the external service. That
data cannot be cached without getting some customer backlash.

Without saying too much about our site, we can think up a similar
situation without too much trouble, and perhaps we can find a good
solution for that:
Imagine that our site is a bank account management site. In it, you can
add your account information for various banks and accounts, and our site
will display a summary of all your portfolios. Ignoring the obvious
security issues which such a site would present, let's focus on the
technical aspect. How can you get the information like account balance etc
from the various external bank accounts, without delaying the
updates/logins when one bank site goes down? Also assume that getting the
bank account balance requires some access to Session state, as this is one
of our limitations.

Thanks for the feedback
 
S

Simon Gorski

Thanks for the great feedback so far! I'm starting to get the impression,
though, that with our design limitations, what we want won't be possible.
Let's continue, though, just so we can be sure. The closest suggestion you
made to what we need was with Mechanism 1. Essentially, we have to assume
that we don't have full cooperation of the services, similar to a mashup
site which only accesses APIs from other services.
Now, the only problem we have with Mechanism 1, is that our design (which
cannot be changed) requires that the logins to all services are triggered as
soon as the login to our site happens. (Blame the designers ;) ) The
challenge here is of course doing that without blocking our site itself.
I have seen this done, on sites like meebo.com. It must be possible
somehow...maybe just not with ASP.NET?

Nick Malik said:
The one thing that isn't clear, so far, is whether you have participation
from the sites whose services you are mashing up. If yes, you have one
solution. If no, you have another.

Let's assume that the sites offer services and that you and they are in a
tight corporate relationship. Then you can create a trust relationship
where the user logs in to you, and you simply pass credentials to their
site. They take your credentials as an "instant login" because they trust
you, and let the user data flow. This is called Federated Security. Look
for articles on ADFS for more information on how this can be done using MS
technology easily. (There are less easy ways to do the same thing, but
it's the same concept as long as the service sites trust your credentials
without requiring a re-login.

OK... other option... you are 'mashing up' these services without the
direct cooperation of the site owners. Perhaps you do have their API, or
perhaps you are doing glorified screen scraping. Either way, you don't
have a close cooperation and they are not willing to trust credentials
that you app issues.

That's more common anyway.

In this case, you have an interesting issue. You have to get logins to
their sites when you need the data. You know: in this case, the user had
to provide you with the userid and password for the remote site.

Mechanism 1: IMHO, the best and most honest way is to allow them to log
into your site, but don't log into the remote sites at all until your
customer asks you for information that comes from them. Perhaps they see
a list of sites whose credentials you are managing, and when they click
the link, you show a page that says 'establishing remote site login...
please wait' and then lets them through as soon as the login is
successful. This gives you some advantages: (a) you don't have to log in
to sites where the login is not actually needed, and (b) if the remote
site is down, you can blame the remote site in your error message, without
looking like it is your fault.

Mechanism 2: You create a memory cache not of the remote site's actual
data, but of the in-memory cookie that they issue to you when you log in.
So it works like this. User logs in to your site. You QUEUE requests for
login to remote sites. You have a service reading the queue and it picks
up the requests. Spawns threads. Logs in. Captures the in-memory
cookies issued by the remote site, ties the cookie to user information,
and stores it in in-memory cache. Your app, after queueing the requests,
simply keeps the user busy with something else like your own data or
content. When they ask for data from the remote site, you check to see if
the login request has come back with the cached cookies. If it has, load
up the cookies into your request header and get the data the user wants.
If not, put up a message saying that "the information is currently not
available from the remote server."

Advantages: (a) user doesn't wait for remote logins to occur, and (b) you
get to blame the remote site for failures to login. You are still logging
in to remote sites, even if that login is not needed. This could allow a
number of attacks including 'man in the middle' and any number of attacks
involving processes that get loaded on your server that can gain access to
your credentials cache. Not the most likely thing in the world, but if
you get to the point where you have large dollars flowing through, expect
someone to try.

I do hope this helps.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
 
N

Nick Malik [Microsoft]

Hi Simon,

I took a look at meebo to get context. Fascinating.

You mention that mechanism 1 is closer to what you are looking for than
mechanism 2, yet you also say that the designers need you to log in
immediately... and that is really mechanism 2.

When you set off multiple threads, you are basically saying "I want little
elves to handle the logins for me and report back to me when they are done."
The queueing mechanism in mechanism 2 is basically that: Little elves. The
difference is that the little elves are running in a different process.

If you run the little elves in the same process that returns the HTML page
to the user, then you run into interesting problems like delaying the
response and consuming response threads. Not usually a good option.

You don't have to run in a different process if you don't want to. You can
certainly set up the queueing within IIS. At the end of the day though,
the fundamental design is the same.

Steps are like this:

1. User logs in to your site
2. You look in DB and find out that they have registered for four services.
3. You get their credentials from the db and place them in a list.
4. You send back a page with something cute on it, like "working" or some
pithy marketing splash. That way, they know you haven't died.
4a. The page you sent back has some ajax controls in it that automatically
come back for status every few seconds or so.

5. Server side: after you send off the response, you spawn off a thread for
each credential in your list or use an async handler. Each element in the
list has a status value: not-logged-in, logging-in, waiting-for-retry,
logged-in.
6. Each thread logs in and updates status in the list and provides info to
allow a follow-on request to the service. Threads terminate.

7. The client comes back periodically, using Ajax, and your server code
responds with info from the list about status.
8. Perhaps, when the client finds out that something is logged it, the
client requests information from you that you turn around and request from
that service. (Meebo style, it could request a list of IM contacts)

There is a discussion on the asp.net forum that may help.
http://forums.asp.net/thread/1344595.aspx

There is also an article in MSDN mag that may help:
http://msdn.microsoft.com/msdnmag/issues/03/06/Threading/default.aspx

I hope this helps. I don't think the problem is ASP.Net or IIS. It is the
web HTTP model. Everyone faces this. Apache and Sun have no grand
advantage on this one.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
 
S

Simon Gorski

First off, thanks again for the help on this issue.

I must be missing something very simple, so I am going to pose a simple
question:
At which point, and how, do you propose updating the user's session object
to indicate that he has been logged in?

The methods described, as far as I can find and test, do not offer Session
access.
 
N

Nick Malik [Microsoft]

sorry my reply took so long. I'm working on the design of a large project
at the moment. Juggling is not easy.

Simon Gorski said:
First off, thanks again for the help on this issue.

I must be missing something very simple, so I am going to pose a simple
question:
At which point, and how, do you propose updating the user's session object
to indicate that he has been logged in?

I'll put up my previous sequence and add a note for where the user is marked
as 'logged in'

At this point, the user is logged in to YOUR site. They are not logged in
to all of the other sites yet.

The client is making requests of the server on a periodic basis to find out
if any of the threads have completed their login. The server has to use the
current user's info to get the details on the threads. You can update the
session when the data passes through indicating success in one of the login
processes.


Note that I'm not a big fan of the design approach of using a session object
for state management. However, your site offers one of the best reasons to
use it, so in your case, I think it makes sense to use the session.
The methods described, as far as I can find and test, do not offer Session
access.

See illustration above. That said, if you update the session in the
threads, which you can do assuming you are using the IIS thread pool, you
can get immediate updates to the session object. The down side is that you
would need to use the IIS thread pool, which is a really bad idea because
those threads are needed to allow your site to handle other requests. You
can harm your site's scalability substantially by using the IIS thread pool
just to update the session right away. The ajax method I describe is better
(In my humble opinion ;-).

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,055
Latest member
SlimSparkKetoACVReview

Latest Threads

Top