IHttpHandler, IHttpModules, and cleaning up after File Uploading...

S

Sky Sigal

I have created an IHttpHandler that waits for uploads as attachments for a
webmail interface, and saves it to a directory that is defined in
config.xml.

My question is the following:
assuming that this is suppossed to end up as a component for others to use,
and therefore I do NOT have access to their global.cs::Session_End()
how do I cleanup files that were uploaded -- but obviously left stranded
when the users aborted/gave up writting email?

My thoughts were:
* look for files in the directory that have a create date that is Now() - 20
minutes or so.
* but how do I launch it? MyUploader.Dispose == too early. Maybe I could
launch a sleep/timer -- but which user/thread launches it? -- if every user
is launching one, this could turn into a total fiasco, with each user
launching secondary threads...
* I could look for these 'old files' every time a new user uploads -- and
clean up then I guess. Meaning there would always be files there, never
empty. Doesn't feel right...but doable...
* I am worried about this timer business because -- some writers are
SLOW...They might finish email after 20 minutes... any ideas?

I looked into the fact that Session_Start() is a method of
IHttpApplication -- but I don't think I could wire into that as easily as
IHttpHandler, or IHttpModule.


Any suggestions?

Thank you very much,
Sky
 
J

John Saunders

Sky Sigal said:
I have created an IHttpHandler that waits for uploads as attachments for a
webmail interface, and saves it to a directory that is defined in
config.xml.

My question is the following:
assuming that this is suppossed to end up as a component for others to use,
and therefore I do NOT have access to their global.cs::Session_End()
how do I cleanup files that were uploaded -- but obviously left stranded
when the users aborted/gave up writting email?

Are you sure that your handler will be called if the e-mail is abandonded? I
would think that it wouldn't be called until the entire request has been
received by ASP.NET. That would include any and all uploaded files.
 
S

Sky Sigal

Exactly -- not sure of what to do...
the only idea I have right now is calling the following
method each time someone uploads a new file...It will clean the dir
of old/dead files, leaving the new ones only.

What I abs. hate about it is that it will never get back to 0 files by
itself...
Hence my post, looking for ideas on how others would/have tackled
the problem of cleaning up an upload directory....


/// <summary>
/// Deletes all files in a directory that are older than 12 hours.
/// </summary>
/// <param name="qDestinationDir"></param>
/// <returns>-1 if Failed. Otherwise, num of files deleted.</returns>
/// <remarks>
/// It goes without saying that this method should be used with EXTREME
care!
/// Because it is so dangerous, it comes with two criteria built into it:
/// a) the directory name given must include the word 'upload' somewhere in
in,
/// b) If the directory has files in it that are older than 7 days, then
it's most
/// probably NOT a current/working upload directory, so nothing is deleted.
/// This might seem like a pain -- but the option of deleting your whole
base virtual directory
/// by accident is a lot worse...
/// </remarks>
public static int CleanDir(string qDestinationDir){
int tResult =-1;
if (qDestinationDir.ToUpper().IndexOf("UPLOAD")<-1){
return tResult;
}
System.DateTime tCheck = System.DateTime.Now.Subtract(new
TimeSpan(-12,0,0));
System.DateTime tCheckWeekAgo = System.DateTime.Now.Subtract(new
TimeSpan(-7,0,0,0));
System.DateTime tLastWriteTime =
System.IO.Directory.GetLastWriteTime(qDestinationDir);
if (tLastWriteTime < tCheckWeekAgo){
//Directory remained untouched for over a week.
//Not ok to continue.
return tResult;
}
//Time to look:
ArrayList tDeleteMe = new ArrayList();
string[] tFiles = System.IO.Directory.GetFiles(qDestinationDir);
foreach (string tFileName in tFiles){
tLastWriteTime = System.IO.Directory.GetLastWriteTime(tFileName);
if (tLastWriteTime < tCheckWeekAgo){
//File has been there over a week.
//Alarm!
//We're not in a valid directory!!!
return tResult;
}
if (tLastWriteTime < tCheck){
//File has been there over [n] hours. Ok to delete at end:
tDeleteMe.Add(tFileName);
}
}
tResult = 0;
foreach (string tFileName in tDeleteMe){
System.IO.File.Delete(tFileName);
tResult +=1;
}
return tResult;
}//Method:End



PS: I've just wrote it now, so it compiles, but I havn't had a chance to
actually run it yet...
 
S

Sky Sigal

Hi John, and thanks again for your feedback!

I do hope you are right (as it would really simplify life) -- but I ... I
beg to differ :)

If I make a form for the user to write emails, and attach attachments, and
he uploads 3 files then continues writting his email, then decides to make a
phonecall instead and cancels the email writting, the server now has 3 files
sitting in the upload dir, waiting for the Send() function to bring the
form's content (subject/body) together with the files and wrap it all up
into an email, which it would have sent.

Since the email was cancelled, and the upload control had no notification of
this, it's still waiting for this Send() command which will never happen...

Or am I missing something really really big and obvious about now?
Sky
 
J

Joe Fallon

One trick I just implemented to solve this problem is to use a Cache
Callback that simulates Session_End.

In Global.AcquireRequestState you set up the callback and insert the
SessionId into the cache as the key.
The trick is to set the cache expiration as a sliding value equal to the
Session.Timeout. So if their session expires, the cache expires at virtually
the exact same moment.

Then in the actual callback method you call your clean up function.
DoCleanup(key)

'which does this logic:
Delete files for SessionID = key

This assumes you store files based on SessionID and can identify them as
such.

This worked for me even when the user closed their browser.
The "dead" files exist until the session expires.

I use StateServer for Session state so it should survive a reecycle of the
worker process too.
 
S

Sky Sigal

Hi Joe: ...sounds very very interesting!

a) Would you be so kind as to show a snippet of code? This is new stuff to me...
b) Are you able to recycle/reuse the callbacks? In other words, if UserA has
gone through ten pages that trigger this Cache Callback - are there
10 waits (for lack of a better equiv -- the equiv. of window.setTimeout in JScript ?) running? Or
are you able to recycle so that only one wait/callback is being used per person?
(sort of like using window.clearTimeout(o);o=window.setTimeout(...); in JScript)

Very best -- and thanks again -- I think this is what I was looking for :)
Sky
 
J

Joe Fallon

Sky,
Here is some code: wathc out for line wrapping.

This goes in Global.asax - AcquireRequestState:
======================================================
'Set up a callback to simulate session_end
Dim key As String = HttpContext.Current.Session.SessionID
Dim onCacheRemove As System.Web.Caching.CacheItemRemovedCallback
onCacheRemove = New
System.Web.Caching.CacheItemRemovedCallback(AddressOf CacheRemoveCallback)
HttpContext.Current.Cache.Insert(key, key, Nothing,
DateTime.Now.AddMinutes(HttpContext.Current.Session.Timeout), TimeSpan.Zero,
System.Web.Caching.CacheItemPriority.Normal, onCacheRemove)
======================================================

This is also in Global.asax:
======================================================
Private Sub CacheRemoveCallback(ByVal key As String, ByVal source As
Object, ByVal reason As System.Web.Caching.CacheItemRemovedReason)
If reason = System.Web.Caching.CacheItemRemovedReason.Expired Then
YourCleanupCode.Clear(key)
End If
End Sub
======================================================
 
J

John Saunders

Sky Sigal said:
Hi John, and thanks again for your feedback!

I do hope you are right (as it would really simplify life) -- but I ... I
beg to differ :)

If I make a form for the user to write emails, and attach attachments, and
he uploads 3 files then continues writting his email, then decides to make a
phonecall instead and cancels the email writting, the server now has 3 files
sitting in the upload dir, waiting for the Send() function to bring the
form's content (subject/body) together with the files and wrap it all up
into an email, which it would have sent.

Since the email was cancelled, and the upload control had no notification of
this, it's still waiting for this Send() command which will never happen...

Or am I missing something really really big and obvious about now?
Sky

Sky, maybe _I'm_ missing something.

What do you mean when you say "and he uploads 3 files then continues writing
his email"? The HTML file upload control (<input type="file">) does not
upload the file at the time the file is chosen. It simply allows the user to
specify which file will eventually be uploaded, and it stores the file name.
The file upload only occurs upon the form post. It's not possible for the
user to upload three files and then continue writing the e-mail.

Unless you actually allow an "upload" action separate from the submission of
the completed e-mail. I'm not sure why you would do that, as the uploaded
files (which I am assuming are the attachments to the e-mail) have no
purpose separate from the completed e-mail.

So, perhaps you should give us more detail on the user interaction you're
talking about. Also, if you have a separate file upload step, perhaps you'll
tell us why.
 
S

Sky Sigal

Hi John:
Yes, of course the <input type=file> doesn't upload till post time -- but there are different ways
to upload: including using the ADOStream ActiveX objects with an IHttpHandler listening to it.
This way allows for multiple uploading of files and because it is asynch. you don't have to wait
till it ends before you can continue typing...

And even if I were using a plain input type=file, we don't always know how many files to upload --
sometimes one, sometime 10 -- so this causes 2 problems:
a) unknown number of Type=File buttons to generate for a form.
b) User has to wait a long time at the end to ensure mail went out successfully....

There are more 'desktop feeling' ways of skinning that cat -- and they involve uploading to a dir
before the email has been finished: In case you have not looked at this type of asynch uploading
before i attach the following two pieces:
a) IHttpHandler that i am still fidlling with so it could have bugs but the basics are there
b) the JS to inject to a page to prepare the Adostream object for binary upload via an xml doc

Cheers,
Sky


//IHTTPHANDLER
using System;
using System.Collections;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Web;
using System.Web.SessionState;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Web.UI.HtmlControls;
using System.Xml;
using System.IO;
namespace XAct.Web.HttpHandlers {
namespace Controls{
/// <summary>
/// Visible Listener Control used to upload files to.
/// </summary>
/// <remarks>You should NOT use this control -- use a proper IHttpHandler
/// instead -- such as HttpHandler_FileUploader...
///
/// It is provided only for people who prefer working in a visual IDE --
/// but it forces you to have a whole aspx page dedicated to hosting it,
/// whereas the IHttpHandler method does not, and therefore is much cleaner.
///
/// Also, both the Control and IHttpHandler share common code, so there is
/// no reason not to go directly with IHttpHandler...
/// </remarks>

/// <summary>
/// IHttpHandler to handle uploads via HttpFile control, or via ADOStream ActiveX methods.
/// </summary>
public class FileUploader : System.Web.IHttpHandler {
//================================================
//FIELDS
//================================================
private const string _PathFilter = "XAct.Uploader.aspx";
//================================================
//PROPERTIES
//================================================
/// <summary>Override of IHttpHandler.IsReusable to set it to true always.</summary>
public bool IsReusable {get { return true; }}//Must be true!
//================================================
//PUBLIC METHODS:
//================================================
/// <summary>
/// Called by Net framework to process request.
/// </summary>
/// <param name="hc"></param>
public void ProcessRequest(HttpContext hc) {
//NB: this will never be called if Install() hasn't been called before:
bool tSuccess = _FileUploader.Process(hc,string.Empty);
}
/// <summary>
/// Static function that modifies web.config xml file as needed to install this IHttpHandler.
/// </summary>
/// <returns>True if successful.</returns>
public static bool Install(bool qThrowExceptionIfUnsuccessful){
System.Type tType = System.Reflection.MethodBase.GetCurrentMethod().DeclaringType;
return XAct.Web.Controls.Tools.InstallHandler(tType,_PathFilter,true);
}
/// <summary>
/// Shared Class used by both the FileUploader Control and IHttpHandler.
/// </summary>
public class _FileUploader{
/// <summary>
/// Processes request and saves uploaded file.
/// </summary>
/// <param name="hc">Current HttpContext.</param>
/// <param name="qUploadDir">Directory where Uploaded files are to be saved.</param>
/// <returns>True if upload and save was successful.</returns>
public static bool Process(System.Web.HttpContext hc, string qUploadDir){
//Clean up any old files in the upload directory.
_CleanUploadDir(qUploadDir);
if (hc.Request.Params["mode"]=="BINARY"){
return _Process_Upload_ADOStream(hc,qUploadDir);
}
else{
return _Process_Upload_Normal(hc,qUploadDir);
}
}


/// <summary>
/// Private Method. Used when Client sends file via Form and HtmlFile controls. (Traditional)
/// </summary>
/// <param name="hc">Current HttpContext</param>
/// <param name="qUploadDir">Directory where Uploaded files are to be saved.</param>
/// <returns>True if upload and save was successful.</returns>
private static bool _Process_Upload_Normal(System.Web.HttpContext hc, string qUploadDir){
bool tResult;
string tDir = string.Empty;
if (qUploadDir == string.Empty){
qUploadDir = System.Configuration.ConfigurationSettings.AppSettings["XActUploadDirectory"];
}
if ((qUploadDir == null)||(qUploadDir == string.Empty)){
tResult = false;
//oReportStatusMsg.InnerText = "Error: No Upload directory specified.";
}
try {
tDir = hc.Server.MapPath(qUploadDir);
System.IO.Directory.CreateDirectory(tDir);
}catch{
tResult = false;
//oReportStatusMsg.InnerText = "Could not gain Server.MapPath to upload directory specified: " +
qUploadDir;
}
//Loop through each file that has been posted:
foreach (System.Web.HttpPostedFile oFile in hc.Request.Files){
// Allocate a byte buffer for reading of the file
byte[] myData = new byte[oFile.ContentLength];
// Read uploaded file from the Stream
oFile.InputStream.Read(myData, 0, myData.Length);
//Make the save path:
string tPath = tDir +"/" + System.IO.Path.GetFileName(oFile.FileName);
// Create a file
FileStream newFile = new FileStream(tPath, FileMode.Create);
// Write data to the file
newFile.Write(myData, 0, myData.Length);
// Close file
newFile.Close();
}
tResult = true;
return tResult;
}//Method:End
/// <summary>
/// Private Method. Used when Client sends file via ADOStream/BinaryXML technique.
/// </summary>
/// <param name="hc">Current HttpContext</param>
/// <param name="qUploadDir">Directory where Uploaded files are to be saved.</param>
/// <returns>True if upload and save was successful.</returns>
/// <remarks>
/// This is my preferred technique as it doesn't
/// require a Post/Refresh on each file upload.
/// But it does require JS code to be injected into the Client's page
/// in order to create the XML and send it to this IHttpHandler...
/// </remarks>
private static bool _Process_Upload_ADOStream(System.Web.HttpContext hc, string qUploadDir){
//Get the incoming stream:
System.IO.Stream tStreamIn = hc.Request.InputStream;
//Just in case it has been switched at some point?
hc.Response.Expires = 0;
hc.Response.ContentType = "text/xml";
//Create a Doc to read the XML stream that is being sent:
System.Xml.XmlDocument oXDocIn = new XmlDocument();
//Create a Doc to write the answer to:
System.Xml.XmlDocument oXMLReport = new XmlDocument();
//Structure we want to send back is:
//<report>
// <status>True</status>
// <status_msg>Doing Fine...</status_msg>
//</report>
System.Xml.XmlNode oReportRoot=
oXMLReport.CreateElement("report");oXMLReport.AppendChild(oReportRoot);
System.Xml.XmlNode oReportStatus=
oXMLReport.CreateElement("status");oReportRoot.AppendChild(oReportStatus);
System.Xml.XmlNode oReportStatusMsg=
oXMLReport.CreateElement("status_msg");oReportRoot.AppendChild(oReportStatusMsg);
//Load the incoming stream text all in one go into a new doc to parse it:
oXDocIn.LoadXml(new StreamReader(tStreamIn).ReadToEnd());
//Start counters, etc.
int tCountTotal=0;
int tCountSuccessful=0;
int tBufferTransferred=0;
bool tResult =true;
string tDir=string.Empty;
string tStatusMsg = string.Empty;

//Get the Upload directory from the config file if it was not given:
if (!_GetUploadDirectory(ref qUploadDir)){
tResult = false;
oReportStatusMsg.InnerText = "- Error: No Upload directory specified.";
//We normally would get out here since we are obviously not doing
//to well...but we want this error message to go out as an xml doc.
//so we need to finish up right to the end, skipping what we would
//have done if tResult were still True...
}
if (tResult){
try {
tDir = hc.Server.MapPath(qUploadDir);
}
catch {
//There was an error.
//The qUploadDir was not a relative path
//and therefore could not be converted by MapPath...
tResult = false;
oReportStatusMsg.InnerText = "- Error: Could not Server.MapPath the upload directory specified: " +
qUploadDir;
}
}
if (tResult){
try {
if (!System.IO.Directory.Exists(tDir)){
//In case the directory doesn't exist, let's see
//if it can be created.
//Might fail if App hasn't been given the right to create
//new subdirectories by itself...
System.IO.Directory.CreateDirectory(tDir);
}
}catch{
//There was an error.
//Program probably doesn't have enough rights to create sub directories...
tResult = false;
oReportStatusMsg.InnerText = "- Error: Could not gain Server.MapPath to upload directory specified:
" + qUploadDir;
}
}


//We are now ready to proceed and process the incoming xml
//file.
//We are expecting an incoming xml file with the following format:
//<root>
// <file>
// <filename>MyFile.dat</filename>
// <charbuff>....</charbuffer>
// </file>
// <file>
// <filename>MyFile.dat</filename>
// <charbuff>....</charbuffer>
// </file>
// ...etc...
//</root>
if (tResult==true){
//Are there any FILE subnodes?
//if not - get out.
tCountTotal = oXDocIn.DocumentElement.ChildNodes.Count;
tBufferTransferred = 0;
if (tCountTotal==0){
tResult = false;
oReportStatusMsg.InnerText = "- Error: No Child Nodes (file names/info) found.";
}
}
if (tResult){
//Loop through each FILE node:
foreach (System.Xml.XmlNode oInfo in oXDocIn.DocumentElement.ChildNodes){
tStatusMsg = string.Empty;
//Each node should have atleast 2 subnodes:
//1) filename,
//2) charbuffer
if (oInfo.ChildNodes.Count >1){
//Get first node (FILENAME):
string tFileName = System.IO.Path.GetFileName(oInfo.ChildNodes[0].InnerText);
//Get second node (base64 BYTEBUFFER):
System.Byte[] tBuffer= System.Convert.FromBase64String(oInfo.ChildNodes[1].InnerText);
//Now that we have both parts:
//make an outgoing stream, write to it and save it to the directory
//then close the outgoing file stream:
System.IO.FileStream tStreamOut=null;
try {
string tPath = tDir +"/" + tFileName;
tStreamOut = System.IO.File.OpenWrite(tPath);
tStreamOut.Write(tBuffer,0,tBuffer.Length);
tCountSuccessful +=1;
tBufferTransferred +=tBuffer.Length;
}catch (System.Exception E){
tResult = false;
oReportStatusMsg.InnerText = "Error: " + E.Message;
if (tStreamOut != null){tStreamOut.Close();}
break;
}
finally{
if (tStreamOut != null){tStreamOut.Close();}
}
}
}//End Foreach
}
//Report string is splittable by spaces:
if (tResult){
oReportStatusMsg.InnerText = "+ Success: " + tCountTotal + " " + "Uploaded." + " (" +
tBufferTransferred + "bytes.)";
}
oReportStatus.InnerText = (tResult)?"1":"0";
//Write the xml of the whole outgoing xml doc to the outgoing stream:
hc.Response.Write(oXMLReport.OuterXml);
//Done.
return tResult;
}//Method:End
private static bool _GetUploadDirectory(ref string qUploadDir){
if (qUploadDir == string.Empty){
qUploadDir = System.Configuration.ConfigurationSettings.AppSettings["XActUploadDirectory"];
}
if ((qUploadDir == null)||(qUploadDir == string.Empty)){
return false;
}
return true;
}
/// <summary>
/// Deletes all files in a directory that are older than 12 hours.
/// </summary>
/// <param name="qUploadDir"></param>
/// <returns>-1 if Error, otherwise count of files actually deleted.</returns>
/// <remarks>
/// It goes without saying that this method should be used with EXTREME care!
/// Because it is so dangerous, it comes with two criteria built into it:
/// a) the directory name given must include the word 'upload' somewhere in in,
/// b) If the directory has files in it that are older than 7 days, then it's most
/// probably NOT a current/working upload directory, so nothing is deleted.
/// This might seem like a pain -- but the option of deleting your whole base virtual directory
/// by accident is a lot worse...
/// </remarks>
private static int _CleanUploadDir(string qUploadDir){
int tResult =-1;
if (!_GetUploadDirectory(ref qUploadDir)){
return tResult;
}
if (qUploadDir.ToUpper().IndexOf("UPLOAD")<-1){
return tResult;
}
System.DateTime tCheck = System.DateTime.Now.Subtract(new TimeSpan(-12,0,0));
System.DateTime tCheckWeekAgo = System.DateTime.Now.Subtract(new TimeSpan(-7,0,0,0));
System.DateTime tLastWriteTime = System.IO.Directory.GetLastWriteTime(qUploadDir);
if (tLastWriteTime < tCheckWeekAgo){
//Directory remained untouched for over a week.
//Not ok to continue.
return tResult;
}
//Time to look:
ArrayList tDeleteMe = new ArrayList();
string[] tFiles = System.IO.Directory.GetFiles(qUploadDir);
foreach (string tFileName in tFiles){
tLastWriteTime = System.IO.Directory.GetLastWriteTime(tFileName);
if (tLastWriteTime < tCheckWeekAgo){
//File has been there over a week.
//Alarm!
//We're not in a valid directory!!!
return tResult;
}
if (tLastWriteTime < tCheck){
//File has been there over [n] hours. Ok to delete at end:
tDeleteMe.Add(tFileName);
}
}
//Ok. Sounds like it's ok to delete after all:
//Set the counter:
tResult = 0;
foreach (string tFileName in tDeleteMe){
System.IO.File.Delete(tFileName);
tResult +=1;
}
//Get out:
return tResult;
}//Method:End
}
}
}//Namespace:End




JAVASCRIPT TO INJECT INTO PAGE:
function FileUploader(){
this._Name = "FileUploader";
this.LocalPath;
this.LocalFileName;
this.LastFilePath;
this.LastStatus;
this.LastStatusMsg;
this.Img;
this.ServerPath = "./XAct/Upload/";
}
FileUploader.prototype.Send=function(qFileName,qImgObject){
var oC = this;
oC.LocalPath = qFileName;
oC.LocalFileName = oC.LocalPath.substr(oC.LocalPath.lastIndexOf("\\")+1);
oC.Img = qImgObject;
//Allows for asynch going...nice
window.setTimeout(function(){oC._SendII()},100);
}
FileUploader.prototype._SendII=function(){
var oC = this;
var oDoc,oRoot,oFileInfo,oFileName, oFileBuffer,tStream,xmlhttp;
// create ADO-stream Object
try {
tStream = new ActiveXObject("ADODB.Stream");
oDoc = new ActiveXObject("MSXML.DOMDocument");
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
catch (e){
alert ("Error uploading File:\n" + e.message);
return;
}
oRoot = oDoc.createElement("root");oDoc.appendChild(oRoot);
// specify namespaces datatypes
oDoc.documentElement.setAttribute("xmlns:dt", "urn:schemas-microsoft-com:datatypes");
oFileInfo = oDoc.createElement("file");oRoot.appendChild(oFileInfo);
oFileName = oDoc.createElement("name");oFileInfo.appendChild(oFileName);
oFileBuffer = oDoc.createElement("buffer");oFileInfo.appendChild(oFileBuffer);
oFileBuffer.dataType = "bin.base64";
//Get the LocalPath:
oFileName.text = oC.LocalPath;
//Get the File contents:
try {
tStream.Type = 1; // 1=adTypeBinary
tStream.Open();
tStream.LoadFromFile(oC.LocalPath);
oFileBuffer.nodeTypedValue = tStream.Read(-1); // -1=adReadAll
tStream.Close();
}catch (e){
alert ("Error encoding file: " + "\n" + e.message);
return;
}
// send XML documento to Web server
xmlhttp.open("POST","./XAct.Uploader.aspx",false);
xmlhttp.send(oDoc);
oC.LastFilePath = oC.LocalPath;
oC.LocalPath = "";
oReturned = new ActiveXObject("MSXML.DOMDocument");
oReturned.loadXML(xmlhttp.responseText);
var oRoot = oReturned.documentElement;
if (!oRoot){
oC.LastStatus = 0;
oC.LastStatusMsg = "Incorrect XML";
}else{
oC.LastStatus = oRoot.childNodes[0].text;
oC.LastStatusMsg = oRoot.childNodes[1].text;
}
if (!oC.LastStatus){
alert (oC.LastStatusMsg);
}
if (oC.Img){
oC.Img.src = oC.ServerPath + oC.LocalFileName;
}
}
/*
function DoSend(qFileName,qCanvas){
var tObj = document.all[qFileName];
if (tObj){qFileName = tObj.value;}
var tObj = document.all[qCanvas];
if (tObj){qCanvas =tObj}
if (!qCanvas){qCanvas = document.body;}
oImg = document.createElement("IMG");
qCanvas.appendChild(oImg);
FU.Send(qFileName,oImg);
}
*/
 
J

John Saunders

Sky Sigal said:
Hi John:
Yes, of course the <input type=file> doesn't upload till post time -- but there are different ways
to upload: including using the ADOStream ActiveX objects with an IHttpHandler listening to it.
This way allows for multiple uploading of files and because it is asynch. you don't have to wait
till it ends before you can continue typing...

Sky, I wish you had mentioned this earlier...
And even if I were using a plain input type=file, we don't always know how many files to upload --
sometimes one, sometime 10 -- so this causes 2 problems:
a) unknown number of Type=File buttons to generate for a form.

I've seen a multiple upload control (from Microsoft's web site, I think)
b) User has to wait a long time at the end to ensure mail went out
successfully....

I'm not sure that a user should be surprised that uploading a large file
takes a long time.
There are more 'desktop feeling' ways of skinning that cat -- and they involve uploading to a dir
before the email has been finished:

I often worry about people who want to make the Web look just like the
desktop when it's not like the desktop. This may or may not be a source of
the problems you're seeing. At the very least, you may want to carefully
review the differences between a desktop solution and a web-based solution.
For instance, although a desktop solution hase to deal with the user exiting
the program or turning off his PC, a web-based solution also has to consider
what happens if the network connection breaks or if the user starts a new
browser session.
 
S

Sky Sigal

Hi John --
Thanks for the code! Learnt something new here :)

One small thing I still have to work around: as I said in the first post -- I wish to make this as a
control that takes care of its own messes by default, without asking the end user to modify
global.asax file or other files.... ...So I have to figure out how to do the equiv of the following
code, but all from within the Control dll I am making... Hum....

Anyway -- very best, and thanks for the help!
Sky
 
S

Sky Sigal

Dear John:
On a totally different subject since I think we have exhausted this one :)
Been looking around for answers to another problem (see post about adding Buttons in PreRender that
I added several hours ago...) and came across this post of yours:

Found at: http://www.dotnet247.com/247reference/msgs/53/266273.aspx
<<When I first begun .NET development, my initial instinct was to load
controls on page load as well, but that soon proved to be a pain. Since
then, I started using PreRender, and never looked back. Seems as though
you are spending the extra effort, going outside of the framework
provided, to achieve something you could have accomplished much faster
using the intrinsic event handlers and processing postback data on
Prerender.>>>

This seems close to an initial conclusion I came to at some point -- until I ran into trouble
with adding controls that I want to track events on -- EventHandlers don't work if added
after Page_Load.

How are you handling event wiring up of buttons if you are adding after Page_Load???

Very best,
Sky
 
S

Sky Sigal

Duh. The answer to my problem just dawned on me..

It works because Session and the use of Dispose to do cleanup work -- no need to modify Global.asax
or anything else:

a) Make a class that basically has one field as an arraylist,
b) Overwrite the classe's Dispose() part to loop through any members of this array -- which are
filenames -- and if they exist, delete them.
b) Instantiate this class, and stick it in the users Session array.
c) Every time the user uploads a file, add the filename (serverside version of it ) to the list:
((MyDumbClass)Session["FILEWATCHER"]).List.Add(FileNameJustUploaded)

a) When the session will end, it will dispose the instance of Session which which dispose the
instance of my DumbClass which will ...delete the files that were not already moved/deleted prior...

In other words, files count in the dir should go back to 0...whihc is what i wanted in the first
place.

I could KICK MYSELF for not seeing this sooner!

Hum. One last question about how Sessions are handled in ASP.NET during updates to its dlls:
If I upload to the server a new build while someone is uploading...will the user's session remain
intact -- or will it be disposed (and therefore deleted his uploading files) while it hotswaps the
dlls?

And when else could File Count left to delete get out of whack?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,764
Messages
2,569,566
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top