What way to send large data from .NET to Linux-platform

Discussion in 'ASP .Net Web Services' started by Jonah Olsson, Jun 10, 2004.

  1. Jonah Olsson

    Jonah Olsson Guest

    Dear All,

    I'm currently developing a solution where large amounts of personalised
    emails are being created (and no, this is not spam...) on the ASP.NET
    platform and being delivered by a Debian Linux server running Qmail and
    mySQL. Currently the .NET application just connects to the SMTP-port on the
    Linux server and sends each mail one by one. This creates an awful lot of
    traffic and isn't really a good way of handling >100.000 emails/month.

    I would like a solution where all this data first being prepared on the .NET
    platform, and then transferred to the Linux platform to be handled and sent.
    But how should I solve this both secure/reliable and efficient?

    So basically I have two questions;

    Should I prepare a large XML dataset and ship this to the Linux server to be
    handled locally (Perl + mySQL + Qmail). This would need some kind of status
    check since if the Linux server would go down, some mail might already have
    been sent.

    Can I use Web Services here? If so, I suppose I should create two Web
    Services. One on the Linux platform to receive the dataset with personalised
    emails, and one on the .NET platform to receive status and results.

    Am I missing something out here? Qmail is currently the most reliable part
    here I think, since it basically never looses mail even if the network or
    server goes down. But the data sent to Qmail might be lost due to network
    trouble etc. This is an important part of the problem.

    Someone with similar experience?
    Thanks for any kind of help/hints!

    Best regards
    Jonah Olsson
    Generation Software
     
    Jonah Olsson, Jun 10, 2004
    #1
    1. Advertisements

  2. Hi Jonah,

    You could certainly use Web Services, but you only need the service on one
    end. The client that consumes the service is at the other end. It calls a
    WebMethod on the Web Service. If the service is on the Windows machine, the
    Method can return the data needed by the Unix machine. However, you should
    be aware that by having the computers use a Web Service to send all the
    emails to the Unix machine, and having that machine email them all at once,
    you may actually be causing a total sum of MORE processing and memory usage
    across both computers than the simpler method you're already using. You're
    adding an extra SOAP layer to the process. On the other hand, if one or the
    other of the machines is under heavy load, you may be able to balance it out
    somewhat by using more of the other machine's resources.

    Good question!

    --
    HTH,
    Kevin Spencer
    ..Net Developer
    Microsoft MVP
    Big things are made up
    of lots of little things.

    "Jonah Olsson" <> wrote in message
    news:...
    > Dear All,
    >
    > I'm currently developing a solution where large amounts of personalised
    > emails are being created (and no, this is not spam...) on the ASP.NET
    > platform and being delivered by a Debian Linux server running Qmail and
    > mySQL. Currently the .NET application just connects to the SMTP-port on

    the
    > Linux server and sends each mail one by one. This creates an awful lot of
    > traffic and isn't really a good way of handling >100.000 emails/month.
    >
    > I would like a solution where all this data first being prepared on the

    ..NET
    > platform, and then transferred to the Linux platform to be handled and

    sent.
    > But how should I solve this both secure/reliable and efficient?
    >
    > So basically I have two questions;
    >
    > Should I prepare a large XML dataset and ship this to the Linux server to

    be
    > handled locally (Perl + mySQL + Qmail). This would need some kind of

    status
    > check since if the Linux server would go down, some mail might already

    have
    > been sent.
    >
    > Can I use Web Services here? If so, I suppose I should create two Web
    > Services. One on the Linux platform to receive the dataset with

    personalised
    > emails, and one on the .NET platform to receive status and results.
    >
    > Am I missing something out here? Qmail is currently the most reliable part
    > here I think, since it basically never looses mail even if the network or
    > server goes down. But the data sent to Qmail might be lost due to network
    > trouble etc. This is an important part of the problem.
    >
    > Someone with similar experience?
    > Thanks for any kind of help/hints!
    >
    > Best regards
    > Jonah Olsson
    > Generation Software
    >
    >
    >
     
    Kevin Spencer, Jun 10, 2004
    #2
    1. Advertisements

  3. Jonah Olsson

    bruce barker Guest

    its hard to believe you could come up with something better. SMTP mail is
    pretty simple, you do a socket connect and send the data. the SMTP demon
    just write the data to directory (after validating the headers). another
    demon scans the directory for new email and sends it on its way. this is why
    spamming is so cheap.

    -- bruce (sqlwork.com)


    "Jonah Olsson" <> wrote in message
    news:...
    > Dear All,
    >
    > I'm currently developing a solution where large amounts of personalised
    > emails are being created (and no, this is not spam...) on the ASP.NET
    > platform and being delivered by a Debian Linux server running Qmail and
    > mySQL. Currently the .NET application just connects to the SMTP-port on

    the
    > Linux server and sends each mail one by one. This creates an awful lot of
    > traffic and isn't really a good way of handling >100.000 emails/month.
    >
    > I would like a solution where all this data first being prepared on the

    ..NET
    > platform, and then transferred to the Linux platform to be handled and

    sent.
    > But how should I solve this both secure/reliable and efficient?
    >
    > So basically I have two questions;
    >
    > Should I prepare a large XML dataset and ship this to the Linux server to

    be
    > handled locally (Perl + mySQL + Qmail). This would need some kind of

    status
    > check since if the Linux server would go down, some mail might already

    have
    > been sent.
    >
    > Can I use Web Services here? If so, I suppose I should create two Web
    > Services. One on the Linux platform to receive the dataset with

    personalised
    > emails, and one on the .NET platform to receive status and results.
    >
    > Am I missing something out here? Qmail is currently the most reliable part
    > here I think, since it basically never looses mail even if the network or
    > server goes down. But the data sent to Qmail might be lost due to network
    > trouble etc. This is an important part of the problem.
    >
    > Someone with similar experience?
    > Thanks for any kind of help/hints!
    >
    > Best regards
    > Jonah Olsson
    > Generation Software
    >
    >
    >
     
    bruce barker, Jun 10, 2004
    #3
  4. Jonah Olsson

    Jonah Olsson Guest

    Hi Kevin and thanks for your reply.
    I'm sorry I haven't responded earlier, but I'm on a short vacation.

    I now realise that such a solution discussed below will probably require a
    lot more system resources (and development resources as well) than the
    current version (or slightly modified).
    Maybe I should stick to an SMTP connection and let Qmail do the entire
    queuing, like what Bruce Barker suggested in his reply?

    However, a Web Service would probably be well suited on the .NET server to
    receive bounce statistics from the Linux mail server!

    Thanks!
    /Jonah

    "Kevin Spencer" <> wrote in message
    news:u$...
    > Hi Jonah,
    >
    > You could certainly use Web Services, but you only need the service on one
    > end. The client that consumes the service is at the other end. It calls a
    > WebMethod on the Web Service. If the service is on the Windows machine,

    the
    > Method can return the data needed by the Unix machine. However, you should
    > be aware that by having the computers use a Web Service to send all the
    > emails to the Unix machine, and having that machine email them all at

    once,
    > you may actually be causing a total sum of MORE processing and memory

    usage
    > across both computers than the simpler method you're already using. You're
    > adding an extra SOAP layer to the process. On the other hand, if one or

    the
    > other of the machines is under heavy load, you may be able to balance it

    out
    > somewhat by using more of the other machine's resources.
    >
    > Good question!
    >
    > --
    > HTH,
    > Kevin Spencer
    > .Net Developer
    > Microsoft MVP
    > Big things are made up
    > of lots of little things.
     
    Jonah Olsson, Jun 13, 2004
    #4
  5. Jonah Olsson

    Jonah Olsson Guest

    Hi Bruce and thanks for your reply.

    So basically there will be no trouble sending 30.000+ in a row (as they're
    being created) to the Linux (mail-)server?

    /Jonah


    "bruce barker" <> skrev i meddelandet
    news:...
    > its hard to believe you could come up with something better. SMTP mail is
    > pretty simple, you do a socket connect and send the data. the SMTP demon
    > just write the data to directory (after validating the headers). another
    > demon scans the directory for new email and sends it on its way. this is

    why
    > spamming is so cheap.
    >
    > -- bruce (sqlwork.com)
    >
    >
    > "Jonah Olsson" <> wrote in message
    > news:...
    > > Dear All,
    > >
    > > I'm currently developing a solution where large amounts of personalised
    > > emails are being created (and no, this is not spam...) on the ASP.NET
    > > platform and being delivered by a Debian Linux server running Qmail and
    > > mySQL. Currently the .NET application just connects to the SMTP-port on

    > the
    > > Linux server and sends each mail one by one. This creates an awful lot

    of
    > > traffic and isn't really a good way of handling >100.000 emails/month.
    > >
    > > I would like a solution where all this data first being prepared on the

    > .NET
    > > platform, and then transferred to the Linux platform to be handled and

    > sent.
    > > But how should I solve this both secure/reliable and efficient?
    > >
    > > So basically I have two questions;
    > >
    > > Should I prepare a large XML dataset and ship this to the Linux server

    to
    > be
    > > handled locally (Perl + mySQL + Qmail). This would need some kind of

    > status
    > > check since if the Linux server would go down, some mail might already

    > have
    > > been sent.
    > >
    > > Can I use Web Services here? If so, I suppose I should create two Web
    > > Services. One on the Linux platform to receive the dataset with

    > personalised
    > > emails, and one on the .NET platform to receive status and results.
    > >
    > > Am I missing something out here? Qmail is currently the most reliable

    part
    > > here I think, since it basically never looses mail even if the network

    or
    > > server goes down. But the data sent to Qmail might be lost due to

    network
    > > trouble etc. This is an important part of the problem.
    > >
    > > Someone with similar experience?
    > > Thanks for any kind of help/hints!
    > >
    > > Best regards
    > > Jonah Olsson
    > > Generation Software
    > >
    > >
    > >

    >
    >
     
    Jonah Olsson, Jun 13, 2004
    #5
    1. Advertisements

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Jonah Olsson
    Replies:
    4
    Views:
    454
    Jonah Olsson
    Jun 13, 2004
  2. jcc
    Replies:
    15
    Views:
    4,878
    Nigel Wade
    May 12, 2006
  3. Coca
    Replies:
    15
    Views:
    754
    Alan Balmer
    Jan 14, 2004
  4. Roedy Green
    Replies:
    4
    Views:
    587
    Tim Slattery
    Mar 18, 2010
  5. Andrew Thompson
    Replies:
    0
    Views:
    430
    Andrew Thompson
    Mar 18, 2010
Loading...

Share This Page