Serialization format of gdb trace data between many differentmachines.

Discussion in 'Python' started by joshbaptiste, Apr 15, 2009.

  1. joshbaptiste

    joshbaptiste Guest

    I have a corewatch.py program on ~400+ Linux machines that collect
    core file gdb output and sends the output traces via email to
    developers. I what to convert the script for all output to be sent to
    a centralized server also written in python and displayed via
    webpage. My first thought was a simple HTTP POST to server from all
    nodes but since the gdb output can be very large, multiplied my the #
    of servers, worries me as network traffic of 400+ nodes sending to 1
    server may cause saturation.

    What would be the most efficient/small serialization format to use to
    send this trace data from nodes to server over network.. would
    Pickling be ok?
     
    joshbaptiste, Apr 15, 2009
    #1
    1. Advertising

  2. joshbaptiste

    MRAB Guest

    joshbaptiste wrote:
    > I have a corewatch.py program on ~400+ Linux machines that collect
    > core file gdb output and sends the output traces via email to
    > developers. I what to convert the script for all output to be sent to
    > a centralized server also written in python and displayed via
    > webpage. My first thought was a simple HTTP POST to server from all
    > nodes but since the gdb output can be very large, multiplied my the #
    > of servers, worries me as network traffic of 400+ nodes sending to 1
    > server may cause saturation.
    >
    > What would be the most efficient/small serialization format to use to
    > send this trace data from nodes to server over network.. would
    > Pickling be ok?
    >

    I'd probably compress the data first using the zipfile module.
     
    MRAB, Apr 15, 2009
    #2
    1. Advertising

  3. joshbaptiste

    joshbaptiste Guest

    On Apr 15, 2:21 pm, MRAB <> wrote:
    > joshbaptiste wrote:
    > > I have a corewatch.py program on ~400+ Linux machines that collect
    > > core file gdb output and sends the output traces via email to
    > > developers. I what to convert the script for all output to be sent to
    > > a centralized server also written in python and displayed via
    > > webpage. My first thought was a simple HTTP POST to server from all
    > > nodes but since the gdb output can be very large, multiplied my the #
    > > of servers, worries me as network traffic of 400+ nodes sending to 1
    > > server may cause saturation.

    >
    > > What would be the most efficient/small serialization format to use to
    > > send this trace data from nodes to server over network.. would
    > > Pickling be ok?

    >
    > I'd probably compress the data first using the zipfile module.


    ok.. zipfile, sounds good thanks..
     
    joshbaptiste, Apr 15, 2009
    #3
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Replies:
    8
    Views:
    2,253
    deadsea
    Jan 2, 2005
  2. Surendra
    Replies:
    0
    Views:
    511
    Surendra
    Mar 23, 2006
  3. seba
    Replies:
    1
    Views:
    1,449
    mlimber
    Mar 22, 2007
  4. Timothy Madden
    Replies:
    1
    Views:
    1,636
    Timothy Madden
    Sep 17, 2009
  5. Wesley
    Replies:
    1
    Views:
    138
    Wesley
    Apr 15, 2014
Loading...

Share This Page