Serialization format of gdb trace data between many differentmachines.

Discussion in 'Python' started by joshbaptiste, Apr 15, 2009.

  1. joshbaptiste

    joshbaptiste Guest

    I have a corewatch.py program on ~400+ Linux machines that collect
    core file gdb output and sends the output traces via email to
    developers. I what to convert the script for all output to be sent to
    a centralized server also written in python and displayed via
    webpage. My first thought was a simple HTTP POST to server from all
    nodes but since the gdb output can be very large, multiplied my the #
    of servers, worries me as network traffic of 400+ nodes sending to 1
    server may cause saturation.

    What would be the most efficient/small serialization format to use to
    send this trace data from nodes to server over network.. would
    Pickling be ok?
     
    joshbaptiste, Apr 15, 2009
    #1
    1. Advertisements

  2. joshbaptiste

    MRAB Guest

    I'd probably compress the data first using the zipfile module.
     
    MRAB, Apr 15, 2009
    #2
    1. Advertisements

  3. joshbaptiste

    joshbaptiste Guest

    ok.. zipfile, sounds good thanks..
     
    joshbaptiste, Apr 15, 2009
    #3
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.