Serialization format of gdb trace data between many differentmachines.

J

joshbaptiste

I have a corewatch.py program on ~400+ Linux machines that collect
core file gdb output and sends the output traces via email to
developers. I what to convert the script for all output to be sent to
a centralized server also written in python and displayed via
webpage. My first thought was a simple HTTP POST to server from all
nodes but since the gdb output can be very large, multiplied my the #
of servers, worries me as network traffic of 400+ nodes sending to 1
server may cause saturation.

What would be the most efficient/small serialization format to use to
send this trace data from nodes to server over network.. would
Pickling be ok?
 
M

MRAB

joshbaptiste said:
I have a corewatch.py program on ~400+ Linux machines that collect
core file gdb output and sends the output traces via email to
developers. I what to convert the script for all output to be sent to
a centralized server also written in python and displayed via
webpage. My first thought was a simple HTTP POST to server from all
nodes but since the gdb output can be very large, multiplied my the #
of servers, worries me as network traffic of 400+ nodes sending to 1
server may cause saturation.

What would be the most efficient/small serialization format to use to
send this trace data from nodes to server over network.. would
Pickling be ok?
I'd probably compress the data first using the zipfile module.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,906
Latest member
SkinfixSkintag

Latest Threads

Top