distributing modules to machines

R

r1pp3r

I'm in the the process of designing a build system written in python.
It runs from a central server with various build machines hosting
server processes, written in Python. Pyro is the chosen RPC mechanism.
What I would like to do is have the system update itself on demand. In
other words, pass pickled objects (the code comprising the server) down
the pipeline to the server, have them copied to the right place, and
then restart the server which can then invoke the new code. Is this
feasible? Will there be any other issues involved?
 
F

Fredrik Lundh

r1pp3r said:
What I would like to do is have the system update itself on demand. In
other words, pass pickled objects (the code comprising the server) down
the pipeline to the server, have them copied to the right place, and
then restart the server which can then invoke the new code.

should work (assuming you trust the various systems involved, and your
own ability to avoid deploying broken code)

but using a custom protocol for this sounds like a slight overkill,
though. I would probably use rsync (over ssh) at regular intervals, or,
if we're only talking about small amounts of code, a bootstrap script
that fetches the current version over http(s) every time the server starts.

(in the http case, you can use etag/if-modified-since to avoid
downloading things if they haven't changed, but if you're on a fast
network, that probably won't matter much).

</F>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,743
Messages
2,569,478
Members
44,898
Latest member
BlairH7607

Latest Threads

Top