executing multiple functions in background simultaneously

Discussion in 'Python' started by Catherine Moroney, Jan 14, 2009.

  1. Hello everybody,

    I know how to spawn a sub-process and then wait until it
    completes. I'm wondering if I can do the same thing with
    a Python function.

    I would like to spawn off multiple instances of a function
    and run them simultaneously and then wait until they all complete.
    Currently I'm doing this by calling them as sub-processes
    executable from the command-line. Is there a way of accomplishing
    the same thing without having to make command-line executables
    of the function call?

    I'm primarily concerned about code readability and ease of
    programming. The code would look a lot prettier and be shorter
    to boot if I could spawn off function calls rather than
    subprocesses.

    Thanks for any advice,

    Catherine
     
    Catherine Moroney, Jan 14, 2009
    #1
    1. Advertising

  2. Catherine Moroney

    James Mills Guest

    On Wed, Jan 14, 2009 at 11:02 AM, Catherine Moroney
    <> wrote:
    > I would like to spawn off multiple instances of a function
    > and run them simultaneously and then wait until they all complete.
    > Currently I'm doing this by calling them as sub-processes
    > executable from the command-line. Is there a way of accomplishing
    > the same thing without having to make command-line executables
    > of the function call?


    Try using the python standard threading module.

    Create multiple instances of Thread with target=your_function
    Maintain a list of these new Thread instnaces
    Join (wait) on them.

    pydoc threading.Thread

    cheers
    James
     
    James Mills, Jan 14, 2009
    #2
    1. Advertising

  3. Catherine Moroney

    MRAB Guest

    James Mills wrote:
    > On Wed, Jan 14, 2009 at 11:02 AM, Catherine Moroney
    > <> wrote:
    >> I would like to spawn off multiple instances of a function and run
    >> them simultaneously and then wait until they all complete.
    >> Currently I'm doing this by calling them as sub-processes
    >> executable from the command-line. Is there a way of accomplishing
    >> the same thing without having to make command-line executables of
    >> the function call?

    >
    > Try using the python standard threading module.
    >
    > Create multiple instances of Thread with target=your_function
    > Maintain a list of these new Thread instnaces Join (wait) on them.
    >
    > pydoc threading.Thread
    >

    The disadvantage of threads in Python (CPython, actually) is that
    there's the GIL (Global Interpreter Lock), so you won't get any speed
    advantage if the threads are mostly processor-bound.
     
    MRAB, Jan 14, 2009
    #3
  4. Catherine Moroney

    James Mills Guest

    On Wed, Jan 14, 2009 at 11:35 AM, MRAB <> wrote:
    > The disadvantage of threads in Python (CPython, actually) is that
    > there's the GIL (Global Interpreter Lock), so you won't get any speed
    > advantage if the threads are mostly processor-bound.


    The OP didn't really say what this function
    does :) *sigh*

    @OP: You have (at least in 2.6+) threading and multiprocessing modules
    at your disposal.

    --JamesMills
     
    James Mills, Jan 14, 2009
    #4
  5. On Jan 14, 2:02 am, Catherine Moroney
    <> wrote:
    > Hello everybody,
    >
    > I know how to spawn a sub-process and then wait until it
    > completes.  I'm wondering if I can do the same thing with
    > a Python function.
    >
    > I would like to spawn off multiple instances of a function
    > and run them simultaneously and then wait until they all complete.
    > Currently I'm doing this by calling them as sub-processes
    > executable from the command-line.  Is there a way of accomplishing
    > the same thing without having to make command-line executables
    > of the function call?
    >
    > I'm primarily concerned about code readability and ease of
    > programming.  The code would look a lot prettier and be shorter
    > to boot if I could spawn off function calls rather than
    > subprocesses.
    >
    > Thanks for any advice,
    >
    > Catherine


    There is an example explaining how to implement exactly
    this use case in the documentation of my decorator module:
    http://pypi.python.org/pypi/decorator/3.0.0#async
    The Async decorator works both with threads and with multiprocessing.
    Here is an example of printing from multiple processes
    (it assumes you downloaded the tarball of the decorator
    module, documentation.py is the file containing the documentation
    and the Async decorator; it also assumes you have the multiprocessing
    module):

    $ cat example.py
    import os, multiprocessing
    from documentation import Async

    async = Async(multiprocessing.Process)

    @async
    def print_msg():
    print 'hello from process %d' % os.getpid()

    for i in range(3):
    print_msg()

    $ python example.py
    hello from process 5903
    hello from process 5904
    hello from process 5905
     
    Michele Simionato, Jan 14, 2009
    #5
  6. Catherine Moroney

    Aaron Brady Guest

    On Jan 13, 7:02 pm, Catherine Moroney
    <> wrote:
    > Hello everybody,
    >
    > I know how to spawn a sub-process and then wait until it
    > completes.  I'm wondering if I can do the same thing with
    > a Python function.
    >
    > I would like to spawn off multiple instances of a function
    > and run them simultaneously and then wait until they all complete.
    > Currently I'm doing this by calling them as sub-processes
    > executable from the command-line.  Is there a way of accomplishing
    > the same thing without having to make command-line executables
    > of the function call?
    >
    > I'm primarily concerned about code readability and ease of
    > programming.  The code would look a lot prettier and be shorter
    > to boot if I could spawn off function calls rather than
    > subprocesses.
    >
    > Thanks for any advice,
    >
    > Catherine


    'multiprocessing' does what you mentioned, as others said. The
    abstraction layer is solid, which makes your code pretty. However, it
    just creates a command line like this:

    '"c:\\programs\\python26\\python.exe" "-c" "from
    multiprocessing.forking import main; main()" "--multiprocessing-fork"
    "1916"'

    The handle '1916' is a pipe used to read further instructions. The
    arrive in 'main()' in the form of a pickled (serialized) dictionary.
    In it, the 'main_path' key contains the path to your program. 'main
    ()' calls the 'prepare()' function, which calls 'imp.find_module',
    using that path. Pretty sophisticated.

    You can do it yourself by creating your own command line. Create a
    subprocess by this command line (untested & lots of caveats):

    '"c:\\programs\\python26\\python.exe" "-c" "from myprogram import
    myfunc; myfunc()"'

    But you have practically no communication with it. If you need
    parameters, you can include them on the command line, since you're
    building it yourself (untested & highly vulnerable):

    '"c:\\programs\\python26\\python.exe" "-c" "from myprogram import
    myfunc; myfunc( literal1, literal2 )"'

    For a return value, unless it can be a simple exit code, you'll need a
    communication channel. For it, a socket wouldn't be bad, or a pipe if
    you're not on Windows (include the port or descriptor on the command
    line). (Even with 'multiprocessing', you're limited to pickleable
    objects, however, I believe.)
     
    Aaron Brady, Jan 14, 2009
    #6
  7. Catherine Moroney

    brooklineTom Guest

    > The disadvantage of threads in Python (CPython, actually) is that
    > there's the GIL (Global Interpreter Lock), so you won't get any speed
    > advantage if the threads are mostly processor-bound.


    On a single processor machine with compute-bound threads, I don't the
    GIL is the bottleneck. No matter how you slice it, there's still only
    one CPU.

    It might be interesting to see what it takes to make CPython do
    something useful with multicore machines, perhaps using approaches
    similar to that offered by Cilk Arts (http://www.cilk.com).
     
    brooklineTom, Jan 14, 2009
    #7
  8. James Mills wrote:
    > On Wed, Jan 14, 2009 at 11:02 AM, Catherine Moroney
    > <> wrote:
    >> I would like to spawn off multiple instances of a function
    >> and run them simultaneously and then wait until they all complete.
    >> Currently I'm doing this by calling them as sub-processes
    >> executable from the command-line. Is there a way of accomplishing
    >> the same thing without having to make command-line executables
    >> of the function call?

    >
    > Try using the python standard threading module.
    >
    > Create multiple instances of Thread with target=your_function
    > Maintain a list of these new Thread instnaces
    > Join (wait) on them.
    >
    > pydoc threading.Thread
    >
    > cheers
    > James


    What is the proper syntax to use if I wish to return variables
    from a function run as a thread?

    For example, how do I implement the following code to return
    the variable "c" from MyFunc for later use in RunThreads?
    Trying to return anything from the threading.Thread call results
    in a "unpack non-sequence" error.

    import threading, sys

    def MyFunc(a, b):

    c = a + b
    print "c =",c
    return c

    def RunThreads():

    args = (1,2)
    threading.Thread(target=MyFunc,args=(1,2)).start()

    if __name__ == "__main__":

    RunThreads()

    sys.exit()
     
    Catherine Moroney, Jan 14, 2009
    #8
  9. James Mills wrote:
    > On Wed, Jan 14, 2009 at 11:02 AM, Catherine Moroney
    > <> wrote:
    >> I would like to spawn off multiple instances of a function
    >> and run them simultaneously and then wait until they all complete.
    >> Currently I'm doing this by calling them as sub-processes
    >> executable from the command-line. Is there a way of accomplishing
    >> the same thing without having to make command-line executables
    >> of the function call?

    >
    > Try using the python standard threading module.
    >
    > Create multiple instances of Thread with target=your_function
    > Maintain a list of these new Thread instnaces
    > Join (wait) on them.
    >
    > pydoc threading.Thread
    >
    > cheers
    > James


    What is the proper syntax to use if I wish to return variables
    from a function run as a thread?

    For example, how do I implement the following code to return
    the variable "c" from MyFunc for later use in RunThreads?
    Trying to return anything from the threading.Thread call results
    in a "unpack non-sequence" error.

    import threading, sys

    def MyFunc(a, b):

    c = a + b
    print "c =",c
    return c

    def RunThreads():

    args = (1,2)
    threading.Thread(target=MyFunc,args=(1,2)).start()

    if __name__ == "__main__":

    RunThreads()

    sys.exit()
     
    Catherine Moroney, Jan 14, 2009
    #9
  10. Catherine Moroney

    James Mills Guest

    Speaking of Threading ..

    http://codepad.org/dvxwAphE

    Just a really interesting way of doing this :)

    cheers
    James

    --
    -- "Problems are solved by method"
     
    James Mills, Jan 15, 2009
    #10
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Jimmy
    Replies:
    1
    Views:
    473
    bruce barker
    Jul 8, 2003
  2. Chad McCune
    Replies:
    2
    Views:
    636
    Chad McCune
    Jan 20, 2004
  3. AnalogFile
    Replies:
    2
    Views:
    298
    Mark P
    Mar 15, 2006
  4. Cameron Simpson
    Replies:
    4
    Views:
    288
    Steve Holden
    Jan 15, 2009
  5. zb
    Replies:
    3
    Views:
    187
    Thomas 'PointedEars' Lahn
    Feb 17, 2008
Loading...

Share This Page