Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Python > executing multiple functions in background simultaneously

Reply
Thread Tools

executing multiple functions in background simultaneously

 
 
Catherine Moroney
Guest
Posts: n/a
 
      01-14-2009
Hello everybody,

I know how to spawn a sub-process and then wait until it
completes. I'm wondering if I can do the same thing with
a Python function.

I would like to spawn off multiple instances of a function
and run them simultaneously and then wait until they all complete.
Currently I'm doing this by calling them as sub-processes
executable from the command-line. Is there a way of accomplishing
the same thing without having to make command-line executables
of the function call?

I'm primarily concerned about code readability and ease of
programming. The code would look a lot prettier and be shorter
to boot if I could spawn off function calls rather than
subprocesses.

Thanks for any advice,

Catherine
 
Reply With Quote
 
 
 
 
James Mills
Guest
Posts: n/a
 
      01-14-2009
On Wed, Jan 14, 2009 at 11:02 AM, Catherine Moroney
<(E-Mail Removed)> wrote:
> I would like to spawn off multiple instances of a function
> and run them simultaneously and then wait until they all complete.
> Currently I'm doing this by calling them as sub-processes
> executable from the command-line. Is there a way of accomplishing
> the same thing without having to make command-line executables
> of the function call?


Try using the python standard threading module.

Create multiple instances of Thread with target=your_function
Maintain a list of these new Thread instnaces
Join (wait) on them.

pydoc threading.Thread

cheers
James
 
Reply With Quote
 
 
 
 
MRAB
Guest
Posts: n/a
 
      01-14-2009
James Mills wrote:
> On Wed, Jan 14, 2009 at 11:02 AM, Catherine Moroney
> <(E-Mail Removed)> wrote:
>> I would like to spawn off multiple instances of a function and run
>> them simultaneously and then wait until they all complete.
>> Currently I'm doing this by calling them as sub-processes
>> executable from the command-line. Is there a way of accomplishing
>> the same thing without having to make command-line executables of
>> the function call?

>
> Try using the python standard threading module.
>
> Create multiple instances of Thread with target=your_function
> Maintain a list of these new Thread instnaces Join (wait) on them.
>
> pydoc threading.Thread
>

The disadvantage of threads in Python (CPython, actually) is that
there's the GIL (Global Interpreter Lock), so you won't get any speed
advantage if the threads are mostly processor-bound.
 
Reply With Quote
 
James Mills
Guest
Posts: n/a
 
      01-14-2009
On Wed, Jan 14, 2009 at 11:35 AM, MRAB <(E-Mail Removed)> wrote:
> The disadvantage of threads in Python (CPython, actually) is that
> there's the GIL (Global Interpreter Lock), so you won't get any speed
> advantage if the threads are mostly processor-bound.


The OP didn't really say what this function
does *sigh*

@OP: You have (at least in 2.6+) threading and multiprocessing modules
at your disposal.

--JamesMills
 
Reply With Quote
 
Michele Simionato
Guest
Posts: n/a
 
      01-14-2009
On Jan 14, 2:02*am, Catherine Moroney
<(E-Mail Removed)> wrote:
> Hello everybody,
>
> I know how to spawn a sub-process and then wait until it
> completes. *I'm wondering if I can do the same thing with
> a Python function.
>
> I would like to spawn off multiple instances of a function
> and run them simultaneously and then wait until they all complete.
> Currently I'm doing this by calling them as sub-processes
> executable from the command-line. *Is there a way of accomplishing
> the same thing without having to make command-line executables
> of the function call?
>
> I'm primarily concerned about code readability and ease of
> programming. *The code would look a lot prettier and be shorter
> to boot if I could spawn off function calls rather than
> subprocesses.
>
> Thanks for any advice,
>
> Catherine


There is an example explaining how to implement exactly
this use case in the documentation of my decorator module:
http://pypi.python.org/pypi/decorator/3.0.0#async
The Async decorator works both with threads and with multiprocessing.
Here is an example of printing from multiple processes
(it assumes you downloaded the tarball of the decorator
module, documentation.py is the file containing the documentation
and the Async decorator; it also assumes you have the multiprocessing
module):

$ cat example.py
import os, multiprocessing
from documentation import Async

async = Async(multiprocessing.Process)

@async
def print_msg():
print 'hello from process %d' % os.getpid()

for i in range(3):
print_msg()

$ python example.py
hello from process 5903
hello from process 5904
hello from process 5905
 
Reply With Quote
 
Aaron Brady
Guest
Posts: n/a
 
      01-14-2009
On Jan 13, 7:02*pm, Catherine Moroney
<(E-Mail Removed)> wrote:
> Hello everybody,
>
> I know how to spawn a sub-process and then wait until it
> completes. *I'm wondering if I can do the same thing with
> a Python function.
>
> I would like to spawn off multiple instances of a function
> and run them simultaneously and then wait until they all complete.
> Currently I'm doing this by calling them as sub-processes
> executable from the command-line. *Is there a way of accomplishing
> the same thing without having to make command-line executables
> of the function call?
>
> I'm primarily concerned about code readability and ease of
> programming. *The code would look a lot prettier and be shorter
> to boot if I could spawn off function calls rather than
> subprocesses.
>
> Thanks for any advice,
>
> Catherine


'multiprocessing' does what you mentioned, as others said. The
abstraction layer is solid, which makes your code pretty. However, it
just creates a command line like this:

'"c:\\programs\\python26\\python.exe" "-c" "from
multiprocessing.forking import main; main()" "--multiprocessing-fork"
"1916"'

The handle '1916' is a pipe used to read further instructions. The
arrive in 'main()' in the form of a pickled (serialized) dictionary.
In it, the 'main_path' key contains the path to your program. 'main
()' calls the 'prepare()' function, which calls 'imp.find_module',
using that path. Pretty sophisticated.

You can do it yourself by creating your own command line. Create a
subprocess by this command line (untested & lots of caveats):

'"c:\\programs\\python26\\python.exe" "-c" "from myprogram import
myfunc; myfunc()"'

But you have practically no communication with it. If you need
parameters, you can include them on the command line, since you're
building it yourself (untested & highly vulnerable):

'"c:\\programs\\python26\\python.exe" "-c" "from myprogram import
myfunc; myfunc( literal1, literal2 )"'

For a return value, unless it can be a simple exit code, you'll need a
communication channel. For it, a socket wouldn't be bad, or a pipe if
you're not on Windows (include the port or descriptor on the command
line). (Even with 'multiprocessing', you're limited to pickleable
objects, however, I believe.)
 
Reply With Quote
 
brooklineTom
Guest
Posts: n/a
 
      01-14-2009
> The disadvantage of threads in Python (CPython, actually) is that
> there's the GIL (Global Interpreter Lock), so you won't get any speed
> advantage if the threads are mostly processor-bound.


On a single processor machine with compute-bound threads, I don't the
GIL is the bottleneck. No matter how you slice it, there's still only
one CPU.

It might be interesting to see what it takes to make CPython do
something useful with multicore machines, perhaps using approaches
similar to that offered by Cilk Arts (http://www.cilk.com).
 
Reply With Quote
 
Catherine Moroney
Guest
Posts: n/a
 
      01-14-2009
James Mills wrote:
> On Wed, Jan 14, 2009 at 11:02 AM, Catherine Moroney
> <(E-Mail Removed)> wrote:
>> I would like to spawn off multiple instances of a function
>> and run them simultaneously and then wait until they all complete.
>> Currently I'm doing this by calling them as sub-processes
>> executable from the command-line. Is there a way of accomplishing
>> the same thing without having to make command-line executables
>> of the function call?

>
> Try using the python standard threading module.
>
> Create multiple instances of Thread with target=your_function
> Maintain a list of these new Thread instnaces
> Join (wait) on them.
>
> pydoc threading.Thread
>
> cheers
> James


What is the proper syntax to use if I wish to return variables
from a function run as a thread?

For example, how do I implement the following code to return
the variable "c" from MyFunc for later use in RunThreads?
Trying to return anything from the threading.Thread call results
in a "unpack non-sequence" error.

import threading, sys

def MyFunc(a, b):

c = a + b
print "c =",c
return c

def RunThreads():

args = (1,2)
threading.Thread(target=MyFunc,args=(1,2)).start()

if __name__ == "__main__":

RunThreads()

sys.exit()

 
Reply With Quote
 
Catherine Moroney
Guest
Posts: n/a
 
      01-14-2009
James Mills wrote:
> On Wed, Jan 14, 2009 at 11:02 AM, Catherine Moroney
> <(E-Mail Removed)> wrote:
>> I would like to spawn off multiple instances of a function
>> and run them simultaneously and then wait until they all complete.
>> Currently I'm doing this by calling them as sub-processes
>> executable from the command-line. Is there a way of accomplishing
>> the same thing without having to make command-line executables
>> of the function call?

>
> Try using the python standard threading module.
>
> Create multiple instances of Thread with target=your_function
> Maintain a list of these new Thread instnaces
> Join (wait) on them.
>
> pydoc threading.Thread
>
> cheers
> James


What is the proper syntax to use if I wish to return variables
from a function run as a thread?

For example, how do I implement the following code to return
the variable "c" from MyFunc for later use in RunThreads?
Trying to return anything from the threading.Thread call results
in a "unpack non-sequence" error.

import threading, sys

def MyFunc(a, b):

c = a + b
print "c =",c
return c

def RunThreads():

args = (1,2)
threading.Thread(target=MyFunc,args=(1,2)).start()

if __name__ == "__main__":

RunThreads()

sys.exit()
 
Reply With Quote
 
James Mills
Guest
Posts: n/a
 
      01-15-2009
Speaking of Threading ..

http://codepad.org/dvxwAphE

Just a really interesting way of doing this

cheers
James

--
-- "Problems are solved by method"
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Re: executing multiple functions in background simultaneously Cameron Simpson Python 4 01-15-2009 05:58 AM
Calling two functions simultaneously. zb Javascript 3 02-17-2008 02:48 PM
Re: Simultaneously overriding multiple member functions AnalogFile C++ 2 03-15-2006 10:38 PM
multiple aspx requests simultaneously by same user Chad McCune ASP .Net 2 01-20-2004 02:50 PM
Unable to debug multiple ASP.NET projects simultaneously Jimmy ASP .Net 1 07-08-2003 12:35 AM



Advertisments