Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Python > Re: run a function in another processor in python

Reply
Thread Tools

Re: run a function in another processor in python

 
 
Astan Chee
Guest
Posts: n/a
 
      12-09-2010
Thanks but I'm having trouble with that module too. Currently what I
have is something like this:

import sys
import os
import multiprocessing

import time

def functionTester(num):
return ((num+2)/(num-2))**2

num_args = [61,62,33,7,12,16,19,35,36,37,38,55,56,57,63]

max_result = 0

start = time.time()

num_processes = multiprocessing.cpu_count()

threads = []
len_stas = len(num_args)

for list_item in num_args:
if len(threads) < num_processes:
p = multiprocessing.Process(target=functionTester,args =[list_item])
p.start()
print p, p.is_alive()
threads.append(p)
else:
for thread in threads:
if not thread.is_alive():
threads.remove(thread)

print "Result " + str(max_result)
end = time.time()
elapsed= end - start
print "Took", elapsed, "seconds to execute"

But it doesn't give me any return data. It also spawns an infinite
number of (sub)processes that crashes my machine. What am I doing
wrong here?

On 12/9/10, Jean-Michel Pichavant <(E-Mail Removed)> wrote:
> Astan Chee wrote:
>> Hi,
>> I've got a python script that calls a function many times with various
>> arguments and returns a result. What I'm trying to do is run this
>> function each on different processors and compile the result at the
>> end based on the function result. The script looks something like
>> this:
>>
>>
>> import time
>>
>> def functionTester(num):
>> return ((num+2)/(num-2))**2
>>
>> num_args = [1,2,3,7,12,16,19,35,36,37,38,55,56,57,63,44,71,81, 91]
>>
>> max_result = 0
>>
>> start = time.time()
>>
>> for n in num_args:
>> result = functionTester(n)
>> if result > max_result:
>> max_result = result
>>
>> print "Result " + str(max_result)
>> end = time.time()
>> elapsed= end - start
>> print "Took", elapsed, "seconds to execute"
>>
>>
>> What I'm trying to do is run each function on a processor and when its
>> done, move on to the next function-argument specifically on windows 7
>> x64 using python 2.6. How do I do this?
>> Thanks for any help
>>

> If I'm not wrong, CPU management is handled by your system, meaning
> there's no way to 'force' anything to run on a specific CPU. However,
> you may try to execute your fonction in a subprocess, so that the system
> will use different CPUs (hopefully). You then just need to limit the
> number of subprocess alive at the same time.
>
> Have a look here
> http://docs.python.org/library/multiprocessing.html
>
> JM
>

 
Reply With Quote
 
 
 
 
Peter Otten
Guest
Posts: n/a
 
      12-09-2010
Astan Chee wrote:

> Thanks but I'm having trouble with that module too. Currently what I
> have is something like this:
>
> import sys
> import os
> import multiprocessing
>
> import time
>
> def functionTester(num):
> return ((num+2)/(num-2))**2
>
> num_args = [61,62,33,7,12,16,19,35,36,37,38,55,56,57,63]
>
> max_result = 0
>
> start = time.time()
>
> num_processes = multiprocessing.cpu_count()
>
> threads = []
> len_stas = len(num_args)
>
> for list_item in num_args:
> if len(threads) < num_processes:
> p =
> multiprocessing.Process(target=functionTester,args =[list_item])
> p.start() print p, p.is_alive()
> threads.append(p)
> else:
> for thread in threads:
> if not thread.is_alive():
> threads.remove(thread)
>
> print "Result " + str(max_result)
> end = time.time()
> elapsed= end - start
> print "Took", elapsed, "seconds to execute"
>
> But it doesn't give me any return data. It also spawns an infinite
> number of (sub)processes that crashes my machine. What am I doing
> wrong here?


I can't replicate the crash. However, your problem looks like there is a
ready-to-use solution:

http://docs.python.org/library/multi...ool-of-workers

 
Reply With Quote
 
 
 
 
geremy condra
Guest
Posts: n/a
 
      12-10-2010
On Fri, Dec 10, 2010 at 4:42 AM, Astan Chee <(E-Mail Removed)> wrote:
> On Fri, Dec 10, 2010 at 1:37 AM, Peter Otten <(E-Mail Removed)> wrote:
>>
>> I can't replicate the crash. However, your problem looks like there is a
>> ready-to-use solution:
>>
>>
>> http://docs.python.org/library/multi...ool-of-workers
>>
>> --
>> http://mail.python.org/mailman/listinfo/python-list

>
>
> Pool.map doesn't seem to be able to support multiple argument functions
> which is what I'm trying to do here. Any other suggestions?
> Thanks again


1. Post the real code you're using, and
2. Put the arguments you want in a tuple and pass that. As an example,
let's say I have the following function:

def my_func(x, y, z):
return x + y * z

you could rewrite this as:

def my_func(*args):
return args[0] + args[1] * args[2]


Here's a worked-out example:

#! /usr/bin/env python3

import multiprocessing

def my_func(x, y, z):
return x + y * z

def my_func_wrapper(t):
return my_func(*t)

# assume we can get an iterable over each argument
xs = [1, 2, 3, 4]
ys = [5, 6, 7, 8]
zs = [9, 1, 2, 3]

# set up the pool to match the number of CPUs
num_processes = multiprocessing.cpu_count()
pool = multiprocessing.Pool(num_processes)

#will call my_func(1, 5, 9), my_func(2, 6, 1), etc.
results = pool.map(my_func_wrapper, zip(xs, ys, zs))
print(results)


Interesting factoid: you can't seem to use lambda or a decorator to do
this, which would have been my first instinct. Pickle apparently
chokes, although marshall wouldn't.

Geremy Condra
 
Reply With Quote
 
Peter Otten
Guest
Posts: n/a
 
      12-11-2010
geremy condra wrote:

> On Fri, Dec 10, 2010 at 4:42 AM, Astan Chee <(E-Mail Removed)> wrote:
>> On Fri, Dec 10, 2010 at 1:37 AM, Peter Otten <(E-Mail Removed)> wrote:
>>>
>>> I can't replicate the crash. However, your problem looks like there is a
>>> ready-to-use solution:
>>>
>>>
>>> http://docs.python.org/library/multi...ing-a-pool-of-

workers
>>>
>>> --
>>> http://mail.python.org/mailman/listinfo/python-list

>>
>>
>> Pool.map doesn't seem to be able to support multiple argument functions
>> which is what I'm trying to do here. Any other suggestions?
>> Thanks again

>
> 1. Post the real code you're using, and
> 2. Put the arguments you want in a tuple and pass that. As an example,
> let's say I have the following function:
>
> def my_func(x, y, z):
> return x + y * z
>
> you could rewrite this as:
>
> def my_func(*args):
> return args[0] + args[1] * args[2]
>
>
> Here's a worked-out example:
>
> #! /usr/bin/env python3
>
> import multiprocessing
>
> def my_func(x, y, z):
> return x + y * z
>
> def my_func_wrapper(t):
> return my_func(*t)
>
> # assume we can get an iterable over each argument
> xs = [1, 2, 3, 4]
> ys = [5, 6, 7, 8]
> zs = [9, 1, 2, 3]
>
> # set up the pool to match the number of CPUs
> num_processes = multiprocessing.cpu_count()
> pool = multiprocessing.Pool(num_processes)
>
> #will call my_func(1, 5, 9), my_func(2, 6, 1), etc.
> results = pool.map(my_func_wrapper, zip(xs, ys, zs))
> print(results)
>
>
> Interesting factoid: you can't seem to use lambda or a decorator to do
> this, which would have been my first instinct. Pickle apparently
> chokes, although marshall wouldn't.


You basically have to ensure that the resulting function is found under its
__name__ in the global namespace of its __module__. This can be done with
functools.wraps():

#! /usr/bin/env python3

import multiprocessing
import functools

def starcall(f):
@functools.wraps(f)
def g(args):
return f(*args)
return g

@starcall
def my_func(x, y, z):
return x + y * z

xs = [1, 2, 3, 4]
ys = [5, 6, 7, 8]
zs = [9, 1, 2, 3]

num_processes = multiprocessing.cpu_count()
pool = multiprocessing.Pool(num_processes)

results = pool.map(my_func, zip(xs, ys, zs))
print(results)

 
Reply With Quote
 
Peter Otten
Guest
Posts: n/a
 
      12-11-2010
Astan Chee wrote:

> Sorry about that, here is a summary of my complete code. I haven't cleaned
> it up much or anything, but this is what it does:
>
> import time
> import multiprocessing
>
> test_constx =0
> test_consty =0
>
> def functionTester(x):
> global test_constx


You don't need to declare a variable as global unless you want to rebind
(assign to) it.

> global test_consty
> print "constx " + str(test_constx)
> print "consty " + str(test_consty)
> return (test_constx*x[0]-x[1]+test_consty*x[0]+x[2])
>
> def functionTesterMain(constx,consty):
> global test_constx
> global test_consty
> test_constx = constx
> test_consty = consty
> num_args =
> [(61,12,1),(61,12,2),(61,12,3),(61,11,4),(61,12,4), (62,33,4),(7,12,4),

(16,19,4),(35,36,4),(37,38,3),(55,56,3),(57,63,3)]
> num_processes = multiprocessing.cpu_count()
> pool = multiprocessing.Pool(num_processes)


I think you need to create the pool outside the function; in the current
configuration you get three not one Pool instance.

> rs = []
> start = time.time()
> rs = pool.map(functionTester,num_args)
> end = time.time()
> elapsed= end - start
> min = elapsed/60
> print "Took", elapsed, "seconds to run, which is the same as", min,
> "minutes"
> pos = 0
> high = 0
> n = None
> for r in rs:
> if r > high:
> n = num_args[pos]
> high = r
> pos+=1
> print "high " + str(high)
> print "n " + str(n)
>
> return high,n
>
> if __name__ == '__main__':
> for i in range(1,4):
> a,b = functionTesterMain(i,7)
> print "-----------"
> print "a " + str(a)
> print "b " + str(a)
>
>
> Which doesn't seem to work because the functionTester() needs to be
> simpler and not use global variables.
> I'm using global variables because I'm also trying to pass a few other
> variables and I tried using a class but that just gave me a unpickleable
> error. I tried using zip but I'm confused with how I can get to pass the
> data.


A simple approach would be to pass an index into a list

const_data = zip(range(1, 4), [7]*3)

> I know I can probably combine the data into tuples but that means that
> there is alot of data duplication, especially if the constx and consty are
> large dictionaries (or even custom objects), which might happen later.
> So it seems that map doesn't quite like functions like these. Anyway, I'll
> try and see if threads or something can substitute. I'd appriciate any

help. Thanks

Your code, slightly modified and cleaned up (yes, four-space indent improves
readability):

import time
import multiprocessing

const_data = zip(range(1, 4), [7]*3)
num_args = [(61, 12, 1), (61, 12, 2), (61, 12, 3), (61, 11, 4),
(61, 12, 4), (62, 33, 4), (7, 12, 4), (16, 19, 4),
(35, 36, 4), (37, 38, 3), (55, 56, 3), (57, 63, 3)]

def functionTester(args):
i, x, y, z = args
constx, consty = const_data[i]

print "constx", constx
print "consty", consty
return constx*x - y + consty*x + z

def functionTesterMain(pool, index):
start = time.time()
rs = pool.map(functionTester, (((index, ) + x) for x in num_args))
end = time.time()
elapsed = end - start
min = elapsed/60
print "Took", elapsed,
print "seconds to run, which is the same as", min, "minutes"
return max(zip(rs, num_args))

if __name__ == '__main__':
num_processes = multiprocessing.cpu_count()
pool = multiprocessing.Pool(num_processes)
for i, _ in enumerate(const_data):
a, b = functionTesterMain(pool, i)
print "-----------"
print "a", a
print "b", b


 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
run a function in another processor in python Astan Chee Python 1 12-13-2010 09:51 PM
when do I see this? XSLTProcessor processor XSLTProcessor processor brahatha Java 1 06-15-2007 10:52 AM
win xp pro sp2 64 bit is a multi processor or a uni processor =?Utf-8?B?dW1lc2g=?= Windows 64bit 4 08-01-2006 05:24 AM
AMD 64 X2 Processor: Any what to tell what program/process is assigned to processor? The Frozen Canuck Windows 64bit 1 01-16-2006 07:45 PM
Processor fried, should I upgrade or just buy a processor? Dim Computer Support 6 06-21-2004 08:11 PM



Advertisments