admin管理员组文章数量:1302379
trying to run a function with 'X' cores simultaneously that has multiple parameters. with PROCESS I cant set the number of cores, and with POOL I cant set multiple parameters for a function.
def funct(a, b):
#run .bat file with params (a,b)
if __name__ == '__main__':
cores = int(cores) #set with input
pool = Pool(processes=cores)
pool.map(funct(3, 44))
Im getting this error:
TypeError: Pool.map() missing 1 required positional argument: 'iterable'
I cant use this, because I cant set Cores
p = Process(target=funct, args=(1,2))
trying to run a function with 'X' cores simultaneously that has multiple parameters. with PROCESS I cant set the number of cores, and with POOL I cant set multiple parameters for a function.
def funct(a, b):
#run .bat file with params (a,b)
if __name__ == '__main__':
cores = int(cores) #set with input
pool = Pool(processes=cores)
pool.map(funct(3, 44))
Im getting this error:
TypeError: Pool.map() missing 1 required positional argument: 'iterable'
I cant use this, because I cant set Cores
p = Process(target=funct, args=(1,2))
Share
edited Feb 11 at 2:38
glewi3
asked Feb 11 at 2:35
glewi3glewi3
215 bronze badges
3
|
5 Answers
Reset to default 3You can't use multiple cores to run a single function call with multiprocessing. Whether you use Process
or Pool
doesn't matter. Parallel processing doesn't work like that.
If you want to use multiple cores, you need to divide your program's work in such a way that different processes can receive different parts of the work. Since the entire work of your Python program consists of running a batch file once, though, there's no room to divide the work like that from the Python end. You might be able to rewrite your batch file to parallelize things in there.
If you wanted to call your function more than once, you could use Pool.starmap
to run separate calls in parallel:
if __name__ == '__main__':
args = [
(1, 2),
(3, 4),
(5, 6),
(7, 8),
]
with multiprocessing.Pool(processes=whatever) as p:
p.starmap(funct, args)
but with one call, that's not going to work (or at least, it's not going to use multiple cores), and you're only making one call.
Also, you'd need to make sure your batch file can safely be run multiple times in parallel. Considering the kinds of things batch files often do, it might not be safe to run it like that.
Use starmap
for multiple parameters.
Simple demonstration showing parallelism:
import multiprocessing as mp
import time
def funct(a, b):
time.sleep(1) # Simulate longer work.
return a + b
if __name__ == '__main__':
with mp.Pool() as pool:
n = mp.cpu_count()
print(n)
start = time.perf_counter()
# call funct 2x number of CPUs. Should take 2 seconds in parallel.
avalues = range(1, n*2 + 1) # 1, 2, 3, ...
bvalues = range(2, n*2 + 2) # 2, 3, 4, ...
# result = 3, 5, 7, ...
result = pool.starmap(funct, zip(avalues, bvalues))
print(time.perf_counter() - start)
print(result)
Output (56 calls to funct took ~2 seconds on my 28 cores):
28
2.085264699999243
[3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 93, 95, 97, 99, 101, 103, 105, 107, 109, 111, 113]
Use apply_async
instead of map
:
from multiprocessing import Pool
def funct(a, b):
# run .bat file with params (a,b)
print(f"{a = }")
print(f"{b = }")
if __name__ == "__main__":
cores = input("Cores: ")
cores = int(cores) # set with input
pool = Pool(processes=cores)
pool.apply_async(funct, (3, 44))
pool.close()
pool.join()
More complex example:
import time
from multiprocessing import Pool
def funct(a, b):
# run .bat file with params (a,b)
print(f"{a = }")
print(f"{b = }")
time.sleep(1)
if __name__ == "__main__":
cores = input("Cores: ")
start = time.time()
cores = int(cores) # set with input
pool = Pool(processes=cores)
pool.apply_async(funct, (3, 44))
for params in [(2, 6), (3, 7), (9, 8), (10, 11)]:
pool.apply_async(funct, params)
pool.close()
pool.join()
end = time.time()
print('Cost:', round(end-start, 1), 'second(s).')
Output:
- cores=3
Cores: 3
a = 3
b = 44
a = 2
b = 6
a = 3
b = 7
a = 9
b = 8
a = 10
b = 11
Cost: 2.1 second(s).
- cores=5
Cores: 5
a = 3
b = 44
a = 2
b = 6
a = 3
b = 7
a = 9
b = 8
a = 10
b = 11
Cost: 1.2 second(s).
Have you tried Pool.starmap()
? It uses Pool
to set the number of cores and starmap()
for multiple arguments so you can avoid map()
.
def funct(a, b):
print(f"Running .bat with params: {a}, {b}")
# Add the actual .bat execution logic here
if __name__ == '__main__':
cores = int(input("number of cores: "))
param_pairs = [
(3, 44),
(5, 10),
(6, 11),
(7, 12)
] # List of (a, b) pairs to run
with Pool(processes=cores) as pool:
pool.starmap(funct, param_pairs)
Note - Multiprocessing.py doesn't work in IDLE, you have to use software like intellij or something.
def funct(a, b):
#run .bat file with params (a,b)
if __name__ == "__main__":
# run as many cores as in the range
processes = []
for _ in range(4):
p = multiprocessing.Process(target=funct, args=[2, "hello"])
p.start()
processes.append(p)
for process in processes:
process.join()
本文标签: Python multiprocessing function multiple parametersStack Overflow
版权声明:本文标题:Python multiprocessing function multiple parameters - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1741681228a2392181.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
pool.map(funct(3, 44))
is totally wrong, this runs the function in the main process then passes he result topool.map
– juanpa.arrivillaga Commented Feb 11 at 4:46