Skip to content

Commit f8e3b45

Browse files
first commit
0 parents  commit f8e3b45

34 files changed

+786
-0
lines changed

Diff for: .gitignore

+3
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# git ignore
2+
**/.DS_Store
3+
.DS_Store

Diff for: README.md

+55
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
1+
# Python ProcessPoolExecutor Jump-Start
2+
3+
![Python ProcessPoolExecutor Jump-Start](cover.png)
4+
5+
* <https://github.com/SuperFastPython/PythonProcessPoolExecutorJumpStart>
6+
7+
This repository provides all source code for the book:
8+
9+
* **Python ProcessPoolExecutor Jump-Start**: _Execute CPU-Bound Tasks in Parallel With Modern Process Pools_, Jason Brownlee, 2022.
10+
11+
12+
## Source Code
13+
You can access all Python .py files directly here:
14+
15+
* [src/](src/)
16+
17+
## Get the Book
18+
19+
You can learn more about the book here:
20+
21+
* Coming soon
22+
23+
### Book Blurb
24+
25+
> How much faster could your python code run (if it used all CPU cores)?
26+
>
27+
> The ProcessPoolExecutor class provides modern process pools for CPU-bound tasks.
28+
>
29+
> This is not some random third-party library, this is a class provided in the Python standard library (already installed on your system).
30+
>
31+
> This is the class you need to make your code run faster.
32+
>
33+
> There's just one problem. No one knows about it (or how to use it well).
34+
>
35+
> Introducing: "Python ProcessPoolExecutor Jump-Start". A new book designed to teach you modern process pools in Python, super fast!
36+
>
37+
> You will get a rapid-paced, 7-part course to get you started and make you awesome at using the ProcessPoolExecutor.
38+
>
39+
> Including:
40+
>
41+
> * How to create process pools and when to use them.
42+
> * How to configure process pools including the number of workers.
43+
> * How to execute tasks with worker processes and handle for results.
44+
> * How to execute tasks in the process pool asynchronously.
45+
> * How to query and get results from handles on asynchronous tasks called futures.
46+
> * How to wait on and manage diverse collections of asynchronous tasks.
47+
> * How to develop a parallel Fibonacci calculator 4x faster than the sequential version.
48+
>
49+
> Each of the 7 lessons was carefully designed to teach one critical aspect of the ProcessPoolExecutor, with explanations, code snippets and worked examples.
50+
>
51+
> Each lesson ends with an exercise for you to complete to confirm you understood the topic, a summary of what was learned, and links for further reading if you want to go deeper.
52+
>
53+
> Stop copy-pasting code from StackOverflow answers.
54+
>
55+
> Learn Python concurrency correctly, step-by-step.

Diff for: cover.png

683 KB
Loading

Diff for: src/lesson01_process.py

+17
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
# SuperFastPython.com
2+
# example of running a function in a new process
3+
from multiprocessing import Process
4+
5+
# custom function to be executed in a child process
6+
def task():
7+
# report a message
8+
print('This is another process', flush=True)
9+
10+
# protect the entry point
11+
if __name__ == '__main__':
12+
# define a task to run in a new process
13+
process = Process(target=task)
14+
# start the task in a new process
15+
process.start()
16+
# wait for the child process to terminate
17+
process.join()

Diff for: src/lesson01_processpoolexecutor.py

+18
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
# SuperFastPython.com
2+
# example running a function in the process pool
3+
from concurrent.futures import ProcessPoolExecutor
4+
5+
# custom function to be executed in a worker process
6+
def task():
7+
# report a message
8+
print('This is another process', flush=True)
9+
10+
# protect the entry point
11+
if __name__ == '__main__':
12+
# create the process pool
13+
with ProcessPoolExecutor() as exe:
14+
# issue the task
15+
future = exe.submit(task)
16+
# wait for the task to finish
17+
future.result()
18+
# close the process pool automatically

Diff for: src/lesson02_default_config.py

+12
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
# SuperFastPython.com
2+
# example reporting the details of a default pool
3+
from concurrent.futures import ProcessPoolExecutor
4+
5+
# protect the entry point
6+
if __name__ == '__main__':
7+
# create a process pool
8+
exe = ProcessPoolExecutor()
9+
# report the status of the process pool
10+
print(exe._max_workers)
11+
# shutdown the process pool
12+
exe.shutdown()

Diff for: src/lesson02_initialize_processes.py

+24
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
# SuperFastPython.com
2+
# example initializing worker processes in the pool
3+
from time import sleep
4+
from concurrent.futures import ProcessPoolExecutor
5+
6+
# custom function to be executed in a worker process
7+
def task(number):
8+
# report a message
9+
print(f'Worker task {number}...', flush=True)
10+
# block for a moment
11+
sleep(1)
12+
13+
# initialize a worker in the process pool
14+
def init():
15+
# report a message
16+
print('Initializing worker...', flush=True)
17+
18+
# protect the entry point
19+
if __name__ == '__main__':
20+
# create and configure the process pool
21+
with ProcessPoolExecutor(2,
22+
initializer=init) as exe:
23+
# issue tasks to the process pool
24+
_ = exe.map(task, range(4))

Diff for: src/lesson02_num_processes.py

+19
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
# SuperFastPython.com
2+
# example of setting a large number number of workers
3+
from time import sleep
4+
from concurrent.futures import ProcessPoolExecutor
5+
6+
# custom task function executed in the process pool
7+
def task(number):
8+
# block for a moment
9+
sleep(1)
10+
# report a message
11+
if number % 10 == 0:
12+
print(f'>task {number} done', flush=True)
13+
14+
# protect the entry point
15+
if __name__ == '__main__':
16+
# create a process pool
17+
with ProcessPoolExecutor(50) as exe:
18+
# issue many tasks to the pool
19+
_ = exe.map(task, range(50))

Diff for: src/lesson03_map_chunksize.py

+16
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
# SuperFastPython.com
2+
# example of executing multiple tasks in chunks
3+
from concurrent.futures import ProcessPoolExecutor
4+
5+
# custom function to be executed in a worker process
6+
def task(number):
7+
return number*2
8+
9+
# protect the entry point
10+
if __name__ == '__main__':
11+
# create the process pool
12+
with ProcessPoolExecutor(4) as exe:
13+
# issue tasks to execute concurrently
14+
_ = exe.map(task, range(10000), chunksize=500)
15+
# wait for all tasks to complete
16+
print('All done')

Diff for: src/lesson03_map_multiple_arguments.py

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# SuperFastPython.com
2+
# example executing tasks concurrently with multiple arg
3+
from random import random
4+
from time import sleep
5+
from concurrent.futures import ProcessPoolExecutor
6+
7+
# custom function to be executed in a worker process
8+
def task(number, value):
9+
# report a message
10+
print(f'Task using {value}', flush=True)
11+
# block for a moment to simulate work
12+
sleep(value)
13+
# return a new value
14+
return number + value
15+
16+
# protect the entry point
17+
if __name__ == '__main__':
18+
# create the process pool
19+
with ProcessPoolExecutor(4) as exe:
20+
# prepare random numbers between 0 and 1
21+
values = [random() for _ in range(10)]
22+
# issue tasks to execute concurrently
23+
for result in exe.map(task, range(10), values):
24+
# report results
25+
print(result)

Diff for: src/lesson03_map_no_chunksize.py

+16
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
# SuperFastPython.com
2+
# example of executing multiple tasks with no chunks
3+
from concurrent.futures import ProcessPoolExecutor
4+
5+
# custom function to be executed in a worker process
6+
def task(number):
7+
return number*2
8+
9+
# protect the entry point
10+
if __name__ == '__main__':
11+
# create the process pool
12+
with ProcessPoolExecutor(4) as exe:
13+
# issue tasks to execute concurrently
14+
_ = exe.map(task, range(10000), chunksize=1)
15+
# wait for all tasks to complete
16+
print('All done')

Diff for: src/lesson03_map_no_return.py

+22
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
# SuperFastPython.com
2+
# example of executing tasks concurrently with no return
3+
from random import random
4+
from time import sleep
5+
from concurrent.futures import ProcessPoolExecutor
6+
7+
# custom function to be executed in a worker process
8+
def task(number):
9+
# generate a random value between 0 and 1
10+
value = random()
11+
# report a message
12+
print(f'Task generated {value}', flush=True)
13+
# block for a moment to simulate work
14+
sleep(value)
15+
16+
# protect the entry point
17+
if __name__ == '__main__':
18+
# create the process pool
19+
with ProcessPoolExecutor(4) as exe:
20+
# issue tasks to execute concurrently
21+
_ = exe.map(task, range(10))
22+
# wait automatically for all tasks to finish...

Diff for: src/lesson03_map_one_argument.py

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# SuperFastPython.com
2+
# example of executing tasks concurrently with one arg
3+
from random import random
4+
from time import sleep
5+
from concurrent.futures import ProcessPoolExecutor
6+
7+
# custom function to be executed in a worker process
8+
def task(number):
9+
# generate a random value between 0 and 1
10+
value = random()
11+
# report a message
12+
print(f'Task generated {value}', flush=True)
13+
# block for a moment to simulate work
14+
sleep(value)
15+
# return a new value
16+
return number + value
17+
18+
# protect the entry point
19+
if __name__ == '__main__':
20+
# create the process pool
21+
with ProcessPoolExecutor(4) as exe:
22+
# issue tasks to execute concurrently
23+
for result in exe.map(task, range(10)):
24+
# report results
25+
print(result)

Diff for: src/lesson03_map_timeout.py

+32
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# SuperFastPython.com
2+
# example of executing tasks concurrently with timeout
3+
from random import random
4+
from time import sleep
5+
from concurrent.futures import ProcessPoolExecutor
6+
from concurrent.futures import TimeoutError
7+
8+
# custom function to be executed in a worker process
9+
def task(number):
10+
# generate a random value between 0 and 1
11+
value = random()
12+
# report a message
13+
print(f'Task generated {value}', flush=True)
14+
# block for a moment to simulate work
15+
sleep(number + value)
16+
# return a new value
17+
return number + value
18+
19+
# protect the entry point
20+
if __name__ == '__main__':
21+
# create the process pool
22+
with ProcessPoolExecutor(4) as exe:
23+
try:
24+
# issue tasks to execute concurrently
25+
for result in exe.map(task, range(10),
26+
timeout=2):
27+
# report results
28+
print(result)
29+
except TimeoutError:
30+
print('Gave up, took too long')
31+
# report that we will wait for the tasks to complete
32+
print('Waiting for tasks to complete...')

Diff for: src/lesson04_submit_many_tasks.py

+26
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
# SuperFastPython.com
2+
# example issuing many asynchronous tasks systematically
3+
from random import random
4+
from time import sleep
5+
from concurrent.futures import ProcessPoolExecutor
6+
7+
# custom function to be executed in a worker process
8+
def task(number):
9+
# generate a random value between 0 and 1
10+
value = random()
11+
# report a message
12+
print(f'Task generated {value}', flush=True)
13+
# block for a moment to simulate work
14+
sleep(value)
15+
# return a new value
16+
return number + value
17+
18+
# protect the entry point
19+
if __name__ == '__main__':
20+
# create the process pool
21+
with ProcessPoolExecutor() as ex:
22+
# issue many asynchronous tasks systematically
23+
futures = [ex.submit(task, i) for i in range(5)]
24+
# enumerate futures and report results
25+
for future in futures:
26+
print(future.result())

Diff for: src/lesson04_submit_multiple_arg.py

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# SuperFastPython.com
2+
# example issuing an async task with multiple arguments
3+
from random import random
4+
from time import sleep
5+
from concurrent.futures import ProcessPoolExecutor
6+
7+
# custom function to be executed in a worker process
8+
def task(number, value):
9+
# report a message
10+
print(f'Task received {value}', flush=True)
11+
# block for a moment to simulate work
12+
sleep(value)
13+
# return a new value
14+
return number + value
15+
16+
# protect the entry point
17+
if __name__ == '__main__':
18+
# create the process pool
19+
with ProcessPoolExecutor() as exe:
20+
# issue an asynchronous task
21+
future = exe.submit(task, 100, random())
22+
# get the result once the task completes
23+
result = future.result()
24+
# report the result
25+
print(result)

Diff for: src/lesson04_submit_no_args.py

+27
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
# SuperFastPython.com
2+
# example issuing an asynchronous task with no arguments
3+
from random import random
4+
from time import sleep
5+
from concurrent.futures import ProcessPoolExecutor
6+
7+
# custom function to be executed in a worker process
8+
def task():
9+
# generate a random value between 0 and 1
10+
value = random()
11+
# report a message
12+
print(f'Task generated {value}', flush=True)
13+
# block for a moment to simulate work
14+
sleep(value)
15+
# return a new value
16+
return value
17+
18+
# protect the entry point
19+
if __name__ == '__main__':
20+
# create the process pool
21+
with ProcessPoolExecutor() as exe:
22+
# issue an asynchronous task
23+
future = exe.submit(task)
24+
# get the result once the task completes
25+
result = future.result()
26+
# report the result
27+
print(result)

0 commit comments

Comments
 (0)