Running code in parallel with Python multiprocessing
Using Python's multiprocessing module can improve performance, but attention should be paid to startup methods, Pool usage, process communication and exception handling. 1. Choose the appropriate startup method: fork (Unix fast but unstable), spawn (cross-platform recommendation), forkserver (profitable for frequent creation); 2. Use Pool to manage concurrent tasks, control the number of processes, and reasonably select map or apply_async; 3. Inter-process communication can be used to communicate with Queue, Pipe, Value, Array or Manager, pay attention to performance and security; 4. Strengthen exception handling, use logging to debug, and can be simulated by a single process during development.

Python's multiprocessing module can indeed make your code run faster, provided that the correct method is used. It bypasses GIL (global interpreter lock) by creating multiple processes, truly implementing parallel computing. But many people are prone to traps when using it, such as inter-process communication, resource competition, efficiency loss, etc.

The following key points are the most common and most prone to problems in actual use. Let’s take a look at them one by one.
How to choose the right startup method
There are three ways to start multiprocessing: fork , spawn and forkserver . The default values of different systems are different, and the behaviors are also different.

- fork (Unix only) copies the status of the current process, including all variables and threads. Although it is fast, it is easy to cause instability. For example, fork may cause deadlocks in a multi-threaded environment.
- spawn creates a clean new process that inherits only the necessary resources. Good cross-platform compatibility and is recommended for production environments.
- forkserver is a compromise solution, suitable for scenarios where child processes need to be created frequently.
If you find that the program is not running properly on some platforms, you can try to explicitly specify the startup method:
import multiprocessing as mp mp.set_start_method('spawn')
Use Pool to manage concurrent tasks more efficiently
When you have a bunch of independent tasks to process, such as batch download, image processing or data analysis, using Pool is the most worry-free way.

It allows you to set the maximum number of concurrencies and automatically schedule tasks to various processes. Commonly used methods include map , apply_async , etc.
For example, suppose you want to process a set of files in parallel:
from multiprocessing import Pool
def process_file(filename):
# Assume that this is the time-consuming processing logic return f"Processed {filename}"
if __name__ == '__main__':
files = ['file1.txt', 'file2.txt', 'file3.txt']
with Pool(4) as pool:
results = pool.map(process_file, files)
print(results)A few suggestions:
- Control the size of
Poolwell and don't open too many processes, which will slow down the system. - If there is no dependency between tasks, use
maporimapfirst, which is more concise. - If you need asynchronous callbacks or error handling, you can use
apply_async, but remember to addjoin()and exception capture.
Pay attention to communication and data sharing between processes
Sometimes you need to exchange data between multiple processes, so you should pay attention to how to transmit data to be safe and efficient.
-
QueueandPipeare commonly used communication methods,Queueis more suitable for multiple consumer/producer models. - If you just read shared data, you can use
ValueorArray, but the write operation must be locked. - Another type is to use
Manager, which can create shared objects that support multiple types (dictionaries, lists, etc.), but there will be some performance losses.
Let's give a simple queue example:
from multiprocessing import Process, Queue
def worker(q):
item = q.get()
print(f"Processing: {item}")
if __name__ == '__main__':
q = Queue()
for i in range(5):
q.put(i)
processes = [Process(target=worker, args=(q,)) for _ in range(3)]
for p in processes:
p.start()
for p in processes:
p.join()Remember, try to avoid frequent replication when passing big data between processes, otherwise it will affect performance. Consider using shared memory if necessary.
Don't ignore exception handling and debugging
There is a problem with a multi-process program, and debugging is much more troublesome than single-threaded ones. The most common ones are "stuck" or "silent failure".
You can do this:
- Add try-except to each child process and print out the exception.
- Use logging instead of print to facilitate viewing of the output of each process.
- If the program is stuck, check whether it is because of deadlock, queue blocking, or join() does not end.
Another trick: first use a single process to simulate the execution process during the development stage, and then switch back to multi-process mode after confirming that there is no problem.
Basically that's it. multiprocessing is powerful, but also has some complexity, especially in cross-platform and resource control. If used well, it can significantly improve program performance; if used poorly, it will increase maintenance costs.
The above is the detailed content of Running code in parallel with Python multiprocessing. For more information, please follow other related articles on the PHP Chinese website!
Hot AI Tools
Undress AI Tool
Undress images for free
Undresser.AI Undress
AI-powered app for creating realistic nude photos
AI Clothes Remover
Online AI tool for removing clothes from photos.
Clothoff.io
AI clothes remover
Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!
Hot Article
Hot Tools
Notepad++7.3.1
Easy-to-use and free code editor
SublimeText3 Chinese version
Chinese version, very easy to use
Zend Studio 13.0.1
Powerful PHP integrated development environment
Dreamweaver CS6
Visual web development tools
SublimeText3 Mac version
God-level code editing software (SublimeText3)
Hot Topics
1793
16
1736
56
1587
29
267
587
Can a Python class have multiple constructors?
Jul 15, 2025 am 02:54 AM
Yes,aPythonclasscanhavemultipleconstructorsthroughalternativetechniques.1.Usedefaultargumentsinthe__init__methodtoallowflexibleinitializationwithvaryingnumbersofparameters.2.Defineclassmethodsasalternativeconstructorsforclearerandscalableobjectcreati
Python for loop range
Jul 14, 2025 am 02:47 AM
In Python, using a for loop with the range() function is a common way to control the number of loops. 1. Use when you know the number of loops or need to access elements by index; 2. Range(stop) from 0 to stop-1, range(start,stop) from start to stop-1, range(start,stop) adds step size; 3. Note that range does not contain the end value, and returns iterable objects instead of lists in Python 3; 4. You can convert to a list through list(range()), and use negative step size in reverse order.
Accessing data from a web API in Python
Jul 16, 2025 am 04:52 AM
The key to using Python to call WebAPI to obtain data is to master the basic processes and common tools. 1. Using requests to initiate HTTP requests is the most direct way. Use the get method to obtain the response and use json() to parse the data; 2. For APIs that need authentication, you can add tokens or keys through headers; 3. You need to check the response status code, it is recommended to use response.raise_for_status() to automatically handle exceptions; 4. Facing the paging interface, you can request different pages in turn and add delays to avoid frequency limitations; 5. When processing the returned JSON data, you need to extract information according to the structure, and complex data can be converted to Data
python one line if else
Jul 15, 2025 am 01:38 AM
Python's onelineifelse is a ternary operator, written as xifconditionelsey, which is used to simplify simple conditional judgment. It can be used for variable assignment, such as status="adult"ifage>=18else"minor"; it can also be used to directly return results in functions, such as defget_status(age):return"adult"ifage>=18else"minor"; although nested use is supported, such as result="A"i
How to read a JSON file in Python?
Jul 14, 2025 am 02:42 AM
Reading JSON files can be implemented in Python through the json module. The specific steps are: use the open() function to open the file, use json.load() to load the content, and the data will be returned in a dictionary or list form; if you process JSON strings, you should use json.loads(). Common problems include file path errors, incorrect JSON format, encoding problems and data type conversion differences. Pay attention to path accuracy, format legality, encoding settings, and mapping of boolean values and null.
Python for loop to read file line by line
Jul 14, 2025 am 02:47 AM
Using a for loop to read files line by line is an efficient way to process large files. 1. The basic usage is to open the file through withopen() and automatically manage the closing. Combined with forlineinfile to traverse each line. line.strip() can remove line breaks and spaces; 2. If you need to record the line number, you can use enumerate(file, start=1) to let the line number start from 1; 3. When processing non-ASCII files, you should specify encoding parameters such as utf-8 to avoid encoding errors. These methods are concise and practical, and are suitable for most text processing scenarios.
python case-insensitive string compare if
Jul 14, 2025 am 02:53 AM
The most direct way to make case-insensitive string comparisons in Python is to use .lower() or .upper() to compare. For example: str1.lower()==str2.lower() can determine whether it is equal; secondly, for multilingual text, it is recommended to use a more thorough casefold() method, such as "straß".casefold() will be converted to "strasse", while .lower() may retain specific characters; in addition, it should be avoided to use == comparison directly, unless the case is confirmed to be consistent, it is easy to cause logical errors; finally, when processing user input, database or matching
How to use the map function in Python
Jul 15, 2025 am 02:52 AM
Python's map() function implements efficient data conversion by acting as specified functions on each element of the iterable object in turn. 1. Its basic usage is map(function,iterable), which returns a "lazy load" map object, which is often converted to list() to view results; 2. It is often used with lambda, which is suitable for simple logic, such as converting strings to uppercase; 3. It can be passed in multiple iterable objects, provided that the number of function parameters matches, such as calculating the discounted price and discount; 4. Usage techniques include combining built-in functions to quickly type conversion, handling None situations similar to zip(), and avoiding excessive nesting to affect readability. Mastering map() can make the code more concise and professional


