In this Python script, multiple 'cat | zgrep' commands are executed sequentially on a remote server and their outputs are collected individually for processing. However, to enhance efficiency, we aim to execute these commands in parallel.
Contrary to using multiprocessing or threading, you can execute subprocesses in parallel using the following approach:
<code class="python">#!/usr/bin/env python from subprocess import Popen # create a list of subprocesses processes = [Popen("echo {i:d}; sleep 2; echo {i:d}".format(i=i), shell=True) for i in range(5)] # collect statuses of subprocesses exitcodes = [p.wait() for p in processes]</code>
This code launches five shell commands concurrently and collects their exit codes. Note that the & character is not necessary in this context since Popen does not wait for commands to complete by default. You must explicitly call .wait() to retrieve their statuses.
Although it is convenient to collect output from subprocesses sequentially, you can also use threads for parallel collection if desired. Consider the following example:
<code class="python">#!/usr/bin/env python from multiprocessing.dummy import Pool # thread pool from subprocess import Popen, PIPE, STDOUT # create a list of subprocesses with output handling processes = [Popen("echo {i:d}; sleep 2; echo {i:d}".format(i=i), shell=True, stdin=PIPE, stdout=PIPE, stderr=STDOUT, close_fds=True) for i in range(5)] # collect outputs in parallel def get_lines(process): return process.communicate()[0].splitlines() outputs = Pool(len(processes)).map(get_lines, processes)</code>
This code runs subprocesses in parallel and collects their outputs concurrently using threads.
For Python versions 3.8 and above, asyncio offers an elegant way to execute subprocesses concurrently. Here's an example:
<code class="python">#!/usr/bin/env python3 import asyncio import sys from subprocess import PIPE, STDOUT async def get_lines(shell_command): p = await asyncio.create_subprocess_shell( shell_command, stdin=PIPE, stdout=PIPE, stderr=STDOUT ) return (await p.communicate())[0].splitlines() async def main(): # create a list of coroutines for subprocess execution coros = [get_lines(f'"{sys.executable}" -c "print({i:d}); import time; time.sleep({i:d})"') for i in range(5)] # get subprocess outputs in parallel print(await asyncio.gather(*coros)) if __name__ == "__main__": asyncio.run(main())</code>
This code demonstrates how to run subprocesses concurrently within a single thread.
By implementing these approaches, you can significantly improve the efficiency of your script by executing multiple 'cat | zgrep' commands in parallel on the remote server.
The above is the detailed content of How to Execute Multiple \'cat | zgrep\' Commands Concurrently in Python?. For more information, please follow other related articles on the PHP Chinese website!