Home > Backend Development > Python Tutorial > How to Execute Multiple \'cat | zgrep\' Commands Concurrently in Python?

How to Execute Multiple \'cat | zgrep\' Commands Concurrently in Python?

DDD
Release: 2024-10-27 07:09:29
Original
319 people have browsed it

How to Execute Multiple 'cat | zgrep' Commands Concurrently in Python?

Execute Multiple 'cat | zgrep' Commands Concurrently

In this Python script, multiple 'cat | zgrep' commands are executed sequentially on a remote server and their outputs are collected individually for processing. However, to enhance efficiency, we aim to execute these commands in parallel.

Using Subprocess Without Threading

Contrary to using multiprocessing or threading, you can execute subprocesses in parallel using the following approach:

<code class="python">#!/usr/bin/env python
from subprocess import Popen

# create a list of subprocesses
processes = [Popen("echo {i:d}; sleep 2; echo {i:d}".format(i=i), shell=True) for i in range(5)]

# collect statuses of subprocesses
exitcodes = [p.wait() for p in processes]</code>
Copy after login

This code launches five shell commands concurrently and collects their exit codes. Note that the & character is not necessary in this context since Popen does not wait for commands to complete by default. You must explicitly call .wait() to retrieve their statuses.

Subprocesses with Output Collection

Although it is convenient to collect output from subprocesses sequentially, you can also use threads for parallel collection if desired. Consider the following example:

<code class="python">#!/usr/bin/env python
from multiprocessing.dummy import Pool # thread pool
from subprocess import Popen, PIPE, STDOUT

# create a list of subprocesses with output handling
processes = [Popen("echo {i:d}; sleep 2; echo {i:d}".format(i=i), shell=True,
                   stdin=PIPE, stdout=PIPE, stderr=STDOUT, close_fds=True)
             for i in range(5)]

# collect outputs in parallel
def get_lines(process):
    return process.communicate()[0].splitlines()

outputs = Pool(len(processes)).map(get_lines, processes)</code>
Copy after login

This code runs subprocesses in parallel and collects their outputs concurrently using threads.

Asyncio-Based Parallel Execution

For Python versions 3.8 and above, asyncio offers an elegant way to execute subprocesses concurrently. Here's an example:

<code class="python">#!/usr/bin/env python3
import asyncio
import sys
from subprocess import PIPE, STDOUT

async def get_lines(shell_command):
    p = await asyncio.create_subprocess_shell(
        shell_command, stdin=PIPE, stdout=PIPE, stderr=STDOUT
    )
    return (await p.communicate())[0].splitlines()


async def main():
    # create a list of coroutines for subprocess execution
    coros = [get_lines(f'"{sys.executable}" -c "print({i:d}); import time; time.sleep({i:d})"') for i in range(5)]

    # get subprocess outputs in parallel
    print(await asyncio.gather(*coros))

if __name__ == "__main__":
    asyncio.run(main())</code>
Copy after login

This code demonstrates how to run subprocesses concurrently within a single thread.

By implementing these approaches, you can significantly improve the efficiency of your script by executing multiple 'cat | zgrep' commands in parallel on the remote server.

The above is the detailed content of How to Execute Multiple \'cat | zgrep\' Commands Concurrently in Python?. For more information, please follow other related articles on the PHP Chinese website!

source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template