Running Bash Commands in Python

Mateen Kiani

Mateen Kiani

Published on Wed Jul 23 2025·4 min read

running-bash-commands-in-python

Ever had to call a shell command from your Python script? It’s often the quickest way to move files, check system status, or kick off a build process. But there’s a related component that many developers gloss over: how to manage errors, security, and output when invoking bash commands. How can you safely and effectively run and manage bash commands in Python?

In this guide, we’ll explore the built-in subprocess module, dive into asynchronous execution with asyncio, and look at popular third-party libraries. Understanding these options helps you capture output reliably, prevent security issues, and keep your code clean. With this knowledge, you’ll automate tasks confidently and avoid unexpected failures.

Using subprocess

The subprocess module is the standard way to run shell commands in Python. It replaces older functions like os.system by offering more control and safety. Here’s a basic example:

import subprocess
result = subprocess.run(
["ls", "-l", "/etc"],
capture_output=True,
text=True
)
print("Exit code:", result.returncode)
print("STDOUT:\n", result.stdout)

Key points:

  • capture_output=True grabs both stdout and stderr.
  • text=True returns strings instead of bytes.
  • result.returncode tells if the command succeeded (zero) or failed (non-zero).

Practical tip: if you need to move or rename files, you can call mv via subprocess instead of writing your own routine. For a pure Python alternative, see how to handle file moves with the python-mv-file guide.

Capturing Output and Errors

In real-world scripts, logging and error handling are crucial. You can redirect output to files, parse results, or raise exceptions on failure. For example:

import subprocess
try:
completed = subprocess.run(
["grep", "error", "app.log"],
capture_output=True,
text=True,
check=True
)
with open("errors.txt", "w") as f:
f.write(completed.stdout)
print("Errors extracted to errors.txt")
except subprocess.CalledProcessError as e:
print("Command failed, exit code", e.returncode)

Here we use check=True to raise a CalledProcessError if the command fails. Writing output to a file is easy; for structured data you might combine this with JSON routines. Check out how to write JSON files in Python for that workflow: python-write-json-to-file.

Tip: Always capture stderr when diagnosing failures. You can use stderr=subprocess.STDOUT to merge it with stdout.

Avoiding Injection Risks

Passing user input into shell commands can open the door to code injection. Avoid this by using argument lists instead of a single shell string. Unsafe:

user = input("Enter name: ")
os.system(f"grep {user} /etc/passwd")

Safe:

import subprocess
user = input("Enter name: ")
subprocess.run(["grep", user, "/etc/passwd"], check=True)

Never set shell=True with untrusted input. If you must run a shell pipeline, sanitize inputs or use specialized libraries.

Practical steps:

  1. Always pass a list of arguments.
  2. Use shlex.quote() for any dynamic values when shell=True is unavoidable.
  3. Validate or whitelist input patterns before calling.

Async Execution with asyncio

For high-throughput tasks, you might launch multiple commands concurrently. Python’s asyncio supports this:

import asyncio
async def run(cmd):
proc = await asyncio.create_subprocess_exec(
*cmd,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE
)
out, err = await proc.communicate()
return proc.returncode, out.decode(), err.decode()
async def main():
tasks = [run(["sleep", "1"]) for _ in range(5)]
results = await asyncio.gather(*tasks)
print(results)
asyncio.run(main())

Why use async?

  • You can start many processes without blocking the event loop.
  • Better resource management for I/O-bound tasks.
  • Ideal for CI/CD pipelines that run tests in parallel.

Note: For CPU-bound tasks, consider threads or processes instead of asyncio.

Comparing Third-Party Tools

Beyond built-ins, these libraries offer more features:

LibraryFeaturesBest For
shPythonic shell calls, pipingQuick scripts
pexpectInteractive sessions, promptsAutomating interactive CLI
plumbumLocal & remote commands, pipelinesComplex workflows

Example with sh:

from sh import ls, grep
print(ls("-la"))
print(grep("def", "script.py"))

Choose based on your needs: simple syntax (sh), interactive control (pexpect), or full DSL (plumbum).

Conclusion

Running bash commands from Python is a powerful tool in any developer’s kit. Whether you pick the built-in subprocess module, embrace asyncio for concurrency, or leverage third-party libraries like sh, you can automate system tasks with confidence. Remember to always capture output, handle errors, and guard against injection by passing argument lists or sanitizing input. With these practices, your scripts will be more reliable, secure, and maintainable. Now it’s your turn—pick a method, write a helper function, and start integrating shell commands into your next Python project!


Mateen Kiani
Mateen Kiani
kiani.mateen012@gmail.com
I am a passionate Full stack developer with around 4 years of experience in MERN stack development and 1 year experience in blockchain application development. I have completed several projects in MERN stack, Nextjs and blockchain, including some NFT marketplaces. I have vast experience in Node js, Express, React and Redux.