Mateen Kiani
Published on Wed Jul 23 2025·4 min read
Ever had to call a shell command from your Python script? It’s often the quickest way to move files, check system status, or kick off a build process. But there’s a related component that many developers gloss over: how to manage errors, security, and output when invoking bash commands. How can you safely and effectively run and manage bash commands in Python?
In this guide, we’ll explore the built-in subprocess
module, dive into asynchronous execution with asyncio
, and look at popular third-party libraries. Understanding these options helps you capture output reliably, prevent security issues, and keep your code clean. With this knowledge, you’ll automate tasks confidently and avoid unexpected failures.
The subprocess
module is the standard way to run shell commands in Python. It replaces older functions like os.system
by offering more control and safety. Here’s a basic example:
import subprocessresult = subprocess.run(["ls", "-l", "/etc"],capture_output=True,text=True)print("Exit code:", result.returncode)print("STDOUT:\n", result.stdout)
Key points:
capture_output=True
grabs both stdout and stderr.text=True
returns strings instead of bytes.result.returncode
tells if the command succeeded (zero) or failed (non-zero).Practical tip: if you need to move or rename files, you can call mv
via subprocess instead of writing your own routine. For a pure Python alternative, see how to handle file moves with the python-mv-file
guide.
In real-world scripts, logging and error handling are crucial. You can redirect output to files, parse results, or raise exceptions on failure. For example:
import subprocesstry:completed = subprocess.run(["grep", "error", "app.log"],capture_output=True,text=True,check=True)with open("errors.txt", "w") as f:f.write(completed.stdout)print("Errors extracted to errors.txt")except subprocess.CalledProcessError as e:print("Command failed, exit code", e.returncode)
Here we use check=True
to raise a CalledProcessError
if the command fails. Writing output to a file is easy; for structured data you might combine this with JSON routines. Check out how to write JSON files in Python for that workflow: python-write-json-to-file.
Tip: Always capture stderr when diagnosing failures. You can use
stderr=subprocess.STDOUT
to merge it with stdout.
Passing user input into shell commands can open the door to code injection. Avoid this by using argument lists instead of a single shell string. Unsafe:
user = input("Enter name: ")os.system(f"grep {user} /etc/passwd")
Safe:
import subprocessuser = input("Enter name: ")subprocess.run(["grep", user, "/etc/passwd"], check=True)
Never set shell=True
with untrusted input. If you must run a shell pipeline, sanitize inputs or use specialized libraries.
Practical steps:
shlex.quote()
for any dynamic values when shell=True
is unavoidable.For high-throughput tasks, you might launch multiple commands concurrently. Python’s asyncio
supports this:
import asyncioasync def run(cmd):proc = await asyncio.create_subprocess_exec(*cmd,stdout=asyncio.subprocess.PIPE,stderr=asyncio.subprocess.PIPE)out, err = await proc.communicate()return proc.returncode, out.decode(), err.decode()async def main():tasks = [run(["sleep", "1"]) for _ in range(5)]results = await asyncio.gather(*tasks)print(results)asyncio.run(main())
Why use async?
Note: For CPU-bound tasks, consider threads or processes instead of asyncio.
Beyond built-ins, these libraries offer more features:
Library | Features | Best For |
---|---|---|
sh | Pythonic shell calls, piping | Quick scripts |
pexpect | Interactive sessions, prompts | Automating interactive CLI |
plumbum | Local & remote commands, pipelines | Complex workflows |
Example with sh
:
from sh import ls, grepprint(ls("-la"))print(grep("def", "script.py"))
Choose based on your needs: simple syntax (sh
), interactive control (pexpect
), or full DSL (plumbum
).
Running bash commands from Python is a powerful tool in any developer’s kit. Whether you pick the built-in subprocess
module, embrace asyncio
for concurrency, or leverage third-party libraries like sh
, you can automate system tasks with confidence. Remember to always capture output, handle errors, and guard against injection by passing argument lists or sanitizing input. With these practices, your scripts will be more reliable, secure, and maintainable. Now it’s your turn—pick a method, write a helper function, and start integrating shell commands into your next Python project!