Mateen Kiani
Published on Tue Jul 22 2025·4 min read
Listing directory contents is one of the first tasks you often need when scripting or automating workflows in Python. Whether you’re building a file management tool, cleaning up old logs, or just checking what’s in a folder, knowing how to “ls” in Python gives you flexibility beyond shell commands. But did you know that Python’s different libraries let you filter, format, and even traverse directories recursively with ease?
In this article, we’ll explore built-in modules like os
, pathlib
, and glob
, show you how to present results cleanly, and even demonstrate exporting file lists to JSON. By the end, you’ll have a solid toolbox to replicate and extend the classic ls
command in your own scripts.
The os
module provides a straightforward way to list directory items:
import osentries = os.listdir('.') # List names in current directoryfor name in entries:print(name)
If you need file metadata or want to skip hidden files, os.scandir
is faster and richer:
with os.scandir('.') as it:for entry in it:if not entry.name.startswith('.'):kind = 'dir' if entry.is_dir() else 'file'print(f"{entry.name} ({kind})")
This gives you DirEntry
objects with methods like .is_file()
, .stat()
, and .name
.
Tip:
os.scandir
is much faster on large directories thanos.listdir
+os.stat
calls.
Python 3’s pathlib
brings an object-oriented approach:
from pathlib import Pathpath = Path('.')for entry in path.iterdir():status = '<DIR>' if entry.is_dir() else '<FILE>'print(entry.name, status)
Advantages:
Path
objects: .is_dir()
, .suffix
, .stat()
path / 'subfolder'
Example: filter by extension:
for py_file in path.glob('*.py'):print(py_file.name)
When you need a full directory tree, os.walk
is ideal:
import osdef walk_dir(root):for dirpath, dirnames, filenames in os.walk(root):print(f"Directory: {dirpath}")for fname in filenames:print(f" - {fname}")walk_dir('.')
Features:
(current_path, [dirs], [files])
dirnames
in-place to prune walkNote: You can limit depth by counting separators in
dirpath
.
Beyond pathlib.glob
, the glob
module provides shell-style patterns:
import glob# All JPG and PNG files:for image in glob.glob('images/*.[pj][pn]g'):print(image)
Or use fnmatch
for in-memory filtering:
import fnmatch, osfor name in os.listdir('.'):if fnmatch.fnmatch(name, 'data_*.csv'):print(name)
These tools let you match complex patterns without manual string checks.
For a user-friendly view, you can format columns or add color:
from pathlib import Pathfrom shutil import get_terminal_sizecols = get_terminal_size().columnsfiles = list(Path('.').iterdir())n = len(files)batch = cols // 20 or 1for idx, f in enumerate(files, start=1):name = f.name.ljust(18)print(name, end='')if idx % batch == 0:print()
For colored output, consider using colorama
:
from colorama import Fore, Stylefor entry in Path('.').iterdir():color = Fore.BLUE if entry.is_dir() else Fore.GREENprint(color + entry.name + Style.RESET_ALL)
Often you’ll want to export a directory list to a file. You can dump it as JSON:
import jsonfrom pathlib import Pathfiles = [str(p) for p in Path('.').iterdir()]with open('files.json', 'w') as f:json.dump(files, f, indent=2)
This approach works great for APIs or reporting. Learn more about saving lists in Python in this guide on how to save a Python list to a file.
If you need advanced JSON writing, check out writing JSON to file for detailed patterns.
pathlib
for new scripts for cleaner, cross-platform code.scandir
backport or watchdog
for live tracking.Pro Tip: Combine filters and recursive walks to gather only certain file types across nested folders.
Python makes it simple to replicate and extend the power of the Unix ls
command within your scripts. The os
module gives you raw speed, pathlib
offers elegance, and glob
or fnmatch
lets you filter by patterns. You can traverse recursively with os.walk
, format columns for readability, colorize entries, and even export lists to JSON for downstream processing.
With these tools in hand, you can build automated scans, reporting tools, or interactive file explorers without ever leaving Python. Next time you need to inspect a filesystem or generate reports, skip the shell invocation and stay in your script — Python’s filesystem APIs have you covered.