Python Write JSON to File

Mateen Kiani

Mateen Kiani

Published on Wed Jul 16 2025·9 min read

python-write-json-to-file

Working with JSON is a daily part of many Python projects. We use JSON for config files, API payloads, and data exchange. But one often overlooked aspect is handling Unicode characters and ensuring proper encoding when writing to a file. Forget this and you might end up with escaped characters or broken text. How can you make sure your JSON file stays readable and correctly encoded?

You can use Python’s built-in json module with the ensure_ascii parameter set to False and open the file with the right encoding. This combination ensures Unicode characters are written as is, not escaped. Understanding this lets you maintain readability and data integrity for international text. It also prevents surprises when your application reads the file later.

Why JSON Matters

JSON is more than just a data-interchange format. It powers configuration files, API responses, and logs across dozens of Python libraries. You will find JSON in web applications, data processing scripts, and even microservices. Knowing how to write JSON files helps you standardize outputs. It keeps data structured and portable.

Here are common scenarios where writing JSON to a file adds value:

  • Config storage: Save settings that your script can load at runtime.
  • Data exchange: Share results between programs or machines.
  • Caching results: Speed up repeated operations by storing data locally.
  • Audit logs: Record actions in a structured, searchable format.
  • Backup snapshots: Preserve complex data trees between runs.

By storing JSON on disk, you can inspect files directly. Version control tools will also show changes clearly. This transparency helps during debugging. As your projects grow in complexity, writing clear JSON files becomes a cornerstone of maintainable code.

Whenever you write a JSON file, choose a clear name and use the .json extension. This ensures your editor applies the right syntax highlighting. You can open the file in any text editor to verify its contents. Many CI/CD pipelines automatically validate JSON files before deploying. If the file is malformed, automated checks can fail early and save you from runtime errors in production.

For cross-language projects, JSON acts as a neutral format. A file written in Python can be consumed by JavaScript, Java, Go, or any language with a JSON parser. This flexibility makes JSON the go-to format for data interchange. Understanding why JSON files matter sets the stage for learning how to write them correctly in Python.

Basic json.dump Example

Writing JSON in Python is straightforward thanks to the built-in json module. Start by importing the module and preparing your data. Here is a minimal example:

import json
data = {
"name": "Alice",
"age": 30,
"languages": ["Python", "JavaScript", "Go"]
}
with open("user.json", "w", encoding="utf-8") as file:
json.dump(data, file)

In this snippet:

  • We import json.
  • We define a Python dictionary.
  • We open a file called user.json in write mode.
  • We pass the data and file handle to json.dump().

Note that json.dump() writes the JSON text directly to the file. If you need a string instead, use json.dumps(). Dumping directly is more efficient because it does not build a large in-memory string.

Tip: Add a newline at the end of the file for better command-line display: python with open("user.json", "w", encoding="utf-8") as file: json.dump(data, file) file.write("\n")

When you open the file in write mode ("w"), Python creates the file if it does not exist. If the file exists, it is overwritten. To append JSON objects, use append mode ("a") but be cautious: you will need to manage commas and array brackets manually. For most use cases, writing a complete JSON structure in one shot is simpler.

Remember that file paths can be relative or absolute. If you point to a nested directory, ensure the directory exists first. Use os.makedirs() with exist_ok=True to create folders automatically:

import os
os.makedirs("data", exist_ok=True)
with open("data/user.json", "w", encoding="utf-8") as file:
json.dump(data, file)

Handling paths and modes correctly prevents runtime errors and makes your scripts more reliable. Up next, learn how to format JSON for humans.

Formatting and Readability

By default, json.dump writes everything in one long line. That is efficient but hard to read by humans. You can improve readability with the indent parameter. This adds line breaks and spaces. Here is how:

with open("user.json", "w", encoding="utf-8") as file:
json.dump(data, file, indent=4)

Common formatting options:

  • indent: Number of spaces for each indent level (e.g., 2 or 4).
  • separators: A tuple to control item separators. Default is (',', ':').
  • sort_keys: Set to True to sort dictionary keys alphabetically.
  • ensure_ascii: Set to False to allow Unicode characters without escaping.

Example with all options:

json.dump(
data,
file,
indent=2,
separators=(',', ': '),
sort_keys=True,
ensure_ascii=False
)

Tip: Use ensure_ascii=False to prevent Unicode characters from being written as escape sequences like "\\u2013". This makes the JSON file easier to read and edit manually.

Readable JSON is easier to validate with linters and diff tools. It also makes debugging faster, especially when teams collaborate on configuration files or API samples.

You can further customize formatting by writing a custom JSONEncoder or post-processing with .replace() for special formatting needs. But for most cases, indent and separators cover the essentials. Next, we will show how to handle complex data types.

Complex Data Encoding

Standard JSON types include strings, numbers, arrays, and objects. But real-world data often includes types like datetime, Decimal, or custom classes. json.dump will fail on these by default. You need to convert or encode them manually. There are two main strategies:

  1. Preprocessing: Convert objects to built-in types before dumping.
  2. Custom JSONEncoder: Subclass json.JSONEncoder to handle special types.

Example using preprocessing:

from datetime import datetime
record = {
"user": "Bob",
"timestamp": datetime.utcnow()
}
# Convert datetime to ISO string
record["timestamp"] = record["timestamp"].isoformat()
with open("record.json", "w", encoding="utf-8") as f:
json.dump(record, f, indent=2, ensure_ascii=False)

Example using a custom encoder:

import json
from datetime import datetime
class MyEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, datetime):
return obj.isoformat()
return super().default(obj)
data = {"time": datetime.now()}
with open("data.json", "w", encoding="utf-8") as f:
json.dump(data, f, cls=MyEncoder)

This approach keeps your serialization logic in one place and scales better for multiple special types. If you plan to serve JSON data over HTTP, take a look at this REST API with Flask guide. By customizing the encoder, you ensure consistent output across your application.

Error Handling Strategies

When writing JSON files, you may encounter two main errors: I/O errors and serialization errors. Proper error handling prevents crashes and data loss. Here is a pattern to follow:

import json
from json import JSONDecodeError
try:
with open("config.json", "w", encoding="utf-8") as file:
json.dump(config_data, file, indent=2)
except JSONDecodeError as jde:
print("Serialization error:", jde)
except OSError as ose:
print("File write error:", ose)

Tip: Catch JSONDecodeError when using json.loads or json.dump. It helps you identify malformed data before it goes to disk.

For large data, you might want to write in chunks. In that case, wrap json.dumps calls:

def write_chunked(data_list, filepath):
try:
with open(filepath, "w", encoding="utf-8") as f:
f.write("[\n")
for item in data_list:
chunk = json.dumps(item, ensure_ascii=False, indent=2)
f.write(chunk + ",\n")
f.write("]\n")
except Exception as e:
print("Error writing chunked JSON:", e)

Using with ensures the file is closed even if an error occurs. Always validate your data shapes before dumping to avoid unexpected exceptions.

Performance and Best Practices

Writing JSON files fast and safely matters when dealing with large datasets. Here are some tips to optimize performance:

  • Use json.dump over json.dumps to avoid large intermediate strings.
  • Stream data in chunks for very large lists.
  • Avoid loading huge datasets into memory at once.
  • Consider alternative formats (like YAML or msgpack) only if JSON becomes a bottleneck.

Example of streaming:

def stream_json(data_iter, filepath):
with open(filepath, "w", encoding="utf-8") as f:
f.write("[")
first = True
for item in data_iter:
if not first:
f.write(",")
f.write(json.dumps(item))
first = False
f.write("]")

Data integrity is also key. Always write to a temp file first and rename on success:

import os
import json
temp_path = filepath + ".tmp"
with open(temp_path, "w", encoding="utf-8") as f:
json.dump(data, f)
os.replace(temp_path, filepath)

This approach prevents corrupt files if the process is interrupted. Finally, track your JSON files in version control. For an introduction to Git and GitHub, check this guide to Git and GitHub.

Conclusion

Writing JSON to a file in Python is simple at first glance, but mastering it opens up safe data storage, readable configurations, and reliable APIs. We covered the basics with json.dump, improved readability using indent and separators, handled complex data via custom encoders, and implemented error strategies to prevent crashes. Along the way, we learned how to optimize for performance with streaming writes and atomic file replaces. By following these practices, you ensure your JSON files are robust, maintainable, and ready for collaboration.

Next time you need to dump data in Python, use these patterns to avoid common pitfalls. Whether you are developing a small script or a large application, proper JSON writing can save hours of debugging. Dive into the code examples, adapt them to your needs, and consider exploring more on REST API with Flask when building web services. Now you’re equipped to write JSON files like a pro—go ahead and give it a try!

Use Python's json module—open file with open('file.json','w', encoding='utf-8') and call json.dump(data, file, indent=4, ensure_ascii=False).


Mateen Kiani
Mateen Kiani
kiani.mateen012@gmail.com
I am a passionate Full stack developer with around 3 years of experience in MERN stack development and 1 year experience in blockchain application development. I have completed several projects in MERN stack, Nextjs and blockchain, including some NFT marketplaces. I have vast experience in Node js, Express, React and Redux.