Mateen Kiani
Published on Sun Jul 06 2025·4 min read
We all work with JSON data in Node.js daily, parsing configs, loading fixtures, or seeding databases. But there's a specific yet often overlooked detail: the difference between synchronous require calls, asynchronous file reads, and ES module imports when bringing JSON into your code. Have you ever wondered which approach is best for your project and what pitfalls to watch out for when bringing a JSON file into your code?
In this guide, we will walk through the main methods to import JSON files in Node.js, from the classic require syntax to modern ES module imports and file system reads. By understanding these options, you can choose a method that fits your performance needs, code style, and module format. Let's explore these techniques and avoid common surprises.
Importing JSON in Node.js is a core task for many applications. Whether you need to load configuration settings, seed test data, or bundle static content, JSON is a portable and human-readable format. But before diving into code, it helps to ask: what do you expect from the import? Do you need synchronous loading during startup, or async handling at runtime? Might you run into caching issues or file path confusion?
Node.js supports multiple ways to import and parse JSON. Using require is fast and simple, but it works only in CommonJS modules and caches the result. Reading with fs gives you a chance to handle streams and async events but adds boilerplate. With ES modules, you can use a new import assertion or dynamic import, though extra flags may be needed. Knowing the trade-offs helps you pick the right tool for each use case.
The simplest way to bring JSON into a CommonJS module is require. Node.js treats a .json file as a special case, parsing it to an object at load time. For example:
// config.jsconst settings = require('./settings.json');console.log(settings.databaseUrl);
Require is synchronous. This makes sense if you need data before app startup. Once loaded, Node caches the result in memory, so repeated require calls are fast. But caching means that if you change the JSON file at runtime, require will not reload it.
Tip: Because require only works in CommonJS, you cannot use it in ESM without a loader. Also, excessive synchronous IO at startup can slow down cold starts in serverless functions.
If you need asynchronous loading or dynamic file paths, fs.readFile is your friend. You can read the file, parse JSON, and handle errors in a callback or with promises. Example using promises:
import fs from 'fs/promises';async function loadData(path) {try {const raw = await fs.readFile(path, 'utf8');return JSON.parse(raw);} catch (err) {console.error('Failed to load JSON:', err);throw err;}}loadData('./data.json').then(data => console.log(data));
This approach avoids blocking the event loop and works in both CommonJS and ESM. It also gives you control to check the file existence first, as shown in the check if file exists guide, before attempting a read.
With modern Node.js versions, you can import JSON directly in ESM modules using an import assertion. This feature is still experimental in some releases, so you may need the --experimental-json-modules flag:
// index.mjsimport config from './config.json' assert { type: 'json' };console.log(config.key);
If you prefer dynamic import, you can do:
async function getConfig() {const module = await import('./config.json', { assert: { type: 'json' } });return module.default;}
Note: Ensure your package.json has 'type': 'module' to use ESM syntax. Without it, Node treats .js files as CommonJS.
Whether you use require, fs.readFile, or import, robust error handling matters. Common issues include missing files, invalid JSON, or wrong file paths. Here are some practices:
try {const data = require('./config.json');if (!data.apiKey) throw new Error('Missing apiKey');} catch (err) {console.error('Config load failed:', err.message);process.exit(1);}
When importing JSON files in Node.js, keep these tips in mind:
These practices help you maintain clarity, performance, and resilience.
Importing JSON in Node.js may seem trivial at first, but picking the right method can have a real impact on load times, memory usage, and developer experience. We covered how to use require in CommonJS, the flexibility of fs.readFile in both module systems, and the modern import assertion in ESM. You also learned to handle errors gracefully and follow best practices to keep your codebase stable.
Next time you reach for a JSON file, think about the context: is it a small config at startup or a dynamic data payload at runtime? Choose the approach that aligns with your module type and performance goals. With these techniques in your toolkit, you can import JSON confidently and avoid those hidden pitfalls.