Node.js Write to File

Mateen Kiani

Mateen Kiani

Published on Mon Jul 14 2025·4 min read

node.js-write-to-file

Introduction

Writing to files is a core part of many Node.js applications, from logging and reporting to saving user data. Yet, developers often overlook the nuances between different file-writing methods and how they impact performance and reliability. Have you ever wondered which approach suits your needs best: callbacks, promises, or streams?

Each method has its own trade-offs. In this guide, we’ll explore how understanding these options helps you choose the right tool for tasks like saving small logs or streaming large data sets. Mastering file writes can prevent data loss, improve throughput, and make your app more robust.

Basic File Writing

The simplest way to write a file in Node.js is with the fs.writeFile function. It takes a path, data, and a callback that fires on completion. For example:

const fs = require('fs');
fs.writeFile('output.txt', 'Hello, world!', err => {
if (err) {
console.error('Write failed:', err);
} else {
console.log('File saved.');
}
});

This callback style is straightforward for small tasks. The write is asynchronous, so it won’t block your event loop. However, if you need to write many files or handle errors in a promise chain, callbacks can become nested and hard to read.

Tip: For quick scripts or small logs, fs.writeFile is often enough. Move to promises or streams as your needs grow.

Promises API

Modern Node.js offers a promise-based API under fs.promises. This lets you use async/await and cleaner error handling:

const fs = require('fs').promises;
async function saveData() {
try {
await fs.writeFile('data.txt', 'Important data');
console.log('Data saved with promises.');
} catch (err) {
console.error('Error writing file:', err);
}
}
saveData();

Using promises makes your control flow linear and readable. It reduces callback nesting and pairs nicely with other async tasks. If you’re saving JSON or doing multiple writes in sequence, fs.promises keeps code tidy and predictable.

Tip: Always wrap your await calls in try/catch to handle file-system errors gracefully.

Using Streams

For large files or continuous data, streams shine. Instead of loading all data into memory, you pipe chunks to a writable stream:

const fs = require('fs');
const readStream = fs.createReadStream('largefile.txt');
const writeStream = fs.createWriteStream('copy.txt');
readStream.pipe(writeStream);
writeStream.on('finish', () => {
console.log('File copy completed.');
});

Streams offer back-pressure management and lower memory usage. You can transform data on the fly, for example compressing or encrypting it. Use streams when working with gigabytes of logs or media files.

Tip: Always handle error events on both read and write streams to avoid crashes.

Writing JSON Data

Saving objects to JSON files is a common task. You simply JSON.stringify your data and write it:

const fs = require('fs').promises;
async function saveJson(obj) {
const json = JSON.stringify(obj, null, 2);
await fs.writeFile('config.json', json);
console.log('JSON saved.');
}
saveJson({ name: 'Alice', age: 30 });

Pretty-printing with null, 2 indent makes files readable. Before writing, validate your data to avoid corrupt files. For more advanced examples on saving JSON, see How to Save JSON Data to a File in Node.js.

Tip: Use a temp file and rename it after write completes, ensuring you don’t end up with half-written files on errors.

Handling Errors

Robust file operations mean planning for failures. Always check for:

  • Permission errors (EACCES)
  • Disk full or quota limits (ENOSPC)
  • Path not found (ENOENT)

Example with detailed logging:

const fs = require('fs').promises;
async function safeWrite(path, data) {
try {
await fs.writeFile(path, data);
console.log(`Wrote to ${path}`);
} catch (err) {
console.error(`Failed to write ${path}:`, err.code || err.message);
// Decide whether to retry, alert, or exit
}
}

Tip: Use error codes (err.code) to handle different failures differently—for example, request elevated permissions on EACCES.

Checking File Existence

Before writing, you may want to verify if a file already exists to avoid overwriting:

const fs = require('fs').promises;
async function writeIfNew(path, content) {
try {
await fs.access(path);
console.log('File exists, aborting write.');
} catch {
// File not found, safe to write
await fs.writeFile(path, content);
console.log('File created.');
}
}
writeIfNew('newfile.txt', 'Some content');

For a deeper dive on checking file existence, check out Node.js Check If File Exists.

Tip: In race conditions, access may pass but the file might be created immediately after. For critical apps, open with flags like wx in fs.open.

Conclusion

Mastering file-writing in Node.js unlocks a range of powerful use cases, from simple logs to high-throughput data pipelines. Whether you use callback-based fs.writeFile, the cleaner fs.promises API, or streams for large transfers, each method fits different scenarios. Remember to handle errors proactively, validate JSON data, and protect against race conditions when checking file existence.

By choosing the right approach, you ensure your application writes data reliably and performs efficiently. Start with the easiest method that works for you, then evolve into streams or advanced patterns as your requirements grow. Now you’re ready to build robust, file-based features with confidence.


Mateen Kiani
Mateen Kiani
kiani.mateen012@gmail.com
I am a passionate Full stack developer with around 3 years of experience in MERN stack development and 1 year experience in blockchain application development. I have completed several projects in MERN stack, Nextjs and blockchain, including some NFT marketplaces. I have vast experience in Node js, Express, React and Redux.