Mateen Kiani
Published on Fri Jul 04 2025·3 min read
Reading text files is a common task for many JavaScript applications, from building command-line tools with Node.js to handling user uploads in the browser. Whether you need to process logs on the server or parse user-provided documents in a web app, JavaScript offers different APIs to read files in both environments.
In this guide, we'll explore how to read text files using the Node.js fs module, handle large files with streams, and work with the FileReader API in browsers. Along the way, we'll cover best practices, error handling, and common pitfalls to help you write reliable file-reading code.
Node.js provides the built-in fs module to work with the file system. To read a text file, you can use either synchronous or asynchronous methods.
const fs = require('fs');// Asynchronous readfs.readFile('./notes.txt', 'utf8', (err, data) => {if (err) return console.error(err);console.log(data);});// Synchronous readtry {const content = fs.readFileSync('./notes.txt', 'utf8');console.log(content);} catch (err) {console.error(err);}
Tip: Check if a file exists before reading to avoid exceptions.
You can use fs.existsSync or better yet, refer to node and file exists for patterns and examples.
Choosing between synchronous and asynchronous file reads depends on your use case:
Example of async/await with promises:
const fsPromises = require('fs').promises;async function readFileAsync(path) {try {const data = await fsPromises.readFile(path, 'utf8');console.log(data);} catch (err) {console.error(err);}}readFileAsync('./data.txt');
Using promises simplifies error handling and makes code more readable.
When dealing with very large files, reading the entire content at once can cause high memory use. Streams let you process chunks:
const fs = require('fs');const readStream = fs.createReadStream('./bigfile.txt', 'utf8');readStream.on('data', (chunk) => {console.log('Received chunk of length:', chunk.length);});readStream.on('end', () => {console.log('Finished reading file');});readStream.on('error', (err) => {console.error('Stream error:', err);});
Benefits of streams:
Use streams when file size is unknown or exceeds available memory.
In the browser, JavaScript can't directly access the file system for security reasons. Instead, users select files via an input element, and you use the FileReader API.
<input type='file' id='fileInput' /><script>const input = document.getElementById('fileInput');input.addEventListener('change', () => {const file = input.files[0];const reader = new FileReader();reader.onload = () => {console.log(reader.result); // file content as text};reader.onerror = () => {console.error(reader.error);};reader.readAsText(file);});</script>
Note: Always validate file type and size before reading.
If you also need to write back or save processed data, check out how to save JSON data to a file for patterns on writing and then reading file content.
Tip: Wrap file operations in helper functions to reuse error handling logic.
Reading text files in JavaScript spans from the Node.js fs module to browser FileReader. Knowing when to use sync or async methods, streams for big data, and proper error handling will make your code robust and performant. Practice these patterns in your next project to handle file I/O like a pro.