How does the File System (fs) module work in Node.js?
The fs Module in Node.js
The fs (File System) module is a core Node.js module that provides an API for interacting with the file system. It allows you to read, write, update, delete, and watch files and directories.
Three API Styles
Node.js fs provides three styles for every operation:
1. Callback-based (original)
js
const fs = require('fs');
fs.readFile('data.txt', 'utf8', (err, data) => {
if (err) throw err;
console.log(data);
});2. Synchronous (blocking)
js
const fs = require('fs');
const data = fs.readFileSync('data.txt', 'utf8');
console.log(data);3. Promise-based (recommended)
js
const fs = require('fs/promises');
async function readData() {
const data = await fs.readFile('data.txt', 'utf8');
console.log(data);
}Common Operations
Reading Files
js
const fs = require('fs/promises');
// Read as string
const text = await fs.readFile('file.txt', 'utf8');
// Read as Buffer (binary)
const buffer = await fs.readFile('image.png');Writing Files
js
// Write (creates or overwrites)
await fs.writeFile('output.txt', 'Hello World');
// Append to file
await fs.appendFile('log.txt', 'New log entry\n');Working with Directories
js
// Create directory
await fs.mkdir('new-folder', { recursive: true });
// Read directory contents
const files = await fs.readdir('src');
// Read with file types
const entries = await fs.readdir('src', { withFileTypes: true });
entries.forEach(entry => {
console.log(`${entry.name} - ${entry.isDirectory() ? 'dir' : 'file'}`);
});File Information
js
const stats = await fs.stat('file.txt');
console.log({
size: stats.size, // bytes
isFile: stats.isFile(),
isDir: stats.isDirectory(),
created: stats.birthtime,
modified: stats.mtime
});Delete and Rename
js
// Delete file
await fs.unlink('old-file.txt');
// Delete directory (recursive)
await fs.rm('old-folder', { recursive: true, force: true });
// Rename / move
await fs.rename('old-name.txt', 'new-name.txt');
// Copy file
await fs.copyFile('source.txt', 'dest.txt');Streaming Large Files
For large files, use streams instead of readFile to avoid loading the entire file into memory:
js
const fs = require('fs');
const readStream = fs.createReadStream('large-file.csv', 'utf8');
const writeStream = fs.createWriteStream('output.csv');
readStream.pipe(writeStream);
readStream.on('end', () => console.log('Done!'));
readStream.on('error', (err) => console.error(err));Watching Files
js
const fs = require('fs');
// Watch for file changes
const watcher = fs.watch('config.json', (eventType, filename) => {
console.log(`${filename} changed: ${eventType}`);
});
// Or use fs.watchFile for polling-based watching
fs.watchFile('data.txt', { interval: 1000 }, (curr, prev) => {
console.log(`File modified at: ${curr.mtime}`);
});Important Considerations
| Topic | Best Practice |
|---|---|
| Encoding | Always specify 'utf8' for text files |
| Error handling | Always handle errors (try/catch or callback) |
| Large files | Use streams, not readFile |
| Sync methods | Avoid in server code (blocks event loop) |
| Paths | Use path.join() for cross-platform paths |
| Security | Validate user-provided paths to prevent directory traversal |
js
const path = require('path');
// ✅ Safe path construction
const safePath = path.join(__dirname, 'uploads', path.basename(userInput));
// ❌ Vulnerable to directory traversal
const unsafePath = './uploads/' + userInput;
// userInput = '../../etc/passwd'Tip: Always prefer
fs/promisesfor modern async code. Use streams for files larger than ~50MB. Never use sync methods in server request handlers.
Short Answer
Interview readyPremium
A concise answer to help you respond confidently on this topic during an interview.