Reading Text Files Line-by-Line in Node.js
reading a large text file one line at a time in Node.js can be a crucial operation for processing extensive datasets. While the question you mentioned from Quora tackles reading from STDIN, this article focuses on extending this concept to reading from a text file.
The initial approach, involving fs.open, serves as a foundation. The missing step is to leverage the Lazy module to perform line-by-line reading from the opened filedescriptor. However, since Node.js v0.12, there is a more robust solution using the built-in readline core module.
Let's explore two approaches using readline:
const fs = require('fs'); const readline = require('readline'); async function processLineByLine() { const fileStream = fs.createReadStream('input.txt'); const rl = readline.createInterface({ input: fileStream, crlfDelay: Infinity }); // Note: we use the crlfDelay option to recognize all instances of CR LF // ('\r\n') in input.txt as a single line break. for await (const line of rl) { // Each line in input.txt will be successively available here as `line`. console.log(`Line from file: ${line}`); } } processLineByLine();
Alternatively, you can use:
var lineReader = require('readline').createInterface({ input: require('fs').createReadStream('file.in') }); lineReader.on('line', function (line) { console.log('Line from file:', line); }); lineReader.on('close', function () { console.log('all done, son'); });
Both approaches leverage the readline module to effectively read lines from a text file one at a time. The last line is read correctly (as of Node v0.12 or later), even if there is no final line break.
The above is the detailed content of How Can I Efficiently Read a Large Text File Line by Line in Node.js?. For more information, please follow other related articles on the PHP Chinese website!