Home > Java > javaTutorial > body text

Java solution to memory overflow when reading large files

黄舟
Release: 2017-08-10 09:21:33
Original
2515 people have browsed it

The following editor will bring you an article that perfectly solves the problem of memory overflow when reading large files in Java. The editor thinks it’s pretty good, so I’ll share it with you now and give it as a reference. Let’s follow the editor and take a look.

1. Traditional method: reading file content in memory

Standards for reading file lines The way is to read in memory. Both Guava and Apache Commons IO provide a method to quickly read file lines as follows:


Files.readLines(new File(path), Charsets.UTF_8); 
FileUtils.readLines(new File(path));
Copy after login

In fact, use BufferedReader or other Subclass LineNumberReader to read.

The problem with the traditional approach: is that all lines of the file are stored in memory. When the file is large enough, it will soon cause the program to throw an OutOfMemoryError exception. .

Thinking about the problem: We usually don’t need to put all the lines of the file into memory at once. Instead, we only need to traverse each line of the file, and then Deal with it accordingly and throw it away when you're done. So we can read it by row iteration instead of putting all rows in memory.

2. Large file reading processing method

Without repeated reading and without running out of memory Processing large files:

(1) File stream method: Use the java.util.Scanner class to scan the contents of the file and read continuously line by line


FileInputStream inputStream = null; 
Scanner sc = null; 
try { 
 inputStream = new FileInputStream(path); 
 sc = new Scanner(inputStream, UTF-8); 
 while (sc.hasNextLine()) {
  String line = sc.nextLine(); 
  // System.out.println(line); 
  } 
}catch(IOException e){
  logger.error(e);
}finally {
  if (inputStream != null) { 
  inputStream.close(); 
  } 
  if (sc != null) {
    sc.close();
   }
}
Copy after login

This scheme will iterate through all lines in the file, allowing each line to be processed without maintaining a reference to it. In short, they are not stored in memory!

(2) Apache Commons IO stream: Implemented using the Commons IO library, using the custom LineIterator provided by the library


LineIterator it = FileUtils.lineIterator(theFile, UTF-8); 
try {
 while (it.hasNext()) {
 String line = it.nextLine(); 
 // do something with line 
  } 
} finally {
 LineIterator.closeQuietly(it);
}
Copy after login

In this solution, the entire file is not all stored in the memory , which also leads to quite conservative memory consumption.

The above is the detailed content of Java solution to memory overflow when reading large files. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!