21

我一直在寻找在内存有限(约 64MB)的 java 中再次读取和写入大文件(0.5 - 1 GB)的最快方法。文件中的每一行代表一条记录,所以我需要逐行获取它们。该文件是一个普通的文本文件。

我尝试了 BufferedReader 和 BufferedWriter 但它似乎不是最好的选择。读写一个 0.5 GB 大小的文件大约需要 35 秒,只读写不处理。我认为这里的瓶颈是写作,因为单独阅读大约需要 10 秒。

我试图读取字节数组,但是在每个读取的数组中搜索行需要更多时间。

请问有什么建议吗?谢谢

4

6 回答 6

20

我怀疑您真正的问题是您的硬件有限,而您所做的是软件不会有太大的不同。如果你有足够的内存和 CPU,更高级的技巧会有所帮助,但如果你只是在硬盘上等待,因为文件没有被缓存,它不会有太大的不同。

顺便说一句:10 秒内 500 MB 或 50 MB/秒是 HDD 的典型读取速度。

尝试运行以下命令以查看您的系统在什么时候无法有效地缓存文件。

public static void main(String... args) throws IOException {
    for (int mb : new int[]{50, 100, 250, 500, 1000, 2000})
        testFileSize(mb);
}

private static void testFileSize(int mb) throws IOException {
    File file = File.createTempFile("test", ".txt");
    file.deleteOnExit();
    char[] chars = new char[1024];
    Arrays.fill(chars, 'A');
    String longLine = new String(chars);
    long start1 = System.nanoTime();
    PrintWriter pw = new PrintWriter(new FileWriter(file));
    for (int i = 0; i < mb * 1024; i++)
        pw.println(longLine);
    pw.close();
    long time1 = System.nanoTime() - start1;
    System.out.printf("Took %.3f seconds to write to a %d MB, file rate: %.1f MB/s%n",
            time1 / 1e9, file.length() >> 20, file.length() * 1000.0 / time1);

    long start2 = System.nanoTime();
    BufferedReader br = new BufferedReader(new FileReader(file));
    for (String line; (line = br.readLine()) != null; ) {
    }
    br.close();
    long time2 = System.nanoTime() - start2;
    System.out.printf("Took %.3f seconds to read to a %d MB file, rate: %.1f MB/s%n",
            time2 / 1e9, file.length() >> 20, file.length() * 1000.0 / time2);
    file.delete();
}

在具有大量内存的 Linux 机器上。

Took 0.395 seconds to write to a 50 MB, file rate: 133.0 MB/s
Took 0.375 seconds to read to a 50 MB file, rate: 140.0 MB/s
Took 0.669 seconds to write to a 100 MB, file rate: 156.9 MB/s
Took 0.569 seconds to read to a 100 MB file, rate: 184.6 MB/s
Took 1.585 seconds to write to a 250 MB, file rate: 165.5 MB/s
Took 1.274 seconds to read to a 250 MB file, rate: 206.0 MB/s
Took 2.513 seconds to write to a 500 MB, file rate: 208.8 MB/s
Took 2.332 seconds to read to a 500 MB file, rate: 225.1 MB/s
Took 5.094 seconds to write to a 1000 MB, file rate: 206.0 MB/s
Took 5.041 seconds to read to a 1000 MB file, rate: 208.2 MB/s
Took 11.509 seconds to write to a 2001 MB, file rate: 182.4 MB/s
Took 9.681 seconds to read to a 2001 MB file, rate: 216.8 MB/s

在具有大量内存的 Windows 机器上。

Took 0.376 seconds to write to a 50 MB, file rate: 139.7 MB/s
Took 0.401 seconds to read to a 50 MB file, rate: 131.1 MB/s
Took 0.517 seconds to write to a 100 MB, file rate: 203.1 MB/s
Took 0.520 seconds to read to a 100 MB file, rate: 201.9 MB/s
Took 1.344 seconds to write to a 250 MB, file rate: 195.4 MB/s
Took 1.387 seconds to read to a 250 MB file, rate: 189.4 MB/s
Took 2.368 seconds to write to a 500 MB, file rate: 221.8 MB/s
Took 2.454 seconds to read to a 500 MB file, rate: 214.1 MB/s
Took 4.985 seconds to write to a 1001 MB, file rate: 210.7 MB/s
Took 5.132 seconds to read to a 1001 MB file, rate: 204.7 MB/s
Took 10.276 seconds to write to a 2003 MB, file rate: 204.5 MB/s
Took 9.964 seconds to read to a 2003 MB file, rate: 210.9 MB/s
于 2012-10-31T10:52:36.017 回答
9

我要尝试的第一件事是增加 BufferedReader 和 BufferedWriter 的缓冲区大小。默认缓冲区大小没有记录,但至少在 Oracle VM 中它们是 8192 个字符,这不会带来太大的性能优势。

如果您只需要制作文件的副本(并且不需要实际访问数据),我会放弃 Reader/Writer 方法并使用字节数组作为缓冲区直接使用 InputStream 和 OutputStream:

FileInputStream fis = new FileInputStream("d:/test.txt");
FileOutputStream fos = new FileOutputStream("d:/test2.txt");
byte[] b = new byte[bufferSize];
int r;
while ((r=fis.read(b))>=0) {
    fos.write(b, 0, r);         
}
fis.close();
fos.close();

或实际使用 NIO:

FileChannel in = new RandomAccessFile("d:/test.txt", "r").getChannel();
FileChannel out = new RandomAccessFile("d:/test2.txt", "rw").getChannel();
out.transferFrom(in, 0, Long.MAX_VALUE);
in.close();
out.close();

然而,在对不同的复制方法进行基准测试时,每次运行基准测试之间的差异(持续时间)要比不同实现之间的差异(持续时间)大得多。I/O 缓存(在操作系统级别和硬盘缓存上)在这里发挥着重要作用,很难说什么更快。在我的硬件上,使用 BufferedReader 和 BufferedWriter 逐行复制一个 1GB 的文本文件在某些​​运行中花费不到 5 秒,而在其他运行中则超过 30 秒。

于 2012-10-31T12:48:10.120 回答
4

在 Java 7 中,您可以使用 Files.readAllLines() 和 Files.write() 方法。这是示例:

List<String> readTextFile(String fileName) throws IOException {
    Path path = Paths.get(fileName);
    return Files.readAllLines(path, StandardCharsets.UTF_8);
}

void writeTextFile(List<String> strLines, String fileName) throws IOException {
    Path path = Paths.get(fileName);
    Files.write(path, strLines, StandardCharsets.UTF_8);
}
于 2013-08-16T16:02:24.600 回答
1

我建议查看java.nio包中的类。对于套接字,非阻塞 IO 可能更快:

http://docs.oracle.com/javase/6/docs/api/java/nio/package-summary.html

这篇文章的基准表明这是真的:

http://vanillajava.blogspot.com/2010/07/java-nio-is-faster-than-java-io-for.html

于 2012-10-31T10:23:39.023 回答
1

我写了一篇详尽的文章,介绍了在 Java 中读取文件的多种方法,并使用 1KB 到 1GB 的示例文件相互测试它们,我发现以下 3 种方法是读取 1GB 文件的最快方法:

1) java.nio.file.Files.readAllBytes() - 读取一个 1 GB 的测试文件只用了不到 1 秒。

import java.io.File;
import java.io.IOException;
import java.nio.file.Files;

public class ReadFile_Files_ReadAllBytes {
  public static void main(String [] pArgs) throws IOException {
    String fileName = "c:\\temp\\sample-10KB.txt";
    File file = new File(fileName);

    byte [] fileBytes = Files.readAllBytes(file.toPath());
    char singleChar;
    for(byte b : fileBytes) {
      singleChar = (char) b;
      System.out.print(singleChar);
    }
  }
}

2) java.nio.file.Files.lines() - 读取一个 1 GB 的测试文件大约需要 3.5 秒。

import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.stream.Stream;

public class ReadFile_Files_Lines {
  public static void main(String[] pArgs) throws IOException {
    String fileName = "c:\\temp\\sample-10KB.txt";
    File file = new File(fileName);

    try (Stream linesStream = Files.lines(file.toPath())) {
      linesStream.forEach(line -&gt; {
        System.out.println(line);
      });
    }
  }
}

3) java.io.BufferedReader - 读取 1 GB 测试文件大约需要 4.5 秒。

import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;

public class ReadFile_BufferedReader_ReadLine {
  public static void main(String [] args) throws IOException {
    String fileName = "c:\\temp\\sample-10KB.txt";
    FileReader fileReader = new FileReader(fileName);

    try (BufferedReader bufferedReader = new BufferedReader(fileReader)) {
      String line;
      while((line = bufferedReader.readLine()) != null) {
        System.out.println(line);
      }
    }
  }
}
于 2018-04-09T13:45:53.310 回答
1

这一切都OutOfMemoryException可以通过 Scanner 类迭代器有效地处理。它逐行读取文件,而不是批量读取。

下面的代码解决了这个问题:

try(FileInputStream inputStream =new FileInputStream("D:\\File\\test.txt");
  Scanner sc= new Scanner(inputStream, "UTF-8")) {
  while (sc.hasNextLine()) {
    String line = sc.nextLine();
    System.out.println(line);
  }
} catch (IOException e) {
  e.printStackTrace();
}
于 2019-04-24T15:01:51.773 回答