Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
257 views
in Technique[技术] by (71.8m points)

java - Fastest way to incrementally read a large file

When given a buffer of MAX_BUFFER_SIZE, and a file that far exceeds it, how can one:

  1. Read the file in blocks of MAX_BUFFER_SIZE?
  2. Do it as fast as possible

I tried using NIO

    RandomAccessFile aFile = new RandomAccessFile(fileName, "r");
    FileChannel inChannel = aFile.getChannel();

    ByteBuffer buffer = ByteBuffer.allocate(CAPARICY);

    int bytesRead = inChannel.read(buffer);

    buffer.flip();

        while (buffer.hasRemaining()) {
            buffer.get();
        }

        buffer.clear();
        bytesRead = inChannel.read(buffer);

    aFile.close();

And regular IO

    InputStream in = new FileInputStream(fileName);

    long length = fileName.length();

    if (length > Integer.MAX_VALUE) {
        throw new IOException("File is too large!");
    }

    byte[] bytes = new byte[(int) length];

    int offset = 0;

    int numRead = 0;

    while (offset < bytes.length
            && (numRead = in.read(bytes, offset, bytes.length - offset)) >= 0) {
        offset += numRead;
    }

    if (offset < bytes.length) {
        throw new IOException("Could not completely read file " + fileName);
    }

    in.close();

Turns out that regular IO is about 100 times faster in doing the same thing as NIO. Am i missing something? Is this expected? Is there a faster way to read the file in buffer chunks?

Ultimately i am working with a large file i don't have memory for to read it all at once. Instead, I'd like to read it incrementally in blocks that would then be used for processing.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

If you want to make your first example faster

FileChannel inChannel = new FileInputStream(fileName).getChannel();
ByteBuffer buffer = ByteBuffer.allocateDirect(CAPACITY);

while(inChannel.read(buffer) > 0)
    buffer.clear(); // do something with the data and clear/compact it.

inChannel.close();

If you want it to be even faster.

FileChannel inChannel = new RandomAccessFile(fileName, "r").getChannel();
MappedByteBuffer buffer = inChannel.map(FileChannel.MapMode.READ_ONLY, 0, inChannel.size());
// access the buffer as you wish.
inChannel.close();

This can take 10 - 20 micro-seconds for files up to 2 GB in size.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...