I'm trying to handle large files by 10MB byte arrays at a time. I'm trying to get the byte arrays one at a time (not get entire byte array for the huge file and split the byte array, after all the problem was due to memory)
This is what I have so far:
private byte[] readFile(File file, int offset) throws IOException
{
BufferedInputStream inStream = null;
ByteArrayOutputStream outStream = null;
byte[] buf = new byte[1048576];
int read = 0;
try
{
inStream = new BufferedInputStream(new FileInputStream(file));
outStream = new ByteArrayOutputStream();
long skipped = inStream.skip(offset);
read = inStream.read(buf);
if (read != -1)
{
outStream.write(buf, 0, read);
return outStream.toByteArray();
}
}
finally
{
if (inStream != null) {try {inStream.close();} catch (IOException e) {}}
if (outStream != null) {try {outStream.close();} catch (IOException e) {}}
}
return null;
the parameter offset
will be in 10MB increments as well.
So the problem I'm having is that, even tho the skipped
long variable gives me 1048576 bytes skipped, the second 10MB i'm suppose to receive from calling readFile(file, 1048576)
is the same as the first byte array from the first 10MB. Thus it didn't really skip the first 10MB at all.
What's the problem here? Is there another way of implementing this idea?