0

I have "Files Downloading Center" for large files (100MB - 2GB).

I'm using PHP.

My problem is when forcing files to download by using php headers the server memory consumed very much, although I make chunks from file when download process, that is mean when 5 users download large file at the same time the server will stop to work.

How to make users to download large files form my server without any problem.

For example, if i use header("location : path/to/files/2GB.zip");, the problem finish. but this is what i don't need because i don't need to give users direct link to the files for security.

What is solution ?

4

1 回答 1

0

You could store the files outside of your web path, then include the files at the time of download, using the header function to deploy it to the user. This is a little over-killie, but it works:

$getFiles = fread($FILEHANDLE,$FILESIZE);       

header("Content-Transfer-Encoding: Binary");
header("Content-length: ".$FILESIZE."");
header("Content-type: ".$FILETYPE."");
header('Content-Disposition: attachment; filename="'.$FILENAME.'"');

echo $getFiles;

This consumes more memory as you're reading the file on the server before you transfer it, but your downloaders will never know where the files live. YMMV with very large files for obvious reasons.

于 2013-10-16T20:39:08.560 回答