0

i have a file sharing website made in php which gives users to download files.But as i am using a hosted server there is limitation on time out and other factors that come to play while for a download script coded in php. This causes large files to get corrupted while downloading. i am looking for a solution for this i have tried using symbolic links but the force download factor is not taking place....

i am thinking of using perl to download files bt dont have any clue what so ever....

can any one help me outwith this problem???

4

2 回答 2

0

您可以在 php 脚本中运行 bash 命令

<?php
    $command = "ls -lrat";
    Shell_exec($command);
于 2013-11-02T21:41:29.833 回答
0

这是一个粗略的 Perl 脚本,它使用了 CGI 模块。它是为演示核心概念而编写的,尚未准备好生产。

它假设:

  • 您将通过 URL、直接或 iFrame 等调用。
  • 您将传递参数文件名。它的值是您要下载的文件的文件名。
  • 文件所在的目录路径是当前工作目录。
  • 路径分隔符是 / (Linux)。

从根本上说,打印适当的内容类型将强制下载到磁盘。

#!/usr/bin/perl
use strict;
use warnings FATAL => qw(all);
use CGI::Carp qw(fatalsToBrowser);
use CGI;
use Cwd;

my $cgi = CGI->new;

my $dirpath  = getcwd;
my $filename = $cgi->param('filename') || die 'filename required';
my $filepath = $dirpath . '/' . $filename;

open my $fh, '<', $filepath or die "cannot open '$filepath': $!";
my $filedata = do { $/ = undef; <$fh> };
close $fh;

print $cgi->header( -type       => 'application/octet-stream',
                    -attachment => $filename );

print $filedata;
于 2013-11-03T03:38:00.227 回答