Download a large file with the admin panel [message #164887] |
Tue, 29 March 2011 13:14 |
|
Atomicrun
Messages: 54 Registered: November 2010 Location: Lund
Karma: 0
|
Member |
|
|
I made a backup of my server (Ghost drive-dump). Now, the server is a bit restricted in its configuration, so to move the backup file from the server, I tried to download using the admin file controls. I don't have FTP and I have turned off most other alternatives, like local network.
The admin file-controls-download is OK for small files, but it didn't work with my "bigger" file.
/* Download file code. */
if (isset($_GET['down']) && $dest && @file_exists($cur_dir .'/'. $dest)) {
if (is_file($cur_dir .'/'. $dest)) {
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename='. $dest);
readfile($cur_dir .'/'. $dest);
} else {
header('Content-type: application/x-tar');
header('Content-Disposition: attachment; filename='. $dest .'.tar');
echo make_tar($cur_dir .'/'. $dest);
}
exit;
}
The PHP cannot run a large file by "readfile". The "readfile" is a problematic spot on the php, and you find many fixes, problems and issues on the www.php.net
I just need the file so I put in the following fix:
- I included a function that send the file in "chunks".
- I supply the file length to the download browser.
- I disable (go around) the php timeout by sending "sleep( 0);"
Insert this function in the adm/admbrowse.php file, among the other functions:
function readfile_chunked( $filename, $retbytes=true)
{
$chunksize = 10*1024*1024; // how many bytes per chunk
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ( $handle === false)
{
return false;
}
while ( !feof( $handle))
{
$buffer = fread( $handle, $chunksize);
echo $buffer;
ob_flush();
flush();
sleep( 0);
if ( $retbytes)
{
$cnt += strlen($buffer);
}
}
$status = fclose( $handle);
if ( $retbytes && $status)
{
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
And then modify the file download to use this function (the function above):
/* Download file code. */
if (isset($_GET['down']) && $dest && @file_exists($cur_dir .'/'. $dest)) {
if (is_file($cur_dir .'/'. $dest)) {
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename='. $dest);
header('Content-Length: '. filesize( $cur_dir .'/'. $dest));
// readfile($cur_dir .'/'. $dest);
readfile_chunked( $cur_dir .'/'. $dest);
} else {
header('Content-type: application/x-tar');
header('Content-Disposition: attachment; filename='. $dest .'.tar');
echo make_tar($cur_dir .'/'. $dest);
}
exit;
}
#############################
We note that I cannot upload the backup file (that is large..) again when it has been downloaded. The upload would need a similar adjustment.
I have not included the fix into the directory download.
|
|
|
|
Re: Download a large file with the admin panel [message #165475 is a reply to message #164888] |
Sat, 25 June 2011 13:44 |
|
Atomicrun
Messages: 54 Registered: November 2010 Location: Lund
Karma: 0
|
Member |
|
|
I would like to inquire, what code is used when attachments are downloaded. I typically have many and large attachments, as forum is used to store information, and I would like to use this provided function.
A clue, where it is, would be appreciated.
No, it is not good. The following code comes from www.php.net:
Function, add it in:
function dl_file_resume( $file)
{
//First, see if the file exists
if ( ! is_file( $file))
{
die( "<b>404 File not found!</b>");
}
//Gather relevent info about file
$len = filesize($file);
$filename = basename($file);
$file_extension = strtolower(substr(strrchr($filename,"."),1));
//This will set the Content-Type to the appropriate setting for the file
switch( $file_extension)
{
case "asf": $ctype = "video/x-ms-asf"; break;
case "avi": $ctype = "video/x-msvideo"; break;
case "exe": $ctype = "application/octet-stream"; break;
case "mov": $ctype = "video/quicktime"; break;
case "mp3": $ctype = "audio/mpeg"; break;
case "mpg": $ctype = "video/mpeg"; break;
case "mpeg": $ctype = "video/mpeg"; break;
case "rar": $ctype = "encoding/x-compress"; break;
case "txt": $ctype = "text/plain"; break;
case "wav": $ctype = "audio/wav"; break;
case "wma": $ctype = "audio/x-ms-wma"; break;
case "wmv": $ctype = "video/x-ms-wmv"; break;
case "zip": $ctype = "application/x-zip-compressed"; break;
case "gz": $ctype = "application/x-gzip-compressed"; break;
default: $ctype = "application/force-download"; break;
}
//Begin writing headers
header("Cache-Control:");
header("Cache-Control: public");
//Use the switch-generated Content-Type
header("Content-Type: $ctype");
if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE")) {
# workaround for IE filename bug with multiple periods / multiple dots in filename
# that adds square brackets to filename - eg. setup.abc.exe becomes setup[1].abc.exe
$iefilename = preg_replace('/\./', '%2e', $filename, substr_count($filename, '.') - 1);
header("Content-Disposition: attachment; filename=\"$iefilename\"");
} else {
header("Content-Disposition: attachment; filename=\"$filename\"");
}
header("Accept-Ranges: bytes");
$size=filesize($file);
//check if http_range is sent by browser (or download manager)
if(isset($_SERVER['HTTP_RANGE'])) {
list($a, $range)=explode("=",$_SERVER['HTTP_RANGE']);
//if yes, download missing part
str_replace($range, "-", $range);
$size2=$size-1;
$new_length=$size2-$range;
header("HTTP/1.1 206 Partial Content");
header("Content-Length: $new_length");
header("Content-Range: bytes $range$size2/$size");
} else {
$size2=$size-1;
header("Content-Range: bytes 0-$size2/$size");
header("Content-Length: ".$size);
}
// open the file
$fp = fopen( "$file", "rb");
if ( $fp)
{
// seek to start of missing part
fseek( $fp, $range);
//start buffered download
while( ! feof( $fp))
{
// reset time limit for big files
set_time_limit( 0);
print( fread( $fp, 1024*8));
flush();
ob_flush();
}
}
fclose( $fp);
}
/* Download file code. */
if (isset($_GET['down']) && $dest && @file_exists($cur_dir .'/'. $dest)) {
if ( is_file($cur_dir .'/'. $dest))
{
// *** header('Content-type: application/octet-stream');
// *** header('Content-Disposition: attachment; filename='. $dest);
// *** header('Content-Length: '. filesize( $cur_dir .'/'. $dest));
// *** readfile($cur_dir .'/'. $dest);
dl_file_resume( $cur_dir .'/'. $dest);
} else {
header('Content-type: application/x-tar');
header('Content-Disposition: attachment; filename='. $dest .'.tar');
echo make_tar($cur_dir .'/'. $dest);
}
exit;
}
/* Delete file/directory code. */
|
|
|
|
|
|
|
Re: Download a large file with the admin panel [message #165657 is a reply to message #165651] |
Sun, 31 July 2011 14:16 |
|
Atomicrun
Messages: 54 Registered: November 2010 Location: Lund
Karma: 0
|
Member |
|
|
The file-download is complex due to that:
- It is platform dependent. Typically, all and everything works on Linux.
- It serve a client on the other end of a network, so it get platform dependent also on the client, the browser used, etc. A Microsoft once again.
- The www.php.net is filled with different versions; typically hacking go on until a solution is found.
- The interaction between systems is typically difficult.
I run on a Windows ( ) server, and I also have a slow upload, so client's download time will override usual time-out settings. The problem was found when I downloaded the entire Ghost dump through the admin panel.
The new improved solution include:
[LIST TYPE=square]
[*] Every effort is made to set up proper headers for the download.
[*] In case the client send $_SERVER['HTTP_RANGE'] a partial download is possible.
[*] The usual time-outs is simply turned off completely!
[*] The server dive, intentionally, into an endless loop!
[*] Zero file length is handled separately.
[*] If the fread( $fp, ...) return zero bytes, this is accepted and can occur during a CPU-panic; the call will be retried.
[*] The upload from the server is paused intermittently to allow any router to re-schedule traffic, clean buffers, etc; I don't know any of how it works, but an occasional "hole" is likely improving reliability.
Now the code:
(Admin panel; admbrowse.php)
function dl_file_resume( $file)
{
// First, see if the file exists
if ( ! is_file( $file))
{
die( "<b>404 File not found!</b>");
}
//Gather relevent info about file
$filename = basename($file);
$file_extension = strtolower(substr(strrchr($filename,"."),1));
//This will set the Content-Type to the appropriate setting for the file
switch( $file_extension)
{
case "asf": $ctype = "video/x-ms-asf"; break;
case "avi": $ctype = "video/x-msvideo"; break;
case "exe": $ctype = "application/octet-stream"; break;
case "mov": $ctype = "video/quicktime"; break;
case "mp3": $ctype = "audio/mpeg"; break;
case "mpg": $ctype = "video/mpeg"; break;
case "mpeg": $ctype = "video/mpeg"; break;
case "rar": $ctype = "encoding/x-compress"; break;
case "txt": $ctype = "text/plain"; break;
case "wav": $ctype = "audio/wav"; break;
case "wma": $ctype = "audio/x-ms-wma"; break;
case "wmv": $ctype = "video/x-ms-wmv"; break;
case "zip": $ctype = "application/x-zip-compressed"; break;
case "gz": $ctype = "application/x-gzip-compressed"; break;
default: $ctype = "application/octet-stream"; break;
}
//Begin writing headers
header("Cache-Control:");
header("Cache-Control: public");
//Use the switch-generated Content-Type
header("Content-Type: $ctype");
if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE"))
{
# workaround for IE filename bug with multiple periods / multiple dots in filename
# that adds square brackets to filename - eg. setup.abc.exe becomes setup[1].abc.exe
$iefilename = preg_replace('/\./', '%2e', $filename, substr_count($filename, '.') - 1);
header("Content-Disposition: attachment; filename=\"$iefilename\"");
}
else
{
header("Content-Disposition: attachment; filename=\"$filename\"");
}
header("Accept-Ranges: bytes");
$size=filesize($file);
$range = 0;
//check if http_range is sent by browser (or download manager)
if(isset($_SERVER['HTTP_RANGE']))
{
list($a, $range)=explode("=",$_SERVER['HTTP_RANGE']);
//if yes, download missing part
str_replace($range, "-", $range);
$size2=$size-1;
$new_length=$size2-$range;
header("HTTP/1.1 206 Partial Content");
header("Content-Length: $new_length");
header("Content-Range: bytes $range-$size2/$size");
}
else
{
$size2=$size-1;
header("Content-Range: bytes 0-$size2/$size");
header("Content-Length: ".$size);
}
// open the file
@ini_set( 'magic_quotes_runtime', 0);
ignore_user_abort( true);
set_time_limit( 0);
$fp = fopen( "$file", "rb");
if ( $fp && $size > 0)
{
// seek to start of missing part
fseek( $fp, $range);
//start buffered download
$More_File = 10;
$Block_counter = 0;
while ( ! feof( $fp) && $More_File > 0 && ( connection_status() & 1) == 0)
{
$file_data = fread( $fp, 1000);
if ( ! $file_data )
{
sleep( 3);
$More_File--;
}
else
{
print( $file_data);
$More_File = 3;
ob_flush();
flush();
$Block_counter++;
if ( $Block_counter > 2000)
{
sleep( 3);
$Block_counter = 1000;
}
}
}
ob_flush();
flush();
}
fclose( $fp);
}
The ignore_user_abort( true); set_time_limit( 0); together kill the time-out. Note that you cannot set the time-out to a value, as the lame Microsofters have implemented this by a polling and not a clock! You can also not set the set_time_limit into the loop; the performance is terrible!
The code check the return from fread( $fp, 1000); note that it can be both empty-string and false when in error. The empty-string do not need to indicate an error, files is, by rumour, told to be without EOF, so read ends with empty-string.
Performance cannot be increased using longer reads; this is likely due to that longer packages is split in the routers. The CPU load (On MY machine, imagine !) 25% when running at 1000kB/s. (Your computer should be able to run at 100 MB/s.)
The loop check for ( connection_status() & 1) == 0 which ignore a time-out flag, that will be set very early on. It is possible to Pause the download, indefinitely, only limited to reliability of the network. The loop will end when the connection is lost.
The ! $file_data error is treated as a local and intermittent error, and several re-tries will be done until the upload is cut. The no-error zero-length and a fatal file-system error is thus run in the same way. We note that a read error from a local file is unlikely. Never heard of!
The solution counts the number of blocks, and set in a 3 second sleep at 2000 blocks, and then for each 1000 blocks. My solution is not your solution, and this is work-priority, CPU speed and network speed dependent.
Note that the headers are extremely important, so "simple" headers has been avoided.
The headers are different on Microsoft (compared to all other platforms that ...)
This is the "call" part:
/* Download file code. */
if (isset($_GET['down']) && $dest && @file_exists($cur_dir .'/'. $dest))
{
if ( is_file($cur_dir .'/'. $dest))
{
// *** header('Content-type: application/octet-stream');
// *** header('Content-Disposition: attachment; filename='. $dest);
// *** header('Content-Length: '. filesize( $cur_dir .'/'. $dest));
// *** readfile($cur_dir .'/'. $dest);
dl_file_resume( $cur_dir .'/'. $dest);
} else {
header('Content-type: application/x-tar');
header('Content-Disposition: attachment; filename='. $dest .'.tar');
echo make_tar($cur_dir .'/'. $dest);
}
exit;
}
Please kindly remember to exit; the thread properly, as some timeouts are severely missing.
The "tar" part to download a directory also need a bit of handling; a directory download can take several hours.
echo make_tar($cur_dir .'/'. $dest);
How on earth should that work!?
The attachment download in getfile.php.t:
function dl_file_resume_noheader( $file)
{
// First, see if the file exists
if ( ! is_file( $file))
{
die( "<b>404 File not found!</b>");
}
// open the file
@ini_set( 'magic_quotes_runtime', 0);
ignore_user_abort( true);
set_time_limit( 0);
$fp = fopen( "$file", "rb");
if ( $fp && filesize( $file) > 0)
{
//start buffered download
$More_File = 10;
$Block_counter = 0;
while ( ! feof( $fp) && $More_File > 0 && ( connection_status() & 1) == 0)
{
$file_data = fread( $fp, 1000);
if ( ! $file_data )
{
sleep( 2);
$More_File--;
}
else
{
print( $file_data);
$More_File = 3;
ob_flush();
flush();
$Block_counter++;
if ( $Block_counter > 1500)
{
sleep( 4);
$Block_counter = 1000;
}
}
}
ob_flush();
flush();
}
fclose( $fp);
}
The loop is basically the same, but now I know that the max upload will be 100kB/s, as the fire-wall router (my router!) enforce this on purpose. I have therefore set a "pause" each 20s-30s in the upload. My brain-dead CPU also need this to handle other simultaneous clients. Imagine that.
The routine is invoked by simply cutting that was there before:
attach_inc_dl_count($id, $r[3]);
// @readfile($r[2]);
dl_file_resume_noheader( $r[2]);
The headers are super-important, and I'll guess that the correct MIME comes from the database. Still I have complaints that, "on your server only exporer works; I cannot see your pictures in firefox". Please kindly, this must be checked.
The server (my server!) is used to store original material, and download times can be very long. I have set up an example GIF, of full size, approx 10 pictures, for a total of 15000kB. I have sent the link to an administrator for testing and evaluation. Note that I cannot test locally; I have a 1000kB/s line and no DNS. It is much different from an external connection.
Ugh! Best possible code ! Can a php-expert check the solution --- put it in the right place -- and send help I'll see if the "tar" thing can be fixed also.
Best regards!
|
|
|