I have a PHP script that downloads files with direct link from a remote server that I own. Sometimes large files (~500-600 MB) and sometimes small files (~50-100 MB).
Some code from the script:
$links[0]="file_1";
$links[0]="file_2";
$links[0]="file_3";
for($i=0;$i<count($links);$i++){
$file_link=download_file($links[$i]); //this function downloads the file with curl and returns the path to the downloaded file in local server
echo "Download complete";
rename($file_link,"some other_path/..."); //this moves the downloaded file to some other location
echo "Downloaded file moved";
echo "Download complete";
}
My problem is if I download large file and run the script from web browser, it takes upto 5-10 minutes to complete and the script echos upto "Download complete" then it dies completely. I always find that the file that was being downloaded before the script dies is 100% downloaded.
On the other hand if I download small files like 50-100MB from web browser or run the script from command shell this problem does not occur at all and the script completes fully.
I am using my own VPS for this and do not have any time limit in the server. There is no fatal error or memory overload problem.
I also used ssh2_sftp to copy files from the remote server. But same problem when I run from web browser. It always downloads the file, executes the next line and then dies! Very strange!
What should I do to get over this problem?
To make sure you can download larger files, you will have to make sure that there is:
enough memory available for php
the maximum execution time limit is set high enough.
Judging from what you said about ssh2_sftp (i assume you are running it via php) your problem is the 2nd one. Check your error(-logs) to find if that truly is your error. If so you simply increase the maximum execution time in your settings/php.ini and that should fix it.
Note: I would encourage you not to let PHP handle these large files. Call some program (via system() or exec()) that will do the download for you as PHP still has garbage collection issues.
Related
I'm uploading to a lightspeed server through "ncftpput" an image taken from a raspberry camera every minute.
I want to be able to show the updated image and I know how I can force the browser to use the latest version instead of the cached image.
So, everything works properly except that, if I refresh the image (like through shift-F5) during image upload, the browser reports image contains errors (or shows image partially).
Is there any way to ensure the image is fully loaded before serving the new one?
I'm not sure if I should operate on ncftp or use PHP to ensure the swap happens only after complete loading.
Image is a progressive jpg but that doesn't help...
Any suggestion?
Thanks
I ended up with NOT using FTP because, as Viney mentioned, the webserver doesn't know if the upload is completed.
I'm using "curl" which has the advantage of being preinstalled on raspberry distro and a php upload page.
It seems that PHP will pass the new image only once fully uploaded and avoid creating the issue when image is still partially uploaded.
So, to recap:
raspberry (webcam), after having taken the image:
curl -F"somepostparam=abcd" -F"operation=upload" -F"file=#filename.jpg" https://www.myserver.com/upload.php
PHP server code:
$uploadfile = '/home/domain/myserver.com/' . basename($_FILES['file']['name']);
move_uploaded_file($_FILES['file']['tmp_name'], $uploadfile);
$content = file_get_contents($uploadfile);
Problem is this: you open browser (at 07:00:10 AM) image.jpg gets rendered now say it's 07:01:00 you hit refresh in browser but the raspberry is already started uploading image.jpg say it would take 3 secs to complete the full upload but the server doesn't know about ftp and would read whatever bytes are there in image,jpg and flushes to your browser. Had it been a baseline JPG it would have shown a cropped(in height) image but since it's a progressive JPG it would be messed up.I am not aware of whether it's possible but try looking up if you FTP supports a locking file.
How to solve ?
The best way is to let you server know that the file it's accessing is in a write process.If your ftp supports advisory locks then you can use it so when server tries accessing you file ( via system call) kernel would instruct it that the file is currently locked so the server will wait until ftp releases that lock.
In vsftp there would be a option lock_upload_files in VSFTPD.CONF setting yest would enable this feature
If you are unable to work out above solution then you can use some trick like checking file last modified time if it's almost same as current time then you make php wait for some guessed time that you think is avg. upload time of your file.For this method you should use PHP to send it to browser instead of server. ie Just change the src of your image from '/path/to/image.jpg' to 'gen-image.php'. This script will read that image and would flush it to the browser
gen-image.php
$guessedUploadTime = 3;// guessed time that ncftpput takes to finish
$currTime = time();
$modTime = filemtime('image.jpg');
if( ($currTime - $modTime) < $guessedUploadTime)
{
sleep($guessedUploadTime);
}
$file = '/path/to/image.jpg';
$type = 'image/jpeg';
header('Content-Type:'.$type);
readfile($file);
Note that the above solution is not ideal because if file has been just done uploading it won't be modified for another 57 seconds yet the browser request say at 07:02:04 has to wait unnecessarily for 3 seconds because mtime would be 07:02:03 and browser would get file only after 07:02:06. I would recommend you to search for some way(proly cmd based) to make server and ftp go hand in hand one should know the status of the other because that is the root cause of this problem.
I'm having a problem like a file upload code.
The user begins to upload files through the site (for large files. Like Wetransfer)
Showing percentage loading with Ajax.
When completed, showing warning.
But the problem starts here.
Because files are huge, it takes time to move to the appropriate folder and ziping.
If the user closes the browser in this process, the process can not be completed.
Even users close the browser, how do I ensure that the operation continues.
I tried to ignore_user_abort. But I was not successful.
So send response to the browser that you are moving file, and or do it as queue and execute it as background job or just do it in your script. That should help: https://stackoverflow.com/a/5997140/4099089
i want to upload a local file via PHP to Google Drive. If I upload a small file (<5MB) everything is working. If I want to upload a larger file I get a "Fatal error: Maximum execution time of 30 seconds exceeded". I can set the max_execution_time in my php.ini on a very high value but it seems to be a bad solution. If I use "set_time_limit(seconds);" I can't upload more files simultaneously (I don't know why).
So my question is how to upload a large file without changing my php.ini to a bad execution time. How is that supposed to solve in PHP? I would like to use just a simple CURL cli command but that does not work with Google Drive because CURL can't support Oath.
Setting the max_execution_time with set_time_limit is the right solution. If your script takes more than 30 seconds to upload the file, it will run more than 30 seconds.
set_time_limit(600); //File upload should not take longer than 10 minutes
do_your_file_upload();
set_time_limit(30); //Change it back to 30 seconds for further processing
I can't upload more files simultaneously
Can you specify what you mean?
If you cannot reach your PHP-Script while the other one runs, it is often because of PHP session management. When you start the session with session_start PHP (creates and) opens the session file and locks it for writing. The file will be locked until your scripts calls session_write_close or the script execution ends. To work with two or more sessions simultaneously, you have to call session_write_close:
session_start();
//do stuff
session_write_close();
set_time_limit(600); //File upload schould not take longer than 10 minutes
do_your_file_upload();
set_time_limit(30); //Change it back to 30 seconds for further processing
session_start();
//do more stuff
session_write_close();
This a bad idea to use a simple PHP code to upload a very large file.
You should use Javascript to upload large files because it's using AJAX methods. Also, it will allow you to display a progress bar.
I did a little research. Here is some nice scripts using jQuery :
http://designscrazed.org/html5-jquery-file-upload-scripts/
Or another script which not use jQuery (I don't if it's a good one) :
http://igstan.ro/posts/2009-01-11-ajax-file-upload-with-pure-javascript.html
Good luck !
I want to zip a large folder of 50K files on Windows Server. I'm currently using this code:
include_once("CreateZipFile.inc.php");
$createZipFile=new CreateZipFile;
$directoryToZip="repository";
$outputDir=".";
$zipName="CreateZipFileWithPHP.zip";
define("ZIP_DIR",1); //
if(ZIP_DIR)
{
//Code toZip a directory and all its files/subdirectories
$createZipFile->zipDirectory($directoryToZip,$outputDir);
}else
{
//?
}
$fd=fopen($zipName, "wb");
$out=fwrite($fd,$createZipFile->getZippedfile());
fclose($fd);
$createZipFile->forceDownload($zipName);
#unlink($zipName);
Everything works fine until around 2K image files. But this is not what I want to get. I'm willing to process to zip like 50K images at least. Meanwhile my script gets this error:
Fatal error: Maximum execution time of 360 seconds exceeded in C:\xampp\htdocs\filemanager\CreateZipFile.inc.php on line 92
$newOffset = strlen(implode("", $this->compressedData));
I'm searching for any solution to proceed such a huge amount of files. I currently use XAMPP on Windows Server 2008 Standard. Is there any possibility to make small parts of the zips, use a system command and maybe external tool to pack them and then send it to header to download?
http://pastebin.com/iHfT6x69 for CreateZipFile.inc.php
try this .. to increase execution time
ini_set('max_execution_time', 500);
500 is number os seconds change it to whatever you lilke
Do you need a smaller file or a fast served file?
for fast serving without compression and without memory leak you could try to use the system command with a zip software like gzip and turning the compression of.
the files would probably get huge but would be served fast as one file.
We have a script which downloads acsv file. When we run this script on command line on EC2 console it runs fine; downloads the file and sends success message to the user.
But if we run through a browser then we get:
error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data.
When we checked in backed for the file download, it's there but the success message sent after the download is not received by the browser.
We are using cURL to download from a remote location with authentication. The user group and ownership of the folder is "ec2-user", the folder has full rights ie 777.
To summarize: the file is downloaded but at the browser end we are not getting any data or success message which we print.
P.S.: The problem occurs when the downloaded file size is 8-9MB; if it is a smaller file size say 1MB it works. So Either script executing time or download file size or some ec2 instance config is blocking it from giving browser a response. The same script is working perfectly fine on our Godaddy Linux VPS. We have already changed Max execution time for the script.
Sadly, this is a known problem without a good solution. There's a very long thread on the amazon forum here: https://forums.aws.amazon.com/thread.jspa?threadID=33427. The solution offered there is to send a keep-alive message to keep the connection from dying after 60 seconds. Not a great solution, but I don't think there's a better one unless Amazon fixes the problem, which doesn't seem likely given that the thread has been open for 3 years.