I have a php script to backup automatically cPanel and upload the .tar.gz concurrently to an FTP server.
The script works fine.
The script requests the file then starts the ftp transfer and when the upload is complete it send also a confirmation email.
The only problem is that even if the backup is, for example, of 1GB the uploaded file is only of 170mb.
It seems it's not able to upload large files. Infact with a small backup (for example 16mb or 20mb) it's all working fine.
You can see the complete file here http://pastie.org/949680
This is probably due to php.ini settings
For instance max_execution_time needs to be high enough to upload such a large file. Im not sure if any other ini directives are pertinent .
Related
I have a vps server (on hostinger) running on openlitespeed (cyberpanel). The website that I built uses PHP.
I created a file upload service and set my php config's post max limit to 1024M and upload max filesize to 1024M as well since both those settings work on my local when testing for large file uploads under 1gb. (I've also restarted php, as well as my server to see if it solved the issue but it still persists. I've also created a phpinfo file to confirm if my post max limit and upload max filesize were changed to 1024M/1G and they were changed)
But on my webserver, uploading files larger than 500mb gives me a "POST net::ERR_HTTP2_PROTOCOL_ERROR" on my console pointing to a line on one of my javascript code referring to my xhttp.send (specifically xhttp.send(formData) XML Http Request. This XML http request forwards the selected file to an upload.php script that processes the data. But it's not getting sent to that script due to the error.
When I get that error when uploading files larger than 500mb, I'd also get an error in my server's error log:
[NOTICE] [xxxx] [T0] [xx.xxx.xx.xxx:xxxxx:HTTP2-3#APVH_*:443_website.com] Request body size: <filesize> is too big!
The uploader works fine when uploading files under 500mb, it sends the formdata to the php upload script and stores it in the server as well as the database. But it gives an error when uploading more than that.
I've looked everywhere to solve this issue, I've also looked into openlitespeed's max request body size since it's seems like it's what the error on my error log shows, but most of the answers I've seen were from several years ago and no longer applies and didn't resolve the issue.
There doesn't seem to be any issue with my php script as well as my javascript from my tests since it works with no issues on my local server, and works fine with small file sizes on my web server.
Is there anyway to resolve this issue?
My application is keeping watch on a set of folders where users can upload files. When a file upload is finished I have to apply a treatment, but I don't know how to detect that a file has not finish to upload.
Any way to detect if a file is not released yet by the FTP server?
There's no generic solution to this problem.
Some FTP servers lock the file being uploaded, preventing you from accessing it, while the file is still being uploaded. For example IIS FTP server does that. Most other FTP servers do not. See my answer at Prevent file from being accessed as it's being uploaded.
There are some common workarounds to the problem (originally posted in SFTP file lock mechanism, but relevant for the FTP too):
You can have the client upload a "done" file once the upload finishes. Make your automated system wait for the "done" file to appear.
You can have a dedicated "upload" folder and have the client (atomically) move the uploaded file to a "done" folder. Make your automated system look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the client (atomically) rename the file after upload to its final name. Make your automated system ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for an example of implementing this approach.
Also, some FTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive.
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, that the file is incomplete.
Some FTP servers allow you to configure a hook to be called, when an upload is finished. You can make use of that. For example ProFTPD has a mod_exec module (see the ExecOnCommand directive).
I use ftputil to implement this work-around:
connect to ftp server
list all files of the directory
call stat() on each file
wait N seconds
For each file: call stat() again. If result is different, then skip this file, since it was modified during the last seconds.
If stat() result is not different, then download the file.
This whole ftp-fetching is old and obsolete technology. I hope that the customer will use a modern http API the next time :-)
If you are reading files of particular extensions, then use WINSCP for File Transfer. It will create a temporary file with extension .filepart and it will turn to the actual file extension once it fully transfer the file.
I hope, it will help someone.
This is a classic problem with FTP transfers. The only mostly reliable method I've found is to send a file, then send a second short "marker" file just to tell the recipient the transfer of the first is complete. You can use a file naming convention and just check for existence of the second file.
You might get fancy and make the content of the second file a checksum of the first file. Then you could verify the first file. (You don't have the problem with the second file because you just wait until file size = checksum size).
And of course this only works if you can get the sender to send a second file.
So, I'm creating an FTP client using PHP.
What I want is, if the upload fails (timed out, connection error, etc.), it will try to upload the file again in 1 minute, and if it fails again then try in 10 minuts, until it reached the maximum time to upload the file.
Let's say I have these files in my local folder:
File1.ext, File2.ext, File3.ext. And I successfully uploaded File1.ext and File3.ext. I should reupload the File2.ext, how should I do it? Any ideas?
I am running my script in the background using exec(), and if the upload is done, it will send me an email regarding the process. And I am doing recursive upload, it checks the files in my local folder then upload it one by one then deletes it after uploading.
Thanks!
I have a backup of my website in WinRAR file, I wana upload it on my new website... So I uploaded that WinRAR file on my new website panel, but it didn't worked...
Actually I'm moving my website to another hosting, so I downloaded the backup in WinRAR file and upload it on new hosting, it uploaded successfully but its not working.. it shows the default page of that hosting... check this http://kownleg.comyr.com, I dont know whats this error all about, I'm newbie... pls help
any suggestions, why this happening or how can we upload WinRAR backup file??
Upload the individual files to the server using the same method you used to upload your compressed file. The key is to NOT send the server a compressed file if you are unable to decompress it on the server.
This should be probably related to file permissions...
Most of the hosting providers take care for security reasons.
Check you file permissions after decompressing the file.
Should be no difference between uploading a single compressed file (and then decompress it) and upload every single file.
I built a website in that I upload 10 too big size(10MB) images. When uploading start, it continues to some time then a blank page will come. I tried to change php_values in .htaccess file, because I don't have permission to change the settings in php.ini file (it's shared server). I have some doubts regarding this.
1) what happen if file will going to post request, because I want fastly uploded the files.
2) it takes time when posting the request or uploding the file, I am cropping the images (loop) using php GD functions.
It is because of the limits your web hosting provider set. Which values did you try to change in the .htaccess?
You could try using some flash uploader, it should work despite the limits imposed by the server. A good one is SWFUpload.
That is because of the exection time of a script.You can edit your php.ini file. If that is not permitted you can set the *MAX_EXECUTION_TIME* for a script using your .htaccess file.