PHP - FTP Error Handling - php

So, I'm creating an FTP client using PHP.
What I want is, if the upload fails (timed out, connection error, etc.), it will try to upload the file again in 1 minute, and if it fails again then try in 10 minuts, until it reached the maximum time to upload the file.
Let's say I have these files in my local folder:
File1.ext, File2.ext, File3.ext. And I successfully uploaded File1.ext and File3.ext. I should reupload the File2.ext, how should I do it? Any ideas?
I am running my script in the background using exec(), and if the upload is done, it will send me an email regarding the process. And I am doing recursive upload, it checks the files in my local folder then upload it one by one then deletes it after uploading.
Thanks!

Related

saving file while receiving file in upload to server

I am gonna to save file when the file is still uploading to the server!
For example, the client uploads a video to the server and I want to get that file and save it when client waits to finish it. in other words, I want to save uploaded file every 100kb.
Maybe you ask me, why I am going to do this! Here is why:
I am creating a video conferencing stream by PHP, I want to save uploading video and get it in another clients device!
Can I use php:// to get that file?
reference

php file_exists on FTP Uploading file [duplicate]

My application is keeping watch on a set of folders where users can upload files. When a file upload is finished I have to apply a treatment, but I don't know how to detect that a file has not finish to upload.
Any way to detect if a file is not released yet by the FTP server?
There's no generic solution to this problem.
Some FTP servers lock the file being uploaded, preventing you from accessing it, while the file is still being uploaded. For example IIS FTP server does that. Most other FTP servers do not. See my answer at Prevent file from being accessed as it's being uploaded.
There are some common workarounds to the problem (originally posted in SFTP file lock mechanism, but relevant for the FTP too):
You can have the client upload a "done" file once the upload finishes. Make your automated system wait for the "done" file to appear.
You can have a dedicated "upload" folder and have the client (atomically) move the uploaded file to a "done" folder. Make your automated system look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the client (atomically) rename the file after upload to its final name. Make your automated system ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for an example of implementing this approach.
Also, some FTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive.
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, that the file is incomplete.
Some FTP servers allow you to configure a hook to be called, when an upload is finished. You can make use of that. For example ProFTPD has a mod_exec module (see the ExecOnCommand directive).
I use ftputil to implement this work-around:
connect to ftp server
list all files of the directory
call stat() on each file
wait N seconds
For each file: call stat() again. If result is different, then skip this file, since it was modified during the last seconds.
If stat() result is not different, then download the file.
This whole ftp-fetching is old and obsolete technology. I hope that the customer will use a modern http API the next time :-)
If you are reading files of particular extensions, then use WINSCP for File Transfer. It will create a temporary file with extension .filepart and it will turn to the actual file extension once it fully transfer the file.
I hope, it will help someone.
This is a classic problem with FTP transfers. The only mostly reliable method I've found is to send a file, then send a second short "marker" file just to tell the recipient the transfer of the first is complete. You can use a file naming convention and just check for existence of the second file.
You might get fancy and make the content of the second file a checksum of the first file. Then you could verify the first file. (You don't have the problem with the second file because you just wait until file size = checksum size).
And of course this only works if you can get the sender to send a second file.

Getting images onto server from separate web host

i have GoDaddy shared webspace with FTP access that contains a folder with images. These images are changing every day.
Im looking for advice on what i need to do to get these images onto my server in the workplace, maybe every hour or so. The server doesnt have IIS installed is there any other way to do this?
Am i able to do a PHP script that can put all the images onto the server using the ip or something?
I had the same issue as I had two images on a remote server which I needed to copy to my local server at a predefined time each day and this is the code I was able to come up with...
try {
if(#copy('url/to/source/image.ext', 'local/absolute/path/on/server/' . date("d-m-Y") . ".gif")) {
} else {
$errors = error_get_last();
throw new Exception($errors['type'] . " - " . $errors['message']);
};
} catch(Exception $e) {
//An Error Occurred Downloading The Requested Image
//Place exception handling code here
};
I then created a cron job which ran this file on a daily basis but you would be able to run is as frequently as you needed to based on how frequently the source image changes on the source server.
A few points to explain the code...
As the code is run in the background I use the # symbol to silence visual references to any errors which occur with the copy command as they won't be viewable anyway. Instead if copy results in an error it returns false and triggers throwing a new exception which takes the error type and error message and throws a new exception for it which can then be handled in the catch block and can be actioned however you need such as making a local log entry, sending an error email, whatever you need.
As the second paramater in the copy command you will see that the path results in the filename being named based on the current date. This name can be anything but needs to be unique in the destination for it to work. If you plan on overwritting the image each time the copy is done then you can statically code the name as the same name but if you want to maintain a history then you would need to come up with a file naming solution on your local server to ensure that the files don't get overwritten each time the cron runs.
The first parameter of the copy statement has been written on the assumption that the source image file is named the same each time, if the name changes the you will need to identify how the naming is achieved and code a variable to build that name and insert it as the source filename.
This code does not alter the format of the source image file so to ensure no corruption occurs and the image can still be shown after the copy you need to ensure that the source image file and the local copy of the image file have the same file extensions, so if the source image file is a .gif file then you need to make sure the file extension in the second copy parameter is also set to .gif
I'll try to answer here instead of continuing the comment-spam ;)
you got your webspace with FTP access. let's just call it webspace;
then you got your server at your workplace. let's just call it workplace;
after all you need one server (also can be webspace for example) where you are able to run PHP. let's call it php-server;
step 1
at the workplace setup a FTP server, for example FileZilla. you setup your FTP server so that you can connect to it. so make an account and set it up so you can access the folder(s) where you want to save your images. also make sure you got access from outside your workplace - firewall settings etc.
step 2
if you can run your PHP scripts on your webspace this would be the easiest way. you can directly access the image files, establish a connection to your FTP server at workplace and upload the files.
if the PHP server is somewhere else, you have to establish a connection from the php-server to your webspace; download your files to your php-server; upload your files to the FTP server at your workplace.
will be a bit of work as you can see, but should be possible to do.
Create a .bat file that uses the ftp command line functions to get the data you want from the server. Save that .bat file somewhere and create a Scheduled Task to run the script every hour.
You can even store the actual sequence of ftp commands in a separate file (e.g. ftpcmd.dat) and call them from the script
ftp -n -s:ftpcmd.dat SERVERNAME.COM
Example ftp command file:
user MyUserName
Password
bin
cd \your\path\images
mget * C:\Temp\
quit

Is there a better way to see if a file is being written to?

We have a FreeBSD server with samba that employees copy image files on to, which then get uploaded to our web servers (this way they don't have to mess with ftp). Sometimes, if the upload script is running at the same time as files are being copied, it can upload an incomplete file.
We fixed this by getting the list of files along with the file sizes, then waiting 5 seconds and rechecking the file sizes. If the sizes match then its save to upload, if they don't match it checks again in another 5 seconds.
This seems like an odd way to check if the files are being written to. is there a better, more simple way of doing this?
Use a flock function http://php.net/flock - when writing a file obtain an exclusive lock flock($handle, LOCK_EX), after it is written release the lock flock($handle, LOCK_UN).
The upload script could try to obtain the exclusive writing lock too, if it succeeds it is Okay to move the file, otherwise no.
EDIT: Sorry, I forgot about the users copying the files to the server through samba... So there is no space to use flock while copying... But the upload script could still use flock($handle, LOCK_EX) to see, if it is successful or not.
I recommend to shell_exec() smbstatus(1), e.g. smbstatus -LB to check for locked files
Write a script to copy the files to a temp folder on the Samba server and then, when fully copied and flushed, move, (ie, unlink/link, not copy again), them to the upload folder.

PHP file uploads being "hijacked" by partial uploads

I have a site that is receiving 30-40k photo uploads a day and I've been seeing an issue pop up with more frequency now. This issue is this:
Our upload script receives (via $_FILES['name']['tmp_name']) a file (photo) that was NOT uploaded by the user & the majority of the time the file received is a "partial" upload.
Of course at first I thought it was my PHP code making a simple mistake and I've spent days looking over it to make sure, but after placing checks in the code I've found that the file received via a HTTP POST upload to PHP is actually the wrong file. So the issue is happening before it reaches my code. The tmp file (phpxxxx) received by the script is sometimes incorrect, as if it was somehow being overwritten by another process and its usually overwritten by a file that was partially uploaded.
Has anyone every seen an issue like this? Any help is greatly appreciated. I'm turning to this as a last resort after days of searching/asking other PHP devs
So to recap:
User uploads a photo
PHP script receives a file that was not uploaded by the user (pre code, via $_FILES in /var/tmp)
Usually the incorrect file received is a partial upload or a broken upload
It seems to happen randomly and not all the time
First off, check you PHP version.
Second, check your file upload limits and POST_MAX_SIZE in php.ini
It might just be that someone tries to upload a file that's too large :-)
Can you try different names for the temp file to avoid its being overwritten? Can you identify the origin of the new, incorrect and incomplete file?
Is this a development environment? Is it possible that more than one user is uploading files at the same time?
Try your program with very small images to check if SchizoDuckie is correct about filesize problems.
Try with different navigators to eliminate the admittedly remote possibility that it is a local problem.
Check permissions on the directory where the temp file is stored.
PHP's built-in file handling does not support partial uploads.
Turn off KeepAlives and/or send a 'Connection: close' header after each upload.
Configure your webserver to send the header 'Allow-Ranges: none'.

Categories