File upload with breaks - php

I'd like to upload large files to my server, but i would like to be able to make breaks (for example, the user must be able to shut down his computer and to continue after reboot) in the upload process.
I think i can handle the client side upload, but I don't know how to make the server side. What is the best way to make it on the server side? Is PHP able to do that ? Is PHP the most efficient?
Thanks a lot

If you manage to do the client side do post the file in chunks, you could do something like this on the server side:
// set the path of the file you upload
$path = $_GET['path'];
// set the `append` parameter to 1 if you want to append to an existing file, if you are uploading a new chunk of data
$append = intval($_GET['append']);
// convert the path you sent via post to a physical filename on the server
$filename = $this->convertToPhysicalPath($path);
// get the temporary file
$tmp_file = $_FILES['file']['tmp_name'];
// if this is not appending
if ($append == 0) {
// just copy the uploaded file
copy($tmp_file, $filename);
} else {
// append file contents
$write_handle = fopen($filename, "ab");
$read_handle = fopen($tmp_file, "rb");
$contents = fread($read_handle, filesize($tmp_file));
fwrite($write_handle, $contents);
fclose($write_handle);
fclose($read_handle);
}

If you are trying to design a web interface to allow anyone to upload a large file and resume the upload part way though I don't know how to help you. But if all you want to do is get files from you computer to a server in a resume-able fashion you may be able to use a tool like rsync. Rsync compare the files on the source and destination, and then only copies the differences between the two. This way if you have 50 GB of files that you upload to your server and then change one, rsync will very quickly check that all the other files are the same, and then only send your one changed file. This also means that if a transfer is interrupted part way through rsync will pick up where it left off.
Traditionally rsync is run from the command line (terminal) and it is installed by default on most Linux and Mac OS X.
rsync -avz /home/user/data sever:src/data
This would transfer all files from /home/user/data to the src/data on the server. If you then change any file in /home/user/data you can run the command again to resync it.
If you use windows the easiest solution is probably use DeltaCopy which is a GUI around rsync.download

Related

Replace image on server and show previous image until the new one is fully uploaded

I'm uploading to a lightspeed server through "ncftpput" an image taken from a raspberry camera every minute.
I want to be able to show the updated image and I know how I can force the browser to use the latest version instead of the cached image.
So, everything works properly except that, if I refresh the image (like through shift-F5) during image upload, the browser reports image contains errors (or shows image partially).
Is there any way to ensure the image is fully loaded before serving the new one?
I'm not sure if I should operate on ncftp or use PHP to ensure the swap happens only after complete loading.
Image is a progressive jpg but that doesn't help...
Any suggestion?
Thanks
I ended up with NOT using FTP because, as Viney mentioned, the webserver doesn't know if the upload is completed.
I'm using "curl" which has the advantage of being preinstalled on raspberry distro and a php upload page.
It seems that PHP will pass the new image only once fully uploaded and avoid creating the issue when image is still partially uploaded.
So, to recap:
raspberry (webcam), after having taken the image:
curl -F"somepostparam=abcd" -F"operation=upload" -F"file=#filename.jpg" https://www.myserver.com/upload.php
PHP server code:
$uploadfile = '/home/domain/myserver.com/' . basename($_FILES['file']['name']);
move_uploaded_file($_FILES['file']['tmp_name'], $uploadfile);
$content = file_get_contents($uploadfile);
Problem is this: you open browser (at 07:00:10 AM) image.jpg gets rendered now say it's 07:01:00 you hit refresh in browser but the raspberry is already started uploading image.jpg say it would take 3 secs to complete the full upload but the server doesn't know about ftp and would read whatever bytes are there in image,jpg and flushes to your browser. Had it been a baseline JPG it would have shown a cropped(in height) image but since it's a progressive JPG it would be messed up.I am not aware of whether it's possible but try looking up if you FTP supports a locking file.
How to solve ?
The best way is to let you server know that the file it's accessing is in a write process.If your ftp supports advisory locks then you can use it so when server tries accessing you file ( via system call) kernel would instruct it that the file is currently locked so the server will wait until ftp releases that lock.
In vsftp there would be a option lock_upload_files in VSFTPD.CONF setting yest would enable this feature
If you are unable to work out above solution then you can use some trick like checking file last modified time if it's almost same as current time then you make php wait for some guessed time that you think is avg. upload time of your file.For this method you should use PHP to send it to browser instead of server. ie Just change the src of your image from '/path/to/image.jpg' to 'gen-image.php'. This script will read that image and would flush it to the browser
gen-image.php
$guessedUploadTime = 3;// guessed time that ncftpput takes to finish
$currTime = time();
$modTime = filemtime('image.jpg');
if( ($currTime - $modTime) < $guessedUploadTime)
{
sleep($guessedUploadTime);
}
$file = '/path/to/image.jpg';
$type = 'image/jpeg';
header('Content-Type:'.$type);
readfile($file);
Note that the above solution is not ideal because if file has been just done uploading it won't be modified for another 57 seconds yet the browser request say at 07:02:04 has to wait unnecessarily for 3 seconds because mtime would be 07:02:03 and browser would get file only after 07:02:06. I would recommend you to search for some way(proly cmd based) to make server and ftp go hand in hand one should know the status of the other because that is the root cause of this problem.

How to upload file in php from browser with no user input

I am working on a php web app .
I need to upload a file to the web server, with customer info - customers.csv.
but this process needs to be automated ,
The file will be generated in a Point of Sale app , and the app can open a browser window with the url ...
first i taught i would do something like this www.a.com/upload/&file=customers.csv
but read on here that is not possible,
then i taught i would set a value for the file upload field and submit form automatically after x seconds. Discovered thats not possible .
Anybody with a solution , will be appreciated .
EDIT
I have tried this and it works ,file is uploaded to remote server
is it working only because the php script is running on the same pc where csv is sitting ???
$file = 'c:\downloads\customers.csv';
$remote_file = 'customers.csv';
// set up basic connection
$conn_id = ftp_connect('host.com');
// login with username and password
$login_result = ftp_login($conn_id,'user','password');
// upload a file
if (ftp_put($conn_id, $remote_file, $file, FTP_ASCII)) {
echo "successfully uploaded $file\n";
} else {
echo "There was a problem while uploading $file\n";
}
// close the connection
ftp_close($conn_id);
This is of course not possible, imagine how this could be abused to upload on linux as example the /etc/passwd. The only way it might be possible is to use a Java Applet, but this is for sure not the best way.
You could try to let your PoS Application make a web request with the customers.csv file and let a WebAPI handle the upload, this may be possible, but I have no expierence with Point of Sale Applications.
Best might be, if the solution above cannot be considered, to just prompt the user to provide the file above and check over name + content if it is the correct one.
This is a bit tricky, but if your CSV is not too long, you could encode it in base64, send to the webserver as a GET parameter and then, in the server side, decode and store it as a CSV file.
If the file is too big to do that, you have to use other method, like the java applet pointed by #D.Schalla or even install and configure a FTP server, and make the Point of Sale app uploads the file there.
Other alternative, specially good if you cannot modify the sale app, is to install a web server in the client side and write a small php script to handle the upload process. In this way, the sale app could call a local url (something like: http:// localhost/upload.php) and it's this script the one in charge to upload the file which can be achieve with a classical HTTP POST, a FTP connection or any other way you can think about.
MY Solution , which will work with out setting up web server on client side.
This is for windows but can be adapted to linux
On client side
Local Application opens cmd and runs this command ftp -n -s:C:\test.scr
WHICH opens test.scr - a file with ftp commands e.g.
open host.com
user1
passwOrd
put C:\downloads\customers.csv public_html/customers.csv
more info here :
http://support.microsoft.com/kb/96269
more commands :
http://www.nsftools.com/tips/MSFTP.htm#put

PHP Read/Write files on remote server

I'm making a utility that provides a GUI to easy edit certain values in a csv file on a remote server. My boss wants the utility in php running on the private webserver. I'm new to php, but I was able to get the GUI file modifier working locally without issues. The final piece now is rather than the local test file I need to grab a copy of the requested file off of the remote server, edit it, and then replace the old file with the edited one. My issue is uploading and downloading the file.
When I searched for a solution I found the following:
(note in each of these I am just trying to move a test file)
$source = "http://<IP REMOTE SERVER>/index.html";
$dest = $_SERVER['DOCUMENT_ROOT']."index.html";
copy($source, $dest);
This solution ran into a permissions error.
$source ="http://<IP REMOTE SERVER>/index.html";
$destination = $_SERVER['DOCUMENT_ROOT']."newfile.html";
$data = file_get_contents($source);
$handle = fopen($destination, "w");
fwrite($handle, $data);
fclose($handle);
This also had a permissions error
$connection = ssh2_connect('<IP REMOTE SERVER>', 22);
ssh2_auth_password($connection, 'cahenk', '<PASSWORD>');
ssh2_scp_recv($connection, '/tmp/CHenk/CHenk.csv', 'Desktop/CHenk.csv');
This solution has the error Fatal error: Call to undefined function ssh2_connect() which I have learned is because the function is not a part of the default php installation.
In closing, is there any easy way to read/write files to the remote server through php either by changing permissions, having the php extension installed, or a different way entirely that will work. Basically I'm trying to find the solution that requires the least settings changes to the server because I am not the administrator and would have to go through a round about process of getting any changes done. If something does need to be changed instructions on doing so or a link to instructions would be greatly appreciated.
Did you set the enable-url-fopen-wrapper in your php.ini?(only if your php version is older)
Please look # php remote files storing in example 2

How to determine whether a file is still being transferred via ftp

I have a directory with files that need processing in a batch with PHP. The files are copied on the server via FTP. Some of the files are very big and take a long time to copy. How can I determine in PHP if a file is still being transferred (so I can skip the processing on that file and process it in the next run of the batch process)?
A possibility is to get the file size, wait a few moments, and verify if the file size is different. This is not waterproof because there is a slight chance that the transfer was simply stalled for a few moments...
One of the safest ways of doing this is to upload the files with a temporary name, and rename them once the transfer is finished. You program should skip files with the temporary name (a simple extension works just fine.) Obviously this requires the client (uploader) to cooperate, so it's not ideal.
[This also allows you to delete failed (partial) transfers after a given time period if you need that.]
Anything based on polling the file size is racy and unsafe.
Another scheme (that also requires cooperation from the uploader) can involve uploading the file's hash and size first, then the actual file. That allows you to know both when the transfer is done, and if it is consistent. (There are lots of variants around this idea.)
Something that doesn't require cooperation from the client is checking whether the file is open by another process or not. (How you do that is OS dependent - I don't know of a PHP builtin that does this. lsof and/or fuser can be used on a variety of Unix-type platforms, Windows has APIs for this.) If another process has the file open, chances are it's not complete yet.
Note that this last approach might not be fool-proof if you allow restarting/resuming uploads, or if your FTP server software doesn't keep the file open for the entire duration of the transfer, so YMMV.
Our server admin suggested ftpwho, which outputs which files are currently transferred.
http://www.castaglia.org/proftpd/doc/ftpwho.html
So the solution is to parse the output of ftpwho to see if a file in the directory is being transferred.
Some FTP servers allow running commands when certain event occurs. So if your FTP server allows this, then you can build a simple signalling scheme to let your application know that the file has been uploaded more or less successfully (more or less is because you don't know if the user intended to upload the file completely or in parts). The signalling scheme can be as simple as creation of "uploaded_file_name.ext.complete" file, and you will monitor existence of files with ".complete" extension.
Now, you can check if you can open file for writing. Most FTP servers won't let you do this if the file is being uploaded.
One more approach mentioned by Mat is using system-specific techniques to check if the file is opened by other process.
Best way to check would be to try and get an exclusive lock on the file using flock. The sftp/ftp process will be using the fopen libraries.
// try and get exclusive lock on file
$fp = fopen($pathname, "r+");
if (flock($fp, LOCK_EX)) { // acquire an exclusive lock
flock($fp, LOCK_UN); // release the lock
fclose($fp);
}
else {
error_log("Failed to get exclusive lock on $pathname. File may be still uploading.");
}
It's not realy nice trick, but it's simple :-), the same u can do with filemtime
$result = false;
$tryies = 5;
if (file_exists($filepath)) {
for ($i=0; $i < $tryies; $i++) {
sleep(1);
$filesize[] = filesize($filepath);
}
$filesize = array_unique($filesize);
if (count($filesize) == 1) {
$result = true;
} else {
$result = false;
}
}
return $result;

Read content of 12000 files from another FTP server

What I would like to script: a PHP script to find a certain string in loads of files
Is it possible to read contents of thousands of text files from another ftp server without actually downloading those files (ftp_get) ?
If not, would downloading them ONCE -> if already exists = skip / filesize differs = redownload -> search certain string -> ...
be the easiest option?
If URL fopen wrappers are enabled, then file_get_contents can do the trick and you do not need to save the file on your server.
<?php
$find = 'mytext'; //text to find
$files = array('http://example.com/file1.txt', 'http://example.com/file2.txt'); //source files
foreach($files as $file)
{
$data = file_get_contents($file);
if(strpos($data, $find) !== FALSE)
echo "found in $file".PHP_EOL;
}
?>
[EDIT]: If Files are accessible only by FTP:
In that case, you have to use like this:
$files = array('ftp://user:pass#domain.com/path/to/file', 'ftp://user:pass#domain.com/path/to/file2');
If you are going to store the files after you download them, then you may be better served to just download or update all of the files, then search through them for the string.
The best approach depends on how you will use it.
If you are going to be deleting the files after you have searched them, then you may want to also keep track of which ones you searched, and their file date information, so that later, when you go to search again, you won't waste time searching files that haven't changed since the last time you checked them.
When you are dealing with so many files, try to cache any information that will help your program to be more efficient next time it runs.
PHP's built-in file reading functions, such as fopen()/fread()/fclose() and file_get_contents() do support FTP URLs, like this:
<?php
$data = file_get_contents('ftp://user:password#ftp.example.com/dir/file');
// The file's contents are stored in the $data variable
If you would need to get a list of the files in the directory, you might want to check out opendir(), readdir() and closedir(), which I'm pretty sure supports FTP URLs.
An example:
<?php
$dir = opendir('ftp://user:password#ftp.example.com/dir/');
if(!$dir)
die;
while(($file = readdir($dir)) !== false)
echo htmlspecialchars($file).'<br />';
closedir($dir);
If you can connect via SSH to that server, and if you can install new PECL (and PEAR) modules, then you might consider using PHP SSH2. Here's a good tutorial on how to install and use it. This is a better alternative to FTP. But if it is not possible, your only solution is file_get_content('ftp://domain/path/to/remote/file');.
** UPDATE **
Here is a PHP-only implementation of an SSH client : SSH in PHP.
With FTP you'll always have to download to check.
I do not know what kind of bandwidth you're having and how big the files are, but this might be an interesting use-case to run this from the cloud like Amazon EC2, or google-apps (if you can download the files in the timelimit).
In the EC2 case you then spin up the server for an hour to check for updates in the files and shut it down again afterwards. This will cost a couple of bucks per month and avoid you from potentially upgrading your line or hosting contract.
If this is a regular task then it might be worth using a simple queue system so you can run multiple processes at once (will hugely increase speed) This would involve two steps:
Get a list of all files on the remote server
Put the list into a queue (you can use memcached for a basic message queuing system)
Use a seperate script to get the next item from the queue.
The procesing script would contain simple functionality (in do while loop)
ftp_connect
do
item = next item from queue
$contents = file_get_contents;
preg_match(.., $contents);
while (true);
ftp close
You could then in theory fork off multiple processes through the command line without needing to worry about race conditions.
This method is probabaly best suited to crons/batch processing, however it might work in this situation too.

Categories