I am trying to transfer an entire folder to FTP server using PHP.
Right now I am using this code:
function ftp_copyAll($conn_id, $src_dir, $dst_dir) {
if (is_dir($dst_dir)) {
return "<br> Dir <b> $dst_dir </b> Already exists <br> ";
} else {
$d = dir($src_dir);
ftp_mkdir($conn_id, $dst_dir);
echo "create dir <b><u> $dst_dir </u></b><br>";
while($file = $d->read()) { // do this for each file in the directory
if ($file != "." && $file != "..") { // to prevent an infinite loop
if (is_dir($src_dir."/".$file)) { // do the following if it is a directory
ftp_copyAll($conn_id, $src_dir."/".$file, $dst_dir."/".$file); // recursive part
} else {
$upload = ftp_put($conn_id, $dst_dir."/".$file, $src_dir."/".$file, FTP_BINARY); // put the files
echo "creat files::: <b><u>".$dst_dir."/".$file ." </u></b><br>";
}
}
ob_flush() ;
sleep(1);
}
$d->close();
}
return "<br><br><font size=3><b>All Copied ok </b></font>";
}
But is it possible to transfer the entire folder without iterating through the files? Because I have about 100+ files and PHP is taking lot of time for the transfer.
Is there any way to increase the speed of transfer?
No there's no other generic way supported by a common FTP server.
Except that you pack the files (zip, gzip, etc) locally, upload them and unpack remotely.
But if you have an FTP access only, you do not have a way to unpack them remotely anyway. Unless the FTP server explicitly allows that. Either by allowing you to execute an arbitrary remote shell command (typically not allowed) or using a proprietary "unpack" extension (very few servers do support that).
The FTP protocol is generally very inefficient for transferring a large amount of small files, because each file transfer has quite an overhead for opening a separate data transfer connection.
Related
I have an app that ingests photos from SD cards. After they are copied the cards will be reformatted and put back in cameras and more photos will be stored on them.
Currently, instead of using the PHP copy() function, I am doing the following (roughly):
$card = '/Volumes/SD_Card/DCIM/EOS/';
$files = scandir($card);
$target = '/Volumes/HARD_DRIVE/photos/';
foreach($files as $k => $file) {
if( strtolower ( pathinfo($file,PATHINFO_EXTENSION) ) == 'jpg') {
$img_data = file_get_contents($file);
$orig_md5 = md5($img_data);
$success = file_put_contents($target . $file, $img_data);
unset ($img_data);
if( $success != TRUE ) {
echo "an error occurred copying $file\n"; exit;
} elseif ( $orig_md5 != md5_file($target . $file) ) {
echo "an error occurred confirming data of $file\n"; exit;
} else {
echo "$file copied successfully.\n";
unlink ($img_data);
}
}
}
I am currently doing it this way so I can compare the md5 hashes to make sure the copy is a bit-for-bit match of the original.
My questions are:
1) Would using php copy() be faster? I assume it would, because the target file doesn't have to be read into memory to check the md5 hash.
2) Does copy() do some sort of hash check as part of the function, to ensure the integrity of the copy, before returning TRUE/FALSE?
PHP's copy function would not only be faster, but does it using buffers to avoid reading all the previous file in memory, which is a problem for big files. The return boolean is only for success writing, you can rely on that, but if you want to check the hash use md5_file instead of passing the content into md5, because it is optimized in the same memory-optimized way.
However if you have just to rename the file then rename is far better, it is totally instant and reliable.
No, copy() doesn't perform any additional integrity checks, it assumes that the operating system's filesystem API is reliable.
You could use md5_file() on both the source and destination:
if (copy($source, $dest) && md5_file($dest) == md5_file($source)) {
echo "File copied successfully";
} else {
echo "Copy failed";
}
Note that your integrity che
cks do not actually check the the file was written to disk properly. Most operating systems use a unified buffer cache, so when you call md5_file() immediately after writing the file, it will get the file contents from the kernel buffers, not the disk. In fact, it's possible that the target file hasn't even been written to disk yet, it's still sitting in kernel buffers that are waiting to be flushed. PHP doesn't have a function to call sync(2), but even if it did, it would still read from the buffer cache rather than re-reading from disk.
So you're basically at the mercy of the OS and hardware, which you must assume is reliable. Applications that need more reliability tests must perform direct device I/O rather than going through the filesystem.
EDIT: I'm pretty sure the issue has to do with the firewall, which I can't access. Marking Canis' answer as correct and I will figure something else out, possibly wget or just manually scraping the files and hoping no major updates are needed.
EDIT: Here's the latest version of the builder and here's the output. The build directory has the proper structure and most of the files, but only their name and extension - no data inside them.
I am coding a php script that searches the local directory for files, then scrapes my localhost (xampp) for the same files to copy into a build folder (the goal is to build php on the localhost and then put it on a server as html).
Unfortunately I am getting the error: Warning: copy(https:\\localhost\intranet\builder.php): failed to open stream: No such file or directory in C:\xampp\htdocs\intranet\builder.php on line 73.
That's one example - every file in the local directory is spitting the same error back. The source addresses are correct (I can get to the file on localhost from the address in the error log) and the local directory is properly constructed - just moving the files into it doesn't work. The full code is here, the most relevant section is:
// output build files
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
copy($source, $dest);
echo "Copy $source to $dest. <br>";
}
}
You are trying to use URLs to travers local filesystem directories. URLs are only for webserver to understand web requests.
You will have more luck if you change this:
copy(https:\\localhost\intranet\builder.php)
to this:
copy(C:\xampp\htdocs\intranet\builder.php)
EDIT
Based on your additional info in the comments I understand that you need to generate static HTML-files for hosting on a static only webserver. This is not an issue of copying files really. It's accessing the HMTL that the script generates when run through a webserver.
You can do this in a few different ways actually. I'm not sure exactly how the generator script works, but it seems like that script is trying to copy the supposed output from loads of PHP-files.
To get the generated content from a PHP-file you can either use the command line php command to execute the script like so c:\some\path>php some_php_file.php > my_html_file.html, or use the power of the webserver to do it for you:
<?php
$hosted = "https://localhost/intranet/"; <--- UPDATED
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$path = str_replace("\\","/",$path); <--- ADDED
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
$content = file_get_contents(urlencode($source));
file_put_contents(str_replace(".php", ".html", $dest), $content);
echo "Copy $source to $dest. <br>";
}
}
In the code above I use file_get_contents() to read the html from the URL you are using https://..., which in this case, unlike with copy(), will call up the webserver, triggering the PHP engine to produce the output.
Then I write the pure HTML to a file in the $dest folder, replacing the .php with .htmlin the filename.
EDIT
Added and revised the code a bit above.
I made an ajax based multiple file upload which is working well enough. But the issue is I have to upload very large files means upto 200 MB of file each and may be 20-25 files of same size at the same time. Files are being uploaded but it takes very long time to upload.
I have changed several things in php ini settings
post_max_size 10G
upload_max_size 10G
max_execution_time 3600
memory_limit -1
So what is the best solution for handling such type of file upload, which perform fast.
My internet connection is of 100 MB/S and upload speed with 20 MB/S.
Please suggest me some good solution.
Depending on wich type of file you're uploading, you can try to send a zipped file to the server and unzip it when the upload is finished.
You can see more about zip functions in php here.
To zip one file:
Source: http://davidwalsh.name/create-zip-php
/* creates a compressed zip file */
function create_zip($files = array(),$destination = '',$overwrite = false) {
//if the zip file already exists and overwrite is false, return false
if(file_exists($destination) && !$overwrite) { return false; }
//vars
$valid_files = array();
//if files were passed in...
if(is_array($files)) {
//cycle through each file
foreach($files as $file) {
//make sure the file exists
if(file_exists($file)) {
$valid_files[] = $file;
}
}
}
//if we have good files...
if(count($valid_files)) {
//create the archive
$zip = new ZipArchive();
if($zip->open($destination,$overwrite ? ZIPARCHIVE::OVERWRITE : ZIPARCHIVE::CREATE) !== true) {
return false;
}
//add the files
foreach($valid_files as $file) {
$zip->addFile($file,$file);
}
//debug
//echo 'The zip archive contains ',$zip->numFiles,' files with a status of ',$zip->status;
//close the zip -- done!
$zip->close();
//check to make sure the file exists
return file_exists($destination);
}
else
{
return false;
}
}
To unzip files in the server:
<?php
$zip = new ZipArchive;
$res = $zip->open('file.zip');
if ($res === TRUE) {
$zip->extractTo('/myzips/extract_path/');
$zip->close();
echo 'woot!';
} else {
echo 'doh!';
}
?>
From your application's point of view the only thing you can do is to limit the simultaneous file uploads. Extend your ajax multiple file upload script to upload just one file at a time. This probably will speed up things a little bit.
However your problem is probably not caused by the application itself, but by your server's network speed, or its disk writing speed. Some VPS providers also limits the number of disk write operations/sec. So, the best thing might be to migrate your app to another server with better network speed and performance. :)
The best way is to use old good FTP and some FTP clients like FileZilla for instance.
I've written a script that transfers local files into a folder structure on a remote FTP server with PHP. I'm currently using ftp_connect() to connect to the remote server and ftp_put() to transfer the file, in this case, a CSV file.
My question is, how would one verify that a file's contents (on the remote FTP server) are not a duplicate of the local file's contents? Is there any way to parse the contents of the remote file in question, as well as a local version and then compare them using a PHP function?
I have tried comparing the filesizes of the local file using filesize() and the remote file using ftp_size(), respectively. However, even with different data, but the same number of characters it generates a false positive for duplication as the file-sizes are the same number of bytes.
Please note, the FTP in question is not under my control, so I can't put any scripts on the remote server.
Update
Thanks to both Mark and gafreax, here is the final working code:
$temp_local_file = fopen($conf['absolute_installation_path'] . 'data/temp/ftp_temp.csv', 'w');
if ( ftp_fget($connection, $temp_local_file, $remote_filename, FTP_ASCII) ) {
$temp_local_stream = file_get_contents($conf['absolute_installation_path'] . 'data/temp/ftp_temp.csv');
$local_stream = file_get_contents($conf['absolute_installation_path'] . $local_filename);
$temp_hash = md5($temp_local_stream);
$local_hash = md5($local_stream);
if($temp_hash !== $local_hash) {
$remote_file_duplicate = FALSE;
} else {
$remote_file_duplicate = TRUE;
}
}
You can use hashing function like md5 and check against two generated md5 if they match.
For example:
$a = file_get_contents('a_local');
$b = file_get_contents('b_local');
$a_hash = md5($a);
$b_hash = md5($b);
if($a_hash !== $b_hash)
echo "File differ";
else
echo "File are the same";
The md5 function is useful to avoid problem on reading strange data on file
You could also compare the last modified time of each file. You'd upload the local file only if it is more recent than the remote one. See filemtime and ftp_mdtm. Both of those return a UNIX timestamp you can easily compare. This is faster than getting the file contents and calculating a hash.
Is it possible to list all the files from a remote server.I am running this code in serverOne.com , its a php server .I want to access serverTwo.com/dirOne its aTomcat server.
$path = "http://www.serverTwo.com/dirOne";
if ($handle = opendir($widget_path)) {
while (false !== ($widgetfile = readdir($handle)))
{
if ($widgetfile != "." && $widgetfile != "..")
{
echo $widgetfile;
}
}
closedir($handle);
}
The short answer is No. Definitely not like that because that would be a serious security problem, don't you think?!
There are ways you could establish a link between two servers but just allowing pretty much anyone to list files and read files off of one server from another would be quite bad.
If the other server has directory browsing enabled and there's no default document in the directory, then generally you get an HTML page containing a file listing. But if it's disabled and/or default-documented, then no, you can't. Not directly.
You can probably connect to your second server using FTP