PHP FTP code fails, even though file is downloaded successfully - php

I'm using the following code to download a file via FTP in PHP:
$fp = fopen($local_file, 'w+');
$conn_id = ftp_connect($host);
$login_result = ftp_login($conn_id, $user, $pass);
$ret = ftp_nb_fget($conn_id, $fp, $remote_file, FTP_BINARY);
while ($ret == FTP_MOREDATA) {
$ret = ftp_nb_continue($conn_id);
}
if ($ret != FTP_FINISHED) {
echo "<span style='color:red;'><b>There was an error downloading the file!</b></span><br>";
logThis("log.txt", date('h:i:sa'), "ERROR DOWNLOADING FILE!");
exit();
}
fclose($fp);
<<php code continues below this....>>
This code seems to be working fine. The file is downloaded, and the MD5 hash of the file matches the hash of the file on the other server before it was downloaded. So the download does complete.
In any event, using that code above, even with the file successfully downloading, it's hitting the code inside of the if ($ret != FTP_FINISHED) condition.
If the file downloads fine, why is FTP_FINISHED not true?
EDIT
When I check the value of $ret after the WHILE loop, the times the script completes fine $ret=1 and the times the script fails $ret=0
However, there are times when the script fails because $ret=0 when the file is actually downloaded properly, which can be confirmed with a MD5 comparison.
Also, 0 or 1 are not values that should be returned from these commands. The official PHP documentation give three possible return values, they are FTP_FAILED or FTP_FINISHED or FTP_MOREDATA

I have thought of one solution. Since the file does get downloaded correctly, as determined by an MD5 check from the original source (which we do have), I could modify the code this way:
if ($ret != FTP_FINISHED) {
$localMD5 = md5_file($local_file);
if($localMD5 != $remoteMD5){
echo "<span style='color:red;'><b>There was an error downloading the file!</b></span><br>";
logThis("log.txt", date('h:i:sa'), "ERROR DOWNLOADING FILE!");
exit();
}
}
In most cases the script completes as expected, so this block of code never gets run. However, in cases where the error above occurs and this code is run, it could verify the MD5 hash of the file, and only run the error code if it doesn't match the MD5 of the original source file. If the MD5's match then the download was successful anyways, so the error code shouldn't run

Edited:
My first solution wasn't correct. After checking your comments below, I must say your code is correct, and that the problem probably is on "upload_max_size" and "post_max_size" values.
See here "Upload large files to FTP with PHP" and mainly here: "Changing upload_max_filesize on PHP"
So, the proposed solution is to add this to the .htaccess file:
php_value upload_max_filesize 2G
php_value post_max_size 2G
or, if the server is yours (dedicated), set them in php.ini (you'll need to restart the server so the changes take effect).
You may also find useful the post_max_size info in php.net. I've found interesting particularly this:
If the size of post data is greater than post_max_size, the $_POST and
$_FILES superglobals are empty. This can be tracked in various ways,
e.g. by passing the $_GET variable to the script processing the data,
i.e. , and then checking if
$_GET['processed'] is set.

Related

PHP filesize() showing old filesize with a file inside a windows shared (network) folder

I have the following script that runs to read new content from a file:
<?php
clearstatcache();
$fileURL = "\\\\saturn\extern\seq_ws.csv";
$fileAvailable = file_exists($fileURL);
$bytesRead = file_get_contents("bytes.txt");
if($fileAvailable){
$fileSize = filesize($fileURL);
//Statusses 1 = Partial read, 2 = Complete read, 0 = No read, -1 File not found. followed by !!
if($bytesRead < $fileSize){
//$bytesRead till $fileSize bytes read from file.
$content = file_get_contents($fileURL, NULL, NULL, $bytesRead);
file_put_contents("bytes.txt", ((int)$bytesRead + strlen($content)));
echo "1!!$content";
}else if($bytesRead > $fileSize){
//File edit or delete detected, whole file read again.
$content = file_get_contents($fileURL);
file_put_contents("bytes.txt", strlen($content));
echo "2!!$content";
}else if($bytesRead == $fileSize){
//No new data found, no action taken.
echo "0!!";
}
}else{
//File delete detected, reading whole file when available
echo "-1!!";
file_put_contents("bytes.txt", "0");
}
?>
It works perfect when I run it and does what is expected.
When I edit the file from the same PC and my server it works instantly and returns the correct values.
However when I edit the file from another PC, my script takes about 4-6 seconds to read the correct filesize of the file.
I added clearstatcache(); on top of my script, because I think its a caching issue. But the strange thing is that when I change the file from the server PC it responds instantly, but from another it doesn't.
On top of that as soon as the other PC changes the file, I see the file change in Windows with the filesize and content but for some reason, it takes Apache about 4-6 seconds to detect the change. In those 4-6 seconds it receives the old filesize from before the change.
So I have the following questions:
Is the filesize information cached anywhere maybe either on the Apache server or inside Windows?
If question 1 applies, is there anyway to remove or disable this caching?
Is it possible this isnt a caching problem?
I think that in Your local PC php has development settings.
So I suggest to check php.ini for this param: realpath_cache_ttl
Which is:
realpath_cache_ttl integer
Duration of time (in seconds) for which to cache realpath
information for a given file or directory.
For systems with rarely changing files,
consider increasing the value.
to test it, php info both locally and on server to check that value:
<?php phpinfo();

Php Lock files when writte

I am testing my code using little database in txt files. The most important problem that I have found is: when users write at the same time into one file. To solve this I am using flock.
OS of my computer is windows with xampp installed (comment this because i understand flocks works fine over linux no windows) However I need to do this test over linux server.
Actually I have tested my code by loading the same script in 20 windows at the same time. The firsts results works fine, but after test database file appears empty.
My Code :
$file_db=file("test.db");
$fd=fopen("".$db_name."","w");
if (flock($fd, LOCK_EX))
{
ftruncate($fd,0);
for ($i=0;$i<sizeof($file_db);$i++)
{
fputs($fd,"$file_db[$i]"."\n");
}
fflush($fd);
flock($fd, LOCK_UN);
fclose($fd);
}
else
{
print "Db Busy";
}
How it's possible that the script deletes database file content. What is proper way: use flock with fixing of existing code or use some other alternative technique of flock?
I have re-wrote the script using #lolka_bolka's answer and it works. So in answer to your question, the file $db_name could be empty if the file test.db is empty.
ftruncate after fopen with "w" is useless.
file function
Returns the file in an array. Each element of the array corresponds to a line in the file, with the newline still attached. Upon failure, file() returns FALSE.
You do not have to add additional end of line symbol.
flock function
PHP supports a portable way of locking complete files in an advisory way (which means all accessing programs have to use the same way of locking or it will not work).
It means that file function not affected by the lock. It means that $file_db=file("test.db"); could read file while other process somewhere between ftruncate($fd,0); and fflush($fd);. So, you need read file content inside lock.
$db_name = "file.db";
$fd = fopen($db_name, "r+"); // w changed to r+ for getting file resource but not truncate it
if (flock($fd, LOCK_EX))
{
$file_db = file($db_name); // read file contents while lock obtained
ftruncate($fd, 0);
for ($i = 0; $i < sizeof($file_db); $i++)
{
fputs($fd, "$file_db[$i]");
}
fflush($fd);
flock($fd, LOCK_UN);
}
else
{
print "Db Busy";
}
fclose($fd); // fclose should be called anyway
P.S. you could test this script using console
$ for i in {1..20}; do php 'file.php' >> file.log 2>&1 & done

PHP - Chunked file copy (via FTP) has missing bytes?

So, I'm writing a chunked file transfer script that is intended to copy files--small and large--to a remote server. It almost works fantastically (and did with a 26 byte file I tested, haha) but when I start to do larger files, I notice it isn't quite working. For example, I uploaded a 96,489,231 byte file, but the final file was 95,504,152 bytes. I tested it with a 928,670,754 byte file, and the copied file only had 927,902,792 bytes.
Has anyone else ever experienced this? I'm guessing feof() may be doing something wonky, but I have no idea how to replace it, or test that. I commented the code, for your convenience. :)
<?php
// FTP credentials
$server = CENSORED;
$username = CENSORED;
$password = CENSORED;
// Destination file (where the copied file should go)
$destination = "ftp://$username:$password#$server/ftp/final.mp4";
// The file on my server that we're copying (in chunks) to $destination.
$read = 'grr.mp4';
// If the file we're trying to copy exists...
if (file_exists($read))
{
// Set a chunk size
$chunk_size = 4194304;
// For reading through the file we want to copy to the FTP server.
$read_handle = fopen($read, 'rb');
// For appending to the destination file.
$destination_handle = fopen($destination, 'ab');
echo '<span style="font-size:20px;">';
echo 'Uploading.....';
// Loop through $read until we reach the end of the file.
while (!feof($read_handle))
{
// So Rackspace doesn't think nothing's happening.
echo PHP_EOL;
flush();
// Read a chunk of the file we're copying.
$chunk = fread($read_handle, $chunk_size);
// Write the chunk to the destination file.
fwrite($destination_handle, $chunk);
sleep(1);
}
echo 'Done!';
echo '</span>';
}
fclose($read_handle);
fclose($destination_handle);
?>
EDIT
I (may have) confirmed that the script is dying at the end somehow, and not corrupting the files. I created a simple file with each line corresponding to the line number, up to 10000, then ran my script. It stopped at line 6253. However, the script is still returning "Done!" at the end, so I can't imagine it's a timeout issue. Strange!
EDIT 2
I have confirmed that the problem exists somewhere in fwrite(). By echoing $chunk inside the loop, the complete file is returned without fail. However, the written file still does not match.
EDIT 3
It appears to work if I add sleep(1) immediately after the fwrite(). However, that makes the script take a million years to run. Is it possible that PHP's append has some inherent flaw?
EDIT 4
Alright, further isolated the problem to being an FTP problem, somehow. When I run this file copy locally, it works fine. However, when I use the file transfer protocol (line 9) the bytes are missing. This is occurring despite the binary flags the two cases of fopen(). What could possibly be causing this?
EDIT 5
I found a fix. The modified code is above--I'll post an answer on my own as soon as I'm able.
I found a fix, though I'm not sure exactly why it works. Simply sleeping after writing each chunk fixes the problem. I upped the chunk size quite a bit to speed things up. Though this is an arguably bad solution, it should work for my uses. Thanks anyway, guys!

Weird PHP file upload issue

I am having strange issues regarding file upload on my windows system. I am using windows 7 with iis7 on the server. I am trying on a client comp with local IP 10.47.47.13 and the server is 10.47.47.1.
I have a very simple form which i couldn't make it work in some cases. The page stays on the wwwroot. (http://10.47.47.1/3.php)
3.php
<?php
$source_file=$_FILES["newsimg"]["tmp_name"];
$destination_file="123.jpg";
$ftp_server="localhost";
$ftp_username="admin";
$ftp_password="apple";
if ($source_file!="") {
$mrph_connect = ftp_connect($ftp_server,21);
$mrph_login= ftp_login($mrph_connect, $ftp_username, $ftp_password);
if (($mrph_connect) && ($mrph_login)) {
$upload = ftp_put($mrph_connect, $destination_file, $source_file, FTP_BINARY);
if ($upload) echo "ok"; else echo "nok";
}
}
?>
<body>
<form enctype="multipart/form-data" action="3.php" method="POST">
<input type=file name=newsimg>
<input type=submit name=mrph>
</form>
</body>
The form calls itself to upload the file. When I select a file of size 1 or 2 KB it works but when I select a file of even 10 15KB the page timeouts after some time. I checked the php.ini settings file upload is on, I set temp folder as c:\uploads just to test. AS I SAID IT WORKS FOR FILES SIZE 1 OR 2KB BUT NOT EVEN WHEN I SELECT A FILE OF 10 OR 20KB. I even removed the PHP code (commented everything) to see even when nothing is done it works but it didn't.
Any help would be appreciated.
To me, the problem seems to be where you are uploading your file, the server; there is nothing wrong with uploading because you are able to upload smaller files but when you upload files of 20 kb size, you fail, check to make sure that right upload settings are specified on the server you want to upload the file to. Using ftp and uploading to a different server/location itself is slow process though. Your code also seems to be right.
My guess is that your ftp_put is timing out, try setting your FTP timeout threshold below PHP's default (30 seconds):
$mrph_connect = ftp_connect($ftp_server,21);
ftp_set_option($mrph_connect, FTP_TIMEOUT_SEC, 20);
$mrph_login= ftp_login($mrph_connect, $ftp_username, $ftp_password);
if (($mrph_connect) && ($mrph_login)) {
$upload = ftp_put($mrph_connect, $destination_file, $source_file, FTP_BINARY);
if ($upload) echo "ok"; else echo "nok";
}
If making that adjustment causes your script to return 'nok' then you'll know the put is taking too long.
If the put is your problem you try a non-blocking put with ftp_nb_put to FTP the file asynchronously:
$mrph_connect = ftp_connect($ftp_server,21);
$mrph_login= ftp_login($mrph_connect, $ftp_username, $ftp_password);
if (($mrph_connect) && ($mrph_login)) {
$ret = ftp_nb_put($mrph_connect, $destination_file, $source_file, FTP_BINARY);
while ($ret == FTP_MOREDATA) {
$ret = ftp_nb_continue($mrph_connect);
}
if ($ret == FTP_FINISHED) echo "ok"; else echo "nok";
}
I think Cryo is onto something, can it be that the php.ini file isnĀ“t correctly configured and the maximum filesize is to low?
This might not be it but for the record your form should have a MAX_FILE_SIZE hidden input with the number of bytes corresponding to the max upload size
You might have a low filesize limit. To check this: create a new php file, called info.php or whatever and just write
<?php
phpinfo();
?>
Open that page in your browser, and search for upload_max_filesize. Check the value for that; if it is only a few kilobytes, that's your problem. If this is the case, you will have to modify your php.ini (under Apache you could use a directive in a .htaccess file as well, but I don't think there's anything like that for IIS). The location of this file can be different depending on your installation, but it's probably C:\Windows\php.ini. Find the upload_max_filesize directive and change it to something bigger. The default is 2 megabytes (2M) but you can make it whatever.

PHP fopen/fwrite problem on IIS7

I am running PHP5 on IIS7 on Windows Server 2008 R2. Check out the below code which writes a string received via a POST request parameter into an XML file.
<?php
$temp = "";
if($_SERVER['REQUEST_METHOD']=="POST"){
if($_POST["operation"]=="saveLevels"){
$fileHandle = fopen("c:\\inetpub\\wwwroot\\test\\xml\\levels.xml", 'w');
fwrite($fileHandle, stripslashes($_POST["xmlString"]));
fclose($fileHandle);
$temp = "success";
}elseif($_POST["operation"]=="saveRules"){
$fileHandle = fopen("c:\\inetpub\\wwwroot\\test\\xml\\rules.xml", 'w');
fwrite($fileHandle, stripslashes($_POST["xmlString"]));
fclose($fileHandle);
$temp = "success";
}
}
When I make a POST request to invoke this code, the application pool owning/hosting the site containing php files, stops (due to some fatal errors, as it writes in event viewer) and then IIS keeps responding with HTTP503 after that. Now, I have given proper permissions to IUSR and IISUSRS on that (test/xml) directory. Those two XML files are not already existing, I have also tried the code when an XML file is already present but; it behaved the same way.
What's wrong with that php code? I have tried it on a linux-box and it behaved as expected.
edit: I have tried various different versions of this code and came up with this result: the fopen call when awaken by a POST request, allways returns FALSE or sometimes NULL, and causes the Application Pool of itself to stop. The exact same code, works OK with a GET request, with exact same parameters. So; I dont know what the problem is but; for the time I'm just going to use GET requests for this operation.
Can you var_dump( $fileHandle) for both options, to show us what it has. I notice you're just assuming the file is opened, rather than checking the value (if it's FALSE the fwrite will fail)

Categories