I am running PHP5 on IIS7 on Windows Server 2008 R2. Check out the below code which writes a string received via a POST request parameter into an XML file.
<?php
$temp = "";
if($_SERVER['REQUEST_METHOD']=="POST"){
if($_POST["operation"]=="saveLevels"){
$fileHandle = fopen("c:\\inetpub\\wwwroot\\test\\xml\\levels.xml", 'w');
fwrite($fileHandle, stripslashes($_POST["xmlString"]));
fclose($fileHandle);
$temp = "success";
}elseif($_POST["operation"]=="saveRules"){
$fileHandle = fopen("c:\\inetpub\\wwwroot\\test\\xml\\rules.xml", 'w');
fwrite($fileHandle, stripslashes($_POST["xmlString"]));
fclose($fileHandle);
$temp = "success";
}
}
When I make a POST request to invoke this code, the application pool owning/hosting the site containing php files, stops (due to some fatal errors, as it writes in event viewer) and then IIS keeps responding with HTTP503 after that. Now, I have given proper permissions to IUSR and IISUSRS on that (test/xml) directory. Those two XML files are not already existing, I have also tried the code when an XML file is already present but; it behaved the same way.
What's wrong with that php code? I have tried it on a linux-box and it behaved as expected.
edit: I have tried various different versions of this code and came up with this result: the fopen call when awaken by a POST request, allways returns FALSE or sometimes NULL, and causes the Application Pool of itself to stop. The exact same code, works OK with a GET request, with exact same parameters. So; I dont know what the problem is but; for the time I'm just going to use GET requests for this operation.
Can you var_dump( $fileHandle) for both options, to show us what it has. I notice you're just assuming the file is opened, rather than checking the value (if it's FALSE the fwrite will fail)
Related
Hello I am relatively new to PHP and I was trying to replace a row in a csv file, i didnt find an optimal solution so I concocted script (a work around) which suits my needs for the time being till I grasp a better understanding of PHP
I tested it on my localhost using XAMPP and everything was working fine , it was replacing the row as intended but when i uploaded the files to my cpanel it stopped replacing and instead it just goes the normal route and write the row on new line.
this is my code :
$fileName = 'Usecase.csv'; //This is the CSV file
$tempName = 'temp.csv';
$inFile = fopen($fileName, 'r');
$outFile = fopen($tempName,'w');
while (($line = fgetcsv($inFile)) !== FALSE)
{
if(($line[0] == "$fin") ) //Here I am checking the value of the variable to see if same value exists in the array then i am replacing the array which will be later written into the csv file
{
$line = explode (",", "$tempstr10");
$asd=$asd+1; //this is the variable that i defined and assigned value 0 in the top most section, this is used later in the code
}
fputcsv($outFile, $line );
}
fclose($inFile);
fclose($outFile);
unlink($fileName);
rename($tempName, $fileName);
if( $asd==0 && filesize("Usecase.csv")>0) // here its checking if the value is 0 , if value is 0 then that means the above code didnt execute which means the value wasnt present in the file , this is to avoid writing the same string again into the file
{ file_put_contents("Usecase.csv", "$tempstr10\r\n",FILE_APPEND | LOCK_EX); }
if( $asd==0 && filesize("Usecase.csv")==0)
{ file_put_contents("Usecase.csv", "$tempstr10\r\n",FILE_APPEND | LOCK_EX); }
and as I mentioned above , its working on the localhost but not on the cpanel , can someone point out if something is wrong with the code ? or if its something else ?
thank you
The most likely problem is that your local version of PHP or your local configuration of PHP is different from what is on the server.
For example, fopen is a feature that can be disabled on some shared servers.
You can check this by creating a php file with the following conents:
<?php phpinfo();
Then visit that PHP file in your browser. Do this for both your local dev environment and your cPanel server to compare the configuration to identify the differences that may be contributing to the differing behavior.
You should also check the error logs. They can be found in multiple different places depending on how your hosting provider has things configured. If you can't find them, you'll need to ask your hosting provider to know for sure where the error logs are.
Typical locations are:
The "Errors" icon in cPanel
A file named "error_log" in one of the folders of your site. Via ssh or the Terminal icon in cPanel you can use this command to find those files: find $PWD -name error_log
If your server is configured to use PHP-FPM, the php error log is located at ~/logs/yourdomain_tld.php.error.log
You should also consider turning on error reporting for the script by putting this at the very top. Please note that this should only be used temporarily while you are actively debugging the application. Leaving this kind of debugging output on could expose details about your application that may invite additional security risks.
<?php
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
... Your code here ...
I have a PHP code that connects to a FTP server, reads a file's content, makes a minor change, and than overrides the original file.
it looks like:
$stream_context = stream_context_create(array('ftp' => array('overwrite' => true)));
$file_content = file_get_contents($ftp_file); // this line works
$file_content = str_replace('some content', 'another content', $file_content); // this also..
file_put_contents($ftp_file, $file_content, 0, $stream_context); // this one doesn't :/
the real issue is that the "file_put_contents" worked for a long time, but now it doesn't.
what it does now is weird: it deletes the original file from the server..
also, if i'm changing it to something like:
file_put_contents($new_ftp_file, $file_content);
from what I know, it should create the new file and put the content in it, but it doesn't create it at all.
the hosting service i'm using has a PHP version change a few days ago. I don't remember the what the previous version was, but the current is: 5.2.17
thanks! :)
some changes
I found this piece: http://www.php.net/manual/en/function.file-put-contents.php#86864
doting the same as "file_put_contents" but with foen, fwrite and fclose (I send his example because of the results..). his functions is returning "false" if it couldn't to "fopen" the file, or the "bytes" if it succeeded. I got "false" :/
which means it couldn't even do the:
#fopen($filename, 'w');
although the "file_get_contents" with the same file address is working.
reading is working (but if you take the $filename and use it yourself on a client FTP - it works):
#fopen($filename, 'r');
the "open base_dir" for my hosting (which makes the action) is set to false, but the target hosting (which has the target-file) is set to be true.
I had an idea to save the new content on a new file, so I tried something like:
$f = #fopen($new_ftp_file, 'w'); //this one seems to work and connect
fwrite($f, $file_content); // this one seems to work either and returning the number of byes..
fclose($f);
the problem is that none of them really works. I logged in to the FTP address, using the same credentials that my script is using, and I haven't found the new file. It wasn't created at all. (as I remind you, "$new_ftp_file" is a path to a file that doesn't exists, so "w" mode on "fopen" should create it).
file_put_contents do erase the file by default if it already exists. You have to ask him to don't.
Look at here:
http://www.php.net/manual/en/function.file-put-contents.php
See: If filename does not exist, the file is created. Otherwise, the existing file is overwritten, unless the FILE_APPEND flag is set.
this is not the best solution (definitely), but I managed to figure something..
I used 2 different hosts on them same hosting service, both of the having "open base_dir = On" & "safe mode = Off".
it looks something like:
$ftpstring = "ftp://user:password#anotherhosteddomain.com";
$file = file_get_contents($ftpstring . 'index.html'); // this line works as expected. (yeah, the other hosting has this file);
and then, if you're trying to write something like:
$handler = fopen($ftpstring.'index.html', "w");
it wouldn't work, and tell you it cannot access on writing mode to an existing file.
so if you're doing something like:
$newfile_handler = fopen($ftpstring.'index_new_version.html', "w");
fwrite($newfile, "1122");
so yeah - it works!
but now is a tricky issue.. when i'm adding this line:
fclose($newfile_handler);
the new file is deleted from the hosting!!
I couldn't find any reason why "fclose" is deleting the file after it was create at "fopen" and written in at "fwrite".
so if you're not adding the "fclose" line - it works, but it doesn't close the connection, and also I have to actually delete the existing file before I can override it with a new content, which makes it silly..
although it works, I would really like someone to give me a better solution than mine.
I want to store some data retrieved using an API on my server. Specifically, these are .mp3 files of (free) learning tracks. I'm running into a problem though. The mp3 link returned from the request isn't to a straight .mp3 file, but rather makes an ADDITIONAL API call which normally would prompt you to download the mp3 file.
file_put_contents doesn't seem to like that. The mp3 file is empty.
Here's the code:
$id = $_POST['cid'];
$title = $_POST['title'];
if (!file_exists("tags/".$id."_".$title))
{
mkdir("tags/".$id."_".$title);
}
else
echo "Dir already exists";
file_put_contents("tags/{$id}_{$title}/all.mp3", fopen($_POST['all'], 'r'));
And here is an example of the second API I mentioned earlier:
http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts
Is there some way to bypass this intermediate step? If there's no way to access the direct URL of the mp3, is there a way to redirect the file download prompt to my server?
Thank you in advance for your help!
EDIT
Here is the current snippet. I should be echoing something, correct?
$handle = fopen("http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts", 'rb');
$contents = stream_get_contents($handle);
echo $contents;
Because this echos nothing.
SOLUTION
Ok, I guess file_get_contents is supposed to handle redirects just fine, but this wasn't happening. So I found this function: https://stackoverflow.com/a/4102293/2723783 to return the final redirect of the API. I plugged that URL into file_get_contents and volia!
You seem to be just opening the file handler and not getting the contents using fread() or another similar function:
http://www.php.net/manual/en/function.fread.php
$handle = fopen($_POST['all'], 'rb')
file_put_contents("tags/{$id}_{$title}/all.mp3", stream_get_contents($handle));
So, I'm writing a chunked file transfer script that is intended to copy files--small and large--to a remote server. It almost works fantastically (and did with a 26 byte file I tested, haha) but when I start to do larger files, I notice it isn't quite working. For example, I uploaded a 96,489,231 byte file, but the final file was 95,504,152 bytes. I tested it with a 928,670,754 byte file, and the copied file only had 927,902,792 bytes.
Has anyone else ever experienced this? I'm guessing feof() may be doing something wonky, but I have no idea how to replace it, or test that. I commented the code, for your convenience. :)
<?php
// FTP credentials
$server = CENSORED;
$username = CENSORED;
$password = CENSORED;
// Destination file (where the copied file should go)
$destination = "ftp://$username:$password#$server/ftp/final.mp4";
// The file on my server that we're copying (in chunks) to $destination.
$read = 'grr.mp4';
// If the file we're trying to copy exists...
if (file_exists($read))
{
// Set a chunk size
$chunk_size = 4194304;
// For reading through the file we want to copy to the FTP server.
$read_handle = fopen($read, 'rb');
// For appending to the destination file.
$destination_handle = fopen($destination, 'ab');
echo '<span style="font-size:20px;">';
echo 'Uploading.....';
// Loop through $read until we reach the end of the file.
while (!feof($read_handle))
{
// So Rackspace doesn't think nothing's happening.
echo PHP_EOL;
flush();
// Read a chunk of the file we're copying.
$chunk = fread($read_handle, $chunk_size);
// Write the chunk to the destination file.
fwrite($destination_handle, $chunk);
sleep(1);
}
echo 'Done!';
echo '</span>';
}
fclose($read_handle);
fclose($destination_handle);
?>
EDIT
I (may have) confirmed that the script is dying at the end somehow, and not corrupting the files. I created a simple file with each line corresponding to the line number, up to 10000, then ran my script. It stopped at line 6253. However, the script is still returning "Done!" at the end, so I can't imagine it's a timeout issue. Strange!
EDIT 2
I have confirmed that the problem exists somewhere in fwrite(). By echoing $chunk inside the loop, the complete file is returned without fail. However, the written file still does not match.
EDIT 3
It appears to work if I add sleep(1) immediately after the fwrite(). However, that makes the script take a million years to run. Is it possible that PHP's append has some inherent flaw?
EDIT 4
Alright, further isolated the problem to being an FTP problem, somehow. When I run this file copy locally, it works fine. However, when I use the file transfer protocol (line 9) the bytes are missing. This is occurring despite the binary flags the two cases of fopen(). What could possibly be causing this?
EDIT 5
I found a fix. The modified code is above--I'll post an answer on my own as soon as I'm able.
I found a fix, though I'm not sure exactly why it works. Simply sleeping after writing each chunk fixes the problem. I upped the chunk size quite a bit to speed things up. Though this is an arguably bad solution, it should work for my uses. Thanks anyway, guys!
I'm having the following problem with my VPS server.
I have a long-running PHP script that sends big files to the browser. It does something like this:
<?php
header("Content-type: application/octet-stream");
readfile("really-big-file.zip");
exit();
?>
This basically reads the file from the server's file system and sends it to the browser. I can't just use direct links(and let Apache serve the file) because there is business logic in the application that needs to be applied.
The problem is that while such download is running, the site doesn't respond to other requests.
The problem you are experiencing is related to the fact that you are using sessions. When a script has a running session, it locks the session file to prevent concurrent writes which may corrupt the session data. This means that multiple requests from the same client - using the same session ID - will not be executed concurrently, they will be queued and can only execute one at a time.
Multiple users will not experience this issue, as they will use different session IDs. This does not mean that you don't have a problem, because you may conceivably want to access the site whilst a file is downloading, or set multiple files downloading at once.
The solution is actually very simple: call session_write_close() before you start to output the file. This will close the session file, release the lock and allow further concurrent requests to execute.
Your server setup is probably not the only place you should be checking.
Try doing a request from your browser as usual and then do another from some other client.
Either wget from the same machine or another browser on a different machine.
In what way doesn't the server respond to other requests? Is it "Waiting for example.com..." or does it give an error of any kind?
I do something similar, but I serve the file chunked, which gives the file system a break while the client accepts and downloads a chunk, which is better than offering up the entire thing at once, which is pretty demanding on the file system and the entire server.
EDIT: While not the answer to this question, asker asked about reading a file chunked. Here's the function that I use. Supply it the full path to the file.
function readfile_chunked($file_path, $retbytes = true)
{
$buffer = '';
$cnt = 0;
$chunksize = 1 * (1024 * 1024); // 1 = 1MB chunk size
$handle = fopen($file_path, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
I have tried different approaches (reading and sending the files in small chunks [see comments on readfile in PHP doc], using PEARs HTTP_Download) but I always ran into performance problems when the files are getting big.
There is an Apache mod X-Sendfile where you can do your business logic and then delegate the download to Apache. The download will not be publicly available. I think, this is the most elegant solution for the problem.
More Info:
http://tn123.org/mod_xsendfile/
http://www.brighterlamp.com/2010/10/send-files-faster-better-with-php-mod_xsendfile/
The same happens go to me and i'm not using sessions.
session.auto_start is set to 0
My example script only runs "sleep(5)", and adding "session_write_close()" at the beginning doesn't solve the problem.
Check your httpd.conf file. Maybe you have "KeepAlive On" and that is why your second request hangs until the first is completed. In general your PHP script should not allow the visitors to wait for long time. If you need to download something big, do it in a separate internal request that user have no direct control of. Until its done, return some "executing" status to the end user and when its done, process the actual results.