Executing file via file_get_contents on remote host - php

I have a script and I don't know why and how it works - one reason for that is I found contradicting information about file_get_contents.
I have three (internal) webservers - all set up the same way, running the same software.
I need to count the number of files in one specific folder on each server (in order to get the number of users logged into a certain application).
For the local server my file counting PHP script is called by a simple include and for the two remote servers I use file_get_contents.
In both cases I refer to the same PHP file. That works - I get the correct number of files for the folder on each server.
Sometimes you read file_get_contents returns just the file content but does not execute the file. In my case the file is executed and I get the correct number of files. So, I'm a bit confused here why my scripts actually work.
My scripts were saved on one server. I want to be more flexible and be able to call the scripts from each server. Therefore I created a new virtual directory on a network folder and moved the script files there, the virtual folder has the same set up on each server. I had to change my script slightly to get the same result again. Instead of a return $num I now have echo $num. If I use return I won't get a result, if I use echo the correct number of files is given. I would prefer to receive the result via return - but I don't know why this doesn't work anymore in the new context.
script which shows the number of files:
function getUserNum($basis_url_server, $url_vaw_scripte, $script_number_users)
{
$serverName = strtoupper($_SERVER['SERVER_NAME']);
//local server
if(strpos(strtoupper($basis_url_server),$serverName) !== false)
{
$numUsers = (include($script_number_users));
}
//remote server
else
{
$path = $basis_url_server.$url_vaw_scripte.$script_number_users;
$numUsers = file_get_contents($path);
//include($path);
}
return $numUsers;
}
echo getUserNum($basis_url_server1, $url_vaw_scripte, $script_number_users)."($label_server1)";
echo getUserNum($basis_url_server2, $url_vaw_scripte, $script_number_users)."($label_server2)";
echo getUserNum($basis_url_server3, $url_vaw_scripte, $script_number_users)."($label_server3)";
script for counting the files (refered as $script_number_users above)
<?php
// 'include' only contains $fadSessionRepository = "E:\Repository\Session"
include dirname(__DIR__).'/vaw_settings.php';
$fi = new FilesystemIterator($pfadSessionRepository, FilesystemIterator::SKIP_DOTS);
$number = (iterator_count($fi)-1)/2 ;
//return $number;
echo $number;
?>

file_get_contents() will execute a GET if given a url, and will read a file if given filesystem path. It is like 2 different function from the same call.
You are actually building a primitive REST webservice instead of actually loading the files as you though, the remote files are executed and you get the output that you would see if you manually loaded them from a browser

file_get_contents() will return the raw content of a local file. For remote files it will return what the webserver delivers. If the webserver executes the script in the file it will get the result of that script. If the webserver doesn't execute the script in the file (due to a misconfiguration for example) you will still get the raw content of the remote script.
In your case I'd just remove the include path and just fetch all scripts over http. It reduces the complexity and the overhead of calling one of three scripts via http instead of loading it directly is negligible.

Related

PHP replace a row in csv works fine on my localhost but does not replace the row when uploaded to cpanel?

Hello I am relatively new to PHP and I was trying to replace a row in a csv file, i didnt find an optimal solution so I concocted script (a work around) which suits my needs for the time being till I grasp a better understanding of PHP
I tested it on my localhost using XAMPP and everything was working fine , it was replacing the row as intended but when i uploaded the files to my cpanel it stopped replacing and instead it just goes the normal route and write the row on new line.
this is my code :
$fileName = 'Usecase.csv'; //This is the CSV file
$tempName = 'temp.csv';
$inFile = fopen($fileName, 'r');
$outFile = fopen($tempName,'w');
while (($line = fgetcsv($inFile)) !== FALSE)
{
if(($line[0] == "$fin") ) //Here I am checking the value of the variable to see if same value exists in the array then i am replacing the array which will be later written into the csv file
{
$line = explode (",", "$tempstr10");
$asd=$asd+1; //this is the variable that i defined and assigned value 0 in the top most section, this is used later in the code
}
fputcsv($outFile, $line );
}
fclose($inFile);
fclose($outFile);
unlink($fileName);
rename($tempName, $fileName);
if( $asd==0 && filesize("Usecase.csv")>0) // here its checking if the value is 0 , if value is 0 then that means the above code didnt execute which means the value wasnt present in the file , this is to avoid writing the same string again into the file
{ file_put_contents("Usecase.csv", "$tempstr10\r\n",FILE_APPEND | LOCK_EX); }
if( $asd==0 && filesize("Usecase.csv")==0)
{ file_put_contents("Usecase.csv", "$tempstr10\r\n",FILE_APPEND | LOCK_EX); }
and as I mentioned above , its working on the localhost but not on the cpanel , can someone point out if something is wrong with the code ? or if its something else ?
thank you
The most likely problem is that your local version of PHP or your local configuration of PHP is different from what is on the server.
For example, fopen is a feature that can be disabled on some shared servers.
You can check this by creating a php file with the following conents:
<?php phpinfo();
Then visit that PHP file in your browser. Do this for both your local dev environment and your cPanel server to compare the configuration to identify the differences that may be contributing to the differing behavior.
You should also check the error logs. They can be found in multiple different places depending on how your hosting provider has things configured. If you can't find them, you'll need to ask your hosting provider to know for sure where the error logs are.
Typical locations are:
The "Errors" icon in cPanel
A file named "error_log" in one of the folders of your site. Via ssh or the Terminal icon in cPanel you can use this command to find those files: find $PWD -name error_log
If your server is configured to use PHP-FPM, the php error log is located at ~/logs/yourdomain_tld.php.error.log
You should also consider turning on error reporting for the script by putting this at the very top. Please note that this should only be used temporarily while you are actively debugging the application. Leaving this kind of debugging output on could expose details about your application that may invite additional security risks.
<?php
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
... Your code here ...

ftp listing and download file in current date

I have a case,
I have a remote server that contains so many generated transaction files (.txt) from 2015 until now. I must download it everyday real time. For now, i use PHP to download it all, but the method i think is not effectifely. First, I list all files, and then I read the component of the files such as the date modified, but this method is annoying. Make my program run slowly and take a very much time.
This is my code (I've used PHP Yii2),
public function actionDownloadfile(){
$contents=Yii::$app->ftpFs->listContents('/backup', ['timestamp','path','basename']); --> Much time needed while executing this line
var_dump($contents);
foreach ($contents as $value) {
if (date('Y-m-d',$value['timestamp']) == date('Y-m-d')){
echo "[".date('Y-m-d H:i:s')."] : Downloading file ".$value['basename']."\n";
$isi = Yii::$app->ftpFs->read($value['path']);
$dirOut = Yii::$app->params['out'];
$fileoutgoing = $dirOut."/".$value['basename'];
$file = fopen($fileoutgoing,"w");
fwrite($file,$isi);
}
}
}
i have a question,
Is that possible to list and download some files in ftp server just only on this current date without listing them all first?
Any solution either using PHP or Shell Script is OK.
Thank you so much (y)

Check files on remote FTP server for duplicate content with PHP

I've written a script that transfers local files into a folder structure on a remote FTP server with PHP. I'm currently using ftp_connect() to connect to the remote server and ftp_put() to transfer the file, in this case, a CSV file.
My question is, how would one verify that a file's contents (on the remote FTP server) are not a duplicate of the local file's contents? Is there any way to parse the contents of the remote file in question, as well as a local version and then compare them using a PHP function?
I have tried comparing the filesizes of the local file using filesize() and the remote file using ftp_size(), respectively. However, even with different data, but the same number of characters it generates a false positive for duplication as the file-sizes are the same number of bytes.
Please note, the FTP in question is not under my control, so I can't put any scripts on the remote server.
Update
Thanks to both Mark and gafreax, here is the final working code:
$temp_local_file = fopen($conf['absolute_installation_path'] . 'data/temp/ftp_temp.csv', 'w');
if ( ftp_fget($connection, $temp_local_file, $remote_filename, FTP_ASCII) ) {
$temp_local_stream = file_get_contents($conf['absolute_installation_path'] . 'data/temp/ftp_temp.csv');
$local_stream = file_get_contents($conf['absolute_installation_path'] . $local_filename);
$temp_hash = md5($temp_local_stream);
$local_hash = md5($local_stream);
if($temp_hash !== $local_hash) {
$remote_file_duplicate = FALSE;
} else {
$remote_file_duplicate = TRUE;
}
}
You can use hashing function like md5 and check against two generated md5 if they match.
For example:
$a = file_get_contents('a_local');
$b = file_get_contents('b_local');
$a_hash = md5($a);
$b_hash = md5($b);
if($a_hash !== $b_hash)
echo "File differ";
else
echo "File are the same";
The md5 function is useful to avoid problem on reading strange data on file
You could also compare the last modified time of each file. You'd upload the local file only if it is more recent than the remote one. See filemtime and ftp_mdtm. Both of those return a UNIX timestamp you can easily compare. This is faster than getting the file contents and calculating a hash.

check file for changes using php

Is there any way to check id a file is being accessed or modified by another process from a php script. i have attempted to use the filemtime(), fileatime() and filectime() functions but i have the script in a loop which is checking continuously but it seems once the script has been executed it will only take the time from the first time the file was checked.. an example would be uploading files to a FTP or SMB share i attempted this below
while(1==1)
{
$LastMod = filemtime("file");
if(($LastMod +60) > time())
{
echo "file in use please wait... last modified : $LastMod";
sleep(10);
}else{
process file
}
}
I know the file is constantly changing but the $LastMod variable is not updating but end process and execute again will pick up a new $LastMod from the file but dosnt seem to update each time the file is checked in the loop
I have also attempted this with looking at filesize() but get the same symptoms i also looked into flock() but as the file is created or modified outside PHP I don't see how this would work.
If anyone has any solutions please let me know
thanks Vip32
PS. using PHP to process the files as requires interaction with mysql and querying external websites
The file metadata functions all work off stat() output, which caches its data, as a stat() call is a relatively expensive function. You can empty that cache to force stat() to fetch fresh data with clearstatcache()
There are other mechanisms that allow you to monitor for file changes. Instead of doing a loop in PHP and repeatedly stat()ing, consider using an external monitoring app/script which can hook into the OS-provided mechanism and call your PHP script on-demand when the file truly does change.
Add clearstatcache(); to your loop:
while(true)
{
$LastMod = filemtime("file");
clearstatcache();
if(($LastMod +60) > time())
{
echo "file in use please wait... last modified : $LastMod";
sleep(10);
}else{
process file
}
}

PHP fopen/fwrite problem on IIS7

I am running PHP5 on IIS7 on Windows Server 2008 R2. Check out the below code which writes a string received via a POST request parameter into an XML file.
<?php
$temp = "";
if($_SERVER['REQUEST_METHOD']=="POST"){
if($_POST["operation"]=="saveLevels"){
$fileHandle = fopen("c:\\inetpub\\wwwroot\\test\\xml\\levels.xml", 'w');
fwrite($fileHandle, stripslashes($_POST["xmlString"]));
fclose($fileHandle);
$temp = "success";
}elseif($_POST["operation"]=="saveRules"){
$fileHandle = fopen("c:\\inetpub\\wwwroot\\test\\xml\\rules.xml", 'w');
fwrite($fileHandle, stripslashes($_POST["xmlString"]));
fclose($fileHandle);
$temp = "success";
}
}
When I make a POST request to invoke this code, the application pool owning/hosting the site containing php files, stops (due to some fatal errors, as it writes in event viewer) and then IIS keeps responding with HTTP503 after that. Now, I have given proper permissions to IUSR and IISUSRS on that (test/xml) directory. Those two XML files are not already existing, I have also tried the code when an XML file is already present but; it behaved the same way.
What's wrong with that php code? I have tried it on a linux-box and it behaved as expected.
edit: I have tried various different versions of this code and came up with this result: the fopen call when awaken by a POST request, allways returns FALSE or sometimes NULL, and causes the Application Pool of itself to stop. The exact same code, works OK with a GET request, with exact same parameters. So; I dont know what the problem is but; for the time I'm just going to use GET requests for this operation.
Can you var_dump( $fileHandle) for both options, to show us what it has. I notice you're just assuming the file is opened, rather than checking the value (if it's FALSE the fwrite will fail)

Categories