PHP fopen function only works one way - php

Ok let me explain what's goin one:
I have 2 hosts/websites, Host A and Host B. When i'm using the php function fopen code on Host A to Host B it works, i can read the title of the page. BUT when i go from Host B to Host A it won't work. And now the strange thing when i go from Host B to example.com. It does work??
I think that it's a wrong setting on Host A but, i can't change that much on the server. Does somebody know how to fix the problem? So i can go from Host B and read the title of a file on Host A?
Code that i use to open a file (and is hosted on Host B) and search the title:
$file = fopen ("http://promike360.esy.es/main_site/", "r"); //This is HOST A
if ($file) {
echo "<p>Loadig remote file succesfull.</p>";
} else {
echo "<p>Unable to open remote file.</p>";
exit;
}
while (!feof ($file)) {
$line = fgets ($file, 1024);
/* This only works if the title and its tags are on one line */
if (preg_match ("#\<title\>(.*)\</title\>#i", $line, $out)) {
$title = $out[1];
echo $title;
break;
}
}
fclose($file);

I fixed it!
After a long time, i came with a solution for my problem.
But first, the problem
I have 2 hosts, Host A and Host B. I wanted to read a file on Host A by using a script (that is written in PHP and is using the fopen function) on Host B.
This did not work. What did work, was reading a file that is hosted on Host B and read it by the script on Host A. (And no, the file-permissions was configured in the right way).
This is realy strange, and even at this moment i'm writing this. I still don't know how to solve it.
The solution
The solution to solve this problem, is using jQuery to request the file on Host A and add 1 row of php code to the file on Host A.
[PHP] The code on Host A what needs to be inside the php file.
header('Access-Control-Allow-Origin: *');
This says to your browser that you can go to this host. Otherwise the browser will block your request (This is what someone told me).
[jQuery/Javascript] The code that i used to read the file
$.get("http://example.com/file.php", function( data ) {
});

Related

fopen & cURL on php ubuntu web server

Whether I'm trying to create a file, or use file_get_contents on a external url I just can't seem to get fopen (or curl) code to work. I have checked phpinfo and allow_url_fopen, allow_url_include & curl are all set to on and even looked at the php5/apache2/php.ini and they are all set to on.
But not matter what I do no php code works. Below are some examples (that work on wamp/localhost but not when moved to live web server)
create a html file in folder (works on windows localhost/wamp server)
if (!file_exists('media/folder')) {
mkdir('media/folder', 0755, true);
}
$config_file = 'media/folder/config.html';
$config_file_handle = fopen($config_file, 'w') or die('Cannot open file: '.$config_file); //implicitly creates file
$config_file = 'media/folder/config.html';
$config_file_handle = fopen($config_file, 'w') or die('Cannot open file: '.$config_file);
$config_data = 'add to my file content'.PHP_EOL.'
';
fwrite($config_file_handle, $config_data);
Does not work on live server. Quite frustrating as this needs to work and have tried many curl options but fail as well.
Below is another php code snippet that works locally but not live.
<?php
//works locally if external url used
//works locally if local file used
//Does not work live server if external url used
//Does work live server if local server url used
$string = file_get_contents('my url');
$json_a = json_decode($string, true);
foreach ($json_a as $person_name) {
//Person Name
echo $person_name['PersonName'];
}
?>
Cant really achieve what is have set out to do if I can't get this to work.
Any Help appreciated
An extremely common error, you may need to allow the user www-data to access your desired files. Simply right click on the file or folder you want click properties, go to the permissions tab and set www-data to the owner.

PHP under IIS6 can't open/read file on network

I have a simple php script that runs on an IIS 6 2003 server. All the script does is try to open a file on a shared network network location and display its content to the screen. Try as I might I cannot get that to work. I keep getting a "failed to open stream error message". After a lot of reading (I'm on my second week of working at this) I narrow the problem down to being a server configuration problem. I can run the script through the command prompt and it works fine. If I run var_dump(shell_exec('whoami')) It returns "NULL". If I run that same command on the command prompt it returns the current user that is logged in (i.e. me). The task manager reports that the user for w3wp.exe is "NETWORK SERVICE". I'm including the code below although I'm 100% sure the code is not the problem but, some people like to look at code so there it is. How do you configure or make changes on the server so that it allows reading from a network location? Also, the network location I'm trying to access have been setup with all permissions for everybody so that we can solve this one issue.
<?php
$theFile="\\\\192.168.0.16\\geo\\junk.txt"; #network file does not works
#$theFile="junk.txt"; #local file works fine
$handle = fopen($theFile, "r");
if($handle){
while (!feof($handle)){
$buffer = fgets($handle);
echo $buffer."<br />";
}
}
?>

PHP can't access network drive but can if address is explicit

I've been battling this problem for days now. I'm 100% sure is a user configuration thing somewhere but, i don't know where to look or what to change. Here is the problem. I have a file i want to read. It just contains 5 lines like "this is line 1, this is line 2 etc. It's for debugging purposes. I have 2 copies of this files. one lives locally and one lives on a network drive. I can access the file locally no problem. I can also access the file on the network drive if I specify the address as in \\192.168.0.16\geo\junk.txt. What I cannot do is to access the file via a mapped drive. as in U:\junk.txt where the U: is mapped to the 192.168... above. Below is my code. Again, after A LOT!!! of reading I've come to the conclusion that its a user perminsion thing between the machine the code lives in and the apache server that runs the code. I don't think the two are talking to each other. Just in case this is on a windows 7 machine apache 2.2.
<?php
#$theFile="\\\\192.168.0.16\\geo\\junk.txt"; #works fine
#$theFile="junk.txt"; #local file works fine
$theFile="U:\\junk.txt"; #mapped drive DOESN'T WORK AAARRRGGG!!!!!
$handle = fopen($theFile, "r");
if($handle){
while (!feof($handle)){
$buffer = fgets($handle);
echo $buffer."<br />";
}
}
?>

php fwrite() doesn't finish writing string data to file, why?

I'm trying to write a sizable chunk of data to a file that is opened via fopen() in php. The protocol wrapper I'm using is ftp, so the file is remote to the server running the php code. The file I'm writing to is on a Windows server.
I verified that the file does, in fact, get created by my php code, but the problem is that the data within the file is either non-existant (0KB) or writing to the file stops prematurely. Not sure why this is the case.
Here is the code I am using for handling the operation:
$file_handle = fopen($node['ftp'].$path_to_lut, "wb", 0, $node['ftp_context']);
include_once($file);
if ($file_handle)
{
fwrite($file_handle, $string); //$string is inside included $file
fclose($file_handle);
} else {
die('There was a problem opening the file.');
}
This code works fine when I host it on my local machine, but when I upload it to my webhost (Rackspace Cloud), it fails. This leads me to believe it's an issue related to the configuration of the my server at Rackspace, but want to know if there is anything I can do to my php code to make it more robust.
Any ideas to ensure fwrite actually finishes writing the string to the remote machine?
Thanks!
Okay, I changed the code that writes to the file like so:
if ($file_handle)
{
if ($bytesWritten = fwrite($file_handle, $string) ) {
echo "There were " . $bytesWritten . " bytes written to the text file.";
}
if (!fflush($file_handle)) {
die("There was a problem outputting all the data to the text file.");
}
if (!fclose($file_handle)) {
die("There was a problem closing the text file.");
}
} else {
die("No file to write data to. Sorry.");
}
What is strange is that the echo statement shows the following:
There were 10330 bytes written to the text file.
And yet, when I verify the text file size via FTP it shows it to be 0K and the data inside the file is, in fact, truncated. I can't imagine it has to do with the FTP server itself because it works if the PHP is hosted on a machine other than the one on Rackspace Cloud.
** UPDATE **
I spoke to a Rackspace Cloud rep who mentioned that they require passive ftp if you're going to ftp from their servers. I setup the remote server to handle passive ftp connections, and have verified that passive ftp now works on the remote server via the OSX Transmit ftp client. I added:
ftp_pasv($file_handle, true);
Right after the fopen() statement, but I get an error from PHP saying the I didn't provide a valid resource to ftp_pasv(). How can I ensure that the connection to the ftp site that PHP makes is PASV and not ACTIVE and still use fwrite()? Incidentally, I've noticed that the Windows machine reports that the file being written by my PHP code is 4096 bytes on disk. It never gets beyond that amount. This led me to change the output_buffering php value to 65536 just to troubleshoot, but that didn't fix the issue either. . .
** UPDATE PART DUEX **
Troubleshooting the problem on the my virtual server on the Rackspace Cloud Sites product was proving too difficult because they don't offer enough admin rights. I created a very small cloud server on Rackspace's Cloud Server product and configured everything to the point where I'm still seeing the same error with fwrite(). To make sure that I could write a file from that server to a remote server, I used basic ftp commands within my bash shell on the cloud server. It worked fine. So, I assume that there is a bug within the php implementation of fwrite(), and that it is probably due to some type of data throttling issue. When I write to the remote server from my local environment which has a slow upspeed compared to what is offered on the Rackspace Cloud server, it works fine. Is there any way to effectively throttle down the speed of the write? Just askin' :)
** UPDATE PART III *
So, I took the suggestion from #a sad dude and implemented a function that might help somebody trying to write to a new file and send it off in its entirety via ftp:
function writeFileAndFTP($filename=null, $data=null, $node=null, $local_path=null, $remote_path=null)
{
// !Determin the path and the file to upload from the webserver
$file = $local_path.'/'.$filename;
// !Open a new file to write to on the local machine
if (!($file_handle = fopen($file, "wb", 0))) {
die("There was a problem opening ".$file." for writing!");
}
// !Write the file to local disk
if ($bytesWritten = fwrite($file_handle, $data) ) {
//echo "There were " . $bytesWritten . " bytes written to " . $file;
}
// !Close the file from writing
if (!fclose($file_handle)) {
die("There was a problem closing " . $file);
}
// !Create connection to remote FTP server
$ftp_cxn = ftp_connect($node['addr'], $node['ftp_port']) or die("Couldn't connect to the ftp server.");
// !Login to the remote server
ftp_login($ftp_cxn, $node['user'], getPwd($node['ID'])) or die("Couldn't login to the ftp server.");
// !Set PASV or ACTIVE FTP
ftp_pasv($ftp_cxn, true);
// !Upload the file
if (!ftp_put($ftp_cxn, $remote_path.'/'.$filename, $file, FTP_ASCII)) {
die("There was an issue ftp'ing the file to ".$node['addr'].$remote_path);
}
// !Close the ftp connection
ftp_close($ftp_cxn);
}
The length of the string fwrite can write in one go is limited on some platforms (which is why it returns the number of bytes written). You can try running it in a loop, but a better idea is to simply use file_put_contents, which guarantees that the whole string will be written.
http://www.php.net/manual/en/function.file-put-contents.php

Remotely include directories with PHP

I've done some research on an issue I'm having with taking a remote directory from Server A and linking that directory to Server B. I'm not fully sure if I can take the remote directory using PHP and use the contents from that directory on Server B
Here's what is what I want to go one between both servers
Server A (code.php)
<?php
$FileTitle = '/code/';
if (!isset($_GET['file'])) {
exit('This file cannot be found');
}
$FileTitle = $_GET['file'];
?>
What I have going on with this script is that every time a person enters in a url ending with /code.php?=testfile.txt or any other file in the directory /code/ on Server A will be echoed using <?php echo $FileTitle; ?>. My problem with this is that I host all the files on Server A rather on Server B. I want the title of the file from the URL to show up in index.php on Server B
Server B (index.php)
<?php
include 'http://example.com/code.php';
?>
<?php echo $FileTitle; ?>
I'm planning for this to take the code from Server A and be able to find the directory /code/ on that server as well.
I've done a ton a research the past few days both on Stackoverflow and around the internet. I haven't found anything even close to what I am trying to do. If you can, please show me how to do this. I would really appreciate figuring out how to have a remote connection to a file on another server and be able to use that file remotely. Thanks :)
code.php will execute on the remote server so you will get the output of code.php if any. The only thing I can think of is writing a script that outputs code.php..
Ex:
server b, index.php
<?php
eval(str_replace(array('<?php', '?>'), '', file_get_contents('http://example.com/sendcode.php)));
?>
server a, sendcode.php
<?php
$code = file_get_contents('code.php');
echo $code;
?>
Completely insecure, but it works.
Edited: try new server b code. If that doesn't work I'm out of ideas.

Categories