PHP script causes page to hang while loading - php

I've done a lot of coding in the past regarding html and other languages, however only touched the surface of PHP.
I am trying to build an updating server status block on my game server site, and have got my code to a stage where in theory it should work, and my page produces no error messages, however I am experiencing the page to simply hang whenever I try to load it on my site, and if I try to view it locally it displays the PHP code raw as I do not have apache/PHP installed on my local machine.
I have done some testing, and my page loads absolutely fine if I remove or comment out the PHP code blocks, suggesting that my code is causing the error.
Here is the code I am trying to implement in the webpage. I am simply putting it in a <span> statement, and not using include from a file or anything like that.
<?php
$conn = ftp_connect('SERVER');
$file = 'status.txt';
if (#ftp_login($conn, 'USERNAME', 'PASSWORD')) {
ftp_pasv($conn, true);
ftp_get($conn, $file, $file, FTP_ASCII);
$status = file_get_contents($file);
if ($status == 'online') {
echo '<span style="color:#00FF00;">Online</span>';
} elseif ($status == 'offline') {
echo '<span style="color:red;">Offline</span>';
} else {
echo '<span style="color:red;">Unknown</span>';
}
} else {
echo 'Could not authenticate FTP server';
}
ftp_close($conn);
?>
This is my second shot at this code - I have also tried doing the same using file_get_contents('ftp://ADDRESS'); instead of transfering the file from the FTP server, but I still got the page hanging, or displaying a PHP error if it gave one.
( If i leave the page long enough, I also get a 'Server sent no data' or 'Took too long to respond', and tend to get locked out of the website completely; as in getting this same error messages on pages I knew worked fine before, until I reset my IP address.
I have tried the webpage on both Chrome browser and Internet explorer, both browsers hang on the page. )

Related

PHP 7 - SFTP keeps interrupting page to load

I'm currently trying to get a file from a server by php sftp. I managed to authenticate and connect to the server. The problem is, that if I want to open a dir on said server, the page just keeps loading until my browser tells me the loading of the page has been interrupted. This only happens, if I try open a dir that EXISTS. If I open a dir that doesn't exist, I get a normal error message.
Therefor I'm not quite sure, wether this is a mistake in my code or a problem with the ftp server.
My Code:
ini_set("display_errors", "1");
$host = "<host>";
$port = 22;
$conn = ssh2_connect($host);
$username = "<user>";
$pub_key = "/home/<user>/.ssh/id_rsa.pub";
$pri_key = "/home/<user>/.ssh/id_rsa";
if (ssh2_auth_pubkey_file(
$conn,
$username,
$pub_key,
$pri_key
)) {
if(!$sftp = ssh2_sftp($conn)){
die("SFTP Connection failed");
};
opendir("ssh2.sftp://".intval($sftp)."/./");
};
Has anyone ever experienced something similar?
I'd be glad for any help :)
~François
It is the expected way.
Opendir return a handle. Your function is working, it's just that you do nothing with the data, and your php script do nothing. It's just waiting with the information
Just handle data, or at least write an echo and it should be ok.
check the manual, there is a working example http://php.net/manual/en/function.opendir.php

Does file_get_contents store any data

I'm using file_get_contents as below and set cron job to run that file every hour, so it opens the described url which is for running some other functions. Now I have two questions completely similar.
<?php
file_get_contents('http://107.150.52.251/~IdidODiw/AWiwojdPDOmwiIDIWDIcekSldcdndudsAoiedfiee1.php');
?>
1) if the above url returns null value, does it store anything on server (temporory value or log)?
2) if the above url returns error, does it store anything like errors or temporary values to server permanently?
The function itself does not leave any trace.
Since you are running this code in a cron job, you cannot directly inspect its output. Therefore you need to log the result to a log file. Look into monolog for instance.
You will then log the result of your function like this :
$contents = file_get_contents( ... );
if($contents == false){
$log->error("An error occurred");
} else {
$log->debug("Result", array('content' => $content));
}
If you are suspecting anything wrong with the above command or want to debug it . You can print the error / success msg with the following code and re-direct it to log file.
$error = error_get_last();
echo $error['message'];

PHP script to check on remote server, a file exists

I am having roblems with locating a PHP script to allow me to obtain the contents of a txt file on a remote server, then output to a variable. Outputting something to a variable is not the hard part. It's the picking up and reading the contents of the file that's the hard part. Anyone have any ideas?
I have trawled the forum and can only locate a method that works locally. Not ideal as the target is remote.
The objective really is, how do I find out if a file exists on the remote server and output a status in html.
Ideas?
Assuming your remote server is accessible by http or ftp you can use file_exists():
if (file_exists("http://www.example.com/somefile.txt")) {
echo "Found it!;
}
or
if (file_exists("ftp:user:password#www.example.com/somefile.txt")) {
echo "Found it!;
}
Use this:
$url = 'http://php.net';
$file_headers = #get_headers($url);
if($file_headers[0] == 'HTTP/1.1 404 Not Found') {
echo "URL does not exist";
}
else {
echo "URL exists";
}
Source: http://www.php.net/manual/en/function.file-exists.php#75064
You can try to use this code:
if (file_exists($path)) {
echo "it exists";
} else {
echo "it does not exist";
}
As you can see $path is the path of your file. Of course you can write anything else instead of those echo.
Accessing files on other servers can be quite tricky! If you have access to the file via ftp, you can use ftp to fetch the file, for example with ftp_fget().
If you do not have access to the file-system via ssh, you only can check the response the server gives when requesting the file. If the server responds with an error 404, the file is either not existent or it is not accessible via http due to the server configuration.
You can check this through curl, see this tutorial for a detailled explanation of obtaining the response code through curl.
I know this is an old thread, but as Lars Ebert points out, checking for the existence of a file on a remote server can be tricky, so checking the server response, using cURL, is how I have been able to do it on our big travel site. Using file_exists() threw an error every time, but checking for a "200 OK" has proved quite successful. Here is the code we are using to check for images for our hotel listings page:
$media_url = curl_init("http://pathto/remote_file.png");
curl_setopt($media_url, CURLOPT_RETURNTRANSFER, true);
$media_img = curl_exec($media_url);
$server_response = curl_getinfo($media_url, CURLINFO_HTTP_CODE);
if($server_response != 200){
echo "pathto/graphics/backup_image.png";
}else{
echo "http://pathto/remote_file.png";
}
Where "http://pathto/remote_file.png" is the remote image we seek, but we need to know whether it is really there. And "pathto/graphics/backup_image.png" is what we display if the remote image does not exist.
I know it's awfully verbose, compared to file_exists(), but it's also more accurate, at least so far.

I am getting an Error (-32300): transport error - HTTP status code was not 200

I am getting an Error when I try to upload data using xmlrpc in wordpress. The code used to work fine but all of a sudden this error started appearing. I have not changed anything in the code.
Error (-32300): transport error - HTTP status code was not 200
Also, I know my script works because google chrome returns an 'ok' status on the GET request.
php.ini has 128mb of memory allocated.
Here is the code that is used to make post
/**
* Make Posts using the XMLRPC classes
*/
function makePosts() {
$data_set = $this->getMovieLinks();
$xml_client = new XMLRPClientWordPress();
foreach ($data_set as $key) {
echo '<pre>';
echo 'This is title movie about to be added ======== : ' . $key['title'];
echo '</pre>';
//new_post($title,$summary,$category,$image_url,$internal_links)
if ($xml_client->new_post($key['title'], $key['summary'], $key['category'], $key['image'], $key['internal_links']) ) {
$status=1;
} else {
$status=0;
}
if (isset($status)) {
echo ' ====== ADDED';
} else {
echo ' ====== ERROR ADDING';
}
}
} // Function makePosts endes here
You can do few things to debug the error.
Take a look in the server logs, maybe they include the real reason for the problem.
Look for "memory_limit" in your php.ini. Try higher number and see whether that's the problem.
Try deactivating one plugin at a time, May be one plugin may causing the error.
I received the same error, finally I found out the reason was I that I enabled such code in .htaccess (XML-PRC server side); I was blocked myself.
order deny,allow
deny from all
allow from 211.111.0.0/16
The server hosted "http://example.com/xmlrpc.php" was blocked post script IP source.
You should:
Add the XML-RPC client script IP to XML-PRC server side; even the client and server in same site.
Or simply remove "deny from all" from .htaccess
I have the same error but i solved it :
I typed http://www.example.com/xmlrpc.php but the good is http://example.com/xmlrpc.php because if it have "www" prefix it redirected with staus cod
If none of the above solutions work:
Make sure you are not being white-listed on the hosting provider. Our client uses wp-engine and we had this exact issue when posting media items.
After making the same request on an outside network, different IP, we got a 200 (OK) status code.

Attempting to load again a URL when it fails

The following function receives a string parameter representing an url and then loads the url in a simple_html_dom object. If the loading fails, it attemps to load the url again.
public function getSimpleHtmlDomLoaded($url)
{
$ret = false;
$count = 1;
$max_attemps = 10;
while ($ret === false) {
$html = new simple_html_dom();
$ret = $html->load_file($url);
if ($ret === false) {
echo "Error loading url: $url\n";
sleep(5);
$count++;
$html->clear();
unset($html);
if ($count > $max_attemps)
return false;
}
}
return $html;
}
However, if the url loading fails one time, it keeps failing for the current url, and after the max attemps are over, it also keeps failing in the next calls to the function with the rest of the urls it has to process.
It would make sense to keep failing if the urls were temporarily offline, but they are not (I've checked while the script was running).
Any ideas why this is not working properly?
I would also like to point out, that when starts failing to load the urls, it only gives a warning (instead of multiple ones), with the following message:
PHP Warning: file_get_contents(http://www.foo.com/resource): failed
to open stream: HTTP request failed! in simple_html_dom.php on line
1081
Which is prompted by this line of code:
$ret = $html->load_file($url);
I have tested your code and it works perfectly for me, every time I call that function it returns valid result from the first time.
So even if you load the pages from the same domain there can be some protection on the page or server.
For example page can look for some cookies, or the server can look for your user agent and if it see you as an bot it would not serve correct content.
I had similar problems while parsing some websites.
Answer for me was to see what is some page/server expecting and make my code simulate that. Everything, from faking user agent to generating cookies and such.
By the way have you tried to create a simple php script just to test that 'simple html dom' parser can be run on your server with no errors? That is the first thing I would check.
On the end I must add that in one case, while I failed in numerous tries for parsing one page, and I could not win the masking game. On the end I made an script that loads that page in linux command line text browser lynx and saved the whole page locally and then I parsed that local file which worked perfect.
may be it is a problem of load_file() function itself.
Problem was, that the function error_get_last() returns all privious erros too, don't know, may be depending on PHP version?
I solved the problem by changing it to (check if error changed, not if it is null)
(or use the non object function: file_get_html()):
function load_file()
{
$preerror=error_get_last();
$args = func_get_args();
$this->load(call_user_func_array('file_get_contents', $args), true);
// Throw an error if we can't properly load the dom.
if (($error=error_get_last())!==$preerror) {
$this->clear();
return false;
}
}

Categories