How can I Connect to other server - php

I want to make a platform to get disk space usage of several server.
How i can do it?
$df = disk_free_space("/");
i want this line of code to be executed on my all servers

It's not easy or maybe impossible with only php, the best solution in my opinion is send a request with curl to the server that return the disk space usage.

One way is to use the ssh2_XXX functions of PHP to login to each server and run df /.
Another way is to create a web page on each server that runs disk_free_space('/') and echoes the result. Then you can use file_get_contents("http://servername/disk_free.php") to query each server.

on each server, you can insert a page that returns the free space:
http://server1.com/GetFreeSpace.php
http://server2.com/GetFreeSpace.php
http://server3.com/GetFreeSpace.php
GetFreeSpace.php
<?php
echo disk_free_space("/");
You can use cURL to query several servers
<?php
$ch = curl_init();
curl_setopt_array($ch, array(
CURLOPT_URL => 'http://server1.com/GetFreeSpace.php',
CURLOPT_RETURNTRANSFER => true
));
echo curl_exec($ch);

There are many ways to do this. Curl is an option, you can use cron on each server, that runs at specific time and updates the value to a web server, or you can use ssh to get the same info.

Related

Calling an external API using a server [duplicate]

I have to send an SMS by making an HTTP request via GET method. The link contains information in the form of GET variables, e.g.
http://www.somelink.com/file.php?from=12345&to=67890&message=hello%20there
After I run the script it has to be as if someone clicked the link and activated the SMS sending process.
I have found some links about get request and curl and what not, it’s all so confusing!
I think the easiest way to make an HTTP request via GET method from PHP is using file_get_contents().
<?php
$response = file_get_contents('http://example.com/send-sms?from=12345&to=67890&message=hello%20there');
echo $response;
Don’t forget to see the notes section for info on PHP configuration required for this to work. You need to set allow_url_fopen to true in your php.ini.
Note that this works for GET requests only and that you will have no access to the headers (request, nor response). Also, enabling allow_url_fopen might not be a good choice for security reasons.
The easiest way is probably to use cURL. See https://web.archive.org/web/20180819060003/http://codular.com/curl-with-php for some examples.
Lets assume that we want to retrive http://www.google.com
$cURL = curl_init();
$setopt_array = array(CURLOPT_URL => "http://www.google.com", CURLOPT_RETURNTRANSFER => true, CURLOPT_HTTPHEADER => array());
curl_setopt_array($cURL, $setopt_array);
$json_response_data = curl_exec($cURL);
print_r($json_response_data);
curl_close($cURL);
/*
cURL is preinstalled by goDaddy.com and many other php hosting providers
it is also preinstalled in wamp and xampp
good luck.
*/

Calling function or script on another server and receiving response

What I essentially want to do is to be able to call a function or script on another server using PHP and receive a response. Let me set up an example:
On my web application, I have a page, and I need to send some data to the server, have the server do some work and then return a response.
Web application:
<?php
// call server script, sending it for example a string, "testString", then wait for a response
?>
Server script:
<?php
// get the string "testString", so some work on it and return it, displaying it on the web app
?>
I'm not looking for someone to complete this for me, just to head me in the right direction. I've read that cURL can be useful for this, but have yet to be able to get any examples to work for me, such as these ones:
http://www.jonasjohn.de/snippets/php/curl-example.htm
How to get response using cURL in PHP
So what's an ideal route to head in to solve my problem?
cURL operates just like a browser basically. It makes an HTTP request to a server and gets back the response. So one server is the 'client' and one server is the 'server'.
On the server server (lol), set up a page called index.php that outputs some text
<?php
echo 'hello from the server server';
Then from the client server, create a page called index.php to make the cURL request to the server server.
<?php
// init curl object
$ch = curl_init();
// define options
$optArray = array(
CURLOPT_URL => 'http://www.serverserver.com', <--- edit that URL
CURLOPT_RETURNTRANSFER => true
);
// apply those options
curl_setopt_array($ch, $optArray);
// execute request and get response
$result = curl_exec($ch);
var_dump($result);
Then, when you go to the client server URL, the client server will hit the server server, just like it was a person using a browser over HTTP, and print the result to the screen.
Hope that helps. Didn't actually test it.
If you do not have control over both the machines or the host server does not provide you with an API to be able to do that, this is not doable. Otherwise it is as simple as setting up some code on host server which will receive commands from your client, then process and respond accordingly. Once that is setup then you can easily call your server code with either cURL or even file_get_contents

Something faster than get_headers()

I'm trying to make a PHP script that will check the HTTP status of a website as fast as possible.
I'm currently using get_headers() and running it in a loop of 200 random urls from mysql database.
To check all 200 - it takes an average of 2m 48s.
Is there anything I can do to make it (much) faster?
(I know about fsockopen - It can check port 80 on 200 sites in 20s - but it's not the same as requesting the http status code because the server may responding on the port - but might not be loading websites correctly etc)
Here is the code..
<?php
function get_httpcode($url) {
$headers = get_headers($url, 0);
// Return http status code
return substr($headers[0], 9, 3);
}
###
## Grab task and execute it
###
// Loop through task
while($data = mysql_fetch_assoc($sql)):
$result = get_httpcode('http://'.$data['url']);
echo $data['url'].' = '.$result.'<br/>';
endwhile;
?>
You can try CURL library. You can send multiple request parallel at same time with CURL_MULTI_EXEC
Example:
$ch = curl_init('http_url');
curl_setopt($ch, CURLOPT_HEADER, 1);
$c = curl_exec($ch);
$info = curl_getinfo($ch, CURLINFO_HTTP_CODE);
print_r($info);
UPDATED
Look this example. http://www.codediesel.com/php/parallel-curl-execution/
I don't know if this is an option that you can consider, but you could run all of them almost at the same using a fork, this way the script will take only a bit longer than one request
http://www.php.net/manual/en/function.pcntl-fork.php
you could add this in a script that is ran in cli mode and launch all the requests at the same time, for example
Edit: you say that you have 200 calls to make, so a thing you might experience is the database connection loss. the problem is caused by the fact that the link is destroyed when the first script completes. to avoid that you could create a new connection for each child. I see that you are using the standard mysql_* functions so be sure to pass the 4th parameter to be sure you create a new link each time. also check the maximum number of simultaneous connections on your server

PHP: Remote Function Call and returning the result?

I'm not very expert to PHP. I want to know how to communicate between 2 web servers. For clearance, (from 1st Server) run a function (querying) on remote server. And return the result to 1st server.
Actually the theme will be:
Web Server (1) ----------------> Web Server (2) ---------------> Database Server
Web Server (1) <---------------- Web Server (2) <--------------- Database Server
Query Function() will be only located on Web Server (2). Then i need to run that query function() remotely from Web Server (1).
What is it call? And Is it possible?
Yes.
A nice way I can think of doing would be to send a request to the 2nd server via a URL. In the GET (or POST) parameters, specify which method you'd like to call, and (for security) some sort of hash that changes with time. The hash in there to ensure no third-party can run the function arbitrarily on the 2nd server.
To send the request, you could use cURL:
function get_url($request_url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $request_url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$response = curl_exec($ch);
curl_close($ch);
return $response;
}
This sends a GET request. You can then use:
$request_url = 'http://second-server-address/listening_page.php?function=somefunction&securityhash=HASH';
$response = get_url($request_url);
On your second server, set up the listening_page.php (with whatever filename you like, of course) that checks for GET requests and verifies the integrity of the request (i.e. the hash, correct & valid params).
You can do so by using an API. create a page on second server that takes variables and communicates to the server using those vars (depending on what you need). and the standard reply from that page should be either JSON or XML. then read that from server 1 by requesting that file and getting the reply from the 2nd server.
*NOTE if its a private file, make sure you use an authentication method to prevent users from accessing the file
What you are aiming to do is definitely possible. You will need to set up some sort of api in order for server one to make a request to server 2.
I suggest you read up on SOAP and REST api
http://www.netmagazine.com/tutorials/make-your-own-soap-api
Generally you will use something like CURL to contact server 2 from server 1.
Google curl and you should quickly get idea.
Its not going to be easy to give you a complete solution so I hope this nudge in the right direction is helpful.

How do I check for valid (not dead) links programmatically using PHP?

Given a list of urls, I would like to check that each url:
Returns a 200 OK status code
Returns a response within X amount of time
The end goal is a system that is capable of flagging urls as potentially broken so that an administrator can review them.
The script will be written in PHP and will most likely run on a daily basis via cron.
The script will be processing approximately 1000 urls at a go.
Question has two parts:
Are there any bigtime gotchas with an operation like this, what issues have you run into?
What is the best method for checking the status of a url in PHP considering both accuracy and performance?
Use the PHP cURL extension. Unlike fopen() it can also make HTTP HEAD requests which are sufficient to check the availability of a URL and save you a ton of bandwith as you don't have to download the entire body of the page to check.
As a starting point you could use some function like this:
function is_available($url, $timeout = 30) {
$ch = curl_init(); // get cURL handle
// set cURL options
$opts = array(CURLOPT_RETURNTRANSFER => true, // do not output to browser
CURLOPT_URL => $url, // set URL
CURLOPT_NOBODY => true, // do a HEAD request only
CURLOPT_TIMEOUT => $timeout); // set timeout
curl_setopt_array($ch, $opts);
curl_exec($ch); // do it!
$retval = curl_getinfo($ch, CURLINFO_HTTP_CODE) == 200; // check if HTTP OK
curl_close($ch); // close handle
return $retval;
}
However, there's a ton of possible optimizations: You might want to re-use the cURL instance and, if checking more than one URL per host, even re-use the connection.
Oh, and this code does check strictly for HTTP response code 200. It does not follow redirects (302) -- but there also is a cURL-option for that.
Look into cURL. There's a library for PHP.
There's also an executable version of cURL so you could even write the script in bash.
I actually wrote something in PHP that does this over a database of 5k+ URLs. I used the PEAR class HTTP_Request, which has a method called getResponseCode(). I just iterate over the URLs, passing them to getResponseCode and evaluate the response.
However, it doesn't work for FTP addresses, URLs that don't begin with http or https (unconfirmed, but I believe it's the case), and sites with invalid security certificates (a 0 is not found). Also, a 0 is returned for server-not-found (there's no status code for that).
And it's probably easier than cURL as you include a few files and use a single function to get an integer code back.
fopen() supports http URI.
If you need more flexibility (such as timeout), look into the cURL extension.
Seems like it might be a job for curl.
If you're not stuck on PHP Perl's LWP might be an answer too.
You should also be aware of URLs returning 301 or 302 HTTP responses which redirect to another page. Generally this doesn't mean the link is invalid. For example, http://amazon.com returns 301 and redirects to http://www.amazon.com/.
Just returning a 200 response is not enough; many valid links will continue to return "200" after they change into porn / gambling portals when the former owner fails to renew.
Domain squatters typically ensure that every URL in their domains returns 200.
One potential problem you will undoubtably run into is when the box this script is running on looses access to the Internet... you'll get 1000 false positives.
It would probably be better for your script to keep some type of history and only report a failure after 5 days of failure.
Also, the script should be self-checking in some way (like checking a known good web site [google?]) before continuing with the standard checks.
You only need a bash script to do this. Please check my answer on a similar post here. It is a one-liner that reuses HTTP connections to dramatically improve speed, retries n times for temporary errors and follows redirects.

Categories