I want to create a php script that will ping a domain and list the response time along with the total size of the request.
This will be used for monitoring a network of websites. I tried it with curl, here is the code I have so far:
function curlTest2($url) {
clearstatcache();
$return = '';
if(substr($url,0,4)!="http") $url = "http://".$url;
$userAgent =
'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322)';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_NOBODY, 1);
curl_setopt($ch, CURLOPT_USERAGENT, $userAgent);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch,CURLOPT_CONNECTTIMEOUT, 15);
curl_setopt($ch, CURLOPT_FAILONERROR, 1);
curl_setopt($ch, CURLOPT_FRESH_CONNECT, 1);
$execute = curl_exec($ch);
// Check if any error occured
if(!curl_errno($ch)) {
$bytes = curl_getinfo($ch, CURLINFO_CONTENT_LENGTH_DOWNLOAD);
$total_time = curl_getinfo($ch, CURLINFO_TOTAL_TIME);
$return = 'Took ' . $total_time . ' / Bytes: '. $bytes;
} else {
$return = 'Error reaching domain';
}
curl_close($ch);
return $return;
}
And here is one using fopen
function fopenTest($link) {
if(substr($link,0,4)!="http"){
$link = "http://".$link;
}
$timestart = microtime();
$churl = #fopen($link,'r');
$timeend = microtime();
$diff = number_format(((substr($timeend,0,9)) + (substr($timeend,-10)) -
(substr($timestart,0,9)) - (substr($timestart,-10))),4);
$diff = $diff*100;
if (!$churl) {
$message="Offline";
}else{
$message="Online. Time : ".$diff."ms ";
}
fclose($churl);
return $message;
}
Is there a better way to ping a website using php?
Obviously curl's got all kinds of cool things, but remember, you can always make use of built in tools by invoking them from the command line like this:
$site = "google.com";
ob_start();
system("ping " . escapeshellarg($site));
print ob_end_flush();
Only thing to keep in mind, this isn't going to be as cross platform as curl might be; although the curl extension is not enabled by default either..
When doing quick scripts for one time tasks I just exec() wget:
$response = `wget http://google.com -O -`;
It's simple and takes care of redirects.
If you're using suhosin patches and curl you may encounter problems with http redirect (301, 302...),
suhosin won't allow it.
I'm not sure about Curl/Fopen but this benchmark says file_get_contents have better performance then fopen.
You could use xmlrpc (xmlrpc_client). Not sure what the advantages/disadvantages to curl are.
Drupal uses xmlrpc for this purpose (look at the ping module).
Using curl is fine.
Not sure if I'd use that useragent string though. Rather make a custom one unless you specifically need to.
maybe this pear Net_Ping is what you are looking for. It's no more maintained but it works.
If remote fopen is enabled, file_get_contents() will do the trick too.
Related
As mentioned above, the php file_get_contents() function or even the fopen()/fread() combination stucks and times out when trying to read this simple image url:
http://pics.redblue.de/artikelid/GR/1140436/fee_786_587_png
but the same image is easily loaded by browsers, whats the catch?
EDITED:
as requested in comments, I am showing the function I used to get the data:
function customRead($url)
{
$contents = '';
$handle = fopen($url, "rb");
$dex = 0;
while ( !feof($handle) )
{
if ( $dex++ > 100 )
break;
$contents .= fread($handle, 2048);
}
fclose($handle);
echo "\nbreaking due to too many calls...\n";
return $contents;
}
I also tried simply this:
echo file_get_contents('http://pics.redblue.de/artikelid/GR/1140436/fee_786_587_png');
Both give the same issue
EDITED:
As suggested in comment I used curl:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.1 Safari/537.11');
$res = curl_exec($ch);
$rescode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch) ;
echo "\n\n\n[DATA:";
echo $res;
echo "]\n\n\n[CODE:";
print_r($rescode);
echo "]\n\n\n[ERROR:";
echo curl_error($ch);
echo "]\n\n\n";
this is the result:
[DATA:]
[CODE:0]
[ERROR:]
If you don't get the remote data with file_get_contents, you can try it with cURL as it can provide error messages on curl_error. If you get nothing, even no error, then something on your server blocks outgoing connections. Maybe you even want to try curl over SSH. I'm not sure if that makes any difference but it's worth the try. If you don't get anything you may want to consider contacting the server admin (if you're not that) or the provider.
I am using PHP Curl to grab files from a url. Currently the file names are hard coded and I am trying to make it so that it downloads all the log files in a specific directory. Any point in the right direction would be nice. Thank you.
Please note that I have "Read" wright to a directory, I don't have FTP access or anything else.
Server_url : http://192.168.2.45/logfiles/
Server : server1
Files in that directory : 140512 ... 150316.log and growing
<?php
$server_url = $_GET['server_url'];
$server = $_GET['server'];
//This needs to be changed to get all files
for($i = 140512; $i <= 150316; $i++) {
$id = base64_encode($i);
$file_name = $server_url.$i.'.log';
curl_setopt($ch,CURLOPT_URL,$file_name);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); // return data as string
// disable peer verification
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
// disable host verification
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE);
// spoof a real browser client string
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)");
$output = curl_exec($ch); // capture data into $output variable
if(!dir('sites/'.$server.'_LOGS')){
mkdir('sites/'.$server.'_LOGS');
}
if( $output != false){
file_put_contents('sites/'.$server.'_LOGS'.'/u_ex' . base64_decode($id) . '.log', $output );
}
curl_close($ch);
}
?>
Curl supports ftp, you can use it to get the file list and then download each file. I found an example in a previous answer using php curl here downloading all the files in a directory with cURL.
I am currently attempting to configure a CURL & PHP function found online that when called checks if the HTTP response headers is in the 200-300 range to determine if the web page is up. This is successful once ran against an individual website with the code below (not the function itself but the if statements etc) The function returns true or false depending on the range of the HTTP Response header:
$page = "www.google.com";
$page = gzdecode($page);
if (Visit($page))
{
echo $page;
echo " Is OK <br>";
}
else
{
echo $page;
echo " Is DOWN <br>";
}
However when running against an array of URL's stored within the script through the use of a for each loop it reports every webpage within the list as down despite that the code is the same bar the added for loop of course.
Does anyone know what the issue may be surrounding this?
Edit - adding Visit function
My bad sorry, not fully thinking.
The visit function is the following:
function Visit($url){
$agent = "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)";$ch=curl_init();
curl_setopt ($ch, CURLOPT_URL,$url );
curl_setopt($ch, CURLOPT_USERAGENT, $agent);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch,CURLOPT_VERBOSE,false);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_setopt($ch,CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch,CURLOPT_SSLVERSION,3);
curl_setopt($ch,CURLOPT_SSL_VERIFYHOST, FALSE);
$page=curl_exec($ch);
//echo curl_error($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
if($httpcode>=200 && $httpcode<310) return true;
else return false;
}
The foreach loop as mentioned looks like this:
foreach($Urls as $URL)
{
$page = $URL;
$page = gzdecode($page);
if (Visit($page))
The if loop for the visit part is the same as before.
$page = $URL;
$page = gzdecode($page);
Why are you trying to uncompress the non-compressed URL? Assuming you really meant to uncompress the content returned from the URL, why would the remote server server compress it when you you've told it that the client does not support compression? Why are you fetching the entire page to see the headers?
The code you've shown us here has never worked
I have a repetitive task that I do daily. Log in to a web portal, click a link that pops open a new window, and then click a button to download an Excel spreadsheet. It's a time consuming task that I would like to automate.
I've been doing some research with PHP and cUrl, and while it seems like it should be possible, I haven't found any good examples. Has anyone ever done something like this, or do you know of any tools that are better suited for it?
Are you familiar with the basics of HTTP requests? Like, do you know the difference between a POST and a GET request? If what you're doing amounts to nothing more than GET requests, then it's actually super simple and you don't need to use cURL at all. But if "clicking a button" means submitting a POST form, then you will need cURL.
One way to check this is by using a tool such as Live HTTP Headers and watching what requests happen when you click on your links/buttons. It's up to you to figure out which variables need to get passed along with each request and which URLs you need to use.
But assuming that there is at least one POST request, here's a basic script that will post data and get back whatever HTML is returned.
<?php
if ( $ch = curl_init() ) {
$data = 'field1=' . urlencode('somevalue');
$data .= '&field2[]=' . urlencode('someothervalue');
$url = 'http://www.website.com/path/to/post.asp';
$userAgent = 'Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)';
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_USERAGENT, $userAgent);
$html = curl_exec($ch);
curl_close($ch);
} else {
$html = false;
}
// write code here to look through $html for
// the link to download your excel file
?>
try this >>>
$ch = curl_init();
$csrf_token = $this->getCSRFToken($ch);// this function to get csrf token from website if you need it
$ch = $this->signIn($ch, $csrf_token);//signin function you must do it and return channel
curl_setopt($ch, CURLOPT_HTTPGET, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 300);// if file large
curl_setopt($ch, CURLOPT_URL, "https://your-URL/anything");
$return=curl_exec($ch);
// the important part
$destination ="files.xlsx";
if (file_exists( $destination)) {
unlink( $destination);
}
$file=fopen($destination,"w+");
fputs($file,$return);
if(fclose($file))
{
echo "downloaded";
}
curl_close($ch);
i have this problem when im trying to upload a file to amazon s3, it gives me this error but i dnt seem to understand:
Warning: curl_setopt() [function.curl-setopt]: CURLOPT_FOLLOWLOCATION cannot be activated when safe_mode is enabled or an open_basedir is set in /var/www/vhosts/??????/httpdocs/actions/S3.php on line 1257
There is a lengthy workaround posted in the comments to the curl functions:
http://php.net/manual/en/function.curl-setopt.php#102121
Though the better solution would be not to use cURL. (See PEAR Http_Request2 or Zend_Http for alternatives, or use PHPs built-in HttpRequest if available.)
The problem is exactly what is says in the error message - you have safe_mode or open_basedir enabled in php.ini. Either edit php.ini to disable whichever one of those you have on, or don't use PHP's flavor of curl. If you can't edit php.ini you'll have to find a new host or find a new solution.
The best solution would be to get a new host. open_basedir isn't a great security feature (a good host will use the far better approach of setting up a jail). safe_mode is deprecated. So the best result will come from disabling both directives (or finding a new host if yours is unwilling to do so).
However, if that's not an option, you can always implement something like this (from a comment on php.net)...
i have shorter and less safe variant of workaround posted by mario, but you may find it useful for urls with known number of redirects (for example FB Graph API image calls -- graph.facebook.com/4/picture)
function cURLRequest($url) {
$ch = curl_init();
// curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_URL, $url);
$result = curl_exec($ch);
if ($result) {
curl_close($ch);
return $result;
} else if (empty($result)) {
$info = curl_getinfo($ch);
curl_close($ch);
// PHP safe mode fallback for 302 redirect
if (!empty($info['http_code']) && !empty($info['redirect_url'])) {
return cURLRequest($info['redirect_url']);
} else {
return null;
}
} else {
return null;
}
}
Use this version of Curl
//=================== compressed version===============(https://github.com/tazotodua/useful-php-scripts/)
function get_remote_data($url, $post_paramtrs=false) { $c = curl_init();curl_setopt($c, CURLOPT_URL, $url);curl_setopt($c, CURLOPT_RETURNTRANSFER, 1); if($post_paramtrs){curl_setopt($c, CURLOPT_POST,TRUE); curl_setopt($c, CURLOPT_POSTFIELDS, "var1=bla&".$post_paramtrs );} curl_setopt($c, CURLOPT_SSL_VERIFYHOST,false);curl_setopt($c, CURLOPT_SSL_VERIFYPEER,false);curl_setopt($c, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows NT 6.1; rv:33.0) Gecko/20100101 Firefox/33.0"); curl_setopt($c, CURLOPT_COOKIE, 'CookieName1=Value;'); curl_setopt($c, CURLOPT_MAXREDIRS, 10); $follow_allowed= ( ini_get('open_basedir') || ini_get('safe_mode')) ? false:true; if ($follow_allowed){curl_setopt($c, CURLOPT_FOLLOWLOCATION, 1);}curl_setopt($c, CURLOPT_CONNECTTIMEOUT, 9);curl_setopt($c, CURLOPT_REFERER, $url);curl_setopt($c, CURLOPT_TIMEOUT, 60);curl_setopt($c, CURLOPT_AUTOREFERER, true); curl_setopt($c, CURLOPT_ENCODING, 'gzip,deflate');$data=curl_exec($c);$status=curl_getinfo($c);curl_close($c);preg_match('/(http(|s)):\/\/(.*?)\/(.*\/|)/si', $status['url'],$link);$data=preg_replace('/(src|href|action)=(\'|\")((?!(http|https|javascript:|\/\/|\/)).*?)(\'|\")/si','$1=$2'.$link[0].'$3$4$5', $data);$data=preg_replace('/(src|href|action)=(\'|\")((?!(http|https|javascript:|\/\/)).*?)(\'|\")/si','$1=$2'.$link[1].'://'.$link[3].'$3$4$5', $data);if($status['http_code']==200) {return $data;} elseif($status['http_code']==301 || $status['http_code']==302) { if (!$follow_allowed){if(empty($redirURL)){if(!empty($status['redirect_url'])){$redirURL=$status['redirect_url'];}} if(empty($redirURL)){preg_match('/(Location:|URI:)(.*?)(\r|\n)/si', $data, $m);if (!empty($m[2])){ $redirURL=$m[2]; } } if(empty($redirURL)){preg_match('/href\=\"(.*?)\"(.*?)here\<\/a\>/si',$data,$m); if (!empty($m[1])){ $redirURL=$m[1]; } } if(!empty($redirURL)){$t=debug_backtrace(); return call_user_func( $t[0]["function"], trim($redirURL), $post_paramtrs);}}} return "ERRORCODE22 with $url!!<br/>Last status codes<b/>:".json_encode($status)."<br/><br/>Last data got<br/>:$data";}