Phpmailer and async sending email - php

I read some other question before ask here cause other answers don't response to my problem.
I've a custom made cms in php. For example if I insert a new payment, the script send to all admin user a notify:
function insert_payment() {
// CODE TO INSERT PAYMENT INSIDE MYSQL DB
$sql_payment = "INSERT INTO payments ($amount, ...) VALUES (?, ...);"
...
// NOTIFY ALL ADMINS
foreach ( $array_emails as $email ) {
send_email_to_admin($email, $subject, $body);
}
// redirect to dashboard
header("Location: " . $homepage);
}
This is an example of send_email_to_admin() function:
function send_email_to_admin($email, $subject, $body) {
// return example:
// $result = array(
// "Error" = true or false
// "DateTimeOfSent" = datetime
// "Email" = string
//)
// SAVE RESULTS IN MYSQL DB ( I need to register results to be sure email are sent without errors...table can be then seen to a specific pages under admin panel of cms)
$mail = new PHPMailer;
...
...
if(!$mail->send()) {
$result = array("Error" => true, "DateTimeOfSent" => date("Y-m-d"), "Email" => $mail);
} else {
$result = array("Error" => false, "DateTimeOfSent" => date("Y-m-d"), "Email" => $mail);
}
$sql_result = "INSERT INTO SentResult ("Error", "DateTimeSent", "Email", ...) VALUES ( $result['Error'], $result['DateTimeOfSent'], $result['Email'] )"
...
//end function
}
Now if I have 1 or 2 admins is ok...but if I have a lot of admins the time gap is not good for waiting a result for each sent.
I'd like to pass the foreach loop to a child process if it possible that can process async the entire loop of SENDING and SAVING inside MYSQL the results.
So header("Location: " . $homepage) can be executed immediately.
Some additional info:
I'm using hosted server so i can't install packages and libraries
I can use only function provided by default PHP config
I can't use a cronjob queue method cause my hosting not provide a free service
i'd like a solution working on IIS windows server and a Linux based server
I'd like an little script example based on my code cause i never used a async method in php and i don't know nothing about it :(
Sorry for my english

you could implement a queue and process this queue (asynchronously) with a curl call.
Instead of sending the emails directly from function send_email_to_admin(), insert a new dataset in a dedicated SQL table EmailQueue. Next you write a recursive function that processes this queue (all emails waiting to be send) until the table EmailQueue is empty.
insert payment:
...
// NOTIFY ALL ADMINS
foreach ( $array_emails as $email ) {
queue_email($email, $subject, $body);
}
curl_process_email_queue();
...
make CURL call, to detach from parent script (source):
function curl_process_email_queue() {
$c = curl_init();
curl_setopt($c, CURLOPT_URL, $url/send_queued_emails.php);
curl_setopt($c, CURLOPT_FOLLOWLOCATION, true); // Follow the redirects (needed for mod_rewrite)
curl_setopt($c, CURLOPT_HEADER, false); // Don't retrieve headers
curl_setopt($c, CURLOPT_NOBODY, true); // Don't retrieve the body
curl_setopt($c, CURLOPT_RETURNTRANSFER, true); // Return from curl_exec rather than echoing
curl_setopt($c, CURLOPT_FRESH_CONNECT, true); // Always ensure the connection is fresh
// Timeout super fast once connected, so it goes into async.
curl_setopt( $c, CURLOPT_TIMEOUT, 1 );
return curl_exec( $c );
}
queue email:
function queue_email($email, $subject, $body) {
$sql = "INSERT INTO emailQueue ("email", "subject", "body") VALUES ($email, $subject, $body)";
...
};
seperate PHP send_queued_emails.php script to be called via URL by cURL, that actualy sends the queued emails (recursively, until queue is empty):
<?php
// close connection early, but keep executing script
// https://stackoverflow.com/a/141026/5157195
ob_end_clean();
header("Connection: close");
ignore_user_abort(true);
ob_start();
echo('Some status message');
$size = ob_get_length();
header("Content-Length: $size");
header("Content-Encoding: none");
ob_end_flush();
flush();
// connection is closed at this point
// start actual processing here
send_queued_emails();
function send_queued_emails() {
// avoid concurrent access
$sql = 'START TRANSACTION';
mysqli_query($sql);
// read one item from the queue
$sql = 'SELECT "id", email", "subject", "body" FROM emailQueue LIMIT 1';
$result = mysqli_query($sql);
// if no more datasets are found, exit the function
if (!$result || (mysqli_num_rows($result) == 0))
return;
// mail the queried data
$mail = new PHPMailer;
...
// optionally write the result back to database
$sql_result = 'INSERT INTO SentResult ... ';
mysqli_query($sql);
// delete the email from the queue
$sql = 'DELETE FROM emailQueue WHERE "id"=...';
mysqli_query($sql);
// commit transaction
$sql = 'COMMIT';
mysqli_query($sql);
// recursively call the function
send_queued_emails();
};
To improve the reliability you may want to use transactions, to prevent issues for concurrent calls of the script send_queued_emails.php. For other options also see Methods for asynchronous processes in PHP 5.4.
EDIT: added "close connection early, but keep executing script" as proposed in this thread. This should enable you to even set a higher timeout for the cURL call.
EDIT2: added header("Content-Encoding: none"); as proposed by itajackass (refer to comments)

Related

Send FCM Push notifcations to specific devices in android app using MySQL query as identification from a PHP script

I want to send FCM push notifications in specific android users only using their token saved in mysql database as identification. here's my current progress
PHP Script Snippet Code: Report_Status.php (File 1)
//Gets the token of every user and sends it to Push_User_Notification.php
while ($User_Row = mysqli_fetch_array($Retrieve_User, MYSQLI_ASSOC)){
$User_Token = $User_Row['User_Token'];
include "../Android_Scripts/Notifications/Push_User_Notification.php";
$message = "Your Report has been approved! Please wait for the fire fighters to respond!";
send_notification($User_Token, $message);
}
PHP code for File 2: Push_User_Notification.php
<?php //Send FCM push notifications process
include_once("../../System_Connector.php");
function send_notification ($tokens, $message)
{
$url = 'https://fcm.googleapis.com/fcm/send';
$fields = array(
'registration_ids' => $tokens,
'data' => $message
);
$headers = array(
'Authorization:key = API_ACCESS_KEY',
'Content-Type: application/json'
);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt ($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($fields));
$result = curl_exec($ch);
if ($result === FALSE) {
die('Curl failed: ' . curl_error($ch));
}
curl_close($ch);
}
?>
Problem:
The page is always stuck in Report_Status.php every time I ran the
script. It is supposed to go in Push_User_Notification and return to Report_Status once the process is done. Am I wrong in the implementation of calling the
Push_User_Notification.php or the receiving parameters to
Push_User_Notification.php?
P.S.
Here's my full source code of Report_Status.php in case anyone wants to check it: Report_Status.php
I think the problem you may be having is that you are sending a lot of notifications to several devices in short amount of time. I think it might be being picked up as spaming. My suggestion is sending one notification to multiple devices.
Try changing your code in report_status.php to this.
include "../Android_Scripts/Notifications/Push_User_Notification.php";
$message = "Your Report has been approved! Please wait for the fire fighters to respond!";
while ($User_Row = mysqli_fetch_array($Retrieve_User, MYSQLI_ASSOC)){
$User_Token[] = $User_Row['User_Token'];
}
$tokens = implode(",", $User_Token);
send_notification($tokens, $message);
the idea is that you will collect the user tokens in $User_Token[] array. Then you would comma seperate the tokens and send the message once to all the devices that associate to the tokens. FCM allows you to send to multiple tokens in one go.
updated
$User_Token needs to be an array. so remove the implode. That was my mistake.
Secondly the $message needs to be in the following format.
$message = array(
'title' => 'This is a title.',
'body' => 'Here is a message.'
);
Also another thing to note is that there are 2 types of messages you can send using FCM. Notification Messages or Data Messages. Read more here: https://firebase.google.com/docs/cloud-messaging/concept-options
I dont know if your app is handling the receipt of messages (i dont know if you have implemented onMessageRecieve method) so i would probably suggest making a small change to the $fields array in send_notification function. Adding the notification field allows android to handle notifications automatically if your app is in the background. So make sure you app is in the background when testing. https://firebase.google.com/docs/cloud-messaging/android/receive
$fields = array(
'registration_ids' => $tokens,
'data' => $message,
'notification' => $message
);
So try the code below. I have tried and tested. It works for me. If it does not work. In send_notification function echo $result to get the error message. echo $result = curl_exec($ch); Then we can work from there to see what is wrong. You can see what the errors mean here: https://firebase.google.com/docs/cloud-messaging/http-server-ref#error-codes
include "../Android_Scripts/Notifications/Push_User_Notification.php";
$message = array(
'title' => 'Report Approved',
'body' => 'Your Report has been approved! Please wait for the fire fighters to respond!'
);
while ($User_Row = mysqli_fetch_array($Retrieve_User, MYSQLI_ASSOC)){
$User_Token[] = $User_Row['User_Token'];
}
send_notification($User_Token, $message);

Google Sitemap Ping Success [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 11 months ago.
Improve this question
I have a php script that creates an xml sitemap. At the end, I use
shell_exec('ping -c1 www.google.com/webmasters/tools/ping?sitemap=sitemapurl');
to submit the updated sitemap to Google Webmaster tools.
Having read the Google documentation, I'm unsure whether I need to do this each time or not. Entering the link in the code manually, results in a success page from google, but using the ping command I receive no confirmation. I would also like to know if there is any way of checking if the command has actually worked.
Here is a script to automatically submit your site map to google, bing/msn and ask:
/*
* Sitemap Submitter
* Use this script to submit your site maps automatically to Google, Bing.MSN and Ask
* Trigger this script on a schedule of your choosing or after your site map gets updated.
*/
//Set this to be your site map URL
$sitemapUrl = "http://www.example.com/sitemap.xml";
// cUrl handler to ping the Sitemap submission URLs for Search Engines…
function myCurl($url){
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
$httpCode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
return $httpCode;
}
//Google
$url = "http://www.google.com/webmasters/sitemaps/ping?sitemap=".$sitemapUrl;
$returnCode = myCurl($url);
echo "<p>Google Sitemaps has been pinged (return code: $returnCode).</p>";
//Bing / MSN
$url = " https://www.bing.com/webmaster/ping.aspx?siteMap=".$sitemapUrl;
$returnCode = myCurl($url);
echo "<p>Bing / MSN Sitemaps has been pinged (return code: $returnCode).</p>";
//ASK
$url = "http://submissions.ask.com/ping?sitemap=".$sitemapUrl;
$returnCode = myCurl($url);
echo "<p>ASK.com Sitemaps has been pinged (return code: $returnCode).</p>";
you can also send yourself an email if the submission fails:
function return_code_check($pingedURL, $returnedCode) {
$to = "webmaster#yoursite.com";
$subject = "Sitemap ping fail: ".$pingedURL;
$message = "Error code ".$returnedCode.". Go check it out!";
$headers = "From: hello#yoursite.com";
if($returnedCode != "200") {
mail($to, $subject, $message, $headers);
}
}
Hope that helps
Since commands like shell_exec(), exec(), passthru() etc. are blocked by many hosters, you should use curl and check for a response code of 200.
You could also use fsockopen if curl is not available. I'm going to check for the code snippet and update the answer when I found it.
UPDATE:
Found it. I knew I used it somewhere. The funny coincedence: It was in my Sitemap class xD
You can find it here on github: https://github.com/func0der/Sitemap. It is in the Sitemap\SitemapOrg class.
There is a also an example for the curl call implemented.
Either way, here is the code for stand alone implementation.
/**
* Call url with fsockopen and return the response status.
*
* #param string $url
* The url to call.
*
* #return mixed(boolean|int)
* The http status code of the response. FALSE if something went wrong.
*/
function _callWithFSockOpen($url) {
$result = FALSE;
// Parse url.
$url = parse_url($url);
// Append query to path.
$url['path'] .= '?'.$url['query'];
// Setup fsockopen.
$port = 80;
$timeout = 10;
$fso = fsockopen($url['host'], $port, $errno, $errstr, $timeout);
// Proceed if connection was successfully opened.
if ($fso) {
// Create headers.
$headers = 'GET ' . $url['path'] . 'HTTP/1.0' . "\r\n";
$headers .= 'Host: ' . $url['host'] . "\r\n";
$headers .= 'Connection: closed' . "\r\n";
$headers .= "\r\n";
// Write headers to socket.
fwrite($fso, $headers);
// Set timeout for stream read/write.
stream_set_timeout($fso, $timeout);
// Use a loop in case something unexpected happens.
// I do not know what, but that why it is unexpected.
while (!feof($fso)){
// 128 bytes is getting the header with the http response code in it.
$buffer = fread($fso, 128);
// Filter only the http status line (first line) and break loop on success.
if(!empty($buffer) && ($buffer = substr($buffer, 0, strpos($buffer, "\r\n")))){
break;
}
}
// Match status.
preg_match('/^HTTP.+\s(\d{3})/', $buffer, $match);
// Extract status.
list(, $status) = $match;
$result = $status;
}
else {
// #XXX: Throw exception here??
}
return (int) $result;
}
If you guys find any harm or improvement in this code, do not hesitate to open up a ticket/pull request on GitHub, please. ;)
Simplest solution: file_get_contents("https://www.google.com/webmasters/tools/ping?sitemap={$sitemap}");
That will work on every major hosting provider. If you want optional error reporting, here's a start:
$data = file_get_contents("https://www.google.com/webmasters/tools/ping?sitemap={$sitemap}");
$status = ( strpos($data,"Sitemap Notification Received") !== false ) ? "OK" : "ERROR";
echo "Submitting Google Sitemap: {$status}\n";
As for how often you should do it, as long as your site can handle the extra traffic from Google's bots without slowing down, you should do this every time a change has been made.

Copying images from live server to local

I have around 600k of image URLs in different tables and am downloading all the images with the code below and it is working fine. (I know FTP is the best option but somehow I can’t use it.)
$queryRes = mysql_query("SELECT url FROM tablName LIMIT 50000"); // everytime I am using LIMIT
while ($row = mysql_fetch_object($queryRes)) {
$info = pathinfo($row->url);
$fileName = $info['filename'];
$fileExtension = $info['extension'];
try {
copy("http:".$row->url, "img/$fileName"."_".$row->id.".".$fileExtension);
} catch(Exception $e) {
echo "<br/>\n unable to copy '$fileName'. Error:$e";
}
}
Problems are:
After some time, say 10 minutes, scripts give 503 error. But still continue downloading the images. Why, it should stop copying it?
And it does not download all the images, everytime there will be difference of 100 to 150 images. So how can I trace which images are not downloaded?
I hope I have explained well.
first of all... copy will not throw any exception... so you are not doing any error handling... thats why your script will continue to run...
second... you should use file_get_contets or even better, curl...
for example you could try this function... (I know... its open and closes curl every time... just an example i found here https://stackoverflow.com/a/6307010/1164866)
function getimg($url) {
$headers[] = 'Accept: image/gif, image/x-bitmap, image/jpeg, image/pjpeg';
$headers[] = 'Connection: Keep-Alive';
$headers[] = 'Content-type: application/x-www-form-urlencoded;charset=UTF-8';
$user_agent = 'php';
$process = curl_init($url);
curl_setopt($process, CURLOPT_HTTPHEADER, $headers);
curl_setopt($process, CURLOPT_HEADER, 0);
curl_setopt($process, CURLOPT_USERAGENT, $useragent);
curl_setopt($process, CURLOPT_TIMEOUT, 30);
curl_setopt($process, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($process, CURLOPT_FOLLOWLOCATION, 1);
$return = curl_exec($process);
curl_close($process);
return $return;
}
or even.. try to doit with curl_multi_exec and get your files dowloaded in parallel, wich will be a lot faster
take a look here:
http://www.php.net/manual/en/function.curl-multi-exec.php
edit:
to track wich files failed to download you need to do something like this
$queryRes = mysql_query("select url from tablName limit 50000"); //everytime i am using limit
while($row = mysql_fetch_object($queryRes)) {
$info = pathinfo($row->url);
$fileName = $info['filename'];
$fileExtension = $info['extension'];
if (!#copy("http:".$row->url, "img/$fileName"."_".$row->id.".".$fileExtension)) {
$errors= error_get_last();
echo "COPY ERROR: ".$errors['type'];
echo "<br />\n".$errors['message'];
//you can add what ever code you wnat here... out put to conselo, log in a file put an exit() to stop dowloading...
}
}
more info: http://www.php.net/manual/es/function.copy.php#83955
I haven't used copy myself, I'd use file_get_contents it works fine with remote servers.
edit:
also returns false. so...
if( false === file_get_contents(...) )
trigger_error(...);
I think 50000 is too large. Network is every time consuming, downloading an image might cost over 100 ms(depend on your nerwork condition), so 50000 images, in the most stable case(without timeout or some other errors), might cost 50000*100/1000/60 = 83 mins, that's really a long time for script like php. If you run this script as a cgi(not cli), normally you only got 30 secs by default(without set_time_limit). So I recommend making this script a cronjob and run it every 10 secs to fetch about 50 urls maybe.
To make the script only fetch a few images each time, you must remember which ones have been processed(successfully) alreay. For example, you can add a flag column to the url table, by default, the flag = 1, if url processed successfully, it becomes 2, or it becomes 3, which means the url got something wrong. And each time, the script can only select the ones which flag=1(3 might be also included, but sometimes, the url might be so wrong so re-try won't work).
copy function is too simple, I recommend using curl instead, it's more reliable, and you can got the exactlly network info of downloading.
Here the code:
//only fetch 50 urls each time
$queryRes = mysql_query ( "select id, url from tablName where flag=1 limit 50" );
//just prefer absolute path
$imgDirPath = dirname ( __FILE__ ) + '/';
while ( $row = mysql_fetch_object ( $queryRes ) )
{
$info = pathinfo ( $row->url );
$fileName = $info ['filename'];
$fileExtension = $info ['extension'];
//url in the table is like //www.example.com???
$result = fetchUrl ( "http:" . $row->url,
$imgDirPath + "img/$fileName" . "_" . $row->id . "." . $fileExtension );
if ($result !== true)
{
echo "<br/>\n unable to copy '$fileName'. Error:$result";
//update flag to 3, finish this func yourself
set_row_flag ( 3, $row->id );
}
else
{
//update flag to 3
set_row_flag ( 2, $row->id );
}
}
function fetchUrl($url, $saveto)
{
$ch = curl_init ( $url );
curl_setopt ( $ch, CURLOPT_FOLLOWLOCATION, true );
curl_setopt ( $ch, CURLOPT_MAXREDIRS, 3 );
curl_setopt ( $ch, CURLOPT_HEADER, false );
curl_setopt ( $ch, CURLOPT_RETURNTRANSFER, true );
curl_setopt ( $ch, CURLOPT_CONNECTTIMEOUT, 7 );
curl_setopt ( $ch, CURLOPT_TIMEOUT, 60 );
$raw = curl_exec ( $ch );
$error = false;
if (curl_errno ( $ch ))
{
$error = curl_error ( $ch );
}
else
{
$httpCode = curl_getinfo ( $ch, CURLINFO_HTTP_CODE );
if ($httpCode != 200)
{
$error = 'HTTP code not 200: ' . $httpCode;
}
}
curl_close ( $ch );
if ($error)
{
return $error;
}
file_put_contents ( $saveto, $raw );
return true;
}
Strict checking for mysql_fetch_object return value is IMO better as many similar functions may return non-boolean value evaluating to false when checking loosely (e.g. via !=).
You do not fetch id attribute in your query. Your code should not work as you wrote it.
You define no order of rows in the result. It is almost always desirable to have an explicit order.
The LIMIT clause leads to processing only a limited number of rows. If I get it correctly, you want to process all the URLs.
You are using a deprecated API to access MySQL. You should consider using a more modern one. See the database FAQ # PHP.net. I did not fix this one.
As already said multiple times, copy does not throw, it returns success indicator.
Variable expansion was clumsy. This one is purely cosmetic change, though.
To be sure the generated output gets to the user ASAP, use flush. When using output buffering (ob_start etc.), it needs to be handled too.
With fixes applied, the code now looks like this:
$queryRes = mysql_query("SELECT id, url FROM tablName ORDER BY id");
while (($row = mysql_fetch_object($queryRes)) !== false) {
$info = pathinfo($row->url);
$fn = $info['filename'];
if (copy(
'http:' . $row->url,
"img/{$fn}_{$row->id}.{$info['extension']}"
)) {
echo "success: $fn\n";
} else {
echo "fail: $fn\n";
}
flush();
}
The issue #2 is solved by this. You will see which files were and were not copied. If the process (and its output) stops too early, then you know the id of the last processed row and you can query your DB for the higher ones (not processed). Another approach is adding a boolean column copied to tblName and updating it immediately after successfully copying the file. Then you may want to change the query in the code above to not include rows with copied = 1 already set.
The issue #1 is addressed in Long computation in php results in 503 error here on SO and 503 service unavailable when debugging PHP script in Zend Studio on SU. I would recommend splitting the large batch to smaller ones, launching in a fixed interval. Cron seems to be the best option to me. Is there any need to lauch this huge batch from browser? It will run for a very long time.
It is better handled batch-by-batch.
The actual script
Table structure
CREATE TABLE IF NOT EXISTS `images` (
`id` int(60) NOT NULL AUTO_INCREMENTh,
`link` varchar(1024) NOT NULL,
`status` enum('not fetched','fetched') NOT NULL DEFAULT 'not fetched',
`timestamp` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
PRIMARY KEY (`id`)
);
The script
<?php
// how many images to download in one go?
$limit = 100;
/* if set to true, the scraper reloads itself. Good for running on localhost without cron job support. Just keep the browser open and the script runs by itself ( javascript is needed) */
$reload = false;
// to prevent php timeout
set_time_limit(0);
// db connection ( you need pdo enabled)
try {
$host = 'localhost';
$dbname= 'mydbname';
$user = 'root';
$pass = '';
$DBH = new PDO("mysql:host=$host;dbname=$dbname", $user, $pass);
}
catch(PDOException $e) {
echo $e->getMessage();
}
$DBH->setAttribute( PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION );
// get n number of images that are not fetched
$query = $DBH->prepare("SELECT * FROM images WHERE status = 'not fetched' LIMIT {$limit}");
$query->execute();
$files = $query->fetchAll();
// if no result, don't run
if(empty($files)){
echo 'All files have been fetched!!!';
die();
}
// where to save the images?
$savepath = dirname(__FILE__).'/scrapped/';
// fetch 'em!
foreach($files as $file){
// get_url_content uses curl. Function defined later-on
$content = get_url_content($file['link']);
// get the file name from the url. You can use random name too.
$url_parts_array = explode('/' , $file['link']);
/* assuming the image url as http:// abc . com/images/myimage.png , if we explode the string by /, the last element of the exploded array would have the filename */
$filename = $url_parts_array[count($url_parts_array) - 1];
// save fetched image
file_put_contents($savepath.$filename , $content);
// did the image save?
if(file_exists($savepath.$file['link']))
{
// yes? Okay, let's save the status
$query = $DBH->prepare("update images set status = 'fetched' WHERE id = ".$file['id']);
// output the name of the file that just got downloaded
echo $file['link']; echo '<br/>';
$query->execute();
}
}
// function definition get_url_content()
function get_url_content($url){
// ummm let's make our bot look like human
$agent= 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.0.3705; .NET CLR 1.1.4322)';
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_VERBOSE, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
curl_setopt($ch, CURLOPT_USERAGENT, $agent);
curl_setopt($ch, CURLOPT_URL,$url);
return curl_exec($ch);
}
//reload enabled? Reload!
if($reload)
echo '<script>location.reload(true);</script>';
503 is a fairly generic error, which in this case probably means something timed out. This could be your web server, a proxy somewhere along the way, or even PHP.
You need to identify which component is timing out. If it's PHP, you can use set_time_limit.
Another option might be to break the work up so that you only process one file per request, then redirect back to the same script to continue processing the rest. You would have to somehow maintain a list of which files have been processed between calls. Or process in order of database id, and pass the last used id to the script when you redirect.

Execute HTTP Post automatically

I have a free script and I would like to ask if it's possible to replace or automate the search function. For example every hour. Right now I have to press the search button to find new proxies but I want to search automatically and update them in my database, maybe using a cron job.
if(isset($_POST['search'])) { // hit search button
$script_start = $pb->microtime_float();
ob_flush();
flush();
$proxylisttype = $pb->returnProxyList($_REQUEST['listtype']); // make sure request vars are clean
$sitestoscour = $pb->returnSitesScour($_REQUEST); // make sure request vars are clean
$finallist = $pb->returnFinalList($sitestoscour);
$finallist = $pb->arrayUnique($finallist); // eliminate the dupes before moving on
if(AUTO_BAN == 1) { // remove banned proxies
$finallist = $pb->autoBan($finallist);
}
$script_end = $pb->microtime_float(); // stop the timer
}
You can either do it with curl from a php script or command line (or wget). That way you can set the $_POST:
$ch = curl_init();
curl_setopt($ch,CURLOPT_URL, "http://yoururl.com'");
curl_setopt($ch,CURLOPT_POST, true);
curl_setopt($ch,CURLOPT_POSTFIELDS, "search=your_query");
$result = curl_exec($ch);
curl_close($ch);
Then make that script run every hour by setting up a cron job.
You could also do it with wget:
wget --post-date="search=query" http://yoururl.com

PHP script to check webserver status using a cron job

I'm looking for a PHP script that can be run as a cron job on my web host. It needs to run through a list of websites and check to make sure that each returns the Http response 200 OK. If a site doesn't return that response, or isn't available, it needs to send off an email to the website admin.
I've since refined this script to check to see if your website/webserver is still up and running. I've improved the error handling slightly and added a comfort email to let you know that the script is running successfully.
The comfort email relies on another file called healthcheck.txt to store some values until the script is run the next time. If it doesn't get automatically created, just create a 0 bytes text file, upload it and set the correct file permissions on it (read/write).
<?php
// set email server parameters
ini_set('sendmail_from', 'server.status#host.example.com' );
ini_set('SMTP', '127.0.0.1' );
ini_set('smtp_port', '25' );
ini_set('allow_url_fopen', true); //enable fopen
// define list of webservers to check
$webservers = array('www.example.com', 'www.example2.com');
function sendemail($subject,$message) // email function using standard php mail
{
$wrapmessage = wordwrap($message,70,"\n",true); // mail function can't support a message more than 70 characters per line
$to = 'you#example.com'; // who to send the emails to
// Headers ensure a properly formatted email
$headers = 'From: server.status#host.example.com' . "\r\n" .
'Reply-To: server.status#host.example.com' . "\r\n" .
'X-Mailer: PHP/' . phpversion();
return mail($to, $subject, $wrapmessage, $headers); //send the email
}
function getresponse($url) //queries a url and provides the header returned and header response
{
$ch = curl_init(); // create cURL handle (ch)
if (!$ch) { // send an email if curl can't initialise
$subject = "Web Server Checking Script Error";
$message = "The web server checking script issued an error when it tried to process ".$url.". Curl did not initialise correctly and issued the error - ".curl_error($ch)." The script has died and not completed any more tasks.";
sendemail($subject,$message);
die();
}
// set some cURL options
$ret = curl_setopt($ch, CURLOPT_URL, "http://".$url."/");
$ret = curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
$ret = curl_setopt($ch, CURLOPT_HEADER, true);
$ret = curl_setopt($ch, CURLOPT_NOBODY, true);
$ret = curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$ret = curl_setopt($ch, CURLOPT_RETURNTRANSFER, false);
$ret = curl_setopt($ch, CURLOPT_TIMEOUT, 30);
// execute
$ret = curl_exec($ch);
if (empty($ret)) {
// some kind of an error happened
$subject = "Web Server Checking Script Error";
$message = "The web server checking script issued an error when it tried to process ".$url.". Curl was trying to execute and issued the error - ".curl_error($ch)." Further URLs will be tried.";
sendemail($subject,$message);
curl_close($ch); // close cURL handler
} else {
$info = curl_getinfo($ch); //get header info - output is an array
curl_close($ch); // close cURL handler
if (empty($info['http_code'])) {
$subject = "Web Server Checking Script Error";
$message = "The web server checking script issued an error when it tried to process ".$url."\r\nNo HTTP code was returned";
sendemail($subject,$message);
} else {
// load the HTTP code descriptions
$http_codes = parse_ini_file("/server/path/to/http-response-codes.ini");
// results - code number and description
$result = $info['http_code'] . " " . $http_codes[$info['http_code']];
return $result; // $result contained a code, so return it
}
return None; //$info was empty so return nothing
}
return None; // $ret was empty so return nothing
}
// this bit of code initiates the checking of the web server
foreach ($webservers as $webserver) { //loop through the array of webservers
$status = getresponse($webserver); //get the status of the webserver
if (empty($status)) {
// nothing happens here because if $status is empty, the function returned nothing and an email was already sent.
} else {
if (strstr($status, "200")) { //search for the error code that means everything is ok
// If found, don't do anything, just process the next one
} else {
$timestamp = date("m/d/Y H:i:s a", time()); //get the current date and time
$error = $webserver." - ".$status." status error detected"; //set error message with server and response code
$message = "At - ".$timestamp." - a http response error was detected on ".$webserver.".\r\nInstead of a 200 OK response, the server returned ".$status."\r\nThis requires immediate attention!"; //At what time was an error detected on which server and what was the error message
sendemail($error,$message); //trigger the sendemail function
}
}
}
// Health Check. Comfort email twice a day to show script is actually running.
$healthfile = "/server/path/to/healthcheck.txt"; // path with the name of the file to store array data
$hfsize = filesize($healthfile); // filesize of healthcheck file
$notify = "16:00"; // specify the earliest time in the day to send the email - cron job settings dictate how close you'll get to this
$datenow = date("d-m-Y"); //what is current date as of now
if (file_exists($healthfile) && $hfsize !== 0) { //read contents of array from file if it exists and has data, otherwise create array with some defaults
$valuestor = unserialize(file_get_contents($healthfile));
} else { // file doesn't exist so we'll create an array with some defaults
$valuestor = array("email_sent"=>0, "sent_date"=>$datenow, "iterations"=>0);
}
$i = $valuestor['iterations']; //get the iterations number from the valuestor array
$curdate = strtotime($datenow); //convert current date to seconds for comparison
$stordate = strtotime($valuestor['sent_date']); //convert stored date to seconds
if ($valuestor['email_sent'] == 1) { // has the email already been sent today
if ($curdate == $stordate) { // if it has, is the current date equal to the stored date
$i++; // yes it is, just increment the iterations
} else { // it's a new day, reset the array
$timestamp = date("m/d/Y H:i:s a", time()); //get the current date and time
$subject = "Web Server Checking Script Health Status"; //set email subject line
$message = "Message created: ".$timestamp."\r\nThe Web Server Checking script ran successfully for ".$i." time(s) on the ".$valuestor['sent_date']; //email message
sendemail($subject,$message); //trigger the sendemail function
$valuestor['email_sent'] = 0; // set email sent to false
$valuestor['sent_date'] = $datenow; // set email send date to today
$i = 1; // this is the first time the script has run today, so reset i to 1. It gets written to the array later.
// echo $message;
}
} else { // email has not been sent today
$checktime = strtotime($notify); //convert $notify time (for current date) into seconds since the epoch
if (time() >= $checktime) { // are we at or have we gone past checktime
$i++; // increase the number of script iterations by 1
$timestamp = date("m/d/Y H:i:s a", time()); //get the current date and time
$subject = "Web Server Checking Script Health Status"; //set email subject line
$message = "Message created: ".$timestamp."\r\nThe Web Server Checking script has successfully run and completed ".$i." time(s) today."; //email message
sendemail($subject,$message); //trigger the sendemail function
$valuestor['email_sent'] = 1; // set array to show that email has gone
// echo $message;
} else { // we haven't reached the check time yet
$i++; // just increment the iterations
}
}
$valuestor['iterations'] = $i; // update the array with the iterations number
// save the array to the file again
$fp = fopen($healthfile, 'w+'); // open or create the file, clear its contents and write to it
if (!$fp) { // handle the error with an email if the file won't open
$subject = "Web Server Checking Script Error";
$message = "The web server checking script issued an error when trying to open or create the file ".$healthfile." The script was ended without any new information being stored.";
sendemail($subject,$message);
} else {
fwrite($fp, serialize($valuestor)); // write to the file and serialise the array valuestor
fclose($fp); // close the file connection
}
die(); // make sure that script dies and cron job terminates
?>
I found it took me a while to research a good answer to this question. So for the benefit of the community, here's what I came up with after research on Stackoverflow and other forums.
You need two files for this to work. The PHP script that you execute via cron and a ini file that contains detailed descriptions of what the http response codes mean.
I hope this is of use to others.
server-check.php
<?php
// set email server parameters
ini_set('sendmail_from', 'server.status#host.example.com' );
ini_set('SMTP', '127.0.0.1' );
ini_set('smtp_port', '25' );
// define list of webservers to check
$webservers = array('www.example.com', 'www.example2.com');
function sendemail($subject,$message) // email function using standard php mail
{
$wrapmessage = wordwrap($message,70,"\n",true); // mail function can't support a message more than 70 characters per line
$to = 'you#example.com'; // who to send the emails to
// Headers ensure a properly formatted email
$headers = 'From: server.status#host.example.com' . "\r\n" .
'Reply-To: server.status#host.example.com' . "\r\n" .
'X-Mailer: PHP/' . phpversion();
return mail($to, $subject, $wrapmessage, $headers); //send the email
}
function getresponse($url) //queries a url and provides the header returned and header response
{
$ch = curl_init(); // create cURL handle (ch)
if (!$ch) {
$subject = "Web Server Checking Script Error";
$message = "The web server checking script issued an error when it tried to process ".$url."\r\nCouldn't initialize a cURL handle";
sendemail($subject,$message);
die();
}
// set some cURL options
$ret = curl_setopt($ch, CURLOPT_URL, "http://".$url."/");
$ret = curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
$ret = curl_setopt($ch, CURLOPT_HEADER, true);
$ret = curl_setopt($ch, CURLOPT_NOBODY, true);
$ret = curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$ret = curl_setopt($ch, CURLOPT_RETURNTRANSFER, false);
$ret = curl_setopt($ch, CURLOPT_TIMEOUT, 30);
// execute
$ret = curl_exec($ch);
if (empty($ret)) {
// some kind of an error happened
$subject = "Web Server Checking Script Error";
$message = "The web server checking script issued an error when it tried to process ".$url."\r\ncurl_error ".$ch;
sendemail($subject,$message);
curl_close($ch); // close cURL handler
die();
} else {
$info = curl_getinfo($ch);
curl_close($ch); // close cURL handler
if (empty($info['http_code'])) {
$subject = "Web Server Checking Script Error";
$message = "The web server checking script issued an error when it tried to process ".$url."\r\nNo HTTP code was returned";
sendemail($subject,$message);
die();
} else {
// load the HTTP codes
$http_codes = parse_ini_file("/server/path/to/http-response-codes.ini");
// results
$result = $info['http_code'] . " " . $http_codes[$info['http_code']];
}
}
return $result;
}
foreach ($webservers as $webserver) { //loop through the array of webservers
$status = getresponse($webserver); //get the status of the webserver
if (strstr($status, "200")) { //search for the error code that means everythings ok
return None; // Don't do anything, just process the next one
} else {
$timestamp = date("m/d/Y H:i:s a", time()); //get the current date and time
$error = $webserver." - ".$status." status error detected"; //set error message with server and response code
$message = "At - ".$timestamp." - a http response error was detected on ".$webserver.".\r\nInstead of a 200 OK response, the server returned ".$status."\r\nThis requires immediate attention!"; //At what time was an error detected on which server and what was the error message
sendemail($error,$message); //trigger the sendemail function
}
}
?>
http-response-codes.ini
[Informational 1xx]
100="Continue"
101="Switching Protocols"
[Successful 2xx]
200="OK"
201="Created"
202="Accepted"
203="Non-Authoritative Information"
204="No Content"
205="Reset Content"
206="Partial Content"
[Redirection 3xx]
300="Multiple Choices"
301="Moved Permanently"
302="Found"
303="See Other"
304="Not Modified"
305="Use Proxy"
306="(Unused)"
307="Temporary Redirect"
[Client Error 4xx]
400="Bad Request"
401="Unauthorized"
402="Payment Required"
403="Forbidden"
404="Not Found"
405="Method Not Allowed"
406="Not Acceptable"
407="Proxy Authentication Required"
408="Request Timeout"
409="Conflict"
410="Gone"
411="Length Required"
412="Precondition Failed"
413="Request Entity Too Large"
414="Request-URI Too Long"
415="Unsupported Media Type"
416="Requested Range Not Satisfiable"
417="Expectation Failed"
[Server Error 5xx]
500="Internal Server Error"
501="Not Implemented"
502="Bad Gateway"
503="Service Unavailable"
504="Gateway Timeout"
505="HTTP Version Not Supported"

Categories