Please don't laugh at me but I believe that I just did something extremely stupid. I was working on setting up a newsletter for a site that I am working on, but I tried it out at first when there was a typo. While scanning throughout the database and sending emails, I screwed up on the part that makes it stop. I fixed the code, but the emails are still being send (to my mom :O) and they don't seem to be stoping.
This is the script when I executed it:
$message = $_POST['emailmessage'];
$subject = $_POST['subject'];
$query = mysql_query("SELECT `email` FROM `members` WHERE `active`='1'");
//This line underneath should not be there
$rows = mysql_fetch_assoc($query);
$headers = array(
"From: contact#thestopitcampaign.com",
"Content-Type: text/html"
);
//should be '$rows = mysql_fetch_assoc($query)' instead of '$rows'
while($rows)
{
mail($rows['email'],$subject,$message,implode("\r\n",$headers));
echo "<p>Sent to: " . $rows['email'] . "</p>";
}
I contacted FatCow to see if they could stop the script, but they said that they could not do that and they would have to delete my entire account and put me on a different server. I cannot do that. Is there anyway to generate an error or something that would make the rogue script stop? FYI I do not have SSH access.
--I looked in my php config file and the timeout for a script is 300 seconds. That seems like a lot of emails to send. Is there anyway to stop those emails?
What has been sent can't be stopped any more. But it won't run forever and probably has already stopped.
If the server is not grossly misconfigured by the provider, your script didn't run any longer than a certain time limit, e.g. 60 seconds. Even though messages continue to come through, it's probably no longer running, but the mail server is taking its time to handle all the messages that it created.
I would wait and see - the flood is likely to end soon.
What the provider says about moving the account to a different server doesn't seem to make any sense at all - if there is a rogue process that is sending E-Mail, they should be able to kill it easily. But anyway... I would wait.
My guess is that the emails are just queued up and should eventually stop as php script wont keep executing itself after the page has stopped loading.
I'd suggest you to use a newsletter provider like mailchimp as they are more relyable, are safe and the service is easy to integrate in your website.
Hope that helps!
Related
I have a Wordpress website with a working order system. Now I want to make an Android app which displays every new order in a list view as soon as the order was made.
The last two days I thought about the following solutions:
Simple HTTP GET requests every 10 seconds
Websockets
MySQL binary log + Pusher Link
Server Sent Events
My thoughts (working with a LAMP stack):
Simple HTTP requests are obviously the most ineffective solution.
I figured out that websockets and Apache aren't working well together.
Feels quite hacky and I want to avoid any 3rd party service if I can.
4. Looks like this is the optimal way for me, however there are some problems with Apache/php and Server Sent Events from what I experienced.
I tried to implement a simple demo script but I don't understand why some of them are using an infinite while loop to keep the connection open and others don't.
Here is an example without a loop and here with an infinite loop, also here
In addition to that, when I tested the variant with the infinite loop, my whole page won't load because of that sleep() function. It looks like the whole server freezes whenever I use it.
Does anyone have an idea how to fix that? Or do you have other suggestions?
That is the code that causes trouble (copied from here) and added a missing curly bracket:
<?php
// make session read-only
session_start();
session_write_close();
// disable default disconnect checks
ignore_user_abort(true);
// set headers for stream
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");
header("Access-Control-Allow-Origin: *");
// Is this a new stream or an existing one?
$lastEventId = floatval(isset($_SERVER["HTTP_LAST_EVENT_ID"]) ? $_SERVER["HTTP_LAST_EVENT_ID"] : 0);
if ($lastEventId == 0) {
$lastEventId = floatval(isset($_GET["lastEventId"]) ? $_GET["lastEventId"] : 0);
}
echo ":" . str_repeat(" ", 2048) . "\n"; // 2 kB padding for IE
echo "retry: 2000\n";
// start stream
while(true){
if(connection_aborted()){
exit();
}
else{
// here you will want to get the latest event id you have created on the server, but for now we will increment and force an update
$latestEventId = $lastEventId+1;
if($lastEventId < $latestEventId){
echo "id: " . $latestEventId . "\n";
echo "data: Howdy (".$latestEventId.") \n\n";
$lastEventId = $latestEventId;
ob_flush();
flush();
}
else{
// no new data to send
echo ": heartbeat\n\n";
ob_flush();
flush();
}
}
// 2 second sleep then carry on
sleep(2);
}
?>
I'm thankful for every advice I can get! :)
EDIT:
The main idea is to frequently check my MySQL database for new entries and if there is a new order present, format the data nicely and send the information over SSE to my android application.
I already found libraries to receive SSEs on android, the main problem is on the server side.
Based on your question I think you could implement SSE - Server sent events, which is part of HTML5 standard. It is a one-way communication from server to client. It needs html/javascript and a backend language, e.g PHP.
The client will subscribe on events and when subscription is up and running the server will send any updates from the input data. As standard the update will be visible each 3 seconds. This can be adjusted though.
I would recommend you to first create a basic functioning web-browser-client as a start. When and if it is working as you expect, only then you would judge about the effort of building the client as an app.
You would probably need to add functions on the client-side, such as start/stop the subscription.
My understanding of users not recommending the combination of (server sent events) and Apache is the lack of control how many open connections there are and what would control the continuously need of closing of connections. This could lead to sever server performance problems.
Seems using for example node.js would not cause that problem.
Here are some start link:
MDN:
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
Stream Updates with Server-Sent Events:
https://www.html5rocks.com/en/tutorials/eventsource/basics/
I have a script to process sending emails. There could be thousands of emails to send. I want to show the user a page that says something along the lines of "Your messages are being sent.", but I don't want that page to do the actual sending of the messages, because I don't want the user to see a blank page until the script finishes and I also don't want the user to have to wait for the script to finish.
I would like to pass a list of IDs to another PHP page (lets call it run.php) and have it execute without having the user actually visit it. So I would pass the IDs to the page, which starts the execution, but then finishes loading the current page which shows the message "Your messages are being sent.".
<?php
/* SESSION DATA HERE */
executeMailSend($ids);
?>
<html>
<head>
....
</head>
<body>
Your messages are being sent.
</body>
</html>
I think I could something like this using ajax, but I would prefer to not make this rely on client side coding.
Also, if at all possible, I need to have the script running in the background keep session data from the session that starts it, and I would prefer that I don't have to pass this data along as separate variables.
Also, I don't want to use anything like exec() or system().
Any ideas?
Utilize the power of AJAX for more information checkout this tutorial by w3schools.com
http://www.w3schools.com/ajax/default.asp
A cron job is one option. You could do it by implementing a queue in the database, you'd just add entries for the mail to be sent and the cron job processes those entries. But maybe read this before going down that road: http://www.engineyard.com/blog/2011/5-subtle-ways-youre-using-mysql-as-a-queue-and-why-itll-bite-you/
If you use a cron job, you're going to be implementing a queue somewhere (flat file...blech, MySQL, other DB, etc.)
Why not use something that is actually made for queuing? For example you could use Amazon (http://aws.amazon.com/sqs/). But whatever software/provider you use, this sounds like it would be best with proper job queuing. You want the user to go to a page which will add some work items to your queue (which in this case are email addresses and maybe messages depending on exactly what you are doing). Then the queue software should have some means of processing these jobs (i.e. actually sending the emails).
You say you don't want to pass variables, but you'd likely have to send whatever data you need for each job to your queue. You could potentially store extra data to your database, just think about locking and performance.
You could make your script add that email to 'email query' and schedule a cron job to send emails. I don't think there is other way to call php script asynchronously.
Maybe this could point you in right direction: http://blog.markturansky.com/archives/205
if you have access to a linux box; you can use something called Cronjobs.
You can create a separate script with it's own database table, which it uses for reference.
on a set interval it runs PHP through the linux CLI for PHP and executes the script, no user needs to be on the page. the database needs to be configured properly though.
Only problem is with cronjobs, all output is emailed to the administration E-mail (/etc/aliases
Example on Cronjob input:
nano /etc/crontab
* * * * * root /usr/bin/php /var/www/cron.php
and on your cron.php you will have a script setup for a specific function, in this case. E-mail users.
http://www.thegeekstuff.com/2011/07/php-cron-job/
I guess I will have to stick with my current option which is script I've used before. I was hoping to find a better option, but here is the code I am going to use:
function backgroundPost($url){
$parts=parse_url($url);
$fp = fsockopen($parts['host'], isset($parts['port'])?$parts['port']:80, $errno, $errstr, 30);
if (!$fp) {
return false;
} else {
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($parts['query'])."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($parts['query'])) $out.= $parts['query'];
fwrite($fp, $out);
fclose($fp);
return true;
}
}
backgroundPost($siteURL.'/run.php?ids='.urlencode($ids).'&sessiondata='.urlencode($sessiondata));
Does anyone see a problem with using this? Security, bad code, etc?
Are you on Linux server? You can fork() the script (within PHP) then daemonize the child process, detaching it so that it continues to run after the initializing HTTP request is finalized. Just make sure to kill the child processes correctly.
http://www.re-cycledair.com/php-dark-arts-daemonizing-a-process
I am trying to create a simple newsletter sending application using php. I was wondering if i could create a script in PHP which runs continuously even the browser is closed.
I have written code as
<?php
include "db.php";
$newsletterid=$_GET['id'];
$t="select * from tbl_newsletters where newsletterid='".$newsletterid."'";
$q=mysql_query($t);
$r=mysql_fetch_array($q);
$Subject=$r['subject'];
$Message=$r['message'];
$headers = 'MIME-Version: 1.0' . "\r\n";
$headers .= 'Content-type: text/html; charset=iso-8859-1' . "\r\n";
$headers .= 'To: '.$_SESSION['login_name'].'<'.$email.'>' . "\r\n";
$headers .= 'From: My Newsletter Manager <newsletter#myexample.com>' . "\r\n";
$t="select email from tbl_subscribers where status='SUBSCRIBED'";
$q=mysql_query($t);
while($r=mysql_fetch_array($q))
{
$addresses[]=$r['email'];
}
foreach($addresses as $address)
{
mail($address,$Subject,$Message,$headers) or die("Cannot Send Email");
$t="insert into tbl_activities set
username='".$_SESSION['login_name']."',
activity='Sent News Letter [$Subject] to $address',
date='".date("Y-m-d H:i:s")."',
status='SUCCESS'
";
mysql_query($t) or die("Error Saving to Database");
}
?>
I am a bit confident that this script works while sending newsletters to all subscribers but what will happen if the newsletter operator closed the browser. Will this script continuously send newsletters until the foreach loop is completed?
Even if this script works, i am willing to display a progress bar for each email sent which the operator can view any time he/she wishes.
Simultaneously I wish i could prevent the operator to re-send the newsletter until the previous one is completed?
I am not an expert and don't have advanced knowledge of OOP and Class instances?
You should read up on PHP's Connection Handling. This section of the documentation talks about ignore_user_abort(), a function which tells the PHP interpreter to continue executing after the browser severs its connection with the PHP script. It also tells you about set_time_limit(), a function to set the timeout duration for the script.
For example, you might want to add code like this to your script:
set_time_limit( 0 ); // 0 means never timeout
ignore_user_abort(true); // continue running when browser closes
As your code runs outside of a connection with a browser, issues like error handling, memory management, and event logging become more important. It's helpful to know if your code, running long after the browser went away, was successful, and what it actually did.
Instead of having your PHP code run after the browser connection is broken, another option to consider is to write the long-running code as a server-side process, and have the PHP script only invoke this server process. This means all the issues of error handling, memory mangement, and event logging can be handled in a server-side application programming environment, like Java, which might have better capabilities than PHP.
What the script does when a user closes the client is dependent on how you deal with PHP's connection handling. You can abort the script, or you can have it continue.
If you decide to abort the script, you can implement a function that can save the last email address the newsletter was sent to. Optionally, you can write to a log file within the loop, and if the script gets aborted, you'll know all the emails that were already notified.
The PHP interpreter imposes a time limit on scripts, and will terminate a script that runs for too long. I think the default time limit is 30 seconds, though it can be changed in the server's configuration.
As far as I know, PHP scripts only run in response to requests from clients (e.g. a browser), and the script's purpose is just to respond to that specific request. You can't do background processing, independent of any request.
I write a simple test code:
<?php
set_time_limit(0); //set time limit = unlimited
ini_set('memory_limit','900M'); //memory max 900MB
for($i=0;$i<=30000;$i++)
{
file_put_contents('temp/'.$i.'.txt','Some data...');
}
?>
After open in browser, then i closed the browser again, PHP still processing my instruction, and success writing all txt files (0-30000). Maybe you can create your own testing using mail() function. Hope this simple test give your more idea.
Maybe you want to see how PHP connection status is maintained: PHP Connection handling.
I would like to be able to get an array of emails and make sure each email is sent. (e.g. Array > send each email > result) I kind of changed the question here because this is more important, plus I have added a 50 rep. point. Codewise how can I do this?
Apart from still using the mail() function, you probably want to setup a cron job for sending out the mails. For spooling mail send jobs use a separate database table. Or if it's about some sort of mailing list functionality, then a simple recipient list will do.
If you just want to send out a bunch of the same email at once, you
could call implode() on your array of emails to turn it into a string:
$to_string = implode(', ', $to_array);
Or, if you want to try something more complicated, you could use a
foreach loop to cycle through each email and to keep track of
successes and failures:
$success = array();
$failure = array();
foreach ($to_array as $to_email)
{
if (mail($to_email, $subject, $message, ...))
$success[] = $to_email;
else
$failure[] = $to_email;
}
I'm guessing your original question had to do with sending out all these
emails every day or something without necessitating you hitting a
button. If you have ssh access, see what happens if you type:
crontab -e
If you get some sort of error, you will have to speak with your system
administrator about cron. If you get a file, then you can use cron. This
is not a part of your current question, though, so I'll leave it.
The same way. You must have code that sends an email to an email address. Whether they are on the site or not, it is the same code. You just need to know their email address.
EDIT: If you are wondering how you would trigger the email to be sent, maybe you want to schedule it using a cron job, for example send an email every day at midnight.
This says it all, really: http://php.net/manual/en/function.mail.php
You just need an outgoing mail server installed (postfix, exim, sendmail)
Easy way to send an email:
$to = "usermail#test.com";
$from = "my_email#mydomain.com";
$subject = "Hello!";
$contents = "This is an test mail!";
mail($to, $subject, $contents, "From: $from");
If you don't have access to cron jobs then you will probably struggle to run without user interaction.
A common method for dealing with this is to run on every nth page load, or every so-often. This only works if you have a site that's visited about as often as you want to send email. You'll also want to use an ACID-compliant database. Pseudo-code follows.
if (1 == rand(1,100)) { // run once every 100 page loads
$emails = get_emails_to_send();
mark_emails_as_sent($emails);
$results = send_emails($emails);
mark_failures_as_needing_to_be_sent($results);
}
Alternately, you can run it on a timer:
if (time() - get_last_time_run() > $run_at_least_once_every_this_many_seconds) {
$emails = get_emails_to_send();
mark_emails_as_sent($emails);
$results = send_emails($emails);
mark_failures_as_needing_to_be_sent($results);
}
Or you can combine both with a &&. This depends on how often your page gets hit.
If you want to send email more often than you get page hits... too bad ;). Or have an external service call the page every so often.
I have a hefty PHP script.
So much so that I have had to do
ini_set('memory_limit', '3000M');
set_time_limit (0);
It runs fine on one server, but on another I get: Out of memory (allocated 1653342208) (tried to allocate 71 bytes) in /home/writeabo/public_html/propturk/feedgenerator/simple_html_dom.php on line 848
Both are on the same package from the same host, but different servers.
Above Problem solved new problem below for bounty
Update: The script is so big because it rawls a site and parsers data from 252 pages, including over 60,000 images, which it makes two copies of. I have since broken it down into parts.
I have another problem now though. when I am writing the image from outside site to server like this:
try {
$imgcont = file_get_contents($va); // $va is an img src from an array of thousands of srcs
$h = fopen($writeTo,'w');
fwrite($h,$imgcont);
fclose($h);
} catch(Exception $e) {
$error .= (!isset($error)) ? "error with <img src='" . $va . "' />" : "<br/>And <img src='" . $va . "' />";
}
All of a sudden it goes to a 500 internal server error page and I have to do it again, at which point it works, because files are only copied it they don't already exist. Is there anyway I can receive the 500 response code and send it back it to the url to make it go again? As this is to all be an automated process?
If this is memory related, I would personally use copy() rather than file_get_contents(). It supports the file wrappers the same way, and I don't see any advantage in loading the whole file in memory just to write it back on the filesystem.
Otherwise, your error_log might give you more information as of why the 500 happens.
There are three parties involved here:
Remote - The server(s) that contain the images you're after
Server - The computer that is running your php script
Client - Your home computer if you are running the script from a web browser, or the same computer as the server if you are running it from Cron.
Is the 500 error you are seeing being generated by 'Remote' and seen by 'Server' (i.e. the images are temporarily unavailable);
Or is it being generated by 'Server' and seen by 'Client' (i.e. there is a problem with your script).
If it is being generated by 'Remote', then see Ali's answer for how to retry.
If it is being generated by your script on 'Server', then you need to identify exactly what the error is - the php error logs should give you more information. I can think of two likely causes:
Reaching PHP's time limit. PHP will only spend a certain amount of time working before returning a 500 error. You can set this to a higher value, or regularly re-set the timer with a call to set_time_limit(), but that won't work if your server is configured in safe mode.
Reaching PHP's memory limit. You seem to have encoutered this already, but worth making sure you're script still isn't eating lots of memory. Consider outputing debug data (possibly only if you set $config['debug_mode'] = true or something). I'd suggest:
try {
echo 'Getting '.$va.'...';
$imgcont = file_get_contents($va); // $va is an img src from an array of thousands of srcs
$h = fopen($writeTo,'w');
fwrite($h,$imgcont);
fclose($h);
echo 'saved. Memory usage: '.(memory_get_usage() / (1024 * 1024)).' <br />';
unset($imgcont);
} catch(Exception $e) {
$error .= (!isset($error)) ? "error with <img src='" . $va . "' />" : "<br/>And <img src='" . $va . "' />";
}
I've also added a line to remove the image from memory, incase PHP isn't doing this correctly itself (in theory that line shouldn't be necessary).
You can avoid both problems by making your script process fewer images at a time and calling it regularly - either using Cron on the server (the ideal solution, although not all shared webhosts allow this), or some software on your desktop computer. If you do this, make sure you consider what will happen if there are two copies of the script running at the same time - will they both fetch the same image at the same time?
So it sounds like you're running this process via a web browser. I'm guessing that you may be getting the 500 error from Apache timing out somehow after a certain period of time or the process dies or something funky. I would suggest you do one of the following:
A) Move the image downloading to a background process, you can run the crawl script in the browser which will write the urls of the images to be downloaded to the db or something and another script will fire up via cron and fetch all the images. You could also have this script work in batches of 100 or so at a time to keep memory consumption down
B) Call the script directly from the command line (this is really the preferred method for something like this anyway, and you should still probably separate the image fetching to another script)
C) If the command line is not an option for some reason, have your browser loaded script touch a file, and have a cron that runs every minute and looks for the file to exist. Then it fires up your script, you can have the output written to a file for you to check later or send an email when it's completed
Is there anyway I can receive the 500 response code and send it back it to the url to make it go again? As this is to all be an automated process?
Here's the simple version of how I would do it:
function getImage($va, $writeTo, $retries = 3)
{
while ($retries > 0) {
if ($imgcont = file_get_contents($va)) {
file_put_contents($writeTo, $imgcont);
return true;
}
$retries--;
}
return false;
}
This doesn't create the file unless we successfully get our image file, and will retry three times by default. You will of course need to add any require exception handling, error checking, etc.
I would definitely stop using file_get_contents() and write the files in chunks, like this:
$read = fopen($url, 'rb');
$write = fope($local, 'wb');
$chunk = 8096;
while (!feof($read)) {
fwrite($write, fread($read, $chunk));
}
fclose($fp);
This will be nicer to your server, and should hopefully solve your 500 problems. As for "catching" a 500 error, this is simply not possible. It is an irretrievable error thrown by your script and written to the client by the web server.
I'm with Swish, this is not really the kind of task that PHP is intended for - you'de be much better using some sort of server side scripting.
Is there anyway I can receive the 500 response code and send it back it to the url to make it go again?
Have you considered using another library? Fetching files from an external server seems to me more like a job for curl or ftp than file_get_content &etc. If the error is external, and you're using curl, you can detect the 500 return code and handle it appropriately without crashing. If not, then maybe you should split your program into two files - one of which fetches a single file/image, and the other that uses curl to repeatedly call the first one. Unless the 500 error means that all php execution crashes, you would be able to detect the failure and handle it.
Something like this pseudocode:
file1.php:
foreach(list_of_files as filename){
do {
x = call_curl('file2.php', filename);
}
while(x == 500);
}
file2.php:
filename=$_GET['filename'];
results = use_curl_to_get_page(filename);
echo results;
Thanks for all your input. I had seperated everything by the time I wrote this question, so the crawler, fired the image grabber, etc.
I took on board the solution to split the number of images, and that also helped.
I also added a try, catch round the file read.
This was only being called from the browser during testing, but now that it is all up and running it is going to be a cron job.
Thanks Swish and Benubird for your particularly detailed and educational answers. Unfortunately I had no cooperation with the developers on the backend where the images are coming from (long and complicated story).
Anyway, all good now so thanks. (Swish how do you call a script from the command line, my knowledge of this field is severely lacking?)