I have a php application (using a LAMP stack) that sends thousands of application. I would love to stop by force emails to be sent. I can't stop the senidng by closing the browser obviously.
Should I kill processes, or is there any other way to do so ? What process should I kill ? There may be more than one..?
PS: Of course, the application is badly designed.. but here is not the question.
If it's your own (self written) application, perhaps you should add some functionality that allows you to suspend or halt the execution.
One example would be on every X iterations, the script checks a resource for commands. If there are commands in the resource queue, it executes them in order, removes them and continues (if applicable).
For example, flat file or DB, you could add a STOP-SUSPEND_EXECUTION command. When your script reads that line or row out, it suspends normal execution but continues checking the resource periodically. After which if a RESUME command is read, execution resumes from where it left off as it hasn't left the iterative loop.
Now you can, either by CLI or other interface, add commands to the queue, and the application will respond accordingly.
You could even get fancy, adding timestamps to defer command execution.
PS: If you're performing tasks like mass mailing, etc., perhaps you'd consider moving these scripts to a command line interface. I mention this only based on your comment about "closing the browser".
Could use some work, but it does the trick. run() takes a callback function $job as an argument. That function represents a single iteration of whatever batch job you're doing (mass mailing, etc.) and $data as an array of data. With each iteration, $job is given the next element of the $data array as a set of arguments.
$data = array(
array('name' => 'bob', 'email' => 'bob#site.com'),
array('name' => 'jim', 'email' => 'jim#site.com'),
array('name' => 'ann', 'email' => 'ann#site.com'),
);
$job = function($name, $email){
// do something with $name
// and $email
};
$batch->run($job, $data);
You need some tables (a 'la MySQL Workbench):
CREATE SCHEMA IF NOT EXISTS `batchtest` DEFAULT CHARACTER SET latin1 COLLATE latin1_swedish_ci ;
USE `batchtest` ;
CREATE TABLE IF NOT EXISTS `batchtest`.`job` (
`id` CHAR(24) NOT NULL ,
`alias` VARCHAR(255) NOT NULL ,
`status` INT NOT NULL DEFAULT 0 ,
`timestamp` TIMESTAMP NOT NULL ,
PRIMARY KEY (`id`) )
ENGINE = InnoDB;
CREATE TABLE IF NOT EXISTS `batchtest`.`queue` (
`id` INT UNSIGNED NOT NULL AUTO_INCREMENT ,
`job_id` CHAR(24) NOT NULL ,
`action` VARCHAR(255) NOT NULL ,
`params` TEXT NULL ,
`timestamp` TIMESTAMP NOT NULL ,
PRIMARY KEY (`id`) )
ENGINE = InnoDB;
Whenever you want to pause/resume/abort a job, add a row to the queue table with the job_id and action (pause, resume, or abort) and the job will respond. The job will automatically remove the completed commands from the queue table.
That's the gist of it.
class BatchJob{
const STATUS_STARTING = 0;
const STATUS_RUNNING = 1;
const STATUS_PAUSED = 2;
const STATUS_ABORTED = 4;
const STATUS_COMPLETED = 5;
protected $_id = null;
protected $_alias = null;
protected $_pdo = null;
protected $_pauseSleep = null;
protected $_status = self::STATUS_STARTING;
protected $_jobTable = 'job';
protected $_queueTable = 'queue';
public function __construct($pdo, $alias){
$this->_pdo = $pdo;
$this->_alias = $alias;
$this->_id = vsprintf('%04x%04x%04x%04x%04x%04x', array(
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
));
$this->output("Initializing job");
$this->_pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$statement = $this->_pdo->prepare("INSERT INTO {$this->_jobTable} (id, alias, status) VALUES (:id, :alias, :status)");
$statement->execute(array(
':id' => $this->_id,
':alias' => $this->_alias,
':status' => $this->_status,
));
}
public function run($job, Array $data, $pauseSleep = 10){
$this->_pauseSleep = $pauseSleep;
$iteration = 0;
$this->updateStatus(self::STATUS_RUNNING);
while($this->_status != self::STATUS_ABORTED
&&$this->_status != self::STATUS_COMPLETED){
$statement = $this->_pdo->prepare("SELECT id, action, params FROM {$this->_queueTable} WHERE job_id = :job_id");
$statement->execute(array(
':job_id' => $this->_id,
));
foreach($statement->fetchAll() as $command){
switch($command['action']){
case 'resume':
$this->updateStatus(self::STATUS_RUNNING);
break;
case 'pause':
$this->updateStatus(self::STATUS_PAUSED);
break;
case 'abort':
$this->updateStatus(self::STATUS_ABORTED, true, false);
exit;
break;
}
$statement = $this->_pdo->prepare("DELETE FROM {$this->_queueTable} WHERE id = :id");
$statement->execute(array(
':id' => $command['id'],
));
}
if($this->_status == self::STATUS_PAUSED){
sleep($this->_pauseSleep);
continue;
}
call_user_func_array($job, (Array) current($data));
if(!next($data)){
$this->updateStatus(self::STATUS_COMPLETED, true, false);
}
}
}
protected function output($string){
echo ">>> [{$this->_alias}:{$this->_id}] [" . date('Y-m-d H:i:s') . "] {$string}" . PHP_EOL;
}
protected function updateStatus($status = null, $updateDatabase = true, $updateOutput = true){
if(!is_null($status)){
$this->_status = $status;
}
if($updateDatabase){
$statement = $this->_pdo->prepare("UPDATE {$this->_jobTable} SET status = :status WHERE id = :id");
$statement->execute(array(
':id' => $this->_id,
':status' => $this->_status,
));
}
if($updateOutput){
$reflection = new ReflectionClass(__CLASS__);
$statusCodes = array_flip($reflection->getConstants());
$this->output("Job status change [{$statusCodes[$this->_status]}]");
}
}
public function __destruct(){
$this->updateStatus();
}
}
If you can modify the script you can insert a line like this in the main cycle of the script (preferably before the mail() line):
if (connection_aborted ())
exit ();
This will stop the PHP script if you close the browser window. Although this is the default behavior php often fails to stop scripts right away.
You can do this without knowing much of the inner working of the script and it is nicer than killing Apache.
httpd - this will stop all of Apache.
Typically you would kill the web server. If you run the cgi exe you can kill that.
Are you asking how to shut down a php script gone wild? If so you could always restart apache. If I misunderstood your question I apologize in advance.
Edit your php.ini to set mail in the disabled-functions. Then php will fail to be able to use the mail function.
disable_functions = mail
http://www.php.net/manual/en/ini.core.php#ini.disable-functions
Related
I have a script that inserts into a database, the script is called from 2 cron jobs simultaneously so in many cases I can have the same data at the same time ( hour,minute,second ).
Actually I don't have control on the cron jobs but I have it on the script.
So is that any way to prevent duplicating rows ?
code :
function checkDuplicate($email) {
$return = 0;
if($email != "" ){
$sql = "SELECT d.id_data as nbre FROM data d WHERE d.email = '" . $email . "' ";
$nbreEmails = $db->run($sql);
$return = (sizeof($nbreEmails) > 0) ? 1 : 0;
}
return $return;
}
if(!checkDuplicate($email)){
$insert = array(
"id_client" => $id_client,
"email" => $email,
"valide" => 1,
"stat" => "valide",
"date_insert" => date("Y-m-d H:i:s"),
"date_refus" => null
);
$db->insert("data", $insert);
}
Thanks.
You should set your timestamp field as an unique.
If you can't set it, or there is more complicated conditions that should cofirm that script1 & script2 will never do insert (or whatever with this table) in the same time, you should use lock tables, I.e.:
// not sure waht do you use as $db
$db->execute('LOCK TABLES data WRITE;');
...// inserts
$db->execute('UNLOCK TABLES;');
or use transactions (for MySQL)
Suggest you to use PDO (PDO::beginTransaction)
Or update your post with framework name that you use, most of them should have a way to do it.
I have a crawler which scrapes a website for information and then inserts the values into a database, it seems to insert the first 4000~ rows fine but then suddenly stops inserting values to the mysql database even though the crawler is still scraping the website
Database table
CREATE TABLE IF NOT EXISTS `catalog` (
`id` varchar(100) NOT NULL DEFAULT '',
`title` varchar(100) DEFAULT NULL,
`value` double DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
PHP insert function
function addToCatalog($id, $title, $value){
$q = "INSERT INTO catalog VALUES('$id', '$title', $value)";
return mysql_query($q, $this->connection);
}
php scrape function
function scrape($pageNumber){
$page = file_get_html('http://example.com/p='.$pageNumber);
if($page){
$id = array();
$title = array();
$value = array();
//id
if($page->find('.productid')){
foreach ($page->find('.productid') as $p) {
$id[] = $p->innertext;
}
}
//title
if($page->find('.title')){
foreach($page->find('.title') as $p){
$title[] = $p->innertext;
}
}
//value
if($page->find('.value')){
foreach($page->find('.value') as $p){
$value[] = $p->innertext;
}
}
for($i=0; $i<sizeof($id); $i++){
$add = $database->addToCatalog($id[$i], $title[$i], $value[$i]);
echo $id[$i]." ".$title[$i]." ".$value[$i]."<br>";
}
}
}
for($i=0; $i<31300; $i++){
scrape($i);
}
Any help with this problem would be appreciated.
If the execution of the process stops after about 30 seconds, your problem is probably the max_execution_time setting.
Had a similar issue not too long ago, turns out it was due to PHP running as FastCGI and a process daemon terminating the script, try counting the # of seconds it takes before the script exits, if its the same amount each time, try switching to CGI then trying again.
It could also be your web host terminating the script to protect shared resources, so if you are using a shared hosting server, it may be worth an upgrade.
I have spent many hours debugging, and scouring the internet for a solution to this unusual problem. Heres the deal:
I am working on a Work Order Submission and Tracking system. There are two databases involved:
The database where the submissions data gets posted, which is located on the same physical machine, but on a separate virtual machine as the webserver serving the php. They are on the same class C subnet.
The database of our tracking system. Located on a different physical server on a different IP altogether, also a virtual machine.
Our work order system allows for multiple 'services requested', stored in an array. In our sumbissions database, this is stored as a comma separated string, i.e. "40,60,70" but in our tracking system database, each 'service requested' needs a separate entry, as to allow the different aspects of the project to be tracked and completed at different times, by different staff.
THE PROBLEM IS: When I place my second insert statement, the one destined for the tracking database, in a for loop, it completely hangs, and takes maybe 5 to 15 minutes, before it passes that point in the code, and sends the confirmation email. The data does not get inserted either.
When I take it out of the for loop and simply do one insert in the submissions database and one insert into the tracking system, it works fine.
First, Ill post the code that works, but only posts one 'service' to the tracking system:
public function insertOrder()
{
$services = implode( ',', $this->model->chk );
$curdate = $this->model->getMySQLDate( $this->model->curdate );
$dueDate = $this->model->getMySQLDate( $this->model->dueDate );
$sql = "INSERT INTO orders VALUES(DEFAULT,
{$this->sanitize($services)},
{$this->sanitize($curdate)},
{$this->sanitize($this->model->submittedBy)},
{$this->sanitize($this->model->shortDesc)},
{$this->sanitize($this->model->projDetails)},
{$this->sanitize($dueDate)},
{$this->sanitize($this->model->dueDateNotes)},
{$this->sanitize( $this->model->approveBy)},
{$this->sanitize( $this->model->cost )} )";
$this->execute( $sql );
$this->convertServicesToTracks();
$notes = $this->model->getTracksNotes();
$dueDate = $dueDate.' 12:00:00';
$shortDescNoQuotes = str_replace("\"","'",$this->model->shortDesc);
$sqlTracks = "INSERT INTO todos VALUES(DEFAULT,
{$this->sanitizeTracks($this->model->chk[0])},
NULL,
{$this->sanitizeTracks($shortDescNoQuotes)},
{$this->sanitizeTracks($notes)},
now(),
{$this->sanitizeTracks($dueDate)},
NULL,
12,
NULL,
'active',
NULL,
now() );";
//echo $sqlTracks;
$this->executeTracks( $sqlTacks );
} private function executeTracks( $sql )
{
$db = $this->getTracksDB( );
$this->check4Error( $db, $sql );
return $result;
}
private function getTracksDB()
{
if (!$this->tracksdb) $this->tracksdb = new mysqli(AbstractSQL::TRACKS_HOST, AbstractSQL::USER, AbstractSQL::PASS, AbstractSQL::TRACKS_SCHEMA);
return $this->tracksdb;
}
private function convertServicesToTracks()
{
//converts submission data to tracking system data
}
private function sanitizeTracks($arg)
{
if (!isset($arg)) return "NULL";
if (is_numeric($arg) && !is_double( $arg) ) return $arg;
return "'{$this->getTracksDB()->escape_string($arg)}'";
}
When I add this simple for loop around the second INSERT statement, it hangs, even if the array only has one item!
for($i = 0; $i < count($this->model->chk); ++$i)
{
$sqlTracks = "INSERT INTO todos VALUES(DEFAULT,
{$this->sanitizeTracks($this->model->chk[$i])},
NULL,
{$this->sanitizeTracks($shortDescNoQuotes)},
{$this->sanitizeTracks($notes)},
now(),
{$this->sanitizeTracks($dueDate)},
NULL,
12,
NULL,
'active',
NULL,
now() );";
//echo $sqlTracks;
$this->executeTracks( $sqlTacks );
}
Any help would be greatly appreciated. And I apologize for the long code snippets!!
Is it iterating through the for loop? I see you have an echo, did that write anything out? How many items does it have to iterate through? 5 min seems like a long time but if there are a lot of items that could be why it's taking so long. Are you seeing any errors in your logs?
Something you might try is hold the count in a variable so it doesn't have to calculate that each time. It might speed up your for loop but I'm not sure it will insert the data.
for($i = 0, $count = count($this->model->chk); $i < $count; ++$i)
{
$sqlTracks = "INSERT INTO todos VALUES(DEFAULT,
{$this->sanitizeTracks($this->model->chk[$i])},
NULL,
{$this->sanitizeTracks($shortDescNoQuotes)},
{$this->sanitizeTracks($notes)},
now(),
{$this->sanitizeTracks($dueDate)},
NULL,
12,
NULL,
'active',
NULL,
now() );";
//echo $sqlTracks;
$this->executeTracks( $sqlTacks );
}
I found this in the PHP for loop reference: http://php.net/manual/en/control-structures.for.php
Well, this may not be the problem, but shouldn't you generally use a foreach loop to avoid hitting parts of the array that may not exist? There is more about this here. If you loop through an empty index, it would break your SQL statement. Like this:
foreach($this->model->chk as $val)
I have a checking script, it checks if the server/switch/router is alive.
The records are stored all in one db
CREATE TABLE IF NOT EXISTS `mod_monitoring` (
`id` int(11) NOT NULL,
`parentid` int(11) NOT NULL,
...
) ENGINE=MyISAM DEFAULT CHARSET=latin1;
So a router could have a switch below it(connected via parent ID) and that could have a server under it, now if a server goes down, its fine because nothing would be under it and no double email would get sent out, however lets say a router goes out that has a router under it and a couple servers.
Because we check them all, we would send out emails for each item to the admin telling them each one is dead, but I need to send out only one email about the router going down. Hope that makes sense, I need to somehow only make an array of the IDs that have no children under it..
I could make an array of all the nodes that are down, but then how do I check if its the first one in the tree? and remove all the ones that are under it
Anyone could help? Been thinking about this for ages now!
If I understood what you want and that is iterate from parent to parent (which required a not specified number of JOIN), you need to use a Stored Procedure. Infact, to achieve this goal you need the Kleene closure that is not doable in a SQL query.
In the end I ended up making array of all the dead id's $key => $id
and then using the following
if(is_array($dead)) {
foreach($dead as $key => $id) {
$conn = $db->query("SELECT * FROM mod_monitoring WHERE id = {$id}");
$data = $db->fetch_array($conn);
if($data['parentid'] == 0) {
$final[] = $id;
unset($dead[$key]);
}
}
}
if(is_array($dead)) {
foreach($dead as $key => $id) {
$conn = $db->query("SELECT * FROM mod_monitoring WHERE id = {$id}");
$data = $db->fetch_array($conn);
if(in_array($data['parentid'], $final)) {
unset($dead[$key]);
}
if(in_array($id, $dead)) {
unset($dead[$key]);
}
}
}
I am filtering null values, in php on MYSQL. When a null value is read, I need the MySQL to read the next record.
How do I go about doing that?
Why not filtering these nulls out at the source, i.e. in the SQL query.
By adding something like the following in the WHERE clause.
WHERE ... -- existing conditions
AND TheFieldOfInterest IS NOT NULL
Exactly as mjv already mentioned, you want to tell MySQL to skip over rows that have a NULL value in a particular column. As it stands in your question 'When a null value is read, I need the MySQL to read the next record.' : This is exactly what MySQL will do when you tell it not to include NULLs in the result set by specifying the WHERE condition.
Have fun hacking :)
In php you can use the is_null() function to detect whether a variable is null or not
$result = mysql_query("SELECT foo FROM bar;");
while($values = mysql_fetch_assoc($result))
{
if (is_null($values["foo"]))
continue; //Skip processing of this row
echo "Data: ".$values["foo"];
}
I agree that you shouldn't query all data and then filter the result set on the mysql-client (your php script). But: done that, but I "just" want to know another way :DThere's nothing wrong with being curious. And: More power to PDO and SPL, esp. FilterIterator in this case.
class ElementIssetFilter extends FilterIterator {
protected $index;
public function __construct(Iterator $iter, $index) {
parent::__construct($iter);
$this->index = $index;
}
public function accept() {
$c = $this->current();
return isset($c[$this->index]);
}
}
$pdo = new PDO('mysql:host=localhost;dbname=test', 'localonly', 'localonly');
$pdo->setAttribute( PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION );
// testtable and -data
$pdo->exec("CREATE TEMPORARY TABLE foo (id int auto_increment, v varchar(16), primary key(id))");
$pdo->exec("INSERT INTO foo (v) VALUES ('1'), (null), ('3'), ('4'), (null), ('6')");
$result = $pdo->query('SELECT id,v FROM foo');
$iter = new IteratorIterator($result);
$filterIter = new ElementIssetFilter($iter, 'v');
foreach( $filterIter as $e) {
echo $e['id'], " ", $e['v'], "\n";
}
$filterIter will act like $result, except that rows with NULL values in ['v'] will be filtered out. You don't have to change the "consuming" code, i.e. the same foreach-loop (or function/method call or whatever) would work with $result instead of $filterIter.