Can a PHP CLI script call itself without forking? - php

I want a PHP CLI* script to run, do a task, sleep for two seconds, and then run again. Currently, this looks like this:
#!/usr/bin/env php
<?php
require __DIR__ . '/config/app.php';
$w = new Worker;
if ($w->running) {
exit;
} elseif ($job = $w->next()) {
$w->run($job);
sleep(2);
exec(__FILE__);
exit;
} else {
exit;
}
However, it occurs to me that the new run starts before the old run completes. I am mostly a web developer, so am unfamiliar with this level (I’m at home at a somewhat higher level of abstraction), but I think this becomes what’s known as a fork bomb. How can I do this safely?
I’ve read the PHP manual for pnctl_exec(), but I’m not confident that I’m understanding it correctly.
* It’s done as PHP so most of the actual functionality can be in a library which can also be called from a web interface.

You could simply put a loop around your worker and execute it, while it has some jobs to do.
#!/usr/bin/env php
<?php
require __DIR__ . '/config/app.php';
$w = new Worker;
if ($w->running) {
exit;
}
while ($job = $w->next()) {
$w->run($job);
sleep(2); // Not sure, if you really need this?
}

Related

Call php script multiple times, with unique include

I'm trying to set up a cron job to update all of our clients. They each have their own db and directory in our web root. An individual call uses this script:
<?php
include_once 'BASEPATH'.$_REQUEST['client'].'PATHTOPHPLIB';
//Call some functions here
//backup db
$filename='db_backup_'.date('G_a_m_d_y').'.sql';
$result=exec('mysqldump '.Config::read('db.basename').' --password='.Config::read('db.password').' --user='.Config::read('db.user').' --single-transaction >BACKUPDIRECTORYHERE'.$filename,$output);
if($output=='') {
/* no output is good */
}else {
logit('Could not backup db');
logit($output);
}
?>
I need to call this same script multiple times, each with a unique include based on a client variable being passed in. We originally had a unique cron job for each client, but this is no longer a possibility. What is the best way to call this script? I'm looking at creating a new php script that will have an array of our clients and loop through it running this script, but I can't just include it because the libraries will have overlapping functions. I'm not considering cUrl because these scripts are not in the web root.
First off, a quick advert for the Symfony console component. There are others, but I've been using Symfony for a while and gravitate towards that. Hopefully you are PSR-0 /Composer -able in your project. Even if you aren't this could give you and excuse to do something self contained.
You absolutely don't want these sorts of scripts under the webroot. There is no value in having them run through apache, and there are limitations imposed on them in terms of memory and runtime that are different in a command line php context.
Base script:
<?php
if (PHP_SAPI != "cli") {
echo "Error: This should only be run from the command line environment!";
exit;
}
// Script name is always passed, so $argc with 1 arg == 2
if ($argc !== 2) {
echo "Usage: $argv[0] {client}\n";
exit;
}
// Setup your constants?
DEFINE('BASEPATH', '....');
DEFINE('PATHTOPHPLIB', '...');
require_once 'BASEPATH' . $argv[1] . 'PATHTOPHPLIB';
//Call some functions here
//backup db
$filename='db_backup_'.date('G_a_m_d_y').'.sql';
$result=exec('mysqldump '.Config::read('db.basename').' -- password='.Config::read('db.password').' --user='.Config::read('db.user').' --single-transaction >BACKUPDIRECTORYHERE'.$filename,$output);
if($output=='') {
/* no output is good */
} else {
logit('Could not backup db');
logit($output);
}
Calling Script Runs in cron:
<?php
// Bootstrap your master DB
// Query the list of clients
DEFINE('BASE_SCRIPT', 'fullpath_to_base_script_here');
foreach ($clients as $client) {
exec('/path/to/php ' . BASE_SCRIPT . " $client");
}
If you want to keep things decoupled inside the caller script you could pass the path to the backup processing script rather than hardwiring it, and if so, use the same techniques to get the param from $argc and $argv.

Can 32-Bit PHP run a .vbs script on a 64-Bit IIS Server?

There is a vbscript that we must run to consolidate information gathered in a custom web application into our management software. The .vbs is in the same folder as the web application which is built in CodeIgniter 2.
Here is the controller code:
public function saveToPM( $budgetType ){
// run it
$obj = new COM( 'WScript.Shell' );
if ( is_object ( $obj ) ) {
$obj->Run( 'cmd /C wscript.exe D:\pamtest\myload.vbs', 0, true );
var_dump($obj->Run);
} else {
echo 'can not create wshell object';
} // end if
$obj = null;
//$this->load->view('goodPush');
} // end saveToPM function
We have enabled DCon in the php.ini file and used dcomcnfg to enable permissions for the user.
I borrowed the code from http://www.sitepoint.com/forums/showthread.php?505709-run-a-vbs-from-php.
The screen echos "Code executed" but the vbscript does not run.
We have been fighting with this for a while so any help is GREATLY appreciated.
It's a bit messy. PHP calls WScript.Shell.Run which will call cmd (with /c - i.e terminate cmd.exe when it's done its thing) which will call cscript.exe to run and interpret a .vbs. As you can see quite a few things that have to go right! :)
What if you 'wait' for the WScript.Shell.Run call to end (your $wait variable) before continuing execution of the wsh script which will in turn allow PHP to continue execution etc?
Since you're not waiting for the call to finish, PHP thinks its all good and continues onto the next line (interpreted language).
Also, maybe have the .vbs create an empty text file? Just so you have an indication that it has actually run.
Just take a step back, have a beer and it'll come to you! Gogo troubleshoot!
And - http://ss64.com/vb/run.html
If bWaitOnReturn is set to TRUE, the Run method returns any error code returned by the application.
I've tested your code with a stand-alone PHP script (without Codeigniter) on a Windows XP machine, with the PHP 5.4.4 built-in web server, and I've noticed that the vbscript gets executed, but PHP dies (the event viewer shows a generic "Application Error" with ID 1000).
However I've also discovered that removing the "cmd /C" from the command string solves the problem.
Here is the simple script that I've used for my test:
<?php
$obj = new COM('WScript.Shell');
if (is_object($obj)) {
//$obj->Run('cmd /C wscript.exe test.vbs', 0, true); // This does'nt work
$obj->Run('wscript.exe test.vbs', 0, true); // This works
var_dump($obj->Run);
} else {
echo 'can not create wshell object';
}
$obj = null;
?>
And this is my simple "test.vbs" script:
WScript.Echo "vbscript is running"
Another solution that seems to be working (at least on my platform) is the "system" call:
system('wscript.exe test.vbs');
Unfortunately I don't have a 64-bit IIS system to test with, so I can't really say if there are specific problems on this platform, but I hope this helps.

How to detect whether a PHP script is already running?

I have a cron script that executes a PHP script every 10 minutes. The script checks a queue and processes the data in the queue. Sometimes the queue has enough data to last over 10 minutes of processing, creating the potential of two scripts trying to access the same data. I want to be able to detect whether the script is already running to prevent launching multiple copies of the script. I thought about creating a database flag that says that a script is processing, but if the script were ever to crash it would leave it in the positive state. Is there an easy way to tell if the PHP script is already running from withing a PHP or shell script?
You can just use a lock file. PHP's flock() function provides a simple wrapper for Unix's flock function, which provides advisory locks on files.
If you don't explicitly release them, the OS will automatically release these locks for you when the process holding them terminates, even if it terminates abnormally.
You can also follow the loose Unix convention of making your lock file a 'PID file' - that is, upon obtaining a lock on the file, have your script write its PID to it. Even if you never read this from within your script, it will be convenient for you if your script ever hangs or goes crazy and you want to find its PID in order to manually kill it.
Here's a copy/paste-ready implementation:
#!/usr/bin/php
<?php
$lock_file = fopen('path/to/yourlock.pid', 'c');
$got_lock = flock($lock_file, LOCK_EX | LOCK_NB, $wouldblock);
if ($lock_file === false || (!$got_lock && !$wouldblock)) {
throw new Exception(
"Unexpected error opening or locking lock file. Perhaps you " .
"don't have permission to write to the lock file or its " .
"containing directory?"
);
}
else if (!$got_lock && $wouldblock) {
exit("Another instance is already running; terminating.\n");
}
// Lock acquired; let's write our PID to the lock file for the convenience
// of humans who may wish to terminate the script.
ftruncate($lock_file, 0);
fwrite($lock_file, getmypid() . "\n");
/*
The main body of your script goes here.
*/
echo "Hello, world!";
// All done; we blank the PID file and explicitly release the lock
// (although this should be unnecessary) before terminating.
ftruncate($lock_file, 0);
flock($lock_file, LOCK_UN);
Just set the path of your lock file to wherever you like and you're set.
If you need it to be absolutely crash-proof, you should use semaphores, which are released automatically when php ends the specific request handling.
A simpler approach would be to create a DB record or a file at the beginning of the execution, and remove it at the end. You could always check the "age" of that record/file, and if it's older than say 3 times the normal script execution, suppose it crashed and remove it.
There's no "silver bullet", it just depends on your needs.
If you are running Linux, this should work at the top of your script:
$running = exec("ps aux|grep ". basename(__FILE__) ."|grep -v grep|wc -l");
if($running > 1) {
exit;
}
A common way for *nix daemons (though not necessarily PHP scripts, but it will work) is to use a .pid file.
When the script starts check for the existence of a .pid file named for the script (generally stored in /var/run/). If it doesn't exist, create it setting its contents to the PID of the process running the script (using getmypid) then continue with normal execution. If it does exist read the PID from it and see if that process is still running, probably by running ps $pid. If it is running, exit. Otherwise, overwrite its contents with your PID (as above) and continue normal execution.
When execution finished, delete the file.
I know this is an old question but in case someone else is looking here I'll post some code. This is what I have done recently in a similar situation and it works well. Put put this code at the top of your file and if the same script is already running it will leave it be and end the new one.
I use it to keep a monitoring system running at all times. A cron job starts the script every 5 minutes but unless the other has stopped from some reason (usually if it has crashed, which is very rare!) the new one will just exit itself.
// The file to store our process file
define('PROCESS_FILE', 'process.pid');
// Check I am running from the command line
if (PHP_SAPI != 'cli') {
log_message('Run me from the command line');
exit;
}
// Check if I'm already running and kill myself off if I am
$pid_running = false;
if (file_exists(PROCESS_FILE)) {
$data = file(PROCESS_FILE);
foreach ($data as $pid) {
$pid = (int)$pid;
if ($pid > 0 && file_exists('/proc/' . $pid)) {
$pid_running = $pid;
break;
}
}
}
if ($pid_running && $pid_running != getmypid()) {
if (file_exists(PROCESS_FILE)) {
file_put_contents(PROCESS_FILE, $pid);
}
log_message('I am already running as pid ' . $pid . ' so stopping now');
exit;
} else {
// Make sure file has just me in it
file_put_contents(PROCESS_FILE, getmypid());
log_message('Written pid with id '.getmypid());
}
It will NOT work without modification on Windows, but should be fine in unix based systems.
You can use new Symfony 2.6 LockHandler.
Source
$lock = new LockHandler('update:contents');
if (!$lock->lock()) {
echo 'The command is already running in another process.';
}
This worked for me. Set a database record with a lock flag and a time stamp. My script should complete well within 15min so added that as a last locked feild to check:
$lockresult = mysql_query("
SELECT *
FROM queue_locks
WHERE `lastlocked` > DATE_SUB(NOW() , INTERVAL 15 MINUTE)
AND `locked` = 'yes'
AND `queid` = '1'
LIMIT 1
");
$LockedRowCount = mysql_num_rows($lockresult);
if($LockedRowCount>0){
echo "this script is locked, try again later";
exit;
}else{
//Set the DB record to locked and carry on son
$result = mysql_query("
UPDATE `queue_locks` SET `locked` = 'yes', `lastlocked` = CURRENT_TIMESTAMP WHERE `queid` = 1;
");
}
Then unlock it at the end of the script:
$result = mysql_query("UPDATE `queue_locks` SET `locked` = 'no' WHERE `queid` = 1;");
I know this is an old question, but there's an approach which hasn't been mentioned before that I think is worth considering.
One of the problems with a lockfile or database flag solution, as already mentioned, is that if the script fails for some reason other than normal completion it won't release the lock. And therefore the next instance won't start until the lock is either manually cleared or cleared by a clean-up function.
If, though, you are certain that the script should only ever be running once, then it's relatively easy to check from within the script whether it is already running when you start it. Here's some code:
function checkrun() {
exec("ps auxww",$ps);
$r = 0;
foreach ($ps as $p) {
if (strpos($p,basename(__FILE__))) {
$r++;
if ($r > 1) {
echo "too many instances, exiting\n";
exit();
}
}
}
}
Simply call this function at the start of the script, before you do anything else (such as open a database handler or process an import file), and if the same script is already running then it will appear twice in the process list - once for the previous instance, and once for this one. So, if it appears more than once, just exit.
A potential gotcha here: I'm assuming that you will never have two scripts with the same basename that may legitimately run simultaneously (eg, the same script running under two different users). If that is a possibility, then you'd need to extend the checking to something more sophisticated than a simple substring on the file's basename. But this works well enough if you have unique filenames for your scripts.
Assuming this is a linux server and you have cronjobs available
///Check for running script and run if non-exist///
#! /bin/bash
check=$(ps -fea | grep -v grep | grep script.php | wc -l)
date=$(date +%Y-%m%d" "%H:%M:%S)
if [ "$check" -lt 1 ]; then
echo "["$date"] Starting script" >> /path/to/script/log/
/sbin/script ///Call the script here - see below///
fi
script file
#/usr/bin/php /path/to/your/php/script.php
Home / Check if a PHP script is already running
Check if a PHP script is already running
If you have long running batch processes with PHP that are run by cron and you want to ensure there’s only ever one running copy of the script, you can use the functions getmypid() and posix_kill() to check to see if you already have a copy of the process running. This post has a PHP class for checking if the script is already running.
Each process running on a Linux/Unix computer has a pid, or process identifier. In PHP this can be retrieved using getmypid() which will return an integer number. This pid number can be saved to a file and each time the script is run a check made to see if the file exists. If it is the posix_kill() function can be used to see if a process is running with that pid number.
My PHP class for doing this is below. Please feel free to use this and modify to suit your individual requirements.
class pid {
protected $filename;
public $already_running = false;
function __construct($directory) {
$this->filename = $directory . '/' . basename($_SERVER['PHP_SELF']) . '.pid';
if(is_writable($this->filename) || is_writable($directory)) {
if(file_exists($this->filename)) {
$pid = (int)trim(file_get_contents($this->filename));
if(posix_kill($pid, 0)) {
$this->already_running = true;
}
}
}
else {
die("Cannot write to pid file '$this->filename'. Program execution halted.n");
}
if(!$this->already_running) {
$pid = getmypid();
file_put_contents($this->filename, $pid);
}
}
public function __destruct() {
if(!$this->already_running && file_exists($this->filename) && is_writeable($this->filename)) {
unlink($this->filename);
}
}
}
Use Class below
$pid = new pid('/tmp');
if($pid->already_running) {
echo "Already running.n";
exit;
}
else {
echo "Running...n";
}
Inspired by Mark Amery's answer I created this class. This might help someone. Simply change the "temp/lockFile.pid" to where you want the file placed.
class ProcessLocker
{
private $lockFile;
private $gotLock;
private $wouldBlock;
function __construct()
{
$this->lockFile = fopen('temp/lockFile.pid', 'c');
if ($this->lockFile === false) {
throw new Exception("Unable to open the file.");
}
$this->gotLock = flock($this->lockFile, LOCK_EX | LOCK_NB, $this->wouldBlock);
}
function __destruct()
{
$this->unlockProcess();
}
public function isLocked()
{
if (!$this->gotLock && $this->wouldBlock) {
return true;
}
return false;
}
public function lockProcess()
{
if (!$this->gotLock && !$this->wouldBlock) {
throw new Exception("Unable to lock the file.");
}
ftruncate($this->lockFile, 0);
fwrite($this->lockFile, getmypid() . "\n");
}
public function unlockProcess()
{
ftruncate($this->lockFile, 0);
flock($this->lockFile, LOCK_UN);
}
}
Simply use the class as such in the beginning of your script:
$locker = new ProcessLocker();
if(!$locker->isLocked()){
$locker->lockProcess();
} else{
// The process is locked
exit();
}

PHP Launch script after background process completes?

I am converting a PDF with PDF2SWF and Indexing with XPDF.. with exec.. only this requires the execution time to be really high.
Is it possible to run it as background process and then launch a script when it is done converting?
in general, php does not implement threads.
But there is an ZF-class which may be suitable for you:
http://framework.zend.com/manual/en/zendx.console.process.unix.overview.html
ZendX_Console_Process_Unix allows
developers to spawn an object as a new
process, and so do multiple tasks in
parallel on console environments.
Through its specific nature, it is
only working on nix based systems
like Linux, Solaris, Mac/OSx and such.
Additionally, the shmop_, pcntl_* and
posix_* modules are required for this
component to run. If one of the
requirements is not met, it will throw
an exception after instantiating the
component.
suitable example:
class MyProcess extends ZendX_Console_Process_Unix
{
protected function _run()
{
// doing pdf and flash stuff
}
}
$process1 = new MyProcess();
$process1->start();
while ($process1->isRunning()) {
sleep(1);
}
echo 'Process completed';
.
Try using popen() instead of exec().
This hack will work on any standard PHP installation, even on Windows, no additional libraries required. Yo can't really control all aspects of the processes you spawn this way, but sometimes this is enough:
$p1 = popen("/bin/bash ./some_shell_script.sh argument_1","r");
$p2 = popen("/bin/bash ./some_other_shell_script.sh argument_2","r");
$p2 = popen("/bin/bash ./yet_other_shell_script.sh argument_3","r");
The three spawned shell scripts will run simultaneously, and as long as you don't do a pclose($p1) (or $p2 or $p3) or try to read from any of these pipes, they will not block your PHP execution.
When you're done with your other stuff (the one that you are doing with your PHP script) you can call pclose() on the pipes, and that will pause your script execution until the process you are pclosing finishes. Then your script can do something else.
Note that your PHP will not conclude or die() until those scripts have finished. Reaching the end of the script or calling die() will make it wait.
If you are running it from the command line, you can fork a php process using pcntl_fork
There are also daemon classes that would do the same trick:
http://pear.php.net/package/System_Daemon
$pid = pcntl_fork();
if ($pid == -1) {
die('could not fork');
} else if ($pid) {
//We are the parent, exit
exit();
} else {
// We are the child, do something interesting then call the script at the end.
}

Php showing 97% of CPU usage

I have a game site developed using flash and php. The php code contains 4000 lines and it will run as a cron. Inside the code, there is one while loop which will run infinitely for checking any data is written in the socket and call different functions accordingly and will send the results back to the sockets. From flash, it will get the results and will be displayed.
The problem Im facing is, somewhere from the php code, it is leaking memory. Since it is very big, I can not find out from where it is happening. Moreover it can be run only as a cron. Is there any tool to find out the memory leakage ? I have heard about xdebug but I didnt use. Any other ?
check.php (as cron)
$sock = fsockopen(IP_ADDRESS, PORT, $sock_error_code, $sock_error_string, 10); if (!$sock){
$message = "Server was down, restarting...\n\n";
$last_line = system("php -q gameserver/server.php", $retval);} else {
$message = "Server is up...";
$message .= $sock_error_string." (".$sock_error_code.")\n\n";}
server.php (only some part)
class gameserver {
var $server_running = true;
function gameserver() {
global $cfg, $db;
$this->max_connections = $cfg["server"]["max-connections"];
$this->start_socket();
echo "Gameserver initialized\n";
while ($this->server_running) {
$read = $this->get_socket_list();
$temp = socket_select($read, $null, $null, 0, 15);
if (!empty($read)) {
$this->read_sockets($read);
}
$db->reconnection();
$this->update_DB_records();
$this->check_games_progress();
if ($this->soft_shutdown && $this->active_games == 0) {
$this->server_running = false;
echo "soft shutdown complete\n";
}
}
$this->stop_socket();
echo "Server shut down\n";
}} $server = new gameserver();
Two things, first, ensure that you sleep at least once inside the loop, to ensure that you don't use 97% cpu.
Second, a trick I've found is, if there is any database activity, to call mysql_free_result (or it's equivalent for other DBMS') to free up the memory used to store the result of the query.
Are you starting never-ending programs from CRON? Cron will start a new instance according to the schedule you specify and you'll end up with several running programs doing the same thing.. Could this be your problem?
I am assuming that you are not starting a new instance using cron every minute, and having each run an infinite loop:
XDebug is probably your best bet. Other than that, you could use memory_get_usage() and log memory usage in specific points of your loop.
Could it simply be that your script accumulates data and doesn't clean it up properly on the end of each loop?

Categories