PHP check if script/function is running - php

Is there a way in PHP to check if a function has completed processing before allowing it to run again?
I have a function that on page loads/reloads/timer event checks if DB items should be expired based on an end date (date less than now) and if date is less than now makes a duplicate of the record with the exact same information but adds 10 days to the end date. The script then sets the original record status to inactive. This is required to keep an original copy of the item in the DB and the process continues for every record.
Sometimes the script will create multiple duplicates of the same item so it seems like the script is not setting the status to inactive quick enough and when the page is reloaded/visited etc another instance of the script is run producing another duplicate record.
So is there a way to check if a function is currently running and if so ignore the new call to only ever have a single instance running?
Many many thanks

Sounds like you need a "mutex". You can try using this, or implement one yourself by creating a shared resource, such as writing an empty file to disk and then checking for its existence, and removing it when you're done.
A better solution for your specific problem though would be to set up a cron job to periodically run your database maintenance scripts rather than relying on random user requests to your page. This will ensure it won't run too often and reduce the processing per request.

For PHP:
$Running = false;
function functionName() {
if (!$Running) {
$Running= true;
//Your logic implementation here
$Running= false;
}
}
You can use same principle for jQuery functions also and cron job too.

Use a try catch
function blah() {
try {
}
catch(Exception $e) {
echo 'Message: ' .$e->getMessage();
}
}

Related

Best way to guarantee a job is been executed

I have a script that is running continuously in the server, in this case a PHP script, like:
php path/to/my/index.php.
It's been executed, and when it's done, it's executed again, and again, forever.
I'm looking for the best way to be notified if that event stop running(been executed).
There are many reasons why it stops been called, like server memory, new deployment, human error... etc.
I just want to be notified(email, sms, slack...) if that script was not executed for certain amount of time(like 1 hour, 1 day, etc...)
My server is Ubuntu living in AWS.
An idea:
I was thinking on having an index in REDIS/MEMCACHED/ETC with a TTL. Every time the script run, renovate that TTL for this index.
If the script stop working for that TTL time, this index will expire. I just need a way to trigger a notification when that expiration happen, but looks like REDIS/MEMCACHED are not prepared for that
register_shutdown_function might help, but might not... https://www.php.net/manual/en/function.register-shutdown-function.php
I can't say i've ever seen a script that needs to run indefinitely in PHP. Perhaps there is another way to solve the problem you are after?
Update - Following your redis idea, I'd look at keyspace notifications. https://redis.io/topics/notifications
I've not tested the idea since I'm not actually a redis user. But it may be possible to subscribe to capture the expiration event (perhaps from another server?) and generate your notification.
There's no 'best' way to do this. Ultimately, what works best will boil down to the specific workflow you're supporting.
tl;dr version: Find what constitutes success and record the most recent time it happened. Use that for your notification trigger in another script.
Long version:
That said, persistent storage with a separate watcher is probably the most straight-forward way to do this. Record the last successful run, and then check it with a cron job every so often.
For what it's worth, for scripts like this I generally monitor exit codes or logs produced by the script in question. This isolates the error notification process from the script itself so a flaw in the script (hopefully) doesn't hamper the notification.
For a barebones example, say we have a script to invoke the actual script... (This is very much untested pseudo-code)
<?php
//Run and record.
exec("php path/to/my/index.php", $output, $return_code);
//$return_code will be 255 on fatal errors. You can use other return codes
//with exit in your called script to report other fail states.
if($return_code == 0) {
file_put_contents('/path/to/folder/last_success.txt', time());
} else {
file_put_contents('/path/to/folder/error_report.json', json_encode([
'return_code' => $return_code,
'time' => time(),
'output' => implode("\n", $output),
//assuming here that error output isn't silently logged somewhere already.
], JSON_PRETTY_PRINT));
}
And then a watcher.php that monitors these files on a cron job.
<?php
//Notify us immediately on failure maybe?
//If you have a lot of transient failures it may make more sense to
//aggregate and them in a single report at a specific time instead.
if(is_file('/path/to/folder/error_report.json')) {
//Mail details stored in JSON here.
//rename file so it's recorded, but we don't receive it again.
rename('/path/to/folder/error_report.json', '/path/to/folder/error_report.json'.'-sent-'.date('Y-m-d-H-i-s'));
} else {
if(is_file('/path/to/folder/last_success.txt')) {
$last_success = intval(file_get_contents('/path/to/folder/last_success.txt'));
if(strtotime('-24 hours') > $last_success) {
//Our script hasn't run in 24 hours, let someone know.
}
} else {
//No successful run recorded. Might want to put code here if that's unexpected.
}
}
Notes: There are some caveats to the specific approach displayed above. A script can fail in a non-fatal way and if you're not checking for it this example could record that as a successful run. For example, permissions errors causing warnings but the script still runs it's full course and exits normally without hitting an exit call with a specific return code. Our example invoker here would log that as a successful run - even though it isn't.
Another option is to log success from your script and only check for error exits from the invoker.

Long PHP script runs multiple times

I have a products database that synchronizes with product data ever morning.
The process is very clear:
Get all products from database by query
Loop through all products, and get and xml from the other server by product_id
Update data from xml
Log the changes to file.
If I query a low amount of items, but limiting it to 500 random products for example, everything goes fine. But when I query all products, my script SOMETIMES goes on the fritz and starts looping multiple times. Hours later I still see my log file growing and products being added.
I checked everything I could think of, for example:
Are variables not used twice without overwriting each other
Does the function call itself
Does it happen with a low amount of products too: no.
The script is called using a cronjob, are the settings ok. (Yes)
The reason that makes it especially weird is that it sometimes goes right, and sometimes it doesnt. Could this be some memory problem?
EDIT
wget -q -O /dev/null http://example.eu/xxxxx/cron.php?operation=sync its in webmin called on a specific hour and minute
Code is hundreds of lines long...
Thanks
You have:
max_execution_time disabled. Your script won't end until the process is complete for as long as it needed.
memory_limit disabled. There is no limit to how much data stored in memory.
500 records were completed without issues. This indicates that the scripts completes its process before the next cronjob iteration. For example, if your cron runs every hour, then the 500 records are processed in less than an hour.
If you have a cronjob that is going to process large amount of records, then consider adding lock mechanism to the process. Only allow the script to run once, and start again when the previous process is complete.
You can create script lock as part of a shell script before executing your php script. Or, if you don't have an access to your server you can use database lock within the php script, something like this.
class ProductCronJob
{
protected $lockValue;
public function run()
{
// Obtain a lock
if ($this->obtainLock()) {
// Run your script if you have valid lock
$this->syncProducts();
// Release the lock on complete
$this->releaseLock();
}
}
protected function syncProducts()
{
// your long running script
}
protected function obtainLock()
{
$time = new \DateTime;
$timestamp = $time->getTimestamp();
$this->lockValue = $timestamp . '_syncProducts';
$db = JFactory::getDbo();
$lock = [
'lock' => $this->lockValue,
'timemodified' => $timestamp
];
// lock = '0' indicate that the cronjob is not active.
// Update #__cronlock set lock = '', timemodified = '' where name = 'syncProducts' and lock = '0'
// $result = $db->updateObject('#__cronlock', $lock, 'id');
// $lock = SELECT * FROM #__cronlock where name = 'syncProducts';
if ($lock !== false && (string)$lock !== (string)$this->lockValue) {
// Currently there is an active process - can't start a new one
return false;
// You can return false as above or add extra logic as below
// Check the current lock age - how long its been running for
// $diff = $timestamp - $lock['timemodified'];
// if ($diff >= 25200) {
// // The current script is active for 7 hours.
// // You can change 25200 to any number of seconds you want.
// // Here you can send notification email to site administrator.
// // ...
// }
}
return true;
}
protected function releaseLock()
{
// Update #__cronlock set lock = '0' where name = 'syncProducts'
}
}
Your script is running for quite some time (~45m) and wget think it's "timing out" since you don't return any data. By default wget will have a 900s timeout value and a retry count of 20. So first you should probably change your wget command to prevent this:
wget --tries=0 --timeout=0 -q -O /dev/null http://example.eu/xxxxx/cron.php?operation=sync
Now removing the timeout could lead to other issue, so instead you could send (and flush to force webserver to send it) data from your script to make sure wget doesn't think the script "timed out", something every 1000 loops or something like that. Think of this as a progress bar...
Just keep in mind that you will hit an issue when the run time will get close to your period as 2 crons will run in parallel. You should optimize your process and/or have a lock mechanism maybe?
I see two possibilities:
- chron calls the script much more often
- script takes too long somehow.
you can try estimate the time a single iteration of the loop takes.
this can be done with time(). perhaps the result is suprising, perhaps not. you can probably get the number of results too. multiply the two, that way you will have an estimate of how long the process should take.
$productsToSync = $db->loadObjectList();
and
foreach ($productsToSync AS $product) {
it seems you load every result into an array. this wont work for huge databases because obviously a million rows wont fit in memory. you should just get one result at a time. with mysql there are methods that just fetch one thing at a time from the resource, i hope yours allows the same.
I also see you execute another query each iteration of the loop. this is something I try to avoid. perhaps you can move this to after the first query has ended and do all of those in one big query? otoh this may bite my first suggestion.
also if something goes wrong, try to be paranoid when debugging. measure as much as you can. time as much as you can when its a performance issue. put the timings in you log file. usually you will find the bottleneck.
I solved the problem myself. Thanks for all the replies!
My MySQL timed out, that was the problem. As soon as I added:
ini_set('mysql.connect_timeout', 14400);
ini_set('default_socket_timeout', 14400);
to my script the problem stopped. I really hope this helps someone. Ill upvote all the locking answers, because those were very helpful!

cakephp/php run function in background

I've never done this before, so I need some input.
Code (in general):
$cofig = Configure::read('config');
if ($config['stuff'] == 1){
$this->Session->setFlash('it is already done this month');
$this->redirect('/to/some/where');
}
elseif ($config['stuff'] == 2){
$this->Session->setFlash('it is already running');
$this->redirect('/to/some/where');
}
else {
SomeComponent::SomeFunction(); //this I need to launch in background while user continues further
$this->Session->setFlash('you have launched it');
$this->redirect('/to/some/where');
}
"SomeComponent" contains several functions. I need to launch a specific function "SomeFunction()" in the bacground while user continues further.
Function "SomeComponent::SomeFunction()" generates bunch of pdfs, interacts with database and uses Cakephp specific methods'n'crap to do all that. Users receive output via database, so I don't need to retrieve it form the function itself.
So i'm not clear which method can do that, which best to use and what are/may be drawbacks of each one (security issues in particular).
I hope I explained everything in a understandable way. If you have read this far - thanks.
Is that process "SomeComponent::SomeFunction()" taking too long? If the answer is yes then I would wrap that function in a shell and if you want to take it to the next level I would use an Observer in order to dispatch that background process.
Here is an introduction to Event Handling in CakePHP.

Stop PHP with ajax

I have a JavaScript functions which calls a PHP function through AJAX.
The PHP function has a set_time_limit(0) for its purposes.
Is there any way to stop that function when I want, for example with an HTML button event?
I want to explain better the situation:
I have a php file which uses a stream_copy_to_stream($src, $dest) php function to retrieve a stream in my local network. The function has to work until I want: I can stop it at the end of the stream or when I want. So I can use a button to start and a button to stop. The problem is the new instance created by the ajax call, in fact I can't work on it because it is not the function that is recording but it is another instance. I tried MireSVK's suggest but it doesn't worked!
Depending on the function. If it is a while loop checking for certain condition every time, then you could add a condition that is modifiable from outside the script (e.g. make it check for a file, and create / delete that file as required)
It looks like a bad idea, however. Why you want to do it?
var running = true;
function doSomething(){
//do something........
}
setInterval(function(){if(running){doSomething()}},2000); ///this runs do something every 2 seconds
on button click simply set running = false;
Your code looks like:
set_time_limit(0);
while(true==true){//infinite loop
doSomething(); //your code
}
Let's upgrade it
set_time_limit(0);
session_start();
$_SESSION['do_a_loop'] = true;
function should_i_stop_loop(){
#session_start();
if( $_SESSION['do_a_loop'] == false ) {
//let's stop a loop
exit();
}
session_write_close();
}
while(true==true){
doSomething();
should_i_stop_loop(); //your new function
}
Create new file stopit.php
session_start();
$_SESSION['do_a_loop'] = false;
All you have to do now is create a request on stopit.php file (with ajax or something)
Edit code according to your needs, this is point. One of many solutions.
Sorry for my English
Sadly this isn't possible (sort of).
Each time you make an AJAX call to a PHP script the script spawns a new instance of itself. Thus anything you send to it will be sent to a new operation, not the operation you had previously started.
There are a number of workarounds.
Use readystate 3 in AJAX to create a non closing connection to the PHP script, however that isn't supported cross browser and probably won't work in IE (not sure about IE 10).
Look into socket programming in PHP, which allows you to create a script with one instance that you can connect to multiple times.
Have PHP check a third party. I.E have one script running in a loop checking a file or a database, then connect to another script to modify that file or database. The original script can be remotely controlled by what you write to the file/database.
Try another programming language (this is a silly option, but I'm a fan of node). Node.js does this sort of thing very very easily.

PHP Singleton class for all requests

I have a simple problem. I use php as server part and have an html output. My site shows a status about an other server. So the flow is:
Browser user goes on www.example.com/status
Browser contacts www.example.com/status
PHP Server receives request and ask for stauts on www.statusserver.com/status
PHP Receives the data, transforms it in readable HTML output and send it back to the client
Browser user can see the status.
Now, I've created a singleton class in php which accesses the statusserver only 8 seconds. So it updates the status all 8 seconds. If a user requests for update inbetween, the server returns the locally (on www.example.com) stored status.
That's nice isn't it? But then I did an easy test and started 5 browser windows to see if it works. Here it comes, the php server created a singleton class for each request. So now 5 Clients requesting all 8 seconds the status on the statusserver. this means I have every 8 second 5 calls to the status server instead of one!
Isn't there a possibility to provide only one instance to all users within an apache server? That would be solve the problem in case 1000 users are connecting to www.example.com/status....
thx for any hints
=============================
EDIT:
I already use a caching on harddrive:
public function getFile($filename)
{
$diff = (time()-filemtime($filename));
//echo "diff:$diff<br/>";
if($diff>8){
//echo 'grösser 8<br/>';
self::updateFile($filename);
}
if (is_readable($filename)) {
try {
$returnValue = #ImageCreateFromPNG($filename);
if($returnValue == ''){
sleep(1);
return self::getFile($filename);
}else{
return $returnValue;
}
} catch (Exception $e){
sleep(1);
return self::getFile($filename);
}
} else {
sleep(1);
return self::getFile($filename);
}
}
this is the call in the singleton. I call for a file and save it on harddrive. but all the request call it at same time and start requesting the status server.
I think the only solution would be a standalone application which does an update every 8 seconds on the file... All request should just read the file and nomore able to update it.
This standalone could be a perl script or something similar...
Php requests are handled by different processes and each of them have a different state, there isn't any resident process like in other web development framework. You should handle that behavior directly in your class using for instance some caching.
The method which query the server status should have this logic
public function getStatus() {
if (!$status = $cache->load()) {
// cache miss
$status = // do your query here
$cache->save($status); // store the result in cache
}
return $status;
}
In this way only one request of X will fetch the real status. The X value depends on your cache configuration.
Some cache library you can use:
APC
Memcached
Zend_Cache which is just a wrapper for actual caching engines
Or you can store the result in plain text file and on every request check for the m_time of the file itself and rewrite it if more than xx seconds are passed.
Update
Your code is pretty strange, why all those sleep calls? Why a try/catch block when ImageCreateFromPNG does not throw?
You're asking a different question, since php is not an application server and cannot store state across processes your approach is correct. I suggest you to use APC (uses shared memory so it would be at least 10x faster than reading a file) to share status across different processes. With this approach your code could become
public function getFile($filename)
{
$latest_update = apc_fetch('latest_update');
if (false == $latest_update) {
// cache expired or first request
apc_store('latest_update', time(), 8); // 8 is the ttl in seconds
// fetch file here and save on local storage
self::updateFile($filename);
}
// here you can process the file
return $your_processed_file;
}
With this approach the code in the if part will be executed from two different processes only if a process is blocked just after the if line, which should not happen because is almost an atomic operation.
Furthermore if you want to ensure that you should use something like semaphores to handle that, but it would be an oversized solution for this kind of requirement.
Finally imho 8 seconds is a small interval, I'd use something bigger, at least 30 seconds, but this depends from your requirements.
As far as I know it is not possible in PHP. However, you surely can serialize and cache the object instance.
Check out http://php.net/manual/en/language.oop5.serialization.php

Categories