Prevent php script from running simultaneously and twice - php

I've been looking for a way to prevent running a php script simultaneously. So I found a way (on this site) to prevent this. This is where I came with (test file).
Link to found solution on stackoverflow: How to prevent PHP script running more than once?
test.php
echo "started: ".microtime()."<br>";
$lock = $_SERVER['DOCUMENT_ROOT'].'/tmp/test.lock';
$f = fopen($lock, 'x');
if($f === false){
die("\nCan't aquire lock\n");
}else{
// Do processing
echo "Working: ".microtime()."<br>";
sleep(5);
echo "Still working: ".microtime()."<br>";
sleep(5);
echo "Ready: ".microtime()."<br>";
fclose($f);
unlink($lock);
}
When running this script for the first time, the output will be like this:
started: 0.87157000 1389879936
Working: 0.87532100 1389879936
Still working: 0.87542000 1389879941
Ready: 0.87551800 1389879946
Now when I run the same script in the same browser simultaneously, both will be executed, however the second one is executed after the first one. So not simultaneously, but twice. I didn't expect that because it should die if the test.lock file already exists.
So running the script in the same browsers with two tabs, this is the result:
tab1:
started: 0.87157000 1389879936
Working: 0.87532100 1389879936
Still working: 0.87542000 1389879941
Ready: 0.87551800 1389879946
tab2:
started: 0.92684500 1389879946
Working: 0.92911700 1389879946
Still working: 0.92920300 1389879951
Ready: 0.92930400 1389879956
As you can see, the script in the 2e tab is started when the script in the first tab is finished. Isn't that weared?
When I do this with different browsers, the script started as second is terminated, so it works.
browser 1:
started: 0.62890800 1389880056
Working: 0.63861900 1389880056
Still working: 0.63878800 1389880061
Ready: 0.63893300 1389880066
Browser 2:
started: 0.10137700 1389880058
Warning: fopen(/home/users/domain/tmp/test.lock) [function.fopen]: failed to open stream: File exists in /home/users/domain/test.php on line 8
Can't aquire lock
The question
I'm now able to prevent executing the script simultaneous, but how to prevent the second script in the same browser of being executed after the first script is finished!

I'm guessing that two tabs in the same browser count as the same client, the browser will use the same session. And the server will answer requests from the same session sequentially. That is why you can be logged into the same service with multiple tabs. (E.g. have several tabs open in stackoverflow)
Requests from a different session (browser) may be processed simultaneously. I guess this depends on your server.
You can't really prevent the script from being executed twice with a simple lock-file. You can only prevent simultaneous execution, as you have demonstrated.
If you wanted to prevent the same client from executing a script too often you'd need to keep track of the last time they executed the script. (possibly in a cookie / database)

Related

PHP flock() not locking

I am having trouble figuring out why flock() is not behaving properly in the following scenario.
The following code is placed into two different PHP scripts one "test1.php" and the other "test2.php". The point of the code is to create a file which no other process (which properly uses the flock() code) should be able to write to. There will be many different PHP scripts which try to obtain an exclusive lock on this file, but only one should have access at any given time and all the rest should fail gracefully when they fail to get the lock.
The way I am testing this is very simple. Both "test1.php" and "test2.php" are placed in a web accessible directory on my server. Then from a browser such as Firefox, the first script will be executed, and then immediately after, the second script is executed from a different browser tab. This seams to work when the code is run from two different PHP scripts such as "test1.php" and "test2.php", but when the code is run twice from the same "test1.php" script or "test2.php" script the second script that is run will not immediately return with a failure.
The only reason I can think of for this, is that flock() treats all PHP processes with the same file name as the same process. If this is the case, then when "test1.php" or "test2.php" are run twice (from two different browser tabs) PHP sees them as the same process and thus does not fail the lock. But to me, it does not makes sense for PHP to be designed like that, thus I am hear to see if anyone else can solve this problem for me.
Thanks in advance!
<?
$file = 'command.bat';
echo "Starting script...";
flush();
$handle = fopen($file, 'w+');
echo "Lets try locking...";
flush();
if(is_resource($handle)){
echo "good resource...";
flush();
if(flock($handle, LOCK_EX | LOCK_NB) === TRUE){
echo "Got lock!";
flush();
sleep(100);
flock($fp, LOCK_UN);
}else{
echo "Failed to get lock!";
flush();
}
}else{
echo "bad resource...";
flush();
}
exit;
Any help with the above is greatly appreciated!
Thank you,
Daniel
I had the same situation and found the problem to be with the browser.
When making multiple requests to the same URL, even if doing so across tabs or windows, the browser is "smart" enough to wait until the first request completes, and then the browser attempts to run the subsequent request(s).
So, while it may look like the lock is not working, what is actually happening is that the browser (both Chrome and Firefox) is waiting for the first request to complete before running the second request.
You can verify that this is the case by opening the same URL once in Chrome and once in Firefox. By doing so, as I did, you would probably see that the lock is indeed working as expected.
flock has many restrictions, including multi-threaded servers, NFS volumes, etc.
The accepted solution is apparently to attempt to create a link instead.
Lots of discussion on this topic: http://www.php.net/manual/en/function.flock.php

Windows PHP repeating script via popen

I'm trying to create a browser-started self-calling/repeating PHP script on Windows with PHP (currently 5.3.24 but soon will be latest). It will act as a daemon to monitor changes in a database (every few seconds, so cron/schedule is out) and then call other PHP scripts to perform work when changes are found. For the purposes of this question please ignore the fact that I'd be better off doing this in C# or some other language :)
To keep things simple I started out by trying to use popen to run a second PHP script in the background...
// BatchMonitor.php
SaveToMonitorTable(1); // save 1st test entry to see if the script reached this point
$Command = '"" "C:\Program Files (x86)\PHP\v5.3\php.exe" C:\inetpub\wwwroot\Test.php --Instance=' . $Data->Instance;
pclose(popen("start /B $Command", "r"));
SaveToMonitorTable(2); // save 2nd test entry to see if the script reached this point
exit();
// Test.php
SaveToTestTable(1);
Sleep(10);
SaveToTestTable(2);
exit();
If I run BatchMonitor.php in the browser it works fine. As expected it will save 1 to the monitor table, call Test.php which saves 1 to the test table, the original BatchMonitor.php will continue without waiting for a response and save 2 to the monitor table before exiting, then 10 seconds later the test page saves 2 to the test table before exiting. The second script starts fine, the first script does not wait for a reply and all parameters are correctly passed between scripts. With everything working as intended I then changed the system to work as a repeating loop by calling itself (with delay) instead of another script...
// BatchMonitor.php
SaveToMonitorTable(1); // save 1st test entry to see if the script reached this point
$Command = '"" "C:\Program Files (x86)\PHP\v5.3\php.exe" C:\inetpub\wwwroot\BatchMonitor.php --Instance=' . $Data->Instance;
pclose(popen("start /B $Command", "r"));
SaveToMonitorTable(2); // save 2nd test entry to see if the script reached this point
exit();
If I run BatchMonitor.php in the browser it runs once and that is it. It will save 1 to the database, wait 10 seconds and then save 2 to the database before exiting. The page returns successfully with no script or PHP errors but it doesn't repeat as it should.
Both BatchMonitor.php and Test.php use line-for-line identical functions to get the parameters and both files run correctly and identical on the first iteration. If I use exec instead of popen then the page loops correctly with all logic working as expected (with the one obvious flaw of creating a never-ending chain of scripts awaiting for response values that will never come).
Am I missing something obvious? Does popen have some sort of secret rule that prevents a page/process from opening duplicates of itself? Are there any alternatives to using popen or exec? I read about WScript.Shell but it might be a while before I can schedule that to get enabled so for now it's not an option and I'm hoping there is something more standard that I can use.
I dont feel like this should cbe your actual answer, But why do you disbandon scheduled tasks/cronjobs because you want something done every X seconds? Having the script minute.php calling 5seconds.php with ofcouse 5 second intervals in between would create a repeated taak evert 5 seconds right?
Strangely enough you are kinda using the same sort of mechanism from your browser already.
My only concern would be to take the processed time in account and create a safe script which ensures no more than 1 '5seconds.php' can run at any given time.

PHP scripts seem to be executing twice

I have a generic PHP maintenance script that sets a database value to prevent running at the same time as another instance. Usually run via cronjob.
I cleared the database value and tried running this script manually with the cronjob turned off. Every time I run it from the browser, it terminates immediately stating it is already running.
The script will run for about 30 seconds as a background process then terminate automatically as if PHP detected the browser was closed (should take about 15 min to complete).
So I added code to echo when the database value is set or read. It never echos when it's set, only when it was read, but I can see the database value is stored each time.
Script always finishes as expected if run from cron.
What could be going on? Could the server be executing scripts twice on each browser based request?
Server runs Hive so different dirs can have different PHP versions. Don't know if this could have something to do with it.
PHP 5.2.17 (default)
PHP 5.3.27 (dir this script is in)
Apache 2.2.25
Code that dictates if it runs is simply this:
$DB = new DbConnector($db_name, $db_user, $db_pass);
if ($DB->queryOne("SELECT COUNT(*) FROM data_vars WHERE name = 'maintenance_running'")) {
exit('Already running!');
} else {
$DB->query("INSERT INTO data_vars (name, value) VALUES ('maintenance_running', 1)");
}
At the end of the script, the value is cleared. Again, this problem only happens when run from the browser.
You should set a longer value for execution limit with:
set_time_limit(30*60) // 30 minutes
Also the echo function probably is working, but as the response is sent to the browser until the whole script ends executing that is the cause that you can't see the print, because it´s killed by the execution limit before it ends and prints the response.
You can try echoing partially the response with the help of the
ob_start
and
ob_flush
You can read more about ob_functions here

Don't run script if it's already running

I've been completely unsuccessful finding an answer to this question. Hopefully someone here can help.
I have a PHP script (a WordPress template, to be specific) that automatically imports and processes images when a user hits it. The problem is that the image processing takes up a lot of memory, particularly if multiple users are accessing the template at the same time and initiating the image processing. My server crashed multiple times because of this.
My solution to this was to not execute the image-processing function if it was already running. Before the function started running, I would check a database entry named image_import_running to see if it was set to false. If it was, the function then ran. The very first thing the function did was set image_import_running to true. Then, after it was all finished, I set it back to false.
It worked great -- in theory. The site hasn't crashed since, I can tell you that. But there are two major problems with it:
If the user closes the page while it's loading, the script never finishes processing the images and therefore never sets image_import_running back to false. The template will never process images again until it's manually set to false.
If the script times out while it's processing images -- and that's a strong possibility if there are many images in the queue -- you have essentially the same problem as No. 1: the script never gets to the point where it sets image_import_running back to false.
To handle No. 1 (the first one of the two problems I realized), I added ignore_user_abort(true) to the script. Did it work? I don't know, because No. 2 is still an issue. That's where I'm stumped.
If I could ask the server whether the script was running or not, I could do something like this:
if($import_running && $script_not_running) {
$import_running = false;
}
But how do I set that $script_not_running variable? Beats me.
I've shared this entire story with you just in case you have some other brilliant solution.
Try using
ignore_user_abort(true); it will continue to run even if the person leaves and closes the browser.
you might also want to put a number instead of true false in the db record and set a maximum number of processes that can run together
As others have suggested, it would be best to move the image processing out of the request itself.
As an interim "fix", store a timestamp alongside image_import_running when a processing job begins (e.g., image_import_commenced). This is a very crude mechanism, but if you know the maximum time that a job can run before timing out, the script can check whether that period of time has elapsed.
e.g., if image_import_running is still true but the current time is more than 10 minutes since image_import_commenced, run the processing anyway.
What about setting a transient with an expiry time that would throttle the operation?
if(!get_transient( 'import_running' )) {
set_transient( 'import_running', true, 30 ); // set a 30 second transient on the import.
run_the_import_function();
}
I would rather store the job into database flagging it pending and set a cron job to execute the processing one job at a time.
For Me i use just this simple idea with a text document. for example run.txt file
in the top script use :
if((file_get_contents('run.txt') != 'run'){ // here the script will work
$file = fopen('run.txt', 'w+');
fwrite($file, 'run');
fclose('run.txt');
}else{
exit(); // if it find 'run' in run.txt the script will stop
}
And add this in the end of your script file
$file = fopen('run.txt', 'w+');
fwrite($file, ''); //will delete run word for the next try ;)
fclose('run.txt');
That will check if script already work by checking runt.txt contents
if run word exist in run.txt it will not run
Running a cron would definitively be a better solution. Idea to store url in a table is a good one.
To answer to the original question, you may run a ps auxwww command with exec (Check this page: How to get list of running php scripts using PHP exec()? ) and move your function in a separated php file.
exec("ps auxwww|grep myfunction.php|grep -v grep", $output);
Just add following on the top of your script.
<?php
// Ensures single instance of script run at a time.
$fileName = basename(__FILE__);
$output = shell_exec("ps -ef | grep -v grep | grep $fileName | wc -l");
//echo $output;
if ($output > 2)
{
echo "Already running - $fileName\n";
exit;
}
// Your php script code.
?>

Apache + PHP multiple scripts at the same time

Good day.
For first, sorry for my bad English =)
So. I created script:
<?
sleep(10);
?>
My Apache has MPM module, I obviously didn't use sessions in this script, just.. just sleep(10).
When I open 2 tabs in my browser simultaneously, first tab loads in 10 seconds, second tab - 20 seconds.
But. When I open this script in 2 different browsers at the same time, it loads in each one after 10 seconds.
So, I started thinking, that my problem is "Connection: Keep-Alive". I changed my script:
<?
header('Connection: close');
phpinfo();
sleep(10);
?>
phpinfo() - to be sure, that headers were sent before sleep(). Buuuut... I meet the same problem. In first tab of Chrome I get headers with "Connection: close", in second tab I can't get response headers while first script is not ended. In two different browsers - everything is normal.
And now I have absolutely no ideas what I'm doing wrong. Why Chrome can't make 2 parallel queries to my site? What I should do to solve this problem?
P.S. I don't want to disable keep-alive for all my site. I don't mind, if it will speed up loading of css, images and other stuff. Even other scripts. But I want to have ability to run some scripts parallel in one browser.
P.P.S. For example: at the one page will be very long ajax query, for example - processing some big data at server-side and ajax queries with some little interval - to get status of executing first query. Obviously, that they must be parallel.
I know it's an old question but I just had the same problem and solved it with session_write_close()!
Without it PHP purposely queues scripts for same session.
Simplest possible example:
Looong Script #1:
<?php
$_SESSION['progress'] = 0;
for ($i=0; $i < 100; $i++)
{
session_start();
$_SESSION['progress']++;
session_write_close();
sleep(1);// This is slowing script purposely!
}
?>
Short script #2:
<?php
session_start();
print_r($_SESSION['progress']);
?>
Now try it, open first script that takes ages open second script in new tab and get the progress updated in a blink while first still running!! So easy right?! ;)
Same principle for ajax polling long script and second ajax call to get the progress!

Categories