Delay time til user can run a php script again - php

I've searched a lot and didn't found a simply answer, all i need is dont allow user to run the same php script too fast, like if he sent the form, i need to block the same form to be runned again for 10 seconds.
I tried using:
if(isset($_SESSION['var']){
exit;
}
and setting the var on the start of the script and then unsetting it on script finish, but it will only disallow him to run the script while it's currently running, is there any way to do this?

You can use Timeout to prevent user run script to fast ,
But php use Session to store Timeout like this :
session_start(); //At the beginning of the PHP file
define("TIMEOUT", 10); //10 sec
//Check timeout
if (isset($_SESSION['expire'])) {
if ($_SESSION['expire']-time()>TIMEOUT) {
unset($_SESSION['expire']);
}
}
//Allow to do submit after 10 sec
if (!isset($_SESSION['expire'])) {
$_SESSION['expire']=time();
//Your submit code
}

Related

Conserve variable after refresh php

I have a PHP script that read and export CSV to a database. At the beginning of each execution, the script get a customer name with $_POST. It runs around 7 minutes to send 120k row. Nevertheless, my host allow PHP scripts to run up to 165 seconds.
My idea was then to refresh the page before the 165s and start the export again, at the row it ended. I've succedeed to refresh the page, but I struggle to conserve the variable saving the row position at which the script ended in order to use it after the refresh.
I could use $_POST or $_SESSION, but my script may run several time at the same moment, exporting a different CSV each run. I'm afraid that changing these super global variable from scripts that may run at the same time make them collide, and change their value when I don't want to.
First : is the above affirmation true?
Then if it is, how can I store the number of row the script ended before refreshing the page. I though about creating a file, putting the informations inside and then read it. That may look like this :
customer_name : Jon
row_ended : 10584
customer_name : Jane
row_ended : 11564
But isn't there a more easier and efficient solution?
You can create a run ID and save it on the session.
Ex.
session_start();
$_SESSION['run']['id'] = 1; // or some unique ID
$_SESSION['run']['user'] = 'jon';
$_SESSION['run']['lastRow']= 0;
$startTime = time() + 160; // total secs
if($starTime > time() ){
// time of 160 passed redirect to same page.
$_SESSION['run']['lastRow']= 100000;
header("location: page.php");
exit;
}
But this will not solve the problem, can be be a redirect hell
You can try to increase the max execution time at runtime.
ini_set('max_execution_time',0); //will run forever
or the best solution run it as a shell command with max_execition_time = 0
users may navigate away the page if it takes too long.

How to make a function sleep for period of seconds to prevent timeout

I have a code that takes sometime to finish because it checks my database
so, i need to make this function to sleep to prevent timeout
I used the following function but it didn't work and the browser keep loading then hit timeout, I am using set_time_limit(0); too but it didn't help
foreach( $lines as $line ) {
sleep(5);
do stuff here
echo "start again";
}
What I need exactly is, make the browser stop loading for 5 seconds after first foreach success then continue the job.
You misunderstood how sleep function works, it won't stop loading the browser, it will wait for 5 seconds for example, on the server side, then will send response to web client. For async calls use AJAX.
If you script timeouts, try to increase max execution time.
ini_set('max_execution_time', '0');

Php Long polling script consumes 20/20 number of processes on my premium shared hosting. How does it work?

I have my page which make an Ajax call to my PHP script searching for data in the database. When I refresh the page multiple times consecutively, my host's number of processes limit on 20 reached (20/20). Because of that, no more php script can be executed until the processes number gets down in like 3 or more minutes.
What's going on on the host when the page gets called and how to fix it? Let's say if 30 people (even less) log on my website, my website will crash.
Here is my script:
//Some logic code
session_write_close();
//It was 400
set_time_limit(60);
$genMesaj = false;
do {
sleep(5);
//Some codes to search on the database
//In case some data found
\session_start();
session_write_close();
$genMesaj = //true or false;
} while (!$genMesaj);

using ignore_user_abort and set_time_limit(0)

I have a form in page1.php which both redirects to page3.php and also triggers an ajax post in page2.php (with no success function), page2.php might need to run for an hour, but the user doesn't need the results. I do need the user to see page2.php, but he might navigate away.
Do I need to use in page2.php these 2 functions? Or just one of them? Or none? I want to make sure the script in page2.php runs until the end.
Page1.php
<form id="form" action="page2.php" method="post">
<!--form input elements with names-->
</form>
<script>
$('#form').submit(function() {
$.post('page3.php', {
var1: $('input[name="name1"]').val(),
var2: $('input[name="name2"]').val(),
});
});
</script>
Page2.php
<?php
ignore_user_abort(true); // Allow user to navigate away
set_time_limit(0); // Allow script to run indefinitely
// a lot of code which will run for a while - between 3 minutes and an hour
?>
Page3.php
<html>
<!--some code here including links to go to page4.php-->
</html>
I am asking partly because I thought there is no need for any of these functions, but was told to use them, but when I try using them, eventhough there is die(); and the script stops, it still seems to be processsing something and I'm afraid because of this "indefinitely" it will be too much on the server.
As I don't want to add unnecessary loads.
Yes you would need both of those functions in order to accomplish your current criteria, my suggestion would be to move this out of the http protocal. Depending on what your script is actually accomplishing if it requires no further interaction from the client it would be best used in the command line.
A theory of use would be to create a cron script that is called at the needed intervals, it would then access a queue which page2.php would populate.
If there is a queue available the cron script would process the information as it is currently done on page2.php. Since your script runs for a long period of time I would suggest using a locking mechanism for the cron, see php.net/flock for a simple file system lock. You check the file if its locked its already running.
Here is a simple example that you put into a standalone script for processing via cron:
$fp = fopen(DATA_PATH . '/locks/isLocked', 'w+');
if (!flock($fp, LOCK_EX | LOCK_NB)) { //locks the file
$logger->info('Already Running');
exit(0);
}
fwrite($fp, time()); //write our new time so we can inspect when it ran last if needed
try {
if (hasQueue()) { //checks to see if any jobs are waiting in mysql
run(); //process normally completed by page2.php
}
} catch (Exception $e) {
//something went wrong here could setup a log / email yourself etc..
}
flock($fp, LOCK_UN); //unlock the file

Ajax long polling (comet) + php on lighttpd v1.4.22 multiple instances problem

I am new to this site, so I really hope I will provide all the necessary information regarding my question.
I've been trying to create a "new message arrived notification" using long polling. Currently I am initiating the polling request by window.onLoad event of each page in my site.
On the server side I have an infinite loop:
while(1){
if(NewMessageArrived($current_user))break;
sleep(10);
}
echo $newMessageCount;
On the client side I have the following (simplified) ajax functions:
poll_new_messages(){
xmlhttp=GetXmlHttpObject();
//...
xmlhttp.onreadystatechange=got_new_message_count;
//...
xmlhttp.send();
}
got_new_message_count(){
if (xmlhttp.readyState==4){
updateMessageCount(xmlhttp.responseText);
//...
poll_new_messages();
}
}
The problem is that with each page load, the above loop starts again. The result is multiple infinite loops for each user that eventually make my server hang.
*The NewMessageArived() function queries MySQL DB for new unread messages.
*At the beginning of the php script I run start_session() in order to obtain the $current_user value.
I am currently the only user of this site so it is easy for me to debug this behavior by writing time() to a file inside this loop. What I see is that the file is being written more often than once in 10 seconds, but it starts only when I go from page to page.
Please let me know if any additional information might help.
Thank you.
I think I found a solution to my problem. I would appreciate if anyone could tell, if this is the technique that is being used in COMET and how scalable this solution.
I used a user based semaphore like this:
$sem_id = sem_get($current_user);
sem_acquire($sem_id);
while(1){
if(NewMessageArrived($current_user))break;
sleep(10);
}
sem_release($sem_id);
echo $newMessageCount;
It seems common for long-polling requests to timeout after 30 seconds. So in your while loop you could echo 'CLOSE' after 30 seconds.
while(!$new_message && $timer < 30){
$new_message = NewMessageArrived($current_user);
if(!$new_message) {
sleep(10);
$timer += 10;
}
}
if($newMessageCount) {
echo $newMessageCount;
} else {
echo 'CLOSE';
}
In the Javascript, you can listen for the CLOSE.
function poll_new_messages(){
xmlhttp=GetXmlHttpObject();
//...
xmlhttp.onreadystatechange=got_new_message_count;
//...
xmlhttp.send();
}
function got_new_message_count(){
if (xmlhttp.readyState==4){
if(xmlhttp.responseText != 'CLOSE') {
updateMessageCount(xmlhttp.responseText);
}
//...
poll_new_messages();
}
}
Now, the PHP will return a response within 30 seconds, no matter what. If you use stays on the page, and you receive a CLOSE, you just don't update the count on the page, and re-ask.
If the user moves to a new page, your PHP instance will stop the loop regardless within 30 seconds, and return a response. Being on a new page though, the XHR that cared about that connection no longer exists, so it won't start up another loop.
You might try checking connection_aborted() periodically. Note that connection_aborted() might not pick up on the fact that the connection has in fact been aborted until you've written some output and done a flush().
In fact, just producing some output periodically may be sufficient for php to notice the connection close itself, and automatically kill your script.

Categories