Everything I google tells me this should not be happening, however it is.
I'm building a migration tool to build a 'master database'. I have an admin panel, only accessible to a few select people. There is a merge button that starts an AJAX call to run the migration php function. I'm not positive how long this script takes considering I'm still developing it but none the less I'm expecting a minimum of 20 minutes once pushed to production and populated with the production database. I do NOT need a lecture on best practices telling me not to do it via a GUI. This will become a cron as well, however I want to be able to induce it manually, if the admin desires.
So here's my process. The migration function immediately closes the session session_write_close() allowing me to run multiple php scripts simultaneously. I do this because I start a setInterval that checks to see a session variable. This is my 'progress' which is just an int on what loop iteration I'm on. In my migration script I open sessions, add 1 to that int, and close the sessions again. I do this at the end of each loop.
By doing this I have successfully created a progress for my AJAX. Now I noticed something. If I start my migration, then close out of my tab - or refresh. Once I reload the page my progress continues to grow. This tells me that the migration script is still executing in the background.
If I close 100% out of my browser, or clear my sessions I no longer see progress go up. This however is not because the script stops. This is because my progress indication relies on sessions and once I clear my sessions or close out my browser my session cookie changes. However I know the script is still running because I can query the database manually and see that entries are being added.
NOW to my question:
I do NOT want this. If my browser closes, if I press refresh, if I loose connection, etc I want the script to be TERMINATED. I want it to stop mid process.
I tried ignore_user_abort(false); however I'm pretty sure this is specific to command line and made no difference for me.
I want it to be terminated because I'm building a 'progress resume' function where we can choose where to resume the migration progress again.
Any suggestions?
UPDATE:
I didn't want to go this route but some solution I just though of is I could have another session variable. And it's my 'last time client was validated' which could be a timestamp. In my javascript, on the client side, every like 30 seconds I could hit a php script to 'update last time client was validated'. And in my migration function at the beginning of each loop I could check to make sure that timestamp isn't like 60 seconds old for example. If it IS 60 seconds old, or older, I do a die thus stopping my script. This would locally mean 'if there is no client updating this timestamp then we can assume the user closed out of his browser/tab/refreshed'. And as for the function I can ignore this check if in command line (cron). Not the ideal solution but it is my plan B
I am, and did, go with the solution to ping from the client to indicate if the client is still alive or not.
So essentially this is what I did:
From the client, in javascript, I set up a setInterval to run every 1.5 seconds and that hits a php script via AJAX. This php script updates a session variable with the current timestamp (this could easily be a database value if you needed to, however I didn't want the overhead of another query).
$_SESSION['migration_listsync_clientLastPing_'.$decodedJson['progressKey']] = time();
Then, inside my migration function I run a check to see if the 'timestamp' is over 10 seconds old, and if it is I die - thus killing the script.
if(isset($_SESSION['migration_listsync_clientLastPing_'.$progressKey])){
$calc = time() - $_SESSION['migration_listsync_clientLastPing_'.$progressKey];
if($calc > 10){
die();
}
}
I added a 'progressKey' param which is a random number from 1-100 that is generated when the function is called. This number is generated in javascript and passed into both of my AJAX calls. This way if the user refreshes the page and then immediately pressed the button again we won't have 2 instances of the function running. The 'old' instance will die after a few seconds and the new instance will take over.
This isn't an ideal solution however it is an effective one.
Related
I have a cron script that would spider a website for new content and save the entries I need into the database. Entries are md5 hashed and validated, to prevent dupes. However I have noticed that there are sometimes two occurrences running at the same time, and the hashing method is failing at this point as I get two of each pregmatches inserted into DB.
Can someone recommend the best way to prevent this from happening in the future.
I have considered locking execution by checking log files, but in this case the script may get permanently locked if there is an error in the middle.
I'm looking into setting $_SESSION['lock'], so in this case if it locks and breaks, the session is bound to expire at some point.
Any ideas?
I think that $_SESSION should be left when running from a web server, not command line.
I would store last activity time in a file. If cron finishes its work normally, you delete the file.
When cron script runs, check the file. If file doesn't exist, or, last activity is older than a certain time span, you continue to execute, otherwise - stop.
This would be pretty easy to implement too.
Check, if script should run:
if(file_exists('lock.txt') && file_get_contents('lock.txt') > (time() - 60)){
die('Should not run!');
}
Log activity on certain script's life-cycle points:
file_put_contents('lock.txt', time());
It's bugging me for a day now and I really cant find out, I have a basic login/register page, and when registering, a timestamp is stored in the mysql database(table timer, column cooldown):
$settime = mysql_query("INSERT INTO `timer` (`cooldown`) VALUES(0)") or die(mysql_error());
What I want to do now (I'm creating a browser mmorpg), Is when I do a specific POST request, I want a timer in my database to go off. This timer should be 1 minute, and also be shown for users, like: <?php echo $timer['cooldown']; ?> Whenever the timer is = 0, I can do a specific function again, and the timer will be set to 60 seconds again.
Sorry for the lack of knowledge but I can't find out anywhere how to do this.
What you're trying to do here - a background job - goes against the web development principle of a request-response cycle in the shared-nothing environment of PHP.
But there are several ways to break up the rigid cycle:
If you just need to do some DB updates after 1 minute, you can use MySQL events: http://dev.mysql.com/doc/refman/5.1/en/events.html
If the function you want to call is not running too long, you can check on every user request if there are entries in the timer table that are older than 1 minute.
Create a PHP script that is called by a cron job every minute. It checks if there are unhandled items in the timer table and does something with them.
Create a PHP daemon script that wakes up every minute to check.
If you need to change something on the user page after 1 minute, doing the PHP script call client-side with a JavaScript timeout and AJAX or websockets is the better option.
For displaying the countdown to the user, you have to use JavaScript. If you are using the server-side timer, you can use it just for display purposes, hiding the countdown when it's finished. To work around the "user opens new page before timout is finished" problem, put the data for the remaining seconds in an HTML data attribute where your JavaScript code can read it.
i have a big script written in php, which should import a lot of informations in a prestashop installation, using webservices, this script is written in "sections" I mean, there is a function that import the categories, another one that import products, then manufacturers, and so on, there are about 7 - 10 functions called in the main script. Basically I assume that this script must run for about an hour, passing from a function to the next one and so on since it arrives at the last function, then return some values and stops until the next night.
i would like to understand if it could be better :
1) impose a time limit of 30 minutes everytime i enter a new function (this will prevent the timeout)
2) make a chain of pages, each one with a single function call (and of course the time limit)
or any other idea... i would like to :
know if a function has been called (maybe using a global variable?)
be sure that the server will execute the function in order (so the pages chain)...
i hope to have beeen clear, otherwise i'll update the question.
edits:
the script is executed by another server that will call a page, the other server is "unkown" from me, so I simply know only that this page is called (they could also call the function by going on the page) but anyway i have no controll on it.
For any long running scripts, I would run it through the commandline, probably with a cronjob to kick it off. If it's triggered from the outside, I would create a job queue (for example in the database) where you insert a new row to signify that it should run, along with any variable input params. Then the background job would run - say - every 5 minutes, check if there's a new job in the queue. If there's not, just exit. If there is, mark that it has begun work and start processing. When done, mark that it's done.
1 hour of work is a looooooooong time though. Nothing you can do to optimise that?
You can increase the time limit for execution of a script as much as you want using :
set_time_limit(seconds);
And also for long running scripts you need a more memory. you can increase the memory limit using :
ini_set('memory_limit','20M');
And second other thing you have to make sure is that you are running your script on a dedicated server because if you are using a shared server you server will kill automatically long running scripts.
I run a procedure that takes about 20 minutes to complete, I just wonder if PHP can keep the connection active until the process finishes.
To be clear, this page will have a button which when you press it will call a php page to run a sql query, in the main page I just wait for the Http request to be complete to send a success message.
For queries that are set to take up some time, you should move some automation requests into the mix, preferably cronjobs if you have access to a linux server.
With cronjobs, you can create enteries in the database for specific values, linked to the user. The cronjob will kick in lets say, every 5 minutes to execute a query if a pending query has finished. This will minimize the fact the user will need to sit on the page until completion. Because you should know, the second the user navigates away from the active page; all active connections, queries etc will stop.
Once the script has complete, make a custom message at the end to send to the user letting them know that their process has been completed.
You should also know, PHP works down the script, from line 1 to the end; so if your hang is on line 40 for example; the script will sit on that line until the query has completed then carry on processing.
Edit: This is for example purposes only to point you in the direction that i'm getting at, and should not be used as you see it. This is merely a markup example
<?php
if (isset($_POST['ButtonToExecute']))
{
// Query to update a table in your database which an external PHP script will work with
// Example Table Structure:
/*
Username
State
*/
if ($state == 0)
{
// Then update state to 1 with username to the username for the query to take place on
}
else
{
// Warn user that their process will take longer to complete as their already is an active query in process
// but add them to a secondry pipeline which will be picked up on the next cronjob interval
}
}
?>
On your cronjob, might have:
<?php
if ($state=='1')
{
// Execute your script
// After query execution update state to 2
// if state == 2 then insert custom message/email to send
}
?>
Edit your crontab:
yourpreferrededitor /etc/crontab
^ Where yourpreferrededitor means your text editor, whether nano or other.
and your cronjob line:
* * * * * root /usr/bin/php /var/www/cron.php
^ This is taken from a current cronjob I have constantly running set for every minute of every day
A Cronjob will give the user the ability to navigate away from the page, because as I mentioned above, the second the user navigates away from the script.. All on going processes stop. The cronjob will carry on running throughout without no user interaction needed (apart from they make the initial request)
You could do it in an ordinary php script if you set the timeout limit and ignore user abort but this is a bad idea because your user will have no feedback if the script worked.
At the company I work at we had the same problem
What we did:
Run the query in a separate script.
Start the script WITHOUT waiting for results
Split the query into multiple parts (using limit and offset)
Have the query script write messages to a temp file
Implement a status screen fetching the current state from the log via ajax
BTW an example for all the wise guys asking why it takes so long:
using a transitive hull can speed up your application a lot if you have to deal with a tree with millions of nodes but building it can take hours.
You could try:
set_time_limit(0);
You may want to take a look at Set Time Limit
In your PHP ini you can set a max-execution time for scripts, however you NEVER want to have a user sit on a page loading screen for that long.
One thing that I could suggest would to e increase your max execution time to something around 30 minutes, and then call this in javascript after a page has already been rendered, so the user will not be left in the dark not knowing what the script is doing.
hello i have some problems with my php ajax script
i'm using PHP/mysql
i have a field in my accounts table that will save the time for the last request from a user, i will use that to kick the idle user out of the chat. and i will make a php function that will delete all the rows that its time field more than the time limit, but where should i use this method is it okay to fire it every time a new request sent to my index.php ? i think that will make a huge load on the server,is n't it ? do you have a better solution?
thanks
There are two viable solutions:
either create a small PHP script that makes this deletion in an infinite loop (and of course sleeps for a specified amount of time before doing it again), and then start it via PHP CLI,
or create one that makes the deletion only once, then exits, and call it from cron (if you're using a UNIXish server) or Task Scheduler (on Windows).
The second one is simpler, but its drawback is that you can't make the interval between the deletions shorter than 60 seconds.
A solution could be to fire the deletion function just once every few requests.
Using rand() you could give it a 1 in 100 (for example) change of running the function, so that about one page request in a 100 will clean up the expired data.