Timed events with php/MySQL - php

I need a way to modify a value in a table after a certain amount of time has passed. My current method is as follow:
insert end time for wait period in table
when a user loads a page requesting the value to be changed, check to see if current >= end time
if it is, change the value and remove the end time field, if it isn't, do nothing
This is going to be a major feature on the site, and so efficiency is the key; with that in mind, you can probably see the problem with how I'm doing it. That same chunk of code is going to be called every time someone access a page that needs the information.
Any suggestions for improvements or better methods would be greatly appreciated, preferably in php or perl.
In response to cron job answers:
Thanks, and I'd like to do something like that if possible, but hosts limits are the problem. Since this is a major part of the app, it can't be limited.

why not use a cron to update this information behind the scenes? that way you offload the checks on each page hit, and can actually schedule the timing to meet your app's requirements.

Your solution sounds very logical, since you don't have access to cron. Another way could be storing the value in a file, and the next time the page is loaded check when it was last modified (filemtime("checkfile.txt")), and decide if it needs modifying again. You should test performance for both methods.

Can you use a cron job to check each field in the database periodically and update that way?
A big part of this is how frequently the updates are required. A lot of shared hosts limit the frequency of cron checks, for example no more than every 15 minutes, which could affect the application.

You could use a trigger of some sort on each page load. I really have no idea how that would affect performance but maybe somebody else can shed some light.

If performance really starts to be an issue, (which means a lot more than you probably realize) you could use memchached to store the info...

Related

Is mysql event scheduling the appropriate (or best) solution?

Basically i just want to know if i have the right idea or if a better one exists.
I have a database that stores auctions that are currently taking place on the website (or will do). Obviously there comes a point when the auction will end. I am pretty new to web developing and was just wondering the best way of changing the status of the auction in the database to "expired".
I am using phpmyadmin and mysql and have read a bit about mysql event schedulers. From what i can gather they seem like a good way of doing this. On this basis i surely just run a recurrent event every minute, say, and it checks the current time against the preordained finish time of the auction and then makes changes if need be? Is this correct? Also i can then move the expired auctions to an old archive - probably the best right idea do you think for optimum performance of my database?
Otherwise i just wondered if there was a better (or easier way) of doing the same thing. How do sites like ebay go about it for example? Thanks a lot for your time in advance.
P.S. I realize this is more of a conceptual question then anything specific. Nevertheless i hope its alright to ask!
You're way overcomplicating this. You don't need to actually change the status of the auction the moment it expires. The status is self-explanatory already from its expiry date. If you have a column expiry_time in the database, any time before that the auction is active, any time after it's expired. As simple as that. Use appropriate queries, like this to find all active auctions:
SELECT * FROM `auctions` WHERE `expiry_time` > NOW()
You could do it this way, but I would suggest another approach. This is something I've implemented myself (in a game, but with the same mechanics), and it worked like a charm.
The only thing you have to do is to run a function at every pageload (before anything else) and check if anything needs to be moved to the archive. If anything needs to be taken care of, do it.
If you do it this way, you know that the page the visitors are seeing always is updated. If you use some sort of cron_job or event like you talked about, some of those auctions might be "expired" a few seconds after the job is executed and is therefore displayed as incorrectly active for almost another minute.

The best way to schedule a event

is there any other option other than cron to schedule the running of a php backup script at a certain time?. I know you can use php itself to schedule things, but it will only fire if the site is getting traffic.
Are there any other options ?.
Thanks :-)
If you're talking about a database backup, then MySQL 5.1 and above has CREATE EVENT which can be used to trigger events (such as stored procedures that can dump table structure/data to file) at regular intervals or set times
Well, cron jobs is a solution. But not necessary in most cases.
If your script is doing something off the site (like sending an email or something), it must be a cron-job.
But...
I made a textbased rpg-game once where several actions were stored in the database waiting to get triggered at a specified time. I found out that it did not make any difference if the script fired at the time it should, or when the first person visiting the page after the time is beyond the timestamp. You could do these events before displaying the content of the page. (I used a file called monitor, to keep it simple).
Would you like to say more about your "event"?
Unless you feel like writing a daemon/service/etc., cron would be your best bet. If you need a job ran more often than minutely, use a lockfile solution and a looping script.
Well not really, Crons are your best bet.
Other than that call a script, and if certain parameters are met such as time elapsed then run the script.

Cron job on a large database with PHP, suggestions needed

I've heard about cron job and don't think the actual creation of it will be that hard to make but I've some concerns about how this will work with a large script.
Without going too much off-topic on my project i will stick with the basics about my situation. I need to make a script that every day performs a CURL fetch for data on a remote website and updates an database for each featured member on my website with it. In short, it's approximatively at this time 1000 times the script need to be executed but it will be a larger number as times goes by.
As you can guess, this will take a long time to preform so i'm worried about how the execution will work in a manner of not crashing in the middle of it.
My first thought was to perhaps split the users into groups and make the executions on a small amount of users each time but don't know how this is manageable ( will read on further about the topic when i got some form of confirmation on this).
So, to my question. Do you think there is any way for me to make this happen and do you perhaps have any suggestions on how to make this to work efficiently? All help i can get is appreciated. Thank you for your time.
bigger cron-jobs with php and mysql needs to be fragmented, since there is no way for you to 'nice' them, (reduce their os priority). Even if you nice the script, the mysql-requests will be executed without this concern.
From what you're describing there's two aspects to consider:
Congestion of network bandwith
Congestion of database throughput
I'd recommend a fragmented solution where you call your script from cron more often, and let the script execute only a small amount of the total job. The job should further be canceled (postponed to next run) if i/o-bandwith or cpu-usage is above any limit that may affect response-time to visitors.
regards,
/t
One Way:
I'm usually against putting logic in the database, but in this case a stored procedure might help. It will run your job faster (since it's a large one) and also you want to lock the tables as you do it. That way, if the script that calls the stored procedure gets hit by cron before the original job was over with it wont edit your database while the first one is running.
The actual time can i not give an
straight answer on but based on
previously experiences this will take
longer then the max execution time.
So solve that problem. There's a reason you can have a different php.ini for the command line interface. Then you can simply focus on processing all users in one script.
I solved this program using the files of cron job as differents cron jobs with small pieces. If you are using PHP you can set a cron job to domain/cronjob1.php, domain/cronjob2.php limiting the database lets say 10 with
$sql="SELECT * FROM table LIMIT 10";
to cronjob 1 and the rest in cronjo2

Most effiecient way to display some stats using PHP/MySQL

I need to show some basic stats on the front page of our site like the number of blogs, members, and some counts - all of which are basic queries.
Id prefer to find a method to run these queries say every 30 mins and store the output but im not sure of the best approach and I don't really want to use a cron. Basically, I don't want to make thousands of queries per day just to display these results.
Any ideas on the best method for this type of function?
Thanks in advance
Unfortunately, cron is better and reliable solution.
Cron is a time-based job scheduler in Unix-like computer operating systems. The name cron comes from the word "chronos", Greek for "time". Cron enables users to schedule jobs (commands or shell scripts) to run periodically at certain times or dates. It is commonly used to automate system maintenance or administration, though its general-purpose nature means that it can be used for other purposes, such as connecting to the Internet and downloading email.
If you are to store the output into disk file,
you can always check the filemtime is lesser than 30 minutes,
before proceed to re-run the expensive queries.
There is nothing at all wrong with using a cron to store this kind of stuff somewhere.
If you're looking for a bit more sophisticated caching methods, I suggest reading into memcached or APC, which could both provide a solution for your problem.
Cron Job is best approach nothing else i seen feasible.
You have many to do this, I think the good not the best, you can store your data on table and display it every 30 min. using the function sleep()
I recommend you to take a look at wordpress blog system, and specially at the plugin BuddyPress..
I did the same some time ago, and every time someone load the page, the query do the job and retrieve the information from database, I remenber It was something like
SELECT COUNT(*) FROM my_table
and I got the number of posts in my case.
Anyway, there are so many approach. Good Luck.
Dont forget The cron is always your best friend.
Using cron is the simplest way to solve the problem.
One good reason for not using cron - you'll be generating the stats even if nobody will request them.
Depending on the length of time it takes to generate the data (you might want to keep track of the previous counts and just add counts where the timestamp is greater than the previous run - with appropriate indexes!) then you could trigger this when a request comes in and the data looks as if it is stale.
Note that you should keep the stats in the database and think about how to implement a mutex to avoid multiple requests trying to update the cache at the same time.
However the right solution would be to update the stats every time a record is added. Unless you've got very large traffic volumes, the overhead would be minimal. While 'SELECT count(*) FROM some_table' will run very quickly you'll obviously run into problems if you don't simply want to count all the rows in a table (e.g. if blogs and replies are held in the same table). Indeed, if you were to implement the stats update as a trigger on the relevant tables, then you wouldn't need to make any changes to your PHP code.

Need advice on cron job'ing a very large process

I have a PHP script that grabs data from an external service and saves data to my database. I need this script to run once every minute for every user in the system (of which I expect to be thousands). My question is, what's the most efficient way to run this per user, per minute? At first I thought I would have a function that grabs all the user Ids from my database, iterate over the ids and perform the task for each one, but I think that as the number of users grow, this will take longer, and no longer fall within 1 minute intervals. Perhaps I should queue the user Ids, and perform the task individually for each one? In which case, I'm actually unsure of how to proceed.
Thanks in advance for any advice.
Edit
To answer Oddthinking's question:
I would like to start the processes for each user at the same time. When the process for each user completes, I want to wait 1 minute, then begin the process again. So I suppose each process for each user should be asynchronous - the process for user 1 shouldn't care about the process for user 2.
To answer sims' question:
I have no control over the external service, and the users of the external service are not the same as the users in my database. I'm afraid I don't know any other scripting languages, so I need to use PHP to do this.
Am I summarising correctly?
You want to do thousands of tasks per minute, but you are not sure if you can finish them all in time?
You need to decide what do when you start running over your schedule.
Do you keep going until you finish, and then immediately start over?
Do you keep going until you finish, then wait one minute, and then start over?
Do you abort the process, wherever it got to, and then start over?
Do you slow down the frequency (e.g. from now on, just every 2 minutes)?
Do you have two processes running at the same time, and hope that the next run will be faster (this might work if you are clearing up a backlog the first time, so the second run will run quickly.)
The answers to these questions depend on the application. Cron might not be the right tool for you depending on the answer. You might be better having a process permanently running and scheduling itself.
So, let me get this straight: You are querying an external service (what? SOAP? MYSQL?) every minute for every user in the database and storing the results in the same database. Is that correct?
It seems like a design problem.
If the users on the external service are the same as the users in your database, perhaps the two should be more closely configured. I don't know if PHP is the way to go for syncing this data. If you give more detail, we could think about another solution. If you are in control of the external service, you may want to have that service dump it's data or even write directly to the database. Some other syncing mechanism might be better.
EDIT
It seems that you are making an application that stores data for a user that can then be viewed chronologically. Otherwise you may as well just fetch the data when the user requests it.
Fetch all the user IDs in go.
Iterate over them one by one (assuming that the data being fetched is unique to each user) and (you'll have to be creative here as PHP threads do not exist AFAIK) call a process for each request as you want them all to be executed at the same time and not delayed if one user does not return data.
Said process should insert the data returned into the db as soon as it is returned.
As for cron being right for the job: As long as you have a powerful enough server that can handle thousands of the above cron jobs running simultaneously, you should be fine.
You could get creative with several PHP scripts. I'm not sure, but if every CLI call to PHP starts a new PHP process, then you could do it like that.
foreach ($users as $user)
{
shell_exec("php fetchdata.php $user");
}
This is all very heavy and you should not expect to get it done snappy with PHP. Do some tests. Don't take my word for it.
Databases are made to process BULKS of records at once. If you're processing them one-by-one, you're looking for trouble. You need to find a way to batch up your "every minute" task, so that by executing a SINGLE (complicated) query, all of the affected users' info is retrieved; then, you would do the PHP processing on the result; then, in another single query, you'd PUSH the results back into the DB.
Based on your big-picture description it sounds like you have a dead-end design. If you are able to get it working right now, it'll most likely be very fragile and it won't scale at all.
I'm guessing that if you have no control over the external service, then that external service might not be happy about getting hammered by your script like this. Have you approached them with your general plan?
Do you really need to do all users every time? Is there any sort of timestamp you can use to be more selective about which users need "updates"? Perhaps if you could describe the goal a little better we might be able to give more specific advice.
Given your clarification of wanting to run the processing of users simultaneously...
The simplest solution that jumps to mind is to have one thread per user. On Windows, threads are significantly cheaper than processes.
However, whether you use threads or processes, having thousands running at the same time is almost certainly unworkable.
Instead, have a pool of threads. The size of the pool is determined by how many threads your machine can comfortable handle at a time. I would expect numbers like 30-150 to be about as far as you might want to go, but it depends very much on the hardware's capacity, and I might be out by another order of magnitude.
Each thread would grab the next user due to be processed from a shared queue, process it, and put it back at the end of the queue, perhaps with a date before which it shouldn't be processed.
(Depending on the amount and type of processing, this might be done on a separate box to the database, to ensure the database isn't overloaded by non-database-related processing.)
This solution ensures that you are always processing as many users as you can, without overloading the machine. As the number of users increases, they are processed less frequently, but always as quickly as the hardware will allow.

Categories