This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
PHP: running scheduled jobs (cron jobs)
I need to update a database every 20 mins lets say. (ie. add 50 to 'X' column, subtract 20 form 'y' column, preform an equation based on time on 'z' column, etc.) I have the necessary update in a update.php page but how would i go about calling that page every 20 minutes (short of scheduling a task on a computer)? or is there a better way to do this?
Thanks
Linux
This should be done as a cron job or alternatively using at to schedule the job. I have written a PHP wrapper for the at command that you could use for this purpose: https://github.com/treffynnon/PHP-at-Job-Queue-Wrapper
Windows
You need to use Scheduled Tasks. Here is a link to an old article I wrote about moving Linux cronjobs to a Windows machine: http://blog.simonholywell.com/post/374209271/linux-to-windows-server-migrating-and-securing-your-cron
Where does this page live? If it's on a Linux box, use a cron job. If it's on a Windows machine, use a Scheduled Task.
The answer to your question is no, you're going to have to schedule a task. Basically because this situation is exactly what it is made for!
On Linux you can use a cron job. On windows it is done with a scheduled task
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have created a PHP script which scrapes 1 mio. domains and analyzes the content. I tested it locally and it takes 20 mins per 1,000 domains scraped.
Can I just setup a server with it and let it run for 2 weeks or is there a reason why a PHP file would crash after a certain execution time?
If you run PHP from the console, it has no max execution time. That being said, you should probably rearchitect your idea if it takes 2 weeks to execute. Maybe have a js frontend that calls a PHP script that scrapes 5 or 10 domains at a time...
Sure, you could if you run the code via command line or set a max_execution_time
With that said I would highly recommend that you re-architect your code, if your running this code on a Linux box look into pThreads. The task your trying to do seem like it would be easier with c# if your running on windows machine.
NOTE I can't stress enough that if you use threading for this task it will go much faster.
I would suggest the following:
Limit your script to X domains per execution.
Create a CRON that runs your script every minute.
This way you won't have to worry too much about memory leaks. You might also want to create a .lock file at the beginning of your process to make sure your CRON doesn't run the script before it's finish. Sometimes when you are requesting information from other websites it might take very long...
The problem with cron-jobs is that they can end up over-running, and so have more than one copy running at the same time. If you are running multiple copies from Cron at once, there will be a huge load spike, but there might not be anything running for the last 30 seconds of every minute. (Trust me, I've seen it happen, it was not pretty).
A simple shell script can be set running easily with normal Linux startup mechanisms and will them loop forever. Here, I've added the ability to check for the exit of a PHP script (or whatever) to exit the loop. Add other checks to deliberately slow down execution. Here's my blogpost on the subject.
I would arrange the script to run somewhere from 10-50 domain-scrapes, and then exit, ready to run again until you run out of data to look for, or some other issue happens that requires attention.
#!/bin/bash
# a shell script that keeps looping until a specific exit code is given
# Start from /etc/init.d, or SupervisorD, for example.
# It will restart itself until the script it calls returns a given exit code
nice php -q -f ./cli-worker.php -- $#
ERR=$?
# if php does an `exit(99);` ...
if [ $ERR -eq 99 ]
then
# planned complete exit
echo "99: PLANNED_SHUTDOWN";
exit 0;
fi
sleep 1
# Call ourself, replacing the script without a sub-call
exec $0 $#
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
Currently I'm trying to build a good scheduler system as an interface for setting and editing cron jobs on my system. My system is built using Zend framework 1.11.11 on a Linux server.
I have 2 main problems that I want your suggestion for:
Problem 1: The setup of the application itself
I have 2 ways to run the cron job:
First way is to create a folder scripts and create a common bootstrap file in it where I'll load only the resources that I need. Then for each task I'll create a separate script and in each script I'll include the bootstrap file. Finally, I'll add a cron task in the crontab file for each one of these scripts and the task will be something like ***** php /path/to/scripts/folder/cronScript_1.php .
Secondly treat the cron job like a normal request (no special bootstrap). Add a cron task in the crontab file for each one of these scripts and the task will be something like ***** curl http://www.mydomain.com/module/controller/action .
Problem 2: The interface to the application
Adding a cron job also can be done in 2 ways:
For each task there will be an entry in the crontab file. when I want to add a new task I must do it via cPanel or any other means to edit the crontab (which might not be available).
Store the tasks in the database and provide a UI for interacting with the database (grid to add few tasks and configuration). After that write only 1 cron job in the crontab file that runs every minute. This job will select all jobs from the database and checks if there is a job that should be run now (the time for the tasks will be stored and compared with the current time of the server).
In your opinion which way is better to implement for each part? Is there a ready made solution for this that is better in general??
Note
I came across Quartz will searching for a ready made solution. Is this what I'm looking for or is it something totally different?
Thanks.
Just my opinion, but I personally like both 1 & 2 dependent on what your script is intending to accomplish. For instance, we mostly do 1 with all of our cron entries as it becomes really easy to look at /etc/crontab and see at a glance when things are supposed to run. However, there are times when a script needs to be called every minute because logic within the script will then figure out what to run in that exact minute. (i.e. millions of users that need to be processed continually so you have a formula for what users to do in each minute of the hour)
Also take a look at Gearman (http://gearman.org/). It enables you to have cron scripts running on one machine that then slice up the jobs into smaller bits and farm those bits out to other servers for processing. You have full control over how far you want to take the map/reduce aspect of it. It has helped us immensely and allows us to process thousands of algorithm scripts per minute. If we need more power we just spin up more "workhorse" nodes and Gearman automatically detects and utilizes them.
We currently do everything on the command line and don't use cPanel, Plesk, etc. so I can't attest to what it's like editing the crontab from one of those backends. You may want to consider having one person be the crontab "gatekeeper" on your team. Throw the expected crontab entries into a non web accessible folder in your project code. Then whenever a change to the file is pushed to version control that person is expected to SSH into the appropriate machine and make the changes. I am not sure of your internal structure so this may or may not be feasible, but it's a good idea for developers to be able to see the way(s) that crontab will be executing scripts.
For Problem 2: The interface to the application I've used both methods 1 & 2. I strongly recommend the 2nd one. It will take quite more upfront work creating the database tables and building the UI. In the long run though, it will make it much easier adding new jobs to be run. I build the UI for my current company and it's so easy to use that non-technical people (accountants, warehouse supervisors) are able to go in and create jobs.
Much easier than logging onto the server as root, editing crontab, remembering the patterns and saving. Plus you won't be known as "The crontab guy" who everyone comes to whenever they want to add something to crontab.
As for setting up the application itself, I would have cron call one script and have that script run the rest. That way you only need 1 cron entry. Just be aware that if running the jobs takes a long time, you need to make sure that the script only starts running if there are no other instances running. Otherwise you may end up with the same job running twice.
This question already has answers here:
How do you deploy a website to your webservers?
(7 answers)
Closed 8 years ago.
We have a small company and we have developed our own CMS in PHP/MySQL.
The code is stored in local files and in the database.
We'd like to be able to update the code of the CMS on our client's servers.
This process should be semi-automatic, once we 'publish' the update, the code gets replaced on the client's server and in the database.
I was thinking about using Bazaar in combination with Bazaar Upload. This would take care of the files.
But what about the database? Is there a standard method already available or should I upload a .sql file that gets installed when a user logs in to the CMS?
Thanks in advance for your suggestions!
For this sort of thing I'm considering liquibase, but it needs java to run so you either need java on the server or maybe on the machine from where you are triggering the deployment.
Use combination of SVN/SVN and Cron
use install package (.rpm,.sh,.deb, whatever) and setup cron job to run your update script
UML:
#!/bin/sh
fetch $version
if(version > current_version); do
cd /path/on/client/server
svn update
/path/on/client/server/update_script.sh
done;
where update_script.sh will take care about whatever you need (sql,cron,files,certificates, ...)
Second variant
You can use something like fake cron job
In Administration you can create "Autoupdate" feature, which can be triggered by button/link or by timer. Timer is activated by running autoupdate script after login to CMS Administration
Simple check time from last update check and perform download of files, running .sql scripts or whatever you need.
The question is very easy, I want to execute several php files every "N" minutes. For example:
every N minutes
{
execute(script1.php)
execute(script2.php)
execute(script3.php)
}
I know about crontab but i was trying to find another solution. Any suggestions? Thanks in advance.
Using a Cron job is the usual solution. Can you explain why you don't want to use CRON? I've also seen libraries that add cron-like features to your system. For example in the Java/Groovy/Grails world there's the Quartz library/plugin. A quick Google search yielded a PHP library called phpJobScheduler that seems similar to Quartz. I have never used phpJobScheduler so I can't vouch for it.
I'd be interested in why you don't want to use crontabs for this? Are you going to be the primary web operations person running this server or will you be relying on an existing sysop team? You may want to get their input since they are the ones who will be most impacted by what method you choose. I've found they tend to be fond of cron for simple scheduling.
On windows I used built-in proggie called Task Scheduler. As for linux, yes, cron jobs is your answer.
You could create a php script to loop forever do X every N minutes and run the command with a & to make it a background process.
/path/to/php /home/user/bgscript.php &
If you want it to always run, you'd then have to add it to startup init.d or services depending on flavor of *nix.
This solution is possible but personally I would highly recommend going with crontab, its established, proven and works well! Why are you avoiding it?
You could build a script and let it run as a daemon and perform certain tasks on a set interval, but that's actually just simulating cron ... and if you have the ability to run a php script as a daemon you should really also be able to run it as a cronjob since that's what crons are made for.
If you need more info on how to run a php script as a daemon read this great intro. There is also a great comparison between daemon and cron inthere, worth the read.
Hey Friends
how can i develop an app which work in php.exe in my php folder and i need to know can i develop an app which will work continuously using this function and able to work it on my CPanel?
EDIT
what i mean is,a principle one of my friend said to me to develop an page which is having a timer in it,in every 5 min it will check in a rss page for looking is there any change or new item is added or not?if we develop some thing like this it is difficult for us to work the php program again and again right??so i need some thing like that, and i need help in that
If you'er trying to develop a gui app, you might want to have a look at winbinder or php-gtk.
Ig you're asking how to write a program....then the answer is a bit long to include here.
You can develop a PHP app that runs on the command line (launched by php.exe). This way, you will not have a restriction on the running time. But it will not be integrated in CPanel.
I have to admit I probably didnt understand the question completly, so my answer is probably a bit off ...
Edit:
After your update, it seems that what you need is a cron/scheduler job. The cron job will call your script at intervals you specify. Google: PHP cron jobs.
If you mean periodically by "work continuously" you can use cron jobs (or windows scheduler). If you mean really continuosly, you should build an app with service/daemon functionality. As friends here already pointed out, it's a bit lenghty to fit in this edit box.