Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I need a job scheduler (a library) that queries a db every 5 minutes and, based on time, triggers events which have expired and rerun on failure.
It should be in Python or PHP.
I researched and came up with Advanced Python Scheduler but it is not appropriate because it only schedules the jobs in its job store. Instead, I want that it takes jobs from a database.
I also found Taskforest, which exactly fits my needs except it is a text-file based scheduler meaning the jobs have to be added to the text-file either through the scheduler or manually, which I don't want to do.
Could anyone suggest me something useful?
Here's a possible solution
- a script, either in php or python performing your database tasks
- a scheduler : Cron for linux, or the windows task scheduler ; where you set the frequency of your jobs.
I'm using this solution for multiple projects.
Very easy to set up.
Celery runs best using RabbitMQ, but also has support for databases, using SQLAlchemy.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to create an application in php with a background thread containing an timer that keeps updating a database (by collecting data from different sites) separately without any user interference. What I mean by this is: without anybody visiting the site, the thread has to keep updating a database. Is this possible in PHP and how am I able to realise this ?
The best way I think it is to create a php script to do whatever you want and then set up a cron job to run that script at specific time.
There are several options for this:
A scheduled task in your operating system, such as cron on *nix or Windows Scheduler for the Windows platform.
A permanently running script. This is not ideal for PHP though, as memory usage is sometimes not correctly thrown away, and the script can run out of memory. It is common for scripts to be set to die and respawn, to prevent this from happening.
A scheduled task in your database server. MySQL now supports this. If your purpose is to run database updates, this might be a good option, if you are running MySQL, and if your version is sufficiently recent.
A queue, where some processing is done in the background upon a request signal. See Gearman, Resque and many others. It is useful where a user requests something in a web application, but that request is too lengthy to carry out immediately. If you need something to run permanently then this may not be ideal - I add it for completeness.
Having a PHP process run for a long time isn't really a good idea because PHP isn't a very memory efficient language and PHP processes consume a lot of memory.
It would be better to use a job manager. Take a look at Gearman.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
im wondering what would happen if a cron job is set to run every minute but the script it runs takes 2 minutes to run? Would it queue itself, ignore runs if previous cron is still running or run the same file simultaniously? Thanks!
The cron will boot up a new PHP process every minute, and they will all operate simultaneously with various terrible results (unless your script is properly guarded against such things, anyway)
After a while, either you'll constantly have a number of simultaneous requests running OR your server will crash after running out of resources, depending on whether or not the scripts start blocking each other due to trying to access restricted resources.
Either way, it probably won't be pretty and it probably won't be what you want.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Is it possible, on a website/webserver (having full root access) to run a PHP script which calls mysql queries in the background. What I mean with that:
An user clicks to process something - however to prevent the user waiting for running the query it should look like it is done for the user - he doesn't has to wait for the PHP/MYSQL in the browser
However the script should be running on the server and finish
How can I do that? If there is none effective solution in PHP - is it possible with other languages?
I'm not talking about cron jobs - I'm on a ubuntu machine (no windows)
Would be for running many PHP scripts (all the same) in the background - Nginx be the better solution or Apache? Is it even relevant?
The best architecture I could recommend here is probably a queue/worker setup. For instance, this is simple to do with Gearman (alternatively: ØMQ, RabbitMQ or similar advanced queues). You spin up a number of workers which run in the background and can handle the database query (I'm partial to daemonizing them with supervisord). Spin up as many as you want to support running in parallel; since the job is apparently somewhat taxing, you want to carefully control the number of running workers. Then any time you need to run this job, you fire off an asynchronous job for the Gearman workers and then return immediately to your user. The workers will handle the request whenever they get around to do it. This assumes you don't need any particular feedback for the user, that the job can simply finish whenever without anybody needing to know about it immediately.
If you do need to provide user feedback when the job is finished, you may simply want to try to execute the request via AJAX. For really sophisticated setups and realtime feedback, you may use the Gearman approach with feedback delivered via a pub/sub websocket. That's quite an involved setup though.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to get the status of services across a large number of servers in order to calculate uptime percentages. I may need to use multiple servers to do the checking. Does anyone know of a reliable way to queue them to be checked at a specific time/interval?
I'm writing the application in PHP, but I'm open to using other languages and tools for this. My only requirement is that it must run on Linux.
I've looked into things like Gearman for job queuing, but I haven't found anything that would work well.
Inorder to get uptime percentages of your services you can execute commands to check status of services and log them for further analysis/calculations. Following are some of the ways of doing same:
System commands like top, free -m, vmstat, iostat, iotop, sar, netstat etc. Nothing comes near these linux utility when you are analysing/debugging a problem. These commands give you a clear picture of what is going inside your server
SeaLion: Agent executes all the commands mentioned in #1 and custom commands as well. Outputs of these commands can be accessed in a beautiful web interface. This tool comes handy when you are working across hundreds of servers as installation is clear simple. And its FREE
Nagios: It is the mother of all monitoring/alerting tools. It is very much customizable but very difficult to setup for beginners. Although there are some nagios plugins.
Munin
Server density: A cloudbased paid service that collects important Linux metrics and gives users ability to write own plugins.
New Relic: Another well known hosted monitoring service.
Zabbix
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
Can someone please explain me, what I have to do?
I want Magento to clean logs, create a sitemap every day.
Is this the right order:
Setup Cron in Server
Setup Cron in Magento
What will be next step?
How do I do step 1 and 2?
What will be step 3? Do I have to wait?
Setting up cron on your server
Use crontab -e to edit your cron jobs. To run a daily cron, add a line like this:
0 3 * * * /bin/sh /path/to/magento/cron.sh
This will run at 3 AM every night.
For log cleaning, you can check /path/to/magento/shell/log.php.
Set up cron in Magento
In the Magento admin, go to: System > Configuration > Advanced > System > Cron (Scheduled Tasks) and configure cron jobs you wish to run.
You should know that Magento runs cron jobs even if you don't have a daily cron job configured. Whenever Magento receives a request, it checks if there are any cron jobs to be run. Therefore, having the daily cron job would only make sense if you had no requests for an entire day.
Really there is no next step to be done. I recommend you read How to Set Up a Cron Job.