This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
PHP Curl script to pull data from remote server every 10 minutes
My friend's website has XML generator for news feeds. I'm reading data from there and use it on my website (I have permission to do that). But... I have a script for that job under my admin panel so I have to call it manually and wait until processing is done (can't close my browser). It works great but problem is that I need to update my database every 30 minutes (or every hour) and I can't stay in front of my computer doing that 24 hours daily.
I'm now wondering, is there something that I can do to make this process automatized on server-side? In short, I want to run some kind of scheduler on my server that runs my script every (lets say) 30 minutes and do that job without my physical presence in front of computer.
I have not experience at all about this and actually don't know where to look to find solution for this problem. Don't know is that even possible?
So what I need here as answer is some suggestions or links or whatever helps in order to find solution for my problem. It's not urgent and I have a lot of time to learn. Just tell me where to start searching.
Sorry if this is duplicate question, but I couldn't search for anything because I didn't know what search terms to use and have not idea yet what I have to look for.
I really appreciate any suggestion.
What you are looking for is cron (quoting) :
Cron is a time-based job scheduler in Unix-like computer operating
systems.Cron enables users to schedule jobs (commands or shell
scripts) to run periodically at certain times or dates. It is
commonly used to automate system maintenance or administration...
You could use curl in order to do the job you are doing right now in your browser, and after that, put that command in cron
Related
I'm surprised I haven't been able to find something on this here - so if I've just completely missed it, please direct me to the proper thread.
Before I dive into any code, I'm trying to gather some good ideas for handling this situation.
We're developing a website with a list of tasks the user can select for the server to execute on their behalf. Automated emails, text messages, calendar reminders, etc.
I first went down the road of thinking about using cron, but as that the times and tasks for each user will likely change every day throughout each day - for this to be feasibly salable, I figured involving cron directly for each task could get pretty messy and buggy.
My next thought was to run a cron script every night at midnight and generate a task-list for the next day - but I'd still need cron or some sort of cron-like timing daemon to check the list against the time every minute.
I've run through several ideas, but they all seem fairly active or processor heavy. I'd like to find a good light-weight solution that can handle up to several thousand user defined tasks per day.
I'm working with your basic LAMP7 stack. If anybody has dealt with a similar task, I'm just looking for some good ideas to consider.
Thanks in advance.
You can use ReactPHP application run in background in your machine.
Then you can create a simple http server on your ReactPHP application for recieving the user data from your webserver such as you specified LAMP7. And once you recieved that you can trigger those events by setting asyncronous timer on the event-loop.
A few years ago i made am webbapp with ajax and php, i really loved the polling mechanism.
Now i`m working on a streamingserver which is controlled by php. The client makes a call to the php to start en stop streaming. These are calls the clients initialises, now i want to do some "server-maintenace". (which is not initialized by a client)
What i'm trying to do is creating a script thats check for clients, the script needs to loop and query`s the database for the clients table every 20 seconds,
is this possible with using PHP only? can you give me some tricks and tips?
help is appreciated
If you want it to be under a minute you'll need a long running PHP process. If you can live w/ every minute, a cron job will suffice. Another option would be to run the check as part of requests as they came into the webserver for other things as part of your application.
I don't have a server, so i don't have crontab access. I am thinking of using a PHP script to check current time and how much time has passed. For example, I run the script, it stores the current date in MySQL and check if 30 days have passed. If so, do my stuff.
Is it possible to do all these without MySQL? And of course it is only my thinking, i haven't tried yet.
Keeping script running:
The issue is that you've either got to keep that script running for a long, long time (which PHP doesn't like) or you'll have to manually run that script every day or whatever.
One thing you could do is write a script to run on your local machine that accesses that PHP script (e.g. using the commandline tool 'wget') every minute or hour or whatever.
If you want to have a long-running script, you'll need to use: http://php.net/manual/en/function.set-time-limit.php. That'll let you execute a script for much longer.
As noted in another answer, there are also services like this: https://stackoverflow.com/questions/163476/free-alternative-to-webcron
Need for MySQL?
As for whether you need MySQL - definitely not, though it isn't a bad option if you have it available. You can use a file if required (http://php.net/manual/en/function.fopen.php) or even SQLite (http://php.net/manual/en/book.sqlite.php) which is a file-based SQL database.
As i understand, you can only run php scripts, which involved by user request.
I tried this once, but it is dangerous. If the scheduled process took too long time, user may be interrupting it, and hanging up the script, causing half-processed data. I'm not suggesting it for production environment.
Take a look at Drupals Poormanscron module: http://drupal.org/project/poormanscron. From the introduction text:
The module inserts a small amount of JavaScript on each page of your
site that when a certain amount of time has passed since the last cron
run, calls an AJAX request to run the cron tasks.
You can implement something like this yourself, possibly using their code as a starting point. However, this implementation depends on regular visits to the pages of your website. If nobody visits your website, the cronjobs do not get executed.
You'd need some kind of persistent storage, but a simple file should do the trick. Give it a go and see how you get on. :) Come back for help if you get stuck. Here are some pointers to get you started:
http://php.net/manual/en/function.file-get-contents.php
http://nz.php.net/manual/en/function.file-put-contents.php
You could use a webcron (basically a cronjob on another server that calls your script on a given time)
https://stackoverflow.com/questions/163476/free-alternative-to-webcron
I'm working on a project that has to move/write a bunch of files (hundreds of them). Right now they're done one after another, so I'm going to be doing it in parallel to speed up the process.
The biggest work is done, using a modified version of the PHP class found here: http://semlabs.co.uk/journal/object-oriented-curl-class-with-multi-threading
Right now I can add 100 url's and then tell the script to parse 10 urls in parallel. This
however means I have to wait until the first 10 are done, to do the next 10.
I was wondering if there was a way to set it up like a queue? Start with 10, then as soon as the first one is done (the one out of those 10 that is done first), move on to number 11, etc ...
Is there any way to do this in PHP? (doesn't have to be with CURL). Any help would be greatly appreciated :)
One possible option is using CRON jobs.
Another is Zend_Queue, but I think its requirement is Zend Server. I'm not sure, though.
Your script may generate a "processing request token" (DB or filesystem based) that would be processed by background daemons. Each daemon starts by "eating" the token then process stuff, if each daemon can process 10 urls just start 10 daemons to process 100 urls at a time ...
Do you have access to the server that your site is hosted? ie. do you have the ability to setup cron jobs / your own program's (ie. running php from the console)?
Sounds like a job for Gearman. You'd submit each list of 10 urls as a job, and have a number of workers running which pick up the jobs as soon as they're available. The advantage of this is you can trivially improve the parallelism later by starting up more workers (even on different machines).
i wonder how can i schedule and automate tasks in PHP? can i? or is web server features like cron jobs needed.
i am wondering if there is a way i can say delete files after say 3 days when the file are likely outdated or not needed
PHP natively doesn't support automating tasks, you have to build a solution yourself or search google for available solutions. If you have a frequently visited site/page, you could add a timestamp to the database linking to the file, when visiting your site in a chosen time (e.g. 8 in the morning) the script (e.g. deleteOlderDocuments.php) runs and deletes the files that are older.
Just an idea. Hope it helps.
PHP operates under the request-response model, so it won't be the responsibility of PHP to initiate and perform the scheduled job. Use cron, or make your PHP site to register the cron jobs.
(Note: the script that the job executes can be written in PHP of course)
In most shared hosting environments, a PHP interpreter is started for each page request. This means that for each PHP script in said environment, all that script will know about is the fact that it's handling a request, and the information that request gave it. Technically you could check the current time in PHP and see if a task needs to be performed, but that is relying on a user requesting that script near a given time.
It is better to use cron for such tasks. especially if the tasks you need performed can be slow -- then, every once in a while, around a certain time, a user would have a particularly slow response, because them accessing a script caused the server to do a whole bunch of scheduled stuff.