Is it possible, to run my PHP code without opening it?
For example, I have "test.php" which includes the following SQL code:
`INSERT INTO Humans(Name) VALUES ('Ric')`
I want to run this code in every 30 seconds, but the problem is that I can't open it every 30 seconds. :)
You can schedule a task if you're using windows, or a cron job if you're on linux.
For these cases, it is for the majority of web hosting solutions available CRON function (automatic call PHP scripts at a specified time).
The question is whether your web hosting provider allows calls PHP scripts in so short a time as 30 seconds as you say.
Under Linux, cron tasks cannot run every 30 seconds as the cron granularity is only 1 minute.
If you want you can have a workaround by create a script that executes every minute but runs the command twice
whatever.sh
#!/bin/sh
cd /the/directory/where/the/script/is
php task.php
sleep 30s
php task.php
Make sure it's executable
chmod +x whatever.sh
Then setup cron to execute the script every minute
* * * * * /path/to/whatever.sh
Related
I have the task in cron: every 3 hours to run php script.
I tried with WGET and PHP as well, but got the same problem. Sometimes script works more than 2 minutes, but mostly 30 seconds enough. And if execution time more than 60 seconds it get dropped (504 getway) and cron run this script again. And then again and again. Fatal overload in several hours.
I tried this with a huge amount of different syntax, but fail:
php -q /var/www/webmy/data/www/website.com/news.php
/usr/bin/wget -O - -q -t 1 http://website.com/news.php
How can I manage the command with only 1 try to run my script? I don't need it to run million times in every 60 seconds. Any limitations?
Maybe I can limit/drop execution time to 20 seconds to prevent any inappropriate script running. I just need to run script, I don't need system to wait any time, the script finishes task in any way.
You can set the maximum execution time directly in the php script with set_time_limit ( int $seconds );
see more here
First of all, the wget suggestion is a bad one. If you're going to use a PHP script for a cron task, you're better off running it as a command line script, thus running it directly and not through a web server.
Assuming you don't rely on server information or GET/POST variables in your script: have you tried running it once manually? Is anything strange happening when you do so?
A simple crontab entry like the one below will run your script only once every three hours:
* */3 * * * php /path/to/script.php
If you want guaranteed avoiding of two scripts running in same time, you should use some kind of locking, e.g. simplest file locking with flock
Gist
I am trying to run a cron job in PHP. But the exec and system functions didn't work. (Because the server administrator doesn't allow me to use these two functions due to the security reasons) However, I still need to use PHP to run it, because the running time is decided by the formula and it's uncertain. As a result, how can I run the cron job without using exec or system function in PHP?
My thought
The only way I have come up with is adding the cron job via the cPanel, which is set to “once per minute.” That means setup a cron job for every minute to check the target PHP page whether it has anything to do at that moment.
Issue
There's a possible issue if it always checks the page every minute per day, won't it damage the host? (Maybe it will cause a damage to CPU or maybe occupy the memory space, etc.) If so, how to improve it?
You can call a bash script containing a call to PHP-CLI and then the file:
#!/usr/bin/php
/path/to/my/php/file.php
Go to the command line of the server and type "which php". Replace /usr/bin/php above with that which is returned. Then you call this bash script in your CRON.
As you say you have a cpanel server, im guessing its a linux box. You run a cronjob from a linux box by typing crontab -e and then to run a php script. */5 * * * * /usr/bin/php /path/to/file.php to un it every 5 minutes.
I'm trying to create a phpscript that creates a crontab that starts an application and shuts it down after 1 hour. I've figured out how to start the application and how to kill the process. All that is left is creating the cronjob, executing it 1 hour from now and it removing itself.
Is this possible? If so, how?
Have you thought about using the at daemon? It is not as popular as cron but does exactly what you want: run a certain command a single time at a particular point in time.
If you can execute shell scripts on the server you might be able to write a shell script that runs the program in background, sleeps for 1h and stops it afterwards if necessary. This would reduce the number of at/cron queue items.
To start the application simply create a crontab:
crontab -e
In your command prompt. Then write the crontab with the time you want it to start and the path to where the php script is stored:
10 10 * * * /Users/you/phpinhere/myphp.php &> /Users/you/output
You can direct it to an output file so then you can read if there area any errors etc.
Then do another cron underneath your first one for an hour later, and run the script to kill it.
I would create 2 cron jobs: One starts the process and one will kill it one hour later.
I have some files on my server, how to open them programatically once a day?
Let them be
http://site.com/scripts/video.php
http://site.com/scripts/music.php
Without my hands, just like sheduling (automatically).
Even if I sleep and server is working, they should open on given time.
And additionally, how to open them once a 10 seconds (for tests)?
Thanks.
The Solution is very clear when you are using a Linux server;CRON JOBS.
One can easily run a cron job by configuring it through the terminal.I saw everyone has provided the Solution,but my answer will be for the people who are novice to Linux servers and don't know much about Cron Jobs.Go to Terminal and type the below commands..
root>which php
The above line will give you the path to where PHP is in your linux systems
Now,
root>crontab e
The above line will open the Cron file in edit mode.
Enter the number of times you want to run a particular php file and what time of the day,month,week,etc.
I am providing the syntex for running a particular file every 15 mins.
So here you go,
(write this in the cron file in edit mode)
*/15 * * * * path/to/your/php path/to/the/file/you/want/to/run
Now,path/to/your/php has to be replaced by the path what you got when you typed
root>which php
And you are done just save the file and close it.You will see a messege on you terminal that a new CronJob is installed.
That's it.
If you're on a Linux/Unix host using a cron job is generally the best approach, as you can simply call the command line version of PHP as a part of the cron job. (You may need to tweak your script if it relies on $_SERVER variables, that said.)
Administration middleware (such as Plesk) often offer the ability to add cron tasks as well, although you many need to check the user/group rights that such tasks are executed with.
Finally, if you use a cron task you can simply enter the required command via the command line during the testing phase. (i.e.: Rather than force a 10 second update (which would be tricky unless you had cron execute a shell script) you could execute the script as required.)
It's not possible with pure PHP. You'll need a cron job for this - ask your provider or administrator whether they are available.
Cron has a resolution of 1 minute, though: Calling a script once every 10 seconds would have to be done e.g. using a PHP script that gets called every minute, and makes six requests every ten seconds.
Running them once a day requires a seperate program running them.
For linux servers the usual choice is a Cron Job, for Windows the Task Sheduler works fine, too.
I am working on cron jobs for my php app and planning to use cron via wget/curl.
Some of my php cron jobs can take 2-3 hours. How to let php work 2-3 hours from cron tab ? Is it good practice to run such long time jobs via cron wget/curl ? Anyone got experience on this ? I also have email queue and i needs to be run every 10 seconds, but cron tab is minute level. Any suggest on this case ?
Thanks for reading.
When you use wget/curl, you are requesting for a page from a webserver. Every webserver will have a time out period, so this might time out.
Also some of the hosting providers may stop the process running beyond certain minutes ( basically done to control the rogue threads).
So it is not advisable to schedule using wget/curl if the job takes more than few minutes.
Try scheduling it using actual scheduler. You can run php from command line
php [options] [-f] [--]
[args...]
php command should be on the path.
You can use the following at the start of your script to tell PHP to effectively never time out:
set_time_limit(0);
What may be wiser is, if crontab is going to run the script every 24 hours, set the timeout to 24 hours so that you don't get two copies running at the same time.
set_time_limit(24*60*60);
Crontab only allows minute-level execution because, that's the most often you should really be launching a script - there are startup/shutdown costs that make more rapid scheduling inefficient.
If your application requires a queue to be checked every 10 seconds a better strategy might be to have a single long-running script that does that checking and uses sleep() occasionally to stop itself from hogging system resources.
On a UNIX system such a script should really run as a daemon - have a look at the PEAR System_Daemon package to see how that can be accomplished simply.
I've just run into the problem with a long running PHP script executed via cron and wget. The script terminates after 15 minutes due to the default timeout in wget.
However I believe this method does in fact work if you set up the correct timeouts. The PHP script you run needs to be set to have an unlimited run time. Best to set this at the start of your long running script.
set_time_limit(0);
When using wget you also need to remove its timeout by passing -T0. e.g.
wget -q0 -T0 http://yourhost/job.php
Be very careful not to overload your server with long running scripts.
If using wget to call the long running cron job, consider some of it's defaults that aren't desirable in this situation. By default it'll --tries 20 times if no data is read within --read-timeout of 900 seconds. So at a maximum set tries to 1 to ensure the cron job is only called once, since you of course wouldn't want a long running script called multiple times while it's still running.
wget -t 1 https://url/
If on cPanel and you don't care for any output emails after each cron job run, also use the following flags -O and -q to hide output, and redirect errors with 2>&1.
wget -O /dev/null -q -t 1 https://url/ 2>&1