Is there a way to make php work forever without cron.
What I want it for is to unban users after a few hours by running a mysql query, thanks
If you don't have access to cron jobs on your server (I guess you are running on a shared hosting?), the best alternative is to run an "external cron". Have a look at www.setcronjob.com. I have been using this for a couple of months now and it is pretty stable.
You can set it up such that it calls a script on your website every whenever you want. (Example: http://www.yoursite.com/script.xxx)
In the script, you can run a MySQL query to check which users have been banned for a couple of hours and then unban them.
You can start your script from the command line and let it run in the background. You will have to design this script in such a way that it never exits and just loops forever using the sleep() function to avoid unnecessary processor load. Since php scripts invoked from the command line have no max execution time the script will run until you manually kill it off with the kill command.
Once you've written the script you can start it with:
nohup php myscript.php &
nohup makes the script still run once you log out of the console session that you started it from, otherwise it would kill off then. The & symbol at the end starts the script as a new process in the background so that you can continue using the console.
Related
I'm creating a webservice for an Android app in PHP with MySQL. I want to continuously check whether any data is available. I haven't got any idea how to get data as a background process. How can I execute a query without any request or without calling file?
I searched and got some code like
$command = "php -d max_execution_time=50 -f myfile.php '".$param."' >/dev/null &";
exec($command);
But where should I put this code so this query will run continuously?
Yes, the ampersand trick will work. You can use something like supervisord to restart it every few hours, so that any memory leaks are dealt with. This also makes it less fragile if it were to crash or hang.
Also, you can use something like cron to run a task for 10 minutes, and then die off and wait for cron to start it again - bear in mind that with most background tasks, it doesn't matter if there's a short period the task is not running, since it will catch up. It's worth checking in each run whether the previous one is still running, and exit early if it is: that way you don't have two background tasks causing race-conditions when retrieving work from your database.
Finally you can use a job server, such as Gearman. This will allow you to send tasks to it in an asynchronous fashion, and they will be run by worker tasks (in either time or priority order). This is probably the most reliable approach, but it takes a bit more work to set up. There's a PHP module for this, but in my experience it's more of a hassle to use than Net_Gearman, which is available in PEAR.
I'm having a problem where putty gets regularly disconnected. So, when I run a PHP script from the terminal, it always gets interrupted. The script is supposed to run several hours, so I'm not having any luck with it.
How can I completely run this from the server side? I'm reading about cron jobs, but I'm having a hard time understanding at this time. Is there any alternative to cron for what I need?
I have several script PHP files that need to be run, one by one, or perhaps two at a time. Any ideas?
You don't need to leave it run in a cron job - you can just run the php script inside a screen.
Simply type;
screen php /path/to/myphpscript.php
A screen will continue running even after you disconnect from PuTTY. If you need to check up on it, you can use;
screen -r
To re-attach yourself to this process, and view any output.
You need to prevent the process from terminating when the session disconnects.
Something like this would work:
nohup php myscript.php
You can create a cron job to start the php script periodically based on a list of time tasks. More info. You could also start the task in the background from the console. i.e. php-cgi script.php& this would make the script a background task
Take a look at GNU Screen; it allows you to detach and reattach a session later, which is perfect for long-running scripts. Cron is a good option if you want it to happen in a recurring fashion; one-off batch jobs can be scheduled with something like at. For more intense computing needs, you might want to look into a more full-fledged job scheduling system like TORQUE.
You can run your program in background
php ./yourscript.php &
In the past, I ran a bunch of scripts each as a separate cron job. Now I'd like to run a controller script with one cron job, then have that call the scripts separately (and in parallel, all at the same time), so I don't have to create a new cron job every time I add another script.
I looked up pcntl_fork() but we don't have that installed. Can fsockopen() do this as well?
A few questions:
I saw this example, http://phplens.com/phpeverywhere/?q=node/view/254, that uses fsockopen(). Will this allow me to run PHP scripts in parallel? Note, the scripts don't interact, but I would still like to know if any of them exited prematurely with an error.
Secondly the scripts I'm running aren't externally accessible, they are internal only. The script was previously run like so: php -f /path/to/my/script1.php. It's not a web-accessible path. Would the example in #1 work with this, or only web-accessible paths?.
Thanks for any advice you can offer.
You can use proc_open to run multiple processes without waiting for each process to finish.
You will have a process handle, you can terminate each process at any time and you can read the standard output of each process.
You can also communicate via pipes, which is optional.
Passing 1st param php /your/path/to/script.php param1 "param2 x" means starting a separate PHP process.
proc_open (see Example #1)
Ultimately you will want to use an infinite while loop + usleep (or sleep) to avoid maxing out on the CPU. Break when all processes finish, or after you killed them.
Edit: you can know if a process has exited prematurely.
Edit2: a simpler way of doing the above is popen
Please correct me if I'm wrong, but if I understand things correctly, the solution Tiberiu-Ionut Stan proposed implies that starting the processes with proc_open and waiting for them to finish will not be run as a cron script, but is part of a running program/service, right?
As far as I understand the cron jobs, the controller script user920050 was thinking of using would be started by cron on a schedule and each new instance would launch the processes all over again, do the waiting for them to finish and probably run in parallel with other cron-launched instances of the controller script.
I just can't figure this out.
I have a script that gets data from Facebook API and this script runs all the time. (using set_time_limit(0); )
However, sometimes the Facebook API gives errors and stops the script. Therefor, I would like to have a cron task every 5 minutes or so that checks to see if the script is still running and if not, starts it again.
I tried several things but it looks like I cannot run a exec() command from a cron job because of different user rights or something? How would you guys do this?
I use CentOS and PHP 5.3+
Set up the cron under a different user (say, root), which will get around any rights issues. However, PeeHaa makes a good point: if this is a cron script, there's no reason to use exec, as exec's job is to send commands out to the OS... these commands can be run directly from the crontab rather than having cron execute a php file.
You may want to look into creating a Daemon which is better suited to running a script continuously. You can create one using PHP with this PEAR package System_Daemon
If this process runs very frequently, run it in an endless loop and just sleep it. No need for crontabs.
while(true) {
//magical code stuff
sleep(60);
}
I'm trying to figure out the most efficient way to running a pretty hefty PHP task thousands of times a day. It needs to make an IMAP connection to Gmail, loop over the emails, save this info to the database and save images locally.
Running this task every so often using a cron isn't that big of a deal, but I need to run it every minute and I know eventually the crons will start running on top of each other and cause memory issues.
What is the next step up when you need to efficiently run a task multiple times a minute? I've been reading about beanstalk & pheanstalk and I'm not entirely sure if that will do what I need. Thoughts???
I'm not a PHP guy but ... what prevents you from running your script as a daemon? I've written many a perl script that does just that.
Either create a locking mechanism so the scripts won't overlap. This is quite simple as scripts only run every minute, a simple .lock file would suffice:
<?php
if (file_exists("foo.lock")) exit(0);
file_put_contents("foo.lock", getmypid());
do_stuff_here();
unlink("foo.lock");
?>
This will make sure scripts don't run in parallel, you just have to make sure the .lock file is deleted when the program exits, so you should have a single point of exit (except for the exit at the beginning).
A good alternative - as Brian Roach suggested - is a dedicated server process that runs all the time and keeps the connection to the IMAP server up. This reduces overhead a lot and is not much harder than writing a normal php script:
<?php
connect();
while (is_world_not_invaded_by_aliens())
{
get_mails();
get_images();
sleep(time_to_next_check());
}
disconnect();
?>
I've got a number of scripts like these, where I don't want to run them from cron in case they stack-up.
#!/bin/sh
php -f fetchFromImap.php
sleep 60
exec $0
The exec $0 part starts the script running again, replacing itself in memory, so it will run forever without issues. Any memory the PHP script uses is cleaned up whenever it exits, so that's not a problem either.
A simple line will start it, and put it into the background:
cd /x/y/z ; nohup ./loopToFetchMail.sh &
or it can be similarly started when the machine starts with various means (such as Cron's '#reboot ....')
fcron http://fcron.free.fr/ will not start new job if old one is still running, Your could use # 1 command and not worry about race conditions.