I am creating a small plugin get get's data from different websites. The data does not have to be up to date, and I do not want to use a cronjob for this.
Instead with every visit of the website I want to check if the DB needs updating. Now it takes a while before the whole db is updated, and I do not want the user waiting for that.
Is there a way that I can have the function fired, but in the background. The user will just work as normal, but in the background the db is updating.
You could also fork the process using pcntl_fork
As you can see in the php.net example you get two execution threads following the function call. The parent thread could complete as usual, while the child could go on doing its thing
You'd want to use exec() with a command that redirects output to a file or /dev/null, otherwise PHP will wait for the command to complete before continuing with the script.
exec('/path/to/php /path/to/myscript.php 2>&1 > /dev/null');
There are many solutions to execute a PHP code asynchronously. The simplest is calling shell exec asynchronously Asynchronous shell exec in PHP. For more sophisticated true parallel processing in PHP try Gearman. Here a basic example on how to use Gearman.
The idea behind Gearman is you will have a deamon what will manage jobs for you by assigning tasks to worker. You will write two PHP files:
Worker: Which contain the code you want to run asynchronously.
Client: The code that will call your asynchronous function.
Related
I have a mailing function and 100k email ids. I want to call a function multiple times like 10 times and in each time it will process 10k emails. I want to call this function without waiting for response just call a function and another and another without getting response.
I tried pthread for multi threading but can't run it successfully.
I am using my sql database
You can use multiple PHP processes for that, just like you can use multiple threads, there isn't much of a difference for PHP, as PHP is shared nothing.
You probably want to wait for it to finish and to notice any errors, but don't want to wait for completion before launching another process. Amp is perfectly suited for such use cases. Its amphp/parallel package makes multi-processing easier than using PHP's native API.
You can find a usage example in the repository.
php name_of_script.php &>/dev/null &
this line will start your script in the background.
I suppose you do not want to control a critical process like sending mails from your browser? What if your connection breaks a few seconds?
If so: still use command line approach, but using exec
$command = '/usr/bin/php path/to/script.php & echo $!');
I've got a need for some php code to launch more than one instance of Shell exec and run a command.
At present my code trys to run both commands in the same window which makes the second command fail. As the commands need to run at the same time a separate windows is best for me, the command is launching an adobe product.
Any help would be much appreciated.
You need to use threading to do that.
One simple method would be to trigger your shell exec calls through ajax calls, so you can hit 2 parallel ajax url's to trigger your shell exec.
Most by default iis servers allow up to 6 parallel connections from the same client.
In my case I need to echo a flag to client side and send an email .
Now client side needs to wait until the email is sent...
But I want to separate these two steps,how to do that?
You could take a look at Run PHP Task Asynchronously which is pretty much what you want to accomplish.
You could take a look at Gearman
Gearman is a system to farm out work to other machines, dispatching function calls to machines that are better suited to do work, to do work in parallel, to load balance lots of function calls, or to call functions between languages.
Have another php file to send emails,
and call it with some parameters using
shell_exec .
You can also call the URL on command line using CURL with some parameters.
That would work fine, you can track your email success status in the target file.
pseudo code:
my main file :
php stuf...
shell_exec("usr/bin/php mySecondfile.php someParam > /dev/null 2>/dev/null &");
other stuff continued.
send SUCCESS
You can use pcntl_fork() function for that. With pcntl you can fork processes to child process with different pid.
http://php.net/manual/en/function.pcntl-fork.php
I have written a PHP script which generates an SQL file containing all tables in my database.
What I want to do is execute this script daily or every n days. I have read about cron jobs but I am using Windows. How can I automate the script execution on the server?
You'll need to add a scheduled task to call the URL.
First of all, read up here:
MS KB - this is for Windows XP.
Second, you'll need some way to call the URL - i'd recommend using something like wget - this way you can call the URL and save the output to a file, so you can see what the debug output is. You can get hold of wget on this page.
Final step is, as Gabriel says, write a batch file to tie all this up, then away you go.
e: wget is pretty simple to use, but if you have any issues, leave a comment and I'll help out.
ee: thinking about it, you don't even really need a batch file, and could just call wget directly..
add a scheduled task to request the url. either using a batch file or a script file (WSH).
http://blog.netnerds.net/2007/01/vbscript-download-and-save-a-binary-file/
this script will allow you to download binary data from a web source. Modify it to work for you particular case. This vbs file can either be run directly or executed from within a script. Alternately you do not have to save the file using the script, you can just output the contents (WScript.Echo objXMLHTTP.ResponseBody) and utilize the CMD out to file argument:
cscript download.vbs > logfile.log
save that bad boy in a .bat file somewhere useful and call it in the scheduler: http://lifehacker.com/153089/hack-attack-using-windows-scheduled-tasks
Cron is not always available on many hosting accounts.
But try this:
http://www.phpjobscheduler.co.uk/
its free, has a useful interface so you can see all the scheduled tasks and will run on any host that provides php and mysql.
You can use ATrigger scheduling service. A PHP library is also available to create scheduled tasks without overhead. Reporting, Analytics, Error Handling and more benefits.
Disclaimer: I was among the ATrigger team. It's a freeware and I have not any commercial purpose.
Windows doesn't have cron, but it does come with the 'at' command. It's not as flexible as cron, but it will allow you to schedule arbitrary tasks for execution from the command line.
Yes, You can schedule and execute your php script on windows to run automatically. In linux like os u will have cron but on windows u can schedule task using task scheduler.
If your code is in remote hosted server then create a cron-job for the same.
Else if in local then use a scheduled task in windows.Its easy to implement.I am having servers with so many scheduled tasks running.
How can I run several PHP scripts from within another PHP script, like a batch file? I don't think include will work, if I understand what include is doing; because each of the files I'm running will redeclare some of the same functions, etc. What I want is to execute each new PHP script like it's in a clean, fresh stack, with no knowledge of the variables, functions, etc. that came before it.
Update: I should have mentioned that the script is running on Windows, but not on a web server.
You could use the exec() function to invoke each script as an external command.
For example, your script could do:
<?php
exec('php -q script1.php');
exec('php -q script2.php');
?>
Exec has some security issues surrounding it, but it sounds like it might work for you.
// use exec http://www.php.net/manual/en/function.exec.php
<?php
exec('/usr/local/bin/php somefile1.php');
exec('/usr/local/bin/php somefile2.php');
?>
In the old days I've done something like create a frameset containing a link to each file. Call the frameset, and you're calling all the scripts. You could do the same with iframes or with ajax these days.
exec() is a fine function to use, but you will have to wait until termination of the process to keep going with the parent script. If you're doing a batch of processes where each process takes a bit of time, I would suggest using popen().
The variable you get creates a pointer to a pipe which allows you to go through a handful of processes at a time, storing them in an array, and then accessing them all with serial speed after they're all finished (much more concurrently) using steam_get_contents().
This is especially useful if you're making API calls or running scripts which may not be memory-intensive or computationally intensive but do require a significant wait for each to complete.
If you need any return results from those scripts, you can use the system function.
$result = system('php myscript.php');
$otherresult = system('php myotherscript.php');