In my case I need to echo a flag to client side and send an email .
Now client side needs to wait until the email is sent...
But I want to separate these two steps,how to do that?
You could take a look at Run PHP Task Asynchronously which is pretty much what you want to accomplish.
You could take a look at Gearman
Gearman is a system to farm out work to other machines, dispatching function calls to machines that are better suited to do work, to do work in parallel, to load balance lots of function calls, or to call functions between languages.
Have another php file to send emails,
and call it with some parameters using
shell_exec .
You can also call the URL on command line using CURL with some parameters.
That would work fine, you can track your email success status in the target file.
pseudo code:
my main file :
php stuf...
shell_exec("usr/bin/php mySecondfile.php someParam > /dev/null 2>/dev/null &");
other stuff continued.
send SUCCESS
You can use pcntl_fork() function for that. With pcntl you can fork processes to child process with different pid.
http://php.net/manual/en/function.pcntl-fork.php
Related
I have a mailing function and 100k email ids. I want to call a function multiple times like 10 times and in each time it will process 10k emails. I want to call this function without waiting for response just call a function and another and another without getting response.
I tried pthread for multi threading but can't run it successfully.
I am using my sql database
You can use multiple PHP processes for that, just like you can use multiple threads, there isn't much of a difference for PHP, as PHP is shared nothing.
You probably want to wait for it to finish and to notice any errors, but don't want to wait for completion before launching another process. Amp is perfectly suited for such use cases. Its amphp/parallel package makes multi-processing easier than using PHP's native API.
You can find a usage example in the repository.
php name_of_script.php &>/dev/null &
this line will start your script in the background.
I suppose you do not want to control a critical process like sending mails from your browser? What if your connection breaks a few seconds?
If so: still use command line approach, but using exec
$command = '/usr/bin/php path/to/script.php & echo $!');
Let's assume I have two PHP scripts, s1.php and s2.php. Let's also assume that s2.php takes about 30 minutes of running.
I would like to use s1.php to call s2.php asynchronously. When s2.php is called, it will run on its own without returning any value to s1.php. s1.php would not wait for s2.php to finish; s1.php will continue the next command, while s2.php starts on its own.
So here is the pseudo code for s1.php
Do something
Call s2.php
Continue s1.php, while s2.php is running (this step does not need to wait for s2.php to return in order to continue, it immedieately startes after s2.php starts).
How can I do that?
IMPORTANT NOTE: I am using a shared hosting environment
Out of the box, PHP does not support async processing or threading. What you're most likely after is Queueing and/or Messaging.
This can be as simple as storing a row in a database in s1.php and running s2.php on a cron as suggested in the comments, which reads that database, pulls out new rows and executes whatever logic you need.
It would be up to you to clean up - making sure you aren't reprocessing the same rows multiple times.
Other solutions would be using something like RabbitMQ or IronMQ. IronMQ might be a good place to look because its a cloud based service which would work well in your shared hosting environment and they allow for a 'dev tier' account which is free and probably far more api calls then you'll ever need.
Other fun things to look at is ReactPHP as that does allow for non-blocking io in php.
afaik, there's no good way to do this :/
you can use proc_open / exec ("nohup php5 s2.php") ~ , or $cmh=curl_multi_init();
$ch=curl_init("https://example.org/s2.php"); curl_multi_add_handle($chm,$ch);curl_multi_exec($chm,$foo);
though (or if you don't have curl, substitute with fopen... or if allow_url_fopen is false, you can even go as far as socket_create ~~ :/ )
I have php file that is to be executed as cronjob, this php file contains some javascript.
I will explain the flow :
the Php is used to retrive some data(A LIST OF URLS) from DB.
For each URL Obtained, a Java script API is used.
THe result Obj returned from API contains data for each url.
The data is then sent back to as an AJAX Call for each url to a php file .
Can this be implemented Via CRON JOBS ?
OR
Is there any method to schedule javascript to run periodically, like cron for php?
UPDATE: i could manage the javascript call to API with PHP curl ,And the cron Job is getting executed perfectly. But i dont think it is the correct solution to this question may be Node.Js is the solution(i didnt test it yet).
You can't run Javascript in Cronjobs because Javascript is ran by browsers. I think you should take a look at curl in php to call an api instead.
http://www.php.net/manual/en/book.curl.php
You have to split the work: Cron the JS, Cron the PHP. In the middle, deliver one's results to another. Agree with phantomjs usage for JS execution (or casperJS-I prefer). Execute the JS, output to JSON as a file, read from the file using file_get_contents from PHP. And define these actions in two different cron jobs.
You can run Javascript via cron in a Javascript runtime like node.js: http://nodejs.org/
phantomjs is one possibility, see this thread wget + JavaScript?
Otherwise you could run Node.js on your server to execute JavaScript in a CLI type environment but mixing node.js and PHP could become complicated.
you can schedule javascript with cron by using Node.js
I am creating a small plugin get get's data from different websites. The data does not have to be up to date, and I do not want to use a cronjob for this.
Instead with every visit of the website I want to check if the DB needs updating. Now it takes a while before the whole db is updated, and I do not want the user waiting for that.
Is there a way that I can have the function fired, but in the background. The user will just work as normal, but in the background the db is updating.
You could also fork the process using pcntl_fork
As you can see in the php.net example you get two execution threads following the function call. The parent thread could complete as usual, while the child could go on doing its thing
You'd want to use exec() with a command that redirects output to a file or /dev/null, otherwise PHP will wait for the command to complete before continuing with the script.
exec('/path/to/php /path/to/myscript.php 2>&1 > /dev/null');
There are many solutions to execute a PHP code asynchronously. The simplest is calling shell exec asynchronously Asynchronous shell exec in PHP. For more sophisticated true parallel processing in PHP try Gearman. Here a basic example on how to use Gearman.
The idea behind Gearman is you will have a deamon what will manage jobs for you by assigning tasks to worker. You will write two PHP files:
Worker: Which contain the code you want to run asynchronously.
Client: The code that will call your asynchronous function.
I have a forum, where users can subscribe to a particular thread. When someone replies to the thread, the reply is added to the DB and an email alert is sent to all the subscribers before the success page is displayed.
FYI - I am not using PHP's mail function, as my hosting provider has a limit of 500 per hour. I am using Google App Engine to send out the emails via curl.
The problem - when the number of subscribers is more than a hundred or so, the user has to wait for way too long before the success page is displayed, because PHP has to loop through each subscriber first. Is there a way to circumvent this without resorting to inserting the emails in the DB and then processing them via a cron?
Hope I've made sense - thanks in advance for your advice!
You can write your mail script and force it to run in the background using a shell command.
shell_exec("/path/to/php /path/to/send_notifications.php 'var1' 'var2' >> /path/to/alert_log/paging.log &");
Using the shell_exec function of PHP, you can tell PHP to run a script in the background. The parameters you see in the shell_exec command are seperated by spaces.
/path/to/php - The actual path to PHP on the server (you should be able to just use 'php' here)
/path/to/send_notifications.php - The path to your mail script
'var1' 'var2' - $_SERVER['argv'] variables you can send to your script will be put in single quotes and spaced after the path to your script.
>> /path/to/alert_log/paging.log - Path to a log file (optional), anything echo'ed in your mail script will be written to this file
& - The most important part of this. The & tells your server to run this script in the background
Doing it this way, when you execute the mail script, the user submitting the form will not have to wait for the script to finish. The browser will load the page faster and the user can browse away from the confirmation page or even close the browser and the script will continue to run until its completed.
If you using php as fastcgi controlled by php-pfm then you can use the fastcgi_finish_request() function. It's a very powerful function. It will close connection to browser when you call it but will continue to execute the rest of the script.
Ever since I first discovered this function, I've been using it and loving it.
No, the PHP process will not exit before all the API calls to GAE haven been executed. You could however send status updates to the user while the script is executing:
// in loop
// sent one message
echo "Sent email to user $email_user \n";
ob_flush();
ob_flush() will force Apache to send the current buffer to the client. Alternatively you could show the mail page in a separate <div> and call the mail script via ajax.
Edit: I guess it should be possible send the messages in batches to GAE? What kind of system is sitting at the GAE end (Python/Java)?
I think the problem is more with your implementation. Sending mail, especially in large quantities, is best done with forking or delayed jobs. The idea behind forking is that you enqueue "jobs" and execute them at a later time. The client is not stuck waiting for processes to complete...
One of the more popular projects for PHP is gearman.
Some other advantages to this approach are:
error caching and automatic retries
faster serving of pages
you can write custom scripts to garbage collect, etc...
You need to move the email sending into a separate script. One way of doing this would be to create a mail_queue table. So inside your loop, you would simply write a row to the database.
Then in another script, you would loop over the set of emails from the database to send out. This script can be run by a cron task, and it can just update that row to 'sent'.
I don't think you should not cron or something but you should use the task queue to do the processing offline!
App Engine applications can perform
background processing by inserting
tasks (modeled as web hooks) into a
queue. App Engine will detect the
presence of new, ready-to-execute
tasks and automatically dispatch them
for execution, subject to scheduling
criteria.
In the webhook(of the taskqueue) you could make an asynchronous fetch to your webservice and this way to call will be made asynchronous which ensures your user does not have to wait for it to complete. I hope you understand my solution.