Send GCM message from PHP after page load (without ajax) - php

Hy guys!My story:I'm making an PHP application with Codeigniter. When my page is loaded I can click a button that calls my PHP API that makes some changes in the database and returns the result (true or false if the change in the database wasn't successful). Also after the database change I call a PHP script that sends push notifications to registered android devices that are stored in my database.My problem:When there are a lot of registered android devices it takes some time to load the page (PHP is waiting for every GCM request to come back). Is there a way that I can load the page after the database changes AND make GCM requests in the background/async?EDIT #1:I am on a Ubuntu server.

There are a number of different ways to address this, but the most common solution is to use some form of a message queue to offload the work to seperate processes.
You could just store the messages to a seperate table in your database and have a cron script run every few minutes to send those messages (and only delete them from the table when successfully sent) or you could look into using rabbitmq, gearman or beanstalk which are designed to be more robust and more easily scaled.
Recommended reading:
http://www.slideshare.net/appdynamics/scaling-php-in-the-real-world-23619565
https://github.com/kr/beanstalkd/wiki/client-libraries
http://www.sitepoint.com/introduction-gearman-multi-tasking-php/

If you can separate the push code in to it's own standalone script, you could call it with
exec("php /path/to/script.php > /dev/null &");
This should run it in the background (on Linux) without the script that calls it waiting.
Another option might be to store notifications in the database as a queue and have a script run via cron every N minutes to check the queue and push notifications from it.

Related

Kick of Process on Server (In Background) Based on User Form Submit

So I have a page (PHP) that is used by a person to verify some quantities for some product, and then they finalize the page (confirm correct quantities and submit the page).
After they submit (finalize) the quantities, I then have to manually run another page (PHP) that processes this users accepted quantities.
Is there a way to automate the manual portion based on when the user submits their finalized numbers? I want to have this process kick off on the server side so that it does not require them to keep their web page open.
I was thinking of using shell_exec to run a command file that 'runs' the other PHP page, but that would require that the user keeps the page open they submitted the form on correct?
EDIT
The Web Server is running WAMP. Forgot to mention this, as it can affect the suggestions/solutions provided.
What you are trying to accomplish is done by using message queues like Beanstalkd (as a software) or AWS SQS (as a service).
This is the application flow of a message queue in Laravel, for example, however the base concept remains the same:
You send some raw data* about what action needs to be fulfilled to the message queue server.
You run a daemon listener that watches the message queue and executes a PHP file which has the responsibility to fulfill the action requested.
Your PHP file is executed, the action is fulfilled and removed from the queue.
Image Source: http://laravelcoding.com/blog/laravel-5-beauty-sending-mail-and-using-queues
A fairly easy solution for this would be a task queue, like php-resque. You can define the task you'd like to run, and then trigger it to execute after the user submits and finalizes the quantities. It won't require the user to keep a browser window open either.
correct. you can use crontab to execute your php scripts each minute. Some other ideas would be to
add the submission to a database where it can form a queue of submitted orders. then iterate through that each minute via crontab.
the php processing script that runs after they submit could execute the shell command, and if the process is not expected to take a long time, the shell command can write data to stdout, or to a file and the php page can grab the status from that, which can make it seem less like a dead waiting to load page.
in combination with number 2 above your page can perform refreshes updating the status based off of a db query or text file with updates.
I am assuming you have access to crontab. in general most websites use a database to store information like this and then have background processes run to do further processing of the data, kind of sounds like your situation.
also something think about. php has an execution time limit set in the php.ini file. if your processing takes longer then this value it will kill the process.

How can I return data from a PHP script whilst running another process asynchronously

So I've built an app that uses an API.
When someone uses that API to submit some data to my database I would like them to be able to continue using the application as fast as possible. Now my problem is when they submit data I want to inform other users that the user has done this. The best way to achieve this in my eyes was push notifications.
However when they have submitted the data I go to collect all users that I want to send a push notification to and loop through them, if there are a lot of users the API will take a lot of time before the function has finished executing.
I want to be able to send data back to the client and still continue executing the PHP script since it has no use for the client to be waiting till this has finished. Is it somehow possible with PHP to asynchronously send data back and still continue executing the script?
There are a number of ways to create async messages in PHP.
Use an event driven library such as React
Use ZeroMQ (with or without React) to send messages via TCP sockets to another service.
Execute a background shell script using shell_exec("your-cmd > /dev/null 2>/dev/null &") where your-cmd is a script or other executable; the rest ensures that PHP lets the process run in the background.
Make an asynchronous HTTP request as [detailed in this question](Asynchronous HTTP requests in PHP

Run PHP script like an application without Browser

Hi I am new to PHP and have no idea if what I am about to ask is even possible or does it even make sense but here goes.
I want to execute a PHP script as if I am executing a standalone application on the WebServer, what I am trying to implement is that when the Customer purchases something on the website and the once he sees the payment confirmation notice on the website, he should be allowed to close the browser window or logoff without affecting the big order generation process that get's started once the user is taken to the page that displays that the payment that he made was successful.
Right now I am making use of AJAX to call my after payment processing PHP script and have kept that script to ignore any user abort call.
This is the page that tells the user that the payment was received successfully.
thankyou.php
This is the page that performs the processing that needs to be done only after successful receipt of payment
FinishCheckoutProcess.inc.php
Now thankyou.php makes use of AJAX to execute FinishCheckoutProcess.inc.php asynchronously and FinishCheckoutProcess.inc.php has a PHP.ini setting in it that goes like this:
ignore_user_abort(true);
Now the combination of AJAX and ignore_user_abort(true) allows the after payment process to run without any errors even if the user closes his browser window, but since this script has nothing to do with the user or the browser I just wanted to know if it is possible to run this script in the background like a standalone application independent of the browser.
Also my WebServer is Apache and OS is Linux(Ubuntu OS).
My work is getting done but I just want to know if there is a better/safer way to do it.
Anyway thanks in advance to everyone, this site has helped me more than any book could have. So all you experts out there who donate their times to newbies like me you guys are awesome. Please keep up the good work.
Again thanks a lot.
Based on suggestions received
If I use the "exec" method to execute the FinishCheckoutProcess.inc.php, will this execute database related commands and will it be able to run further PHP scripts.
FinishCheckoutProcess.inc.php in turn executes a series of other PHP scripts which in turn executes other PHP scripts, so will using "exec" command to run FinishCheckoutProcess.inc.php create any difficulties.
FinishCheckoutProcess.inc.php process also does interaction with the MySQL database, so will I be able to do this if I execute this script using "exec" command. I am passing the necessary MySQLi connection object to this PHP script right now. So can I pass it the same way to it using "exec"
Also the process is quite heavy as it generates a set of 4 image files using IMagick and ImageMagick.
It generates a set of 4 image files for every product ordered, so if the quantity of 1 product is 10 then the total files generated will be 1x10x4 = 40
If there are two products with one having quantity as 2 and the other having quantity as 4 then the total files generated will be
1x2x4 = 8 +
1x4x4 = 16 = 24
So this script might need to run for a long time and cannot be allowed to be stopped due to time out reasons, it needs to finish what it started.
Basiclly the FinishCheckoutProcess.inc.php logic and process is quite complex so just want to confirm if the "exec" can handle it or not.
Also I am not sure but some of them also make use of $_SESSION variables, but if this a problem I can modify it, $_SESSION variables only get's used in one place and yes the $_SESSION get's set in the browser before the FinishCheckoutProcess.inc.php script is executed. By some previous PHP script.
I just want to execute the FinishCheckoutProcess.inc.php script independent of the parent/calling script i.e. thankyou.php, so that if the user closes the browser then the FinishCheckoutProcess.inc.php will not stop or abort becuse the parent/calling script i.e. thankyou.php is now no longer running.
FYI you can run php scripts like php my/script.php.
A safer way to do it would be have a master/worker process workflow. The master process runs on the server and checks a queue of work and the spawns worker processes to handle items on the queue as the arrive.
In your scenario you add stuff to the queue when the user pays. Once it is added to the queue you can send back thankyou.php to the user and they can continue or leave or whatever. Once the work is on the queue your master process spawns a worker process to handle the stuff (basically does everything in FinishCheckoutProcess.inc.php).
You can implement this in php with: php master.php
master.php
while( true ){
//check queue
//if found queue item
//shell_exec( 'php worker.php' );
}
From what i understand, you are looking for something like Laravel offers with it's illuminate/queue package:
Queues allow you to defer the processing of a time consuming task, such as sending an e-mail, until a later time which drastically speeds up web requests to your application.
This isn't something that only Laravel offers, though it does simplify/ease the implementation of such mechanism.
In the background you have supervisord executing a "worker" php script that executes tasks you put in a common place (db tabel, filesystem, anything), those tasks are usually references to a certain class/method with some variables to send to it.
The following links might give you a better understanding:
http://supervisord.org/index.html
https://laravel.com/docs/5.1/queues
https://laravel.com/docs/5.1/queues#supervisor-configuration
There are many ways you could implement a queue system, also without the use of supervisord. But i recently implemented this method myself because it guarantees my tasks are being processed, even after server restart (if configured properly).

Can I have an hour-long sleep in a website PHP script?

I have a PHP script that processes my email subscriptions.
It does something like:
foreach email to be sent:
mailer->send-email
print "Email sent to whoever."
I'm now encountering rate-limiting by my web host. The mailing library has a built in throttler that will sleep to ensure I stay under the rate. However, this could result in the web page taken multiple hours to actually load.
Will the client side browser ever give up on the page loading? Any suggested better solutions to this?
Why is this being done on a webpage load? This should be an off-line back-end process which is scheduled to run. (Look into cron for scheduling tasks.)
Any long running process should be delegated to a back-end service to handle that process. Application interfaces (such as a web page) should respond back to the user as quickly as possible instead of forcing the user to wait (for upwards of an hour?) for a response.
The application can track progress, usually by means of some shared data source (a simple database, for example), of the back-end process and present that progress to the user. That's fine. But the process itself should happen outside of the application.
For example, at a high level...
Have a PHP script scheduled to run to process the emails.
When the script starts, save a record to a database indicating that it's started.
Each time the script reaches a milestone of some kind, update the database record to indicate this.
When the script finishes, update the database record to indicate this.
Have a web application which checks for that database record and shows the user the current status of the back-end process.
You may not care, but even if you coerce this script into staying alive, you shouldn't purposely run a long running script through the webserver. Webserver's use resource heavy threads or processes to run your script, and they have a finite amount of them available to server web requests. A long running script basically takes one of them out of the pool of processes that can be used to server web visitors.
Instead, use a cron job which executes the php binary directly. Specifically, do not use wget or lynx or any other web browser like program as part of the cron job, because those methods run the script through the webserver. The cron command should include something like
php /full/path/to/the/script.php

Using PHP mail in a loop forces user to wait until the page displays

I have a forum, where users can subscribe to a particular thread. When someone replies to the thread, the reply is added to the DB and an email alert is sent to all the subscribers before the success page is displayed.
FYI - I am not using PHP's mail function, as my hosting provider has a limit of 500 per hour. I am using Google App Engine to send out the emails via curl.
The problem - when the number of subscribers is more than a hundred or so, the user has to wait for way too long before the success page is displayed, because PHP has to loop through each subscriber first. Is there a way to circumvent this without resorting to inserting the emails in the DB and then processing them via a cron?
Hope I've made sense - thanks in advance for your advice!
You can write your mail script and force it to run in the background using a shell command.
shell_exec("/path/to/php /path/to/send_notifications.php 'var1' 'var2' >> /path/to/alert_log/paging.log &");
Using the shell_exec function of PHP, you can tell PHP to run a script in the background. The parameters you see in the shell_exec command are seperated by spaces.
/path/to/php - The actual path to PHP on the server (you should be able to just use 'php' here)
/path/to/send_notifications.php - The path to your mail script
'var1' 'var2' - $_SERVER['argv'] variables you can send to your script will be put in single quotes and spaced after the path to your script.
>> /path/to/alert_log/paging.log - Path to a log file (optional), anything echo'ed in your mail script will be written to this file
& - The most important part of this. The & tells your server to run this script in the background
Doing it this way, when you execute the mail script, the user submitting the form will not have to wait for the script to finish. The browser will load the page faster and the user can browse away from the confirmation page or even close the browser and the script will continue to run until its completed.
If you using php as fastcgi controlled by php-pfm then you can use the fastcgi_finish_request() function. It's a very powerful function. It will close connection to browser when you call it but will continue to execute the rest of the script.
Ever since I first discovered this function, I've been using it and loving it.
No, the PHP process will not exit before all the API calls to GAE haven been executed. You could however send status updates to the user while the script is executing:
// in loop
// sent one message
echo "Sent email to user $email_user \n";
ob_flush();
ob_flush() will force Apache to send the current buffer to the client. Alternatively you could show the mail page in a separate <div> and call the mail script via ajax.
Edit: I guess it should be possible send the messages in batches to GAE? What kind of system is sitting at the GAE end (Python/Java)?
I think the problem is more with your implementation. Sending mail, especially in large quantities, is best done with forking or delayed jobs. The idea behind forking is that you enqueue "jobs" and execute them at a later time. The client is not stuck waiting for processes to complete...
One of the more popular projects for PHP is gearman.
Some other advantages to this approach are:
error caching and automatic retries
faster serving of pages
you can write custom scripts to garbage collect, etc...
You need to move the email sending into a separate script. One way of doing this would be to create a mail_queue table. So inside your loop, you would simply write a row to the database.
Then in another script, you would loop over the set of emails from the database to send out. This script can be run by a cron task, and it can just update that row to 'sent'.
I don't think you should not cron or something but you should use the task queue to do the processing offline!
App Engine applications can perform
background processing by inserting
tasks (modeled as web hooks) into a
queue. App Engine will detect the
presence of new, ready-to-execute
tasks and automatically dispatch them
for execution, subject to scheduling
criteria.
In the webhook(of the taskqueue) you could make an asynchronous fetch to your webservice and this way to call will be made asynchronous which ensures your user does not have to wait for it to complete. I hope you understand my solution.

Categories