I am working on REST API for Android app in Symfony2. Recently I implemented sending push notifications via FCM. Everything works fine, the problem is that my implementation sends notifications normally in controller, so client has to wait for sending all notifications to get response from server, what of course leads to performance issues. Could anyone give me a hint, what is the best way how to handle notifications sending, i.e. in separate thread or some scheduler? I just don't know what are my possibilities. Thanks in advance.
Well, what you are looking for is kind of an asynchronous worker.
That can be accomplished through different ways.
The easiest way might be to store all notifications, that need to be send, in some kind of queue like a table in your Database and processing these entries using a command (https://symfony.com/doc/current/console.html) which is regularly executed via crontab.
Another way would be to use something like RabbitMQ and writing a custom consumer which sends the notifications. Thats quite straight-forwarded and requires something like supervisord to demonize the consumer-process.
Maybe the CronJob-Way is the best for you. I didn't quite get, whether these Push-Notifications need to rely on the requests send by your clients but generally you should try to encapsulate all logic from controllers into services.
The Symfony-Documentation is always a good entry for these kind of questions and should give you some more detailed examples and hints:
https://symfony.com/doc/current/index.html
Related
I am using the PHPMailer library to handle the sending of emails from within my application.
The problem is, when some emails are triggered to be sent (such as when a contact form has been submitted, a new user registers, etc), it could take 1-3 seconds for the page to load while the email is sending. If there is ever a problem sending the mail, the delay can be more.
I was thinking of saving any emails that need to be sent into a pending_emails table in my database, then just have a cron job ran every minute which would send out all those emails, then remove them from the table.
My question is, does this seem like a logical thing to do? Are there any potential resource concerns I should have with a cron job running every minute vs sending the email in runtime? (I need to run the cron job often, as someone may be waiting on an urgent message, for example "reset password" email)
You got everything right already.
Sending at runtime, just when you respond to the user's HTTP request, is the easiest thing to do. But the response is slowed down a bit by this, of course. That's not too bad in a small application, because sending email is faster than one might think. It definitely works.
Implementing a message queue is the more elegant and scalable approach, of course. But it takes a little more work. Your idea of using a pending_emails database table is totally valid. There are libraries and components for such queues, but you don't have to use them.
This is a very opinion based question so you're going to get a lot of different, conflicting answers because there are some who might tell you its ok to make a user wait 1-3 seconds since its not that long but I tend to disagree with that. What I typically do instead, however, is use a Queue.
There are ways to create a queue WITHOUT using 3rd party software, but there are some excellent tools out there such as RabbitMQ, Iron.io or Beanstalkd which can be extremely helpful to performing tasks in the background. These services push your task into a queue and these items in the queue are processed in a timely manner in the background, but the user gets an almost immediate response (depending on what you're doing). This is how I usually handle more resource intensive tasks, like sending an email, in the background to avoid holding up the response to a user.
Best of luck.
Look into threading (PHP Threading). I would suggest you create a new thread which invokes the sending of the email. This way, you can return a response to the user without waiting for the email to be sent, and the email sending process would run in parallel in another thread.
I'm working on a API project that needs to send emails and store statistics after (or before depending on the implementation) the response has been returned to the client. For both cases I'm considering the Symfony's EventDispatcher component (I'm not using Symfony as the framework) so each controller action will dispatch an event to add an email to the queue or insert the data into the statistics database table.
So the thing would look something like this
Controller
=> Send Response to client
=> Dispatch Event email => EmailEventListener => Mail queue
=> Dispatch Event stats => StatsEventLister => Database
I'm considering this because I want this internal actions to be as asynchronous as they could be. Is this appropriate solution for this case?
EDIT: As Jovan Perovic suggested I'm adding more information. The API is a REST API that users communicate with it via web or mobile apps and I want to log, store stats and send notifications (emails primarily) without penalizing the performance of the API, the first idea was to use something that run after returning the response to the client but I don't know if that's possible using the EventDispatcher. Even if a use queue to process stats or notifications I need a centralized place where all the controllers can send information so logs be written and stats stored.
I hope my goal is now more clear. Sorry.
I think you could use Request filters (After would be suitable for you), although, I have never attempted to use them outside of Symfony2 framework.
As for async operations, in general, sockets are your friend. You could externalize the logic, by sending data to some socket which will in turn process the data accordingly. If that processing is non-essential (e.g. email and stats), your request could be finished even if your external mechanism fails.
I read some time ago about Gearman here (just an example) which might help up externalize that by creating a separate jobs.
Hope this sheds some light here :)
I am a PHP developer and the title basically says it all. However I was hoping on some more in-depth information as I am starting to get confused about how the flow for the project I work on should go.
For an (web) application I need to implement a feature like Facebook does it with notifying users about replies/comments and instantly showing these.
I figured I could use long-polling with ajax requests but this does not seem to be a nice solution as the notifications never really are instant and it is resource heavy.
So I should use some form of sockets if I understand correctly, and Node.Js would be a good choice. So based on the last assumption I now get confused about the work flow.
I thought about two possible solutions:
1) It seems to me, that if I would use Node.Js I could skip using PHP at all and base the application on Node.js only.
2) Or I could use PHP as a base and only use Node.js for notifying users and instantly showing messages but saving the data using PHP and Mysql.
These two possibilities confuse me and I can't make up my mind about what would be the "best" and cleanest way.
I do not have much experience in Node.js, played with it for a while. But managing and saving data seems to be hard in Node.js so that is why I came up with option 2.
I know Facebook is build on PHP so I am assuming that they save the data via PHP and notify / instantly show replies and comments via Node.
Could someone help me out on this?
Thanks in advance!
EDIT:
I just noticed, Stackoverflow does something similar. I get a notification in the upper left, and below my question a box with "new answer to this question". I am really interested in the technologie(s) used.
Well you could use node.js for the notifications and PHP for your app.
By googling I found this about real-time-notifications.
You could also just use node.js with socket.io, but this means that you have to learn new technologies as you mention that you have no experience with node.
I haven't used it but you could check this project, for websockets in PHP.
When you have an update that you want to notify users you can use the publish subscriber pattern to notify the intrested in this update.
Take a look in Gearman too.
Personally, I've built a notification system using the pubsub mechanism of redis, with node.js+socket.io. Everytime that there is an update on a record then there is a publish on the appropriate channel. If the channel has listeners then they will be notified. I also store the last 20 notifications in a Redis list.
The appplication is built in PHP. The notification system is built in node.js. They are different applications that see the same data. The communication occurs via redis. For example in the Facebook context:
1) A user updates his status.
2) PHP stores this to the database and Redis
3) Redis knows that this update must publish to the status channel of the specific user and it does.
4) All the friends of the specific user are listening to his status channel (here comes node.js)
5) Node.js pushes the notification in the browser with socket.io
As for facebook, I have read in an article that is using long polling for supporting older browsers. Not sure for this though, needs citation...
AFAIK It would be via two simple methods :
First one that could be very simple is adding a Boolean column to each record that determines if it has been notified or not.
The second method is creating a table to insert all notifications.
However, I'm not sure if there are alternative methods for better performance, But first method is what I do commonly myself. But I think Facebook is using 2nd method, because it has to notify each one to a lot of users.
Your question maybe dublicate of:
Facebook like notifications tracking (DB Design)
Database design to store notifications to users
You could use Server Side Events it involves a bit of JavaScript but nothing overly complicated I think.
The main bulk of this method is PHP though, so you would just use the PHP to query your DB for notifications and SSE will push them to the user.
It does have some limitations though, most notably it's not supported by IE (huge surprise) thought i'd mention it anyway to let you know of other possibilities.
Hope this helps
I'd like to create an application where when a Super user clicks a link the users should get a notification or rather a content like a pdf for them to access on the screen.
Use Case: When a teacher wants to share a PDF with his students he should be able to notify his students about the pdf available for download and a link has to be provided to do the same.
There are several ways you can accomplish this. The most supported way is through a technique called Comet or long-polling. Basically, the client sends a request to the server and the server doesn't send a response until some event happens. This gives the illusion that the server is pushing to the client.
There are other methods and technologies that actually allow pushing to the client instead of just simulating it (i.e. Web Sockets), but many browsers don't support them.
As you want to implement this in CakePHP (so I assume it's a web-based application), the user will have to have an 'active' page open in order to receive the push messages.
It's worth looking at the first two answers to this, but also just think about how other sites might achieve this. Sites like Facebook, BBC, Stackoverflow all use techniques to keep pages up to date.
I suspect Facebook just uses some AJAX that runs in a loop/timer to periodically pull updates in a way that would make it look like push. If the update request is often enough (short time period), it'll almost look realtime. If it's a long time period it'll look like a pull. Finding the right balance between up-to-dateness and browser/processor/network thrashing is the key.
The actual request shouldn't thrash the system, but the reply in some applications may be much bigger. In your case, the data in each direction is tiny, so you could make the request loop quite short.
Experiment!
Standard HTTP protocol doesn't allow push from server to client. You can emulate this by using for example AJAX requests with small interval.
Have a look at php-amqplib and RabbitMQ. Together they can help you implement AMQP (Advanced Message Queuing Protocol). Essentially your web page can be made to update by pushing a message to it.
[EDIT] I recently came across Pusher which I have implemented for a project. It is a HTML5 WebSocket powered realtime messaging service. It works really well and has a free bottom tier plan. It's also extremely simple to implement.
Check out node.js in combination with socket.io and express. Great starting point here
Hi I am trying to make an AJAX instant messenger. I currently have a website (with user logon, admin area etc.) using PHP, MySql, Java Script etc and an AJAX chat prog with 2 chat rooms (and users in room list etc) and it works really good, but dont really know where to go from here (instant messenger wise). I have done some researh which has suggested using an AJAX listener for new messages but I cant find much information on it...or if indeed this is needed or i should use something else. If anyone has any advice where I should go next it would be very, very much appreciated, thanks :)
For a chat or chat-like application which needs realtime and immediate responses probably node.js is a way to go. The mentioned socket.io is also built on node.js. It can be used both on server and client side.
There are lot of blogs/tutorials about node.js. Or you may like this, even if it is for a little fee.
I'd suggest taking a look at www.socket.io for the real-time stuff.
There's even an instant messenger example on the site IIRC.
Why dont you go with something like AJAX Chat, it's free and open source!
I think it might get you going!
Use Stream Hub. Reverse AJAX - pretty cool stuff
You can try cometd of Dojo Foundation!
http://cometd.org/
Node.js
Like a lot of people mentioned I would use node.js/socket.io for this instead of PHP. It has been created to tackle such sort of problems.
Redis
But if you really want to create somethink like this in PHP I would do it using redis(needs to be installed). It has blocking list operations which really help you create something like this. When some user sents a message to another user we push the message to corresponding blocking list of that user. The user listens to an unique blocking list(key) to receive messages.
Can not install Redis
Then you have to use MySQL insert into a table and poll table frequently, but not to much to kill your server/database.