Currently I'm making a chat application where only admin and users chat, no user-to-user chat . The design is: every chat is stored in database and each 2 seconds user and admin make an AJAX request (to a php file) to see if there is a new chat dialogue, and if there is, pull the data into the textbox. It all seems normal and working good.
Problem is as more user is talking to admin at the same time the AJAX request is becoming a lot, and by testing, the web performance already decreased with only 5 users chatting at the same time. And the input is slow too, every time user press enter they got to enter the data into database first before the admin can read it (and vice versa).
I have been told that using JSON is a recommended way, but I have no idea how to do it, can someone please at least tell me how's the design or flow is going to be if use JSON? Or is there a better way to make it? (by the way, using node.js is currently impossible for my current hosting, so don't put it in suggestion lists, sucks I know).
You should change the AJAX responder phps output to JSON. (you can use the json_encode php function for example.) And you should parse(eval) this in javascript.
I am a bit sceptic. It think It could reduce the network usage by more than 50%.
Maybe you can try a message queue, like 0mq or rabbitmq.
There are a lot of chat examples around.
Related
I am currently creating an app where 2 users will have the ability to chat with one another. Specifically, it will be an iOS app using Swift as the main language. Most chat app tutorials on the web recommend using Firebase but I personally want to use MySQL since the rest of my database activities for this app are done using MySQL. I also do not want to use any existing libraries and want to do this all on my own.
I only have questions regarding the efficiency of using MySQL. When accessing the database, I create a URLSession using swift which then uses a predetermined link that points to my PHP scripts on the backend to handle the database accessions. The only problem with this is that my chat functionality of the app will have to refresh messages (to see messages that the other user has sent you within a second or so). I am confused as how to go about this. My current idea is to have a Timer that calls the URLSession data task every second or so to retrieve new messages from the database then display them on the user's screen. Would this be efficient or is there a better way to do this? I feel as if this would bog down MySQL in some way and would over all slow down the efficiency of the database. Is there a better way to go about this?
Thanks in advance.
If you really want to use MYSQL as a way of delivering messages then you can look into #TekShock's comment about using Apple's PushNotifications. You can also use Long Polling however it is not favorable at all.
I personally would not use MYSQL as a way of delivering messages just because there is a lot more better options. You can pick from messaging protocols like XMPP and MQTT to deliver your messages. I have personally have used MQTT in the past and thought it was really simple to get the hang and will fit your needs perfectly. It has a couple of really good swift clients like SwiftMQTT. You will have each device subscribe and publish to a room so it can receive and send messages. So in your case you can have a User A subscribe to ROOM 1 and a User B subscribe to the same room and they will both receive all the messages published to that specific room.
You can then if you want to store delivered messages to a MYSQL db so when the user opens the app back up you can load all their previous messages. You can also use Sqlite or Realm to store these messages locally instead of storing them online.
EDIT:
Scaling is also pretty simple with MQTT if this is something you will consider. You could place a queuing system between your application and the MQTT broker, possibly something like Apache Kafka which would be your best bet.
I am a programmer at an internet marketing company that primaraly makes tools. These tools have certian requirements:
They run in a browser and must work in all of them.
The user either uploads something (.csv) to process or they provide a URL and API calls are made to retrieve information about it.
They are moving around THOUSANDS of lines of data (think large databases). These tools literally run for hours, usually over night.
The user must be able to watch live as their information is processed and is presented to them.
Currently we are writing in PHP, MySQL and Ajax.
My question is how do I process LARGE quantities of data and provide a user experience as the tool is running. Currently I use a custom queue system that sends ajax calls and inserts rows into tables or data into divs.
This method is a huge pain in the ass and couldnt possibly be the correct method. Should I be using a templating system or is there a better way to refresh chunks of the page with A LOT of data. And I really mean a lot of data because we come close to maxing out PHP memory and is something we are always on the look for.
Also I would love to make it so these tools could run on the server by themselves. I mean upload a .csv and close the browser window and then have an email sent to the user when the tool is done.
Does anyone have any methods (programming standards) for me that are better than using .ajax calls? Thank you.
I wanted to update with some notes incase anyone has the same question. I am looking into the following to see which is the best solution:
SlickGrid / DataTables
GearMan
Web Socket
Ratchet
Node.js
These are in no particular order and the one I choose will be based on what works for my issue and what can be used by the rest of my department. I will update when I pick the golden framework.
First of all, you cannot handle big data via Ajax. To make users able to watch the processes live you can do this using web sockets. As you are experienced in PHP, I can suggest you Ratchet which is quite new.
On the other hand, to make calculations and store big data I would use NoSQL instead of MySQL
Since you're kind of pinched for time already, migrating to Node.js may not be time sensitive. It'll also help with the question of notifying users of when the results are ready as it can do browser notification push without polling. As it makes use of Javascript you might find some of your client-side code is reusable.
I think you can run what you need in the background with some kind of Queue manager. I use something similar with CakePHP and it lets me run time intensive processes in the background asynchronously, so the browser does not need to be open.
Another plus side for this is that it's scalable, as it's easy to increase the number of queue workers running.
Basically with PHP, you just need a cron job that runs every once in a while that starts a worker that checks a Queue database for pending tasks. If none are found it keeps running in a loop until one shows up.
I am a PHP developer and the title basically says it all. However I was hoping on some more in-depth information as I am starting to get confused about how the flow for the project I work on should go.
For an (web) application I need to implement a feature like Facebook does it with notifying users about replies/comments and instantly showing these.
I figured I could use long-polling with ajax requests but this does not seem to be a nice solution as the notifications never really are instant and it is resource heavy.
So I should use some form of sockets if I understand correctly, and Node.Js would be a good choice. So based on the last assumption I now get confused about the work flow.
I thought about two possible solutions:
1) It seems to me, that if I would use Node.Js I could skip using PHP at all and base the application on Node.js only.
2) Or I could use PHP as a base and only use Node.js for notifying users and instantly showing messages but saving the data using PHP and Mysql.
These two possibilities confuse me and I can't make up my mind about what would be the "best" and cleanest way.
I do not have much experience in Node.js, played with it for a while. But managing and saving data seems to be hard in Node.js so that is why I came up with option 2.
I know Facebook is build on PHP so I am assuming that they save the data via PHP and notify / instantly show replies and comments via Node.
Could someone help me out on this?
Thanks in advance!
EDIT:
I just noticed, Stackoverflow does something similar. I get a notification in the upper left, and below my question a box with "new answer to this question". I am really interested in the technologie(s) used.
Well you could use node.js for the notifications and PHP for your app.
By googling I found this about real-time-notifications.
You could also just use node.js with socket.io, but this means that you have to learn new technologies as you mention that you have no experience with node.
I haven't used it but you could check this project, for websockets in PHP.
When you have an update that you want to notify users you can use the publish subscriber pattern to notify the intrested in this update.
Take a look in Gearman too.
Personally, I've built a notification system using the pubsub mechanism of redis, with node.js+socket.io. Everytime that there is an update on a record then there is a publish on the appropriate channel. If the channel has listeners then they will be notified. I also store the last 20 notifications in a Redis list.
The appplication is built in PHP. The notification system is built in node.js. They are different applications that see the same data. The communication occurs via redis. For example in the Facebook context:
1) A user updates his status.
2) PHP stores this to the database and Redis
3) Redis knows that this update must publish to the status channel of the specific user and it does.
4) All the friends of the specific user are listening to his status channel (here comes node.js)
5) Node.js pushes the notification in the browser with socket.io
As for facebook, I have read in an article that is using long polling for supporting older browsers. Not sure for this though, needs citation...
AFAIK It would be via two simple methods :
First one that could be very simple is adding a Boolean column to each record that determines if it has been notified or not.
The second method is creating a table to insert all notifications.
However, I'm not sure if there are alternative methods for better performance, But first method is what I do commonly myself. But I think Facebook is using 2nd method, because it has to notify each one to a lot of users.
Your question maybe dublicate of:
Facebook like notifications tracking (DB Design)
Database design to store notifications to users
You could use Server Side Events it involves a bit of JavaScript but nothing overly complicated I think.
The main bulk of this method is PHP though, so you would just use the PHP to query your DB for notifications and SSE will push them to the user.
It does have some limitations though, most notably it's not supported by IE (huge surprise) thought i'd mention it anyway to let you know of other possibilities.
Hope this helps
Im starting a new project very soon, and im considering long polling to notify the users that they have a new private message/Notification that they have to check, really similar to how Facebook uses to notify you that someone has posted something about you/Liked a photo of you..
From what I have read, cometd seemed like a really good option to start with.
Then, other ways started emerging, like: Socket.io, and node.js ..
Now, my question is; Which one do you think is the best option for this case and why?
What I need to do exactly is the following;
User 1 logs into their account
User 2 sends User 1 a message which gets stored into the database and a flag is generated, (If possible?!)
The PHP script responsible for User 1 should pick up the flag, and push a notification to User 1
I know how to take care of the javascript side, but I have never done anything similar to long polling.
Im using jQuery as javascript library and PHP for the server side.
So recommendations and any good resources to do this?
It would be beneficial to use a combination of php and also node.js. node.js is made for use with persistant connections, and push communication instead of poll.
http://nodejs.org/
Here is a quick video i found: http://vimeo.com/29099827
hii.. i am designing a chat application. i have tried using php + MySQL + JavaScript and developed the web based chat app. But the application is quite slow. What all i do is, just store the messages from the clients in a table, and each client retrieves the table at a lag of 2 seconds.... I manage this by ajax.
But the app response becomes very bad when many user joins.
Is their any other technique to build the chat app ??? or what else i can do to make my app better.
help plzz ..
thanks in advance :)
Don't know if you've touched upon it, but I'd recomend using an IRC server as backend.
If you have the ability to install nodejs(basicly server side javascript) on your server you should take a look at socket.io which is a plugin for nodejs. This gives you almost an instant response time, even for quite large masses of users.
There are almost no libraries/projects for real-time web applications in PHP. But if you really want to use it you can take a look at a technique called long-polling. This can still be quite heavy on your server though.
Maybe it is slow because each client retrieves full table of chat contents - try appending only the new messages.
I've done the same thing last year.
I suggest retrieving only so many chats when the user first logs in, e.g. the last 30, and displaying them. Then on each subsequent AJAX call only retrieve any new chat messages, rather than everything else again. You can do this by storing the latest chat id (a unique id) when you first retrieve all the chat msgs and then only retrieve anything with a chat id greater than that.
How often do you check for new msgs? You could also increase that time.