I'm building a groceries application that allows for syncing with other members in your family who are part of a 'family' group, and when one of them update the grocery list on their devices, it will update automatically on other devices. I can do pretty much everything, but the automatically bit.
I've got .load working with setInterval, but it only functions (stable) when the interval is set to a few minutes, because making the call once every few seconds is a bit excessive on the server :\
I believe the way to do this is with long polling, which I still have no idea how to do, but could anyone suggest how I could do this efficiently? In a way that might not lag like crazy on mobile too? Because I do intend to push this over to mobile.
Or if it means less load on the server, would anyone know how to do it like Twitter does '1 new Tweet' when new content gets detected?
Any help greatly appreciated! :)
Cheers,
Karan
If you're frequent polling is too excessive on the server then you need to revise the logic on the server. Rather than hit the database during every single request you should have something where the server caches a status in a session variable or something similar. Then when a user makes a change, invalidate the cache, so that after that one of those poll requests from the Javascript actually incurs the full hit to the server.
Another thing I would say is be careful on the long running response paradigm. It's a handful and practically everything I've seen in the wild has used frequent polling instead. Take a peak at this old thread.
Comet VS Ajax polling
If you are still interested in the long running response have a look at Comet.
Related
So as you know in some browser-games such as Travian, Tribalwars and etcetera, you can build up a building, it takes X amount of time and it finishes.
So I'm curious how that is done?
Is it a cron-job running every second or what? How are they then doing with troops, they can't have a cron-job running ever millisecond, that wouldn't be resource usage friendly, right?
So I'm really curious about this and I have no idea so I can't really say I have tried. I have however searched around but never found anything helpful.
Thanks.
The easiest way to implement something like this is simple timestamps. The request on the front end generates a timestamp based on the constraints given by the details of the request (what type of building you are building, what level you are, if you have bought the upgrade). Then a timestamp is inserted into the database for when the completion occurs. Then, if you want the browser to refresh when the job is up, you make a script on the js that makes a request for all timestamps in queue and reloads when they come up.
One way: You let the client handle the timer. So the timer will sit on the browser side counting down using javascript. When the time is up it will contact server to see if it's valid (never trust client side code). Server looks up the building and see if it would have been finished by then. Server side doesn't need to keep any timers it just answers requests. Timers are UI side.
Well, PHP is one of the worst things you could use to build a game... In games, everything is controlled by the main loop that controls the game. So basically, in a game, everything is running inside an infinite loop, albeit one that allows for user input without freezeing the computer, obviously. So that loop takes care of the timing as well, and the way to compute timing will depends on the language on which the game is developed. For web-based games, Java, JavaScript and Flash are usual choices.
I am a programmer at an internet marketing company that primaraly makes tools. These tools have certian requirements:
They run in a browser and must work in all of them.
The user either uploads something (.csv) to process or they provide a URL and API calls are made to retrieve information about it.
They are moving around THOUSANDS of lines of data (think large databases). These tools literally run for hours, usually over night.
The user must be able to watch live as their information is processed and is presented to them.
Currently we are writing in PHP, MySQL and Ajax.
My question is how do I process LARGE quantities of data and provide a user experience as the tool is running. Currently I use a custom queue system that sends ajax calls and inserts rows into tables or data into divs.
This method is a huge pain in the ass and couldnt possibly be the correct method. Should I be using a templating system or is there a better way to refresh chunks of the page with A LOT of data. And I really mean a lot of data because we come close to maxing out PHP memory and is something we are always on the look for.
Also I would love to make it so these tools could run on the server by themselves. I mean upload a .csv and close the browser window and then have an email sent to the user when the tool is done.
Does anyone have any methods (programming standards) for me that are better than using .ajax calls? Thank you.
I wanted to update with some notes incase anyone has the same question. I am looking into the following to see which is the best solution:
SlickGrid / DataTables
GearMan
Web Socket
Ratchet
Node.js
These are in no particular order and the one I choose will be based on what works for my issue and what can be used by the rest of my department. I will update when I pick the golden framework.
First of all, you cannot handle big data via Ajax. To make users able to watch the processes live you can do this using web sockets. As you are experienced in PHP, I can suggest you Ratchet which is quite new.
On the other hand, to make calculations and store big data I would use NoSQL instead of MySQL
Since you're kind of pinched for time already, migrating to Node.js may not be time sensitive. It'll also help with the question of notifying users of when the results are ready as it can do browser notification push without polling. As it makes use of Javascript you might find some of your client-side code is reusable.
I think you can run what you need in the background with some kind of Queue manager. I use something similar with CakePHP and it lets me run time intensive processes in the background asynchronously, so the browser does not need to be open.
Another plus side for this is that it's scalable, as it's easy to increase the number of queue workers running.
Basically with PHP, you just need a cron job that runs every once in a while that starts a worker that checks a Queue database for pending tasks. If none are found it keeps running in a loop until one shows up.
I recently inherited a PHP application using the CodeIgniter framework - which handles authentication, user sessions, CRUD operations, routing, templating, and pretty much all aspects of the application just beautifully. However, one feature requires the use of long-polling. Certain admins need near real-time updates on what other users are doing. Everything is working fine now for a few hundred users, but we're scaling up to support a few thousand, and I'm worried the long-polling will cause some performance issues.
Here's the basics of the long-poll process:
The browser makes a GET request which kicks off the long-poll process.
The long-poll process checks the timestamp of a txt file every 1/4 seconds.
a) If the txt file is updated, the long-poll process returns the changes to the browser and the view is updated.
b) If no changes are found in 25 seconds, the long-poll ends and returns null.
Repeat step 1.
This process takes place outside of the codeigniter framework. I'm thinking that it's a good idea to replace this process with a socket.io implementation.
Will socket.io be a better solution than the long-polling? If so, what convincing evidence do I have without actually building a demo and doing load testing? It seems like a good idea in my head, but I need to justify the time and effort to make the switch.
Also, sorry if this doesn't make sense, or 'is not a real question' by SO standards. I'm pretty new to back-end scalability and most of this stuff is brand new to me. Please offer helpful guidance for rewording (if needed) before downvoting. Thanks.
EDIT: The preference here is to keep things as-is, because it's expensive to rip out code and replace it with new things, especially when new things are untested. So I guess my question is, when/if the long-polling solution hits a brick wall, will socket.io be a viable replacement?
I think websockets are better than long-polling, with polling you have unnecessary network throughput overhead primarily as your application scale. Check this two links:
http://www.websocket.org/quantum.html
http://dsheiko.com/weblog/websockets-vs-sse-vs-long-polling
Why is HTTP-Polling so laggy?
What I have is a button, and whenever a user clicks it a MySQL database field gets updated and the value is displayed to the user. I'm polling every 800 milliseconds and it's very laggy/glitchy. Sometimes when clicking the button it doesn't register it. And I actually need to be polling quite a bit more frequent than every 800 milliseconds.
This is also with just 1 user on the website at a time... When in the end there is going to be many at once.
HTTP-streaming/Long-polling/Websockets instead of polling
When you need real-time information you should avoid polling(frequently). Below I would try to explain why this is wrong. You could compare it to a child in the back of your car screaming every second "are we there yet" while you are replying "we are not there yet" all the time.
Instead you would like to have something like long-polling/HTTP-streaming or websockets. You could compare this to a child in the back of your car telling you to let him know when "we are there" instead of asking us every second. You could imagine this is way more efficient then the previous example.
To be honest I don't think PHP is the right tool for this kind of applications(yet). Some options you have available are:
hosted solutions:
http://pusherapp.com:
Pusher is a hosted API for quickly,
easily and securely adding scalable
realtime functionality via WebSockets
to web and mobile apps.
Our free Sandbox plan includes up to
20 connections and 100,000 messages
per day. Simply upgrade to a paid plan
when you're ready.
http://beaconpush.com/
Beaconpush is a push service for
creating real-time web apps using
HTML5 WebSockets and Comet.
host yourself:
http://socket.io:
Socket.IO aims to make realtime apps
possible in every browser and mobile
device, blurring the differences
between the different transport
mechanisms
When becoming very big the "host yourself" solution is going to less expensive, but on the other hand using something like pusherapp will get you started easier(friendly API) and also is not that expensive. For example pusherapp's "Bootstrap" can have 100 concurrent connections and 200,000 messages per day for $19 per month(but when small beaconpush is cheaper => do the math :)). As a side-note this plan does not include SSL so can not be used for sensitive data. I guess having a dedicated machine(VPS) will cost you about the same amount of money(for a simple website) and you will also have to manage the streaming solution yourself, but when getting bigger this is probably way more attractive.
Memory instead of Disc
whenever a user clicks it a MySQL
database field gets updated and the
value is displayed to the user
When comparing disc I/O(MySQL in standard mode) to memory it is extremely slow. You should be using an in-memory database like for example redis(also has persistent snapshots) or memcached(completely in memory) to speed up the process. I myself really like redis for it's insane speed, simplicity and persistent snapshots. http://redistogo.com/ offers a free plan with 5MB of memory which will probably cover your needs. If not the mini plan of $5 a month will probably cover you, but when getting even bigger a VPS will be cheaper and in my opinion the prefered solution.
Best solution
The best solution(especially if you are getting big) is to host socket.io/redis yourself using a VPS(cost money). If really small I would use redistogo, if not I would host it myself. I would also start using something like beaconpush/pusherapp because of it's simplicity(getting started immediately). Hosting socket.io(advice to play with it on your own machine for when getting big) is pretty simple, but in my opinion more difficult than beaconpush/pusherapp.
Laggy/glitchy? Sounds like a client-side problem. As does the button thing. I'd get your JavaScript in order first.
As for polling, 0.8 sounds a bit time-critical. I don't know about most countries, but here in the third world simple network packets may get delayed for as long a few seconds. (Not to mention connection drops, packet losses and the speed of light.) Is your application ready to cope with all that?
As for an alternate approach, I agree with #Vern in that an interrupt-driven one would be much better. In HTTP terms, it translates to a long-standing HTTP request that does not receive a response until the server has some actual data to send, minimizing delay and bandwidth. (AFAIK) it's an older technique than AJAX, though has been named more recently. Search for "COMET" and you'll end up with both client- and server-side libraries.
there are many things that might cause the lag that you are experiencing. Your server might be able to process the requests fast enough, but if the connection between your client and the server is slow, then you'll see the obvious lag.
The first thing you should try is to ping the server and see what response time you're getting.
Secondly, rather than poll, you might want to consider an interrupt driven approach. This means that only when your server replies, will you send out your next request. This makes sense, so that many clients won't be flooding the server with requests till the point the server cannot cope. This is especially true, then the RTT (Round-Trip-Time) of your request is pretty long.
Hope it helps. Cheers!
A good place to start would be to use a tool like Firebug in Mozilla Firefox that will allow you to watch the requests being sent to the server and look for bottlenecks.
Firebug will break down each part of the request, so you can see if you are having trouble talking to the server or if it is simply taking a long time to come up with a response.
Along with #Vern's answer I would also say that if at all possible I would have the server cache the data ahead of time and then all of the clients will pull from that same cache and not need separate MySQL calls to reach the same data for every update. Then you just have your PHP update the cache whenever the actual DB data changes.
By cache I mean having php write to a file on the sever side, and then clients will simply look at the contents of that one file to see the most updated info. There might be better ways of caching, but being that I have never done this personally before, this is the first solution that popped into my mind.
I wrote a PHP chatscript with JQUERY on the userend pullstyle. It reloaded a pull.php page once a second and retrieved only new chat records from my sql database since the last check.
However I got an email from my host saying I was using too much bandwidth. Bleh. My chat was working just as I wanted it and this happened. I'm scared to do a COMET php chat because I was told it uses a separate process or thread foreach user.
I suppose I just need a strategy that's good, and more efficient than what I had.
Well you've got the premise down. 1 second recalls are far too frequent - can anyone bang out a response in a second and expect it to go through? It takes me at least 2 seconds to get my SO messages together / relatively typo free.
So part 1 - increase the recall time. You kick it up to 10 seconds, the app is still going to feel pretty speedy, but you're decreasing processing by a factor of 10 - that's huge. Remember every message includes extra data (in the HTML and in all the network layers that support it), and every received message by the server requires processing to decide what to do with it.
Part 2 - You might want to introduce a sliding scale of interactivity speed. When the user isn't typing and hasn't received a message in a while the recheck rate could be decreased to perhaps 30 seconds. If the user is receiving more than one message in a "pull" or has received a message for the last 3 consecutive pulls then the speed should increase (perhaps even more frequent than 10 seconds). This will feel speedy to users during conversations, and not slow/laggy when there's nothing going on.
Part 3 - Sounds like you're already doing it, but your pull should be providing the minimum possible reply. That is just the HTML fragment to render the new messages, or perhaps even just a JSON structure that can be rendered by client-side JavaScript. If you're getting all the messages again (bad) or the whole page again (worse) you can seriously decrease your bandwidth this way.
Part 4 - You may want to look at alternate server software. Apache has some serious overhead... and that's for good reason, you've got all your plugs running, apache config, .htaccess config, rewrites, the kitchen sink - you might be able to switch to a lighter weight http server - nginx, lighttp - and realise some serious performance increase (strictly speaking it won't help much with bandwidth - but one wonders if that's what the host is really complaining about, or that your service consumes serious server resources).
Part 5 - You could look at switching host, or package - a horrible thought I'm sure, but there may be packages with more bandwidth/processor (the later being more key I suspect) available... you could even move to your own server which allows you to do what you want without the service provider getting involved.
You don't need to check for a new message every second, kick it down to maybe 3 or 4. Make sure you are only using 1 query and that the file loaded is as small as possible. And I would always add a timeout... it would stop reloading after a minute or 2 of no activity.
Make these changes and see where that puts you.
Post some code if you want to see if we can help.
I believe that comet is the only solution.
Threads are much lighter than processes, so you would want to use a COMET type technology with a light threaded web server like lighthttpd.
Take a look at Simple Chat. It's very tiny and extensible i think.
All you need to do is "move ahead with one of the standard solutions available"....
Comet, Web Socket, XMPP are a few words you will hear every now n then, just pick one of them and it should be more efficient than classic periodic ajax call based php chat.
If you want to go ahead with PHP and jQuery based solution, try these two working examples of browser based chat system using XMPP: http://bit.ly/jaxlBoshChat (for one-to-one chat example) and http://bit.ly/jaxlBoshMUC (for multi-user chat example). Let me know if you face any problem setting them up.