Check Internet connection before displaying web page content - php

In web development, is it possible to perform a test on bandwidth/internet connection speed in client device, and based on this test, filter items to be displayed.
For instance, is it possible do avoid showing social media buttons if connection speed is low for the user?
I am using PHP for server, HTML/jquery for client side.

There is no easy way to do this. There are no built in functions or anything. Using a combination of PHP and JS there are workarounds, but honestly if your user has a slow connection it will slow it down even more to have all that extra javascript to download. You're really better off just making your website as compact as possible. Use style sheets and avoid adding excessive images, etc.

You'll have to use javascript to download files of varying sizes and time how long it takes. Then based on how long it takes you can handle things differently. Take a look here: Bandwidth utility using javascript

Related

Many HTTP requests (API) Vs Everything in single php

I need some advice on website design.
Lets take example of twitter for my question. Lets say I am making twitter. Now on the home_page.php ,I need both, Data about tweets (Tweet id , who tweeted , tweet time etc. etc) and Data about the user( userId , username , user profile pic).
Now to display all this, I have two option in mind..
1) Making separate php files like tweets.php and userDetails.php. By using AJAX queries, I can get the data on the home_page.php.
2) Adding all the php code (connecting to db, fetching data ) in the home_page.php itself.
In option one, I need to make many HTTP requests, which (i think) will be load to the network. So it might slow down the website.
But option two, I will have a defined REST API. Which will be good of adding more features in the future.
Please give me some advice on picking the best. Also I am still a learner, so if there are more options of implementing this, please share.
In number 1 you're reliant on java-script which doesn't follow progressive enhancement or graceful degradation; if a user doesn't have JS they will see zero content which is obviously bad.
Split your code into manageable php files to make it easier to read and require them all in one main php file; this wont take any extra http requests because all the includes are done server side and 1 page is sent back.
You can add additional javascript to grab more "tweets" like twitter does, but dont make the main functionality rely on javascript.
Don't think of PHP applications as a collection of PHP files that map to different URLs. A single PHP file should handle all your requests and include functionality as needed.
In network programming, it's usually good to minimize the number of network requests, because each request introduces an overhead beyond the time it takes for the raw data to be transmitted (due to protocol-specific information being transmitted and the time it takes to establish a connection for example).
Don't rely on JavaScript. JavaScript can be used for usability enhancements, but must not be used to provide essential functionality of your application.
Adding to Kiee's answer:
It can also depend on the size of your content. If your tweets and user info is very large, the response the single PHP file will take considerable time to prepare and deliver. Then you should go for a "minimal viable response" (i.e. last 10 tweets + 10 most popular users, or similar).
But what you definitely will have to do: create an API to bring your page to life. No matter which approach you will use...

Reduce MySQL query amount with jQuery and PHP

I am building a "multiplayer world" with jQuery and PHP. Here is a bit how it works:
User's character's positions are taken from a database, user is plotted accordingly (the position values are CSS values - left and top)
User is able to move about using the arrow keys on the keyboard, making their character move using jQuery animations. While this is happening (on each arrow press) the user's position values are inserted into a database and updated.
In order to make this "global" (so users see each other) as you could say, the values need to be updated all at once for each user using AJAX
The problem I am having is I need to continuously call a JavaScript function I wrote which connects to the MySQL server and grabs values from a database table. And this function needs to be called constantly via setInterval(thisFunction, 1000); however my host just suspended me for overloading the server's resources and I think this was because of all my MySQL queries. And even after grabbing values from my database repeatedly, I had to insert values every few seconds as well so I can imagine that would cause a crash over time if enough clients were to login. How can I reduce the amount of queries I am using? Is there another way to do what I need to do? Thank you.
This is essentially the same thing as a chat system with regards to resource usage. Try a search and you'll find many different solution, including concepts like long polling and memcached. For example, check this thread: Scaling a chat app - short polling vs. long polling (AJAX, PHP)
You should look into long polling - http://en.wikipedia.org/wiki/Push_technology. This method allows you to establish a connection with your server, and then update it only when you need to. However by the sounds of it, you have a pretty intensive thing going on if you want to update every time, you may want to look into another way of storing this data, but if your wondering how big companies do it, they tend to have mass amounts of servers to handle it for them, but they will also use a technique similar to long polling.
You could store all the positions in memory using memcached See http://php.net/manual/fr/book.memcached.php and save them all at once every few seconds into the database (if needed).
You could use web sockets to overcome this problem. Check out this nettuts tutorial.
There is another way, it's to emulate or use actual sockets. Instead of constantly pulling the data (refreshing to check if there are new records), you can push the data over WebSockets which works in Chrome at the moment (at least to my knowledge, didn't try it in FF4) or you can use Node.js for leaner long pooling. That way, the communication between players will be bi-directional without the need of MySQL for storing positions.
Checkout Tornado
From their site:
Tornado is an open source version of the scalable, non-blocking web server and tools that power FriendFeed. The FriendFeed application is written using a web framework that looks a bit like web.py or Google's webapp, but with additional tools and optimizations to take advantage of the underlying non-blocking infrastructure.
The framework is distinct from most mainstream web server frameworks (and certainly most Python frameworks) because it is non-blocking and reasonably fast. Because it is non-blocking and uses epoll, it can handle thousands of simultaneous standing connections, which means it is ideal for real-time web services. We built the web server specifically to handle FriendFeed's real-time features — every active user of FriendFeed maintains an open connection to the FriendFeed servers. (For more information on scaling servers to support thousands of clients, see The C10K problem.)

Client-side or server-side processing?

So, I'm new to dynamic web design (my sites have been mostly static with some PHP), and I'm trying to learn the latest technologies in web development (which seems to be AJAX), and I was wondering, if you're transferring a lot of data, is it better to construct the page on the server and "push" it to the user, or is it better to "pull" the data needed and create the HTML around it on the clientside using JavaScript?
More specifically, I'm using CodeIgniter as my PHP framework, and jQuery for JavaScript, and if I wanted to display a table of data to the user (dynamically), would it be better to format the HTML using CodeIgniter (create the tables, add CSS classes to elements, etc..), or would it be better to just serve the raw data using JSON and then build it into a table with jQuery? My intuition says to do it clientside, as it would save bandwidth and the page would probably load quicker with the new JavaScript optimizations all these browsers have now, however, then the site would break for someone not using JavaScript...
Thanks for the help
Congratulations for moving to dynamic sites! I would say the following conditions have to be met for you to do client-side layout (it goes without saying that you should always be doing things like filtering DB queries and controlling access rights server side):
Client browser and connection capabilities are up to snuff for the vast majority of use cases
SEO and mobile/legacy browser degradation are not a big concern (much easier when you synthesize HTML server side)
Even then, doing client-side layout makes testing a lot harder. It also produces rather troublesome synchronization issues. With an AJAX site that loads partials, if part of the page screws up, you might never know, but with regular server-side composition, the entire page is reloaded on every request. It also adds additional challenges to error/timeout handling, session/cookie handling, caching, and navigation (browser back/forward).
Finally, it's a bit harder to produce perma-URLs in case someone wants to share a link with their friends or bookmark a link for themselves. I go over a workaround in my blog post here, or you can have a prominent "permalink" button that displays a dynamically rendered permalink.
Overall, especially when starting out, I would say go with the more kosher, better supported, more tutorialed, traditional approach of putting together the HTML server side. Then dip in some AJAX here and there (maybe start out with form validation or auto-completion), and then move on up.
Good luck!
It is much better to do the heavy lifting on the server side.
In CodeIgniter you create a view, looping through all the rows in the table adding in the classes or whatever else you would need. There is no reason at all to do this in Javascript.
Javascript is a sickly abused language with unfortunate syntax. Why on earth would you want to load a page, and then issue a AJAX call to load up some JSON objects to push into a table is beyond me. There is little reason to do that.
Javascript (and jQuery) is for end user enhancement. Make things slide, flash, disappear! It is not for data processing in even the mildest of loads. The end user experience would be crap because you're relying on their machine to process all the data when you have a server that is infinitely more capable and even designed for this specifically.
It depends on your target market and the goal of your site.
I strongly believe in using the client side where ever you can to offload work from the server. Obviously its important you do it correctly so it remains fast for the end user.
On sites where no-js support is important (public websites, etc), you can have fallbacks to the server. You end up doubling code in these situations but the gains are very beneficial.
For advanced web applications, you can decided if making JS a requirement is worth the trade of losing a (very) few users. For me, if I have some control over the target market, I make it a requirement and move on. It almost never makes sense to spend a ton of time to support a small percentage of potential audience. (Unless the time is spent on accessibility which is different, and VERY important regardless of how many people fit into this group on your site.)
The important thing to remember, is touch the DOM as little as possible to get the job done. This often means building up an HTML string and using a single append action to add it to the page vs looping through a large table and adding one row at a time.
It's better to do as much as possible on the server-side because 1) you don't know if the client will even have JavaScript enabled and 2) you don't know how fast the client-side processing will be. If they have a slow computer and you make them process the entire site, they're going to get pretty ticked off. JavaScript/jQuery is only supposed to be used to enhance your site, not process it.
You got the trade-off correctly. However, keep in mind that you can activate compression in the server side, which will probably make adding repetitive markup to format the table a small bandwidth cost.
Keep also in mind that writing Javascript that works in all browsers (including hand-helds) is more complicated than doing the same server side in PHP. And don't forget that the "new JavaScript optimizations" do not apply to the same extent to browsers of handheld devices.
I do a great deal of AJAX app development and I can tell you this from my experience. a good balance between the two is key.
do the raw data server-side but use javascript to make any modifications that you would need to it. such as paging, column sorting, row striping, etc.
I absolutely love doing everything in AJAX heh.. but there are some short falls to doing it using AJAX, and that's SEO. search engines do not read javascript, so for the sake of your website's page rank, I would say have all data served up server side and then formatted and made look cool client-side.
The reason I love AJAX so much is because it drastically speeds up your APP usage by the user as it only loads the data you need to load where you need to load it, versus load the entire page every time you do something... you can do a whole bunch of stuff, such as hide/show rows/columns (we are talking about table manipulation here because you mentioned a table) and even with these show/hide actions add delete actions where when you click a delete row or button it deletes that row not only visually but in the database all done via AJAX calls to server-side code.
in short.
raw data: server-side sending to the client the raw data in html layout (tables for table structured data, however I do everything else in divs and other flexible html tags, only do tables for column/row style data)
data formatting: client-side which also includes any means of interacting with the data. adding to it, deleting from it, sorting it differently etc. This achieves two things. SEO, and User Experience (UX).

Whats the most efficent way to scrape data from a website (in php)?

Im trying to scrape data from IMDB, but naturally there are a lot of pages, and doing it in a serial fashion takes way too long. Even with I do multi-threaded CURL.
Is there a faster way of doing it?
Yes I know IMDb offers text files, but they dont offer everything, in any sane fashion.
I've done a lot of brute force scraping with PHP and sequential processing seems to be fine. I'm not sure "what a long time" to you is, but I often do other stuff while it scrapes.
Typically nothing is dependent on my scraping in real time, its the data that counts, and I usually scrape it and massage it at the same time.
Other times I'll use a crafty wget command to pull down a site and save locally. Then have a PHP script with some regex magic extract the data.
I use curl_* in PHP and it works very good.
You could have a parent job that forks child processes providing them URL's to scrape, which they process and save the data locally (db, fs, etc). The parent is responsible for making sure the same URL isn't processed twice and children don't hang.
Easy to do on linux (pcntl_, fork, etc), harder on windows boxes.
You could also add some logic to look at the last-modified-time and (which you previously store) and skip scraping the page if not content has changed or you already have it. There's probably a bunch of optimization tricks like that you could do.
If you are properly using cURL with curl_multi_add_handle and curl_multi_select there is no much you can do. You can test to find an optimal number of handles to process for your system. Too few and you will leave your bandwidth unused, too much and you will loose too much time switching handles.
You can try to use master-worker multi process pattern to have many script instances running in parallel, each one using cURL to fetch and later process block of pages. Frameworks like http://gearman.org/?id=gearman_php_extension can help in creating elegant solution but using process control functions on Unix or calling your script in the background (either via system shell or over non-blocking HTTP) can also work well.

Load testing the UI

I have been working on a site that makes some pretty big use of AJAX and dynamic JavaScript on the front end and it's time to start stress testing. But how do you properly stress test something that requires clicking several links on the front-end? One way I was able to easily hit every page of the site quickly and repeatedly was to point a Google Mini at it. But that's not going to click links and then navigate Modal windows and things like that.
Edit - I should point out that the site is done in PHP5 and the JavaScript library used is jQuery. Not sure if this would make any difference but felt it might be useful to know.
JMeter is great at this. You may record your sessions and tweak them to your liking.
So-called 'ajax load testing' is a recurring subject on this site, and is often confused. So let's get it straight: There is really no difference between load testing a normal web page and load testing with ajax. It all boils down to discrete requests; they just happen to not be full page refreshes.
One thing to keep in mind is there is a distinct difference between load testing the server processing the requests (a load test) and the performance on screen of the UI components being updated (how well your javascript performs.)
Simple load test example:
initial page load
login
navigate?
5-10 'ajax' requests (or whatever may fit your application usage pattern)
logout
There are load testing tools that can support AJAX. For example, WebLoad
http://www.radview.com/solutions/ajax-load-testing.aspx
What you really want is to stress test is the server's ability to handle the ajax requests. Use a load tool that looks at the requests while "recording" the test, and then tune as appropriate. I have only used the vs test edition one, so I can't point you to a low cost one.
I disagree with Nathan and Freddy to some degree. They are correct that "AJAX testing" is really no different in that HTTP requests are made. But it's not that simple. See my article on Ajaxian.com on Why Load Testing Ajax is Hard.
JMeter, Pylot, and The Grinder are all great tools for generating HTTP requests (I personally recommend Pylot). But at their core, they don't act as a browser and process JavaScript, meaning all they do is replay the traffic they saw at record time. If those AJAX requests were unique to that session, they may not be suitable/correct to replay in large volumes.
The fact is that as more logic is pushed down in to the browser, it becomes much more difficult (if not impossible) to properly simulate the traffic using traditional load testing tools.
In my article, I give a simple example of how difficult it becomes to test something like Google's home page when you want to query 1000's of different search terms (an important goal during load testing). To do it with JMeter/Pylot/Grinder you effectively end up re-writing parts of the AJAX code (in your case w/ jQuery) over again in the native language of the tool.
It gets even more complex if your goal is to measure the response time as perceived by the user (which is arguably the most important thing at the end of the day). For really complex applications that use Comet/"Reverse Ajax" (a technique that keeps open sockets for long periods of time), traditional load tools don't work at all.
My company, BrowserMob, provides a load testing service that uses Firefox browsers, powered by Selenium, to drive hundreds or thousands of real browsers, allowing you to measure and time the performance of visual elements as seen in the browser. We also support traditional virtual users (blind HTTP traffic) and a simulated browser (via HtmlUnit).
All that said, usually a mix of a service like BrowserMob plus traditional load testing is the right approach. That is, real browsers are great for a full-fidelity load test, but they will never be as economical as "virtual users", since they require 10-100X more RAM and CPU. See my recent blog post on whether to simulate or not to simulate virtual users.
Hope that helps!
You could use something like openSTA.
This allows a session with a web site to be recorded and then played back via a relatively simple script language.
You can also easily test web services and write your own scripts.
It allows you to put scripts together in a test in any way you want and configure the number of iterations, the number of users in each iteration, the ramp up time to introduce each new user and the delay between each iteration. Tests can also be scheduled in the future.
It's open source and free.
It produces a number of reports which can be saved to a spreadsheet. We then use a pivot table to easily analyse and graph the results.

Categories