I have an HTML/PHP/JQUERY/MYSQL web application.
It's an HTML Bootstrap base, and jquery and other libraries plus my custom scripts, in front.
backend, i have several php files to serve the data.
For this example, say I have a CONTACTS php page where I need to display several data sets:
1) List of contacts
2) List of groups
3) list of tags associated with contacts
I have a backend php file in: engine/contacts.php
this is the php script that serves the contacts data as requested based on GET flags, eg:
engine/contacts.php?list=contacts
engine/contacts.php?list=groups
engine/contacts.php?list=tags
Sure i could serve them up in one call but , by design, each part of the web page (contacts, or groups, or tags) are separate datasets, and this way, if one data set is updated, i can refresh that part only.. eg:
user adds a group , then JS will ajax load:
engine/contacts.php?list=groups
to update the groups area (div)
So, basically, ON PAGE LOAD 3 separate JS calls are fired at the same time load data from the same contacts.php file
IS THIS AN OK Practice? I mean it should be because I see lots of sites doing the same .
And how does this impact the server side? Will the server execute this php file one at a time? will it be better if i separate the files? like:
contacts.php
contacts_groups.php
contacts_tags.php
and simultaneously call them?
The reason I ask is because I'm currently debugging some performance issues. Simply put, i have very light weight PHP/MYSQL web application with HTML5/Jquery front end. The datasets being handled is very minimal and the database tables having less than 50rows
But somehow my application is hitting resource limits on the shared host server, particularly on the 1GB RAM limit side. And i have reproduced this situation on a stand alone domain w/ no other users and it's still hitting the limits.
I have gone through the php scripts and can't find anything. I do have loops, yes, but they are thoughtfully done and terminates after a few iterations.
I'm out of ideas so I'm just trying to explore what else i can poke at.
Would appreciate some guidance, thanks
I think, if you use an OOP structure you can consider a method for handling each request in backend. Although the best way is to use a MVC framework to dispatch the requests with URL routing to special methods :
engine/contacts/contacts
engine/contacts/groups
engine/contacts/tags
Related
Here's the context : we're actually using the basic web stack, and our website builds HTML templates with datas it gets from the database directly.
For tons of reasons, we're splitting this into two projects, one will be responsible for talking with the database directly, the other one will be responsible for displaying the datas.
To make it simple, one is the API, the other one is the client.
Now we're wondering about how we should ask our API for datas. To us, there are 2 totally different options :
One request, one route, for one page. So we would get a huge object to use which would contain everything needed to build the corresponding page.
One request for one little chunk of data. For example on a listing page, we'd make one request to get datas about the current logged user and display its name along with its avatar, then another request to get every articles, another request to get datas about the current page category...
Some like the first option, I don't like it at all. I feel like we're going to have a lot of redundance. I'm also not sure one huge request is that much faster than X tiny requests. I also don't like binding data to a specific page, as I feel like the API should be (somewhat) independant from our front website.
Some also don't like the second option, they fear we overcharge the server by making too many calls, and I can understand this fear. It also looks like it'll be hard to properly define the scope of what to send, what to not send without any redundancy. If we're sending only what's needed to display a page, isn't that the first option in the end ? But isn't sending unneeded information a waste ?
What do you guys think ?
The first approach will be good if getting all data is fast enough. The less requests - the faster app. Redundancy I think you mean code redundancy because sending the same amount of data in one request will be definitely faster than in 10 small non-parallel ones (network overhead). If you send a few parallel requests from UI you can get performance gain of cause. And you should take into account that browsers have some limitations for parallel requests.
Another case if getting some data is fast but another is slow you can return the first data and on UI show loading image and load the second data when it will come. It will improve user experience showing the page as fast as possible.
The second approach is more flexible as you can use some requests from other pages. But it comes with price - logic with making these requests (gathering information) you need to move to UI code making it more complex. And if you need the same data on another app like mobile you have to copy this logic. As a rule creating such code on backend side is easier.
You can also take a look at this pattern which allow you to locate business/domain logic inside one service and “frontend friendly” logic to another service (orchistration service).
How one would go about loading this sort of site properly. Currently it is taking a page roughly 10 second to load, and that is obviously far too long.
Some things to just assume for now:
The server is not causing the slowness
Media like images aren't causing the slowness
The slowness is due to the poor way I currently
have things set up in my current 'pass' at learning HTML/CSS/PHP (i
don't know what I'm doing, but im trying to learn as I go and
redoing everything via full 'passes' as I gain knowledge to reduce
code etc and im due for another pass)
Basically I am making a site that people can goto to get useful information about a game. This site uses the games API to pull data that is mainly in the form of arrays.
Here is a sample at the starter array (Career): http://pastebin.com/X33k0NKw
One of the arrays comes from that (One of the heros): http://pastebin.com/fA4tjgxd
The item array that comes from that (also large): http://pastebin.com/gaqba3i7
Now from the item array I further get file that gives me data that I can use in tooltips.
Now I have all this data coming from the server, and its taking 10~sec to all get finished. Way too long. You’re thinking I need to set up a database and get the data in there. And I plan to. However I can’t possibly be doing this correctly, so I want to be able to get this done right before I start doing the MySQL stuff.
So my question is what should I be searching for to help with this sort of thing?
My rule of thumb when dealing with any RESTful API data is this: A RESTful API is not a database & should never be used for direct real-time calls for data on request. They are simply not built for speed. Especially for arrays of data this large. The data array is simply so large the combined weight of fetching the data & acting on the data is slowing down your page.
The solution? If the RESTful API calls are filled with data you need to cache those in your local database—or on the file system—and then act on the data you have stored locally in the database or file system. Even if the caching time between accesses of the API is 5 seconds, the speed of accessing them locally will still be great.
So that all means your setup is like this:
[Your Page] -> [Your Code] -> [Your Call to the RESTful API]
You need to add some local storage/caching layer:
So that all means your setup is like this:
[Your Page] -> [Your Code] -> [Your Local Storage of API Data] -> [Your Call to the RESTful API to Update Local Storage]
Now how long you cache the data locally in a database or on the file system is your call. I have dealt with RESTful APIs where data only needs to be updated once a day as well as systems where the caching adds up to maybe just a minute. But even if the caching is a minute, that could be enough breathing room to help your system deal with the size of the data.
And past all of that, you should also consider adding a AJAX layer to your front-end functionality that would be encompassed in the [Your Page] areas of those short diagrams I have above.
AJAX stands for Asynchronous JavaScript. And the general concept for data issues like this is the core page HTML is rendered by standard methods in whatever language you are using. Then after the HTML loads, a JavaScript call is made on some basic—time interval, user interaction, a mix of both, something else…—which requests data via JSOP from your core application framework & then the JavaScript updates elements on the page based on that data.
So of you were yo break down the [Your Page] functionality it could be like this:
[Your Server Side Script Renders a Page] -> [Browser Loads HTML, CSS & JavaScript] -> [JavaScript AJAX Calls Adjust the Content in Your Renered HTML]
The beauty of this is your core page is never reloaded again. It stays there like a framework waiting to be filled with content. And AJAX calls fill that framework with content.
I need some advice on website design.
Lets take example of twitter for my question. Lets say I am making twitter. Now on the home_page.php ,I need both, Data about tweets (Tweet id , who tweeted , tweet time etc. etc) and Data about the user( userId , username , user profile pic).
Now to display all this, I have two option in mind..
1) Making separate php files like tweets.php and userDetails.php. By using AJAX queries, I can get the data on the home_page.php.
2) Adding all the php code (connecting to db, fetching data ) in the home_page.php itself.
In option one, I need to make many HTTP requests, which (i think) will be load to the network. So it might slow down the website.
But option two, I will have a defined REST API. Which will be good of adding more features in the future.
Please give me some advice on picking the best. Also I am still a learner, so if there are more options of implementing this, please share.
In number 1 you're reliant on java-script which doesn't follow progressive enhancement or graceful degradation; if a user doesn't have JS they will see zero content which is obviously bad.
Split your code into manageable php files to make it easier to read and require them all in one main php file; this wont take any extra http requests because all the includes are done server side and 1 page is sent back.
You can add additional javascript to grab more "tweets" like twitter does, but dont make the main functionality rely on javascript.
Don't think of PHP applications as a collection of PHP files that map to different URLs. A single PHP file should handle all your requests and include functionality as needed.
In network programming, it's usually good to minimize the number of network requests, because each request introduces an overhead beyond the time it takes for the raw data to be transmitted (due to protocol-specific information being transmitted and the time it takes to establish a connection for example).
Don't rely on JavaScript. JavaScript can be used for usability enhancements, but must not be used to provide essential functionality of your application.
Adding to Kiee's answer:
It can also depend on the size of your content. If your tweets and user info is very large, the response the single PHP file will take considerable time to prepare and deliver. Then you should go for a "minimal viable response" (i.e. last 10 tweets + 10 most popular users, or similar).
But what you definitely will have to do: create an API to bring your page to life. No matter which approach you will use...
We need to load an embedded version of a site written in Flash, and not originally designed to load multiple instances of itself, on a HTML page. The specific issue is how to get them to load in order when embedded, given that they are all being opened by the same instance of the flash player.
It's a complicated mapping application, and at the moment, the maps and data get intermixed as the session variables are overwritten by another instance starting to load before the previous one has finished. We need a way to have them load sequentially, one finishing before another starts to load.
The most we can specify in the URL is an &order=1 or similar. We have PHP and SQL on the backend.
Edit:
The embedded versions are being loaded in an iFrame of a parent site. One php file loads one swf, as many times as the parent site desires.
I would use Javascript and swfobject to load the Flash apps sequentially onto the page. The javascript code would contain the order it wants to load the flash apps in and each Flash app should notify the javascript on the containing page when it has completed loading by calling a function in the javascript via ExternalInterface, which would then trigger the loading of the next Flash app in the list.
I feel sorry to tell my solution mustn't seem the most simple but may be the most realistic to keep a stable version up, alas. Probably you may spend a couple of days re-factoring the setup core of the application so it matches the current requirements.
It's often not a good thing to find out an external 'hack' to make it kind-of-work as it may get wasted and complicate things even more further on. By experience, all summed up you always spend more time (and painful one) by tweaking such kind of 'quick-fixes' than you would have done sitting down thinking it over for a while.
For the following pretty straightforward task: query a list of products from a DB and present it on a webpage,
consider 2 setups:
Setup1: PHP script queries. All content is built on the server and the entire page is served back to the client.
Setup2: Static HTML "page skeleton" requesting content using AJAX. Received content is parsed on the client side using Javascript and rendered using innerHTML or similar.
Of course the second setup only makes sense when you have pages, categories and tags for the client user to choose from.
I need to compare these two, at least by means of:
time it will take content to be served
user experience (setup1 is delivered as a whole, setup2 is delivered in "two parts")
scalability - how do the setups compare when I have 100,000 queries daily
Any thoughts on the issue will be much appreciated.
You may find the following question helpful: Smarty Vs. Javascript/AJAX
I brought up a few points in my answer to that question:
You should use server-side scripts to show any data that is known at the moment the page is loaded. In this case, you know the list of products should be displayed. The fact that a question's answers should be shown is known at page load.
You should only use AJAX calls to load dynamic data that is not known at the moment the page is loaded. For example, when you click the "comments" link under a question or answer on Stack Overflow. The fact that you want to view a particular question's comments is not known at page load.
Javascript should not be required to access core functionality of your site.
You should gracefully degrade functionality when Javascript is disabled. For example, Stack Overflow works just fine with Javascript disabled. You don't have access to real-time Markdown previews or dynamic badge notices, but the core functionality is still intact.
A single HTTP request to a server-generated page will load significantly faster than a request to load a page that makes five or six additional AJAX calls, especially on high latency connections (like cellular networks). See Yahoo's Best Practices for Speeding Up Your Website.
You should think of Javascript as a bonus feature that might not be enabled, not as something that should be used to construct critical pieces of your website. There are exceptions to this rule. If you want to do some sort of pagination in which you click a "next page" button and only the product list changes, AJAX might be the right choice. You should, however, make sure users without Javascript are not excluded from viewing the entire list.
There's nothing more frustrating than when a page can't be accessed because the web developer didn't obey the KISS principle. As an example, take Friendly's Restaurants. I wanted to check out their menu while I was at the mall, so I loaded their website on my iPhone, only to find out that you literally can't get any meaningful information about the restaurant without Flash. It's nice to have fancy menus with swooshing desserts flying everywhere, but in the end, I just wanted to see the items on their menu. I couldn't do that because they required Flash. Graceful degradation in service would have been helpful in that case.
Some things on the web can't be done effectively without Javascript. Displaying a list of products is not one of them. If you are still unsure, look at how other popular websites do things. I think you'll find most of the successful, well-engineered websites follow the guidelines listed above.
AJAX is probably better choice when only a small part of the page changes.
I would recommend starting with the server side version and then building AJAX on top of that. This way you will get also a version of your site that works without javascript, which you probably need anyway if you care about being indexed in search engines.
But first concentrate on creating a page that just works - you can always optimize it later.
Performance on the client has many factors. What is running at the time, what browser, what the content is, what the CSS of the page is, how full is the browser's cache, what plug-ins are installed, what is happening on the network, etc. Just remember that when you are playing with the numbers.
Unless the implementation sucks, AJAX should win hands down. Among the benefits are:
parallelism due to parallel requests on the client side (i.e. you can use multiple server CPU cores to serve parts of one served web page, that can't be done easily using PHP)
refreshing only small parts of the page is faster (less data to transfer, generate ...)
it scales much better since the server has less work to do (esp. if you can offload some of the processing needed for generating html to the client instead of just delivering it)
Dynamic pages like http://www.startpagina.nl/ have been doing this successfully since way before the recent AJAX fad (1 static file delivered, all customization done on the client side - last time I checked anyway).
Of course you can screw things up with either method so that it becomes slower than the other.