I'm using an easy way to display my Facebook pictures on my website. It's called Facebook Album Gallery and it's from CodeCanyon.
Although they claim to cache the data and their example is rather fast (5 sec.), mine is very slow (15-20sec.) compared to theirs.
Is their a way to improve the loading speed or is there any way to find out why my page is loading slow and the example isn't...
I'm not familiar with caching, so if there is any more information you need, please let me know!
Xdebug has a profiler. Try it to determine where is the bottleneck.
First of from my connection the photo's/website loaded pretty fast.
If PHP is the problem then Xdebug's profiler like Elszo pointed out could help you find it. Doubt that is the problem right now..
Test your site using Yslow/Google Page speed and read up on their information.
If javascript is the problem use Javascript profiler like for example Firebug or Google Chrome's built in profiler.
You're displaying alot more photos then the example.
Your server might be slower.
Your images might be bigger then the one used in the example.
And make sure you are implementing the code properly and you are enabling caching, i don't really know how their code works, look for an FAQ or some documentation.
Related
I am currently working on the development of a platform and I was wondering how sites like Behance, Artsation and upwork do to partially reload their page. (When you click on a link, the page loads well but the menu does not move just like the footer). I first thought of Nginx but Artsation does not seem to be used Nginx.
I would like to know the things to reproduce this kind of loading page if someone could enlighten me on the subject.
EDIT: I already know ajax, but the thing is that on behance, the server seems to really redirect
Thanks
You probably should learn about AJAX, here is a starting place:
https://developer.mozilla.org/en-US/docs/Web/Guide/AJAX/Getting_Started
And if you use jquery
https://learn.jquery.com/ajax/
First of all, Behance uses Ngnix.
Second, If you want to create a real-time web application you need to do it with NodeJS and Express or any framework based on NodeJS. But there is an alternative, which is Ajax.
So using ajax will helps you to load contents on your website in each couple of seconds or load a part of your page.
See: https://www.youtube.com/watch?v=ytKc0QsVRY4
Maybe I was not clear enough or I went a bit in the air asking the question, I'm sorry.
I already use Vuejs for my project (all front end, router, store, I use axios to recover data from my API), but I'm looking for a way to relieve JS files that weigh more than 2 MB. That's why Behance interest me, it seems to give this impression of redirection without redirection, I would rather say that it is this effect that I would like to reproduce.
Next time I will try to ask questions more clearly :)
Sorry for the inconvenience
I am the webmaster of a dynamic website, and because of plenty of complicated queries that I have to use on the front page and some other pages, the server suffers sometimes from overload, when the number of visitors of our website is elevated.
So, I got the Idea to generate periodically (every 2 minutes) an html static snapshot of these pages. This would charge the server just once per 2 minutes by just one user.
My question is: Is this a good Idea? because I plan to generalize it over many other pages, and I don't want to be surprised and have to go back again.
If it isn't, is there any good ideas to avoid this charge?
Thank you in advance
PS: I would maybe publish the method I use to do this, to see if there is a better way.
I don't think it's a bad idea, but you should use an existing caching solution rather than implementing your own. Why not to use memcached? I think that's what you are looking for, just use it for those parts of your code that are taking long time.
Caching is a good idea to protect your server from overloading. Many CMS (Content Management System) use this technique.
Sure, it's called caching :)
However, most sites are caching just parts of their content. You can't cache a whole page if you are using user specific content, for example the name of a logged-in user. But you can cache the heavy parts of your site and combine it with a dynamic page.
Your idea is really good and many big websites are using this concept. You can also using Caching techniques, if you want avoid database hit then you can use caching technique it will be better. you can use Memcached http://memcached.org/.
Currently i am wondering whether or not to use a MySQL DB to provide content on my website.
An example of what i mean is based loosely here: http://www.asual.com/jquery/address/samples/
find the sample called SEO. Alternatively Click Here
Anyone with HTML5 able browsers will notice the URL is 'pretty' and is what you'd expect to find on any standard website.
Anyone with IE8 or a browser which isnt 'Webkit' enabled, will see the use of the Hashbang (#!) in order for SEO.
The problem is this: the content is pulled from a MySQL DB.. I have approx 30 pages (some are PACKED with content) And im wondering if all this tedious modification of my website is necessary?
I use jQuery MySQL and PHP through a single page interface so my content is not indexable at all. What are your views?
Help me!!
PS. would it be easier to provide PHP Includes in my DB content to fetch pages without having to upload all my pages into my DB?
your question is made up of a lot of questions. :)
to mysql or not to mysql: most of the PHP-usng web world is using mysql as a database to store content. i don't see much of a problem there. 30 pages is peanuts.
jquery and php for a single page interface indexable: depends on the search engine. i've read somewhere (too lazy to look things up) that google uses a javascript enabled crawler. not sure if they use it in production already.
PHP includes in DB content: textpattern uses this approach. your worry is a problem of scale.
if your PHP code can serve pages properly, it wouldn't matter where it pulls content from. DB or filesystem wouldn't matter at this point.
just do it.
There is no such question.
Mysql is okay.
Its general purpose solution for storing site data, everyone using it with not a single problem and even Wikipedia is happy with it.
Mysql is irrelevant to any problems of your site.
Your problem is somewhere else but you forgot to state it. Let me suggest you to ask another question, pointing to the real problem you have, not some guess you made of it's reasons.
Well, if you can avoid it, avoir storing pages inside MySQL, unless you want to give the administrator the possibility to edit the pages.
Aside from that, there is no problem in storing pages in a DB, would it be MySQL or others. A lot of CMS do it (Drupal, Joomla, etc.).
You might encounter some performance issues on your DB server if your traffic becomes high, but this is another problem.
In my tests and comparison, mysql connectivity and queries do slow down responses. If your site is simple and you are only doing updates yourself, then using a template engine and storing content in a files is not a bad choice.
If you decide to put it into SQL, then eventually you might need to build a cache. Hopefully nginx and not the php cache, so it shouldn't be a problem too.
The deciding factor is how you are willing to edit the content. I found that myself and my team is much more comfortable with editing html files through notepad++, Vim or Coda. If content is inside a database you get a poorly-performing (compared to desktop app) WYSIWYG editor.
Always use SQL the content is generated by your users. And do use some lightweight CMS.
I am using the one bundled with Agile Toolkit myself and templates look like this:
https://github.com/atk4/atk4-web/tree/master/templates/jui
would it be easier to provide PHP Includes in my DB content
I think you'll find your site far easier to maintain for years IF you keep a very clear separation of duties: data goes in a database, presentation and code go in files.
While there is some contention whether it is a good idea to store templates in a database, my gut feeling says that you should avoid that temptation unless you have a very good reason.
But storing code (your PHP include statements) in the database is almost certainly not the best way forward.
I've been asked to create a custom 'tracker' in PHP, to know where users are coming from and where they are going on the site.
I'm thinking of writing a simple script, which connects to a database, writes the ip, browser, and time of the visit, then closes the db link.
Is this the right way to do it ?
I've found a few similar questions on stackoverflow, but none mentioned performance.
Is there a reason you can't use a solution such as Google Analytics - its free and has some nice features such as heat maps which show traffic flow
The main disadvantage is that it requires you to embed some javascript on all the pages - which means that its client side
I suppose it's another question of the kind "I want superior performance, however I have no certain reason for that".
in fact, any solution will be fast enough as writing logs is not too heavy operation.
the only thing one have to keep in mind is not to use any indexes in case SQL database used.
that's all.
So, lets put aside that performance stuff.
The only complete solution would be analyzing web-server logs.
Any other method will not give you complete picture. Say, if there is some image hotlinked on other sites and makes heavy load because of that, you'd never notice that if you log only requests to php scripts.
So, you can run crontab-based script running every night parsing access logs and getting comprehensive information of all users and bots activity.
Check Piwik or New Relic, if you need more customization, you should take a look at Webalyzer and Visitors
N.B: You can customize Piwik by creating plugins http://geekmonkey.org/articles/34-how-to-write-a-piwik-plugin
Perhaps you need some special software like Webalyzer? (it's free and quite powerful)
Performance is easy to say but much harder to define. It depends on zillion circumstances and while i'm say: this is the best performance i can get - you might say: hey, what's this?
Personally i recommend Google Analytics. It does almost everything if you need (almost things you didn't need). Maybe you can get a small 'performance' boost if you storing it's source locally but there's a chance it's cached in users' browser yet.
Or, if you prefer open source solutions, give a shot for Piwik.
Piwik does just that, and it does it very well. There is also a Tracking API that you can use to track a lot of things about your visitors, using PHP or any other language (REST API). See more information on http://piwik.org/docs/tracking-api/
Also it is very modular & fast, don't reinvent the wheel :)
I have designed a new web site. I have hosted it online. I want it to be of the best performance and load pages faster.
This website is designed in php 5.0+ using codeigniter. This is using mysql as DB. I have images on it. I am using Nitobi grid for displaying set of records on page. The rest is everything normal page controls.
As i am not so very experienced with website performance factors i would like to get suggestions and details on factors that can improve performance of website. Please let me know how i can improve my performance.
Also please let me know if there are any ways to measure the performance of website and also any websites or tools to help test the performance.
To start, get Firefox and Firebug and then install YSlow. YSlow gives great information about the client-side performance of the website in question. Here's an User Guide.
For the server-side performance, have a look at Apache JMeter.
Have you looked into opcode caching, APC, memcache etc? As another has said, you need to time the loading of your pages and try to find potential SQL bottlenecks and/or scripts that can be refactored. You may also want to look at getting something like webgrind installed so you can see what happens on a page load and how long each process takes.
You can see loading times of the main page and the components it contains with the Net tab in the already mentioned Firebug addon for Firefox. There you can see if a page is slow due to having a lot of external content (like user added images or so) or because of itself.
In the first case not much you can do except removing the content that takes most time, in the second case you will need to take a look at your PHP code considering the fact that most of the times performance issues in PHP applications depend on imperfect database interaction (badly written queries, repeated queries when one would suffice, etc.).
Profiling is the key word in the world of performance optimization.
To profile your site you have to measure 2 different areas: php scripts running time and whole page load time (including pictures, javascripts, style sheets etc). To measure PHP scripts is quite easy. The easiest way is to place this line at the top of your page
$TIMER['start']=microtime(1);
and this line at the bottom:
echo "Time taken: ".round(microtime(1) - $TIMER['start'],3);
if it stays below 0,1 sec, it's ok. Now to the whole page loading. Dunno if there are some http sniffer with response time recording.
Edit: Looks like Firebug's Net tab mentioned above is the right tool for this
Like what Kevin said, I suggest trying opcode caching with PHP. I'm not sure which is currently best, but when I looked it up a year ago, I decided to go with [eAccelerator][1] and it works great. I've also used APC on another server but I prefer eAccelerator.
You should probable go with Col. Shrpnel's advice and do some profiling as well.
[1]: http://en.wikipedia.org/wiki/EAccelerator eAccelerator
from the server-perspective:
as others wrote; use a php accelerator (I use APC, which is
supposed to become standard in php)
take care of database; number of queries, complexity of queries, data in resultset, ... can have a big impact
cache dynamic pages
and from a browser-perspective:
minimize number of JS and CSS-files
(one of each is ideal), put css in head, js below
avoid calling 3rd party javascript (analytics, widgets, ...)
check size of images (I use smush.it)
impact of these can be huge, cfr. tests I ran on my (wordpress-based) site.
If you have time to play try HipHop developed and used by Facebook
Page generated in 0.0074 secs.
DB runtime 0.0006 secs (7.87 %) using 1 DB queries, 7 DB cache fetches, 3 RSS cache fetches and 61.88 K memory.
http://i42.tinypic.com/2m31frp.jpg
ouch !!
dont bump - this is his benchmark ;)
This site will measure integrated performance mark for your site, as well as give you some relevant advice. All you have to do is to type in the URL.
I would suggest give Clicktale a try. I’ve been using it for 2 months and it is neat to watch what your users do, I learned a lot.