I'd like to create a button on my site that COMPLETELY clears my cache. As neither Safari's nor Chrome's features work at all it seems. Is this possible?
Not possible. That'd expose low-level functionality to public access. Even if an exploit would simply empty your cache, it'd still be undesireable. Firefox and Chrome both use shift-ctrl-del for this, so at the cost of actually having to use your keyboard, you can do the same thing without the security risk.
It sounds like you want to clear the cached data that sits between your server and your browser, not the data that the browser has cached. A copy of your resources mat be sitting on a machine between your client computer and your server, and that is returning the cached copy, instead of asking your server for the data again.
You should read up on different caching methods, so you can set up certain types of files to be cached for a certain amount of time etc. Try this for starters.
I usually set up static resources (css, js, etc.) to be cached for a long period of time, but I change the URL when I have made changes to it. I usually do this by rewriting the request url so /resources/dummy/file.css becomes /resources/file.css and I can change dummy whenever I want. This creates the allusion of a different file (that hasn't been cached yet) but I don't have to rename the file.
RewriteRule resources/[^/]+/([^/]+)$ resources/$1
Highly doubt this is possible moreover you want it to be cross-browser. It will create serious security flaw to be exploit.
Related
I develop on a remote server. I'm not sure if this is a laravel caching tendancy, but CSS and JavaScript changes tend to be delayed longer than usual. I can make instant HTML and php changes, but sometimes it takes more than a few minutes for CSS and JavaScript changes to be reflected on a page. I do know that .blade.php files are generated and cached within an app/storage/views folder, but even when I delete those the changes are not reflected right away.
I have tried on Chrome and Firefox.
Any ideas? Thanks!
CSS and Javascript are not handled by Laravel.
You can check it from the .htaccess of the public folder
Most likely you are hitting the cache browser. One solution to avoid browser cache is to append to the css or javascript a unique identifier whenever there is a new release, e.g.:
site/my/css.css?12345
^
|
+ - Change this to force a fresh copy
What ended up working best was to clear the cached views on the server, as #dynamic hinted at (Laravel and Blade store these cached files).
AS WELL, to close the tab completely and re-access the webpage fresh. There were cases where it was still cached, and so I would just switch to another browser.
I'm working on a website that can be found here:
http://odesktestanswers2013.com/Metareviewer
The index appears to be unusually slow (slowing down the browser as it loads) even though Yslow doesn't seem to see anything particularly wrong with it and that my php microtime returns a fine value.
What's the other things I should be looking into ?
Using Chrome Developer Tools, the network tab shows this:
... a timeline of what's loading in your page.
There are also plenty of good practices that aren't being made here. Some of these can also be flagged up by using Google Chrome's Audit tool (F12 menu), but in my opinion the most important are:
Use a CDN for serving common library code. Do you really need to host Jquery yourself? (side-rant, do you really need jquery at all?)
Your JavaScript files are taking a long time to load, because they are all served as separate HTTP calls. You can combine them into a single JavaScript file, and also minify them to save lots of bandwidth.
Foundation.css is very large - not that there's a problem with large CSS files, but it looks like there are over 2000 rules in the CSS file that aren't being used on your site. Do you need this file?
CACHE ALL THE THINGS - there are 26 HTTP requests that are made, that are uncached, meaning that everyone who clicks on your site will have to download everything, every request.
The whole bandwidth can be reduced by about two thirds if you enabled gzip compression on your server (or even better, implement SPDY, but that's a newer technology with less of a community).
Take a look on http://caniuse.com - there are a lot of CSS technologies that are supported in modern browsers without the need for -webkit or -moz, which could save a fortune of kebabbobytes.
If I could change one thing on your site...
Saying all of that, each point above will make a very small (but accumulative) difference to the speed of your site, but it's probably a good idea to attack the worst offender first.
Look at the network graph. While all that JavaScript is downloaded, it is blocking the rest of the site to download.
If you're lazy, just move it all to the end of the document body. That way, the rest of the page will download before the JavaScript has to, but this could harm the execution of your scripts if they are programmed in particular styles.
Hope this helps.
You should also consider using http://www.webpagetest.org/
It's one of the best tools when it comes to benchmarking your site's performance.
You can use this site (http://gtmetrix.com/) to analyze the causes and to fix them them.The site provides the reasons as well as solutions like js and css in optimized formats.
As per this site's report, you need to optimize images and minify js and css files. The optimized images and js and css files can be downloaded from this site.
Use Google Chrome -> F12 -> Network and check the connect, send, receive and etc. time for each resource, used in your page.
It looks like your CSS and JS scripts have a very long conntect and wait times.
you can use best add-one available for both chrome and firefox
YSlow analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages.
above is link for firefox add-one you can also search chrome and is freely available.
Yslow gives you details about your website's front end. Most likely you have a script that is looping one to many times in the background.
If you suspect that a sequence of code is hanging server side then you need to do a stack trace to pinpoint exactly where the overhead is taking place.
I recommend using New Relic.
Try to use Opera. Right click -> Inspect element -> Profiler.
Look to Inspect element -> Errors.
I have a dynamic php (Yii framework based) site. User has to login to do anything on the site. I am trying to understand how caching and CDN work; and I am a bit confused.
Caching (memcache):
My site has a good amount of css, js, and images. I've been given to understand that enabling caching ("memcache"?) will GREATLY speed up my site. But this has me confused. How does caching help? I mean, how can you cache something that's coming out of DB for each user separately? For instance, user-1 logs-in, he sees his control panel. User-2 logs-in, user 2 will see their control panel.
How do I determine what to cache? Plus, how do I enable caching (memcaching)?
CDN:
I have been told to use a content delivery network like CloudFlare. It is suppose to automatically cache my site. So, when my user-1 logs in, what will it cache? Will it cache only the homepage CSS, JS, and homepage images? Because everything else requires login? What happens when user logs-out? I mean, do "sessions" interfere with working of a CDN?
Does serving up images via CDN reduce significant load on my server? I don't have much cash for getting a clustered-server configuration. So, I just want my (shared) server to be able to devote all its resources in processing PHP code. So, how much load can I save by using "caching" (something like memcache) and/or "CDN" (something like CloudFlare)?
Finally,
What would be general strategy to implement in this scenario for caching, cdn, and basic performance optimization? do I need to make any changes to my php-code to enable CDN like CloudFlare and to enable/implement/configure caching? What can I do that would take least amount of developer/coding time and will make my site run much much faster?
Oh wait, some of my pages like "about us" page etc. are going to be static html too. But they won't get as many hits. Except for maybe the iFrame page that will be used for my Facebook Page.
I actually work for CloudFlare & thought I would hop in to address some of the concerns.
"do I need to make any changes to my php-code to enable CDN like
CloudFlare and to enable/implement/configure caching? What can I do
that would take least amount of developer/coding time and will make my
site run much much faster?"
No, nothing like a need to re-write urls, etc. We automatically cache static content by file extension. This does require changing your DNS to point to us, however.
Does serving up images via CDN reduce significant load on my server?
Yes, and it should also help most visitors access the site faster and save you a fair amount on bandwidth.
"Oh wait, some of my pages like "about us" page etc. are going to be
static html too."
CloudFlare doesn't cache HTML by default. You use PageRules to setup more advanced caching options for things like static HTML.
Caching helps because instead of performing disk io for each user the data is stored in the memory, ie memcached. This provides a SIGNIFICANT increase in performance.
Memcache is generally used for cacheing data ie query results.
http://pureform.wordpress.com/2008/05/21/using-memcache-with-mysql-and-php/
There are lots of tutorials.
I have only ever used amazon s3 which is is not quite a cdn. It is more of a storage platform but still it helps to take the load off of my own servers when serving media.
I would put all of your static resources on a CDN so your own server would not have to serve these. It would not require any modifcation to your php code. This includes JS, and CSS.
For your static pages (your about page) I'd make sure that php isn't processing that since there is no reason for it. Your web server should serve it directly.
Cacheing will require changes to your code. For cacheing a normal flow is:
1) user makes a request
2) check if data is in cache
3) if it is not in cache do the DB query and put it in cache
4) if it is in cache retrieve it
5) return data.
You can cache anything that requires disk io and you should see a speed up.
Memcached works by storing database information (usually from a remote server or even a database engine on the same server) in a flat file format in the filesystem of the web server. Accessing a flat file directly to retrieve data stored in a regulated format is much much muuuuuch faster than accessing that data from a remote query each time. This is typically useful when you have data that can be safely stored for certain periods of time as it is not subject to regular changes.
The way this works is that if you want to store a user's account information in a cache to speed up loading pages where that user is logged in. You would load the information and cache it locally. On any subsequent requests for that data, it will load in a fraction of the time it normally would take to load that information from the database itself. Obviously you will need to make sure that you update/recache that information if the user changes it while logged in, but you will greatly reduce the time it takes to serve up pages if you implement a caching system that can minimize the time spent waiting on the database.
I'm personally not familiar with CloudFlare so I can't offer any advice to that effect, but in terms of implementing caching in your application, you should check out:
http://code.google.com/p/memcached/wiki/NewOverview
And read the rest of the Wiki entries there which cover installation/implementation/etc. That should get you started on the right track.
If you open up your mozilla Firefox web browser and turn on firebug to check for incoming and outcoming network traffic, you see that, when you look at Wikipedia articles, the amount of cached content is very large.
Unless the article in question has many pictures, most of the content comes from the cache.
I'd like to know whether that is done by the browser itself or if it's some underlying PHP Caching mechanism. (is that what they call memcache?APC?) It works very well so I'd like to know how they do it.
Memcacahe, APC etc are server side data stores. You basically use it as a key value store so you don't have to ping your database all the time.
However, what you're actually seeing is a site being loaded on a primed cache. This is the technique of telling your web server to let the browser know that your commonly used resources haven't changed since the last time you viewed it. This effect is achieved by setting far future headers so that the browser doesn't keep requesting the resources. A lot of sites use this technique, including SO.
Here's a great source to read up on, if you want more info : http://developer.yahoo.com/performance/rules.html
I am using the header function of PHP
to send the file to the browser with some small code. Its work well
and I have it so that if any one requests it with a referer other than my site
it redirects to a page first.
Unfortunately it's not working with the internet download manager.
What I want to know is how the rabidshare and 4shared sites do this.
You could use sessions to make sure the download is being requested by a valid user.
Not all browsers / softwares that can see web pages will send a Referer to your server. Some sites will make a browser "fingerprint", usually hashed, which might be Referer, User-Agent and a couple of other headers strung together to make a uniquie identifier for that user and thus restrict access as you describe.
Of course, I may have completely missed the point of your post!
A typical design pattern is using a front controller to have a single entry point for all requests. By having a front controller, you can control exactly what the client sees.
You can configure this in Apache so that all requests go through a single file (it's been a while since I've done this because I now concentrate on Java). I think you would need to look at pathinfo documentation for Apache.
This might require a significant change in the rest of your application code. But, the code will be more secure and maintainable in the long run.
I've served images and other binary files through this pattern. This allowed me to easily verify users were authenticated before actually sending them the file. Obfuscation is not security, so if you rely on obfuscating your URL, an attacker may be delayed in getting in, but it is just a matter of time.
Walter
The problem probably is that sending file through php script (with headers you mentioned) doesn't support starting file download at certain position. Download managers use this feature to download file using several simultaneous threads (assuming server gives one thread at certain speed).
For small project I would recommend making a copy of file with unique filename just for download time and redirecting user to this copied file. This way he gets full server download features and it also doesn't load processor as php does. Disadvantages - more disk space required and need to cleanup download directory.