I have page caching going on and I am trying to improve server response time and when I grab the cached page content via php, for whatever reason it takes about 800ms to output it to the browser.
require_once(PATH_TO_CACHED_FILE);
When I copy this exact same content and put it into a .html file, I get the same content in the browser in about 250ms.
I also get about 250ms when I switch the above require_once to this =>
echo 'a';
So with all this in mind - I'm thinking this is probably related to the buffer size as the bigger the buffer the longer it takes to output it)? Right? I mean - this is a huge difference - what could one do in order to more or less match outputting the content via php to simply loading html since both files do virtually the same thing (grab content / push to browser)?
Thanks!
btw: I also testing copying the output to a .php file (so it just echoed the cached HTML, there are no calculations done in this file) and it still takes ~800ms -- how can simply changing the extension from .php to .html make 500ms difference?
btw2: not sure if it's important, php is on nginx
what could one do in order to more or less match outputting the content via php to simply loading html since both files do virtually the same thing (grab content / push to browser)?
Nothing, see below.
how can simply changing the extension from .php to .html make 500ms difference?
The PHP/PHP-FPM scripting engine is quite heavy compared to simple HTML access because it has to (potentially, based on current state) fork a worker process, do the necessary bootstrapping, load modules if they were not, parse the script and even in a simple HTML you requireing, it must parse it and look for <?php tags to process.
With direct HTML access, there is no PHP engine involved. Nothing of the above happens if you simply link/access your HTML file directly.
Aside from that, buffering can be an issue as well, if you use zlib on the PHP side.
If you must insist on some parent PHP code to be used to serve cache HTML files, then it probably makes sense to experiment with using readfile in lieu of require, as the former won't be subject to parsing the file content for PHP tags.
Otherwise, the best route to go with, if you already have complete HTML files (as in full page cache), is completely avoiding PHP invocation for them. This can be accomplished by using NGINX (or your webserver's of choice) rewrites to route requests directly to HTML-cache files, based on their presence (e.g. for NGINX this can be implemented using try_files directive).
Related
I'm confused. I have a PHP 5.6 script that is producing a .js JavaScript file. It echoes arrays using PHP loops, and also includes some plain-text non-PHP sections, and also reads ten smaller javascript files and echoes them to the client. The total final file size is 560KB, The server automatically compresses the output and it arrives at the client as 163KB compressed. It takes between 700 and 1400 milliseconds to arrive at the client.
I guess that I shouldn't complain, but it seemed to me that it didn't make sense to keep reconstructing this file via PHP, so I prepared a copy of the final file and gzencoded it with level 9 to a size of 160KB and tried skipping PHP and loading the file either directly or via a RewriteRule in .htaccess. It's now always taking betwen 1200 and 1600 milliseconds to arrive, according to Chrome's Network pane.
Is it possible that PHP is so fast that it's bad to cache the file? Or is there something that might need adjustment? This is all via shared hosting, so I don't have full control.
I think that I was missing the point that it's all static data as pointed out by #Capsule. So by using my new scheme and putting it in a file I automatically activate support for sending back 304 responses later via the ETag (after I set the appropriate Cache-Control header in .htaccess).
The PHP system may be highly optimized as pointed out by #arkascha, but that's only helpful for the first time the user accesses the file; when max-age expires and they ask for it again, my PHP script doesn't contain a whole scheme to try to send back 304, so it has to send the whole file out again.
.html files can translate html and css code. But .php files can do all that an html file can do plus use php. Are there any advantages in using html files over php files, especially in development of a responsive website?
Semantic:
By using *.html/*.htm the reader immediately know, it is just plain HTML
Performance*:
Every html file, that is been requested, is sended by the webserver immediately to the browser.
For every php file, the webserver first starts the PHP interpreter, which is processing the file. After the file is processed, the output is send to the browser.
This means: html takes less cpu power/memory on your webserver. However you will never realy notice it, if you are not serving thousands of requests per second.
* not as much relavant as it's used to be, due to new caching technologies.
Php is a Hypertext Processor that work's on server side. It generate content and send it to a browser which reads the content sent by php. Once it is in the client window, the page is fixed. That's why we use JavaScript on client side so the page can work on its own. It is the server that deal with the php file not the browser. You browser only reads markup language like html.
So if you have a PHP page, while if someone loads that page they may not see the server side run PHP code; if they grab the source, the file itself is still publicly available, because if you make it not publicly available the person would not be able to load that page.
Thus someone could with the right knowledge 'grab' that file and then read the serverside script stuff.
So is it not safer to make a 'proxy'. for example, AJAX post call to a PHP page (called script handler) and pass a string with the first 2 char being the id to the PHP script to run and the rest of the string being the data for that script, then the script handler runs and include based on the number and returns the echoed back HTML that is then displayed.
What do you guys think? I have done this and it works quite nice, if I grab source all I get is an HTML page with a div container and a javascript file with ajax calls to script handler.
No. Your 'workaround' does not fix the problem, if there ever was one.
If a client (a browser) asks a 'resource' (a page, for example) from a webserver, the webserver won't just serve the resource as it finds it on disk.
If you configured your webserver well, it will know that
An .html, .gif, .png, .css, .js file can just be served as-is.
A .php, .php5, .cgi, .pl file has to be executed first, and the resulting output has to be served.
So with a properly configured server (and most decent webservers are properly configured by default), grabbing the PHP source just by calling the page is impossible - the webserver will know to execute the source and return the result.
But
One of the most encountered bugs when writing your own 'upload/download script' is allowing users to upload/download .php (or other executable) files. If your own script 'serves' the .php file by reading it from disk and writing it to the net, users will be able to see your code.
Solution:
Don't write scripts unless you know what you are doing.
Avoid the not-invented-here syndrome (don't reinvent the wheel unless you are sure you NEED a better wheel AND can MAKE a better wheel)
Don't solve problems that don't exist!
By the by:
if your webserver was mal-configured and is just serving .php files as viewable/downloadable files, your 'solution' of calling it by ajax would not change this... Ajax still is client-side, so any client could bypass the ajax and fetch the script itself.
If your web server is configured correctly, users should never be able to view the actual contents of the PHP file. If they try, they should see the actual output of the PHP script as your web server reads and executes it, then passes that as the response to the HTTP request.
Furthermore, you need to understand that users can easily still look at the file the AJAX request is fetching; all they need to do is install Firebug, or use the Chrome developer tools, and they'll be able to see the full URL the file is fetched from.
So to sum up, firstly you shouldn't need to use this kind of 'security technique' for PHP files, and secondly, the 'security technique' will not stop anyone with more than a passing interest in your data.
Would it be faster include a javascript file and outputting it in the html as a <script> or just use the src attribute and let the browser make another request?
Simply outputting it instead of letting the browser make another request would obviously mean less requests and possibly less server load, but does it make it faster? Including the files and outputting them doesn't let the browser cache them.
If you include it, every different page will have the overhead of downloading the script again.
If you externally link to it, and send future expiry headers and use versioning with a cache buster (for changes), your file will be downloaded once as per required. On the topic of performance, be sure to minify or pack your production use JavaScript.
Of course, this is very relevant to your JavaScript. If it is a few lines and likely not to change, maybe you could save that one HTTP request and place it inline.
99% of the time, however, in an external file is best practice.
It is quite a complex answer. Obviously the techniques differ for a production enviroment and a development one.
The last one is quite simple: let include your scripts as they are.
For production environment: you should concatenate the js files you need into one file, minify and compress it. You can retrieve libraries from public cdn to increase download performance and relieve your server load.
The increased server load (the http header) should be balanced by the caching
To increase the user-perceived performance you should link your js file at the bottom of the page instead that into the head section
You should be aware of the deferred execution too, it let the browser to download other resources while downloading javascript files (by default the browsers download a javascript at a time as it doesn't know if the javascript he download will change the dom during his execution).
At last, if your script is quite short you will have a better performance if you include it right into the web page
At the very last, if you have similar question, you should enjoy reading this:
http://developer.yahoo.com/performance/rules.html
I agree with #alex. Also, linking allows the script files to be downloaded in parallel as the page is being parsed. Most browsers use multiple threads to download content while parsing the main page's content.
Was wondering a couple of things.
Does http headers cache everything on the page. And if i have some javascript files will it cache them as well for subsequent pages, or is it more complicated then that. Example: If I cache all javascript files on page1.php will the files still be cached on page2.php or does it cache files for page1.php only apply to page1.php.
The other question is...
Should I scrap http headers and just use APC and if so how complicated is it, or in fact is it possible to use both(asking cuz yslow says to use http headers). Thanks for any info, Ive been reading but these questions weren't really answered in the text.
Your web server will take care of caching for you if you're just serving up regular .js files. The .js files will be downloaded the first time they are linked from one of your pages. When the user re-loads that page, or goes to another page entirely that uses the same .js file, the browser will used the cached copy. This applies when you load scripts via <script src="code.js"></script> tags.
That's if you have standalone, separate .js files. If, on the other hand, you JavaScript code buried in the HTML your PHP scripts generate, for example:
<script type="text/javascript">
alert("Hello world!");
</script>
...these scripts will be re-generated each time your .php file is loaded. If you're looking to cache the output of your PHP scripts then you will need to manage caching yourself by setting the appropriate HTTP headers from your PHP scripts, be that via the Cache-Control family of headers or the If-Modified-Since and ETag style of headers.
Caching and PHP files don't generally go together, though, since you're usually generating dynamic content that changes based on user input, the time of day, cookies, etc. As caching is purely an optimization the general programming warning against premature optimization applies. If you mess up your HTTP headers you can cause yourself a lot of headaches (believe me on that!). As a rule of thumb, you can probably just let Apache or IIS take care of advanced HTTP things like this and only muck around with HTTP headers if you have a specific need to do so.
I think you're confusing the different types of caching. You've talked about 3 or 4 very different things here.
browser caching -- any normal browser will cache images, JS files, and CSS files between pages. Meaning, the second time a browser wants to display any particular image from your site, it will load it from it's local disk cache instead of going back to your server for it. All this stuff just happens -- don't mess around with it, and it just works.
(exceptions: browsing user has turned off caching, you've changed headers to avoid caching, your mime.types aren't set up correctly so the browser doesn't treat these files correctly.)
server-side content caching -- if your pages are rendering slowly ON THE SERVER, you can use various disk-and-RAM caching schemes to keep the output around, and prevent the server from having to render each page each time. This only works for fairly static sites or static parts of pages.
APC content caching -- APC has commands that let you stuff arbitrary content into a server-side RAM cache. If a piece of your system takes a long time to render, but can be reused by many server hits, this is a good choice.
APC code caching -- Your text PHP scripts are "pseudo-compiled", then sent to the PHP runtime for execution. This "pseudo-compile" stage can be very slow and is redundant, so APC caches the "psuedo-compiled" PHP stage in RAM. It can speed up a whole website quite handily.
Sorry if this is TMI.