I am using browser caching to speed up my PHP website. If a user visits a page before logging in and comes back to the page again after logging in, the page doesn't get updated with the contents that are only visible to the logged-in users.
i.e. The "Log in" link on the header menu doesn't change to the "Log out" link if the user visited the page before logging in. And if they visited a new page after logging in, the "Log out" link on the header menu doesn't change to the "Log in" link even after log out.
Here is a sample code
<h2>Contact Form</h2>
<form method="post" action="" onsubmit="">
<?php
if (logged_in() === false) {
?>
Name*
<input type="text" name="name" required />
Email*
<input type="email" name="email" required />
<?php
}
?>
Message*
<textarea name="message" rows="12"></textarea>
<input type="submit" name="contact" value="Send Message" />
</form>
It would be a huge help for me if someone could help me on this issue.
Thanks in advance!
I'll formulate the answer based on the comments we had together.
Check the HTTP headers sent to the browser
The HTTP headers sent by PHP, Apache, NGINX or whatever is running on your server or even passing through (such as reverse proxies like Varnish) will inform the browser what to do with the response.
If you have an Expire or Cache-Control HTTP header that says the page can be cached then effectively the browser won't retrieve the page again once you're logged in.
If you don't want your dynamic pages to be cached you have to set the HTTP header in adequation, ex: Cache-Control: no-cache, no-store, private
This could be done in PHP:
header('Cache-Control: no-cache, no-store, private');
It can also be done later in your web server config for page requests handled by PHP.
By the way, the web server can override what PHP says, so you have to check the output yourself with the browser. Even some other layers such as proxies could alter the HTTP headers.
The Vary HTTP header
If your server responds for a given URL in different ways then it has to inform that the response can vary depending on something. Typically, almost all assets should be compressed to gzip or brotli to save bandwidth. But not all browsers handle brotli compression. So when they do the request, they say what they can handle, ex: accept-encoding: gzip, deflate, br. Here, the browser can handle br = brotli.
The response will then be compressed with brotli to reduce the size. But if a reverse proxy caches it and someone else comes with a browser that doesn't handle brotli then we'll be in trouble. This is why the server should inform all caching layers that the response varies depending on the Accept-Encoding header by returning Vary: Accept-Encoding.
It's also common to have a language detection script for the homepage URL (if you hit the domain without a language path). In this case, we have to send Vary: Accept-Encoding, Accept-Language
For the HTML generated by PHP, it would also be important to say that the response can vary depending on the Cookie header. This is because a page for a specific user could be allowed for caching in private mode, let's say for a minute or more but it should not be messed up with another user's page.
So in PHP, you could do:
header('Vary: Accept-Language, Accept-Encoding, Cookie');
Caching static files and re-deploying them
In terms of performance, it's important to cache static data. So typically this should be done for all assets such as CSS, JS, images, favicons, fonts, etc. They are not dynamic so they can be stored for ages. This can be done by setting Cache-Control: max-age=31536000 for these static files.
If you change a CSS rule, the CSS file won't be re-downloaded by a visitor that already came before. But it's not a problem because your HTML can point to a new URL for this same static file. This can be done by adding a query parameter in the URL of this asset:
<?php
// Increment the version when you change your CSS.
// This can even be done automatically during deployment.
// You could also use the file timestamp or an hash instead.
$css_version = 3;
echo "<link rel=\"stylesheet\" media=\"all\" href=\"/css/theme.css?version=$css_version\" />\n";
Apache config for compressed static files
The idea to have a performance improvement is to compress all CSS and JS files. This can be done by minifying, aggregating and then compressing.
I use NodeJS and made some NPM scripts to do that. There's plenty of info about that on the web
Then you need to add a few rules in your .htaccess file:
RewriteEngine On
# Redirect to HTTPS if we are not in HTTPS:
RewriteCond %{HTTPS} off
RewriteRule .* https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
# Redirect to www.example.com if it's example.com without the www. prefix:
RewriteCond %{HTTP_HOST} !^www\. [NC]
# But don't do it if it's on the local development machine (ex: example.com.local).
RewriteCond %{HTTP_HOST} !\.local$ [NC]
RewriteRule .* https://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
# If the URL looks like only having the language in the path, such as
# https://www.example.com/de or https://www.example.com/EN then rewrite it
# to index.php?lang=XX where XX is the captured language.
# It's case insensitive with [NC] because PHP makes it to lowercase after.
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(en|de|fr|es|it)$ index.php?lang=$1 [NC]
# Serve brotli compressed assets if they exist.
RewriteCond %{HTTP:Accept-Encoding} br
RewriteCond %{REQUEST_FILENAME}.br -f
RewriteRule (.*\.(?:css|js|svg|ico))(?:$|\?) $1.br [NC,L]
# Serve gzipped compressed assets if they exist.
RewriteCond %{HTTP:Accept-Encoding} gzip
RewriteCond %{REQUEST_FILENAME}.gz -f
RewriteRule (.*\.(?:css|js|svg|ico))(?:$|\?) $1.gz [NC,L]
# Set MIME types and avoid double compression done by some Apache modules.
RewriteRule \.css\.(gz|br) - [NC,E=no-gzip:1,E=no-brotli:1,T=text/css]
RewriteRule \.js\.(gz|br) - [NC,E=no-gzip:1,E=no-brotli:1,T=application/javascript]
RewriteRule \.svg\.(gz|br) - [NC,E=no-gzip:1,E=no-brotli:1,T=image/svg+xml]
RewriteRule \.ico\.(gz|br) - [NC,E=no-gzip:1,E=no-brotli:1,T=image/x-icon]
# Tell proxies to cache files separately depending on the Accept-Encoding header
# value sent by the browser. This is because we respond with different content
# encodings (none, brotli, gzip) for the same requested URL. We'll only do that
# for the static files. For pages, we'll let the PHP app set the Vary header.
<FilesMatch "\.(?i:css|js|svg|ico)(?:\.(?:gz|br))?$">
Header append Vary Accept-Encoding
</FilesMatch>
# Send the correct encoding header for *.gz files served instead of the original.
<FilesMatch "\.(?i:css|js|svg|ico)\.gz$">
Header set Content-Encoding gzip
</FilesMatch>
# Send the correct encoding header for *.br files served instead of the original.
<FilesMatch "\.(?i:css|js|svg|ico)\.br$">
Header set Content-Encoding br
</FilesMatch>
# Set a long cache age of 2 years for all static files.
# You have to change the URL if you update the file. Be aware
# that if you upload documents to your website you might have
# to disable this rule in the uploaded folder.
<FilesMatch "\.(?i:br|css|gif|gz|ico|jpe?g|js|png|svg)$">
Header set Cache-Control "max-age=63072000, public"
</FilesMatch>
# PHP can be configured to compress automatically the output if the browser
# accepts gzip compression. According to the documentation it is more effective
# this way than using ob_start('ob_gzhandler') in the PHP code.
# If output_compression is enabled then it will automatically set the
# "Vary: Accept-Encoding" HTTP header.
php_flag zlib.output_compression On
php_value zlib.output_compression_level 9
# Hide PHP's X-Powered-By HTTP header if php.ini doesn't already do it.
Header unset X-Powered-By
Compress PHP's output with gzip
As you see in the Apache config above, we can enable the
zlib.output_compression functionnality of PHP, which will
compress the output if the browser handles it and the Vary
HTTP header will also be automatically set.
But if you cannot enable it via Apache or php.ini then you
can still do the compression yourself in the PHP code with the
output buffering. Just put this piece of code in your PHP before
outputing any text:
// If zlib.output_compression=On then the output will be automatically compressed
// and the "Vary: Accept-Encoding" header will be set too. But if it's not enabled
// then we could compress the page ourselves with PHP's output buffering.
if (preg_match('/^(Off|0)$/i', ini_get('zlib.output_compression'))) {
// If the browser accepts gzip then compress with PHP's output buffering.
if (preg_match('/\bgzip\b/', $_SERVER['HTTP_ACCEPT_ENCODING'])) {
// No need to set the "Vary: Accept-Encoding" header because this is
// already done by ob_gzhandler() called below.
ob_start('ob_gzhandler');
}
}
// Hide the X-Powered-By HTTP header if expose_php=Off cannot be set.
if (function_exists('header_remove')) {
header_remove('X-Powered-By');
} else {
// We cannot remove it but we can set it to empty.
header('X-Powered-By:');
// We'll also try to remove it in the .htaccess file.
}
Related
I finished my Angular project.
In my project I send POST requests with data to PHP files, and then get result from them back to Angular.
Now I want to allow requests only from the origin domain, and deny any request from any othe domain.
I try to use:
header("Access-Control-Allow-Origin: example.com");
but it does not work. And I don't want to use $_SERVER['HTTP_REFFER'] because it can manipulated.
I also tried to use .HTACCESS but I don't know how to implement that. I tried something like that:
order deny, allow
deny from all
allow from mydomain.com
but it does not work.
My project already has the following .HTACCESS file:
RewriteEngine On
# If an existing asset or directory is requested go to it as it is
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} -f [OR]
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_URI} -d
RewriteRule ^ - [L]
# If the requested resource doesn't exist, use index.html
RewriteRule ^ /index.html
taken from here: https://angular.io/guide/deployment#routed-apps-must-fallback-to-indexhtml
What can I do?
Thanks.
Short answer :
header("Access-Control-Allow-Origin: *");
Is the minimum that should work (on your server side), but you may need to add two additional headers like:
header("Access-Control-Allow-Methods: POST, GET, OPTIONS");
header("Access-Control-Allow-Headers: Content-Type");
Like any header() call, be sure to perform those BEFORE any output.
Long answer:
CORS allow, with a relative security, to perform cross origin queries, depending on a pre-flight request OPTIONS to check what's allowed and what's not.
Access-Control-Allow-Origin sets which origins (domains) are allowed. You may want to use only your trusted domains.
Access-Control-Allow-Methods sets which methods are allowed. Usually, a lot of those.
Access-Control-Allow-Headers sets which optional (and customized) headers are allowed. Usually, you want to include any non-standard headers on top of Content-type.
CORS is incredibly well documented here : https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS
In my project, HSTS is enabled. So if someone is tryig to use the site using the HTTP then it redirects to HTTPS.
After the Security scan, it is reported that ttf, woff and woff2 files are ignoring the HSTS.
Example 1:
On Google Crome if i am trying below URL then it redirects to HTTPS:
http://example.com/backend/web/lib/roboto/Roboto-Light.woff2 then it
redirects to
https://example.com/backend/web/lib/roboto/Roboto-Light.woff2
If i try same thing on Firefox then it just downloads the Roboto-Light.woff2 file over HTTP instead of redirecting to HTTPS.
Example 2:
If i am trying below URL on both google Chrome and Firefox it just downloads the file.
http://example.com/backend/web/lib/roboto/Roboto-Black.ttf
So what should i do to fix this issue?
Update
Network log after accessing the below URL:
http://example.com/backend/web/lib/roboto/Roboto-Black.ttf
It seems that first file is being loaded by visiting the HTTP URL. But the https one not being updated in Address Bar of browser but not sure.
VHOST Settings
<VirtualHost *:80>
ServerAdmin webmaster#localhost
DocumentRoot /var/www/html
ServerName example.com
RewriteEngine on
RewriteCond %{HTTP:X-Forwarded-Proto} ^http$
RewriteRule .* https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
#RewriteCond %{HTTPS} !=on
#RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
RewriteCond %{REQUEST_METHOD} ^(TRACE|TRACK|OPTIONS)
RewriteRule .* - [F]
Header always set Strict-Transport-Security "max-age=63072000; includeSubdomains;"
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
</VirtualHost>
You need to go back and ask the security scan people why they think this is the case.
You are clearly showing that HSTS is being set for the font files.
You area also showing that you are correctly showing the 307 internal redirect for HSTS reasons.
This is the way it's supposed to work. You get two requests in Chrome's network tab (other browsers may be different):
A fake 307 response which upgrades the request from HTTP to HTTPS. This is created by the browser and the HTTP request never reaches the server. Hence why I am calling it a "fake" resonse.
The real request sent over HTTPS.
As fonts are downloaded it's difficult to tell that this was downloaded over HTTPS except by looking in the network tab - but that's fine.
If i try same thing on Firefox then it just downloads the Roboto-Light.woff2 file over HTTP instead of redirecting to HTTPS.
How do you know this? Are you sure you have visited the site over HTTPS to get the HSTS header? The first request may well be over HTTP (though you have a standard redirect in place so this should redirect to HTTPS and then download), but after that it should auto redirect BEFORE the request is sent.
If i am trying below URL on both google Chrome and Firefox it just downloads the file.
It probably does. But after a redirect.
It seems that first file is being loaded by visiting the HTTP URL. But the https one not being updated in Address Bar of browser but not sure.
No, as discussed the first one is a dummy request. The second is the real request which is actually sent to the browser. As the font file is downloaded immediately it doesn't do anything with the URL bar.
I have some existing PHP code on my server. Now I want log-in complete information about requests that come to my server. I don't want to make any changes to existing code. I am using apache mod_rewrite for this. I have a sample php script,stats.php which looks something like this
<?php
/*NOTE:This is peseudo code!!!*/
open database connection
add serverinfo, referer info, script_name, arguments info to database
change characters in request from UTF16 to UTF 8.
//Call header function for redirection
$str = Location : $_SERVER["REQUEST_URI"]
header ("$str");
?>
In httpd.conf file
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_URI} !/stats\.php
RewriteCond %{REQUEST_URI} !\/favicon\.php
RewriteRule ^/(.*)$ /stats.php?$1 [L]
RewriteLog "logs/error_log"
RewriteLogLevel 3
</IfModule>
The problem is, I am afraid this may not be best from SEO perspective and also may be buggy. Are there any better ways to do this? For example, can I use a script to the access_log file?
Say for example, if you go to http://your-domain.com/some-page.html, you'll get a loop:
Browser contacts server with request URI /some-page.html
mod_rewrite rewrites the URI to /stats.php?some-page.html
The stats.php does its thing, then redirects the browser to /some-page.html
Browser contacts server with request URI /some-page.html
repeat starting at #2
What you need to do instead of responding with the Location: header is read the contents of the some-page.html file and return that to the browser, essentially "proxying" the request for the browser. The browser therefore doesn't get redirected.
As for how to do that in php, there's plenty of google results or even plenty of answers on Stack Overflow.
I figured what I should do. I did the following
1) Add a custom logformat to httpd.conf file.
2) Added a customLog dirctive. Piped the output to stats.php.
3) stats.php takes care of adding the code to database.
I wrote a REST-ful API using Fat Free Framework in PHP, and am making a call using backbone.js. When I try and save a new Orders model my app makes a PUT request, and the server spits back a 406 error.
Request Method:PUT
Status Code:406 Not Acceptable
Request Headers
Accept:application/json, text/javascript, */*; q=0.01
Accept-Charset:ISO-8859-1,utf-8;q=0.7,*;q=0.3
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-US,en;q=0.8
Connection:keep-alive
Content-Length:174
Content-Type:application/json
Cookie:__utma=239804689.76636928.1286699220.1305666110.1325104376.94; __utmz=239804689.1325104376.94.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); PHPSESSID=935d2632fd0d12a1a0df4cb0f392eb5e
X-Requested-With:XMLHttpRequest
Request Payload
{"id":0,"customerId":0,"lastPage":"items","priceConfig":null,"items":null,"saveStatus":0,"savedAt":1326588395899,"name":null}
Response Headers
Connection:Keep-Alive
Content-Length:460
Content-Type:text/html; charset=iso-8859-1
Date:Sun, 15 Jan 2012 00:46:37 GMT
Keep-Alive:timeout=5, max=98
Server:Apache
My .htaccess file looks like this:
# Enable rewrite engine and route requests to framework
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-l
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule .* index.php [L,QSA]
# Disable ETags
<IfModule mod_headers.c>
Header Unset ETag
FileETag none
</IfModule>
# Default expires header if none specified (stay in browser cache for 7 days)
<IfModule mod_expires.c>
ExpiresActive On
ExpiresDefault A604800
</IfModule>
<IfModule mod_security.c>
SecFilterEngine Off
SecFilterScanPOST Off
</IfModule>
My website application works fine on my local server and only does this on my web server. Any ideas what is going wrong?
I came up with a workaround.
My believe my server is using mod_security2 to block PUT and DELETE requests. I am waiting to hear back from them, and mod_security2 can't be disabled in .htaccess files so there's nothing I can do.
Using "Script PUT /filename" in the .htaccess file was causing a 500 error: "Script not allowed here", I am not sure why, but I decided not to deal with having my web host reconfigured to handle PUT and DELETE.
To keep my API REST-ful, I left in the normal handling of PUT and DELETE, and added this to the POST handling:
function post() {
//if Backbone.emulateHTTP is true, emulate PUT
$data = json_decode(F3::get('REQBODY'), true);
$type = $_SERVER['HTTP_X_HTTP_METHOD_OVERRIDE']; //PUT, DELETE, POST
if ($type == 'PUT') {
$this->put();
return;
}
if ($type == 'DELETE') {
$this->delete();
return;
}
//handle normal POST here
}
If you set Backbone.emulateHTTP = true; it keeps the request method as POST and sends the X-HTTP-Method-Override as PUT or DELETE.
I like this because I can keep my REST-ful implementation intact, and just comment out the emulateHTTP code for when I publish to my webserver.
I have been trying to force HTTPS on my osCommerce site and it works. But when it switched to HTTPS, the session breaks and login doesn't work at all.
.htaccess code for forcing HTTP
RewriteEngine On
RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
The way to force HTTPS on all your pages in an osCommerce site is to use what's already set up for you in the configuration instead of making .htaccess do the work.
Edit the includes/configure.php file and put the HTTPS version of your site in both of the following:
define('HTTP_SERVER', 'https://example.com');
define('HTTPS_SERVER', 'https://example.com');
Are you definitely using Apache?
Try this instead in your .htaccess...
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}
I'm not sure if this is directly related to your problem, but I would suggest to make sure that all the forms, links and Location headers aimed within your site point to URLs using an https prefix, if those are absolute.
The rewrite rules that turn HTTP requests into HTTPS are only really useful for securing the "entry point": the first page that the user visits. It doesn't prevent data to be sent in clear if that data is sent to a URL that uses http://. Indeed, these rewrite rules only come into action after the browser has made the request in clear first (so all headers, including login cookies, unless secure cookies, and all the POSTed data, for example, will have been sent in clear).
You may be interested in these related questions:
How to redirect all HTTP requests to HTTPS
Tomcat session management - url rewrite and switching from http to https
There's a chance that the sessions break because there's a seemingly invisible plain HTTP connection in the process, which may cause some session-related data not to be transmitted correctly. If you're using Firefox, it can be useful to turn on the security.warn_leaving_secure option (via about:config URL) to track this sort of problems.
In /includes/configure.php modify the HTTP DOMAIN to have https:// That will make all sessions stay https only. Do the same in /admin/includes/configure.php
It builds upon the other answers for mod_rewrite which you should do.
I would also add the HTTP STRICT TRANSPORT SECURITY header and XSS protection, too.
<IfModule mod_headers.c>
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains"
Header unset X-Powered-By
Header unset Server
Header set X-Content-Type-Options "nosniff"
Header set X-XSS-Protection "1; mode=block"
<FilesMatch "\.(appcache|atom|bbaw|bmp|crx|css|cur|eot|f4[abpv]|flv|geojson|gif|htc|ico|jpe?g|js|json(ld)?|m4[av]|manifest|map|mp4|oex|og[agv]|opus|otf|pdf|png|rdf|rss|safariextz|svgz?|swf|topojson|tt[cf]|txt|vcard|vcf|vtt|webapp|web[mp]|woff2?|xloc|xml|xpi)$">
Header unset X-XSS-Protection
</FilesMatch>
</ifModule>