I'm playing with Symfony2 Reverse proxy and HTTP cache and I had a lot of read on the subject.
However I'm getting stuck on how it works in my case.
Here is a use case.
GET /api/articles returns something like:
HTTP/1.1 200 OK
Content-Encoding: gzip
Content-Type: application/json
Set-Cookie: PHPSESSID=12345; expires=Thu, 14-Nov-2013 14:50:35 GMT; path=/
age: 0
allow: GET, POST
cache-control: must-revalidate, no-cache, private
etag: "da4b6c4f1540a12a112936e58db06df8c95fd3c4"
vary: Accept,Accept-Encoding
x-content-digest: enbf30f962b06f99bd91843741537e112fbd3300c8
x-symfony-cache: GET /api/articles: miss, store
As you can see there the Cache-Control header is marked as private along with no-cache & must-revalidate. However, I think I'm setting the Response correctly:
$response = clone $view->getResponse();
$response
->setPublic()
->setEtag($etag)
->setSharedMaxAge(60)
->setVary(array('Accept'))
;
if ($response->isNotModified($this->getRequest())) {
return $response;
}
I set it to Public so it should work. You may have noticed the Set-Cookie header, I dunno if it matters, but as long as I set the cache as public it shouldn't, isn't it?
Now, if I GET /api/articles with an If-None-Match: {etag} I get a 304 which is correct, but the Cache-Control header is the same.
Note that if I disable the Reverse Proxy, the Cache-Control is correct and showing me:
Cache-Control: public, s-maxage=60 which is what I except.
Related
I have an API and I've been trying to add cache control headers to it.
The API already makes use of PhpFastCache for server side caching but I wanted to add an additional layer of browser control caching. I came across this intelligent php cache control page and modified it slightly.
Using PhpFastCache, I do a check to see if the server side cache exists, if it doesn't then query the DB and output normally with a 200 response code. If the cache does exist then I do the following:
//get the last-modified-date of this very file
$lastModified=filemtime(__FILE__);
//get a unique hash of this file (etag)
$etagFile = md5( $CachedString->get() );
//get the HTTP_IF_MODIFIED_SINCE header if set
$ifModifiedSince=(isset($_SERVER['HTTP_IF_MODIFIED_SINCE']) ? $_SERVER['HTTP_IF_MODIFIED_SINCE'] : false);
//get the HTTP_IF_NONE_MATCH header if set (etag: unique file hash)
$etagHeader=(isset($_SERVER['HTTP_IF_NONE_MATCH']) ? trim($_SERVER['HTTP_IF_NONE_MATCH']) : false);
//set last-modified header
header("Last-Modified: ".gmdate("D, d M Y H:i:s", $lastModified)." GMT");
//set etag-header
header("Etag: $etagFile");
//make sure caching is turned on
header('Cache-Control: public');
//check if page has changed. If not, send 304 and exit
if (#strtotime($_SERVER['HTTP_IF_MODIFIED_SINCE'])==$lastModified || $etagHeader == $etagFile)
{
header("HTTP/1.1 304 Not Modified");
exit;
}else{
//Cache Match - Output Cache Result
header('Content-Type: application/json');
echo $CachedString->get();
}
I'm using this line to get the cached response as md5:
$etagFile = md5( $CachedString->get() );
Then doing a check to see if this md5 content has changed:
if (#strtotime($_SERVER['HTTP_IF_MODIFIED_SINCE'])==$lastModified || $etagHeader == $etagFile)
{
header("HTTP/1.1 304 Not Modified");
exit;
}else{
//Cache Match - Output Cache Result
header('Content-Type: application/json');
echo $CachedString->get();
}
However I can never seem to get the 304 response header. It is ALWAYS a 200 code response header.
curl -I -L https://db.ygoprodeck.com/api/v7/cardinfo.php?name=Tornado%20Dragon
With the response always being:
HTTP/1.1 200 OK
Date: Tue, 17 Mar 2020 13:37:31 GMT
Content-Type: application/json
Connection: keep-alive
Set-Cookie: __cfduid=daaab295934a2a8ef966c2c70fe0955b91584452250; expires=Thu, 16-Apr-20 13:37:30 GMT; path=/; domain=.ygoprodeck.com; HttpOnly; SameSite=Lax
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: GET
Access-Control-Allow-Headers: Content-Type, Authorization, X-Requested-With
Cache-Control: public
Last-Modified: Tue, 17 Mar 2020 13:15:53 GMT
Etag: 399b9ba2d69ab115f46faa44be04d0ca
Vary: User-Agent
CF-Cache-Status: DYNAMIC
Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
Server: cloudflare
CF-RAY: 57571be8a986a72f-DUB
Your request is being proxied through Cloudflare which has its own caching layer. If you test this direct to origin/with a grey clouded record are you getting a 304?
You said you were working on browser caching, browser is going to cache based on the max-age setting you send, but don't see one being set in the response.
I did R&D on prevention of CRLF injection in php, but i didn't find any solution in mycase, as I'm using a burp suite tool to inject some headers using CRLF characters like the below.
// Using my tool i put CRLF characters at the start of my request url
GET /%0d%0a%20HackedHeader:By_Hacker controller/action
//This generates an header for me like below
HackedHeader:By_Hacker
So i can modify all headers by doing just like above
This tool is just like a proxy server so it catches the request and gives the response and we can modify the response in the way we want.
So i'm just modifying the response by injecting some headers using CRLF characters. Now the Server responds to this request by injecting the CRLF characters in the response.
I'm just worried as header fields like Pragma, Cache-Control, Last-Modified can lead to cache poisoning attacks.
header and setcookie contain mitigations against response/header splitting, But these can't support me in fixing the above issue
Edit
When i request to mysite.com contact us page like below This is the request I captured in my tool like below
Request headers:
GET /contactus HTTP/1.1
Host: mysite.com
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
And i get the Response HTML for the above request
Now for the same request using the tool i'm adding custom headers just like below
Request Headers:
GET /%0d%0a%20Hacked_header:By_Hacker/contactus HTTP/1.1
Host: mysite.com
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
Response Headers:
HTTP/1.1 302 Found
Date: Fri, 10 Jul 2015 11:51:22 GMT
Server: Apache/2.2.22 (Ubuntu)
Last-Modified: Fri, 10 Jul 2015 11:51:22 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Location: mysite.com
Hacked_header:By_Hacker/..
Vary: Accept-Encoding
Content-Length: 2
Keep-Alive: timeout=5, max=120
Connection: Keep-Alive
Content-Type: text/html; charset=UTF-8
You can see the injected header Hacked_header:By_Hacker/.. in the above response
Is there anyway in php or apache server configuration to prevent such kind of headers' hack?
Not sure why all the down votes - infact, it is an interesting question :)
I can see that you have tagged CakePHP - which means your app is using Cake Framework... Excellent! If you are using Cake 3 , it is automatically strip off : %0d%0a
Alternatively, where you receive the response header, just strip off %0d%0a and you are good!
Where things like these could be applied - a 3rd party API response or say.... a Webhook response! or a badly sanitized way to handle intl.. example : lang=en to lang=fr where the GET param is directly set as response header... That would not be a wise move!
Ideally, the responses will be as GET and not in the header but either way just strip the %0d%0a and you are good.
Answering your edit.
You can see the injected header Hacked_header:By_Hacker/.. in the above response
That injected header cannot be controlled or stopped, mate. We do not have control over what the other server does.
The question is.. What do you do with the response header?
The answer is... You sanitize it, as ndm said you need to sanitize the input.. What you get as a response IS an input. As soon as you detect %0d%0a, discard the response.
Need code work?
<?php
$cr = '/\%0d/';
$lf = '/\%0a/';
$response = // whatever your response is generated in;
$cr_check = preg_match($cr , $response);
$lf_check = preg_match($lf , $response);
if (($cr_check > 0) || ($lf_check > 0)){
throw new \Exception('CRLF detected');
}
My Response Header is
Access-Control-Allow-Meth... GET, POST
Access-Control-Allow-Orig... *
Cache-Control no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Connection Keep-Alive
Content-Length 81
Content-Type text/html
Date Mon, 26 Aug 2013 06:35:53 GMT
Expires Thu, 19 Nov 1981 08:52:00 GMT
Keep-Alive timeout=5, max=99
Pragma no-cache
Server Apache/2.4.3 (Win32) OpenSSL/1.0.1c PHP/5.4.7
X-Powered-By PHP/5.4.7`
And The Request Header is
Accept application/json, text/javascript, */*; q=0.01
Accept-Encoding gzip, deflate
Accept-Language en-US,en;q=0.5
Cache-Control no-cache
Connection keep-alive
Content-Length 31
Content-Type application/x-www-form-urlencoded; charset=UTF-8
Cookie USERNAMEEMAIL=shan%40atlos.com; PHPSESSID=8asm46iltcqc9oahsbaaap1c16
Host localhost
Pragma no-cache
Referer http://localhost/test/
User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64; rv:23.0) Gecko/20100101 Firefox/23.0
X-Requested-With XMLHttpRequest`
i am getting error in firefox "Not Well Formed" what is the problem in this.
i am getting the data correctly in json form but it show also error that is very annoying
Java Script Code to Make Request
GetTopNotification
And Uses a Class for make Ajax Request is
Workspace
Your reponse-header is incorrect.
if(headers_sent()) die('Should not output data before json');
header('Content-type: application/json');
echo json_encode($data_for_json);
exit;
Also, nothing should be sent before the json, and nothing after it either.
In response to comment below:
Somewhere in your php-code you're outputting json. However, as stated, your reponse header is incorrect: the Content-type part should be set to application/json; the above code does just that. A line-by line walktrough:
Checks if you did not already send anything and die if you did
Set the Content-type part of your response header to the appropriate mime-type
Output the json (as it currently is, should be fine)
exit;
More update irt comments
You're creating your json string manually: something i can wholeheartedly advice against, use an array or object and then use json_encode to create you json.
I also added output_buffering over your code, just in case.
Give it a try, new code is here
Update three
In work-space.js Replace this line
self.responseValue = self.getValueFromResponse( jqXHR );
With this
if(type != 'xml') self.responseValue = data;
else self.responseValue = self.getValueFromResponse( jqXHR );
save it, clear the cache, and try again.
Okay so I'm getting this weird unexpected response from Internet Explorer, while testing file upload with smarty in php.
Here my smarty code for file upload (view), simplified down to main issue, for those who have not used activecollab the Router::assemble is just forming a url with parameters that are read from the MVC.
(source: iforce.co.nz)
<div id="xero_invoice_manager_api">
{form action=Router::assemble('xero_invoice_manager_api') method=post id="xero_invoice_manager" enctype="multipart/form-data"}
<div class="content_stack_wrapper">
<input type="file" name="file_1" /><br/>
<input type="file" name="file_2" /><br/>
{wrap_buttons}
{submit success_event="api_updated" }Authenticate{/submit}
{/wrap_buttons}
{/form}
</div></div>
And here is my jquery for the view.
App.Wireframe.Events.bind('api_event_finished.content', function(event, settings) {
App.Wireframe.Flash.success(App.lang('Xero Invoice Manager has saved/uploaded your Xero API data.'));
});
Here is my simplified controller (I have found the issue is with smarty and not php).
//api view
function api(){
if ( $this->request->isSubmitted()) {
$this->response->respondWithData(true);
}
}
Here is my controller with the upload occuring..
//api view
function api(){
$this->assignSmarty();
if ($this->request->isSubmitted()) {
$this->XeroAuthUpdate(); //update everything
if(isset($_FILES)){
$file_manager = new XeroFileManager();
$file_manager->dumpFiles($_FILES);
//upload the files
foreach($_FILES as $file){
$file_manager->handle_certificate_file($file);
} //foreach add the headers
if(function_exists('headers_list')){
xeroDebugMode("[Controller] the headers to be sent are... ", headers_list());
} //function check
} //end if
$this->response->respondWithData(array(
// constraints
'key_result' => (bool)$this->checkValue(XeroAuths::getSetting('xero_consumer')),
'secret_result' => (bool)$this->checkValue(XeroAuths::getSetting('xero_secret')),
// files secruity certificates
'publickey' => (bool)file_exists(XERO_PUBLIC_KEY_PATH),
'privatekey' => (bool)file_exists(XERO_PRIVATE_KEY_PATH),
'xero_auth' => (bool)validateXeroAuth(),
//login constraints
'install' => !$this->checkInstallRequirements(),
));
} //close the request
}
Here is a response from firefox with the file_1 and file_2 not empty.
(source: iforce.co.nz)
Here is a response from internet explorer 9 with file_1 and file_2 empty (so far so good).
(source: iforce.co.nz)
Here is the problematic response from internet explorer 9 with file_1 (i.e. publickey.cer) and file_2 (i.e. privatekey.pem) not empty (download index.php huh?).
(source: iforce.co.nz)
My response from activecollab
Hello Micheal,
Sorry for the late reply.
Unfortunately we cannot figure out where the problem is. It looks like everything is written OK but without dealing with the code itself there's pretty much nothing we can do. Dealing with JSON responses in IE works fine across activeCollab (well, not in IE6) since almost everything in aC 3 is based on JSON, which makes your issue specific and probably there's something wrong in your code.
Regards,
Oliver Maksimovic
activeCollab development & support
General and Pre-Sale Questions: 1-888-422-6260 (toll-free) Technical Support: support#activecollab.com
An associate has suggested..
Would suggest trying the following though:
1) open IE -> open the developer tools (press F12) -> Click "Cache" in menu -> click "Clear Browser Cache"... When thats finished click "Cache" and then click "Always refresh from server".
this forces IE to not cache anything, as I've had numerous times where IE was caching ajax requests and causing some very strange behaviour.
let me if this fixes your problem, and if so we can add some php to your ajax response to force all browsers to never cache the response. otherwise if that still doesn't work, probably need to do some JS debugging in IE, to see what's being sent and compare it to your FF firebug results.
headers_sent() comes up blank
but the headers_list (just before respondWithData is called), for Internet Explorer.
2012-08-08 06:50:16 the headers sent from this request is... Array
(
[0] => X-Powered-By: PHP/5.3.8
[1] => Set-Cookie: ac_activeCollab_sid_yhRk0xSZku=1%2Fhkykz0Rw0796e4lDykXekNXvhMMxC8pV4akJPMvA%2F2012-08-08+06%3A50%3A15; expires=Wed, 22-Aug-2012 06:50:15 GMT; path=/
[2] => Content-Type: application/json
[3] => Expires: Mon, 26 Jul 1997 05:00:00 GMT
[4] => Cache-Control: no-cache, no-store, must-revalidate
[5] => Pragma: no-cache
)
Response Headers from Raw tab on Fiddler, on Internet Explorer
HTTP/1.1 200 OK
Date: Sat, 11 Aug 2012 08:08:46 GMT
Server: Apache/2.2.21 (Win32) mod_ssl/2.2.21 OpenSSL/1.0.0e PHP/5.3.8 mod_perl/2.0.4 Perl/v5.10.1
X-Powered-By: PHP/5.3.8
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Set-Cookie: ac_activeCollab_sid_yhRk0xSZku=11%2Fz8rWxiRchAh8EWinYO2d7a1mmvn2DMKUdse1vfKh%2F2012-08-11+0 8%3A08%3A46; expires=Sat, 25-Aug-2012 08:08:46 GMT; path=/
Content-Length: 107
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: application/json; charset=utf-8
{"key_result":true,"secret_result":true,"publickey":true,"privatekey":true,"xero_auth":true,"install":true}
Response Headers from Raw tab on Firefox.
HTTP/1.1 200 OK
Date: Sat, 11 Aug 2012 08:13:45 GMT
Server: Apache/2.2.21 (Win32) mod_ssl/2.2.21 OpenSSL/1.0.0e PHP/5.3.8 mod_perl/2.0.4 Perl/v5.10.1
X-Powered-By: PHP/5.3.8
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Set-Cookie: ac_activeCollab_sid_yhRk0xSZku=12%2FO40CbXC9Vfa7OVnderlK2MFnvnpkyeckvO0Ab5NQ%2F2012-08-11+08%3A13%3A45; expires=Sat, 25-Aug-2012 08:13:45 GMT; path=/
Content-Length: 107
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: application/json; charset=utf-8
{"key_result":true,"secret_result":true,"publickey":true,"privatekey":true,"xero_auth":true,"install":true}
Any ideas on what I'm doing wrong with IE? and why Internet Explorer is notifying the user to download index.php (when the fields are active with values). Keeping in mind that no actual uploading is occurring on the server-side (during the initial test, the index.php download request is irrelevant to move_uploaded_file).
It could be that IE specific code has an error, and so the returned content-type is different. If you make an AJAX request for some kind of XML or JSON data and instead get some kind of file HTML error response with a different content-type or disposition than expected, the browser might not know what to do with it.
You might want to find a way to view or log the response (as opposed to request) headers sent by the web server. Usually a prompt for file download comes from a content-disposition header... though in this case it might just be because it's a file coming from an asynchronous request.
You might also want to look at:
IE prompts to open or save json result from server
and
How can I convince IE to simply display application/json rather than offer to download it?
I had a similar issue using pupload and mvc3. I know we use different technology but maybe my issue could help you. I had this:
public JsonResult UploadDoc(string correlationId)
{
try
{
//upload code here
return Json(new { message = "chunk uploaded", name = "test" });
}
catch (Exception ex)
{
return Json(new { message = "chunk uploaded", name = "test" });
}
}
Now everytime I wanted to try upload a file I would get IE asking me to open or download a file which just contained that json response above. If I set my return type as "String" and set my return code as:
return "{\"respCode\" : \"200\", \"Msg\" : \"succussful\",\"mimeType\": \"" + Request.Files[0].ContentType + "\", \"fileSize\": \"" + Request.Files[0].ContentLength + "\"}";
Then the file was successfully uploaded. Response Header for when it failed: "Content-Type: application/json; charset=utf-8" . Response Header for when it works with "String" return type:
"Content-Type: text/html; charset=utf-8". Hope it helps, cheers.
Due to the lack of answers, I think I need to take a different approach in my jquery.. until an actual solution is found.
I have a website in which I update the content approximately once monthly. When I check the HTTP request header fields, I get the following output:
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
My question is, given the frequency at which I update content, I am thinking about manually setting these fields to allow cache of the site. I am using the php header(); command to do so.
Therefore, my question is: what should my expires, cache-control, and pragma HTTP request header fields be set to? Also, should I be setting any other fields in addition to those?
You could look into using ETAGs - http://en.wikipedia.org/wiki/HTTP_ETag
Your Expires header should be the date in the future at which time the content will expire and caches will be forced to fetch it again.
Get rid of the Pragma header
For Cache-Control you can add:
public max-age=2592000
Assuming you want it cached for 30 seconds.
For greater control you should follow hafichuk's advice and use ETags.
For references on cache headers check out Headers