I am experiencing this strange behaviour, on a shared hosting I am connected by FTP and when I am editing some file(and saving it), it takes at least a few minutes for that change to take effect. For example I put in my index.php file the line echo "test";die;and save it: the program (I am using file zilla) shows that the file is uploaded into server. just in case to be sure, am doing cat index.php (im connected by putty) and I can see that the change in fact is done. But, guess what, when I open in the browser it just works as before (without showing my "test"). But I just wait a few minutes and refresh the page it shows me that "test". Browser cache I deleted(though do not think it matters this case, also trying to refresh the page by CTRL+F5) but after all only after a few minutes changes take into effect. The same thing when I delete that line and double check that it is saved, again during a few minute still I see that echo, when when is nothing in the file already.
So, is there such a thing, that apache has some kind of cache, so even if I change the files in a physical drive after all it uses the file from there and only after a few minutes updates the cache ?
Thanks
I believe that if Varnish is set up correctly you can turn it off via PHP like so.
header('Pragma: no-cache');
header('Cache-Control: private, no-cache, no-store, max-age=0, must-revalidate, proxy-revalidate');
header('Expires: Tue, 04 Sep 2012 05:32:29 GMT');
Files being saved but php script did not change?
Try this in .htaccess - maybe because of new php-versions the apc cache is turned on by default:
php_flag opcache.enable Off
Related
I am working with scanning images.
Today I made a step forward but I think that I need something more in the header maybe?
Currently a scan request is POSTed to app running on Apache server and PHP 7.0. I receive that post and today I figured out I needed to reply with a custom 201 header.
I have the following custom headers set in PHP
header('HTTP/1.1 201', Created);
header('Location: /eSCL/Scans');
header('Cache-Control: no-cache, no-store, must-revalidate');
After doing I was able to see that a request is now made for an image, from the app to Apache server to /eSCL/Scans/NextDocument. "NextDocument" is actually a softlink to a .jpg, (but may also later be a PDF). The app GETs this file, I can also see that the server replies with the JPG file . That image should display in the scan app that made the request, but never does.
I think however I need something in the header to tell it it is a JPG , especially given it loads from a softlink with no extension . I tried the following with the same result
header('HTTP/1.1 201', Created);
header('Content-type:image/jpeg');
header('Location: /eSCL/Scans');
header('Cache-Control: no-cache, no-store, must-revalidate');
There must be a magic soup that causes the image to display in the app.I have tried both VueScan and Mopria android app with the same results
Thanks for any ideas in advance
I obviously see, that in specific cases, after SetCookie() function, when I reload the page (or re-enter that page), I don't see the cookies set, unless i press CTRL+F5 (which is cache-cleared request).
have you ever met such occasion (this happens only on http version of the file, not on https).
this didn't help:
header('Expires: Sun, 01 Jan 2014 00:00:00 GMT');
header("Cache-Control: no-store, no-cache, must-revalidate, max-age=0");
header("Cache-Control: post-check=0, pre-check=0", false);
header("Pragma: no-cache");
What I should do in order to get cookies read on next page-enter?
Cookies generally are hard to debug when they go wrong. What you are describing does seem to be related to caching but it sounds very odd.
Here's what you can try doing to see what's wrong.
Make sure you are setting the cookie in the correct domain and path . If your page is e.g. at www.example.com the domain should either be .example.com or www.example.com. The idea is same for the path (if you access www.exampl.com/path then the path needs to either be / or /path
If you need the cookie to be available on http then you should not set the secure attribute on it.
Make sure the cookie is not already expired when set, browsers will probably ignore it if it is.
A few basic things to check.
Assume that you have cookie line:
setcookie('name', 'value', time()+10000,'/test','.example.com');
Open the Chrome (or your favourite browser's) developer console and when you make the request check the cookies being set. There should be something along the lines of:
The duration must not be 0 and the rest should be correct.
When you make the next request the cookie should be sent as well.
This should be an entry in the "Request cookies".
If the cookie is received, but then not sent to the server it might be worth opening the browser settings and finding the stored cookies manually for any hints as to why this is happening. In chrome this would be in chrome://settings/siteData
If you still don't find anything wrong with any of these then check if there's any intermediate caching layer running, that would override the No-Cache header you are setting.
I've found that there was some kind on Nginx caching enabled, causing the browser not to recall the execution from server.
I've fixed that by including the random() string in the url, voiding the cache for that page-call.
I am facing a weird thing when i move the file to live server. Actually i have an XML file. It is read from Jquery and the contents are displayed in the HTML Page. Yesterday i made some changes in the XML file and updated in the Live server. It works perfectly in the local. But in Live server it is returning old XML file values only. I totally removed the file and moved the new file.
I thought it is referring from somewhere else. So i deleted the file and checked. But it shows error on that time. So it refers the same file only. I opened the file in the live server itself. Everything is perfect. But it still shows old content. I don't know what is the problem happening on live server.
Can anyone help me to figure out?
try adding no-cache header in your php, like:
header("Cache-Control: no-cache, must-revalidate");
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT");
Or try accessing your xml file adding some random integer, like
your_xml_file.xml?id=<?php echo time(); ?>
Or if in js, use url like:
var url = "http://www.somesite.com/your_xml.xml?"+new Date().getTime();
I have been trying to find the reason for this error for weeks now - and I have come up a blank. The system uses PHP to generate dynamic .pdf files.
I have three servers: Dev (Win7 with Apache2), Test (Ubuntu 10.4 with nginx), and Live (Ubuntu 10.10 with nginx). All are running php5 and the system I have developed - same code. Equivalent, same config.
I have many browsers I have tested things with: DevIE (win7, IE8), DevFF (Win7 Firefox 3.5), DevSaf (win, Safari), LaptopFF (WinXP, Firfox 3.5), Laptop IE(WinXP, IE8 Test (Ubuntu FF3.5), and users (mostly IE8 on Win 7 and Win XP).
When I generate a PDF from Test it works correctly in all browsers (except Users which I can't test).
When I generate a PDF from Dev it fails from DevIE, DevFF and DevSaf, but calling for it from Test works.
Apache2 always fails from the same machine.
From the laptop, using FF succeeds, and using IE8 fails (see below).
The users are reporting intermittent problems. It fails, and then the repeat the request and it succeeds.
When it fails....
The log of the generated PDF is shown, sending the right sort of size reply (500KB to 1.8MB) with a 200 OK result. This is sometimes followed about 10 seconds later with a repeat of the same URL - but this generates the log-on screen (again 200 OK reply), but only 2K in size. The implication is that it was requested without the cookie.
Adobe Reader tries to display the log-on page, with the inevitable "This file does not start with "%PDF-" error message.
Except for when I try with the laptop and IE8 - then it fails with show source showing a 4 line html file with an empty body!
The system has been working for over a year - and only started failing with a change of production server about 2 months ago. The test version was not changed at this time, but started to fail also.
I have tried all sorts of headers, but nothing I have tried makes any difference. The current set of headers is:
header('Content-Disposition: inline; filename="'.$this->pdfFilename().'"');
header('Content-type: application/pdf');
header("Pragma: public");
$when = date('r',time()+20); // expire in 20 seconds
header("Expires: $when");
I've tried replacing inline with attachment. Adding and removing all sorts of no-cache headers. All to no avail.
The PDF is requested in a new window, by JavaScript - and is followed 8 seconds later by a refresh. I have tested without the new window, and without the refresh - no change.
I have has a few (small) PDFs served by the Dev server. So I have raised every limit I can think of. Now it always fails.
So I have a Windows Apache2.2 server that fails when browsed from the same machine and succeeds when browsed from other machines in Firefox.
There is no proxy or cache mechanism involved other than that in the browsers.
Has anyone any ideas about what might be going wrong? As I said, I have been testing and eliminating things for nearly 4 weeks now, on and off, and I have not yet even identified the failing component.
This is really tough to troubleshoot - for starters, (please excuse my bluntness, but) this a prime example of what a pipeline should not look like:
Three different operating systems.
Probably at least two different versions of PHP.
Two different webservers.
But anyway, a few general hints on debugging PHP:
make sure to enable error_log and log_errors in php.ini (set display_errors = Off)
use the most verbose error_reporting
set access_log and error_log in nginx.
crank up log level in nginx (I'm guessing you use php-cgi or php-fpm, so you should be able to see what status the backend emits when the download attemp fails).
Furthermore:
You haven't shared how the PDF is generated - are you sure all libraries used here are the same or at least somewhat the same across all systems?
In any case, just to be sure I would save the PDF on the server before it is offered to download. This allows you to troubleshoot the actual file — to see if the PDF generation actually worked.
Since you're saving the PDF, I'd see about putting it in a public folder, so you can see if you can just redirect to it after it's generated. And only if this works, then I'd work on a force-download kind of thing.
I would replicate the production environment in all stages. ;-) You need your dev server to be exactly like the production environment. For your own workstation, I'd recommend a VM (e.g. through Virtualbox with Ubuntu 10.10).
Let me know this gets you somewhere and reply with updates. :-)
Update:
I'd investigate these two headers:
header("Cache-Control: no-cache, must-revalidate"); // HTTP/1.1
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT"); // Date in the past
Definitely helps with cache busting.
These are the headers, which finally worked in a similar situation in one of my apps:
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false);
header( "Content-Type: application/pdf" );
header("Content-Disposition: inline; filename=\"YourPDF_" . time() . ".pdf\";");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ". strlen( $pdfData ) );
I added the time() code to make the filename change each time, so that it likely passes all proxies.
From time to time but seldom, the problem re-appears. Then, we ask our clients to download the file using the browser context menu.
PS: The app uses ezPDF found here: http://www.ros.co.nz/pdf/
I am designing a website in php.After completing it i uploaded the things in server.
The page was working fine in localhost. But after uploading, the page is not even loading.
At the top of every page i included a page called startsession.php. the contents of this page is as follows:
session_start();
header("Pragma: no-cache");
header("Cache-Control: no-cache, must-revalidate");
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT");
If I remove the session_start, it is working fine. The details of error is as follows:
Page in which Error Occured---Unknown
Line no in which error occured---0
Details Of the Error---Unknown(): open(/tmp/sess_723d94fdc8ae3569b1a641fd8799ece9, O_RDWR) failed: No such file or directory (2)
Error Code---2.
Please help me
Check and make sure the path (/tmp) is writable by the user running the web server, as this may be a permissions problem. Also, you need to use the header() function BEFORE using session_start, as noted in the documentation:
http://php.net/manual/en/function.session-start.php
Since session_start may actually send the HTTP headers. Also, always make 100% certain you're using:
error_reporting(E_ALL)
You can use your own directory where session save.
Put below code above session_start(); code.
$sessdir = dirname(dirname(__FILE__)).'/session_dir';
ini_set('session.save_path', $sessdir);
Enjoy!!!!!!!!!!
I had a different problem - no errors but sessions were not working after installing php / apache on linux. I spent hours researching this and finally solved the problem by simply deleting the contents of /tmp.
Thanks to KramerC