I have a requirement to make a page available if apache is down for any reason.
The way I thought of it is to make the page "cached" so that it is available always.
However I have 2 problems:
- I want the page to be always available (may be I can set the cache limit to a very big number)
- When I make the page cached, the browser always retrieves the cached page even if apache is up.
So anyone can advice on what should I do here ? May be there is a better alternative other than the one I am using ?
This is the code I use for reference:
<?php
session_cache_limiter('public');
$cache_limiter = session_cache_limiter();
session_cache_expire(60);
$cache_expire = session_cache_expire();
session_start();
echo "hello world 2222";
?>
Thanks in advance
John
I'm not sure how this would work. If apache is down, how will this default page get served up? How will the client be directed to the web root? Who is telling the client where this default page is located?
I'm very interested in the idea of "the page is cached". Have you had any success with this after taking apache offline? Does it require that the browser visit the page once before in order to cache the page?
Here's an odd addition to our idea. How about caching some javascript into the page. The javascript attempts to make an ajax call. If it is unsuccessful, it assumes apache is down and then redirects the user to another server's webpage or re-writes the entire page with the "Server is down" page you have in mind.
Not sure it's worth the resources, but it's an interesting idea.
Thanks all for your answers
I managed to do it by setting a cookie that contains the "dc" param to be appended to ajax calls.
And the whole page uses ajax.
So I make a dummy request at the start of the page, if I got no response, I get the cached request from the "dc" parameter set in the cookie
You can't do this using caching.
In order to handle the request and detect the non-availability of the webserver, you need something which can process http requests and then substitute content when Apache is not available... i.e. a second webserver. And if you've got a second webserver, why not just load balance them.
Related
Apologies if this question duplicates some other question, but I can't find one exactly like it in S.O.
I am writing a remotely hosted app, the kind that runs when you put a javascript on your own website page, where the src="some remote javascript.js". so, the script operates by calling every operation as a jsonp ajax. A lot of jsonp housekeeping, but otherwise works surprisingly well.
In the main remote js script, I set a user cookie when the user logs in. It works fine, the cookie is set for a year, and when you return to the page it continues recognizes you.
However, when I try to output the cookie (even after it has been set) using php, my php code does not see it for some reason.
If I alert(document.cookie); the cookie is displayed.
If I do a var_dump($_COOKIE); php returns array(0) { }.
This isn't a "you have to reload the page after setting the cookie with javascript" problem.
As far as I know, when I use the Firefox Web Developer extension to View Cookie Information, it is all happening on the same domain.
Looking over many other examples, it is clear that PHP should be able to read a cookie, even if set by javascript.
Even as I write this, I think a glimmer of what the problem is is starting to form in my head, that (possibly) a JSONP'd php script isn't going to see the cookie set by javascript.
I've searched but haven't been able to find any discussion on this topic:
I have a private webpage (implemented in php) that scrolls through a few different php pages displaying info then refreshing to the next one every minute for use in a fire station displaying latest jobs, weather etc. I currently use header( 'refresh: 60; url=screen2.php' ); in each php file simply pointing to the next file..
The problem is that every once in a while the page fails to load for one reason or another, at which point the "Server not found" page is displayed. When that happens of course the refresh instruction is lost and error page stays there until someone notices the problem and manually refreshes the page. Not ideal..
I should mention the page is displayed on multiple monitors around the fire station and the pc running it is locked away. Hence the hassle of manually refreshing it every so often!
It's important if possible to automatically recover from this situation without human intervention. Is there any way, using frames, php, javascript or otherwise, to refresh again if the error page is shown?
Any ideas would be greatly appreciated.
Thanks!
Adam
You could just use javascript window.location inside a setTimeout function or a meta refresh tag.
There's no reason to need to do it on the PHP side unless you care whether the user can change the refresh information.
You say it "fails to load for one reason or another" - do you know why this is? If the problem is at your server end, then you could set up your server so that its 404 response (or 500 internal server error response, or whatever error it is) contains the header to redirect to the start PHP page.
If the issue is DNS or connection related - ie. the server isn't even contacted and so the "not found" error page is being displayed by the browser, then what you could do is alter this page. This is not easy to do, but in Firefox you can e.g. follow the instructions on http://forums.mozillazine.org/viewtopic.php?f=7&t=492177&start=0 in order to customise the error page shown.
Other than that, it might be better to fix the problem with the page failing to load, that might be simpler :)
You could also save the information in a session.
http://php.net/manual/en/features.sessions.php
Another way that is server-independent is the Web Storage API of javascript. Most modern browsers already support it.
I use KO3.1 & php 5.3.3
In my controller 'action_lang' with route 'lang/code' I set user UI language and save it into the cookie with:
Cookie::set('language', $code)
Right after this I call:
Request::current()->redirect('/')
to move to the main page, where I have
echo Cookie::get('language')
to print current language.
The problem is my browser reads redirected page from it's CACHE thus Cookie::get('language') always shows the SAME value. The only way is to force the browser to refresh with F5, then it change as expected but that's not the way it should work. / It should change instantly!
Is there something wrong with this method? or its just me too tired...
I'm sure this worked fine some time ago, with the very same browser and KO2.
please help
The beahvior of the browser might actually correct, if the cache headers indicate it. The correct solution would be to forbid caching of the URI contents. However, some browsers have issues here and it might not be desired to not allow the browser caching the URL at all.
A simple solution to fool the browsers for reloading the page in your exact case is to append a random parameter to the URL like ?refresh=$time with $time being the current time stamp.
How to make sure a human doesn't view the results from a PHP script URL?
Recently when viewing the source of a site that was making an AJAX call, I tried to follow the link in the browser
www.site.com/script.php?query=value
Instead of getting the result I expected to see, I saw a message stating only scripts should view that page.
How do you restrict a script to only allowing a script to access it?
UPDATE:
here is the page DEMO page
Short answer: you can't.
Long answer: You can make it harder to do it by requiring special header values in the HTTP request (setting Accept to application/json is a common one). On the server side just check to make sure that header is set to the value you expect. This will make it so that regular users will get the message you mention and your scripts will work just fine. Of course advanced users will be able to easily work around that sort of limitation so don't rely on it for security.
with php you can check for and only display results if the page is called via ajax
function isAjax() {
return (isset($_SERVER['HTTP_X_REQUESTED_WITH']) && ($_SERVER['HTTP_X_REQUESTED_WITH'] == 'XMLHttpRequest'));
}
if(isAjax()) {
// display content
} else {
// not ajax, dont show
echo 'Invalid Request';
}
You can't. A human being can always spoof the request. You can send your request with a post variable, to make sure a human doesn't end up on the page by accident.
One possible solution is to check the HTTP Request for it's origin.
Another solution is to send a "password" with every request. Take a look into this tutorial how to do this.
But it's never 100% secure, it only makes it harder for possible intruders.
As Tim stated, this script is almost certainly looking for this request header, which is being sent with each request to rpc.php (found via the net panel in firebug, naturally):
X-Requested-With : XMLHttpRequest
As to cross-browser compatibility, the setRequestHeader method appears to be available with both the activex and xmlhttprequest connections so this should work in all major modern browsers.
If you are calling the script by AJAX, then it MUST be accessible for you because an AJAX call is similar to your browser actually asking for the page, thus it is not only script accessible but accessible to anyone.
If it was actually called by PHP or by some other means, you could "maybe" use Apache rules or PHP scripting to diminish the accessibility.
You could set a secret value into the php session with the 'view' script and check for it with the ajax scripts.
Request 'index.php' with the
browser.
PHP builds the page, saves a key into
the session, sends the content back
to the browser.
The browser gets the page content and
makes some ajax request to your site.
Those ajax scripts also have access
to the same session your main page
did, which allows you to check for a
key.
This insures only authenticated browsers are allow to make the ajax requests.
Don't count on the ajax request being able to write to the session though. With many requests being satisfied at the same time, the last one in will be the last one written back to your session storage.
http://us.php.net/manual/en/book.session.php
A lot of open source applications use a variation of this on top of every php file:
if (!defined('SOMETHING')) {
die('only scripts have direct access');
}
Then in index.php they define SOMETHING:
define("SOMETHING", "access granted.");
edit: I'm not saying this is a good approach btw
edit2: Seems I missed the part about it being an ajax request. I agree in this case this isn't a solution.
I have a PHP redirect page to track clicks on links. Basically it does:
- get url from $_GET
- connect to database
- create row for url, or update with 1 hit if it exists
- redirect browser to url using Location: header
I was wondering if it's possible to send the redirect to the client first, so it can get on with it's job, and then log the click. Would simply switching things around work?
You can do this:
get url from $_GET
send Location: header
call flush()
connect to database
create row for url, or update with 1 hit if it exists
This will not necessarily cause the browser to go on its way immediately, since your connection is still being held open and what the browser does about that is up to it, but (barring conditions that cause flush() to not work) it will at least get the Location: header out to the browser so it can do the redirect if it likes.
If that's not good enough, I'd recommend dumping your data to per-minute or per-second files that a script then picks up for postprocessing into the database. If that's not good enough, you can play with pcntl_fork(); this is an extremely hairy thing to do under Apache SAPI.
Most databases give you the ability to insert a query delayed, so it'll only process when the database is idle. You're most likely best doing that...so long as the data isn't critical (you'll lose the query if mysqld dies or the RAM is flushed).
http://dev.mysql.com/doc/refman/5.1/en/insert-delayed.html
That is fine. PHP will finish executing before flushing the redirect header to the client. If you still have doubts try saving this in a file and loading from browser:
<?php
header('location: /');
sleep(2);
?>
The redirect should not happen for 2 seconds.