First of all, I'm sorry about the title. I couldn't find a better one.
I've a image file, generated by a PHP script. this script (image) is connected to a database and saves its referrer url in a table.
Because the output image doesn't change, I think it's better to cache it.
But as I know, if I cache 1 file (for example http://www.example.com/img.png.php), on every pages, the browsers reads it from cache. and it's not good for my script. because on the first call, it save the referrer url and cached by browser. And on the next calls, on different websites (referrers), cached version will be used and browser don't send any request to the server, and finally referrer url won't save in the database.
Can I say to browser, please cache 1 copy of the image for each domain?
I mean:
http://wwww.abc.com/index.html sends a request to get my image (script)
browser checks its cache, and doesn't find it. so get it from the
server. and PHP script saves the referrer url.
the user goes to another page of ABC.COM. (for example: http://wwww.abc.com/about.html) browser check the cache, it finds
it. so doesn't send a request to the server to get the file content.
and PHP script won't run.
another site (http://wwww.efg.com/index.html) sends a request to get my image (script)
browser checks cache, and WILL NOT find it. so send a request for file
content. and PHP script runs...............................
Is it possible?
(sorry for long text, with a lots of grammatical problems)
You could use a redirect page (that is not cached) that saves the referrer to your database and then redirects to the cached image.
That way you always get a hit but the actual image is cached.
In your HTML you could use:
<img src="/image.php">
And in image.php:
<?php
// save the referrer in here
header('Location: /image.jpg');
?>
and /image.jpg is your actual image (which can be cached)
First of all, think about the user's experience: Do you really need to increase page load time just for the referrer feature? Also, you should be aware that many browser/privacy tool configurations suppress or don't send the Referer header in the first place.
If you're really sure that you want the resource(JavaScript, stylesheet, image, ...) to load each time, you can send the Cache-Control HTTP header with the resource to prevent caching. For example, to prevent caching of referer.js when served with Apache, add the following .htaccess file in the same directory (requires mod_header):
<FilesMatch "^referer\.js$">
Header set Cache-Control no-cache
</FilesMatch>
Seems to be a counter, am I right?
AFAIK you cannot do exactly what you've explained.
But you always can "cache" an image on the server side so you wouldn't need to redraw it:
<?
/*
do some stuff
*/
// send an image: the content-type first
header('Content-type: image/png');
// and the image
readfile('myImage.png');
Related
I have a php file which generate a file to be downloaded using POST parameters. I already have everything working to resume file downloading (using Range header and everything related).
However, if I pause the download and then try to resume it, the browser does not send POST data in the request. This works fine using GET data instead but I'd like to keep using POST. How could I make this work ?
Notes:
The file is generated on the fly and sent to the browser (simply print'ed by php) using the right headers.
I cannot save the file somewhere and serve it. It has to be served on the fly.
*sorry for my bad english.
well, lets say that you have database to save the POST params.
then you need to create unique url for the download based on the params.
lets say that the download link is dowload.php <- you send the POST here.
now you need to save the params there, and lets says that you get the unique_id.
after that you redirect the page to the new page(with the unique_id param) that process the download, for example resume_download.php
example download url will be resume_download.php?req=[your_unique_id]
or you can use .htaccess to made the url more friendly
this is not guarantee the download will continue, but at least user don't need to re-enter the form.
Can I detect if the web browser is requesting a certain image from the server?
I want to check if the user downloads the image or if it is already cached from its browser.
The main idea:
I am counting unique visitors per profile page. I use IPs and Cookies for now but want to add this, too. IP could be changed easily, Cookie could be blocked/deleted.
My idea is to use this information just like a flash cookie. The image will be 1px x 1px in size and will be invisible to the user. I don't have experience with ActionScript and Flash at all, so I can't use flash cookie and want to try with this.
EDIT:
As I understand from Sven's answer maybe I couldn't explain what I need. My question is same as Sven's answer. How to wait for the request to appear on the server? I want the browser to cache the image, so it will be downloaded only if the user is an unique visitor, i.e. he is viewing the page for the very first time.
I want to get this information and check if the image is requested or not (i.e. it is cached). Something like:
$requested_files = $_SERVER['REQUESTS']; // Or something similar, this is the question.
$file_name = $profile_page_owner_id.'.png'; // For example.
if(in_array($file_name, $requested_files)) {
// File is requested, so it is not cached. This is an unique visitor.
// Of course except this I will continue to check IP and Cookie.
// This will be the 3rd check.
} else {
// File is not requested, so it is already cached.
// Page is viewed before, this is not an unique visitor.
}
Have your image path set to, let's say user_track.php, the browser will request the file, where you do your logging, then send the appropriate headers and the image itself.
You can even send cache-denial headers, so that the image won't be cached by default.
Just create a PHP file that will output an image, add the logics you need (counting and stuff) before the output, call the php file in an html image and force the image to be cached by sending a header like header('Cache-Control: max-age=37739520, public');
You can take a look at this post: How to get the browser to cache images, with php? for more information about caching.
You can detect it by simply waiting for the request to appear on the server.
If you get a request, the browser has not cached it.
If you do not want the browser to cache it, simply say so in the http headers. Then you'll get the request every time.
Edit: Ok, if you WANT caching of the image, simply send cache headers that allow for indefinite caching in the browser. Usually the image will then be requested only once. The detection of the request stays the same.
I have a requirement to make a page available if apache is down for any reason.
The way I thought of it is to make the page "cached" so that it is available always.
However I have 2 problems:
- I want the page to be always available (may be I can set the cache limit to a very big number)
- When I make the page cached, the browser always retrieves the cached page even if apache is up.
So anyone can advice on what should I do here ? May be there is a better alternative other than the one I am using ?
This is the code I use for reference:
<?php
session_cache_limiter('public');
$cache_limiter = session_cache_limiter();
session_cache_expire(60);
$cache_expire = session_cache_expire();
session_start();
echo "hello world 2222";
?>
Thanks in advance
John
I'm not sure how this would work. If apache is down, how will this default page get served up? How will the client be directed to the web root? Who is telling the client where this default page is located?
I'm very interested in the idea of "the page is cached". Have you had any success with this after taking apache offline? Does it require that the browser visit the page once before in order to cache the page?
Here's an odd addition to our idea. How about caching some javascript into the page. The javascript attempts to make an ajax call. If it is unsuccessful, it assumes apache is down and then redirects the user to another server's webpage or re-writes the entire page with the "Server is down" page you have in mind.
Not sure it's worth the resources, but it's an interesting idea.
Thanks all for your answers
I managed to do it by setting a cookie that contains the "dc" param to be appended to ajax calls.
And the whole page uses ajax.
So I make a dummy request at the start of the page, if I got no response, I get the cached request from the "dc" parameter set in the cookie
You can't do this using caching.
In order to handle the request and detect the non-availability of the webserver, you need something which can process http requests and then substitute content when Apache is not available... i.e. a second webserver. And if you've got a second webserver, why not just load balance them.
Do dynamic pages like CGI, PHP, ASP, SSI always contain content-length field in the HTTP headers. If not why? Please provide links to webpages hosted on servers which don't include the Content-Length field in the headers. I want to see it first hand.
Per RFC 2616:
In HTTP, it SHOULD be sent whenever
the message's length can be determined
prior to being transferred,
It is often the case that the length cannot be determined beforehand. If you want to check out headers, try curl -I http://www.example.com. You'll quickly see that some sites do and some sites don't.
I think that pages NOT need always to send their content-length.
From the browser side, if browser knows the content-length can show the loading bar, or else just wait to see the "end of the file". If you send a file is better to sent the content-length or else user can not see this loading bar and can't be sure that the file is fully loaded. But if you just have a page, the browser just load until gets the end.
The reason is that some pages can create their content while they send their data on the client. This way user no need to wait too much to see the first data coming.
This Dogs page is an example. Also amazon did not send the content-length on most page for the same reason.
The page is flush the data after find the first item, and then is flush the data time to time, so the user not need to spend time waiting for the program first find them all, then calculate the size of the page, and then start sending the data.
there is a page that i need to post a password to it and then i get a file to download.
the post goes to the same page address its loads again and pop up the download manager (download starts automatically).
now i want to do the same but in curl, i posted the data to the url and then its sends me the file back but i don't want my script to download the whole file i want only to get a link to download it by myself.
how can i do that?
Actually, you most probably can't. Such password protected download system usually checks either cookies or browser / environment based variables. Getting the link itself shouldn't be problem, however you could not use it outside this generator's scope anyway.
firstly you need to post that password with curl assuming "on specific form. the form will take you to the downloading page" now you need to use regex (regular expressions).
filter the data you want then save it on other variable to re-use it.
There is for sure a redirection after you hit 1st page with POST. Look for that redirection with curl and read http response headers: Content-Location or Location or even Refresh
To prevent the automatic download you have to set the curl opt to not follow redirects. I can't remember the exact command but curl by default will follow auto refreshes and URL redirects, which happen in split seconds so humans don't actually see it happening.
I kinda don't understand what you really want to do, but if you just want a link then have the php script perform the entire curl post and everything when they click it. Doesn't matter what the web server will require a password before access to a file, you can't skip that step.