Do dynamic pages like CGI, PHP, ASP, SSI always contain content-length field in the HTTP headers. If not why? Please provide links to webpages hosted on servers which don't include the Content-Length field in the headers. I want to see it first hand.
Per RFC 2616:
In HTTP, it SHOULD be sent whenever
the message's length can be determined
prior to being transferred,
It is often the case that the length cannot be determined beforehand. If you want to check out headers, try curl -I http://www.example.com. You'll quickly see that some sites do and some sites don't.
I think that pages NOT need always to send their content-length.
From the browser side, if browser knows the content-length can show the loading bar, or else just wait to see the "end of the file". If you send a file is better to sent the content-length or else user can not see this loading bar and can't be sure that the file is fully loaded. But if you just have a page, the browser just load until gets the end.
The reason is that some pages can create their content while they send their data on the client. This way user no need to wait too much to see the first data coming.
This Dogs page is an example. Also amazon did not send the content-length on most page for the same reason.
The page is flush the data after find the first item, and then is flush the data time to time, so the user not need to spend time waiting for the program first find them all, then calculate the size of the page, and then start sending the data.
Related
Can I detect if the web browser is requesting a certain image from the server?
I want to check if the user downloads the image or if it is already cached from its browser.
The main idea:
I am counting unique visitors per profile page. I use IPs and Cookies for now but want to add this, too. IP could be changed easily, Cookie could be blocked/deleted.
My idea is to use this information just like a flash cookie. The image will be 1px x 1px in size and will be invisible to the user. I don't have experience with ActionScript and Flash at all, so I can't use flash cookie and want to try with this.
EDIT:
As I understand from Sven's answer maybe I couldn't explain what I need. My question is same as Sven's answer. How to wait for the request to appear on the server? I want the browser to cache the image, so it will be downloaded only if the user is an unique visitor, i.e. he is viewing the page for the very first time.
I want to get this information and check if the image is requested or not (i.e. it is cached). Something like:
$requested_files = $_SERVER['REQUESTS']; // Or something similar, this is the question.
$file_name = $profile_page_owner_id.'.png'; // For example.
if(in_array($file_name, $requested_files)) {
// File is requested, so it is not cached. This is an unique visitor.
// Of course except this I will continue to check IP and Cookie.
// This will be the 3rd check.
} else {
// File is not requested, so it is already cached.
// Page is viewed before, this is not an unique visitor.
}
Have your image path set to, let's say user_track.php, the browser will request the file, where you do your logging, then send the appropriate headers and the image itself.
You can even send cache-denial headers, so that the image won't be cached by default.
Just create a PHP file that will output an image, add the logics you need (counting and stuff) before the output, call the php file in an html image and force the image to be cached by sending a header like header('Cache-Control: max-age=37739520, public');
You can take a look at this post: How to get the browser to cache images, with php? for more information about caching.
You can detect it by simply waiting for the request to appear on the server.
If you get a request, the browser has not cached it.
If you do not want the browser to cache it, simply say so in the http headers. Then you'll get the request every time.
Edit: Ok, if you WANT caching of the image, simply send cache headers that allow for indefinite caching in the browser. Usually the image will then be requested only once. The detection of the request stays the same.
I have some PHP code which frequently serves page redirects to clients via the header('Location: x') function. The header redirect works fine; I have no output before the Header function, and the user is successfully redirected to the new page. Some of the clients that connect have HTTP byte range requests tied to them, with the intent to only grab a certain portion of the requesting file that I redirect to.
I need to preserve this range request when sending to the new site; the site I redirect to should also see the range info in their headers, and be able to correctly process the user's request.
I understand that I can see the byte range that they're requesting in my PHP code by looking at $_SERVER['HTTP_RANGE'], but I'm unable to think of a way to pass this range to the redirecting site in their headers. I'm pretty sure trying to implement via the header function is wrong, since it will set the headers of my own page. Instead, I need to be able to set the headers that are sent to the page that I'm redirecting to.
Does anybody have any ideas on how to implement this?
If you don't actually need to redirect the user to another site, but just need to give them content from another site, you might want to use cURL functions to where you can set the Content-Range header on the request, get the result, and then serve it up to the end user.
I have a PHP application that I have been having some problems with, some pages take a very long time to load.
After a couple of hours I have figured out the problem, but I have no idea how to fix it.
The problem seems to be with the header Connection: keep-alive. I used a Firefox plugin called "Tamper Data" which allows you to "tamper" with the headers and stuff. Once I used that tool to change the connection header to Connection: close the delay on some pages stopped.
How, in PHP, can I make sure that the Connection: close header is used?
I tried putting header("Connection: close"); at the top of a PHP file, and reloaded the page. It still sends the Connection: keep-alive header, not the one I am trying to send.
How can I achieve what I am trying to do?
EDIT: I have just realized that on this subdomain the content-length header is not sent at all for most pages. It is only sent right after a form submission followed by a redirect.
EDIT 2:
This is the page: http://volunteer.essentialtransit.com/job/13/just-a-test-at-eta/
Click the "Apply now" link and fill out some random txt, you don't need to attach a file. Notice when you are redirected back to the "job" detail page that it will take a very long time to load.
Your problem has nothing to do with connection states. It might seem related to connections because Apache automatically spawns a new child thread for each new request originating from a different source. With keep-alive, it will attempt to reuse the previous thread, which is busy from a PHP script (from your application). It's a little more complicated actually but this is the basic. Just note that "Connection: Close" is being sent, but it's supposed to close the connection only after the script has finished (sent all buffers out).
Now I'm going to tell you how to debug your script. I'll do this because if you don't fix your problem and you gain more traffic, your host will kick you out for extreme resource usage.
So:
Append set_time_limit(5) or higher to confirm there's a background script problem
Check for requests to local resources, requests that would only work on your staging server (you can use WireShark for this)
Check for external requests, cURL, file_get_contents() calls, anything with a timeout
Benchmark and optimize lengthy scripts (you can try xdebug for this)
Log all PHP notices, warnings and errors to a file; you should get at most zero errors
Finally, it's a good practice to triple-check your entire application. One for data entry, second for data operations and third for modules interconnection. But you should focus on AJAX background scripts that can't return output
Of course, skip anything that doesn't apply.
So, I determined what the problem was, and found a work-around to this issue.
The script that processed the form just processed the input and redirected to another page, but it actually didn't output anything. On most pages on the site the content-length header is either not sent, or is set at the correct value. But for some reason when posting to a page, and then redirecting without the processing script outputting anything to the browser, the content-length was being set at 0.
I tried setting the content-length myself, but didn't have much luck, as it didn't seem to make a difference.
So, all I did was make the processing script have some output. So now when the form is submitted the processing script outputs a page with a redirect script (and a 'click to continue' message just in case) that leads to the correct page. So while this adds a very brief delay between the form submission and the correct page being seen it causes the content-length to be set correctly and the problem is solved.
While this is not an ideal solution it is manageable and makes the script work.
First of all, I'm sorry about the title. I couldn't find a better one.
I've a image file, generated by a PHP script. this script (image) is connected to a database and saves its referrer url in a table.
Because the output image doesn't change, I think it's better to cache it.
But as I know, if I cache 1 file (for example http://www.example.com/img.png.php), on every pages, the browsers reads it from cache. and it's not good for my script. because on the first call, it save the referrer url and cached by browser. And on the next calls, on different websites (referrers), cached version will be used and browser don't send any request to the server, and finally referrer url won't save in the database.
Can I say to browser, please cache 1 copy of the image for each domain?
I mean:
http://wwww.abc.com/index.html sends a request to get my image (script)
browser checks its cache, and doesn't find it. so get it from the
server. and PHP script saves the referrer url.
the user goes to another page of ABC.COM. (for example: http://wwww.abc.com/about.html) browser check the cache, it finds
it. so doesn't send a request to the server to get the file content.
and PHP script won't run.
another site (http://wwww.efg.com/index.html) sends a request to get my image (script)
browser checks cache, and WILL NOT find it. so send a request for file
content. and PHP script runs...............................
Is it possible?
(sorry for long text, with a lots of grammatical problems)
You could use a redirect page (that is not cached) that saves the referrer to your database and then redirects to the cached image.
That way you always get a hit but the actual image is cached.
In your HTML you could use:
<img src="/image.php">
And in image.php:
<?php
// save the referrer in here
header('Location: /image.jpg');
?>
and /image.jpg is your actual image (which can be cached)
First of all, think about the user's experience: Do you really need to increase page load time just for the referrer feature? Also, you should be aware that many browser/privacy tool configurations suppress or don't send the Referer header in the first place.
If you're really sure that you want the resource(JavaScript, stylesheet, image, ...) to load each time, you can send the Cache-Control HTTP header with the resource to prevent caching. For example, to prevent caching of referer.js when served with Apache, add the following .htaccess file in the same directory (requires mod_header):
<FilesMatch "^referer\.js$">
Header set Cache-Control no-cache
</FilesMatch>
Seems to be a counter, am I right?
AFAIK you cannot do exactly what you've explained.
But you always can "cache" an image on the server side so you wouldn't need to redraw it:
<?
/*
do some stuff
*/
// send an image: the content-type first
header('Content-type: image/png');
// and the image
readfile('myImage.png');
On a website, I enter some parameters in a form, click on search and then get a page with a message "retrieving your results". After the search is complete, I get another page with my results displayed.
I am trying to recreate this programatically and I used Live HTTP Headers to get a peek of what is going on behind i.e the url, form variables,etc. However, I'm only getting information of what goes on up to the page which shows "retrieving your results". Live HTTP Header is not giving me information up to the page which contains the final results.
What can I do to get this final bit of information (i.e the url, form variables, etc)
I use Charles HTTP Proxy for all my HTTP troubleshooting needs. It has a ton of options and works with any browser.
"Web Developer" does this:
https://addons.mozilla.org/en-US/firefox/addon/60
#Mark Harrison
I have webdeveloper installed. Initially, I used it to turn off meta-redirects and referrers to get a clearer picture of the http interaction. But when i do this, the website does not work (i.e it is not able to complete the process of retrieving my search results) so i turned it back on.
I'm wondering if anyone has had to capture http information for a site that has a processing page in between the user input page and the results page
That sounds weird? I'm pretty sure that LiveHttpHeaders should show this. Can you double check that you aren't missing something? Otherwise try with Firebug. It has a tab for "network", which shows all requests made.
I'm using Fiddler2, which is a free (as in beer), highly configurable proxy; works with all browsers, allows header inspection/editing/automodification on request/response.
Disclaimer: I'm in no way affiliated with Fiddler, just a (very happy) user.
I for such problems always fire-on an Ethereal or similar network spying tool, to see exactly, what is going on.
The document is creating a browser component called XMLHTTPRequest , on submit event the object method send() is called, during the waiting time for server response an html element is replaced with a "Waiting message" on succesfull response a callback is called with the new html elements and then inserted in the selected html element. (That's called ajax).
If you want to follow that process you can use Firefox Live HTTP Headers Extension , or Wireshark to view full HTTP headers and actions (get/post/).