I have this:
echo "<font color=\"#000000\">text</font>";
usleep(2000000);
header("location:/otherpage.php");
?>`
Please note that this will be included in an iframe...
The problem is that it isn't echoing the echo statement, but then sleeping for two seconds and then redirecting (which is all correct except for the echo part)...
Anyone have an idea why this is happening?? It's not the iframe because when you go straight to the file (separately from the iframe) the same happens. Could it be the usleep??
Thanks
What you are doing above will not work. First, you would need to do a flush in order to make sure the data was sent. Second, though, and more important, you can't change the header after the flush, which would result in either the header not being sent or the text not being sent.
If all you want to do is change the data after a delay, did you consider doing the following:
header('Refresh: 2; url=http://my.site.com/otherpage.php);
echo "<font color=\"#000000\">text</font>";
This will send the information in the browser, instructing the browser to change to the new URL after 2 seconds.
This won't work since you can't change the header after outputting text.
The only option is to use a meta refresh or javascript when you want to exact replicate this behaviour.
But the output problem you can solve by flushing the buffer but then no redirection is possible as i mentioned before.
Another very important thing is:
DONT USE USLEEP FOR SUCH THINGS.
Why? Because when your script is heavily loaded every request which needs too much time is very bad and you will run out of php threads (depending on your php webserver implementation). So, for such timeouts you should use clientside code (if possible).
It's sleeping while on the server, and then sending the output. Also you can't send headers after echoing something.
You should use javascript or a meta redirect, this will allow you to wait a few seconds before redirecting, and the time and url for both of those can be generated by your php script.
You can't to that. I have a feeling you're misunderstanding the purpose of PHP and it's role as a server-side language.
If you want to "redirect" to another page in PHP, you do so using HTTP headers, which you did. The thing is, those are the headers, so they must be sent before any text body (like an echo statement). You're trying to do something that should be done client-side on the server.
Instead, make an HTML page with some JavaScript code to do the redirection, like that:
<script>
setTimeout(function() {
window.location = "otherpage.html";
}, 2000);
</script>
Expanding my comment into an answer:
It's not possible to redirect like this (outputting some content and then trying to send in a header) because headers must be sent before all content. PHP will also complain about this, using default settings you should see a warning.
Second, the usleep delay might not be observable due to the server buffering the content throughout the sleep and only sending it in one big chunk afterwards. In general, it isn't reliable to make the browser display content "in steps" like this, although it can be made to work more or less if you pile the hacks high enough.
May I suggest that if this kind of behavior is what you need, you should look into making a (series of?) appropriate AJAX call, which can be orchestrated perfectly.
What everyone else said. Just adding that usleep() will make clients hold connections open on your server---it's a very inefficient use of limited server resources. Your PHP should always send everything it can as quickly as possible so your web server can close the connection.
Related
So yeah this came to mind randomly when I was teaching someone how to redirect their page. I wasn't really sure what the main difference was... Is there a reason you would use one over the other? I guess if you are not coding in PHP, you would have to use the Javascript window.location to redirect but would you ever use window.location over PHP header if you were developing in PHP? I feel they have very similar functions but perhaps I am missing something.
The browser will process the header redirect right away, whereas the Javascript redirect will not be executed until the page has loaded (or at least enough of it to run the Javascript). Also, it will be the Javascript engine executing the redirect instead of the browser itself.
Doing it via the header will perform better. (slightly anyway...)
PHP's server-side header can send other headers then only Location. Javascripts client-side window.location can be used to read, inspect, and alter (parts of the) current url, including hash. Really, they can do quite different stuff, and about their only overlap is both being able to redirect.
Is it possible to have print_r() displayed live. By live I mean while the script is executed. I do not want to wait the end of the script to have it displayed. Hope I am clear. Thank you in advance for your replies. Cheers. Marc
Likely you are using PHP through a webserver like Apache.
Webservers have caching implemented, they tend to send their data out in larger blocks.
Browsers also have caching implemented, they only refresh the data from time to time and at the end when they finished loading the website.
Finally PHP also has caching built in.
HTTP was not made for "live" display it's more like a static page, that's why people invented "AJAX" and Javascript to poll for changed/live events after a page was loaded.
What you can do:
To make sure the data from PHP is sent to the webserver you can use the command flush()
There is also a php setting called implicit_flush you might want to look up.
The webserver is likely using gzip/mod_gzip to compress output. You need to disable that behaviour.
Maybe that will do it: #apache_setenv('no-gzip', 1);
Add some more content than just pure text, if you put the data inside a simple "table" including </table> it's more likely browsers will display it during load.
Look in php ini for this:
output_buffering = Off
zlib.output_compression = Off
You can do this at runtime too (#ini_set('zlib.output_compression', 0);)
Some browsers will only display data if they receive a certain amount of bytes.
If I recall right 256 byte might help.
str_repeat(" ", 256); (or anything else)
I'd like to add that these steps can help solve the issue but from my experience the results are not perfect.
Every new browser and browser version might act different.
What is faster on opening and proccecing:
Having under 1 file all the jquery functions or
Each function to a separate file and call it when ever you need it?
Ex. I have a blabla.js file that has 4 functions in it.
And my xaxa.php that calls the blabla.js.
Now, when I firstly open my page its fast enough. No problem (even with cookies cleared and all)
BUT... when I first (and after) click a button that activates a part of my blabla.js all my links and functions are opening/working slower.
So should I separate my functions and load each js file where ever I need it or my problem is somewhere else?
Thank you
(As I said I start to suspect something in my structure)
So here is a sample of my jq:
$(document).ready(function(){
$(".avoid_ref_add").click(function(){
var keyValues = {
pid : $(this).parent().find('input[name="pid"]').val()
};
$.post('help_scripts/cartfunct.php', keyValues, function(rsp){
$('#content').load("p_body.php");
});
return false;
});
function remove
function update
});
and I have my items.php items2.php items3.php...
Now, MY COOKIES I HAVE THEM CLEARED NO CACHE... When I firt open the site, it loads fast and all links are fast...
But if I click that add button everything start to work REALLY slow...
IF I just refresh the whole page it starts working fast again and so on...
FOR ME is quite strange and I cannot figure what I did wrong... Because if the page was slow, it wouldn't load fast from the first time... Correct? Is it something in my code?
You should optimally put all your javascript in one javascript file.
This file could be served with gzip compression and far-future expiry headers to limit bandwidth usage (and result in faster pageloads). You could even run a minimizer on your javascript to reduce file size.
What you are really asking for is the art of minimalization/optimisation and this article is a good read: http://developer.yahoo.com/performance/rules.html
It is clearly better to hold all your JS functions in a single file and this should be faster. Think that when a different function needs to be called you want to avoid the delay inserted by the hit to the server
Execution speed should not be affected by the way you distribute functions among the files. However, download of all resources related to a page is faster when fewer files need to be downloaded. That's because each HTTP requests comes with a HTTP request header and each HTTP response come with a HTTP response header. If the keep-alive feature of HTTP is not used, additional overhead comes from establishing a TCP connection for each request.
You don't really have to separate each function to a separate file. I would suggest keeping the functions that are related to each other in the same JS file and putting others in another JS file.
This makes it easier for you to manage the code and increases the reusability/modularity of the JS file.
After that, you can use minify http://code.google.com/p/minify/ to combine the necessary JS files when you need to, thus reducing the number of requests made to retrieve the JS files.
if i have something like this in php
$foo=0;
while($foo<20){
echo "hello";
usleep(1000000);
$foo=$foo+1;
}
and i make an ajax request to that php file, can i do anything with the data while the request is in progress?
i mean, the script echos hello every second and i saw that the request only shows what data it has when the whole loop is finished, so isnt there a way i can access each hello when its echoed out?
Look for firebug extension of Firefox.
There are a few reasons why you can'y see it.
The content coming from the AJAX request is processed by the server like any other http/php request.
What is happening is the data is being cached by the php buffer, then when its done its flushing it to the output. Which apache then delivers to you.
There is so little data that there is no need to flush this buffer before the processes is done. So you are only seeing the final result.
If you had some much data outputted that it cause the output to be flushed before hand then you may get it.
The other problem is going to be you ajax request handler. I'm pretty sure the onComplete (or similar) method that you (and everyone else) is using will only be called when the output from the server request is finishing and your browser has the full data.
It may be possible to use a different event or perhaps write the ajax code your self (with out using stuff like jQuery) but i'm not even sure if that would solve your problem; as this might also be something to do the with x http request implementation.
May i ask what your are trying to do this for? there may be an easier solution for your actually problem *i'm assuming this isn't the code your actually are using on your site).
Dan
If you execute the flush(); command in PHP you will send content. If you're compressing at the server level you may need to pad output to fill up a packet to make it send.
flush();
Here's an example: http://example.preinheimer.com/flush.php
The correct answer is you CAN see the content while it's being returned.
The other answers were partially correct in mentioning that the PHP output buffer will keep the output "bottled up"... but the output buffer can be disabled.
Once you disable the output buffer you need to show the JQuery response before the request completes - you do this by updating the browser periodically while the connection to the server is still active. This concept is called "Comet" or "Long Polling".
See these questions:
Comet and jQuery
How do I implement basic "Long Polling"?
Comet In PHP
Hey guys quick question, I am currently echoing a lot of javascript that is based conditionally on login status and other variables. I was wondering if it would be better to simply echo the script include like <script type="text/javascript" src="javascript/openlogin.js"></script> that has been run through a minifying program and been gzipped or to echo the full script in raw format. The latter suggestion is messier to me but it reduces http requests while the latter would probably be smaller but take more cpu? Just wondering what some other people think. Thanks in advance for any advice.
I would go the first option, even though its an extra request it means the html/php page will be smaller. Also, it is my understanding once the Javascript is cached it won't be requested again whereas the html/php page will be requested every time.
Depending on your javascript functionality you could also add the async="true" to the script include to ensure the page is downloaded first then the javascript.
Include it externally (your first option). Then when you're doing javascript maintenance, you're not doing it inside PHP as well.
Including the raw text is preferred if you do not expect the page loads per user to go much beyond 1. If you expect your users to request your page multiple times, then the external, cacheable include is the right option. This is usually the case.
Echo the script include so that the javascript in in an external file and then the browser's cache can do it's job.