Apache + PHP multiple scripts at the same time - php

Good day.
For first, sorry for my bad English =)
So. I created script:
<?
sleep(10);
?>
My Apache has MPM module, I obviously didn't use sessions in this script, just.. just sleep(10).
When I open 2 tabs in my browser simultaneously, first tab loads in 10 seconds, second tab - 20 seconds.
But. When I open this script in 2 different browsers at the same time, it loads in each one after 10 seconds.
So, I started thinking, that my problem is "Connection: Keep-Alive". I changed my script:
<?
header('Connection: close');
phpinfo();
sleep(10);
?>
phpinfo() - to be sure, that headers were sent before sleep(). Buuuut... I meet the same problem. In first tab of Chrome I get headers with "Connection: close", in second tab I can't get response headers while first script is not ended. In two different browsers - everything is normal.
And now I have absolutely no ideas what I'm doing wrong. Why Chrome can't make 2 parallel queries to my site? What I should do to solve this problem?
P.S. I don't want to disable keep-alive for all my site. I don't mind, if it will speed up loading of css, images and other stuff. Even other scripts. But I want to have ability to run some scripts parallel in one browser.
P.P.S. For example: at the one page will be very long ajax query, for example - processing some big data at server-side and ajax queries with some little interval - to get status of executing first query. Obviously, that they must be parallel.

I know it's an old question but I just had the same problem and solved it with session_write_close()!
Without it PHP purposely queues scripts for same session.
Simplest possible example:
Looong Script #1:
<?php
$_SESSION['progress'] = 0;
for ($i=0; $i < 100; $i++)
{
session_start();
$_SESSION['progress']++;
session_write_close();
sleep(1);// This is slowing script purposely!
}
?>
Short script #2:
<?php
session_start();
print_r($_SESSION['progress']);
?>
Now try it, open first script that takes ages open second script in new tab and get the progress updated in a blink while first still running!! So easy right?! ;)
Same principle for ajax polling long script and second ajax call to get the progress!

Related

I want PHP to output while it's still running [duplicate]

I'm trying to run a loop every second for 25 seconds basically.
for($i = 0; $i <= 25; $i += 1){
echo $i;
sleep(1)
}
The thing is it doesn't output until it's fully done, so after the loop continues 25 times. Is there a way to do this so it will output before each sleep? and not wait until the full loop is complete?
Thanks!
I just hashed through this same problem from a beginner perspective and came up with this bare-bones script which will do what you want.
<?PHP
ob_start();
$buffer = str_repeat(" ", 4096)."\r\n<span></span>\r\n";
for ($i=0; $i<25; $i++) {
echo $buffer.$i;
ob_flush();
flush();
sleep(1);
}
ob_end_flush();
?>
Questions that you may ask could be here (about \r\n) and here (about ob_flush()). Hope that helps you out.
What you're trying to achieve is incremental output to the browser from PHP.
Whether this is achievable can depend on your server and how you're invoking PHP.
PHP under FastCGI
You're probably a bit more likely to run into this kind of problem when PHP is running under FastCGI rather than as an Apache module, because the coupling between the server and the PHP processes is not as tightly coupled. FastCGI communication uses output buffering once the data has left the PHP processes, with the output sent to the browser only once the request is fully complete, or this buffer has filled up. On top of this, the PHP processes tend to be terminated after a certain amount of time, to avoid letting any one run for too long.
That said, a combination of ob_end_flush() (or ob_flush()) and flush() should still cause PHP to request that the downstream buffers are cleared, so this may still work. You may need to also investigate whether you need to length the time limit for PHP scripts.
PHP under mod_php
If you're using mod_php, you can write incrementally out to the browser. Use the flush() command to ensure that the PHP module will flush it instantly. If you don't have output buffering, or some Apache module such as mod_gzip, then it should go out instantly to the user's browser. What's more, you can keep your PHP script running as long as you like (with set_time_limit() in PHP), under the default configurations, though of course it will consume some memory.
You may run into trouble with some browsers which don't start rendering the page until a certain amount of a page is downloaded. Some versions of IE may wait for 1KB. I've found that Chrome can wait for more. A lot of people get around this by adding padding, such as a long comment 1 or 2 KB long at the top of the document.
Call flush will force PHP to push all of the output buffer to the client before proceeding.
for($i = 0; $i <= 25; $i += 1){
echo $i;
flush();
sleep(1);
}
EDIT:
After testing this on my lighttpd server I noticed that it buffered my outputs in blocks of 4096 characters, and I assume other browser might have similar buffering schemes. Also GZIP can prevent flush completely. Unfortunately there is no way to test that it's working due to the nature of HTTP.
Also another issue with this strategy is that it leaves that PHP proc blocked to other requests. This can cause requests to pile up.

PHP ajax multiple calls

I have been looking for several answers around the web and here, but I could not find one that solved my problem.
I am making several JQuery ajax calls to the same PHP script. In a first place, I was seeing each call beeing executed only after the previous was done. I changed this by adding session_write_close() to the beginning of the script, to prevent PHP from locking the session to the other ajax calls. I am not editing the $_SESSION variable in the script, only reading from it.
Now the behaviour is better, but instead of having all my requests starting simultaneously, they go by block, as you can see on the image:
What should I do to get all my requests starting at the same moment and actually beeing executed without any link with the other requests ?
For better clarity, here is my js code:
var promises = [];
listMenu.forEach(function(menu) {
var res = sendMenu(menu);//AJAX CALL
promises.push(res);
});
$.when.apply(null, promises).done(function() {
$('#ajaxSpinner').hide();
listMenu = null;
});
My PHP script is just inserting/updating data, and start with:
<?php
session_start();
session_write_close();
//execution
I guess I am doing things the wrong way. Thank you in advance for you precious help !!
Thomas
This is probably a browser limitation, there is a maximum number of concurrent connections to a single server per browser instance. In Chrome this has been 6, which reflects the size of the blocks shown in your screenshot. Though this is from 09, I believe it's still relevant: https://bugs.chromium.org/p/chromium/issues/detail?id=12066

Progress Bar when running function inside foreach loop

I have a foreach loop that calls a function to set values to an array. Sometimes it takes hours to complete depending on how many times it has to run thru the function to complete.
What I would like to have is a progress bar or at least a 1/1000 completed type progress indicator.
Is this possible? If so how could I implement this into my code? Would it be in the function or in the foreach loop? Been researching and found some examples using for and $i++ but I am not really sure how to implement that since I am already using a foreach loop.
Thanks much.
function scrape_amazon($links) {
//my code runs here to set all values in $ret array.
}
foreach($links as $link) {
$ret = scrape_amazon($link);
}
PHP probably isn't really the right tool for this task, however what you could do is:
Launch the slow code as a background process, and output progress to a file.
Have a PHP script that polls that file for progress information (either by page refresh or AJAX)
Launching the background process can be done in several ways, including:
Launch via cron every 60 seconds, and poll for new jobs spooled in some readable area
Launch via a fork/exec mechanism from a web page
Launch as a daemon at system startup
It will take some effort to avoid problems with multiple executions and/or overlap.
I use this, which well, not an ajax, do only flushing, but not so ugly.
I place an image
<img src='progress.gif' height=18 width=0 name=probar>
Then set on every event done on server a echo a line, then flush:
echo "<script language='JavaScript'>\ndocument.probar.width=".(($sys["probar_width"]/$task_all)*$task_i).";\n</script>\n";
flush();
If your server (eg. apache) use caching (eg. gzip is enabled) it won't work well.

Abandon Long Processes in PHP (But let them complete)

I have an HTML form that submits to a PHP page which initiates a script. The script can take anywhere from 3 seconds to 30 seconds to run - the user doesn't need to be around for this script to complete.
Is it possible to initiate a PHP script, immediately print "Thanks" to the user (or whatever) and let them go on their merry way while your script continues to work?
In my particular case, I am sending form-data to a php script that then posts the data to numerous other locations. Waiting for all of the posts to succeed is not in my interest at the moment. I would just like to let the script run, allow the user to go and do whatever else they like, and that's it.
Place your long term work in another php script, for example
background.php:
sleep(10);
file_put_contents('foo.txt',mktime());
foreground.php
$unused_but_required = array();
proc_close(proc_open ("php background.php &", array(), $unused_but_required));
echo("Done);
You'll see "Done" immediately, and the file will get written 10 seconds later.
I think proc_close works because we've giving proc_open no pipes, and no file descriptors.
In the script you can set:
<?php
ignore_user_abort(true);
That way the script will not terminate when the user leaves the page. However be very carefull when combining this whith
set_time_limit(0);
Since then the script could execute forever.
You can use set_time_limit and ignore_user_abort, but generally speaking, I would recommend that you put the job in a queue and use an asynchronous script to process it. It's a much simpler and durable design.
You could try the flush and related output buffer functions to immediately send the whatever is in the buffer to the browser:
Theres an API wrapper around pcntl_fork() called php_fork.
But also, this question was on the Daily WTF... don't pound a nail with a glass bottle.
I ended up with the following.
<?php
// Ignore User-Requests to Abort
ignore_user_abort(true);
// Maximum Execution Time In Seconds
set_time_limit(30);
header("Content-Length: 0");
flush();
/*
Loooooooong process
*/
?>

Can a PHP script trick the browser into thinking the HTTP request is over?

I first configure my script to run even after the HTTP request is over
ignore_user_abort(true);
then flush out some text.
echo "Thats all folks!";
flush();
Now how can I trick the browser into thinking the HTTP request is over? so I can continue doing my own work without the browser showing "page loading".
header(??) // something like this?
Here's how to do it. You tell the browser to read in the first N characters of output and then close the connection, while your script keeps running until it's done.
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Will not work
flush(); // Unless both are called !
// At this point, the browser has closed connection to the web server
// Do processing here
echo('Text user will never see');
?>
Headers won't work (they're headers, so they come first)
I don't know of any way to close the http connection without terminating the script, though I suppose there's some obscure way of doing it.
Telling us what you want to do after the request is done would help us give better suggestions.
But generally, I'd be thinking about one of the following:
1) Execute some simple command-line script (using exec()) that looks like:
#!/bin/sh
php myscript.php <arg1> <arg2> .. <argN> &
Then kick that off from your http-bound script like:
<?PHP
exec('/path/to/my/script.sh');
?>
Or:
2) Write another program (possibly a continuously-running daemon, or just some script that is cronned ever so often), and figure out how your in-request code can pass it instructions. You could have a database table that queues work, or try to make it work with a flat file of some sort. You could also have your web-based script call some command-line command that causes your out-of-request script to queue some work.
At the end of the day, you don't want your script to keep executing after the http request. Assuming you're using mod_php, that means you'll be tying up an apache process until the script terminates.
Maybe this particular comment on php.net manual page will help: http://www.php.net/manual/en/features.connection-handling.php#71172
Theoretically, if HTTP 1.1 keep-alive is enabled and the client receives the amount of characters it expects from the server, it should treat it as the end of the response and go ahead and render the page (while keeping the connection still open.) Try sending these headers (if you can't enable them another way):
Connection: keep-alive
Content-Length: n
Where n is the amount of characters that you've sent in the response body (output buffering can help you count that.) I'm sorry that I don't have the time to test this out myself. I'm just throwing in the suggestion in case it works.
The best way to accomplish this is using output buffering. PHP sends the headers when it's good and ready, but if you wrap your output to the browser with ob_* you can control the headers every step of the way.
You can hold a rendered page in the buffer if you want and send headers till the sun comes up in china. This practice is why you may see a lot of opening <?php tags, but no closing tags nowadays. It keeps the script from sending any headers prematurely since there might some includes to consider.

Categories