If I'm generating a stream of data to send out to a browser, and the user closes the browser, can I tell within PHP that I don't need to bother generating or sending the rest of the stream? I'd like to insert something into this loop:
while (!feof($pipes[1])) {
echo fgets($pipes[1]);
}
My fallback plan is to have the browser use a JavaScript onunload to hit another PHP page to kill the process that's generating the data, but it would be cleaner if PHP could tell when I'm echoing to nowhere.
By default PHP will abort the script if the user navigates away. There are however times where you don't want this to happen so php has a config you set called ignore_user_abort.
http://php.net/manual/en/misc.configuration.php
There's also a function called register_shutdown_function() that is supposedly executed when execution halts. I've never actually used it, so I won't vouch for how well it works, but I thought I'd mention it for completeness.
I believe that script will automatically abort when loaded normally (No ajax). But if you want to implement some sort of long polling via php using xmlhttprequest I think you will have to do it with some sort of javascript because then php can't detect it. Also like to know the precise case.
These answers pointed me towards what I was looking for. The underlying process needed special attention to kill it. I needed to jump out of the loop. Thanks again, Stack Overflow.
while (!feof($pipes[1]) && !connection_aborted())
{
echo fgets($pipes[1]);
}
if (connection_aborted())
{
exec('kill -4 '.$mypid);
}
Related
I have 2 functions, let's call them login and doSomething and currently, I implemented them this way,
$member=$this->login();
$this->doSomething($member);
//show welcome page
When a user logs in, I want to do some stuff but it takes around 20 seconds or more to complete. Is there any ways where after login() is run, it will show the welcome page immediately while the method doSomething() is being executed separately. The method doSomething() doesn't return any values thus does not affect the welcome page.
Please try the following.
ob_start();
$member = $this->login();
ob_end_flush();
ob_flush();
flush();
$this->doSomething($member);
If you do not want to print anything after login, you can use:
ob_start();
$this->doSomething($member);
ob_end_clean();
Also using Ajax from the front site's login page(after loading), you can start processing
$this->doSomething($member);
in another ajax call in the back end silently.
There are other ways for achieving threading, pseudo threading like behaviour.
But these are the easiest one for your scenerio :)
You can check WorkerThreads also.
Their implementation example documentation are available in the net.
If you really, really want to run it in parallel, then you need to run it in a sperate process. That means you are running it in different scope, so while the code you invoke might contain $this->doSomething($member), that "this" won't be this "this".
Assuming that is possible, then your question is a duplicate of this one (but beware - the accepted answer is not good). Note that you will run in blocking problems if both parts of the script depend on a session.
I have been looking for several answers around the web and here, but I could not find one that solved my problem.
I am making several JQuery ajax calls to the same PHP script. In a first place, I was seeing each call beeing executed only after the previous was done. I changed this by adding session_write_close() to the beginning of the script, to prevent PHP from locking the session to the other ajax calls. I am not editing the $_SESSION variable in the script, only reading from it.
Now the behaviour is better, but instead of having all my requests starting simultaneously, they go by block, as you can see on the image:
What should I do to get all my requests starting at the same moment and actually beeing executed without any link with the other requests ?
For better clarity, here is my js code:
var promises = [];
listMenu.forEach(function(menu) {
var res = sendMenu(menu);//AJAX CALL
promises.push(res);
});
$.when.apply(null, promises).done(function() {
$('#ajaxSpinner').hide();
listMenu = null;
});
My PHP script is just inserting/updating data, and start with:
<?php
session_start();
session_write_close();
//execution
I guess I am doing things the wrong way. Thank you in advance for you precious help !!
Thomas
This is probably a browser limitation, there is a maximum number of concurrent connections to a single server per browser instance. In Chrome this has been 6, which reflects the size of the blocks shown in your screenshot. Though this is from 09, I believe it's still relevant: https://bugs.chromium.org/p/chromium/issues/detail?id=12066
Good day!
I am having some issues with getting the echo statement to output before the execution of the exec()
<?
if (isset($_POST['ipaddress'])) {
$escaped_command = escapeshellcmd($_POST['ipaddress']);
if(filter_var($escaped_command, FILTER_VALIDATE_IP)) {
echo "Gleaning ARP information, please wait..";
$command = exec('sudo /sbin/getarp.exp');
The echo statement is being outputted after the execution of the $command. The execution time can be anywhere from 15-30 seconds depending on how large the ARP table on the remote router is. Is there an order of operations that I am not aware of? It appears that all the statements within the if statement are executed in parallel and not by line by line as I had assumed.
I would rather not a solution be provided, but some documentational links that would lead me to finding a solution. I have searched what I could, but was not able to find a viable solution.
Any help would be appreciated.
Thanks.
This is happening because the script will run in its entirety before any result/output is sent to the browser.
In PHP there is a concept of "output buffering".
Whenever you output something (e.g. using echo, print, etc.) the text is thrown into a buffer. This buffer is only sent at certain times (at the end of the request, for instance, or when the buffer is full).
In order to empty the buffer (to "flush" it) you need to do it manually. The flush() function will do this. Sometimes you also need to call ob_flush() (this is if you have opened custom output buffers yourself). It is generally a good idea to just call both functions and be done with it:
echo 'Wait a few seconds...';
flush(); ob_flush();
sleep(3);
echo ' aaand we are done!';
See Output Buffering Control for more information on output buffering in PHP.
This is probably an issue with the output buffer. PHP buffers output and writes it to the browser in chunks. Try adding a call to ob_flush() between the echo and the exec(); this will force PHP to write the current contents of the buffer to the browser.
By default, php does not send any of the output until the php script is done running completely. There is a solution. However, I hear it is a little browser dependent. I would test it on different systems and browsers to see if it is working:
ob_implicit_flush (true)
Put that before any of your echo/print commands and that should allow anything printed to show right up on the browser.
A more universal approach would be to integrate your page with asynchronous javascript. A process commonly referred to as "AJAX". It is a little more difficult because it requires the use of many interacting scripts, some client-side and some server-side. However, AJAX is the defacto way to do thing like this on the web.
I have a PHP script that I need to execute from inside another PHP webpage. However for the second one to run properly the first needs to have fully completed. Essentially I need the first page to spawn a new process/thread for the second script which will wait 1 second before starting.
Doing an include causes blocking which prevents it from working and I can't get it to start using exec
Edit:
Should have clarified. These pages have no output and are not interfaced with through a web interface. All pages are called by POST requests from another server.
Edit 2:
Solution: make server requesting the page send a request directly to the second page 1 second after the first returns.
proc_open is the correct choice, as #ChristopherMorrissey pointed out. I want to elaborate a little here, as there are some caveats to using proc_open that aren't entirely obviously.
In the first code example # http://php.net/manual/en/function.proc-open.php, it shows the overall usage and I will reference that.
The first caveat is with the pipes. The pipes are file streams in PHP that link to STDIN, STDOUT and STDERR of the child process. In the example, pipe index 0 represents a file stream from the parent PHP processes perspective. If the parent process writes to this stream, it will appear as STDIN input to the child process.
On POSIX compliant OSes, STDIN to a process needs to close before the process can terminate. Its very important to call fclose on the pipe from the parent, or your child process will be stuck. That is done with this line in the example:
fclose($pipes[0]);
The other caveat is on checking the exit code of the child process. Checking the exit code is the best way to determine if the child process has exited correctly, or if it erred out. At the very least, you will need to just know when the child process has completed. Checking this and the exit code are both done with http://www.php.net/manual/en/function.proc-get-status.php
If you want to ensure the process exits correctly, you will need to look at the exitcode field returned in the array from proc_get_status. Keep in mind this exit code will only return a valid value once. All other times it will return -1. So, the one time it returns > -1, this is your actual exit code. So, the first time running == false, check exitcode.
I hope this helps.
One option would be to redirect the page after the first script has finished.
You could do it this way:
//first script here
sleep(1); //wait one second
echo "<meta http-equiv=\"refresh\" content=\"0;URL='yoursecondscript.php'\" />"; //redirect
or even:
//first script here
//redirect after one sec
echo "<meta http-equiv=\"refresh\" content=\"1;URL='http://thetudors.example.com/'\" />";
You might be able to write a header going to the second page when you want it, then putting a header back to the original page at the end of the second page. Though there are many reasons this fix wouldn't work, like the PHP code being required to be in the HTML.
Headers reference
So something like: header(Location: seconddocument.php)
Or maybe you could put the PHP to execute in a function, and then call it from the original PHP document. I can't be exactly sure of your requirements here, but those would be my two best answers.
I want to watch for cookie and if this cookie exist do somethink. I tried this :
while (true)
{
sleep(1);
if ($_COOKIE['name'])
{
doSomethink();
}
}
But this code really make execution to 'sleep'.
Is there a way to watch without stopping the execution?
if this is a browser activity, a better way to do this may be a javascript setInverval() with an ajax call to a php function.
Otherwise, I'd recommend running the PHP script via CRON. There's no other way that I know to run PHP asynchronously, which is probably what you're trying to accomplish.