I have a PHP file that takes a while to run, it has several parts to it, Is there a way to send a command back to javascript after part of the PHP file has been completed?
PHP runs in server side and javascript in client side.
First PHP runs in server and once it is completed it sends output to client (browser)
You can use AJAX for your requirement
Related
I have a PHP script which is run by AJAX. At the end of the script, the server makes an HTTP request to a different server to log the successful completion of the script. The problem is, this second server sometimes takes a while to respond, and I would like this to happen after the AJAX client finishes its request.
Is there some PHP library or similar which could do this? Is my best bet to log the completion to a file on the first server, then have a cron script making the HTTP requests to the second server based on the contents of the file?
U can use file_get_contents to call a remote server from your PHP, or use the more complex, but more feature rich CURL wrapper library PHP has.
Apparently I use
var x=ajax.....
if(x){x.abort();}
but this does not cause an abortion inside my php file, which by the way although the option is by default it is set to ignore_user_abort(false)
meaning that if connection were to be lost then the script would not continue execution.
The problem is it appears it keeps on executing code although ajax on js client side aborted.
How to accomplish via js the successful abortion of both, client side and server side php stopping.
That is not the way AJAX works in most cases. Once the request is made to the server, the server starts to fulfill it. Even if PHP was set to stop script execution if the client connection was dropped, in a web server configuration, it does not make such a check until output is actually attempted to be sent to the client. So if you are executing a typical AJAX type of script where you simply have a single output event, a client side abort is not going to have any effect on the server side.
If you need to confirm that something was aborted you have to send another request to ask if your job finished or not. That means that after you call xhr.abort() you have to send a request to something like /abort/jobId to find out whether it was able to abort before finishing processing.
There's no magic way to guarantee that the PHP script will stop immediately after the abort() call.
I wrote an HTML page with JavaScript, a PHP file and a shell script
they are all on the same machine
I run the shell script, it will open the html page with Firefox,
when the JavaScript finishes, it will POST to a getdata.php
<form id="hidden_form" method="POST" action="getdata.php" ></form>
The getdata.php will do something and then it will send a signal to the shell script
the above is the normal behaviour, I'm afraid at some time, the PHP or JavaScript run into error and don't send signal to the shell script
are there any good and simple way for shell script to detect whether JavaScript is running?
a guy below mentioned that I can let javascript send request to the server periodically, like once every 2 minutes, but how to let shell script notice this/get the signal?
As someone suggested, you can poll the server with setInterval to see if the javascript engine is still running, but that might as well mean that only the javascript thread doing setInterval is the one actually running (depending on what kind of error could arise). You could do the post to "status.php" which would touch/write to a file on the disk, which the bash script could poll to see if it's updated in regular intervals.
I'd however suggest that you look into something like phantomjs which allows you to solve these kind of problems.
No.
Remember: Javascript is running remotely over on the client browser; PHP (and your shell script) are on the server.
What you can do is set up an alarm (on the server) when you send the web page, and invoke an alarm handler if it triggers:
http://www.woodwose.net/thatremindsme/2011/05/forking-parallel-processes-in-bash/
If your JavaScript is running a long-to-process script, you may intermittently submit a request to the server over a set interval. If you don't see the request after a set period... ( interval x2 or w/e ) and the signal has not been received upon completion, you can assume that the JavaScript itself has stopped running / broke.
setInterval("methodofcommunicatingtotheserver", 1000);
etc.
..as stated before: No.
Your best options in this case are, either:
Re-design your script. Is it absolutely necessary for it to send the request to the php page via the javascript in-browser?
If the answer to the above is yes, have a look at Selenium, especially Selenium WebDriver.
It does not provide bindings for bash scripts out-of-the-box, but it's pretty easy to use from other languages such as Python (or PHP, they say).
I'm sending an HTTP POST request from my C++ app to a PHP script on a server. Using HttpOpenRequest/HttpSendRequest/etc. Currently it waits for the PHP script to finish executing before HttpSendRequest returns. Is there anyway to change this functionality?
I'm sending the data just before my C++ application closes, so I don't want it to sit there for 10+ seconds waiting for the PHP script to finish executing. I just want it to send the data, then close the app.
One thing I was thinking was to have the PHP script spawn another PHP script in a different process using the exec command, so the first script would return straight away. However, I'm sending a lot of data with my HTTP POST request, so would need it pass this data to the spawned process.
Ideally I just want a flag to set to tell HttpSendRequest not to wait. I couldn't see anything obvious in the docs, but may have missed something.
You can call InternetOpen with the INTERNET_FLAG_ASYNC and have your callback do nothing of consequence.
Here's some example code to get you started:
http://www.codeproject.com/KB/IP/asyncwininet.aspx
Then (as rik suggests), call ignore_user_abort(true); at the top of your PHP script to ensure it executes fully.
ignore_user_abort
You may want to ignore_user_abort() in your PHP script. Then you can close the connection from your C client after the data is sent and PHP will continue to do whatever it's supposed to do.
I was wondering about the lifespan of a PHP script when called via Ajax. Assume that there is a long-running (i.e. 30 seconds) PHP script on a server and that page is loaded via Ajax. Before the script completes, the user closes the browser. Does the script continue running to completion, is it terminated, or is this a function of the server itself (I'm running Apache fwiw). Thanks for any help.
This may be of interest: ignore_user_abort()
ignore_user_abort — Set whether a client disconnect should abort script execution
However note
PHP will not detect that the user has aborted the connection until an attempt is made to send information to the client.
The script will continue running. Closing the browser on the client does not notify the server to stop processing the request.
If you have a large time consuming script, then I would suggest splitting it up into chunks. Much better that way