calling a webpage without pausing the script - php

is there any method to call a webpage using curl or anything else , without pausing the calling script?
in other words:
php code here
.
.
.
.
call_web_page();
.
.
.
php codes 2 here
the script runs through php code , then calls the webpage and resumes the php code to the end without waiting for a result from the webpage being called .
no need to the call_web_page(); to be a function it could be some lines of code to call the page ...
PS : No AJAX or EXEC

You could do a Ajax Request to the PHP script that executes when the page is finished loading. This way you can just let the user know that you are waiting for a response and let them finish loading the page. But if you need some of the data that you are retrieving you could consider hiding the real pages elements and showing only a progress bar. Then when you have the data you could populate the elements with it, just a idea.
Update, You could maybe delegate the task to a process running on the machine (if you have that level of access)
And have a look at Run PHP Task Asynchronously maybe that helps too.

You could execute a second php script (or system command - curl for example) as a background process using a similar method to this answer: php execute a background process
Edit: Due to no exec sys commands
You could use file('http://yourserver.com/another.php'); which makes a request to start another php process. The file contains this code, and returns immediately. The request should happen all on your server without trekking off to the internet.
header("Connection: close");
header("Content-Length: " . mb_strlen($response));
echo $response;
flush();
do_function_that_takes_five_mins();
code taken from: How to continue process after responding to ajax request in PHP?

What you want to doe is doing asynchroneaus calls what is achived by running some paralell threads and then to wait at a certain joinpoint for all the threads to finish.
Actually there is no native multithreading support in PHP but you can look at this post: Does PHP have threading?
There are some suggestions on what to use if you want to realize multithreading in php.

Why not just grab the pages and cache them for say 30 minutes (or longer depending on content). Then you don't need to wait each time a user opens the page.
You would use a process of something like:
check if local cached copy exists / is not too old
if old/not exist -> fopen remote file
fopen the local file cache
repeat for as many files as you need
More reading on SO:
How can I force PHP's fopen() to return the current version of a web page?
Does PHPs fopen function implement some kind of cache?
5-minute file cache in PHP

Since PHP isn't event-based i/o will block, and since your dismissing any AJAX/Exec solution i don't think you'll be able to implement this in PHP.
Maybe try with Node, Ruby EventMachine or Twisted Python?

Related

Laravel hangs when running command via exec without sending it to background

I have a weird issue that I've been stuck with for a couple of days now. I'm trying to generate a pdf in a Laravel app using chrome headless with this command
google-chrome --headless --disable-gpu --print-to-pdf=outputfile.pdf http://localurl/pdf-html
The command basically opens chrome in headless mode, navigates to the given url and prints it as pdf saving the file in the specified location. This command is working perfectly when run in my system's shell (I'm using Ubuntu 18.04). Now, my issue arises when trying to run the same command from a Laravel controller, I've tried exec, shell_exec, system and passthru and all give me the same problem. If I run the command without redirecting output and running the process on the backgroung, by adding >> tmpfile 2>&1 & to the end of the command then the request hangs. Running the command in the background would not be a problem normally, except that I need for the command to finish in order to send the file back to the client as a download. By running it on the background this basically executes it asynchrounously and I have no way of knowing when the process ends (or to wait until it ends) to then send the file as a dowload on the response.
I've tried other alternatives to no avail. I've tried using Symfony's Process which comes bundled with Laravel and it also fails. I've tried using puppeteer and instead of running the google-chrome command use a node.js script with code from the puppeteer documentation (which by the way also works when run directly in my system shell), but when run from Laravel throws a Navigation Timeout Error exception.
Finally I created a simple php file with the following code:
<?php
$chromeBinary = 'google-chrome';
$pdfRenderUrl = "http://localhost:8000/pdf-html";
$fileName = 'invoice.pdf';
$outputDirectory = "/path/to/my/file/" . $fileName;
$command = sprintf(
'%s --headless --disable-gpu --print-to-pdf=%s %s',
escapeshellarg($chromeBinary),
escapeshellarg($outputDirectory),
escapeshellarg($pdfRenderUrl)
);
exec( $command );
echo ( file_exists("/path/to/my/file/" . $fileName) ? 'TRUE' : 'FALSE');
?>
And the code runs just fine when run from shell like php thefile.php printing TRUE, meaning the command in exec was launched and after it ended then the file exists; and THAT is the exact code I'm using on Laravel except it only works, as mentioned above, when I send the process to the background.
Can anybody throw me a line here, please? Thanks
EDIT: #namoshek thanks for the quick reply and sorry if I did not made myself clear. The problem is not long waiting times, perhaps I could live with that. The problem is that exec never finishes and I eventually have to forcefully terminate the process (nor exec, nor any other alternative, they all freeze the request completely forever, with the exception of Process which fails by throwing a TimeoutException). I'm using postman to query the endpoint. The frontend is an Angular app, meaning the request for the invoice download will be made asynchronously eventually. Furthermore the task itself is not a long running task, as a matter of facts it finishes pretty quick. Using a polling strategy or a notification system, to me, does not seem like a viable solution. Imagine an app with a download button to download a simple document and you have to click the button and then wait for the app to notify you via email (or some other way) that the document is ready. I could understand it if it were a more complicated process, but a document download seems like something trivial. But what has me at a loss is why is it that running the task from a php script works as I want it to (synchonously) and I can't replicate the behaviour on the laravel controller
EDIT: I've also tried using BrowserShot, which, BTW also fails. Browsershot provides a way to interact, behind the scenes with puppeteer by using Process, and generate a pdf file. And even though it's an external program, it still seems to me that the behaviour I'm getting is not normal, I should be able to obtain the download even if the request took 10secs to finish because it executed the external program synchronously. But in my case it's failing due to a timeout error
EDIT: So after a while I came upon the apparent reason of the server hang up. The problem is that I was using artisan's development server. This, initially, did not seem like a problem to me but it seems that artisan can't handle that load. In the feature I'm implementing I'm performing a request to a particular endpoint, let's call it endpoint 1, to generate the pdf, the code on this endpoint triggers the external command, and when executed synchronously it means the code in endpoint 1 is waiting for the external command to finish. The external command in turn needs to browse to endpoint 2 on the same server, endpoint 2 contains an html view with the content to be put on the pdf, since the server is still waiting on endpoint 1 for the return of the external command then endpoint 2 is unresponsive, which apparently creates a loop which artisan's development server can't handle. Problem is I did a quick search and I found nothing that indicated that defficiency on artisan's development server. I moved the environment to Apache just to test my theory and it worked, though it should be noted that the request takes a very long time to finish (around 10-20 secs). This, so far, seems like the only reasonable explanation as to why that issue was happenning. If anyone knows how I can improve performance on this request, or anyone can provide a better explanation to the original issue I'd appreciate it.
#hrivera I'm a bit late to the game here, but regarding your last edit I believe you're almost correct, but my thoughts on this is that PHP's built-in server, which Laravel uses for development, is single threaded. The issue I had is that any assets within the page that was being passed to Chrome couldn't be loaded (CSS, js, etc) as the thread was already in use, and so it hung. Removing any assets from the HTML fixed the issue.
Production servers are multi-threaded, so we should have no issues. Not entirely sure I'm right, but wanted to comment anyway.
I don't really get what you are asking for, because it seems you already understood that executing a long running task like creating a snapshot will block the request if being run synchronously. Using other software such as Puppeteer will not change that. If your requests needs to wait for the result of this process to return, then the only way to have your request return faster is by speeding up the task itself. Which is most likely not possible.
So, there are basically only two options left: Live with the long wait times (if you want to perform the task synchronously) or execute the request/task asynchronously. The latter can be achieved in two ways:
Make the actual HTTP request be run in the background (using ajax) and use a loading indicator to keep your users patient. This way you could still run the process synchronously, but I would not recommend doing so as you would have to use high timeout times for the ajax request and in some situations the requests would probably still timeout (depending on the workload of your server).
Use the power of Laravel and make use of queue workers to perform the calculation in background. When the snapshot generation is finished, you can then use one of the following three options to return the result:
Use polling on the client side to see if the result is available.
Send the result or a link to the result per mail or something similar to the user.
Use a notification system to notify the user about the finished process and return the result in some way (i.e. fetch it or send it as part of the notification - there are plenty of options available). A built-in notification system that does exactly what I described is Laravel Echo. On receiving the notification that tells you the process finished, you could then fetch the result from the server.
In current times, the standard for web apps and user experience is option 2 with the notification system (3rd point).

How to display progress bar in PHP exec functions

I am running an external script in PHP using its exec() function. I was looking for various options to create a progress bar. I am able to create a plain rotating loader through AJAX but I couldn't achieve the percentage progress bar. Is there any way to do that?
Depending on the program you want to execute, you could use proc_open() instead of exec(), so you can proccess the output, calculate a percentage and throw it back to your ajax script using ob_flush() and flush().
It's not true that you need to wait for the execution to finish before sending any output, but there are some caveats with your server cache configuration and browsers rendering engines, so it is not so reliable.
If you're not using Websockets (the clean and modern option, which can be achieved with PHP using Ratchet or with nodejs using various options), the most reliable way of doing what you want is with polling.
Briefly, instead of calling your script once, you do a first ajax request to init the proccess and then start poking the server again and again to ask the execution status of your script.
For more information, take a look at those answers:
Run process with realtime output in PHP
PHP - Flushing While Loop Data with Ajax
Grab results from a php exec() while the command is still running?
PHP runs on the server, thus can not achieve this (to my knowledge),
here are some answered questions that might be able to help you.
How to show loading status in percentage for ajax response?
Jquery:: Ajax powered progress bar?

Terminate PHP script on server through AJAX call

I have written a PHP script to import large amount of data. The import process is triggered through Ajax call and the ajax request keep on waiting for the server response. As I am working on a dedicated server so there is no issue of timeout.
The problem is that we require a feature by which we can terminate the import process. For example a stop button on client-side. We thought that if we had killed the waiting ajax call then the process on the server will also stop as there is no request to serve. But unfortunately that is not the case, the script keeps on executing on server side while the Ajax Request is already killed from client.
Secondly, we use PHP session in this project. Let say if the cancel button requires an Ajax call to another script on the server to stop the process then How could that request will reach server if there is already a waiting ajax request. Php/Apache will hold the second request until the first request cycle is completed.
Note: As per our project architecture we require session_start() on every page. It will be good if anyone can guide on these issues.
You can write the Process ID at the start of running the script to a file, the db or a cache. You can get the Process ID with http://php.net/manual/en/function.getmypid.php. This assumes each script has its own Process ID.
The kill script (not using the locked session) could read that Process ID and try to kill it.
Be careful while doing long running processes in PHP as PHP's zend engine & GC is not suited for long running processes.
So, I strongly suggest using a proper job manager like gearman. gearman does have a php extensions. Using a job manager will give you full control over each process. you can start/stop processes & taks.
Another option is to use a queue, like amqp, to handle these tasks more cleanly. Which one is more suitable for your use case, I'll let you decide.
How about setting a time limit slightly higher than your AJAX timeout on your PHP script? This should kill the PHP script if it runs over time. Something similar to this:
<?php
set_time_limit(20);
while ($i<=10)
{
echo "i=$i ";
sleep(100);
$i++;
}
?>
Source: http://php.net/manual/en/function.set-time-limit.php
Once a script runs it can only be stopped by ending the php process working on the script. One possibility would be to use the session to store a "continue" condition when another script is called.
For example:
Script 1 is the worker (importer)
Script 2 is a function called repeatedly by ajax as long as the importer shall work.
Script 1 and 2 share let's say $_SESSION['lastPing'].
so
Script 2 sets $_SESSION['lastPing'] = time(); on each call.
Script 1 has a condition if($_SESSION['lastPing'] - 30 > time()){ die(); }
You might be able to handle this using the proc_ functions. A possible outline of the steps:
Create a unique token and pass it to the server along with the order to begin the import.
When order to import is received (via AJAX):
a) Store a record showing that this token is being processed (via file, db, memcached).
b) Run the import script using proc_open.
c) Begin a polling loop, checking the record from (2a) to see that this token is still in processing status. If not, call proc_terminate and exit the loop.
If the order to stop import is given (in a separate AJAX call), update the persisted record to indicate that the particular token should be stopped, which will be picked up in (2c)
Goto the following link for the exact solution to your problem:
PHP auto-kill a script if the HTTP request is cancelled/closed

is there a way to launch a php script and get the status?

is it possible to launch a php script in background on the webserver with js and let it run even if you change page or not visit the site at all and then get the current status if you call the php script in a second moment?
This php script will process data for hours and sleep for X seconds/minutes for each loops. If what I asked before is possible how can I even get "echos" from it if php will only generated an output only when the script ends?
Maybe this is not a job for PHP?
thank you
EDIT: on a windows machine with apache
It certainly is possible - I have several scripts that run 24/7 written in PHP. Check out Creating Daemons in PHP. It has good info on how to 'daemonize' a php script so that it will run like a service, and it also covers signal handling.
To get debugging output you would redirect to a log file. Do a search on "unix redirect output" as there is a lot of info available.
In Windows it's not much different from UNIX.
First of all, you need to create a PHP script with a run loop. For example, take a look at this: http://code.google.com/p/php-apns/ . This is a PHP "daemon": the main script, PushMonitor.php, runs forever, because it has an infinite loop. It polls a queue at regular intervals, then execute the actions and then wait. Really simple, actually!
The problem, in your case, is that you want to launch the "daemon" from a PHP script.
You may want to look at this: http://robert.accettura.com/blog/2006/09/14/asynchronous-processing-with-php/ (first example code) . You will execute something like launchBackgroundProcess('php myscript.php') .
Note that on the code there's the "start /b" command (and the "&" at the end of the command for UNIX). That is important, because otherwise your process would be killed when the PHP script of the web page is terminated (children process die after parent dies!).
Also, remember that the "php" executable (cli) must be in your path (so you can execute "php" from the command line).
Since the PHP script of the page launching the background process is going to terminate, you can't directly catch the "echoes" in a simple way. My suggestion is to write all output to a file (or a database etc), and then read the contents from that source when necessary.
So, instead of "echo", you will use file_put_contents() etc.

Asynchronous external aplication execution in PHP

I was wondering if is is possible to send output from application ran by php to client.
For example i have application that outputs:
Hello world
And after 10 seconds it outputs
10 seconds passed
I'd like to know if it is possible to send "Hello word" and "10 seconds passed" to client without waiting until whole program finishes its job. Client would receive "Hello world" first and after 10 seconds second output.
Thank you.
Your title says "Asynchronous external aplication execution". By this, you would mean something that will execute a program from your PHP script, yet continue on its own process and not hang PHP page load. You may want passthru() specifically setting the command to output to a local file rather than your script (personally not tested, though the PHP manual says you can), or pcntl_fork() to split off your script into a separate process which will handle the program execution on the side. However, double-sending to a browser after it had already disconnected from your server and expecting it to display your uninvited message is impossible unless you install a trojan on the client which will auto-accept your second, new tcp forced connection.
But, if you want a progress message for your page load, simply echo "still loading..." anywhere along a number of for or while loops. File download progress bars on the other hand cannot be dealt with in PHP. Echoing "still loading..." in the middle of the download will corrupt the file. At the moment, I'm not aware of any facility to do this using any PHP, Javascript, or VB method, except in the browsers own API (if documented) if the client allows it by installing a plugin you authored. But why, when browsers already have built-in progress bars?
I think you should do this with javascript. It's totally unnecessary to use cpu-cycles on the server until all of your requirements are that show time passed.
Usually, a client pulls content from the server. If you want to push from the server to the client, you need to look into push technologies like Comet. There is not too much available for PHP though. Periodically pushing with the PHP script terminating inbetween requires a Message Queue.
I don't understand your application, but for batch processing this comes to mind:
php hello-world.php | php client.php
To scale it, use Hadoop.

Categories