File upload aborted in the middle of uploading - php

Little brief about how uploading process flow in my team project.
In client side we call upload service to application server via AJAX. Application server then forward that file to File Server afterward via Curl (file server is private and can only be accessed by application server)
The situation is like this.
When files were uploading, it's already pass through the application server and reach Fileserver already. But before the data pass back to client side, user click cancel button from client.
How to check from application server, if user abort the request, then call delete to file server if it's already uploaded?
My solution
If php setting ignore_user_abort=false, I can't check if upload cancelled. So I set it to true before curl.
ini_set('ignore_user_abort', TRUE);
** Btw, ignore_user_abort=false still not terminate curl execution immediately even after script call aborted.
Set CURLOPT_NOPROGRESS to track progress if call aborted or not.
curl_setopt($ch, CURLOPT_NOPROGRESS, false);
curl_setopt($ch, CURLOPT_PROGRESSFUNCTION, 'callback_progress');
Handle progress
$cancelled = false; //cancel flag
function callback_progress($download_size, $downloaded, $upload_size, $uploaded){
//Really need this, if not connection_aborted() never be true and curl still running even if script aborted
print " "; ob_flush (); flush ();
if(connection_aborted()!= 0){
if(!$cancelled) $cancelled = true;
return 0; //to continue script, and handle later
}
}
Proceed
curl_close($ch); //close curl
ini_set('ignore_user_abort', FALSE); //set back to false
//If $cancelled if true, make delete call to file server using the file id
if($cancelled && isset($response['id'])) return $this->removeFile($response['id']);
But, it's not work. $cancelled is still false, although it's already true in callback_progress function.
Is there a better way for this? I can't find the right solution anywhere in the net for this situation.

I believe this is a case of the $cancelled variable and the call to the removeFile method not being available as at your progress function's call the part of the script you put it in will DEFINITELY have ran before the first call to callback_progress is made.
Try putting the check to see if $cancelled and request['id'] inside the callback_progress. This way the removeFile function should run when $cancelled is indeed true

Related

Send response and continue executing script

I'm creating a highload Telegram Bot, which means many requests come in and they take time to handle.
I use webhook (Telegram sends updates to, say, handler.php) which requires to respond with correct [] answer and fully close the connection for Telegram to send the next update. Otherwise Telegram will only increase pending_update_count and won't send any new updates until the previous one is handled.
So what I'm trying to do is to respond correctly and close the connection before code is executed.
StackOverflow suggests some solutions. But none of these will work for me because they only close the connection if there was no output and I need to respond with [].
How do I close a connection early?
Send response and continue executing script - PHP
The only workaround I came up with is to call shell_exec('php sendMessage.php >/dev/null 2>/dev/null &') from handler.php. Works perfectly well but that's not what I need. Any other suggestions to respond and close the connection, so code can execute in the background?
I found a simple solution that works for Telegram Bot API webhooks. It allows me to call the handling script directly within the script itself without using exec() or shell_exec().
Time-consuming code
$bool = true; // some condition in your regular code
if($bool) {
sleep(10);
}
This will cause a huge delay for other bot users since Telegram didn't get the response and won't send you any other updates.
Solution
if(isset($_GET['action'])) {
sleep(10);
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://localhost/handler.php?action');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 50);
curl_exec($ch);
curl_close($ch);
This way the script will call itself (or any other script), Telegram will be provided with correct and quick response, while the heavy part is executed separately.
It is strongly recommended not to go below 50 ms on CURLOPT_TIMEOUT_MS, sometimes it just doesn't work when receiving requests from Telegram.

How to use amphp/parallel library for non-blocking process

I want to use the amphp/parallel library for non-blocking process. I have a simple download file function which does a curl hit to the remote image file and save it to the local. I'm hitting this method through a REST API. Basically I want a process where aysnc download should be done on backend and it could be said as, REST API hit the function and function says "Hey, OK I'm downloading in background you can proceed ahead". Means non-blocking and API gets response as ok , not to wait. Meanwhile, if there is some network failure onto download, worker can restart the process in some time. How do I start?
I have tried the following code, but did not work.
require_once "vendor/autoload.php";
use Amp\Loop;
use Amp\Parallel\Worker\CallableTask;
use Amp\Parallel\Worker\DefaultWorkerFactory;
\Amp\Loop::run(function () {
$remote_file_url = "some remote image url"; //http://example.com/some.png
$file_save_path = "save path for file"; //var/www/html/some.png
$factory = new DefaultWorkerFactory();
$worker = $factory->create();
$result = yield $worker->enqueue(new CallableTask('downloadFile', [$remote_file_url, $file_save_path]));
$code = yield $worker->shutdown();
});
//downloadFile is a simple download function
function downloadFile($remoteFile, $localFile) {
if (!$remoteFile || !$localFile) {
return;
}
set_time_limit(0);
$fp = fopen($localFile, 'w+');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $remoteFile);
curl_setopt($ch, CURLOPT_TIMEOUT, 50);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$result = curl_exec($ch);
curl_close($ch);
fclose($fp);
return $result ? true : false;
}
I'm getting this error:
PHP Fatal error: Uncaught Amp\\Parallel\\Worker\\TaskError: Uncaught Error in worker with message "Call to undefined function downloadFile()" and code "0" in /var/www/html/test/vendor/amphp/parallel/lib/Worker/Internal/TaskFailure.php:45\nStack trace:\n#0 /var/www/html/test/vendor/amphp/parallel/lib/Worker/TaskWorker.php(126): Amp\\Parallel\\Worker\\Internal\\TaskFailure->promise()\n#1 [internal function]: Amp\\Parallel\\Worker\\TaskWorker->Amp\\Parallel\\Worker\\{closure}()\n#2 /var/www/html/test/vendor/amphp/amp/lib/Coroutine.php(76): Generator->send(Object(Amp\\Parallel\\Worker\\Internal\\TaskFailure))\n#3 /var/www/html/test/vendor/amphp/amp/lib/Internal/Placeholder.php(130): Amp\\Coroutine->Amp\\{closure}(NULL, Object(Amp\\Parallel\\Worker\\Internal\\TaskFailure))\n#4 /var/www/html/test/vendor/amphp/amp/lib/Coroutine.php(81): Amp\\Coroutine->resolve(Object(Amp\\Parallel\\Worker\\Internal\\TaskFailure))\n#5 /var/www/html/test/vendor/amphp/amp/lib/Internal/Placeholder.php(130): Amp\\Coroutine->Amp\\{closure}(NULL, Object(Amp\\Parallel\\Worker\\Internal\\TaskFailur in /var/www/html/test/vendor/amphp/parallel/lib/Worker/Internal/TaskFailure.php on line 45
I have similar requirement as asked in How does amphp work regarding the background running process.
Generally, Amp doesn't work magically in the background. If you use PHP via PHP-FPM or alike Amp will be shut down once the response is done, just like anything else.
If you want to move work from these requests into background processes, you need some kind of queue (e.g. beanstalkd) and a (permanent) worker to process these queued jobs. You can write such a daemonized worker with Amp, but it will have to be started out-of-band.
That said, if you just want concurrent downloads amphp/artax is better suited than using amphp/parallel, as it has a way lower overhead compared to a separate PHP process per HTTP request.
The question doesn't clarify where the downloadFile() function has been defined. As per amphp/parallel documentation, the callback must be auto-loaded so that Amphp can find it when the task is executed.
Here's a suggestion:
Put the downloadFile() function in a separate file, say functions.inc.
In your composer.json, under autoload/files, add an entry for functions.inc.
{
"autoload": {
"files": ["functions.inc"]
}
}
Run composer install so that the autoload.php is regenerated to reflect the above change.
Try executing the file containing your first code snippet containing Loop::run() etc.
I think this should do the trick. Apart from this, please refer to kelunik's comment which contains valuable information.

Why does this code so negatively affect my server's performance?

I have a Silverstripe site that deals with very big data. I made an API that returns a very large dump, and I call that API at the front-end by ajax get.
When ajax calling the API, it will take 10 mins for data to return (very long json data and customer accepted that).
While they are waiting for the data return, they open the same site in another tab to do other things, but the site is very slow until the previous ajax request is finished.
Is there anything I can do to avoid everything going unresponsive while waiting for big json data?
Here's the code and an explanation of what it does:
I created a method named geteverything that resides on the web server as below, it accessesses another server (data server) to get data via streaming API (sitting in data server). There's a lot of data, and the data server is slow; my customer doesn't mind the request taking long, they mind how slow everything else becomes. Sessions are used to determine particulars of the request.
protected function geteverything($http, $id) {
if(($System = DataObject::get_by_id('ESM_System', $id))) {
if(isset($_GET['AAA']) && isset($_GET['BBB']) && isset($_GET['CCC']) && isset($_GET['DDD'])) {
/**
--some condition check and data format for AAA BBB CCC and DDD goes here
**/
$request = "http://dataserver/streaming?method=xxx";
set_time_limit(120);
$jsonstring = file_get_contents($request);
echo($jsonstring);
}
}
}
How can I fix this, or what else would you need to know in order to help?
The reason it's taking so long is your downloading the entirity of the json to your server THEN sending it all to the user. There's no need to wait for you to get the whole file before you start sending it.
Rather than using file_get_contents make the connection with curl and write the output directly to php://output.
For example, this script will copy http://example.com/ exactly as is:
<?php
// Initialise cURL. You can specify the URL in curl_setopt instead if you prefer
$ch = curl_init("http://example.com/");
// Open a file handler to PHP's output stream
$fp = fopen('php://output', 'w');
// Turn off headers, we don't care about them
curl_setopt($ch, CURLOPT_HEADER, 0);
// Tell curl to write the response to the stream
curl_setopt($ch, CURLOPT_FILE, $fp);
// Make the request
curl_exec($ch);
// close resources
curl_close($ch);
fclose($fp);

Timing out on command line

I had previously asked a question, and got the answer, but I think I've run into another problem.
The php script I'm using does this:
1 - transfers a file to my server from my backup server
2 - when it's done transfering it sends some post data to it using curl, which creates a zip file
3 - when it's done, the result is echoed and depending on what the result is; transfers the file, or does nothing.
My problem is this:
When the file is small enough (under 500MB) it creates it, and transfers back no problem. When it's larger, it timesout, finishes creating the zip on the remote server, but because it timed out it doesn't get transfered.
I'm running this from a command line on the backup server. I have this in the php script:
set_time_limit(0); // ignore php timeout
ignore_user_abort(true); // keep on going even if user pulls the plug*
while(ob_get_level())ob_end_clean(); // remove output buffers
But it still timesout when I run sudo php backup.php
Is using curl making it timeout like a browser on the other end where the zip is being made? I think the problem is the response isn't being echo'd out.
Edits:
(#symcbean)
I'm not seeing anything, which is why I'm struggling. When I run it from the browser, I see the loading thing in the address bar. After about 30 seconds it just stops. When I do it from the command line, same deal. 30 seconds and it just stops. This only happens when large zips need to be created.
It's being invoked via a file. The file loads a class, sends the connection information to the class. Which contacts the server to make the zip, transfers the zip back, does some stuff to it then transfers it to S3 for archiving.
It logs into the remote server, uploads a file with curl. upon a valid response, it curls again with the location of the file as a url (I'll always know what it is), which fires up the php file I just transfered over. The zip ALWAYS gets created no problem, even up to 22GB, just sometimes takes a long time of course. After that it waits for a response of "created". Waiting for that response is where it dies.
So the zip always gets created, but the waiting time is what "I think" is making it die.
Second Edit:
I tried this from the command line:
$ftp_connect= ftp_connect('domain.com');
$ftp_login = ftp_login($ftp_connect,'user','pass');
ftp_pasv($ftp_connect, true);
$upload = ftp_put($ftp_connect, 'filelist.php', 'filelist.php', FTP_ASCII);
$get_remote = 'filelist.php';
$post_data = array (
'last_bu' => '0'
);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'domain.com/'.$get_remote);
curl_setopt($ch, CURLOPT_HEADER, 0 );
// adding the post variables to the request
curl_setopt($ch, CURLOPT_POSTFIELDS, $post_data);
//echo the following to get response
$response = curl_exec($ch);
curl_close($ch);
echo $response;
and got this:
<HTML>
<HEAD>
<TITLE>500 Internal Server Error</TITLE>
</HEAD><BODY>
<H1>Internal Server Error</H1>
The server encountered an internal error or
misconfiguration and was unable to complete
your request.<P>
Please contact the server administrator to inform of the time the error occurred
and of anything you might have done that may have
caused the error.<P>
More information about this error may be available
in the server error log.<P>
<HR>
<ADDRESS>
Web Server at domain.com
</ADDRESS>
</BODY>
</HTML>
Again, the error log is blank, the zip still gets created, but because of the timeout around 650MB of creation I can't get the response.
The problem is in the server code that generates the file to be returned.
Check the php error log
It may be timing out for a few reasons but the log shouldl tell you why.
I fixed it guys, thank you so much to everyone who helped me, it pointed me in the right directions.
In the end, the problem was on the remote server. What was happening was that it was timing out the cURL connection, which didn't send the result I needed back.
What I did to fix it was add a function to my class that (again) using curl, checks for the zip file http code I know it's creating When it finishes, then throw the result locally. If it's not finished, sleep for a few seconds and check again.
private function watchDog(){
$curl = curl_init($this->host.'/'.$this->grab_file);
//don't fetch the actual page, you only want to check the connection is ok
curl_setopt($curl, CURLOPT_NOBODY, true);
//do request
$result = curl_exec($curl);
//if request did not fail
if ($result !== false) {
//if request was ok, check response code
$statusCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);
if ($statusCode == 404) {
sleep(7);
self::watchDog();
}
else{
return 'zip created';
}
}
curl_close($curl);
}

Sending POST Requests without waiting for response?

I am writing a simple REST service, which responds to requests from clients. All in PHP.
My concern is, that when my server responds to a request, it could end up tying up resources if the client side is too slow in sending back "ok" response.
How do I send a POST request via lib_curl setting it to not wait for any responses, but rather quit immidiately after the POST data have been sent?
Is this even possible? Thank you !
You cannot just send data without receiving an answer with HTTP. HTTP always goes request -> response. Even if the response is just very short (like a simple 200 with no text), there needs to be a response. And every HTTP socket will wait for that response.
If you don't care about the response, you could add a process to the server that makes your requests, and you just push your request data to it (like a service that is running in the background, checking a request database, and always starting the request whenever a new entry was added). That way you would make the request asynchronously and could quit as soon as you added that request to the stack.
Also as meouw said, the client is not part of any communication you are doing with php. Php is a server-side language, so when the client requests a webpage (the php file), the server executes that file (and does all requests the php file states) and then returns the result to the client.
This solutions is for software minimal recevied package to continue script. If you want don't care about respond and have access to exec than use exec and call script in background. First Recevier File:
recevier.php
ignore_user_abort(true); //continue script if connetions become close by webbrowser(client) within working script
ob_end_clean(); // this 4 lines just extra sending to web about close connect it just in case
header("Connection: close\r\n"); //send to website close connect
header("Content-Encoding: none\r\n");
header("Content-Length: 1"); //
fastcgi_finish_request(); //close nginx,apache connect to php-fpm (php working but nginx or apache stop communication with php)
//continue scripting
// ...DO HERE WHAT YOU WANT ...
//check test with your mongo or mysql to sure php still keep connection with db
FRONTGROUND by PHP request to HTTP:
this solution is better than background and you need wait only 1ms
sender.php:
curl_setopt($curl, CURLOPT_TIMEOUT_MS, 1); //HERE MAGIC (We wait only 1ms on connection) Script waiting but (processing of send package to $curl is continue up to successful) so after 1ms we continue scripting and in background php continue already package to destiny. This is like apple on tree, we cut and go, but apple still fallow to destiny but we don't care what happened when fall down :)
curl_setopt($curl, CURLOPT_NOSIGNAL, 1); // i'dont know just it works together read manual ;)
--------- Check next answer to complete solution ------------
BACKGROUND By Server Request to HTTP: This will execute $cmd in the background (no cmd window) without PHP waiting for it to finish, on both Windows and Unix. #source https://www.php.net/manual/en/function.exec.php
<?php
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
?>
If you really don't care about the response you're probably best off exec-ing a wget command. This is mentioned in passing in some of the other answers, but here's a super easy function for sending a _POST package via this approach (which is asynchronous and takes 1-2ms):
function wget_request($url, $post_array, $check_ssl=true) {
$cmd = "curl -X POST -H 'Content-Type: application/json'";
$cmd.= " -d '" . json_encode($post_array) . "' '" . $url . "'";
if (!$check_ssl){
$cmd.= "' --insecure"; // this can speed things up, though it's not secure
}
$cmd .= " > /dev/null 2>&1 &"; //just dismiss the response
exec($cmd, $output, $exit);
return $exit == 0;
}
Credits: Function was adapted from
https://segment.com/blog/how-to-make-async-requests-in-php/
http://curl.haxx.se/mail/lib-2002-05/0090.html
libcurl has no asynchronous interface.
You can do that yourself either by
using threads or by using the
non-blocking "multi interface" that
libcurl offers. Read up on the multi
interface here:
http://curl.haxx.se/libcurl/c/libcurl-multi.html
PHP example of multi interface is here:
http://www.phpied.com/simultaneuos-http-requests-in-php-with-curl/
I have never tried this, but setting the CURLOPT_TIMEOUT to a very low value might do the trick. Try 0 or 0.1.
However, I don't know how cURL and the client will behave with this, whether the connection will be actively cancelled when the connection is already established, and the timeout is reached. You would have to try out. If you're calling PHP scripts, maybe ignore_user_abort() can make sure your scripts run through either way.
If you have 2 PHP servers communicating with each other, e.g. server 1 wants to send JSON data to a server 2, the server 2 is doing some heavy work and terminates the connection right after it receives the data so the server 1 doesn't have to wait for the result. You can do it like this:
Server 1 (client creating POST request with JSON data):
Use CURL, don't use file_get_contents() because in my experience, file_get_contents() doesn't handle Connection: close HTTP header correctly and doesn't terminate the connection as it should.
$curl = curl_init('http://server2.com/');
curl_setopt($curl, CURLOPT_HEADER, false);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_HTTPHEADER, ["Content-type: application/json"]);
curl_setopt($curl, CURLOPT_POST, true);
curl_setopt($curl, CURLOPT_POSTFIELDS, json_encode(['some data']));
$response = curl_exec($curl);
$status = curl_getinfo($curl, CURLINFO_HTTP_CODE);
if ($status !== 200) {
exit("Failed with status {$status}, response {$response}, curl_error " . curl_error($curl) . ", curl_errno " . curl_errno($curl));
}
curl_close($curl);
echo $response;
Server 2:
Modified code from bubba-h57 used.
// Cause we are clever and don't want the rest of the script to be bound by a timeout.
// Set to zero so no time limit is imposed from here on out.
set_time_limit(0);
// Client disconnect should NOT abort our script execution
ignore_user_abort(true);
// Clean (erase) the output buffer and turn off output buffering
// in case there was anything up in there to begin with.
ob_end_clean();
// Turn on output buffering, because ... we just turned it off ...
// if it was on.
ob_start();
echo 'I received the data, closing connection now, bye.';
// Return the length of the output buffer
$size = ob_get_length();
// Send headers to tell the browser to close the connection
// Remember, the headers must be called prior to any actual
// input being sent via our flush(es) below.
header("Connection: close");
// Hack how to turn off mod deflate in Apache (gzip compression).
header("Content-Encoding: none");
header("Content-Length: {$size}");
// Set the HTTP response code
http_response_code(200);
// Flush (send) the output buffer and turn off output buffering
ob_end_flush();
// Flush (send) the output buffer
// This looks like overkill, but trust me. I know, you really don't need this
// unless you do need it, in which case, you will be glad you had it!
#ob_flush();
// Flush system output buffer
// I know, more over kill looking stuff, but this
// Flushes the system write buffers of PHP and whatever backend PHP is using
// (CGI, a web server, etc). This attempts to push current output all the way
// to the browser with a few caveats.
flush();
// Close current session.
session_write_close();
// Here, you can proceed with some heavy work.
echo "This won't be sent, the connection should be already closed";
In Laravel
use Illuminate\Support\Facades\Http;
...Some Code here
$prom = Http::timeout(1)->async()->post($URL_STRING, $ARRAY_DATA)->wait();
... Some more important code here
return "Request sent"; //OR whatever you need to return
This works for me as I don't need to know the response always.
As other people says when you make a http request you have to wait the response.
In PHP what you can do is make the request using the exec function.
Check this link: php exec command (or similar) to not wait for result

Categories