I have a script that uploads a file to my webserver via http, what i need is the server to send a response to the http post but then continue to run some commands.
For example
file is posted via http
Server saves the file the inserts data into the database and sends a response to the http.
But i need the server to continue to run after the response and continue running some editing on the file.
example php
function uploadFile(){
$imageName = $tmpname . '.jpg';
move_uploaded_file ($_FILES["foto"]["tmp_name"], $targetDir . $imageName);
$data = Array('ALL THE DATABASE DATE')
$returnid = $this->uploader_model->addData($data);
//I NEED IT TO RETURN THIS TO THE AJAX HTTP REQUEST
echo json_encode(array(
'returned' => 'Successfully uploaded..',
'id' => $returnid
));
//I NOW NEED IT TO KEEP RUNNNG SOME EDITING ON THE IMAGE HERE IE..
shell_exec(" run some image editing here 2>&1");
}
I have seen ignore user abort like
ignore_user_abort(1); // run script in background
set_time_limit(0); // run script forever
but does this mean the script will just run forever and crash the server if it is getting multiple requests? just a bit confused on how to use this function correctly in this scenario
Any help please
Your script won't run forever unless you make it, e.g. by using an infinite loop. Having said that, that's not necessarily a good way to go about it.
To ensure the best user experience, you should make sure your script executes quickly by deferring any time-consuming tasks. For example, you could create a task queue and run the queued tasks with cron.
Alternatively, if you do decide to use the solution you described in your question, you may be better off running that with AJAX after a file has been successfully uploaded. AJAX requests may be terminated when the user navigates off the webpage, so you'd still need to make sure the server continues executing your script.
BTW, You should also run flush() if you actually want to send something to the client.
Related
I have a php script that is currently invoked directly by a webhook. The webhook method was fine up until this past week where the volume of requests is becoming problematic for API rate limits.
What I have been trying to do is make a second PHP file ($path/webhook-receiver.php) to be invoked by webhooks only when there isn't a process running. I'll be using the process user webb recommended, which is in the invoked script ($path/event-finance-reporting.php) it will create a file as the first action, and the delete that file as the last exection.
Before invoking the script the automation will check the directory to make sure it is empty, otherwise it will kick back an error to the user telling them to wait until the current job is completed before submitting another one.
The problem I'm running into now is that both $command1 and $command2'. both end up invoking the$path/webhook-reciever.phpinstead of$path/event-finance-reporting.php`.
$command1 = "php -f $path/event-finance-reporting.php 123456789";
$command2 = "/usr/bin/php -q -f $path/event-finance-reporting.php 123456789";
Anyone know why would be?
The goal it to have only one instance of event-finance-reporting.php run at a time. One strategy is to create a unique lockfile, don't run if it exists, and delete it when it finishes, e.g.,:
$lockfilepath = '.../event-finance-reporting.lock';
if(file_exists($lockfilepath)){
print("try again later");
exit();
}
touch($lockfilepath);
...
// event-finance-reporting.php code
...
unlink($lockfilepath);
You could also do something more complicated in the if, such as checking the age of the lockfile, then deleting and ignoring it if it was left behind awhile ago by a crashed instance of event-finance-reporting.php.
With this strategy, you also don't need two separate phps.
I am building a WebService, using PHP:
Basically,
User sends a request to the server, via HTTP Request. 'request.php', ie.
Server starts php code asynchronously. 'update.php', ie.
The connection with the user is finished.
The code 'update.php' is still running, and will finish after some time.
The code 'update.php' is finished.
The problem is with php running asynchronously some external code.
Is that possible? Is there another way to do it? With shell_exec?
Please, I need insights! An elegant way is preferable.
Thank you!
The best approach is using message queue like RabbitMQ or even simple MySQL table.
Each time you add new task in front controller it goes to queue. Then update.php run by cron job fetch it from queue, process, save results and mark task as finished.
Also it will help you distribute load over time preventing from DoS caused by your own script.
You could have the user connect to update.php, generate some sort of unique ID to keep track of the process, and then call fsockopen() on itself with a special GET variable to signify that it's doing the heavy lifting rather than user interaction. Close that connection immediately, and then print out the appropriate response to the user.
Meanwhile, look for the special GET variable you specified, and when present call ignore_user_abort() and proceed with whatever operations you need in that branch of the if clause. So here's a rough skeleton of what your update.php file would look like:
<?php
if ( isset($_GET['asynch']) ) {
ignore_user_abort();
// check for $_GET['id'] and validate,
// then execute long-running code here
} else {
// generate $id here
$host = $_SERVER['SERVER_NAME'];
$url = "/update.php?asynch&id={$id}";
if ( $handle = fsockopen($host, 80, $n, $s, 5) ) {
$data = "GET {$url} HTTP/1.0\r\nHost: {$host}\r\n\r\n";
fwrite($handle, $data);
fclose($handle);
}
// return a response to the user
echo 'Response goes here';
}
?>
You could build a service with PHP.
Or launch a PHP script using bash : system("php myScript.php param param2 &")
Look into worker processes with Redis resque or gearman
I have a PHP script that has to reload a page on the client (server push) when something specific happens on the server. So I have to listen for changes. My idea is to have a text file that contains the number of page loads for the current page. So I would like to monitor the file and as soon as it is modified, to use server push in order to update the content on the client. The question is how to track the file for changes in PHP?
You could do something like:
<?php
while(true){
$file = stat('/file');
if($file['mtime'] == time()){
//... Do Something Here ..//
}
sleep(1);
}
This will continuously look for a change in the modified time of a file every second. If you don't constrain it you could kill your disk IO and may need to adjust your ulimit.
This will check your file for a change:
<?php
$current_contents = "";
function checkForChange($filepath) {
global $current_contents;
$new_contents = file_get_contents($filepath);
if (strcmp($new_contents, $current_contents) {
$current_contents = $new_contents;
return true;
}
return false;
}
But that will not solve your problem. The php file that serves the client finishes executing before the rendered html is sent to the client. That client will need to call back to some php file to check for a change... and since that is also a http request, the file will finish executing and forget anything in memory.
In order to properly solve this, you'll probably have to back off the idea of checking a file. Either the server needs to know when and how to contact currently connected clients, or those clients need to poll a lightweight service at a regular interval.
This is sort of hacky but what about creating a cron job that sucks in the page, stores it in a scope or table, and then simply compares it every 30 seconds?
I'm building a forum that will allow users to upload images. Images will be stored on my web server temporarily (so I can upload them with a progress bar) before being moved to an S3 bucket. I haven't figured out a brilliant way of doing this, but I think the following makes sense:
Upload image(s) to the web server using XHR with progress bar
Wait for user to submit his post
Unlink images he did not end up including in his post
Call a URL that uploads the remaining images to S3 (and update image URLs in post body when done)
Redirect user to his post in the topic
Now, since step 4 can take a considerable amount of time, I'm looking for a cron like solution, where I can call the S3 upload script in the background and not have the user wait for it to complete.
Ideally, I'm looking for a solution that allows me to request a URL within my framework and pass some image id's in the query, i.e.:
http://mysite.com/utils/move-to-s3/?images=1,2,3
Can I use a cURL for this purpose? Or if it has to be exec(), can I still have it execute a URL (wget?) instead of a PHP script (php-cli)?
Thanks a heap!
PHP's
register_shutdown_function()
is your friend [reference].
The shutdown function keeps running, while your script terminated.
Thus, if everything is available, submit the finale page and exit. The the registered shutdown function continues and performs the time-consuming job.
In my case, I prepared a class CShutdownManager, which allows to register several method to be called after script termination. For example, I use CShutdownManager to delete temporary files no longer needed.
Try the following statement:
shell_exec(php scriptname);
I found the solution to this problem which I'm happy to share. I actually found it on SO, but it needed some tweaking. Here goes:
The solution requires either exec() or shell_exec(). It doesn't really matter which one you use, since all output will be discarded anyway. I chose exec().
Since I am using MAMP, rather than a system-level PHP install, I needed to point to the PHP binary inside the MAMP package. (This actually made a difference.) I decided to define this path in a constant named PHP_BIN, so I can set a different path for local and live environments. In this case:
define(PHP_BIN, '/Applications/MAMP/bin/php/php5.3.6/bin/php');
Ideally, I wanted to execute a script inside my web framework instead of some isolated shell script. I wrote a wrapper script that accepts a URL as an argument and named it my_curl.php:
if(isset($argv[1]))
{
$url = $argv[1];
if(preg_match('/^http(s)?:\/\//', $url))
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
}
}
In this SO question I found the way to execute a shell command in the background. This is necessary because I don't want the user to have to wait until it's finished.
Finally, I run this bit of PHP code to execute the other (or ANY) web request in the background:
exec(PHP_BIN . ' /path/to/my_curl.php http://site.com/url/to/request
&> /dev/null &');
Works like a charm! Hope this helps.
I want to download a large amount of Files to my Server. I have a List of different Files to download and Locations where to put them. This all is not a Problem, i use wget to download the File, execute this with shell_exec
$command = 'wget -b -O' . $filenameandpathtoput . ' ' . $submission['url'];
shell_exec($command);
This works great, the Server starts all the Threads and the Files are downloaded in no Time.
Problem is, I want to notify the User when the Files are downloaded... And this does not work with my current way of doing things. So how would you implement this?
Any Suggestions would be helpful!
I guess that you are able to check whether all files are in place with something like
function checkFiles ()
{
foreach ($_SESSION["targetpaths"] as $p)
{
if (!is_file($p)) return false;
}
return true;
}
Now all you have to do is to call a script on your server that calls this function every second (or so). You can either accomplish this with Meta Refresh (forcing the browser to reload the page after n seconds) or by using AJAX (have a look at jQuery's .getJSON, for example).
If the script is called and the files are not yet all downloaded, print something like "Please wait" and refresh again later. Otherwise, show the success message. Thats all.
You can consider using exec to run the external wget command. Your PHP script will block till the external command completes. Once it completes you can echo the name of the completed file.