I'm currently working on an internal website displaying a lot of statistics, and some pages or ajax scripts are extremely slow due to large datas.
What I'm searching is a way to launch theses scripts in background with a request, and then launch ajax requests to know the progress of the background script.
Is there any way to achieve this? I work with php7.0 and an apache2 server (I don't have direct access to apache server configuration, so if it's possible, I search for a client-side option)
If ever someone else is searching for a way to achieve this, here is the solution I found:
I call in Ajax a script, it forks itself and save the PID of the child process in the database.
Then I call session_write_close() in the child process to allow user making new requests, and the father process exits (not waiting for child end).
As the father exits, the user receive an answer to his request, and the child process continue his job.
Then in Ajax I call another script to get the evolution of the worker, and finally I get the result and kill the child process when everything is done.
Here is the code of my worker class:
class AsyncWorker
{
private $pid;
private $worker;
private $wMgr;
public function __construct($action, $content, $params = NULL)
{
$this->wMgr = new WorkersManager();
$pid = pcntl_fork(); // Process Fork
if ($pid < 0) {
Ajax::Response(AJX_ERR, "Impossible de fork le processus");
} else if ($pid == 0) { // In the child, we start the job and save the worker properties
sleep(1);
$this->pid = getmypid();
$this->worker = $this->wMgr->fetchBy(array("pid" => $this->pid));
if (!$this->worker) {
$this->worker = $this->wMgr->getEmptyObject();
$this->wMgr->create($this->worker);
}
$this->worker->setPid($this->pid);
$this->worker->setAction($action);
$this->worker->setContent($content);
$this->worker->setPercent(0.00);
$this->worker->setResult("");
$this->wMgr->update($this->worker);
$this->launch($params);
} else { // In the father, we save the pid to DB and answer the request.
$this->worker = $this->wMgr->fetchBy(array("pid" => $this->pid));
if (!$this->worker) {
$this->worker = $this->wMgr->getEmptyObject();
$this->worker->setPid($pid);
$this->wMgr->create($this->worker);
}
Ajax::Response(AJX_OK, "Worker started", $this->worker->getId());
}
}
// Worker job
private function launch($params = NULL)
{
global $form, $_PHPPATH, $url, $session;
session_write_close(); // This is useful to let the user make new requests
ob_start(); // Avoid writing anything
/*
** Some stuff specific to my app (include the worker files, etc..)
*/
$result = ob_get_contents(); // Get the wrote things and save them to DB as result
$this->worker->setResult($result);
$this->worker->setPercent(100);
ob_end_clean();
}
}
It's a bit tricky but I had no choices, as I have no access to server plugins and libraries.
you can make php script to execute shell bash script , or using exec() method for that
Related
I'm in the process of updating a web app presenting real-time sensor data, which in it's current first iteration does this by continuous AJAX polling.
However, in order to make this more like a 'true real-time' app, I would like it to be event-based.
I've been reading up on event-based techniques, and based on the fact that the real-time communication only has to go one way (server - > client), I have chosen to go with Server-Sent Events (SSE) for now instead of something like websockets.
As described here on the Mozilla Docs, this is easily implemented on the server side with something like (a little simplified):
<?php
// SSEscript.php
date_default_timezone_set("America/New_York");
header("Content-Type: text/event-stream\n\n");
while (1) {
if ($new_data_available) {
echo "data:". $data;
}
sleep($short_time_to_spare_cpu);
}
?>
and on the client side with:
<script>
var evtSource = new EventSource("SSEscript.php");
evtSource.onmessage = function(e) {
var data = e.data;
// Do something with data object
}
</script>
All the above works fine for me.
However, the sensor data is initially retrieved by a Python script running continuously on the server, so how do I transfer the sensor data from the Python script to the PHP script IMMEDIATELY when it is retrieved, so that an event can be generated and sent?
Can I do something like depicted below? :
At the same time all new data is stored in a MySQL db, so I could of course make the PHP script query the db really often for new entries, but there has to be a smarter way. So can I make 2.1 and 2.2 in the image happen at the same time?
All the answers I could find in here describes how data can be transferred by making the PHP execute the Python script, but that is not what I want as this has to run whether or not a user asks for data.
Is a kind of socket the way to go, and if so, can you point me in the direction of how to do so?
I hope you can help me out!
I use redis to do the signalling between python and redis. After python do all the stuffs, rpush a token (or the latest data) into a redis queue. In the PHP I use a while(true) loop to hold the request and redis lpop(queue,timeout) to wait for the token, and send data out, something like this:
<?php
require __DIR__.'/predis-1.0/autoload.php';
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");
header("Connection: keep-alive");
$lastId = isset($_SERVER["HTTP_LAST_EVENT_ID"]) ?
$_SERVER["HTTP_LAST_EVENT_ID"] : null;
if (isset($lastId) && !empty($lastId) && is_numeric($lastId)) {
$lastId = intval($lastId);
$lastId++;
}
$index = isset($_GET['index']) ? $_GET['index'] : null;
echo "retry: 2000\n";
$client = new Predis\Client();
while (true) {
$data = $client->blpop('queue',5);
if ($data) {
error_log("$index : " . strlen($data));
sendMessage($lastId, $data);
$lastId++;
}
}
function sendMessage($id, $data) {
echo "id: $id\n";
echo "data: $data\n\n";
ob_flush();
flush();
}
I have a website on an Ubuntu LAMP Server - that has a form which gets variables and then they get submitted to a function that handles them. The function calls other functions in the controller that "explodes" the variables, order them in an array and run a "for" loop on each variable, gets new data from slow APIs, and inserts the new data to the relevant tables in the database.
Whenever I submit a form, the whole website gets stuck (only for my IP, on other desktops the website continue working regularly), and I get redirected until I get to the requested "redirect("new/url);".
I have been researching this issue for a while and found this post as an example:
Continue PHP execution after sending HTTP response
After studding how this works in the server side, which is explained really good in this video: https://www.youtube.com/watch?v=xVSPv-9x3gk
I wanted to start learning how to write it's syntax and found out that this only work on CLI and not from APACHE, but I wasn't sure.
I opened this post a few days ago: PHP+fork(): How to run a fork in a PHP code
and after getting everything working from the server side, installing fork and figuring out the differences of the php.ini files in a server (I edited the apache2 php.ini, don't get mistaked), I stopped getting the errors I used to get for the "fork", but the processes don't run in the background, and I didn't get redirected.
This is the controller after adding fork:
<?php
// Registers a new keyword for prod to the DB.
public function add_keyword() {
$keyword_p = $this->input->post('key_word');
$prod = $this->input->post('prod_name');
$prod = $this->kas_model->search_prod_name($prod);
$prod = $prod[0]->prod_id;
$country = $this->input->post('key_country');
$keyword = explode(", ", $keyword_p);
var_dump($keyword);
$keyword_count = count($keyword);
echo "the keyword count: $keyword_count";
for ($i=0; $i < $keyword_count ; $i++) {
// create your next fork
$pid = pcntl_fork();
if(!$pid){
//*** get new vars from $keyword_count
//*** run API functions to get new data_arrays
//*** inserts new data for each $keyword_count to the DB
print "In child $i\n";
exit($i);
// end child
}
}
// we are the parent (main), check child's (optional)
while(pcntl_waitpid(0, $status) != -1){
$status = pcntl_wexitstatus($status);
echo "Child $status completed\n";
}
// your other main code: Redirect to main page.
redirect('banana/kas');
}
?>
And this is the controller without the fork:
// Registers a new keyword for prod to the DB.
public function add_keyword() {
$keyword_p = $this->input->post('key_word');
$prod = $this->input->post('prod_name');
$prod = $this->kas_model->search_prod_name($prod);
$prod = $prod[0]->prod_id;
$country = $this->input->post('key_country');
$keyword = explode(", ", $keyword_p);
var_dump($keyword);
$keyword_count = count($keyword);
echo "the keyword count: $keyword_count";
// problematic part that needs forking
for ($i=0; $i < $keyword_count ; $i++) {
// get new vars from $keyword_count
// run API functions to get new data_arrays
// inserts new data for each $keyword_count to the DB
}
// Redirect to main page.
redirect('banana/kas');
}
The for ($i=0; $i < $keyword_count ; $i++) { is the part that I want to get running in the background because it's taking too much time.
So now:
How can I get this working the way I explained? Because from what I see, fork isn't what I'm looking for, or I might be doing this wrong.
I will be happy to learn new techniques, so I will be happy to get suggestions about how I can do this in different ways. I am a self learner, and I found out the great advantages of Node.js for exmaple, which could have worked perfectly in this case if I would have learnt it. I will consider to learn working with Node.js in the future. sending server requests and getting back responses is awesome ;).
***** If there is a need to add more information about something, please tell me in comments and I will add more information to my post if you think it's relevant and I missed it.
What you're really after is a queue or a job system. There's one script running all the time, waiting for something to do. Once your original PHP script runs, it just adds a job to the list, and it can continue it's process as normal.
There's a few implementations of this - take a look at something like https://laravel.com/docs/5.1/queues
I'm relatively new to php, but I can program in other languages (js, java, c/c++). I have a problem and I can't seem to be able to solve, so I'm hoping that someone can help me out here :)
I created a server_class.php file which manages connections between multiple clients. The clients connect to the php server via the web. When I launch the server_class.php it executes two applications (they are in an endless loop) and print data to the terminal. When a client connects to the php server, I want the server to start sending the output of each application to each client, so the clients can see the current output of each application. I have partially achieved this. However, it only sends the output of one application and not the other.
The function below is executed when the connection between the server and the client is performed:
private function startProc($client) {
$this->output("Start a client process");
$pid = pcntl_fork();
if($pid == -1) {
$this->output("fork terminated!");
}
elseif($pid) { // process
$client->setPid($pid);
}
else {
$this->output("Starting app1 data pipe...");
exit($this->launchAppOneProc($client));
$this->output("Starting app2 data pipe...");
exit($this->launchAppTwoProc($client));
}
}
Ok, once the connection between the client and the server is done, this function is executed. As you can see, I create a new process which then executes two methods: launchAppOneProc and launchAppTwoProc. These two functions contain the following code:
private function launchAppOneProc($client) {
while (# ob_end_flush()); // end all output buffers if any
while (!feof($this->appone_proc))
{
$appone_text = fread($this->appone_proc, 4096);
$this->send($client, $appone_text);
flush();
}
$this->output("AppOne has stopped running!");
}
The function above is the same as for launchAppTwoProc(). The function $this->output prints the text specified into the terminal of the server_class.php
So the problem is that it only executes launchAppOneProc() function and does not execute the next function launchAppTwoProc().
Any ideas on how I can execute both functions?
Thank you
David
Ok, I found a solution to my problem. To clear some confusion, I have two applications that are running in the background and writing to a buffer. When a client connects to the server, they connect to both buffers and start receiving data. So what I did was to create a new process for each channel. When the client connects to the server, two processes are created and assigned to the client. Each process is connected to a buffer and starts sending data to the client.
private function startProc($client) {
$this->output("Start a client process");
for ($i = 0; $i < 2; ++$i) {
// Create new process.
$pid = pcntl_fork();
switch ($pid) {
case -1: // Failed to create new process.
$this->output("Failed to create new process!");
break;
case 0: // Child process created, execute code...
switch ($i) {
case 0:
$this->output("Connect to AppOne data pipe...");
$this->launchAppOneProc($client);
break;
case 1:
$this->output("Connect to AppTwo data pipe...");
$this->launchAppTwoProc($client);
break;
}
default: // Back to parent.
switch ($i) {
case 0:
$client->setAppOnePid($pid);
break;
case 1:
$client->setAppTwoPid($pid);
break;
}
}
}
}
I know I can use an array to store the process IDs, etc. But for now, this works. Once the client disconnects, each process for the client that just left has to be terminated as such:
posix_kill($client->getAppOnePid(), SIGTERM);
posix_kill($client->getAppTwoPid(), SIGTERM);
I hope this make sense and helps anyone who runs into the same problem.
In a script I'm trying to check whether the same script is already running using MySQL GET_LOCK. The problem is, when a script tries to get lock, which isn't free, it blocks forever regardless of the parameter I provide.
<?php
class Controller_Refresher extends Controller {
public function action_run($key) {
echo date('H:i:s'."\n", time());
$this->die_if_running();
$this->run();
echo date('H:i:s'."\n", time());
}
private function die_if_running() {
$result = DB::query(Database::SELECT, "SELECT IS_FREE_LOCK('refresher_running') AS free")->execute();
if (! intval($result[0]['free'])) die('Running already');
$result = DB::query(Database::SELECT, "SELECT GET_LOCK('refresher_running', 1)")->execute();
}
private function run() {
echo "Starting\n";
ob_flush();
sleep(10);
DB::query(Database::SELECT, "SELECT RELEASE_LOCK('refresher_running')")->execute();
}
}
When I run this in 2 tabs in browser, I get e.g.:
-tab 1-
20:48:16
Starting
20:48:26
-tab 2-
20:48:27
Starting
20:48:37
While what I want to do is to make the second tab die('Running already');.
Watch out - this problem might actually be caused by php locking the session file:
https://stackoverflow.com/a/5487811/539149
So you should call session_write_close() before any code that needs to run concurrently. I discovered this after trying this Mutex class:
http://code.google.com/p/mutex-for-php/
The class worked great but my php scripts were still running one by one!
Also, you don't need IS_FREE_LOCK(). Just call GET_LOCK('refresher_running', 0) and it will either return 1 if it gives you the lock or 0 if the lock is taken. It's more atomic that way. Of course, lock timeouts can still be useful in certain situations, like when you want to queue up tasks, but watch out for the script timing out if you get too many simultaneous requests.
Zack Morris
One option would be to rely on a filesystem lock instead of a database. Since it's the script execution that needs handling, it should not matter. A sample from the manual with a non-blocking exclusive lock:
$fp = fopen('/tmp/lock.txt', 'r+');
/* Activate the LOCK_NB option on an LOCK_EX operation */
if(!flock($fp, LOCK_EX | LOCK_NB)) {
die('Running already');
}
/* ... */
fclose($fp);
Edit
Another option would be to use a status file that gets created at the beginning of each exection and will be automatically deleted by register_shutdown_function upon script completion.
The script would simply check the existence of the status file and if it's already there, execution would stop:
define('statusFile', sys_get_temp_dir() . DIRECTORY_SEPARATOR . 'myjob.running');
//
// If a process is already running then exit
//
if (file_exists(statusFile)) {
die('Running already');
} else {
file_put_contents(date('Y-m-d H:i:s'), statusFile);
}
//
// Other code here
//
function shutdown() {
unlink(statusFile);
}
//
// Remove the status file on completion
//
register_shutdown_function('shutdown');
I have a web app that has a few processes that can take up to 10 minutes to run. Sometimes these processes are triggered by a user and they need the output as it is processed.
For instance, the user is looking for
a few records that they need. The
click the button to retrieve the
records (this is the part that can
take 10 minutes). They can continue
to work on other things but when they
click back to view the returns, it is
updated as the records are downloaded
into the system.
Right now, the user is locked while the process runs. I know about pcntl_fork() to fork a child process so that the user doesn't have to wait until the long process completes.
I was wondering if it's possible to tie that forked process to the specific user that triggered the request in a $_SESSION variable so that I can update the user when the process is complete. Also, is this the best way to update a user on a long-running process?
I think gearman fits your needs. Look at this sample code, taken from the doc :
<?php
/* create our object */
$gmclient= new GearmanClient();
/* add the default server */
$gmclient->addServer();
/* run reverse client */
$job_handle = $gmclient->doBackground("reverse", "this is a test");
if ($gmclient->returnCode() != GEARMAN_SUCCESS)
{
echo "bad return code\n";
exit;
}
$done = false;
do
{
sleep(3);
$stat = $gmclient->jobStatus($job_handle);
if (!$stat[0]) // the job is known so it is not done
$done = true;
echo "Running: " . ($stat[1] ? "true" : "false") . ", numerator: " . $stat[2] . ", denomintor: " . $stat[3] . "\n";
}
while(!$done);
echo "done!\n";
?>
If you store the $job_handle in the session, you can adapt the sample to make a control script.