I've got the following scenario: multiple users from local network access a web application in coded in php resident on a IIS server. At every page load 3 ajax calls to 3 separated php script are performed and these calls repeat themselves every X minutes (timed jquery). For every ajax call of every user connected a php-cgi session is opened on the server, adding quickly up to 20 or so processes. The problem is after the ajax call these processes remain open, thus using a large amount of memory on the server with consequential problems to performances (arriving at a total block at times).
All the php scripts are called via the jQuery $.post function and perform one or more queries on a mssql db and end echoing a json encoded object or array. Is there a way to make these process close after the execution of the php script? I would like to avoid the option of making serial calls instead of parallel ones.
Any help is strongly appreciated.
Thanks
Don't know if you have already got a solution but anyways:
Try adding a
die();
?>
at the end of the php scripts that are called. That would kill the scripts that are called after they have completed execution. As each call creates its own process, their would be no issues even if things are done in parallel.
Related
I have a webpage that when users go to it, multiple (10-20) Ajax requests are instantly made to a single PHP script, which depending on the parameters in the request, returns a different report with highly aggregated data.
The problem is that a lot of the reports require heavy SQL calls to get the necessary data, and in some cases, a report can take several seconds to load.
As a result, because one client is sending multiple requests to the same PHP script, you end up seeing the reports slowly load on the page one at a time. In other words, the generating of the reports is not done in parallel, and thus causes the page to take a while to fully load.
Is there any way to get around this in PHP and make it possible for all the requests from a single client to a single PHP script to be processed in parallel so that the page and all its reports can be loaded faster?
Thank you.
As far as I know, it is possible to do multi-threading in PHP.
Have a look at pthreads extension.
What you could do is make the report generation part/function of the script to be executed in parallel. This will make sure that each function is executed in a thread of its own and will retrieve your results much sooner. Also, set the maximum number of concurrent threads <= 10 so that it doesn't become a resource hog.
Here is a basic tutorial to get you started with pthreads.
And a few more examples which could be of help (Notably the SQLWorker example in your case)
Server setup
This is more of a server configuration issue and depends on how PHP is installed on your system: If you use php-fpm you have to increase the pm.max_children option. If you use PHP via (F)CGI you have to configure the webserver itself to use more children.
Database
You also have to make sure that your database server allows that many concurrent processes to run. It won’t do any good if you have enough PHP processes running but half of them have to wait for the database to notice them.
In MySQL, for example, the setting for that is max_connections.
Browser limitations
Another problem you’re facing is that browsers won’t do 10-20 parallel requests to the same hosts. It depends on the browser, but to my knowledge modern browsers will only open 2-6 connections to the same host (domain) simultaneously. So any more requests will just get queued, regardless of server configuration.
Alternatives
If you use MySQL, you could try to merge all your calls into one request and use parallel SQL queries using mysqli::poll().
If that’s not possible you could try calling child processes or forking within your PHP script.
Of course PHP can execute multiple requests in parallel, if it uses a Web Server like Apache or Nginx. PHP dev server is single threaded, but this should ony be used for dev anyway. If you are using php's file sessions however, access to the session is serialized. I.e. only one script can have the session file open at any time. Solution: Fetch information from the session at script start, then close the session.
Hi I am new to PHP and have no idea if what I am about to ask is even possible or does it even make sense but here goes.
I want to execute a PHP script as if I am executing a standalone application on the WebServer, what I am trying to implement is that when the Customer purchases something on the website and the once he sees the payment confirmation notice on the website, he should be allowed to close the browser window or logoff without affecting the big order generation process that get's started once the user is taken to the page that displays that the payment that he made was successful.
Right now I am making use of AJAX to call my after payment processing PHP script and have kept that script to ignore any user abort call.
This is the page that tells the user that the payment was received successfully.
thankyou.php
This is the page that performs the processing that needs to be done only after successful receipt of payment
FinishCheckoutProcess.inc.php
Now thankyou.php makes use of AJAX to execute FinishCheckoutProcess.inc.php asynchronously and FinishCheckoutProcess.inc.php has a PHP.ini setting in it that goes like this:
ignore_user_abort(true);
Now the combination of AJAX and ignore_user_abort(true) allows the after payment process to run without any errors even if the user closes his browser window, but since this script has nothing to do with the user or the browser I just wanted to know if it is possible to run this script in the background like a standalone application independent of the browser.
Also my WebServer is Apache and OS is Linux(Ubuntu OS).
My work is getting done but I just want to know if there is a better/safer way to do it.
Anyway thanks in advance to everyone, this site has helped me more than any book could have. So all you experts out there who donate their times to newbies like me you guys are awesome. Please keep up the good work.
Again thanks a lot.
Based on suggestions received
If I use the "exec" method to execute the FinishCheckoutProcess.inc.php, will this execute database related commands and will it be able to run further PHP scripts.
FinishCheckoutProcess.inc.php in turn executes a series of other PHP scripts which in turn executes other PHP scripts, so will using "exec" command to run FinishCheckoutProcess.inc.php create any difficulties.
FinishCheckoutProcess.inc.php process also does interaction with the MySQL database, so will I be able to do this if I execute this script using "exec" command. I am passing the necessary MySQLi connection object to this PHP script right now. So can I pass it the same way to it using "exec"
Also the process is quite heavy as it generates a set of 4 image files using IMagick and ImageMagick.
It generates a set of 4 image files for every product ordered, so if the quantity of 1 product is 10 then the total files generated will be 1x10x4 = 40
If there are two products with one having quantity as 2 and the other having quantity as 4 then the total files generated will be
1x2x4 = 8 +
1x4x4 = 16 = 24
So this script might need to run for a long time and cannot be allowed to be stopped due to time out reasons, it needs to finish what it started.
Basiclly the FinishCheckoutProcess.inc.php logic and process is quite complex so just want to confirm if the "exec" can handle it or not.
Also I am not sure but some of them also make use of $_SESSION variables, but if this a problem I can modify it, $_SESSION variables only get's used in one place and yes the $_SESSION get's set in the browser before the FinishCheckoutProcess.inc.php script is executed. By some previous PHP script.
I just want to execute the FinishCheckoutProcess.inc.php script independent of the parent/calling script i.e. thankyou.php, so that if the user closes the browser then the FinishCheckoutProcess.inc.php will not stop or abort becuse the parent/calling script i.e. thankyou.php is now no longer running.
FYI you can run php scripts like php my/script.php.
A safer way to do it would be have a master/worker process workflow. The master process runs on the server and checks a queue of work and the spawns worker processes to handle items on the queue as the arrive.
In your scenario you add stuff to the queue when the user pays. Once it is added to the queue you can send back thankyou.php to the user and they can continue or leave or whatever. Once the work is on the queue your master process spawns a worker process to handle the stuff (basically does everything in FinishCheckoutProcess.inc.php).
You can implement this in php with: php master.php
master.php
while( true ){
//check queue
//if found queue item
//shell_exec( 'php worker.php' );
}
From what i understand, you are looking for something like Laravel offers with it's illuminate/queue package:
Queues allow you to defer the processing of a time consuming task, such as sending an e-mail, until a later time which drastically speeds up web requests to your application.
This isn't something that only Laravel offers, though it does simplify/ease the implementation of such mechanism.
In the background you have supervisord executing a "worker" php script that executes tasks you put in a common place (db tabel, filesystem, anything), those tasks are usually references to a certain class/method with some variables to send to it.
The following links might give you a better understanding:
http://supervisord.org/index.html
https://laravel.com/docs/5.1/queues
https://laravel.com/docs/5.1/queues#supervisor-configuration
There are many ways you could implement a queue system, also without the use of supervisord. But i recently implemented this method myself because it guarantees my tasks are being processed, even after server restart (if configured properly).
I have written a PHP script to import large amount of data. The import process is triggered through Ajax call and the ajax request keep on waiting for the server response. As I am working on a dedicated server so there is no issue of timeout.
The problem is that we require a feature by which we can terminate the import process. For example a stop button on client-side. We thought that if we had killed the waiting ajax call then the process on the server will also stop as there is no request to serve. But unfortunately that is not the case, the script keeps on executing on server side while the Ajax Request is already killed from client.
Secondly, we use PHP session in this project. Let say if the cancel button requires an Ajax call to another script on the server to stop the process then How could that request will reach server if there is already a waiting ajax request. Php/Apache will hold the second request until the first request cycle is completed.
Note: As per our project architecture we require session_start() on every page. It will be good if anyone can guide on these issues.
You can write the Process ID at the start of running the script to a file, the db or a cache. You can get the Process ID with http://php.net/manual/en/function.getmypid.php. This assumes each script has its own Process ID.
The kill script (not using the locked session) could read that Process ID and try to kill it.
Be careful while doing long running processes in PHP as PHP's zend engine & GC is not suited for long running processes.
So, I strongly suggest using a proper job manager like gearman. gearman does have a php extensions. Using a job manager will give you full control over each process. you can start/stop processes & taks.
Another option is to use a queue, like amqp, to handle these tasks more cleanly. Which one is more suitable for your use case, I'll let you decide.
How about setting a time limit slightly higher than your AJAX timeout on your PHP script? This should kill the PHP script if it runs over time. Something similar to this:
<?php
set_time_limit(20);
while ($i<=10)
{
echo "i=$i ";
sleep(100);
$i++;
}
?>
Source: http://php.net/manual/en/function.set-time-limit.php
Once a script runs it can only be stopped by ending the php process working on the script. One possibility would be to use the session to store a "continue" condition when another script is called.
For example:
Script 1 is the worker (importer)
Script 2 is a function called repeatedly by ajax as long as the importer shall work.
Script 1 and 2 share let's say $_SESSION['lastPing'].
so
Script 2 sets $_SESSION['lastPing'] = time(); on each call.
Script 1 has a condition if($_SESSION['lastPing'] - 30 > time()){ die(); }
You might be able to handle this using the proc_ functions. A possible outline of the steps:
Create a unique token and pass it to the server along with the order to begin the import.
When order to import is received (via AJAX):
a) Store a record showing that this token is being processed (via file, db, memcached).
b) Run the import script using proc_open.
c) Begin a polling loop, checking the record from (2a) to see that this token is still in processing status. If not, call proc_terminate and exit the loop.
If the order to stop import is given (in a separate AJAX call), update the persisted record to indicate that the particular token should be stopped, which will be picked up in (2c)
Goto the following link for the exact solution to your problem:
PHP auto-kill a script if the HTTP request is cancelled/closed
I have some php code, that execute for a very long time.
I need to realise next scheme:
User enter on some page(page 1)
This page starts execution of my large PHP script in background .(Every change is writting to database)
We sent every N seconds query to database to get current status of execution.
I don't want to use exec command because 1000 users makes 1000 php processes. It's not way for me...
So you basically want a queue (possibly stored in a database) and a command line script ran by cron that process queued items.
Clarification: I'm not sure about what's unclear about my answer, but this complies with the two requirements imposed by the question:
The script cannot be aborted by the client
You share a single process between 1,000 clients
Use http requests to the local http server from within your script in combination with phps ignore_client_abort() function.
That way you keep the load inside the http servers worker processes, have a natural limit and queuing of requests comes for free.
You can use CLI to execute multiple PHP scripts
or
you can try Easy Parallel Processing in PHP
I'm currently running a Linux based VPS, with 768MB of Ram.
I have an application which collects details of domains and then connect to a service via cURL to retrieve details of the pagerank of these domains.
When I run a check on about 50 domains, it takes the remote page about 3 mins to load with all the results, before the script can parse the details and return it to my script. This causes a problem as nothing else seems to function until the script has finished executing, so users on the site will just get a timer / 'ball of death' while waiting for pages to load.
**(The remote page retrieves the domain details and updates the page by AJAX, but the curl request doesnt (rightfully) return the page until loading is complete.
Can anyone tell me if I'm doing anything obviously wrong, or if there is a better way of doing it. (There can be anything between 10 and 10,000 domains queued, so I need a process that can run in the background without affecting the rest of the site)
Thanks
A more sensible approach would be to "batch process" the domain data via the use of a cron triggered PHP cli script.
As such, once you'd inserted the relevant domains into a database table with a "processed" flag set as false, the background script would then:
Scan the database for domains that aren't marked as processed.
Carry out the CURL lookup, etc.
Update the database record accordingly and mark it as processed.
...
To ensure no overlap with an existing executing batch processing script, you should only invoke the php script every five minutes from cron and (within the PHP script itself) check how long the script has been running at the start of the "scan" stage and exit if its been running for four minutes or longer. (You might want to adjust these figures, but hopefully you can see where I'm going with this.)
By using this approach, you'll be able to leave the background script running indefinitely (as it's invoked via cron, it'll automatically start after reboots, etc.) and simply add domains to the database/review the results of processing, etc. via a separate web front end.
This isn't the ideal solution, but if you need to trigger this process based on a user request, you can add the following at the end of your script.
set_time_limit(0);
flush();
This will allow the PHP script to continue running, but it will return output to the user. But seriously, you should use batch processing. It will give you much more control over what's going on.
Firstly I'm sorry but Im an idiot! :)
I've loaded the site in another browser (FF) and it loads fine.
It seems Chrome puts some sort of lock on a domain when it's waiting for a server response, and I was testing the script manually through a browser.
Thanks for all your help and sorry for wasting your time.
CJ
While I agree with others that you should consider processing these tasks outside of your webserver, in a more controlled manner, I'll offer an explanation for the "server standstill".
If you're using native php sessions, php uses an exclusive locking scheme so only a single php process can deal with a given session id at a time. Having a long running php script which uses sessions can certainly cause this.
You can search for combinations of terms like:
php session concurrency lock session_write_close()
I'm sure its been discussed many times here. I'm too lazy to search for you. Maybe someone else will come along and make an answer with bulleted lists and pretty hyperlinks in exchange for stackoverflow reputation :) But not me :)
good luck.
I'm not sure how your code is structured but you could try using sleep(). That's what I use when batch processing.