Start & Stop PHP Script from Backend Administrative Webpage - php

I'm trying to create a webpage that will allow me to start and stop of a PHP script. The script is part of the backend of a site, and will need to access, read data from, process that data, and update a database on the same server the script exists on. Additionally, I'd like the webpage to allow an administrator to view the current status (running or stopped), and even link to logs created by the script.
I'm starting to go down the path of learning about PHP's exec, passthru, and related functions. Is this the correct path to take? Are there other ways to do this that would be more suitable? Is it possible to do this in a platform agnostic way? I'm developing on a LAMPP+CakePHP stack in Windows, and would like the functionality to exist on any webhost I choose.

I've done this in a recent job but it's probably overkill for you. I did a job processor and it basically sets 2 tables in the database, 2 objects at a minimum and 2 controllers at a minimum.
The first part is the job processing unit, it is composed of a job processor controller that manages the request to start or continue a job and it comes with two activerow models JobQueue and Job. You can remove the queue, but it's always practical to have queing in such systems so you can say that 2,3,4 jobs could execute at once.
The queue is only that, it's a slot that gets several jobs attached to it and it has a queue status to determine if it is running right now or not.
The job is a virtual object that maps to a job table describing what has to be done. In my implementation, i have created an interface that must be implemented into the called controller and a field + a type in the database. The Job instanciates the controller class to call (not the job processor controler, another controler that manages the operation to do) and calls a method in it to start the task processing.
Now, to get tricky, i forced my system to run on a dedicated server just for that portion because i didn't want the task to load the main server or jam the processing queue of Apache. So i had two servers and my Queue class was in charge of calling via an ip address a page on another server to run the job on that server specifically. When the job was done, it called itself back using a HTTP request to restart processing and do the next task. If no task was left, then it would simply die normally.
The advantage of doing it this way is that it doesn't require a cronjob (as long as your script is super stable and cannot crash) because it gets triggered by you when you want it and then you can let it go and it calls itself back with a fsockopen to trigger another page view that triggers the next job.
Work units
It is important to understand that if your jobs are very large, you should segment them. I used the principle of a "work unit" to describe 1 part the job has to do any number of times. Then the Queue Processor became a time manager too so that he could detect if a job took more than X seconds, it would simply defer the rest of the steps for later and call itself back and continue were he was at. That way, you don't need to "set time limit" and you don't jam your server while a 30s script gets executed.
I hope this helps!

To run a script which run continually, you need think to that:
Your php script should be launched as CLI (command line) by a job scheduler like cron or something else. Don't forget that your web server configuration defined a timeout on executed script.
To run 24h a day, maybe you imagine to implement an infinite loop. In that case, you can write a test like jobIsActive which read in a file or in the database every loop if the job should be executed or not. If you click on the button just change the job status (update file, db ...). Your both button can stop the treatment or activate it but doesn't stop the infinite loop.
An infinite loop isn't the most elegant solution, why don't you write an entry in the cron tab to execute the job each night and a click on a button can fired it manually.

Related

How to make PHP listening to the job tables

I am working on the system which developed in php without framework.
It has the function is automatically run some jobs via third party api every night. It loops all the jobs in table and call api using curl.
// run cron job to loop this table
ID JOB
1 updateUser
2 getLatestInfo/topicA
……
//code
// if UpdateUser
loop user table and call api to get latest info…
Also other curl task will do here like send email / notification …
It works perfectly before. But recently we have many new users. It will call 50-100 API at the same time.
Each api call will take 10-20 seconds to respond, and we will retry the api if it is timeout.
I checked the log it totally take 3-4 hours for only first job (with many errors)
Although I can make the cron job for queueing the curl, like get first 5 curl and run them each 1 minutes. But if we keep increasing the users or task, and the third party api keep slow. It may take more hours to finish the task.
Is there any solution can make it keep listening to the job table, and run the curl one by one?
I want it can be auto Triggered if new row is added to the table. (Like websocket?) and not single php to run and infinite loop ( to prevent some error occurred and need to rerun the php task manually )
(The API keys is in the php project, so I hope that I can do this in same project)
PHP scripts need to be triggered in order to do something, they can't really "run in background" (I mean, they can, technically, but PHP isn't supposed to be used that way).
Instead, one of three options is usually used to do job management:
call jobs on every call from web, along with the actual code to generate output
use external web cron service to query specific URLs tied to job execution
use local cron job on your system to call the php executable and have it execute jobs periodically
If you want an event based system, PHP is likely the wrong option. Depending on your DB system though you might be able to create a small wrapper code that subscribes to DB changes and is triggered on inserts, that then calls PHP again - but it's definitely a cleaner solution to use a more suitable programming language / environment.

Is there any way that I make the PHP at server side to perform some kind of actions on the data on it's own?

I have this scenario:
User submits a link to my PHP website and closes the browser. Now that the server has got the link it will analyse the submitted link (page) for the broken links and after it has completely analysed the posted link, it will send an email to the user. I have a complete understanding of the second part i.e. how to analyse the page for the broken links and send the mail to the user. Only problem that I have is how may I achieve this first part i.e. make the server keep running the actions on it's own even even if there is no request made by the client end?
I have learned that "Crontab" or a "fork" may work for me. What do you say about these? Is it possible to achieve what I want, using these? What are the alternatives?
crontab would be the way to go for something like this.
Essentially you have two applications:
A web site where users submit data to a database.
An offline script, scheduled to run via cron, which checks for records in the database and performs the analysis, sending notifications of the results when complete.
Both of these applications share the same database, but are otherwise oblivious to each other.
A website itself isn't suited well for this sort of offline work, it's mainly a request/response system. But a scheduled task works for this. Unless the user is expecting an immediate response, a small delay of waiting for the next scheduled run of the offline task is fine.
The server should run the script independently of the browser. Once the request is submitted, the php server runs the script and returns the result to the browser (if it has a result to return)
An alternative would be to add the request to a database and then use crontab run the php script at a given interval. The script would then check the database to see if there's anything that needs to be processed. You could limit the script to run one database entry every minute (or whatever works). This will help prevent performance problems if you have a lot of requests at once, but will be slower to send the email.
A typical approach would be to enter the link into a database when the user submits it. You would then use a cron job to execute a script periodically, which will process any pending links.
Exactly how to setup a cron job (or equivalent scheduled task) depends on your server. If you have a host which provides a web-based admin tool (such as CPanel), there will often be a way to do it in there.
PHP script will keep running after the client closes the broser (terminating the connection).
Only keep in mind PHP scripts maximum execution time is limited to "max_execution_time" directive value.
Of course here I suppose the link submission happens calling your script page... I don't understand if this is your use case...
For the sake of simplicity, a cronjob could do the wonders. User submits a link, the web handler simply saves the link into a DB (let me pretend here that the table is named "queued_links"). Then a cronjob scheduled to run each minute (for example), selects every link from queued_links, does the application logic (finds broken page links) and sends the email. It then also deletes the link from queued_links (or updates a flag to represent the fact that the link has already been processed.
In the sake of scale and speed, a cronjob wouldn't fit as well as a Message Queue (see rabbitmq, activemq, gearman, and beanstalkd (gearman and beanstalk are my favorite 2, simple and fit well with php)). In lieu of spawning a cronjob every minute, a queue processor listens for 'events' and asynchronously processes the 'events' (think 'onLinkSubmission($link)'), and processes the messages ASAP. The cronjob solution is just a simplified implementation of one of these MQ solutions, will result in better / more predictable results, but at the cost of adding new services to maintain, etc.
well, there are couple of ways, simplest of them would be:
When user submit a request, save this request some where, let's call it jobs table, and inform customer that his request has been received, they'll be updated site finish processing your request, or whatever suites you.
Now, create a (or multiple) scripts (depending upon requirement) and run this script from Cron, this script will pick requests from Job table, process it, do whatever required.
Alternatively, you can evaluate possibility of message_queue or may be using a Job server for this.
so, it all depends on your requirement.

Run PHP script like an application without Browser

Hi I am new to PHP and have no idea if what I am about to ask is even possible or does it even make sense but here goes.
I want to execute a PHP script as if I am executing a standalone application on the WebServer, what I am trying to implement is that when the Customer purchases something on the website and the once he sees the payment confirmation notice on the website, he should be allowed to close the browser window or logoff without affecting the big order generation process that get's started once the user is taken to the page that displays that the payment that he made was successful.
Right now I am making use of AJAX to call my after payment processing PHP script and have kept that script to ignore any user abort call.
This is the page that tells the user that the payment was received successfully.
thankyou.php
This is the page that performs the processing that needs to be done only after successful receipt of payment
FinishCheckoutProcess.inc.php
Now thankyou.php makes use of AJAX to execute FinishCheckoutProcess.inc.php asynchronously and FinishCheckoutProcess.inc.php has a PHP.ini setting in it that goes like this:
ignore_user_abort(true);
Now the combination of AJAX and ignore_user_abort(true) allows the after payment process to run without any errors even if the user closes his browser window, but since this script has nothing to do with the user or the browser I just wanted to know if it is possible to run this script in the background like a standalone application independent of the browser.
Also my WebServer is Apache and OS is Linux(Ubuntu OS).
My work is getting done but I just want to know if there is a better/safer way to do it.
Anyway thanks in advance to everyone, this site has helped me more than any book could have. So all you experts out there who donate their times to newbies like me you guys are awesome. Please keep up the good work.
Again thanks a lot.
Based on suggestions received
If I use the "exec" method to execute the FinishCheckoutProcess.inc.php, will this execute database related commands and will it be able to run further PHP scripts.
FinishCheckoutProcess.inc.php in turn executes a series of other PHP scripts which in turn executes other PHP scripts, so will using "exec" command to run FinishCheckoutProcess.inc.php create any difficulties.
FinishCheckoutProcess.inc.php process also does interaction with the MySQL database, so will I be able to do this if I execute this script using "exec" command. I am passing the necessary MySQLi connection object to this PHP script right now. So can I pass it the same way to it using "exec"
Also the process is quite heavy as it generates a set of 4 image files using IMagick and ImageMagick.
It generates a set of 4 image files for every product ordered, so if the quantity of 1 product is 10 then the total files generated will be 1x10x4 = 40
If there are two products with one having quantity as 2 and the other having quantity as 4 then the total files generated will be
1x2x4 = 8 +
1x4x4 = 16 = 24
So this script might need to run for a long time and cannot be allowed to be stopped due to time out reasons, it needs to finish what it started.
Basiclly the FinishCheckoutProcess.inc.php logic and process is quite complex so just want to confirm if the "exec" can handle it or not.
Also I am not sure but some of them also make use of $_SESSION variables, but if this a problem I can modify it, $_SESSION variables only get's used in one place and yes the $_SESSION get's set in the browser before the FinishCheckoutProcess.inc.php script is executed. By some previous PHP script.
I just want to execute the FinishCheckoutProcess.inc.php script independent of the parent/calling script i.e. thankyou.php, so that if the user closes the browser then the FinishCheckoutProcess.inc.php will not stop or abort becuse the parent/calling script i.e. thankyou.php is now no longer running.
FYI you can run php scripts like php my/script.php.
A safer way to do it would be have a master/worker process workflow. The master process runs on the server and checks a queue of work and the spawns worker processes to handle items on the queue as the arrive.
In your scenario you add stuff to the queue when the user pays. Once it is added to the queue you can send back thankyou.php to the user and they can continue or leave or whatever. Once the work is on the queue your master process spawns a worker process to handle the stuff (basically does everything in FinishCheckoutProcess.inc.php).
You can implement this in php with: php master.php
master.php
while( true ){
//check queue
//if found queue item
//shell_exec( 'php worker.php' );
}
From what i understand, you are looking for something like Laravel offers with it's illuminate/queue package:
Queues allow you to defer the processing of a time consuming task, such as sending an e-mail, until a later time which drastically speeds up web requests to your application.
This isn't something that only Laravel offers, though it does simplify/ease the implementation of such mechanism.
In the background you have supervisord executing a "worker" php script that executes tasks you put in a common place (db tabel, filesystem, anything), those tasks are usually references to a certain class/method with some variables to send to it.
The following links might give you a better understanding:
http://supervisord.org/index.html
https://laravel.com/docs/5.1/queues
https://laravel.com/docs/5.1/queues#supervisor-configuration
There are many ways you could implement a queue system, also without the use of supervisord. But i recently implemented this method myself because it guarantees my tasks are being processed, even after server restart (if configured properly).

Commercial PHP script, long running processes. daemons vs. cronjobs?

I'm putting together my first commercial PHP application, it's nothing really huge as I'm still eagerly learning PHP :)
Right now I'm still in the conceptual stage of planning my application but I run into one problem all the time, the application is supposed to be self-hosted by my customers, on their own servers and will include some very long running scripts, depending on how much data every customer enters in his application.
Now I think I have two options, either use cronjobs, like for example let one or multiple cronjobs run at a time that every customer can set himself, OR make the whole processing of data as daemons that run in the background...
My question is, since it's a self-hosted application (and every server is different)... is it even recommended to try to write php that starts background processes on a customers server, or is this more something that you can do reliably only on your own server...?
Or should I use cronjobs for these long running processes?
(depending on the amount of data my customers will enter in the application, a process could run 3+ hours)
Is that even a problem that can be solved, reliably, with PHP...? Excuse me if this should be a weird question, I'm really not experienced with PHP daemons and/or long running cronjobs created by php.
So to recap everything:
Commercial self-hosted application, including long running processes, cronjobs or daemons? And is either or maybe both also a reliable solution for a paid application that you can give to your customers with a clear conscience because you know it will work reliable on all kinds of different servers...?
EDIT*
PS: Sorry, I forgot to mention that the application targets only Linux servers, so everything like Debian, Ubuntu etc etc.
Short answer, no, don't go for background process if this will be a client hosted solution. If you go towards the ASP concept (Application Service Provider... not Active Server Pages ;)) then you can do some wacky stuff with background processes and external apps connecting to your sql servers and processing stuff for you.
What i suggest is to create a strong task management backbone and link that to a solid task processing infrastructure. I'll recommend you read an old post i did quite some time ago regarding background processes and a strategy i had adopted to fix long running processes:
Start & Stop PHP Script from Backend Administrative Webpage
Happy reading...
UPDATE
I realize that my old post is far from easy to understand so here goes:
You need 2 models: Job and JobQueue, 2 controller: JobProcessor, XYZProcessor
JobProcessor is called either by a user when a page triggers or using a cronjob as you wish. JobProcessor::process() is the key that starts the whole processing or continues it. It loads the JobQueues and asks the job queues if there is work to do. If there is work to do, it asks the jobqueue to start/continue it's job.
JobQueue Model: Used to queue several JOBS one behind each other and controls what job is currently current by keep some kind of ID and STATE about which job is running.
Job Model: Represents exactly what needs to be done, it contains for example the name of the controller that will process the data, the function to call to process the data and a serialized configuration property that describe what must be done.
XYZController: Is the one that contains the processing method. When the processing method is called, the controller must load everything it needs to memory and then process each individual unit of work as fast as possible.
Example:
Call of index.php
Index.php creates a jobprocessor controller
Index.php calls the jobprocessor's process()
JobProcessor::Process() loads all the queues and processes them
For each JobQueue::Process(), the job queue loads it's possible Jobs and detects if one is currently running or not. If none is running, it starts the next one by calling Job::Process();
Job::Process() creates the XYZController that will work the task at hand. For example, my old system had an InvoicingController and a MassmailingController that worked hand in hand.
Job::Process() calls XYZController::Prepare() so that it loads it's information to process. (For example, load a batch of emails to process, load a batch of invoices to create)
Job::Process() calls XYZController::RunWorkUnit() so that it processes a single unit of work (For example, create one invoice, send one email)
Job::Process() asks JobProcessingController::DoIStillHaveTimeToProcess() and if so, continues processing the next element.
Job::Process() runs out of time and calls XYZController::Cleanup() so that all resources are released
JobQueue::Process() ends and returns to JobController
JobController::Process() is about to end? Open a socket, call myself back so i can start another round of processing until i don't have anything to do anymore
Handle the request from the user that start in position #1.
Ultimately, you can instead open a socket each time and ask the processor to do something, or you can queue a CronJob to call your processor. This way your users won't get stuck waiting for the 3/4 work units to complete each time.
Its worth noting that, in addition to running daemons or cron jobs, you can kick off long running processes from a web request (but note that it must run outside of the webserver process group) and of course asynchronous message processing (which is essentially a variant on the batch approach).
All four of these approaches are very different in terms of how they behave, how concurrency and timing are managed. The factors which make them all different are the same ones you omitted from your question - so it's not really possible to answer.
Unfortunately all rely on facilities which are very different between MSWindows and POSIX systems - so although PHP will run on both, if you want to sell your app on both platforms it's going to need 2 versions.
Maybe you should talk to your potential customer base and ask them what they want?

Infrastructure for Running your Zend Queue Receiver

I have a simple messaging queue setup and running using the Zend_Queue object heirarchy. I'm using a Zend_Queue_Adapter_Db back-end. I'm interested in using this as a job queue, to schedule things for processing at a later time. They're jobs that don't need to happen immediately, but should happen sooner rather than later.
Is there a best-practices/standard way to setup your infrastructure to run jobs? I understand the code for receiving a message from the queue, but what's not so clear to me is how run the program that does that receiving. A cron that receives n messages on the command-line, run once a minute? A cron that fires off multiple web requests, each web request running the receiver script? Something else?
Tangential bonus question. If I'm running other queries with Zend_Db, will the message queue queries be considered part of that transaction?
You can do it like a thread pool. Create a command line php script to handle the receiving. It should be started by a shell script that automatically restarts the process if it dies. The shell script should not start the process if it is already running (use a $pid.running file or similar). Have cron run several of these every 1-10 minutes. That should handle the receiving nicely.
I wouldn't have the cron fire a web request unless your cron is on another server for some strange reason.
Another way to use this would be to have some backround process creating data, and a web user(s) consume it as they naturally browse the site. A report generator might work this way. Company-wide reports are available to all users but you don't want them all generating this db/time intensive report. So you create a queue and process one at a time possible removing duplicates. All users can view the report(s) when ready.
According to the docs it doens't look like the zend db is even using the same connection as your other zend_db queries. But of course the best way to find out is to make a simple test.
EDIT
The multiple lines in the cron are for concurrency. each line represents a worker for the pool. I was not clear, you don't want the pid as the identifier, you want to pass that as a parameter.
/home/byron/run_queue.sh Process1
/home/byron/run_queue.sh Process2
/home/byron/run_queue.sh Process3
The bash script would check for the $process.running file if it finds it exit.
otherwise:
Create the $process.running file.
start the php process. Block/wait until finished.
Delete the $process.running file.
This allows for the php script to die but not cause the pool to loose a worker.
If the queue is empty the php script exits immediately and is started again by the nex invocation of cron.

Categories