we are working on project where we have used node-js in background socket for continues respond to web application. In between sometimes somehow some process stops automatically.
We would like to know how we can check all the process running using forever.
We are using sudo forever list to list all process. Is there any way to use this command(forever list) in .sh(shell script file) to check my specific process like responsclient is working or not. If that particular process is not working then we needs to start that.
There are several solutions that will ensure that your service is always running.
One of them is even called forever. Here you have an overview prepared by express.
However, for production services I recommend passenger The result is almost the same, but much greater scalability. For example, you can configure so that another instance is automatically added.
Almost - because it is designed to ensure the availability of HTTP, and not the constant operation of the application.
BTW: service stops, because you have uncatched exception.
Update
If you insist on forever, then: (We're talking about the same forever?)
Make sure that forever is run by the same user. forever has separate managers for all users.
Make sure you save your data in the same place. (automatic run eg by cron is different from manual startup (vaiables in env))
forever has --pidFile - then it is very easy to check if the process is working
also ps -aux | grep node should be your big friend.
No, I do not have it combined. When I started to have problems, I switched to passenger. Finally I did it well because I have professional monitoring, which I launched in less time than searching how to combine the above points together.
Related
I have a website, created using PHP and running on Apache. I want a subscriber to be able to log in and start a process on the server. They can then log out or close the browser without interrupting the process. Later they can log in and see the progress or see the results of the original process. What is the best way to accomplish this (having the process run until completion, after the browser is closed)?
Just looking for someone to point me in the right direction. A few people mentioned Gearman.
Gearman would be an ideal candidate, and I would use it for exactly the purpose you describe. It has everything you need out of the box to meet your requirements ("background" a long running, CPU-bound process to another machine, e.g. video encoding).
There is a Gearman PHP library, but you can write your worker code in a different language if it's better suited to doing the work.
For reporting progress information, I recommend having the worker write to Redis or Memcached - some kind of temporary storage that your web server can also access.
Check out the simple PHP example on the Gearman site. For learning, I recommend setting up a lab environment that contains 3 separate VM's, one for your web server (the client), one for the Gearman job queue (the server) and another for processing jobs (the workers).
how could I know if cron services is running using PHP script?
I want to have a PHP script that checks if cron services is running, else it will notify the admin via email so that they can make an immediate action on it.
Thank you
Depending on your OS, there are three approaches (all of which add considerable performance losses but might be acceptable for your app)
Check process list - You can execute a console command to check the list of runniing processes. I dont think this is possible on windows but no problem on linux. Take EXTRA care to filter any and all variables used there as running console command can be a big security risk.
running files - Create a file on start of your script and check for it's existance. I think this is how most (even non PHP) processes check if they are running. Performance loss and security issues should be minimal, you have to take care that the file is properly removed though, even in case ofa an error.
info in storage - Like the file solution, you can add information into your database or other storage system. Performance loss is slightly bigger than file IO but if you already have a db connection for your script, it might be worth it. It's also easier to store more informatione for your current process there or add logging to it.
I'm developing a PHP-service which does numerous operations per customer, and I want this to run continuously. I've already taken a look at cron, but as far as I understood cron made it possible to run the code on set times. This can be a bit dangerous since we are dependant that the code has finished running before it starts over, and the time for each run may vary as the customer base increases. So refresh, cron or other timed intervals cant be done, as far as I'm aware.
So I'm wondering if you know any solutions where I can restart my service when it is finished, and under no circumstances make the re-run before all the code have been executed?
I'm sorry if this is answered before or is easily found on Google, I have tried to find something, but to no avail.
Edit: I could set timed intervals to be 1 hour, to be absolutely sure, but I want as little time as possible between each run.
Look at this:
http://www.godlikemouse.com/2011/03/31/php-daemons-tutorial/
What you need is a daemon that keeps running. There are more solutions than this while loop.
The following I once used in a project: http://kvz.io/blog/2009/01/09/create-daemons-in-php/ , it's also a package for PEAR: http://pear.php.net/package/System_Daemon
For more information, see the following SO links:
What is a daemon: What is daemon? Their practical use? Usage with php?
How to use: PHP script that works forever :)
Have you tried runnning the PHP script as a process. This here has more details http://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/
If you do not want to learn how to code a daemon, I recommand using a software that manages processes in userland: Supervisor (http://supervisord.org/)
You just need to write a configuration file to specify which processes you want to run, and how.
It is extremely simple to configure and it is very adaptable (you can force having only one instance of your process, or instead have a fixed number of instances... etc).
It will also handle automatic restart in case your script crashes, and logging.
On the PHP side, just create a script that never quits, using a while(true) { ... } loop, and add an entry like this in supervisord's conf:
[program:your-script]
command=/usr/bin/php /path/to/your_script.php
I'm using that software in production for a few projects (to run ruby and php gearman asynchronous workers for websites).
Try to have a custom logic , where you can set the flag ON and OFF and in your CRON , you can check before running the code inside it. I wanted to suggested something like Queue based solution , once you get the entry , then run the logic of your processing . Which can be either daemon or cron. It will give more control if your task is OK to execute now . Edited it
I'm putting together my first commercial PHP application, it's nothing really huge as I'm still eagerly learning PHP :)
Right now I'm still in the conceptual stage of planning my application but I run into one problem all the time, the application is supposed to be self-hosted by my customers, on their own servers and will include some very long running scripts, depending on how much data every customer enters in his application.
Now I think I have two options, either use cronjobs, like for example let one or multiple cronjobs run at a time that every customer can set himself, OR make the whole processing of data as daemons that run in the background...
My question is, since it's a self-hosted application (and every server is different)... is it even recommended to try to write php that starts background processes on a customers server, or is this more something that you can do reliably only on your own server...?
Or should I use cronjobs for these long running processes?
(depending on the amount of data my customers will enter in the application, a process could run 3+ hours)
Is that even a problem that can be solved, reliably, with PHP...? Excuse me if this should be a weird question, I'm really not experienced with PHP daemons and/or long running cronjobs created by php.
So to recap everything:
Commercial self-hosted application, including long running processes, cronjobs or daemons? And is either or maybe both also a reliable solution for a paid application that you can give to your customers with a clear conscience because you know it will work reliable on all kinds of different servers...?
EDIT*
PS: Sorry, I forgot to mention that the application targets only Linux servers, so everything like Debian, Ubuntu etc etc.
Short answer, no, don't go for background process if this will be a client hosted solution. If you go towards the ASP concept (Application Service Provider... not Active Server Pages ;)) then you can do some wacky stuff with background processes and external apps connecting to your sql servers and processing stuff for you.
What i suggest is to create a strong task management backbone and link that to a solid task processing infrastructure. I'll recommend you read an old post i did quite some time ago regarding background processes and a strategy i had adopted to fix long running processes:
Start & Stop PHP Script from Backend Administrative Webpage
Happy reading...
UPDATE
I realize that my old post is far from easy to understand so here goes:
You need 2 models: Job and JobQueue, 2 controller: JobProcessor, XYZProcessor
JobProcessor is called either by a user when a page triggers or using a cronjob as you wish. JobProcessor::process() is the key that starts the whole processing or continues it. It loads the JobQueues and asks the job queues if there is work to do. If there is work to do, it asks the jobqueue to start/continue it's job.
JobQueue Model: Used to queue several JOBS one behind each other and controls what job is currently current by keep some kind of ID and STATE about which job is running.
Job Model: Represents exactly what needs to be done, it contains for example the name of the controller that will process the data, the function to call to process the data and a serialized configuration property that describe what must be done.
XYZController: Is the one that contains the processing method. When the processing method is called, the controller must load everything it needs to memory and then process each individual unit of work as fast as possible.
Example:
Call of index.php
Index.php creates a jobprocessor controller
Index.php calls the jobprocessor's process()
JobProcessor::Process() loads all the queues and processes them
For each JobQueue::Process(), the job queue loads it's possible Jobs and detects if one is currently running or not. If none is running, it starts the next one by calling Job::Process();
Job::Process() creates the XYZController that will work the task at hand. For example, my old system had an InvoicingController and a MassmailingController that worked hand in hand.
Job::Process() calls XYZController::Prepare() so that it loads it's information to process. (For example, load a batch of emails to process, load a batch of invoices to create)
Job::Process() calls XYZController::RunWorkUnit() so that it processes a single unit of work (For example, create one invoice, send one email)
Job::Process() asks JobProcessingController::DoIStillHaveTimeToProcess() and if so, continues processing the next element.
Job::Process() runs out of time and calls XYZController::Cleanup() so that all resources are released
JobQueue::Process() ends and returns to JobController
JobController::Process() is about to end? Open a socket, call myself back so i can start another round of processing until i don't have anything to do anymore
Handle the request from the user that start in position #1.
Ultimately, you can instead open a socket each time and ask the processor to do something, or you can queue a CronJob to call your processor. This way your users won't get stuck waiting for the 3/4 work units to complete each time.
Its worth noting that, in addition to running daemons or cron jobs, you can kick off long running processes from a web request (but note that it must run outside of the webserver process group) and of course asynchronous message processing (which is essentially a variant on the batch approach).
All four of these approaches are very different in terms of how they behave, how concurrency and timing are managed. The factors which make them all different are the same ones you omitted from your question - so it's not really possible to answer.
Unfortunately all rely on facilities which are very different between MSWindows and POSIX systems - so although PHP will run on both, if you want to sell your app on both platforms it's going to need 2 versions.
Maybe you should talk to your potential customer base and ask them what they want?
I am developing a website that requires a lot background processes for the site to run. For example, a queue, a video encoder and a few other types of background processes. Currently I have these running as a PHP cli script that contains:
while (true) {
// some code
sleep($someAmountOfSeconds);
}
Ok these work fine and everything but I was thinking of setting these up as a deamon which will give them an actual process id that I can monitor, also I can run them int he background and not have a terminal open all the time.
I would like to know if there is a better way of handling these? I was also thinking about cron jobs but some of these processes need to loop every few seconds.
Any suggestions?
Creating a daemon which you can make calls to and ask questions would seem the sensible option. Depends on wether your hoster permits such things, especially if you're requiring it to do work every few seconds, then definately an OS based service/daemon would seem far more sensible than anything else.
You could create a daemon in PHP, but in my experience this is a lot of hard work and the result is unreliable due to PHP's memory management and error handling.
I had the same problem, I wanted to write my logic in PHP but have it daemonised by a stable program that could restart the PHP script if it failed and so I wrote The Fat Controller.
It's written in C, runs as a daemon and can run PHP scripts, or indeed anything. If the PHP script ends for whatever reason, The Fat Controller will restart it. This means you don't have to take care of daemonising or error recovery - it's all handled for you.
The Fat Controller can also do lots of other things such as parallel processing which is ideal for queue processing, you can read about some potential use cases here:
http://fat-controller.sourceforge.net/use-cases.html
I've done this for 5 years using PHP to run background tasks and its no different to doing in any other language. Just use CRON and lock files. The lock file will prevent multiple instances of your script running.
Also its important to monitor your code and one check I always do to prevent stale lock files from preventing scripts to run is to have second CRON job to check if if the lock file is older than a few minutes and if an instance of the PHP script is running, if not it then removes the lock file.
Using this technique allows you to set your CRON to run the script every minute without issues.
Use the System::Daemon module from PEAR.
One solution (that I really need to try myself, as I may need it) is to use cron, but get the process to loop for five mins or so. Then, get cron to kick it off every five minutes. As one dies, the next one should be finishing (or close to finishing).
Bear in mind that the two may overlap a bit, and so you need to ensure that this doesn't cause a clash (e.g. writing to the same video file). Some simple inter-process communication may be useful, even if it is just writing to a PID file in the temp directory.
This approach is a bit low-tech but helps avoid PHP hanging onto memory over the longer term - sort of in-built task restarts!