Automated scripts on a Linux Server (not cron jobs) - php

I am programming a website on a Linux CentOS server (I am planning to upgrade to a VPS plan where I will have root access). Much of the website will rely on scripts that are automated.
I have 2 questions about starting automated processes.
Is there any way I can start a Daemon thread, or anything like that, which will constantly be running. I need to execute a script every time an email account gets a new e-mail. I am aware of cron jobs that can run every minute, but having a script that constantly runs would be ideal, so I can execute the script the moment a new e-mail arrives.
Is there any way from code (ideally PHP) to start a thread, which runs concurrently with the main program. In the script I am using, the imap_open is used to connect to an e-mail account, which takes a few seconds every time. However, if I could fire off multiple concurrent scripts at the same time, that would ideally reduce the program's time. Is there any way to do this?
Any help with these questions would be greatly appreciated.

You can certainly write a daemon / service that runs constantly. For a starting tutorial see
http://www.netzmafia.de/skripten/unix/linux-daemon-howto.html
Your daemon can implement SMTP (there are existing libraries available to support this) to periodically check the email account for new emails and act accordingly.
Here's a question with answers from SO that discusses how to accomplish all of this with Python
How to make a Python script run like a service or daemon in Linux

For the first part, there's two easy solutions:
Use the Vixie cron #reboot start specification to start your daemon at reboot as a standard user. This and every-minute cron-jobs are the only mechanisms that make it easy to run a daemon-style service as a user.
Use procmail to start a new script on every email delivery. The downside here is that procmail will run and then start a new program on every email -- when you're getting a hundred emails per second, this could be a serious hindrance compared to a daemon that uses inotify(7) to alert a long-lived program about new emails.
For the second part, look for a wrapper for the fork(2) system call. It cleaves a program cleanly in half -- parent and child -- and allows each to continue independent execution from then on. If the child and parent need to communicate again in the future, then perhaps see if PHP supports threaded execution.

And what about incron? May be there is a way to use it in your case but you must produce a filesystem event (for example create a new file).

Related

Running an infinite loop in cron job

Running an infinite loop in cron job.Suppose, i have written a php based script to run on my server computer using cron job, and i want to use infinite loop in that php script.Any ideas for running an infinite loop in cron job.
Infinite looping applications are usually called daemons. They are system services that offer some kind of constant processing and/or the readiness to accept some potential incoming processing activities.
Gearman is a system daemon you can install than can handle various tasks you give it. It's a complex tools that allows many things but it could be used to implement your necessities.
PHP::Gearman is a Gearman client that talks to the Gearman daemon and sends tasks to the daemon specifying the conditions under which the task must be executed.
The limitations that #Jeffrey emphasized about PHP are true because PHP was designed as a share nothing architecture (one page load equals one script execution - each page load works under its own data context).
Perhaps System Daemon (a pear package) may assist in overcoming some or all of the limitations mentioned above. I haven't used it so I can't tell you much more about it but it's as good a place to start as any.

How can i send my emails and perform some other tasks in batches using php

I run a website and my subscriber base is gradually increasing.
I had to manually batch my subscribers, that is Batch A (1-700), Batch B (701 - 1400) etc. and manually trigger the email sending every hour.
In addition, to sending them emails i want to perform some other tasks along side the email.
I believe there should be a way of triggering the message send once from the web interface (that is from my website backend, pls not from the command line), and it batches the emails and processes automatically hourly.
Looking forward to replies on how i can get it done.
Thanks in advance.
If you are unable to schedule cron jobs on your server (as is the case with most cheap hosting solutions), there are some pure php alternatives to run scheduled jobs: phpjobscheduler is one of those alternatives.
In UNIX-like systems, this can be done with cron. In Windows, see the Task Scheduler, schtasks or at.
If you don't have access to these tools, you cannot programatically run scripts with a given period (short of having another machine call your scripts via HTTP).

Threads in PHP?

I am creating a web application using zend, here I create an interface from where user-A can send email to more than one user(s) & it works excellent but it slow the execution time because of which user-A wait too much for the "acknowledged response" ( which will show after the emails have sent. )
In Java there are "Threads" by which we can perform that task (send emails) & it does not slow the rest application.
Is there any technique in PHP/Zend just like in Java by which we can divide our tasks which could take much time eg: sending emails.
EDIT (thanks #Efazati, there seems to be new development in this direction)
http://php.net/manual/en/book.pthreads.php
Caution: (from here on the bottom):
pthreads was, and is, an experiment with pretty good results. Any of its limitations or features may change at any time; [...]
/EDIT
No threads in PHP!
The workaround is to store jobs in a queue (say rows in a table with the emails) and have a cronjob call your php script at a given interval (say 2 minutes) and poll for jobs. When jobs present fetch a few (depending on your php's install timeout) and send emails.
The main idea to defer execution:
main script adds jobs in the queue
cron script sends them in tiny slices
Gotchas:
make sure u don't send an email without deleting from queue (worst case would be if a user rescieves some spam at 2 mins interval ...)
make sure you don't delete a job without executing it first ...
handle bouncing email using a score algorithm
You could look into using multiple processes, such as with fork. The communication between them wouldn't be as simple as with threads (but then, it won't come with all of its pitfalls either), but if you're just sending emails, it might not be necessary to communicate much, if at all.
Watch out for doing forks on an Apache process. You may get some behaviors that you are not expecting. If you are looking to do any kind of asynchronous execution it should be via some kind of queuing mechanism. Gearman is one. Zend Server Job Queue is another. I have some demo code at Do you queue? Introduction to the Zend Server Job Queue. Cron can be used, but you'll have the problem of depending on your cron scheduler to run tasks whereas asynchronous computing often needs to be run immediately. Using a queuing system allows you to do that without threading.
There is a Threading extension being developed based on PThreads that looks promising at https://github.com/krakjoe/pthreads
There is pcntl, which allows you to create sub-processes, but php doesn't work very well for this kind of architecture. You're probably better off creating a long-running script (a daemon) and spawning multiple of them.
As of PHP there are no threads in it. However for php, you can have a look at this roundabout way
http://www.alternateinterior.com/2007/05/multi-threading-strategies-in-php.html
You may want to use a queue system for your email sending and send the email from another system which supports threads. PHP is just a tool and you should the tool that is best fitted for the job.
PHP doesn't include threading as part of the language, there are some methods that can emulate it but they aren't foolproof.
This Google search shows a few potential workarounds

Multithreaded Programming in PHP to avoid runtime limitations

I know about PHP not being multithreaded but i talked with a friend about this: If i have a large algorithmic problem i want to solve with PHP isn't the solution to simply using the "curl_multi_xxx" interface and start n HTTP requests on the same server. This is what i would call PHP style multithreading.
Are there any problems with this in the typical webserver environment? The master request which is waiting for "curl_multi_exec" shouldn't count any time against its maximum runtime or memory length.
I have never seen this anywhere promoted as a solution to prevent a script killed by too restrictive admin settings for PHP.
If i add this as a feature into a popular PHP system will there be server admins hiring a russian mafia hitman to get revenge for this hack?
If i add this as a feature into a
popular PHP system will there be
server admins hiring a russian mafia
hitman to get revenge for this hack?
No but it's still a terrible idea for no other reason than PHP is supposed to render web pages. Not run big algorithms. I see people trying to do this in ASP.Net all the time. There are two proper solutions.
Have your PHP script spawn a process
that runs independently of the web
server and updates a common data
store (probably a database) with
information about the progress of
the task that your PHP scripts can
access.
Have a constantly running daemon
that checks for jobs in a common
data store that the PHP scripts can
issue jobs to and view the progress
on currently running jobs.
By using curl, you are adding a network timeout dependency into the mix. Ideally you would run everything from the command line to avoid timeout issues.
PHP does support forking (pcntl_fork). You can fork some processes and then monitor them with something like pcntl_waitpid. You end up with one "parent" process to monitor the children it spanned.
Keep in mind that while one process can startup, load everything, then fork, you can't share things like database connections. So each forked process should establish it's own. I've used forking for up 50 processes.
If forking isn't available for your install of PHP, you can spawn a process as Spencer mentioned. Just make sure you spawn the process in such a way that it doesn't stop processing of your main script. You also want to get the process ID so you can monitor the spawned processes.
exec("nohup /path/to/php.script > /dev/null 2>&1 & echo $!", $output);
$pid = $output[0];
You can also use the above exec() setup to spawn a process started from a web page and get control back immediately.
Out of curiosity - what is your "large algorithmic problem" attempting to accomplish?
You might be better to write it as an Amazon EC2 service, then sell access to the service rather than the package itself.
Edit: you now mention "mass emails". There are already services that do this, they're generally known as "spammers". Please don't.
Lothar,
As far as I know, php don't work with services, like his concorrent, so you don't have a way for php to know how much time have passed unless you're constantly interrupting the process to check the time passed .. So, imo, no, you can't do that in php :)

Infrastructure for Running your Zend Queue Receiver

I have a simple messaging queue setup and running using the Zend_Queue object heirarchy. I'm using a Zend_Queue_Adapter_Db back-end. I'm interested in using this as a job queue, to schedule things for processing at a later time. They're jobs that don't need to happen immediately, but should happen sooner rather than later.
Is there a best-practices/standard way to setup your infrastructure to run jobs? I understand the code for receiving a message from the queue, but what's not so clear to me is how run the program that does that receiving. A cron that receives n messages on the command-line, run once a minute? A cron that fires off multiple web requests, each web request running the receiver script? Something else?
Tangential bonus question. If I'm running other queries with Zend_Db, will the message queue queries be considered part of that transaction?
You can do it like a thread pool. Create a command line php script to handle the receiving. It should be started by a shell script that automatically restarts the process if it dies. The shell script should not start the process if it is already running (use a $pid.running file or similar). Have cron run several of these every 1-10 minutes. That should handle the receiving nicely.
I wouldn't have the cron fire a web request unless your cron is on another server for some strange reason.
Another way to use this would be to have some backround process creating data, and a web user(s) consume it as they naturally browse the site. A report generator might work this way. Company-wide reports are available to all users but you don't want them all generating this db/time intensive report. So you create a queue and process one at a time possible removing duplicates. All users can view the report(s) when ready.
According to the docs it doens't look like the zend db is even using the same connection as your other zend_db queries. But of course the best way to find out is to make a simple test.
EDIT
The multiple lines in the cron are for concurrency. each line represents a worker for the pool. I was not clear, you don't want the pid as the identifier, you want to pass that as a parameter.
/home/byron/run_queue.sh Process1
/home/byron/run_queue.sh Process2
/home/byron/run_queue.sh Process3
The bash script would check for the $process.running file if it finds it exit.
otherwise:
Create the $process.running file.
start the php process. Block/wait until finished.
Delete the $process.running file.
This allows for the php script to die but not cause the pool to loose a worker.
If the queue is empty the php script exits immediately and is started again by the nex invocation of cron.

Categories