how to create continues executing script in php with no execution limit - php

i have created a script to continuously listen from SMSC to catch the delivered SMS. script looks OK but want to make sure it never stops.
In One post it is stated that server may interfere the
continuous execution and could stop the execution, but could not
find any option in httpd.conf to disable the execution limit.
Secondly is below script is going to take 100% CPU usage? as some
other application are running on the same server.
Third what medium is better? Browser or php CLI?
would appreciate any help or any reading material on stated problem. using window server, XAMPP with apache 2.2, php 5.2.9
set_time_limit(0);
do{
//read incoming sms
$sms=$tx->readSMS();
//check sms data
if($sms && !empty($sms['source_addr']) && !empty($sms['destination_addr']) && !empty($sms['short_message'])){
//send sms for processing in smsadv
$from=$sms['source_addr'];
$to=$sms['destination_addr'];
$message=$sms['short_message'];
// insert in database...
}
//continuously check for SMS
}while(1);
EDIT
As per below response i used PHP script with CLI and without set_time_limit(0) as in script running through CLI has no execution limit.

You should use CLI script because the typical scenario of using PHP on server side to provide dynamic web pages isn't designed to have script running "forever". As advised in comments you might use scheduler script to be requested by browser frequently. If you don't plan to use brwoser at all, use CLI instead.
Regarding CPU load you might consider inserting sleep(1) in your while loop (in case of you're polling for incoming SMS rather than waiting for it as presumed in comments on question.

Related

Limit of concurrent PHP scripts

I have some PHP scripts that process data
unfortunately I can only run 6 script simultaneously
it seems that there is some sort of limitation in php or apache that makes the 7th script waits until another script ends
the script only processes data, there isn't any kind of connection with the database or any web request
How can I increase this limit?
#Leo - How are you running these scripts simultaneously? Is it web browser calling it (as you have mentioned apache in question)? -- browser has simultaneously connection limit. OR some other scenario.
place this function top of each page.
set_time_limit(0);
more details you can found on
http://php.net/manual/en/function.set-time-limit.php

Pass SNMP trap packet to a php daemon on Ubuntu

I have a Ubuntu server which is collecting incoming SNMP traps. Currently these traps are handled and logged using a PHP script.
file /etc/snmp/snmptrapd.conf
traphandle default /home/svr/00-VHOSTS/nagios/scripts/snmpTrap.php
This script is quite long and it contains many database operations. Usually the server receives thousands of traps per day and therefore this script is taking too much CPU time. My understand is this is due to high start-up cost of the php script every-time when a trap received.
I got a request to re-write this and I was thinking of running this script as a daemon. I can create an Ubuntu daemon. My question is how can I pass trap-handler to this daemon using snmptrapd.conf file?
Thank you in advance.
One suggestion is to use mysql support thats built into 5.5 of snmptrapd. That way you can use mysql as a queue and process the traps in bulk.
details of this are on the snmptrapd page: http://www.net-snmp.org/wiki/index.php/Snmptrapd
If not using mysql another option is to use a named pipe.
Do mkfifo snmptrapd.log
Now change snmptrapd to write to this log. Its not a file but it looks like one. You then write another daemon to watch the named pipe for new data.
You can probably use php-fpm / php-fcgi to minimize PHP script start-up cost.
Although, you probably need to write some wrapper shell script to forward request from snmptrapd to fcgi protocol.
But at first I'd recommend checking the PHP script. PHP start-up cost is not that high that few requests per minute should rise CPU usage notably.

long running php script called via ajax (Don't want to wait for response)

I have a long running script that can run for awhile. (It sends an email every 5 seconds) to many users. This script is triggered via an ajax request
If the response is never received such as the browser is closed, will the script continue to run? (It appears it does, but are there any conditions on when it won't?
Does sleep count towards the max execution time. (It appears this is false also)
1.
The short answer is: it depends.
In fact, it can be configured in PHP and in web server you use. It depends on the mode you use PHP in (module or CGI or whatever).
You can configure it sometimes, though. There is an option in php.ini:
/* If enabled, the request will be allowed to complete even if the user aborts
; the request. Consider enabling it if executing long requests, which may end up
; being interrupted by the user or a browser timing out. PHP's default behavior
; is to disable this feature.
; http://php.net/ignore-user-abort
;ignore_user_abort = On*/
2. Almost always sleep does count. There are conditions when it does not, but in that case not the execution time is measured but execution cpu time. IIS does count CPU usage by app pools, not sure how it applies to PHP scripts.
It is true that PHP does not kill a script that is in sleep right now. That mean that the script will be killed once the sleep is over (easy test: add sleep(15); to your php and set max execution time to 10. You will get an error about time limit but in 15 seconds, not in 10).
In general, you can not rely on freely using sleeps in script. You should not rely on script that is run without a user (browser) within a web server, either. You are apparently solving a problem with wrong methods: you really should consider using cron jobs/separate tasks.
This depends on the server. Some servers could terminate the script when the socket is closed (which will probably happen when the browser is closed), others could let the script execute until the timeout is reached.
Again, would depend on the server. I can really see a implementation looking at the time the script puts load on a CPU, then again - just measuring how long ago the script was started is an equally good approach. It all depends on what the person(s) making the server software was after.
If you want definite answers I would suggest sifting through the documentation for the webserver and php-implementation your script is running on.

Execute PHP file in background

I have some php code, that execute for a very long time.
I need to realise next scheme:
User enter on some page(page 1)
This page starts execution of my large PHP script in background .(Every change is writting to database)
We sent every N seconds query to database to get current status of execution.
I don't want to use exec command because 1000 users makes 1000 php processes. It's not way for me...
So you basically want a queue (possibly stored in a database) and a command line script ran by cron that process queued items.
Clarification: I'm not sure about what's unclear about my answer, but this complies with the two requirements imposed by the question:
The script cannot be aborted by the client
You share a single process between 1,000 clients
Use http requests to the local http server from within your script in combination with phps ignore_client_abort() function.
That way you keep the load inside the http servers worker processes, have a natural limit and queuing of requests comes for free.
You can use CLI to execute multiple PHP scripts
or
you can try Easy Parallel Processing in PHP

php script that runs on the server without a client request

I am working on a site that require a php script running on a server without any request,
it is a bot script that keeps (not full time but at least once a day) checking client accounts and send alert messages to clients when something happens.
any ideas are appreciated.
Assuming you need to do this on linux, you may run any php script from the browser and from the CLI as well.
You may run a simple php script:
<? echo "Ana are mere"; ?>
like this:
php -f ./index.php
Be careful about file-permissions, and any bug that may creep inside your code, memory leaks or unallocated variables will become VERY visible now, as the process will run continuously.
If you dont want it running in the background all the time, take a look at crontab (http://unixgeeks.org/security/newbie/unix/cron-1.html) to be able to start jobs regularly.
-- edit--
take a look at php execute a background process and PHP: How to return information to a waiting script and continue processing
Basically you want to start a background process, and you may do this by either using exec() or fsockopen() or a file_get_contents() on your own script probably in this order, if don't have access to exec, or socket functions.
Also take a look at http://us2.php.net/manual/en/function.session-write-close.php so the "background script" won't "block" the request and http://us2.php.net/manual/en/function.ignore-user-abort.php
Use a cron job to do it http://www.cronjobs.org/
You can automatically call a script at any interval you like indefinitely. Your hosting provider should support them if they are good.
You should also consider putting a unique key on the end of the page
ie. www.yoursite.com/cronjob.php?key=randomstring
and then only run the script if the key is correct, to prevent bots and other users from running the script when you don't want it run.
If you can't create a cron job, then create a page that does what you want and create a scheduled task on another machine (maybe your PC?) that just goes out and hits that page at a certain time every day.
It's really a hack, but if you absolutely can't set up a cron job, it would be an option.
As Evernoob and Quamis said, you want to have a cron job (UNIX/Linux/Mac OS) or a scheduled task (MS Windows). Furthermore, you can either have the PHP script run using the PHP command line interface (CLI), in which case you can invoke the PHP executable and then your script name. As an alternate, you can use a tool like wget (availble on all platforms) to invoke the PHP script as if someone had typed the URL in the location bar of a web browser.
A php script could not be used like you imagine here. Because it's executed through apache after a request from somewhere.
Even if you do while(1) in your script, apache/php will automaticly stop your script.
Responding to your comment, yes you'll need ssh access to do this, except if your web interface allow you to add cronjob.
Maybe you can write a service which can be executed with a program on another server and do the job.
If you have no access to the server the easiest way would probably be to hit it through the browser, but that would require you or an external script hitting the URL at the same interval each day when you wanted it to one. You may also be able to setup a Selenium test suite that runs locally on a schedule and hits the page. I'm not 100% if that's possible with Selenium though, you may need some 3rd-party apps to make it happen.
Something else you could try would be to see about using PHP's Process Control Functions (link). These will let you create a script that is a deamon and runs in the background. You may be able to do this to keep the script running on the server and firing off commands at programmed intervals. You will still need some way to get it running the first time (browser request or via command line) though.

Categories