Run a PHP script every second using CLI - php

I have a dedicated server running Cent OS with a Parallel PLESK panel. I need to run a PHP script every second to update my database. These is no alternative way time-wise, it needs to be updated every second.
I can find my script using the URL http://www.somesite.com/phpfile.php?key=123.
Can the file be executed locally every second? Like phpfile.php?
Update:
It has been a few months since I added this question. I ended up using the following code:
#!/user/bin/php
<?php
$start = microtime(true);
set_time_limit(60);
for ($i = 0; $i < 59; ++$i) {
doMyThings();
time_sleep_until($start + $i + 1);
}
?>
My cronjob is set to every minute. I have been running this for some time now in a test environment, and it has worked great. It is really super fast, and I see no increase in CPU nor Memory usage.

You could actually do it in PHP. Write one program which will run for 59 seconds, doing your checks every second, and then terminates. Combine this with a cron job which runs that process every minute and hey presto.
One approach is this:
set_time_limit(60);
for ($i = 0; $i < 59; ++$i) {
doMyThings();
sleep(1);
}
The only thing you'd probably have to watch out for is the running time of your doMyThings() functions. Even if that's a fraction of a second, then over 60 iterations, that could add up to cause some problems. If you're running PHP >= 5.1 (or >= 5.3 on Windows) then you could use time_sleep_until()
$start = microtime(true);
set_time_limit(60);
for ($i = 0; $i < 59; ++$i) {
doMyThings();
time_sleep_until($start + $i + 1);
}

Have you thought about using "watch"?
watch -n 1 /path/to/phpfile.php
Just start it once and it will keep going. This way it is immune to PHP crashing (not that it happens, but you never know). You can even add this inittab to make it completely bullet-proof.

Why not run a cron to do this and in the php file loop 60 times which a short sleep. That is the way I have overcome this to run a php script 5 times a minute.
To set up your file to be run as a script add the path to the your PHP on the first line such as a perl script
#!/user/bin/php
<?php
while($i < 60) {
sleep(1);
//do stuff
$i++;
}
?>

This is simple upgraded version of nickf second solution witch allow to specify the desired interval in seconds beetween each executions in execution time.
$duration = 60; // Duration of the loop in seconds
$sleep = 5; // Sleep beetween each execution (with stuff execution)
for ($i = 0; $i < floor($duration / $sleep); ++$i) {
$start = microtime(true);
// Do you stuff here
time_sleep_until($start + $sleep);
}

I noticed that the OP edited the answer to give his solution. This solution did not work on my box (the path to PHP is incorrect and the PHP syntax is not correct)
This version worked (save as whatever.sh and chmod +X whatever.sh so it can execute)
#!/usr/bin/php
<?php
$start = microtime(true);
set_time_limit(60);
for ($i = 0; $i < 59; ++$i) {
echo $i;
time_sleep_until($start + $i + 1);
}
?>

You can run your infinite loop script with nohup command on your server which can work even you logout from system. Only restart or physically shutdown can destroy this process. Don't forget to add sleep (1) in your php script.
nohup php /path/to/you/script.php
Now in case you don't need to use the console while it's working, it'll write its output to nohup.out file in your working directory (use the pwd command to get it).

Related

Performance degrades when php getmxrr() is called from inside shell for loop

I noticed big performance difference when tried to fetch MX of gmail 100000 times using php and shell script.
PHP script is taking around 1.5 min.
<?php
$time = time();
for($i=1;$i<=100000;$i++)
{
getmxrr('gmail.com', $hosts, $mxweights);
unset($hosts, $mxweights);
}
$runtime = time() - $time;
echo "Time Taken : $runtime Sec.";
?>
But same thing done inside shell for loop is almost 10 times slower
time for i in {1..100000}; do (php -r 'getmxrr("gmail.com", $mxhosts, $mxweight);');done
I am curious to know, what are the reasons, shell script is taking more time to complete exactly the same thing which php script can do very fast.

executing script before 60 seconds

I am creating simple script to test that I can echo inside my while loop before it gets 60 seconds,but the problem is it will not echo inside my loop.I don't know if it is really executed inside my while loop. Then my browser will crashed.
$timelimit = 60; //seconds
set_time_limit($timelimit);
$start_time = time(); //set startup time;
while(((time() - $start_time) < $timelimit) || !$timelimit){
echo "executing..<br/>";
}
Thank you in advance.
This is a very tight loop. It will run very fast and will create a very large output, which will eventually kill the browser (it will have hundreds of thousands of lines). You may add some delay to your loop:
while(((time() - $start_time) < $timelimit) || !$timelimit){
sleep(1); // pause for 1 second
echo "executing..<br/>";
}
In this case the output will be only 60 lines, and the browser should render it after a minute of waiting.
CPU execution is very first (approximately 10^-9s per execution). Your looping time is 60 seconds. So consider how many (may be 300915626 executions) executions will occur. During this time if you want to print something your browser will be killed.
If you're expecting to see the output as the script generates it, then you'll want to add a flush(); after your echo. However, if I recall correctly, php will still wait to send the output until it has a certain number of bytes (1024 maybe?)

How do I run a batch file from PHP that loops x times (PHP LOOP)?

My nearish psuedo code: -
for ($i = 1; $i < 5; $i++) {
exec("C:\wamp\www\googletodaybatch.bat");
echo $i;
}
The bat file contains the following: -
START C:\wamp\bin\php\php5.5.12\php.exe -f "C:\wamp\www\googletoday-task.php"
I want to use this way instead of include file as I need it to start 5 instances of the task.
Each instance takes 2 mins. So I don't want it to wait. It updates many databases all at the same time.
Do I use "exec" or "shell_exec"?
Summary: -
Instead of having many lines in my bat file that are all the same, I want one line running many times using a loop from PHP.
Help please!
The code above seems to just keep loading and never stops....
Use just a batch file to start the five tasks
for /l %%a in (1 1 5) do (
START "" "c:\wamp\bin\php\php5.5.12\php.exe" -f "C:\wamp\www\googletoday-task.php"
)
And here you will find the exec vs shell_exec information.

Show an indicator on CLI while query runs

I have a PHP script running via CLI that's working well, but it runs a couple long queries (2-5 minutes) that would ideally give you some idea that something is still happening. When iterating through results, I do have a function running that updates the progress, but when PHP is waiting for a query to return, silence ensues.
I don't need to know anything about when the query will complete, but some sort of indication on the CLI that it's doing something would be a huge gain (binking ..., or something). Possible?
I've found that using carriage returns \r without newlines to be extremely helpful. They reset the output to the beginning of the line, but do not move down a line, allowing you to overwrite the current text.
Please note that you'll need to pad the line to the full length, otherwise previous characters will still linger. For example:
$iteration = 0;
while (/* wait condition */) {
printf("Process still running%-5s\r", str_repeat('.', $iteration % 5));
sleep(1);
$iteration++;
}
echo "\n";
echo "Task completed!";
If you're using a for loop for processing, something like this would be much more useful:
// Display progress every X iterations
$update_interval = 1000000;
for ($i = 0; $i < $massive_number; $i++) {
// Do processing
if ($i % $update_interval == 0) {
printf("Progress: %.2f%%\r", (100 * $i / $massive_number));
}
}

php loop interactions taking too long to complete

The following loop is taking 13 seconds to run on a Windows i7 # 3.4Ghz 16GB.
The script is running from the command line - php loop.php
$start = microtime(true);
for($i = 0; $i <= 150000; $i++) {
$running_time = date('i:s', microtime(true) - $start);
echo "$i - $running_time\n";
}
If I take out the 'echo', it takes less than a second, why?
This has to do with lack of buffering of your output. If you run this in a Windows console, you'll find that the console is your bottleneck.
Hold the scroll bar and watch your program hang until you release it again, to prove this.

Categories