How to run php file in shell script infinite times - php

I'm new to linux system and I'm trying to make a PHP script to be ran infinite times. Note that I'm using Debian 7.
So, I'm using a screen to open a window, so far so good, I have the worker.php file already running succsefully, and I need to make shell script which runs the php script infinite times.
So I've come up with this:
#!/bin/sh
for (( ; ; ))
do
/usr/bin/php worker.php
sleep 1
done
The problem is , when trying to run ./worker.sh in the screen , I get this error:
bash: ./worker.sh: /bin/sh^M: bad interpreter: No such file or directory
So I've stripped of the for, and replaced it with a simple echo , which results into the same error, so I've wrote this question because I don't know what's wrong, both sh or bash exist on the server, I'm wondering if the shebang is wrong but.. I have the automysqlbackup script which starts with the same shebang.
Do you have any idea what is wrong ? I'm just a newb.. don't really know much.
If you're wondering why am I running a file every second, it's because this file serves as a commands processor from a queue in a game. And running it with cron every minute is too slow. MySQL triggers are not fitting my needs, so I'm forced doing this.
Regards.

From the message it looks like you have a <cr><lf> at the end of your shebang line (the #! one). As isn't a valid line end on debian unix (it is on windows and some other varieties of unix), it is being taken as part of the filename, and so the o/s can't find the program to run.
Fixing it would require something like this:
tr -d '\015' < worker.sh > worker_nocr.fixed
Also, as you're using bash as your shell, you might wish to change the shebang to use bash as well, or other things might not work which work fine when you type them in at the command prompt

Related

Logging script commands and errors in linux

2 part question,
I am running a script that executes a second script.
I have it setup this way because I read if I put a script in the /etc/init.d directory it will run it at start up.(true or false?)
I have tried adding >> LoopTriggerLogging.log at the end of each line but nothing comes out in the log file
So I have a first script as follows
#!/bin/bash
/var/www/Dev/LoopTrigger.sh
exit
This triggers the following script to run
#!/bin/bash
while true; do
# do some work
php /var/www/Dev/FindNearestDriverAndSendPush.php
php /var/www/Dev/ProcessCustomerPayment.php
php /var/www/Dev/ProcessDriversPayment.php
# write to LoopTriggerLogging.log
sleep 2 # sleep and repeat
done
What I would like is to have the commands logged along with any errors. I have tried to read a little on this but get lost in the answers and what they are trying to tell the user. I am still new at this and learning, kindly give definition to any commands or options. I am open to a best practice scenario.
Also, with putting in the etc/init.d directory will this tell the script to run at start up?
Is there a way to run this script without it taking up the command line because its an endless script?
My ultimate goal is to get the 3 php files to execute every 2 seconds with some sort of logging.
I did some reading on Cron but seems it is not meant for this type of use case.
Ive also seen this:
exec > logfile 2>&1 (dont know what this does)
set -x makes bash print every command before executing it
FOO=BAR (dont know what this means)
echo $FOO (dont know what this means)
if I put a script in the /etc/init.d directory it will run it at start
up.(true or false)
True. If you put a script in init.d then that script will run for every startup.
My ultimate goal is to get the 3 php files to execute every 2 seconds
You are using the correct way of running it approx every 2sec depending upon the time your php script takes to run. Crontab runs after a minimum of one minute so that would not an option.
I have tried adding >> LoopTriggerLogging.log at the end of each line
but nothing comes out in the log file
You can use /var/www/Dev/LoopTrigger.sh >> LoopTriggerLogging.log in your first script so that whenever it runs it will
Create a file for the first time and append the content from the next time.
Push all the logs of the second script into the file.
Note: As logs will keep on appending to the single file, this file will become very huge at some point of time so make sure your handle it well.

Script_exec() not working when given sh file or command

I'm trying to run a shell script (starting a raspberry pi camera following the second answer from this question - raspivid -o - -t 0 -hf -w 640 -h 360 -fps 25 | cvlc -vvv stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554}' :demux=h264), and I'm executing it from a PHP file. When I try to run the script though, it doesn't work.
Running the script normally either by putting it in an .sh file or just entering the command into the shell both work to turn the camera on. When I put shell_exec('pwd'); in the PHP file I get the directory back but when I try to run the the camera script either by shell_exec('[script here]'); or shell_exec('sh path/to/script.sh'); (or system([script]) or putting the script in backticks), nothing happens.
I saw that some people had the same problem but the reason was that PHP could not access a specific directory, which is not a problem here since I can run a test script in the same way - it seems as if this is might be a problem with the specific script i'm running, but I can't imagine why, if it runs fine outside of shell_exec(...);
Does anyone have any insight as to what might be wrong? Much Thanks!

Running PHP script on Perl or sh error. Not found

I'm currently running a cron job that loads a php script.
I keep getting an error, sh 1 /usr/bin/php: not found.
I tried it two other ways but to no avail.
on a perl script. I tried.
my $x = qx('/usr/bin/php /home/script.here');
This doesn't generate anything and sends me an error message on my mail.
But if I run the line
/usr/bin/php /home/script.here
on my shell, it works.
I also create a script 1.sh and had this.
#!/usr/bin/php -v
I run the script ./1.sh and it shows the result. But as soon as I try to call it via cron or /bin/sh 1.sh, it just fails and can't find the php path even if it was explicitly stated.
Am I missing anything?
I also tried this on php5, but same error.
The problem are the single quotes inside the qx() operator. Remove them:
my $x = qx(/usr/bin/php /home/script.here);
As long as they are there the shell tries to find a command "script.here" in the directory "/usr/bin/php /home" (yes, with the space in the directory name).
Totally forgot about this question.
Found a solution.
I just added
SHELL=/bin/bash in crontab and the scripts worked.

php shell_exec() help needed for running a script in the background

My project calls for 3 php scripts that are run with if-else conditions. The first script is loaded on the index page of the site to check if a condition is set, and if it is, it calls for the second script. The second script check to see if other conditions are set and it finally calls for the last script if everything is good.
Now I could do this by just including the scripts in the if statement, but since the final result is a resource hogging MySQL dump, i need it to be run independently of the original trigger page.
Also those scripts should continue doing their things once triggered, regardless of the user actions on the index page.
One last thing: it should be able to run on win and nix.
How would you do this?
Does the following code make any sense?
if ($blah != $blah-size){
shell_exec ('php first-script.php > /dev/null 2>/dev/null &');
}
//If the size matches, die
else {
}
Thanks a million in advance.
UPDATE: just in case someone else is going through the same deal.
There seem to be a bug in php when running scripts as cgi but command line in Apache works with all the versions I've tested.
See the bug https://bugs.php.net/bug.php?id=11430
so instead i call the script like this:
exec("php-cli mybigfile.php > /dev/null 2>/dev/null &");
Or you could call it as shell. It works on nix systems but my local windows is hopeless so if anyone run it on windows and it works, please update this.
I would not do this by shell exec because you'd have no control over how many of these resource-hogging processes would be running at any one time. Thus, a user could go click-click-click-click and essentially halt your machine.
Instead, I'd build a work queue. Instead of running the dump directly, the script would submit a record to some sort of FIFO queue (could be a database table or a text file in a dir somewhere) and then immediately return. Next you'd have a cron script that runs at regular intervals and checks the queue to see if there's any work to do. If so, it picks the oldest thing, and runs it. This way, you're assured that you're only ever running one dump at a time.
The easiest way I can think is that you can do
exec("screen -d -m php long-running-script.php");
and then it will return immediately and run in the background. screen will allow you to connect to it and see what's happening.
You can also do what you're doing with 'nohup php long-running-script.php', or by writing a simple C app that does daemonize() and then execs your script.

PHP shell_exec() issue

I am having an issue using the PHP function shell_exec().
I have an application which I can run from the linux command line perfectly fine. The application takes several hours to run, so I am trying to spawn a new instance using shell_exec() to manage better. However, when I run the exact same command (which works on the command line) through shell_exec(), it returns an empty string, and it doesn't look like any new processes were started. Plus it completes almost instantly. shell_exec() is suppose to wait until the command has finished correct?
I have also tried variations of exec() with the same outcome.
Does anyone have any idea what could be going on here?
There are no symbolic links or anything funky in the command: just the path to the application and a few command line parametes.
Some thing with you env
See output of env from cli (command line interface) and php script
Also see what your shell interpreter?
And does script and cli application runs from one user?
If so, se option safe_mode
Make sure the user apache is running on (probably www-data) has access to the files and that they are executable (ls -la). A simple chmod 777 [filename] would fix that.
By default PHP will timeout after 30 sec. You can disable the limit like this:
<?php
set_time_limit(0);
?>
Edit:
Also consider this: http://www.rabbitmq.com/

Categories