First of all, I would like to point out that this is NOT a duplicate to this or this and other related questions. I am asking how to get the output as well and my question is mostly related to that.
I need to run a bash script from PHP and get its output in PHP. However, doing:
echo shell_exec('script');
dies out on the 60th second due to Apache's FcgidIOTimeout. If I use the following to run it in the background, I won't be able to get the output:
shell_exec('<myscript> > /dev/null 2>/dev/null &');
I cannot:
Modify the bash script;
The code and all the functionality needs to be in a single file (unless I use a temp file);
If I were to write the output of the shell_exec() function to a temp file, I would also need to find a way to verify whether the
process has finished;
Using a database is an overkill which I cannot afford;
My current and only idea is to create a tmp file using PHP and write the output to it, then using top -c | grep <myscript> (with a refresh button) and if it returns true then it is still ongoing. However, I suspect that would not be practical and efficient in most of the time.
It is important that I use a temp file and not actually creating a "permanent" file to write to.
Solution for a similar problem: A few years ago I had a similar problem. I had to upload a file to a FTP server. I wondered how to communicate to the FTP server that the file was uploaded completely, so that the FTP server can perform some tasks on it. My solution was to rename the file to something like *.completed after it was uploaded completely. Then the process on the FTP server could look for *.completed files only.
Solution adjusted to your problem: I'd suggest you to rename your temp file after it was generated by your bash script. This way you can independently find out if the script was executed successfully. shell_exec() could look like this:
shell_exec('<myscript> > tmp-file && mv tmp-file tmp-file.completed &');
Be aware that this only redirects the channel STDOUT into the the tmp-file. If you also want to redirect STDERR into the tmp-file, try this:
shell_exec('<myscript> &> tmp-file && mv tmp-file tmp-file.completed &');
Related
I have a client who uploads an XML file to a directory on my server using a third party script over which I have no control. Upon upload the file gets processed by a php script.
I have a cronjob which checks the dir every hour to see whether a file has been uploaded but if the upload takes place at, say, 3:01pm it won't be processed until 4:00pm. My host forbids running a cronjob more frequently than once per hour.
Is there a way to detect and advise immmediately upon a file arriving?
The short answer is yes, it is possible to detect a file upload and perform an immediate action. However, this is not what I would call a "simple" solution. You would want to set up some sort of daemon so that you can run PHP continually (or use a different language entirely to monitor for the file).
The simple solution is to lower your cron interval, and then just add some kind of anti-stacking mechanic (like a lockfile) to ensure your script doesn't execute if it's already running.
make a file checker.sh , and run it in background ./checker.sh /dev/null 2>&1 & , or run it in screen
#!/bin/bash
COUNTER=0
while [ $COUNTER -lt 99999 ]; do
if [ -f "/path/to/your/file" ]; then
#do action here ,run a script a program etc #the file is there
fi
sleep 5
let COUNTER=COUNTER+1
done
If you are running Linux you could look at the inotifywait command which is part of the inotify-tools package. This can be set up to wait until a file is created in a particular directory as follows:
#!/bin/bash
while true
do
inotifywait -e create /home/myuser/upload_dir/ && php /home/myuser/scripts/process_file.php
done
I currently use nohup to run a long php script and redirect the live results to a file using this command
nohup php long_file.php >logs 2>&1 &
so i just go and visit logs file continuously to see the results
Now i want to do the exact same thing using another php file to execute the above command
i tried the above command with php exec and the redirect output doesn't seem to be working,
I know i can just retrive the output using php and store it using any file write function but the thing is .. the output is too long thats why i keep it running on server's background
a similar question :
Shell_exec php with nohup, but it had no answer
any solution ?
Please try with -q
nohup php -q long_file.php >logs 2>&1 &
http://ubuntuforums.org/showthread.php?t=977332
Did you try passthru instead of exec?
You are redirecting STDOUT to a file by using >
This will truncate the file each time the script is run. If two scripts simultaneously redirect their output to the same file, the one started last will truncate the output from the first script.
If you really want to properly append to a log file with multiple concurrent running scripts, consider using >> to avoid having the log file truncated.
Side effect however is that the log file never truncates and keeps expanding, so if the file gets really large you can consider including it in a logrotate scheme.
i have a project which i develop in php framework named codeigniter. I have a script file in it which creates 30,0000 pdfs of db records for my client. the problem is that i have to run code in chunks. Because it gives errors, some time server time out, some time memory out of sync, I fixed all errors by changing in php.ini but in vain. Because I still have to make pdfs in chunks by using limit with 7000 ofset. IF I increase Limit offset it fails to run and gives me error. It's very hard to sit upto 30,0000 record and generate pdf on them. I just want that I run within one single chunk. Please give me solution. I really need it. Thanks in advance.
I would recommend using separate shell script for this, or if you insist on using a PHP script, use that as you would use a shell script. You can just "fire and forget" the script, so it would run on the background, without it affecting anything else and without it eating resources off your web application. That would be something like:
shell_exec('yourscript.sh > /dev/null 2>/dev/null &');
or with php script
shell_exec('php yourscript.php > /dev/null 2>/dev/null &');
Note that in the above, I'm redirecting both stdout and stderr to null. If you wan't them, you'd rather redirect them to somewhere else.
I have a large PHP application and I'm looking for a way to know which PHP script is running at a given moment. Something like when you run "top" on a Linux command line but for PHP.
Are you trying to do so from within the PHP application, or outside of it? If you're inside the PHP code, entering debug_print_backtrace(); at that point will show you the 'tree' of PHP files that were included to get you at that point.
If you're outside the PHP script, you can only see the one process that called the original PHP script (index.php or whatnot), unless the application spawns parallel threads as part of its execution.
If you're looking for this information at the system level, e.g. all php files running under any Apache child process, or even any PHP files in use by other apps, there is the lsof program (list open files), which will spit out by default ALL open files on the system (executables, sockets, fifos, .so's, etc...). You can grep the output for '.php' and get a pretty complete picture of what's in use at that moment.
This old post shows a way you can wrap your calls to php scripts and get a PID for each process.
Does PHP have threading?
$cmd = 'nohup nice -n 10 /usr/bin/php -c /path/to/php.ini -f /path/to/php/file.php action=generate var1_id=23 var2_id=35 gen_id=535 > /path/to/log/file.log & echo $!';
$pid = shell_exec($cmd);
I have a php script I want to run every minute to see if there are draft news posts that need to be posted. I was using "wget" for the cron command in cPanel, but i noticed (after a couple days) that this was creating a blank file in the main directory every single time it ran. Is there something I need to stop that from happening?
Thanks.
When wget runs, by default, it generates an output file, from what I need to remember.
You probably need to use some option of wget, to specify to which file it should write its output -- and use /dev/null as destination file (It's a "special file" that will "eat" everything you can write to it)
Judging from man wget, the -O or --output-file option would be a good candidate :
-O file
--output-document=file
The documents will not be written to the appropriate files, but all will be concatenated together and written to file.
so, you might need to use something like this :
wget -O /dev/null http://www.example.com/your-script.php
And, btw, the output of scripts run from the crontab is often redirected to a logfile -- it can always help.
Something like this might help, about that :
wget -O /dev/null http://www.example.com/your-script.php >> /YOUR_PATH_logfile.log
And you might also want to redirect the error output to another file (can be useful, to help with debugging, the day something goes wrong) :
wget -O /dev/null http://www.example.com/your-script.php >>/YOUR_PATH/log-output.log 2>>/YOUR_PATH/log-errors.log