I have to run my five different scrapy spiders at the same time so I define a custom command in scrapy like
scrapy crawlall
to call these spiders. It works successfully when I run in command line in independent.
However it fails when I use php shell_exec to call the same command and the code in php file is like
$cmd = 'scrapy crawlall';
$results = shell_exec($cmd);
echo $results;
The web page will echo nothing immediately and not waiting the scrapy function(or even it wasn't be called at the beginning I don't know). So how can I call this command or other way to run these 5 spiders at the same time in php file? Thanks!
I have solved the problem. If you get the same situation, try to remove all comment in your command python file. I don't know the reason but after I do that it works successful.
Related
I have a PHP script that is run via CLI. In turn I want that script to call a bash script, but preferably I would like to break up the BASH requests so I can see the action as it is happening.
I can seem to set Environmental variables and they seem to exist between shell_exec() functions. But even when I have a source file like:
source ./bashes/functions.sh
And in the source file I use "export -f function-name" to export the functions in the script before executing the next line, the next line does not see the functions.
$file = file('./bashes/bash_testing.sh',FILE_SKIP_EMPTY_LINES);
//we want to watch this in realtime so write each line seperately to shell.
foreach($file as $command) {
$output->writeln(shell_exec($command));
}
The function $output->writeln is a helper function just to echo the returned result. But basically the error I get is
sh: now: command not found
now is defined as a function in the included bash_testing.sh shell script.
Anyone know how I can resolve this issue?
Here is the source to the ./bashes/functions.sh file:
function now {
date -u +%T
}
export -fx now
There is a way to maintain a single bash shell, execute commands and handle the return. I recently published a project that allows PHP to obtain and interact with a real Bash shell. Get it here: https://github.com/merlinthemagic/MTS
I would suggest not triggering a bash script but rather trigger the induvidual commands. That way you can handle the return and not have to build exception handling in bash.
After downloading you would simply use the following code:
//get a real bash shell.
$shell = \MTS\Factories::getDevices()->getLocalHost()->getShell('bash', false);
$return1 = $shell->exeCmd($command1);
//logic to handle the return
$return2 = $shell->exeCmd($command2);
//logic to handle the return
..... etc
I'm trying to compile an executable via PHP with msbuild which compiles my C# source, the majority of the script relies on the executable being created so it must wait for msbuild to compile the source.
If I don't put any sort of while loop it will compile fine and the executable is created but the problem is the rest of the script executes to fast and the end result isn't correct.
so at the moment I'm using this..
exec('C:\Windows\Microsoft.NET\Framework\v3.5\MSBuild.exe C:\Users\Administrator\AppData\Roaming\Compile\Myprogram\Myprogram.sln /p:Configuration=Release');
while (!file_exists('C:\Users\Administrator\AppData\Roaming\Compile\Myprogram\bin\Release\Myprogram.exe')) sleep(1);
However in this scenario it's almost as if the exec command never gets ran at all. It gets stuck in an infinite loop and eventually times out resulting in the exe never being compiled.
Any suggestions on the proper way to go about this?
Try running it as follows:
$output = array();
$cmd = 'C:\Windows\Microsoft.NET\Framework\v3.5\MSBuild.exe C:\Users\Administrator\AppData\Roaming\Compile\Myprogram\Myprogram.sln /p:Configuration=Release && exit';
exec($cmd, $output);
This is really important as I could not find anything I am looking for in Google.
How do I know when the application (or is it more appropriate to call it a task?) executed by a command line is done? How does the PHP know if the task of copying several files are done if I do like this:
exec("cp -R /test/ /var/test/test");
Does the PHP script continue to go to next code even while the command is still running in background to make copies? Or does PHP script wait until the copy is finished? And how does a command line application notify the script when it's done (if it does)? There must be some kind of interaction going on.
php's exec returns a string so yes. Your webpage will freeze until the command is done.
For example this simple code
<?PHP
echo exec("sleep 5; echo HI;");
?>
When executed it will appear as the page is loading for 5 seconds, then it will display:
HI;
How does the PHP know if the task of copying several files are done if I do like this?
Php does not know, it simply just run the command and does not care if it worked or not but returns the string produced from this command. Thats why it better to use PHP's copy command because it returns TRUE/FALSE upon statistics. Or create a bash/sh script that will return 0/FALSE or 1/TRUE to determine if command was successful if you are going this route. Then you can PHP as such:
<?PHP
$answer = exec("yourScript folder folder2");
if ($answer=="1") {
//Plan A Worked
} else {
//Plan A FAILED try PlanB
}
?>
It waits until the exec call returns, whatever it returns.
However it might be that the exit call returns although the command it has started has not yet finished. That might be the case if you detach from the control, for example by explicitly specifying a "&" at the end of the command.
I want to run a PHP script every 15 minutes using either CURL or WGET.
This PHP file is in a local folder:
/home/x/cron.php
How would I run this using CURL/WGET?
It doesn't work when I try to run
curl /home/x/cron.php
Thank you!
CURL and WGET are more adecuate for URLs like http://myhost.com/cron.php
When the script is offline, you would better run it using php CLI:
Ex:
php -q cron.php
Just do something like this:
/usr/bin/php /home/x/cron.php
cURL/wget is for HTTP actions. If your PHP script is on the same system, you don't want to load it over HTTP. (You can, of course, if it is accessible over HTTP, but I don't think that is what you want.) Just call it directly.
Alternatively, you can set the execute permission on your script and throw in a shebang line for PHP:
#!/usr/bin/php
Then, just put your PHP script in crontab directly.
If you're using CURL or WGET, I believe you'll need to pass in the path as a URL. If you want to run the php script on the command line, you'll need to use the the php CLI
I am trying to invoke a script which takes several seconds (web services with 3rd party) using the PHP exec call. After much struggling, I reduced this to the classic hello world example. The calling script looks like:
exec('/usr/bin/php /home/quote2bi/tmp/helloworld.php > /tmp/execoutput.txt 2>&1 &');
When I run this, the output execoutput.txt contains a copy of the invoking script page, not hello world as I expected.
Why can't I get this PHP script to execute using exec? Note that when I change the command to something like ls -l, the output is a directory listing as expected. btw, in case it matters, I did chmod the called script to 755...
Update - I moved the exec call to the end of the calling script and at least now I don't see the calling script executed in the output. Thx to posters and I will try some of these ideas.
Help!
Thanks
Steve
I had this issue also and it turns out this is a bug in php (#11430). The fix is to use php-cli when calling another php script within a php script. So you can still use exec but rather than use php use php-cli when calling it in the browser:
exec("php-cli somescript.php");
This worked for me.
What exec is doing is taking the rightmost command and appending it to your destination. If you have the shebang line in your php script, you shouldn't need to include the binary directive of the php interpreter.
if you just want the script's output, try:
exec('/home/quote2bi/tmp/helloworld.php > /tmp/execoutput.txt 2>&1 &')
however if you do not want the errors to be in the file, you should redirect the STDERR prior to outputting to the file. Like so:
exec('/home/quote2bi/tmp/helloworld.php 2> /dev/null > /tmp/execoutput.txt')
the above should only output the "Hello World" to the execoutput.
Edit:
Interesting you are getting this behaviour. You stated the command "ls" worked. Try making an alias for this and forward it to a file like so:
alias pexec='php /home/quote2bi/tmp/helloworld.php'
then
exec('pexec > /tmp/execoutput.txt 2>&1 &')
it seems to be a problem with the way exec handles input as opposed to the shell itself.
-John
The problem is with PHP itself, it treats everything as $argv in the script. It doesn´t redirect the output to a file ou to /dev/null.
I faced the same problem some time ago. What I did is to create a runscript.php in /opt/php-bin and then inside this script run what It should be running. Something like this:
$script = $argv[1]
$params = implode(' ', array_slice($argv, 2));
$cmd = "{$script} {$params} > /dev/null &";
$output = array();
$return = 0;
exec("php {$cmd}", $output, $return);
exit((int)$return);
And then you call it using:
exec('/opt/php-bin/runscript.php /path/to/your/script.php arg1 arg2')
It´s the only way I managed to get this working.
To avoid the stated problems of PHP in this area, why not put this in inside a shell script? PHP can then execute the shell script which has all the redirections handled internally.
If you need to dynamically change things, then why not write the shell script and then execute it (and of course, clean up afterwards)?
if you are just simply running a php script one possible way to execute the entire code is to use the include() that will run the php file and output any results. You cannot direct the output to a text file but it should appear in the browser window if you're Hello World php script looks like
<?php echo "Hello World!"; ?>
then it will spit that out in the browser. So your second code would look like
<?php include("helloWorld.php"); echo " PHP ROCKS";?>
resulting in a page that would look like,
Hello world! PHP ROCKS
This runs as if you run the script from browser.
This came across while working on a project on linux platform.
exec('wget http://<url to the php script>)
Hope this helps!!