This entry in my Makefile happily crawls my PHP files and runs PHP's built-in lint on them:
lint:
#find . -name "*.php" | grep -v "^./lib" | grep -v "^./vendor" | xargs -I{} php -l '{}' | grep -v "No syntax errors detected"
The grep -v suppresses all the "No syntax errors detected" messages that would otherwise be produced while still failure messages, if any.
The problem is that make dies when there are no syntax errors and continues when there are errors. This is because of the exit code from grep -v. It thinks it has succeeded when it finds something (an error message) and failed when it finds nothing (all files passed lint).
I looked at negating the exit code of the final call to grep with !:
lint:
#find . -name "*.php" | grep -v "^./lib" | grep -v "^./vendor" | xargs -I{} php -l '{}' | ! grep -v "No syntax errors detected"
but that gives me:
/bin/sh: -c: line 0: syntax error near unexpected token `!'
I can use ! at the commandline fine but in this context it doesn't work for some reason.
I'm curious how I negate an exit code within the context of a pipeline/xargs/grep/make. But mostly I want to solve my problem - open to any suggestions that result in a working lint target in my Makefile that does the right thing.
Return value of pipe is one which is returned by its last command. So you need just revert status of full command line:
lint:
#! find ... | grep -v "No syntax errors detected"
Related
I wrote a very convoluted, very hackjob PHP-cli script that receives and parses JSON of changeable structure, depth and content. For that reason at that time I found it easiest to do the parsing using PHP's shell_exec() and cat | jq | grep.
Sometimes, rarely, on certain input it gives me the message Error: writing output failed: Broken pipe, which is the last message I see in cli output before the script dies. However, even when it does do that, the data is still parsed out correctly, for all the little good it does me.
I isolated the problematic piece of code to:
$jq1='cat '.$randfile.' | jq \'.\' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}"';
$jq1=trim(shell_exec($jq1));
And tried to debug it by seeing what it executes. The first line is the shell_exec argument, echoed before execution, the second line is the result of shell_exec.
Command: cat 5ca15f21.json | jq '.' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}";
Result: standalone
Command: cat 5ca59379.json | jq '.' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}";
Result: season
Error: writing output failed: Broken pipe
Command: cat 5ca7d271.json | jq '.' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}";
Result: extended
Command: cat 5ca7d7a8.json | jq '.' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}";
Result: season
(I have seen the error of my lazy ways and will be rewriting that whole section, but back then I was young and inexperienced and impatient. I'd still like to understand what's going wrong where why.)
What would make it do that sometimes? The input is always jq's pretty-printed JSON, of varying structure.
Even if it does get the broken pipe message, the necessary value is still parsed out and stored in the variable. What causes it to die then? I would like to know for the future if there's a way to make PHP disregard the [non-fatal] error and go on executing.
Why does the shell command that produces the broken pipe error message in shell_exec behave no differently when invoked manually in bash? Where is the broken pipe and what makes it so broken?
Edit:
The cat command could be eliminated if this is a bash command to be executed. I would write it like this.
$jq1=grep -Po --color=none "\w{3,15}"|"\[" -m 1 file.txt | jq \'.\';
$jq1=trim(shell_exec($jq1));
I combined your redundant grep commands and added file to the grep command and only piped the jq command.
I would want to run strict syntax check:
$ php -d error_reporting=32767 -l test.php
for all php files inside my project in Travis CI.
I tried to use find but it always just returns 0 even if the command for exec flag fails
$ find . -type f -name "*.php" -exec php -d error_reporting=32767 -l {} \;
PHP Parse error: syntax error, unexpected end of file, expecting ',' or ';' in ./test.php on line 3
$ echo $?
0
I solved this by checking if the find + php returns anything into the STDERR.
before_script:
- '! find . -type f -name "*.php" -exec php -d error_reporting=32767 -l {} \; 2>&1 >&- | grep "^"'
How does this work? Adding:
2>&1 >&-
after any command removes STDOUT and redirects STDERR to STDOUT.
Then we can just check if the output contains any lines with grep:
| grep "^"
Then because grep returns 0 if it finds anything and 1 if it doesn't find anything we need to negate the end result by using exclamation mark in the start of this command. Otherwise this command would fail when everything is okay and success when things are failing.
I am using Jenkins and doing PHPMD, PHPCS and PHP lint checks for Pull Requests. What we have is basically a separate branch for each feature and it supposed to merge into the master branch again if it passes checks and tests.
We are checking all php files in the project with this command:
echo "php syntax checks are started"
find ./ -type f -name \*.php -exec php -l '{}' \; | grep -v "No syntax errors detected" && echo "PHP Syntax error(s) detected" && exit 1;
Using "php -l" for all php files takes around minute.
I was wondering if there is a way to speed up for this and came up with a solution. Please check my answer below.
Considering only few php files are going to change this takes only few seconds.
echo "php syntax checks for only changed files"
( ( (git diff --name-only origin/master $GIT_COMMIT ) | grep .php$ ) | xargs -n1 echo php -l | bash ) | grep -v "No syntax errors detected" && echo "PHP Syntax error(s) detected" && exit 1;
If you are using git plugin with Jenkins you can keep $GIT_COMMIT otherwise change it with commit number or branch name.
This can be used also for css and js lints as well. Change "php -l" part depends on what you need.
I am extracting information from proftpd logs. I have to call this one-liner from a PHP script but it does not work anymore from there.
This is the original line, which works:
(gunzip -c xferlog*.gz; cat xferlog?(*)!(.gz)) | grep 'host [0-9]\+ file a _ o r ftpuser' | sort -k 5n,5 -k 2M,2 -k 3n,3 -k 4,4 | tail -1 | cut -c 1-24
This is the error I got when executed in PHP:
$cmd = "(gunzip -c $logFile*.gz; cat $logFile?(*)!(.gz)) | grep '$host [0-9]\+ $file a _ o r $ftpUser' | sort -k 5n,5 -k 2M,2 -k 3n,3 -k 4,4 | tail -1 | cut -c 1-24";
exec($cmd);
sh: Syntax error: "(" unexpected (expecting ")")
I tried several bash scripts that would be called by PHP, but it has not been successful. I had errors like:
bash: command substitution: line 9: syntax error near unexpected token `('
bash: command substitution: line 9: `cat ${LOGS}?(*)!(.gz)'
or
bash: ./extract_date_in_xferlog.sh: line 8: syntax error near unexpected token `('
bash: ./extract_date_in_xferlog.sh: line 8: `(gunzip -c ${LOGS}*.gz; cat ${LOGS}?(*)!(.gz)) | grep "$HOST [0-9]\+ $FILE a _ o r $USER" | sort -k 5n,5 -k 2M,2 -k 3n,3 -k 4,4 | tail -1 | cut -c 1-24'
I am a bit confused, thank you for your help!
The weird wildcard uses extended globbing. You need to enable extglob either as part of your script (probably better) or in your Bash setup (probably where it was before, and then it broke when somebody changed it for unrelated reasons).
You're probably not escaping the quotes correctly in the script.
I suggest handling the shell command as a single quoted string assuming you don't want to embed PHP variables in the shell command, and then making sure that all the single quotes in the command are escaped with \' to avoid prematurely terminating the PHP string.
Alternatively you could use a HEREDOC or NOWDOC style string to avoid escaping issues.
I know this is simple but I just cant figure it out.
I have a bunch of files output by "svn st" that I want php to do a syntax check on the command line.
This outputs the list of files: svn st | awk '{print $2}'
And this checks a php script: php -l somefile.php
But this, or variants of, doesn't work: svn st | php -l '{print $2}'
Any ideas? Thanks!
Use xargs:
svn st | awk '{print $2}' | xargs -L 1 php -l
The xargs -L 1 command reads items from standard input, one per line, and runs the given command for each item separately. See the xargs(1) man page for more info.