linux - grep based on blacklist -- without procedural code? -


it's well-known task, simple describe:

given text file foo.txt, , blacklist file of exclusion strings, 1 per line, produce foo_filtered.txt has lines of foo.txt not contain exclusion string.

a common application filtering compiler warnings build log, ignore warnings on files not yours. file foo.txt warnings file (itself filtered build log), , blacklist file excluded_filenames.txt file names, 1 per line.

i know how it's done in procedural languages perl or awk, , i've done combinations of linux commands such cut, comm, , sort.

but feel should close xargs, , can't see last step.

i know if excluded_filenames.txt has 1 file name in it, then

grep -v foo.txt `cat excluded_filenames.txt` 

will it.

and know can filenames 1 per line

xargs -l1 -a excluded_filenames.txt 

so how combine 2 single solution, without explicit loops in procedural language?

looking simple , elegant solution.

you should use -f option (or can use fgrep same):

grep -vf excluded_filenames.txt foo.txt 

you use -f more directly answer asked:

grep -vf "`cat excluded_filenames.txt`" foo.txt 

from man grep

-f file, --file=file           obtain patterns file, 1 per line.  empty file contains 0 patterns, , therefore matches nothing.  -f, --fixed-strings           interpret pattern list of fixed strings, separated newlines, of matched. 

Comments

Popular posts from this blog

objective c - Change font of selected text in UITextView -

php - Accessing POST data in Facebook cavas app -

c# - Getting control value when switching a view as part of a multiview -