I am trying to find all occurrences of files which have a specified string in them and then clear out the contents of all those files. So a file with "stringtofind" within it would subsequently be an empty file.
To do this I have tried the following, but I get errors:
With the following I get error: find: missing argument to `-exec'
find / -maxdepth 1 -xdev -type f -exec grep -i "stringtofind" -l -exec sed -i '/./d'
With the following I get error:
grep: /nis
/tset.any
/test2.file
/test.txt
: No such file or directory sed: no input files
find / -maxdepth 1 -xdev -type f -print | xargs -0 grep -i "stringtofind" -l | xargs sed -i '/./d'
your find command with the missing argument is very close to working. try:
find / -maxdepth 1 -xdev -type f -exec grep -i "stringtofind" -l {} \; -exec sed -i '/./d' {} \;
you just need to add a backslash escaped semi-colon to the end of each -exec
action and add {}
to let it know where to substitute the filename.
You need to specify a file marker ({}
) and termination (either ;
or +
) to -exec
. In addition I would suggest that you process the files with GNU parallel:
find / -maxdepth 1 -xdev -type f -exec grep -li "stringtofind" {} + |
parallel sed "'/./d'" '{}'
To handle filenames with newlines in them you need to use zero termination as suggested by others:
find / -maxdepth 1 -xdev -type f -exec grep -Zli "stringtofind" {} + |
parallel -0 sed "'/./d'" '{}'
Here's a non-parallel dependent alternative:
find / -maxdepth 1 -xdev -type f -exec grep -Zli "stringtofind" {} + |
while IFS= read -r -d $'\0' file; do
sed '/./d' "$file"
done
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With