Find and replace in multiple files
Just had to make a few changes to a website I own that means changing the same text in lots of files. Linux has a lot of powerful tools to do this across many directories in one go. The safe modern pattern is:
find . -type f -print0 | xargs -0 sed -i 's/string1/string2/g'
Two commands piped together. find . -type f lists every file under the current directory (skipping directories and symlinks); add -iname "filename.ext" before the pipe if you want to restrict to a particular pattern.
-print0 | xargs -0 is important: -print0 emits filenames separated by NUL bytes instead of newlines, and xargs -0 expects that same format. Without the -0 pair, any filename containing spaces, tabs, or newlines gets split into multiple arguments and sed either fails or edits the wrong files.
sed then receives each filename and does an in-place edit (-i) using the substitution s/string1/string2/g — s is substitute, g is global-per-line (without it only the first match on each line is replaced). The pattern side is a regex.
macOS gotcha: BSD sed (the one on macOS by default) requires an extension argument after -i, even if empty:
find . -type f -print0 | xargs -0 sed -i '' 's/string1/string2/g'
Or install GNU sed (brew install gnu-sed) and invoke it as gsed.
If you’d prefer to skip xargs entirely, find -exec is equivalent and slightly safer because it doesn’t need the NUL-separator dance:
find . -type f -exec sed -i 's/string1/string2/g' {} +