I want to recursively search through a directory with subdirectories of text files and replace every occurrence of {$replace} within the files with the contents of a multi line string. How can this be achieved with python?
[EDIT]
So far all I have is the recursive code using os.walk to get a list of files that are required to be changed.
import os
import sys
fileList = []
rootdir = "C:\\test"
for root, subFolders, files in os.walk(rootdir):
if subFolders != ".svn":
for file in files:
fileParts = file.split('.')
if len(fileParts) > 1:
if fileParts[1] == "php":
fileList.append(os.path.join(root,file))
print fileList
Open Notepad++ and go to Search > Find in Files... or press CTRL+SHIFT+F. This opes the Find in Files menu. Under Find what:, enter the word or phrase that you need to change. Under Replace with:, enter the new word or phrase.
Method 1: Finding the index of the string in the text file using readline() In this method, we are using the readline() function, and checking with the find() function, this method returns -1 if the value is not found and if found it returns 0.
os.walk is great. However, it looks like you need to filer file types (which I would suggest if you are going to walk some directory). To do this, you should add import fnmatch
.
import os, fnmatch
def findReplace(directory, find, replace, filePattern):
for path, dirs, files in os.walk(os.path.abspath(directory)):
for filename in fnmatch.filter(files, filePattern):
filepath = os.path.join(path, filename)
with open(filepath) as f:
s = f.read()
s = s.replace(find, replace)
with open(filepath, "w") as f:
f.write(s)
This allows you to do something like:
findReplace("some_dir", "find this", "replace with this", "*.txt")
Check out os.walk:
import os
replacement = """some
multi-line string"""
for dname, dirs, files in os.walk("some_dir"):
for fname in files:
fpath = os.path.join(dname, fname)
with open(fpath) as f:
s = f.read()
s = s.replace("{$replace}", replacement)
with open(fpath, "w") as f:
f.write(s)
The above solution has flaws, such as the fact that it opens literally every file it finds, or the fact that each file is read entirely into memory (which would be bad if you had a 1GB text file), but it should be a good starting point.
You also may want to look into the re module if you want to do a more complex find/replace than looking for a specific string.
For those using Python 3.5+ you can now use a glob recursively with the use of **
and the recursive
flag.
Here's an example replacing hello
with world
for all .txt
files:
for filepath in glob.iglob('./**/*.txt', recursive=True):
with open(filepath) as file:
s = file.read()
s = s.replace('hello', 'world')
with open(filepath, "w") as file:
file.write(s)
To avoid recursing into .svn
directories, os.walk()
allows you to change the dirs
list inplace. To simplify the text replacement in a file without requiring to read the whole file in memory, you could use fileinput
module. And to filter filenames using a file pattern, you could use fnmatch
module as suggested by @David Sulpy:
#!/usr/bin/env python
from __future__ import print_function
import fnmatch
import os
from fileinput import FileInput
def find_replace(topdir, file_pattern, text, replacement):
for dirpath, dirs, files in os.walk(topdir, topdown=True):
dirs[:] = [d for d in dirs if d != '.svn'] # skip .svn dirs
files = [os.path.join(dirpath, filename)
for filename in fnmatch.filter(files, file_pattern)]
for line in FileInput(files, inplace=True):
print(line.replace(text, replacement), end='')
find_replace(r"C:\test", "*.php", '{$replace}', "multiline\nreplacement")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With