read, err := ioutil.ReadFile(path)
if err != nil {
return err
}
if strings.Contains(string(read), "STRING") {
// display line number?
// what if many occurrences of the string
// how to display for each the line number?
}
I'm trying to search files for a specific string and display the line number at which the string is located.
Use scanner to iterate over the file, line-by-line, incrementing your line count on each loop.
e.g.
f, err := os.Open(path)
if err != nil {
return 0, err
}
defer f.Close()
// Splits on newlines by default.
scanner := bufio.NewScanner(f)
line := 1
// https://golang.org/pkg/bufio/#Scanner.Scan
for scanner.Scan() {
if strings.Contains(scanner.Text(), "yourstring") {
return line, nil
}
line++
}
if err := scanner.Err(); err != nil {
// Handle the error
}
Update: if you need to do this across 'thousands of files' (as per the comment on another answer), then you would wrap this approach in a worker pool and run this concurrently.
Hi Try this out.
/* ioutil.ReadFile returns []byte, error */
data, err := ioutil.ReadFile("output.txt")
/* ... omitted error check..and please add ... */
/* find index of newline */
file := string(data)
line := 0
/* func Split(s, sep string) []string */
temp := strings.Split(file, "\n")
for _, item := range temp {
fmt.Println("[",line,"]\t",item)
line++
}
Basically, it will read file using ioutil package and find out newline from the contents. Strings packages is very useful when you find a certain character or newlines and so on. Take a look at GO's official webpage for further note and examples. (https://golang.org/pkg/strings/#Split).
I made a "output.txt" from (https://golang.org/pkg/strings/#Split) above and expected output should be
[ 0 ]
[ 1 ] Examples
[ 2 ]
[ 3 ] Contains
[ 4 ] ContainsAny
[ 5 ] Count
[ 6 ] EqualFold
[ 7 ] Fields
[ 8 ] FieldsFunc
[ 9 ] HasPrefix
[ 10 ] HasSuffix
...
Regarding the format, you can modify
fmt.Println("[",line,"]\t",item)
with fmt.Printf(
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With