Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use grep for collecting URL links?

Let's say I have a file named plaintext.txt which have 2 lines and a content of

you can be social online by logging in to http://www.facebook.com and http://plus.google.com
alternatively, you can also create an account in http://www.twitter.com

I know that I can display the whole contents of the file by issuing a cat statement in the command-line like

cat plaintext.txt

I wanted to collect all of the URL links from the plaintext such that I can display

http://www.facebook.com
http://plus.google.com
http://www.twitter.com

I presume the command line statement for this one would be something like

cat plaintext.txt | grep something

but I don't exactly know how.

How is it possible to use grep for collecting URL links?

like image 469
Abel Callejo Avatar asked Aug 31 '25 10:08

Abel Callejo


2 Answers

You can do with grep -o option.

$ cat file
you can be social online by logging in to http://www.facebook.com and http://plus.google.com
alternatively, you can also create an account in http://www.twitter.com

$ grep -o "http[^ ]*" file
http://www.facebook.com
http://plus.google.com
http://www.twitter.com

From the man page:

-o, --only-matching
              Show only the part of a matching line that matches PATTERN.
like image 76
jaypal singh Avatar answered Sep 02 '25 22:09

jaypal singh


sed 's/\s/\n/g' plaintext.txt | grep http:

where plaintext.txt is the file containing your two lines.

like image 37
unxnut Avatar answered Sep 03 '25 00:09

unxnut