Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

awk grep or sed : how to match two files

(you're wonderful)

How can I pass arguments to awk or grep from a previous command ?

I want to match one field (Username) from two files:

Expected result:

firstname   Lastname  Username IPaddress
Mozes       Bowman    user1    134.244.47.32
Jazzmyn     Parrish   user2    3.249.198.34
Chet        Woods     user3    52.215.73.213

From these 2:

**file1**
IPaddress       Username
34.244.47.32    user1
3.249.198.34    user2
52.215.73.213   user3

**file2**
firstname Lastname   Username
Mozes     Bowman     user1
Jazzmyn   Parrish    user2
Chet      Woods      user3

This is as far as I can go with:

awk '{print $2,$1}' file1 | while read Username IP ; do grep $username file2 && echo $IP; done

Resulting in IP shown on a new line each time:

firstname   Lastname    Username
IPaddress
Mozes   Bowman  user1
34.244.47.32
Jazzmyn Parrish user2
3.249.198.34
Chet    Woods   user3
52.215.73.213
like image 775
Kerkael Belarpaillon Avatar asked Dec 18 '25 14:12

Kerkael Belarpaillon


2 Answers

You can try this awk

awk 'NR==FNR {r[$3]=$0 ; next } { print r[$2]"\t" $1 }' $file2 $file1

output

firstname Lastname   Username   IPaddress
Mozes     Bowman     user1      34.244.47.32
Jazzmyn   Parrish    user2      3.249.198.34
Chet      Woods      user3      52.215.73.213
like image 96
HatLess Avatar answered Dec 20 '25 09:12

HatLess


Please try following awk code. This will take care of case where we have spaces in user names too. Written and tested with same format as per OP's shown samples.

awk '
BEGIN{
  OFS="\t"
  print "firstname   Lastname  Username IPaddress"
}
NR==1{ next }
FNR==NR{
  match($0,/^([0-9]+\.)+[0-9]+[[:space:]]+/)
  value=substr($0,RSTART,RLENGTH)
  sub(/[[:space:]]+$/,"",value)
  arr[substr($0,RSTART+RLENGTH)]=value
  next
}
{
  first=$1
  second=$2
  $1=$2=""
  sub(/^[[:space:]]+/,"")
}
($0 in arr){
  print first,second,$0,arr[$0]
}
' file1 file2

Explanation: Adding detailed explanation for above solution.

awk '                                                ##Starting awk program from here.
BEGIN{                                               ##Starting BEGIN section of this program from here.
  OFS="\t"                                           ##Setting OFS as tab here.
  print "firstname   Lastname  Username IPaddress"   ##printing header here.
}
NR==1{ next }                                        ##Checking condition if this is very first line of file1 then simply ignore it.
FNR==NR{                                             ##This condition will be TRUE when file1 is being read.
  match($0,/^([0-9]+\.)+[0-9]+[[:space:]]+/)         ##Using match function to match IP address in file1 here followed by spaces.
  value=substr($0,RSTART,RLENGTH)                    ##Creating value which is substring of matched regex.
  sub(/[[:space:]]+$/,"",value)                      ##Substituting spaces till last in value.
  arr[substr($0,RSTART+RLENGTH)]=value               ##Creating array named arr with index of rest of value(apart from ip values followed by spaces in file1) and its value is variable value.
  next                                               ##next will skip all further statements from here.
}
{
  first=$1                                           ##Creating first which has $1 of current line.
  second=$2                                          ##Creating second which has $2 of current line.
  $1=$2=""                                           ##Nullifying $1 and $2 here.
  sub(/^[[:space:]]+/,"")                            ##Substituting spaces from starting with null.
}
($0 in arr){                                         ##If current line is present in arr then do following.
  print first,second,$0,arr[$0]                      ##Printing variables then current line followed by arr value.
}
' file1 file2                                        ##Mentioning Input_file names here.
like image 42
RavinderSingh13 Avatar answered Dec 20 '25 09:12

RavinderSingh13



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!