We can remove duplicate element in an array by 2 ways: using temporary array or using separate index. To remove the duplicate element from array, the array must be in sorted order. If array is not sorted, you can sort it by calling Arrays. sort(arr) method.
PERLServer Side ProgrammingProgramming Scripts. You can also extract a "slice" from an array - that is, you can select more than one item from an array in order to produce another array.
You can do something like this as demonstrated in perlfaq4:
sub uniq {
my %seen;
grep !$seen{$_}++, @_;
}
my @array = qw(one two three two three);
my @filtered = uniq(@array);
print "@filtered\n";
Outputs:
one two three
If you want to use a module, try the uniq
function from List::MoreUtils
The Perl documentation comes with a nice collection of FAQs. Your question is frequently asked:
% perldoc -q duplicate
The answer, copy and pasted from the output of the command above, appears below:
How can I remove duplicate elements from a list or array? (contributed by brian d foy)
Use a hash. When you think the words "unique" or "duplicated", think "hash keys".
If you don't care about the order of the elements, you could just create the hash then extract the keys. It's not important how you create that hash: just that you use "keys" to get the unique elements.
my %hash = map { $_, 1 } @array;
# or a hash slice: @hash{ @array } = ();
# or a foreach: $hash{$_} = 1 foreach ( @array );
my @unique = keys %hash;
If you want to use a module, try the "uniq" function from "List::MoreUtils". In list context it returns the unique elements, preserving their order in the list. In scalar context, it returns the number of unique elements.
use List::MoreUtils qw(uniq);
my @unique = uniq( 1, 2, 3, 4, 4, 5, 6, 5, 7 ); # 1,2,3,4,5,6,7
my $unique = uniq( 1, 2, 3, 4, 4, 5, 6, 5, 7 ); # 7
You can also go through each element and skip the ones you've seen before. Use a hash to keep track. The first time the loop sees an element, that element has no key in %Seen. The "next" statement creates the key and immediately uses its value, which is "undef", so the loop continues to the "push" and increments the value for that key. The next time the loop sees that same element, its key exists in the hash and the value for that key is true (since it's not 0 or "undef"), so the next skips that iteration and the loop goes to the next element.
my @unique = ();
my %seen = ();
foreach my $elem ( @array )
{
next if $seen{ $elem }++;
push @unique, $elem;
}
You can write this more briefly using a grep, which does the same thing.
my %seen = ();
my @unique = grep { ! $seen{ $_ }++ } @array;
Install List::MoreUtils from CPAN
Then in your code:
use strict;
use warnings;
use List::MoreUtils qw(uniq);
my @dup_list = qw(1 1 1 2 3 4 4);
my @uniq_list = uniq(@dup_list);
My usual way of doing this is:
my %unique = ();
foreach my $item (@myarray)
{
$unique{$item} ++;
}
my @myuniquearray = keys %unique;
If you use a hash and add the items to the hash. You also have the bonus of knowing how many times each item appears in the list.
Can be done with a simple Perl one-liner.
my @in=qw(1 3 4 6 2 4 3 2 6 3 2 3 4 4 3 2 5 5 32 3); #Sample data
my @out=keys %{{ map{$_=>1}@in}}; # Perform PFM
print join ' ', sort{$a<=>$b} @out;# Print data back out sorted and in order.
The PFM block does this:
Data in @in
is fed into map
. map
builds an anonymous hash. keys
are extracted from the hash and feed into @out
Logic: A hash can have only unique keys, so iterate over array, assign any value to each element of array, keeping element as key of that hash. Return keys of the hash, its your unique array.
my @unique = keys {map {$_ => 1} @array};
Better to make a subroutine if we are supposed to use this functionality multiple times in our code.
sub get_unique {
my %seen;
grep !$seen{$_}++, @_;
}
my @unique = get_unique(@array);
List::MoreUtils
use List::MoreUtils qw(uniq);
my @unique = uniq(@array);
The variable @array
is the list with duplicate elements
%seen=();
@unique = grep { ! $seen{$_} ++ } @array;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With