I've parsed a file, split the string by lines and want to leave only unique elements in each vector. I expect vec.dedup()
to work like this:
let mut vec = vec!["a", "b", "a"];
vec.dedup();
assert_eq!(vec, ["a", "b"]);
But it fails:
thread 'main' panicked at 'assertion failed: `(left == right)`
left: `["a", "b", "a"]`,
right: `["a", "b"]`', src/main.rs:4:4
How can I remove duplicates?
As documented, Vec#dedup
only removes consecutive elements from a vector (it is much cheaper than a full deduplication). It would work fine if the vector was vec!["a", "a", "b"]
, for example.
Of course, there are multiple potential solutions.
In order to obtain a vector with all duplicates removed while retaining the original order of the elements, the itertools
crate provides a unique
adaptor.
use itertools::Itertools;
let v = vec!["b", "a", "b"];
let v: Vec<_> = v.into_iter().unique().collect();
assert_eq!(v, ["b", "a"]);
If element order is not important, you may sort the elements first and then call dedupe.
let mut v = vec!["a", "b", "a"];
v.sort_unstable();
v.dedup();
assert_eq!(v, ["a", "b"]);
If fast element lookup is important, you may also consider using a set type instead, such as HashSet
.
let v: HashSet<_> = ["a", "b", "a"].iter().cloned().collect();
let v2: HashSet<_> = ["b", "a"].iter().cloned().collect();
assert_eq!(v, v2);
The other answer points out that a HashSet
is a better choice for a collection without duplicates, which I agree with. This shows how to directly deduplicate a Vec
using that property of HashMap
and without sorting the Vec
first to use std::vec::Vec::dedup
.
use std::hash::Hash;
use std::collections::HashSet;
fn dedup<T: Eq + Hash + Copy>(v: &mut Vec<T>) { // note the Copy constraint
let mut uniques = HashSet::new();
v.retain(|e| uniques.insert(*e));
}
fn main() {
let mut v = vec!["a", "b", "a"];
dedup(&mut v);
assert_eq!(&v, &vec!["a", "b"]);
}
This is a fast (O(n)) solution, but creating the HashSet
requires some extra memory.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With