I have two static slices of u8
and I would implement a function to concatenate them. Something like that
fn concat_u8(first: &'static [u8], second: &'static [u8]) -> &'static [u8] {
&[&first[..], &second[..]].concat()
}
The compiler shows me the error returns a reference to data owned by the current function
. That because the allocated memory will be free at the end of the function.
How can I "force" the lifetime to be static?
Edit
I've a long running process.
At start time, the process processes some input in order to calculate a result (ie concat_u8
function). The result is a slice of u8
and will be used in the rest of the process life in read-only.
The function concat_u8
couldn't be called after an "internal start
event".
I'd like not to use Box
because the dynamic allocation implies a little overhead (maybe not measurable?) and store the result as &[u8]
.
Have I any chance to do that?
Have I any chance to do that without using an unsafe
block?
It is not possible to concatenate two slices to a new slice with static lifetime without leaking memory.
Slices are stored consecutively in memory. Concatenating two slices with static lifetime requires copying them to newly allocated memory, since the result also needs to be consecutive. This newly allocated memory will necessarily be owned by the current function, so you cannot return a reference to it.
You will have to transfer ownership of the memory back to the caller instead:
pub fn concat_u8(first: &[u8], second: &[u8]) -> Vec<u8> {
[first, second].concat()
}
There is no need to require the inputs to have static lifetime anymore, and probably no need to implement this function at all, since calling it will be no shorter or clearer than simply inlining the code.
As mentioned above, if you require a &'static [u8]
instead of a Vec<u8>
for some reason, you can leak()
the vector. There's rarely a reason to do this, and accessing a static slice isn't any faster than accessing a vector.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With