Consider this code:
#include <iostream>
#include <type_traits>
using namespace std;
template<typename T_orig> void f(T_orig& a){
a=5;
}
template<typename T_orig, typename T=T_orig&> void g(T a){
a=8;
}
int main() {
int b=3;
f<decltype(b)>(b);
cout<<b<<endl;
g<decltype(b)>(b);
cout<<b<<endl;
return 0;
}
This prints
5
5
Can somebody explain to me why in the second version the &
is lost?
The problem here is that type deduction takes priority over defaulted function template parameters. Therefore you get the T
parameter deduced and T
never deduces to a reference.
You can prevent this by making the type not deducible. A generic identity type trait can do this.
template <typename T>
struct identity { using type = T; };
template <typename T>
using NotDeducible = typename identity<T>::type;
template<typename T_orig, typename T=typename target<T_orig>::T>
void g(NotDeducible<T> a) { // blah
Or, in this particular case, you can simply get rid of the template parameter altogether.
template<typename T_orig> void g(typename target<T_orig>::T a)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With