I'm trying to understand how to use C++(11) <type_traits>
.
Here's my trivial test program
#include <type_traits>
template<class U, class S>
inline U add(typename std::enable_if<std::is_unsigned<U>::value,U>::type a,
typename std::enable_if<std::is_signed <S>::value,S>::type b)
{
return a + b;
}
int main(int argc, const char * argv[], const char * envp[])
{
unsigned int ui;
int i;
auto a = add(ui, i);
return 0;
}
When compiled with GCC 4.8.1 it errors as
/home/per/f.cpp: In function ‘int main(int, const char**, const char**)’:
/home/per/f.cpp:15:23: error: no matching function for call to ‘add(unsigned int&, int&)’
auto a = add(ui, i);
^
/home/per/f.cpp:15:23: note: candidate is:
/home/per/f.cpp:5:10: note: template<class U, class S> U add(typename std::enable_if<std::is_unsigned<U>::value, U>::type, typename std::enable_if<std::is_signed<S>::value, S>::type)
inline U add(typename std::enable_if<std::is_unsigned<U>::value,U>::type a,
^
/home/per/f.cpp:5:10: note: template argument deduction/substitution failed:
/home/per/f.cpp:15:23: note: couldn't deduce template parameter ‘U’
auto a = add(ui, i);
^
I have no clue why GCC can't deduce the template parameter U
. Anybody knows what information my code is missing, that is how I write a program in C++11 that takes a unsigned integral type as first argument and signed integral type as second?
typename std::enable_if<std::is_unsigned<U>::value,U>::type
is not a deducible context. In order to deduce U
from this the compiler would need the ability to apply the reverse operation of std::enable_if
. It doesn't look too hard, that's true, but that's because you are talking about a simple thing like enable_if
. It would be impossible to require this of every trait, so C++ just plays it cool and does not make any kind of weird rule exceptions: it's not deducible in general, it's not deducible in this one.
You can do it this way instead:
template<class U, class S,
EnableIf<std::is_unsigned<U>, std::is_signed<S>>...>
U add(U a, S b)
Or in compilers that don't support that style properly you can just add an extra defaulted argument:
template<class U, class S>
U add(U a, S b,
typename std::enable_if<std::is_unsigned<U>::value
&& std::is_signed<S>::value,void>::type* = nullptr)
... or mess up the return type.
template<class U, class S>
typename std::enable_if<std::is_unsigned<U>::value
&& std::is_signed<S>::value,U>::type
add(U a, S b)
You are not giving the compiler a chance to deduce U
and S
. You can rewrite your function as follows, and move the SFINAE checks in the template parameter list:
template<class U, class S,
typename std::enable_if<std::is_unsigned<U>::value &&
std::is_signed <S>::value
>::type* = nullptr>
inline U add(U a, S b)
{
return a + b;
}
Here is a live example.
You first have to deduce the types before you can reason about the types!
It should be:
template <typename U, typename S>
typename std::enable_if<std::is_unsigned<U>::value &&
std::is_signed<S>::value>, U>::type
add(U u, S s)
{
// ...
}
It's not possible to deduce a template parameter from a "nested typedef" expression. That is, it's possible to deduce U
from some_template<U>
, but not from some_template<U>::type
.
The compiler cannot possibly enumerate all (infinite!) instantiations of some_template
and see for which of them the nested typedef equals the actual argument type.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With