Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Overload resolution and user defined conversion

Consider the simple code :

struct A;
struct B {
  B(){}
  B(A const&){ }
};

struct A {
  operator int() const {return 0;};
};
void func(B){}
void func(char){}

int main()
{
func(A()); //ambiguous call oO
}

First of all I'm not sure if I understand everything correctly, so correct me anytime you find me wrong please.

My understanding was that that void func(B) should have been chosen, since argument to func is A which is class type, hence type of conversion required is "User defined conversion sequence"

Now from IBM C++ ref :

A user-defined conversion sequence consists of the following:

  • A standard conversion sequence
  • A user-defined conversion
  • A second standard conversion sequence

Now there are two user defined conversion present B::B(const A&) and A::operator int (const A&);

so the sequence are

-> A() -> B::B(const A&) -> Standard conversion (identity conversion)

-> A() -> A::operator int (const A&) -> Standard conversion (integral conversion)

since integral conversion is worse than identity conversion I thought void func(B) would called but still the call is ambiguous .

So please help me at which point am I wrong and why the call is ambiguous. Thanks a lot :)

like image 316
Angelus Mortis Avatar asked Dec 28 '15 08:12

Angelus Mortis


People also ask

What is the overload resolution?

The process of selecting the most appropriate overloaded function or operator is called overload resolution. Suppose that f is an overloaded function name. When you call the overloaded function f() , the compiler creates a set of candidate functions.

What is user-defined conversion?

For more information, see Standard Conversions. User-defined conversions perform conversions between user-defined types, or between user-defined types and built-in types. You can implement them as Conversion constructors or as Conversion functions.

What is operator overloading explain conversion?

The operator overloading defines a type conversion operator that can be used to produce an int type from a Counter object. This operator will be used whenever an implicit or explict conversion of a Counter object to an int is required. Notice that constructors also play a role in type conversion.

What does overload resolution failed mean?

It generates this error message when one overload is more specific for one argument's data type while another overload is more specific for another argument's data type.


2 Answers

The two conversion sequences here, A -> B and A -> int are both user-defined because they operate via functions which you defined.

The rule for ranking user-defined conversion sequences is found in 13.3.3.2 (N3797):

User-defined conversion sequence U1 is a better conversion sequence than another user-defined conversion sequence U2 if they contain the same user-defined conversion function or constructor or they initialize the same class in an aggregate initialization and in either case the second standard conversion sequence of U1 is better than the second standard conversion sequence of U2

These two conversion sequences don't contain the same user-defined conversion function, and they don't initialize the same class in aggregate initialization (since one initializes int).

So it is not true that one sequence ranks above the other, therefore this code is ambiguous.

like image 191
M.M Avatar answered Sep 23 '22 19:09

M.M


so the sequence are -> A() -> B::B(const A&) -> Standard conversion (identity conversion)

No! Excerpt from the standard (draft) [over.best.ics] (emphasis mine):

  1. If no conversions are required to match an argument to a parameter type, the implicit conversion sequence is the standard conversion sequence consisting of the identity conversion (13.3.3.1.1).

func(A()) is not identity, it's user-defined. Again from the standard, [[conv]]:

For class types, user-defined conversions are considered as well; see 12.3. In general, an implicit conversion sequence (13.3.3.1) consists of a standard conversion sequence followed by a user-defined conversion followed by another standard conversion sequence.

I think you have a misunderstanding about Standard conversions. They have nothing to do with user-defined types/classes. Standard conversions are only for built-in types: lvalue-to-rvalue conversion, array-to-pointer conversion, function-to-pointer conversion, integral promotions, floating point promotion, integral conversions, floating point conversions, floating-integral conversions, pointer conversions, pointer to member conversions, boolean conversions and qualification conversions. A -> int is not any of these but a user-defined conversion. The standard on user-defined conversions, [[class.conv]] i.e. 12.3:

Type conversions of class objects can be specified by constructors and by conversion functions. These conversions are called user-defined conversions and are used for implicit type conversions (Clause 4), for initialization (8.5), and for explicit type conversions (5.4, 5.2.9).

You have two user-defined conversion sequences of the same rank (see M.M's answer to know why), so the compiler wants you to disambiguate.

like image 22
legends2k Avatar answered Sep 25 '22 19:09

legends2k