I'm primarily a C++ developer, but recently I've been working on a project in C#. Today I encountered some behavior that was unexpected, at least to me, while using object initializers. I'm hoping someone here can explain what's going on.
Example A
public class Foo {
public bool Bar = false;
}
PassInFoo( new Foo { Bar = true } );
Example B
public class Foo {
public bool Bar = true;
}
PassInFoo( new Foo { Bar = false } );
Example A works as I'd expect. The object passed into PassInFoo has Bar set to true. However, in Example B, foo.Bar is set to true, despite being assigned false in the object initializer. What would be causing the object initializer in Example B to be seemingly ignored?
The tilde (~) is a character in the standard ASCII character set that is provided on a conventional computer keyboard and is used in both writing and computer programming. It corresponds to ASCII code 126. The tilde is also sometimes known as the twiddle.
The logical OR operator ( || ) returns the boolean value true if either or both operands is true and returns false otherwise.
C operators are one of the features in C which has symbols that can be used to perform mathematical, relational, bitwise, conditional, or logical manipulations. The C programming language has a lot of built-in operators to perform various tasks as per the need of the program.
This operator first subtracts the value on the right from the current value of the variable on left and then assigns the result to the variable on the left. Example: (a -= b) can be written as (a = a - b) If initially value stored in a is 8. Then (a -= 6) = 2.
I confirm this ugly bug in Unity3d build of Mono (Mono 2.6.5, Unity3d 4.1.2f1, OSX).
It looks like it doesn't like to use the default value for the ValueType, so you can pass a int != 0
, (bool)true
etc just fine, but passing in the default value like (int)0
or (bool)false
ignores it's value.
Proof:
using UnityEngine;
using System.Collections;
public class Foo1 {
public bool Bar=false;
}
public class Foo2 {
public bool Bar=true;
}
public class Foo1i {
public int Bar=0;
}
public class Foo2i {
public int Bar=42;
}
public class PropTest:MonoBehaviour {
void Start() {
PassInFoo(new Foo1 {Bar=true}); // FOO1: True (OK)
PassInFoo(new Foo2 {Bar=false});/// FOO2: True (FAIL!)
PassInFoo(new Foo1i {Bar=42}); // FOO1i: 42 (OK)
PassInFoo(new Foo2i {Bar=0});/// FOO2i: 42 (FAIL!)
PassInFoo(new Foo2i {Bar=13});/// FOO2i: 13 (OK)
}
void PassInFoo(Foo1 f) {Debug.Log("FOO1: "+f.Bar);}
void PassInFoo(Foo2 f) {Debug.Log("FOO2: "+f.Bar);}
void PassInFoo(Foo1i f) {Debug.Log("FOO1i: "+f.Bar);}
void PassInFoo(Foo2i f) {Debug.Log("FOO2i: "+f.Bar);}
}
On a non-unity3d OSX Mono 2.10.11 (mono-2-10/2baeee2 Wed Jan 16 16:40:16 EST 2013) the tests are running fine:
FOO1: True
FOO2: False
FOO1i: 42
FOO2i: 0
FOO2i: 13
EDIT: filled in a bug in unity3d's bugtracker: https://fogbugz.unity3d.com/default.asp?548851_3gh8hi55oum1btda
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With