What is the real reason for that limitation? Is it just work that had to be done? Is it conceptually hard? Is it impossible?
Sure, one couldn't use the type parameters in fields, because they are allways read-write. But that can't be the answer, can it?
The reason for this question is that I'm writing an article on variance support in C# 4, and I feel that I should explain why it is restricted to delegates and interfaces. Just to inverse the onus of proof.
Update: Eric asked about an example.
What about this (don't know if that makes sense, yet :-))
public class Lookup<out T> where T : Animal { public T Find(string name) { Animal a = _cache.FindAnimalByName(name); return a as T; } } var findReptiles = new Lookup<Reptile>(); Lookup<Animal> findAnimals = findReptiles;
The reason for having that in one class could be the cache that is held in the class itself. And please don't name your different type pets the same!
BTW, this brings me to optional type parameters in C# 5.0 :-)
Update 2: I'm not claiming the CLR and C# should allow this. Just trying to understand what led to that it doesnt.
In C programming language, %d and %i are format specifiers as where %d specifies the type of variable as decimal and %i specifies the type as integer. In usage terms, there is no difference in printf() function output while printing a number using %d or %i but using scanf the difference occurs.
The Semicolon tells that the current statement has been terminated and other statements following are new statements. Usage of Semicolon in C will remove ambiguity and confusion while looking at the code. They are not used in between the control flow statements but are used in separating the conditions in looping.
'#' is called pre-processor directive and the word after '#' is called pre-processor command. Pre-processor is a program which performs before compilation. Each pre-processing directive must be on its own line.
The C programming language is the recommended language for creating embedded system drivers and applications. The availability of machine-level hardware APIs, as well as the presence of C compilers, dynamic memory allocation, and deterministic resource consumption, make this language the most popular.
First off, as Tomas says, it is not supported in the CLR.
Second, how would that work? Suppose you have
class C<out T> { ... how are you planning on using T in here? ... }
T can only be used in output positions. As you note, the class cannot have any field of type T because the field could be written to. The class cannot have any methods that take a T, because those are logically writes. Suppose you had this feature -- how would you take advantage of it?
This would be useful for immutable classes if we could, say, make it legal to have a readonly field of type T; that way we'd massively cut down on the likelihood that it be improperly written to. But it's quite difficult to come up with other scenarios that permit variance in a typesafe manner.
If you have such a scenario, I'd love to see it. That would be points towards someday getting this implemented in the CLR.
UPDATE: See
Why isn't there generic variance for classes in C# 4.0?
for more on this question.
As far as I know, this feature isn't supported by CLR, so adding this would require significant work on the CLR side as well. I believe that co- and contra-variance for interfaces and delegates was actually supported on CLR before the version 4.0, so this was a relatively straightforward extension to implement.
(Supporting this feature for classes would be definitely useful, though!)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With