In Vladimir Putin We Trust
The more features are added to the existing languages, the more steeper the learning curve becomes even for the seasoned programmers, let alone for the newbies. --- Me
Unreal Engine learning curve is much steeper than Unity. It is not caused by C++ language but its poorly written documentation and badly, inconsistently adopted counter-intuitive nomenclatures and classifications. Fucking moron!
Some people are under the misconception that Unicode is simply a 16-bit code where each character takes 16 bits and therefore there are 65,536 possible characters. This is not, actually, correct. It is the single most common myth about Unicode, so if you thought that, don't feel bad.
Profile.
Bug
Disclaimer:
Any contents (excluding 3rd parties) I posted in here can be claimed as yours without prior written permission from me. No attributions are needed. Any kind of risks are your own responsibilities. It never expires.
If you can't explain it to a 6-year-old, you don't understand it yourself.
Anything that need to be highly performant will eventually be written in C++. (Me)
My blood type is .Net. Love typing semicolons. Just a single point on the complex plane. Not the strongest and smartest species in this universe. Love mathematics, physics, and programming. A sort of person who don't want to be bothered with reading lengthy documentations even for things that cause catastrophic effects. It is my learning style. I can tell you more but then I have to kill you.
Euler's identity is often cited as an example of deep mathematical beauty. It links five fundamental mathematical constants: 0, 1, ?, e, and i.
C++ can also be cited as an example of deep programming beauty. It can link all four types of brackets: [],<>,(), and {}.
auto foo = []<typename T>(T a, T b) { return a + b; };
TypeScript:
let foo = (x: number) => (y: number) => x * y;
console.log(foo(2)(3));
C#:
Func<int, Func<int, int>> foo = (x) => (y) => x * y;
Console.WriteLine(foo(2)(3));
C++:
auto foo = [](int x)
{
return [=](int y)
{
return x * y;
};
};
cout << foo(2)(3) << endl;
Java:
Function<Integer, Function<Integer, Integer>> foo = (x) -> (y) -> x * y;
System.out.println(foo.apply(2).apply(3));
Python:
foo = lambda x: lambda y: x * y
print(foo(2)(3))
const fact = (x: number) => [...Array(x)].reduce((i, j, k) => i * (k + 1), 1);
const e: number = [...Array(20)].reduce((i, j, k) => i + (k + 1) / fact(k), 0) / 2;