Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

why does java and c# differ in simple Addition

Tags:

java

c#

I have two snippets, one in Java and one in c#.

float a = 1234e-3f;
float b = 1.23f;
float ca = 1.234e3f;
float d = 43.21f;
long e = 1234L;
int f = 0xa;
int g = 014;
char h = 'Z';
char ia = ' ';


byte j = 123;
short k = 4321;

System.out.println(a+b+ca+d+e+f+g+h+ia+j+k);

the Java snippet returns 7101.674

and in c#

float a = 1234e-3f;
float b = 1.23f;
float ca = 1.234e3f;
float d = 43.21f;
long e = 1234L;
int f = 0xa;
int g = 014;
char h = 'Z';
char ia = ' ';


byte j = 123;
short k = 4321;

Console.WriteLine(a+b+ca+d+e+f+g+h+ia+j+k);

produces a result of 7103.674.

why am I off by 2 and what is correct?

like image 408
Alexander Greiner Avatar asked Oct 31 '18 12:10

Alexander Greiner


1 Answers

The difference is in the

int g = 014;

It's Octal in case of Java (014 == 12) and Decimal in case of C# (014 == 14).

like image 68
Dmitry Bychenko Avatar answered Oct 06 '22 20:10

Dmitry Bychenko