private static void convert(int x) {
// assume we've passed in x=640.
final int y = (x + 64 + 127) & (~127);
// as expected, y = 768
final int c = y;
// c is now 320?!
}
Are there any sane explanations for why the above code would produce the values above? This method is called from JNI. The x
that is passed in is originally a C++ int
type that is static_cast
to a jint
like so: static_cast<jint>(x);
In the debugger, with the breakpoint set on the y assignment, I see x=640. Stepping one line, I see y=768. Stepping another line and c=320. Using the debugger, I can set the variable c = y and it will correctly assign it 768.
This code is single threaded and runs many times per second and the same result is always observed.
Update from comments below
This problem has now disappeared entirely after a day of debugging it. I'd blame it on cosmic rays if it didn't happen reproducibly for an entire day. Oddest thing I've seen in a very long time.
I'll leave this question open for a while in case someone has some insight on what could possibly cause this.
Step 01: compile it right, see comments under your post.
if needed i with this code it will go: C# Code:
private void callConvert(object sender, EventArgs e)
{
string myString = Convert.ToString(convert123(640));
textBox1.Text = myString;
}
private static int convert123(int x) {
// assume we've passed in x=640.
int y = (x + 64 + 127) & (~127);
// as expected, y = 768
int c = y;
// c is now 320?!
return (c);
}
but its a c# code
and a tipp for you NEVER call your funktion with a name that is used in the compiler as an standart. convert is in the most langues used.
(system.convert)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With