Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Sign extension from 16 to 32 bits in C

I have to do a sign extension for a 16-bit integer and for some reason, it seems not to be working properly. Could anyone please tell me where the bug is in the code? I've been working on it for hours.

int signExtension(int instr) {
    int value = (0x0000FFFF & instr);
    int mask = 0x00008000;
    int sign = (mask & instr) >> 15;
    if (sign == 1)
        value += 0xFFFF0000;
    return value;
}

The instruction (instr) is 32 bits and inside it I have a 16bit number.

like image 808
Sorin Cioban Avatar asked Jun 02 '11 13:06

Sorin Cioban


2 Answers

Why is wrong with:

int16_t s = -890;
int32_t i = s;  //this does the job, doesn't it?
like image 88
Nawaz Avatar answered Dec 08 '22 10:12

Nawaz


what's wrong in using the builtin types?

int32_t signExtension(int32_t instr) {
    int16_t value = (int16_t)instr;
    return (int32_t)value;
}

or better yet (this might generate a warning if passed a int32_t)

int32_t signExtension(int16_t instr) {
    return (int32_t)instr;
}

or, for all that matters, replace signExtension(value) with ((int32_t)(int16_t)value)

you obviously need to include <stdint.h> for the int16_t and int32_t data types.

like image 34
CAFxX Avatar answered Dec 08 '22 09:12

CAFxX