Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Print an int in binary representation using C

Tags:

I'm looking for a function to allow me to print the binary representation of an int. What I have so far is;

char *int2bin(int a) {  char *str,*tmp;  int cnt = 31;  str = (char *) malloc(33); /*32 + 1 , because its a 32 bit bin number*/  tmp = str;  while ( cnt > -1 ){       str[cnt]= '0';       cnt --;  }  cnt = 31;  while (a > 0){        if (a%2==1){            str[cnt] = '1';         }       cnt--;         a = a/2 ;  }  return tmp;  } 

But when I call

printf("a %s",int2bin(aMask)) // aMask = 0xFF000000 

I get output like;

0000000000000000000000000000000000xtpYy (And a bunch of unknown characters.

Is it a flaw in the function or am I printing the address of the character array or something? Sorry, I just can't see where I'm going wrong.

NB The code is from here

EDIT: It's not homework FYI, I'm trying to debug someone else's image manipulation routines in an unfamiliar language. If however it's been tagged as homework because it's an elementary concept then fair play.

like image 466
gav Avatar asked Jun 21 '09 17:06

gav


2 Answers

Here's another option that is more optimized where you pass in your allocated buffer. Make sure it's the correct size.

// buffer must have length >= sizeof(int) + 1 // Write to the buffer backwards so that the binary representation // is in the correct order i.e.  the LSB is on the far right // instead of the far left of the printed string char *int2bin(int a, char *buffer, int buf_size) {     buffer += (buf_size - 1);      for (int i = 31; i >= 0; i--) {         *buffer-- = (a & 1) + '0';          a >>= 1;     }      return buffer; }  #define BUF_SIZE 33  int main() {     char buffer[BUF_SIZE];     buffer[BUF_SIZE - 1] = '\0';      int2bin(0xFF000000, buffer, BUF_SIZE - 1);      printf("a = %s", buffer); } 
like image 156
Adam Markowitz Avatar answered Sep 21 '22 13:09

Adam Markowitz


A few suggestions:

  • null-terminate your string
  • don't use magic numbers
  • check the return value of malloc()
  • don't cast the return value of malloc()
  • use binary operations instead of arithmetic ones as you're interested in the binary representation
  • there's no need for looping twice

Here's the code:

#include <stdlib.h> #include <limits.h>  char * int2bin(int i) {     size_t bits = sizeof(int) * CHAR_BIT;      char * str = malloc(bits + 1);     if(!str) return NULL;     str[bits] = 0;      // type punning because signed shift is implementation-defined     unsigned u = *(unsigned *)&i;     for(; bits--; u >>= 1)         str[bits] = u & 1 ? '1' : '0';      return str; } 
like image 42
Christoph Avatar answered Sep 22 '22 13:09

Christoph