Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to determine object code size on Linux when "size" gives the wrong answer?

Tags:

gcc

binutils

I want to know precisely how much object code is generated by GCC for each of a collection of compilation units, but I'm having an odd problem where the "size" command from binutils is not giving the correct result.

Let's take a C file containing only this function:

int foo (int a, int b)
{
  return a+b;
}

We can compile it and check the object code size using both "size" and "objdump":

$ gcc -O foo.c -c
$ size foo.o
   text    data     bss     dec     hex filename
     52       0       0      52      34 foo.o
$ objdump -d foo.o

foo.o:     file format elf64-x86-64


Disassembly of section .text:

0000000000000000 <foo>:
   0:   8d 04 37                lea    (%rdi,%rsi,1),%eax
   3:   c3                      retq   

From the objdump output, it is clear that the object code size is 4 bytes. However, size reports 52 bytes, which is incorrect.

From using the "-D" option to objdump, it looks like the exception handling code and maybe some other stuff is getting measured by "size" and added to the size of the code that I actually care about. Does anyone know a relatively straightforward way to get size to ignore these extras?

like image 255
John Regehr Avatar asked Mar 18 '13 03:03

John Regehr


1 Answers

Do you have to stick with size? It has many issues similar to what you ran into so I usually use this readelf snippet:

OBJ=foo.o
SEC=.text

readelf -SW "$OBJ" \
  | sed 's/^ *\[[0-9 ]*\] *//' \
  | awk '
    /NOBITS/ { next; }
    /^'$SEC'\>/ { sz = strtonum("0x" $5); s += sz; }
    END { print s }'
like image 178
yugr Avatar answered Sep 29 '22 16:09

yugr