I'm trying to detect array declarations and build a symbol value table for static sized arrays. It will contain a name-arraySize pairing. I have several questions:
Given an instruction such as %a = alloca [200 x i8], align 16
how can I extract a
, the name of the array from it?
I'm trying to extract the 200
as the array size but this code:
if(AllocaInst *allocInst = dyn_cast<AllocaInst>(&*I)){
PointerType *p = allocInst->getType();
if(p->getElementType()->isArrayTy()){
Value* v = allocInst->getOperand(0);
errs() << *v ;
}
}
yields me i32 1
when I print v
.
Does anyone know why this is?
I didn't think there was anything 32bit about this except maybe the address.
Some of the answers are in the comments, but here is a more full explanation.
There are two sources of size in an alloca: the size of the allocated type and the number of elements of that type which are allocated. If you don't specify a number explicitly, you get the implicit default of allocating a single element. This is the i32 1
value you get out of operand #0. If the allocated type is an array type (use dyn_cast<...>
to test for this, as cast<...>
will assert), then you also need to account for that size.
In LLVM, the optimizer canonicalizes alloca instructions with a static size greater than one into an alloca instruction of a single array with that size. So you most often see the alloca size as a constant one.
There are more friendly APIs for this as well: http://llvm.org/docs/doxygen/html/classllvm_1_1AllocaInst.html
In particular, AllocaInst::getArraySize()
will get you the number of elements (usually 1) and AllocaInst::getAllocatedType()
will get you the type of the allocated element (sometimes an array).
Finally, a note about using the name: LLVM doesn't make any guarantees about the names of instructions. Various parts of the optimizer will destroy the names or change them. Just be careful using them for production code as you may be surprised when they go away.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With