https://gcc.gnu.org/onlinedocs/libstdc++/manual/using_dual_abi.html
I ran into crashing/valgrind issues with using std::string on GCC 5. The above link hints that there is a change in the ABI starting GCC 5.x. The new default ABI for libstd++ is C++11/14... which is not compatible with the older ABI. There is a way to select the older ABI using a define.
I am trying to understand what is the difference between the ABIs and haven't found details. I'd like help understanding:
More details on the issue I ran into (https://github.com/YasserAsmi/jvar/issues/21) The project worked fine in GCC 4.8 and Clang. With GCC, the same code refuses to run:
x_misc(33112,0x7fff728c2000) malloc: *** error for object 0x7fd639c034cc: pointer being freed was not allocated
*** set a breakpoint in malloc_error_break to debug
Abort trap: 6
And here is a partial Valgrind output:
==33027== Invalid read of size 1
==33027== at 0x1006F78BA: _platform_memmove$VARIANT$Nehalem (in /usr/lib/system/libsystem_platform.dylib)
==33027== by 0x100009388: jvar::Variant::toString[abi:cxx11]() const (in bin/ex_misc)
==33027== by 0x1000023A7: bugreport() (in bin/ex_misc)
==33027== by 0x1000133B8: main (in bin/ex_misc)
The project uses std::string and has some custom memory management. It is doing some non-typical but valid operations using placement new constructors etc. I am trying to understand better what kind of code is effected by the API and how to fix it--a place to start.
These details are defined as the compiler Application Binary Interface, or ABI. From GCC version 3 onwards the GNU C++ compiler uses an industry-standard C++ ABI, the Itanium C++ ABI. The GNU C++ compiler, g++, has a compiler command line option to switch between various different C++ ABIs.
clang, modulo bugs, is fully C++ ABI compatible with GCC (they both follow the intervendor Itanium ABI) on unix systems. Make sure you use the same standard library for all components because libstdc++ and libc++ are different implementations with completely different object layouts.
The old std::string
was not compliant with C++11 because that standard prohibits a copy-on-write implementation. There was no way to create a compliant std::string
without breaking ABI, so they did so with a way to return to the non-compliant version for ABI compatibility.
Yes.
Make sure all the translation units in your program use the same value of _GLIBCXX_USE_CXX11_ABI
and you should be fine. If you mix them up across translation units you will definitely have problems. You might be ok if you have different values of the define in different translation units that don't communicate string
s to each other.
There is an interesting difference between COW and non-COW string for example:
std::string some_string;
std::string foo()
{
return some_string;
}
char const *s = foo().c_str();
printf("%s\n",s);
This would work with COW strings as c_str() that is returned by some_string and foo point to same memory but when it isn't COW than s would be invalidated once std::string returned by foo
is destroyed.
From the sample there you should look into this direction.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With