Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using Visual Studio 2010, how can one link to a DLL generated by Visual Studio 2008

My questions are:

  • Is it possible to link to VS2008 generated DLLs using VS2010?

  • If not, why does it seem to be possible to link to static libraries generated by VS2008.

  • I see that VS2010 now has a Platform Toolset option. But will that let people set it to v90 instead of v100 even though they don't have VS2008 installed?

  • Even though I use the /Z7 compiler switch, why do I still need to have a .pdb to debug DLLs.

The Details

I can use Visual Studio 2010 to link to my Leptonica C static libraries generated by Visual Studio 2008 without any problems. (See the References section below for details on how I build Leptonica and link to it.)

However, when I try to link the same program (leptonlib-1.67\prog\ioformats_reg.c) with my VS2008 generated DLL version of Leptonica the program crashes. Debugging, I can see that the problem is that ioformats_reg.c does this:

fp = fopen(filename, "rb"); /* in ioformats_reg.c */

and soon thereafter, in the leptonlib.dll the following is done which crashes:

rewind(fp);                 /* in leptonlib.dll */

How to link with the correct C Run-Time (CRT) library says:

A reusable library and all of its users should use the same CRT library types and therefore the same compiler switch...

If you do choose to mix CRT libraries, remember that you have two separate copies of the CRT, with separate and distinct states, so you must be careful about what you try to do across a CRT-boundary. There are many ways to get into trouble with two CRTs. Here are just a few:

  • There are two separate heaps. You cannot allocate (explicitly with new, malloc, or so on -- or implicitly with strdup, strstreambuf::str, or so on), and then pass the pointer across a CRT-boundary to be freed.
  • You cannot pass a FILE* or file handle across a CRT-boundary and expect the "stdio low-level IO" to work.
  • You cannot set the locale in one and expect the other's locale to be set.

Beginning with Visual C++ 4.0, the linker will issue a warning (LNK4098) if a resulting module attempts to combine more than one copy of the CRT library. For more information, search the Help file for LNK4098.

But I do not get any LNK4098 error messages from the VS2010 linker.

Leptonica uses fopen(), rewind(), fclose(), etc. which the documentation categorizes as Stream I/O not "low-level IO", but those do pass around FILE ptrs. I suppose this is what Microsoft means when they say "stdio low-level IO".

/MD, /MT, /LD (Use Run-Time Library) says:

All modules passed to a given invocation of the linker must have been compiled with the same run-time library compiler option (/MD, /MT, /LD).

It doesn't say that all modules have to be compiled by the same version of the compiler. I do use /MD (or /MDd) consistently and correctly for all my modules.

When using DLLs it appears that the DLLs not only have to use the same /MD switch, but they also have to be compiled by VS2010?

My test case seems to indicate that linking with static libraries generated by VS2008 works, but maybe I just got lucky? Why does linking to VS2008 generated static libraries work, while linking to a VS2008 generated DLL doesn't when using VS2010?

Does this mean that I need to ship separate DLLs for use by VS2008 and VS2010 users?


And what about the new Platform Toolset option? Can VS2010 users change that to v900 even though they don't have VS2008? If so, then I could just tell people to change that setting for my Leptonlib-1.67 project.


Finally, I use the /Z7 switch when creating my libraries. The documentation at /Z7, /Zi, /ZI (Debug Information Format) states:

/Z7

Produces an .obj file containing full symbolic debugging information for use with the debugger. The symbolic debugging information includes the names and types of variables, as well as functions and line numbers. No .pdb file is produced.

For distributors of third-party libraries, there is an advantage to not having a .pdb file. However, the .obj files for the precompiled headers are necessary during the link phase, and debugging. If there is only type information (and no code) in the .pch object files, you will also have to compile with /Yl (Inject PCH Reference for Debug Library).

I am not using any precompiled headers. However, it's only when I have the .pdb available that I can debug my Leptonica DLLs. Also even though it says "No .pdb file is produced." a .pdb is in fact generated with my current project settings. Does having /PDB in my linker options somehow override having specified /Z7 while compiling?

Edit: Also I should mention that I am able to debug the static library version of Leptonica even without any PDB.

References

Leptonica is the open-source C Image Processing Library by Dan Bloomberg available at http://www.leptonica.com. I provide the instructions for building Leptonica using VS2008/VS2010 and also provide windows binaries.

See http://leptonica.com/vs2008doc/building-leptonlib.html and http://leptonica.com/vs2008doc/building-image-libraries.html for details on how I build the Leptonica libraries. http://www.leptonica.org/vs2008doc/building-prog-dir.html discusses how I link ioformats_reg.

My Leptonica VS2008 Solution is available at http://www.leptonica.com/source/vs2008-1.67.zip. My binary libraries are in the zip file at http://leptonica.com/source/leptonica-1.67-win32-lib-include-dirs.zip. The Leptonica sources are at http://www.leptonica.com/source/leptonlib-1.67.tar.gz

like image 614
T Powers Avatar asked Nov 13 '10 07:11

T Powers


People also ask

Where is DLL Link in Visual Studio?

On the menu bar, choose File > New > Project to open the New Project dialog box. In the left pane of the New Project dialog box, select Installed > Visual C++ > Windows Desktop. In the center pane, select Dynamic-Link Library (DLL).


2 Answers

The bit I was missing is that when you use a DLL there are two invocations of the linker, once for the DLL and once for the app that links with the DLL. When you use static libraries there is only one invocation of the linker (creating the static libraries uses LIB).

So a DLL links to a C Runtime Library separately from any app that links with that DLL. And problems occur if these two C Runtimes are different.

I can use the VS2010 debugger to look at what modules are loaded via the Debug -> Windows -> Modules window. When I link with the Leptonica static libraries I see msvcrt.dll and msvcr100d.dll. However, when I link with the Leptonica leptonlibd.dll I can see msvcrt.dll, msvcr90d.dll, and msvcr100d.dll.

Running "dumpbin /imports leptonlibd.dll" also shows the reference to msvcr90d.dll.

I would say there are 3 solutions to the problem:

  • People can statically link with Leptonica to avoid the issue entirely.

  • Supply VS2008 and VS2010 versions of leptonlib.dll.

  • Change the leptonica API so any FILE handles or allocated memory it creates can only be manipulated/freed using the API. I've posted an issue about this at: http://code.google.com/p/leptonica/issues/detail?id=45

Now that I understand the cause of the problem, I'll also probably supply a VS2010 version of the DLL in my next binary release.


I've decided to not worry about not being able to debug the DLL without a PDB. People who need to debug Leptonica will have the sources and can build their own debug versions of the library (and thus will make a PDB).


I'd still be interested in hearing if its possible for owners of VS2010 to use the v90 Platform Toolset option even though they don't have VS2008 installed. (But the more I think about it the more I strongly doubt they can.)

like image 101
T Powers Avatar answered Sep 27 '22 18:09

T Powers


You can link to 2008 code using 2010. However, as you have answered for yourself, if you create an object (e.g. a memory or file handle) using one runtime (2008) and pass it to another runtime (2010) to destroy it, you will have problems - these systems are distinct instances (e.g. managing their own heap), so won't work if you try to use two of them interchangeably, as you'll be passing memory block pointers to a system that has no idea what they are or where they came from.

The solutions are either:

  • make sure that all of these calls happen in one side or the other (so if your dll allocates memory, it should properly encapsulate that process and provide APIs to deallocate it too). This internalisation is a good library design principle in any case.

  • provide a 2010 build of your dll for 2010 users to link to. This is the easiest solution for everyone as mucking around with linker options just isn't any fun. Forcing people to target their code to an old runtime just so they can use your library can make life impossible (as soon as they want to use another library that does the same thing, they're stuck). Good libraries are compliant and easy to use, not prescriptive and difficult.

As for pdb's: Compiling and linking are two different processes that are pipelined together. If you change the settings for compiling, you may need to change the settings for linking in a compatible manner before the entire pipeline is set up correctly.

like image 20
Jason Williams Avatar answered Sep 27 '22 19:09

Jason Williams