Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tips on how to deploy C++ code to work every where

Tags:

I'm not talking about making portable code. This is more a question of distribution. I have a medium-sized project. It has several dependencies on common libraries (eg openssl, zlib, etc). It compiles fine on my machine and now it's time to give it to the world.

Essentially build engineering at its finest. I want to make installers for Windows, Linux, MacOSX, etc. I want to make a downloadable tar ball that will make the code work with a ./configure and a make (probably via autoconf). It would be icing on the cake to have a make option that would build the installers..maybe even cross-compile so a Windows installer could be built in Linux.

What is the best strategy? Where can I expect to spend the most time? Should the prime focus be autoconf or are there other tools that can help?

like image 577
User1 Avatar asked Mar 21 '10 05:03

User1


People also ask

How to deploy c++ project in Visual Studio?

On the Visual Studio menu bar, choose File > Recent Projects and Solutions, and then choose to reopen your project. On the menu bar, choose File > New > Project to open the Create a New Project dialog box. In the search box, type "Setup" and from the results list choose Setup Project.

How do you distribute a C++ program?

Visual Studio enables three ways to deploy the Visual C++ libraries together with your application: central deployment, local deployment, and static linking. Central deployment puts the library files under the Windows directory, where all applications can access them automatically.


2 Answers

I would recommend CMake. Advantages:

  • It is very easy to use for building simple and complex projects with static libraries, dynamic libraries, executables and their dependencies.
  • It is platform independent and generates makefiles and/or ide project files for most compilers and IDEs.
  • It abstracts the differences between windows and unix, eg "libShared.so" and "Shared.dll" are referred to as "Shared" (cmake handles the name differences for each platform), if Shared is part of your project it sorts out the dependency if not it assumes that it is in the linker path.
  • It investigates the users system for compiler and 3rd party libraries that are required, you can then optionally remove components when 3rd party libraries are not available or display an error message (It ships with macros to find most common 3rd party libraries).
  • It can be run from the command line or with a simple gui that enables the user to change any of the parameters that were discovered above (eg compiler or version of 3rd party library).
  • It supports macros for automating common steps.
  • There is a component called CPack that enables you to create an installer, I think this is just a make install command line thing (I have not used it).
  • The CTest component integrates with other unit testing libraries like boost test or google test.

I use CMake for everything now, even simple test projects with visual studio.

I have never used autotools but a lot of other users have commented that cmake is easier to use. The KDE project moved to cmake from autotools for this reason.

like image 147
iain Avatar answered Sep 22 '22 05:09

iain


The product that I work on is not too different from this. We use an autoconf-based build system, and it works pretty well.

The place that you'll spend the most time, by far, is supporting users. User systems will have all sorts of wrinkles that you don't expect until they run into them, and you'll need to add more configure options to support them. Over time, we've added options to set the include and lib paths for every library we depend on; we've added options to change compile flags to work around various weird glitches in various versions of those libraries (or API changes from one version to another than need changes in our code), we've added workarounds for the fact that some BLAS libraries use a C interface and some use a Fortran interface so even though they're theoretically implementations of the same library they do a few things slightly differently, and so on. You can't anticipate all this in advance, and it also needs documenting so that users can figure out which options to set.

Oh, and installers are really a pain, because they generally are OS-dependent (unless it's just a shell script and you require CygWin), and the locations to install to tend to be OS-dependent, and so forth. That's another area that will take up time -- either in building a good installer, or in supporting users in manually setting things up.

Setting up cross-compile is, in my experience, well worth the trouble (at least for the Linux-to-Windows case; not sure about MacOS/X) -- much easier than trying to keep multiple different build systems in sync.

As an alternate perspective, there's the option that the OpenFOAM project uses for their rather large C++ library, which is to distribute it along with an "approved" G++ compiler and packages for all the other components, so that they don't have to worry about different compilers and so forth. But that really only works on one OS. I guess the Windows/MacOSX version of that is to provide pre-set-up VMWare images. In some cases, there's something to be said for that....

like image 32
Brooks Moses Avatar answered Sep 26 '22 05:09

Brooks Moses