Say you have 7 core projects in a legacy code base (a enterprise API). The code base has roughly 50 applications that reference one or more of the core projects. Only a couple of the 50 applications still work after a vss to tfs migration that was manually don went pear shaped. To get the applications working again many have been taken out of the enterprise API and placed into there own TFS Project.
I am trying to persuade colleagues that they should not make a branch of the core projects and put the copy in separate TFS Projects and only merge additions to the core project back into the enterprise API after a release to PROD. Obviously Continous Integration will be much harder when its less frequent.
I am trying to convince colleagues it would be better to take the core projects out of enterprise API and put them in their own TFS Project and then referencing the bin/Debug.
Is it better to Branch, copy the branch(s) to seperate TFS Projects then Merge (and see conflicts at the end) or is it better to encapsulate core projects and force a team of 20 to use only one copy of each of the core projects?
A “branching strategy” refers to the strategy a software development team employs when writing, merging, and shipping code in the context of a version control system like Git. Software developers working as a team on the same codebase must share their changes with each other.
This is known as continuous integration: All changes made by all teams are continuously integrated. Continuous integration servers such as Jenkins or TeamCity are tools to automate exactly this process. They check out changes to a particular branch from version control, compile the code, and run unit tests.
Continuous integration, deployment, and delivery are three phases of an automated software release pipeline, including a DevOps pipeline. These three phases take software from idea to delivery to the end-user. The integration phase is the first step in the process.
It really depends on the maturity of your shared code. I would say there three approaches you can follow, each with its own pros and cons:
Option 1: Directly reference them out of their own Team Project
Team Project Application 1
--> Development
--> ...
--> MAIN
--> Sources
--> Application 1
--> Application1.sln
Team Project Application 2
--> Development
--> ...
--> MAIN
--> Sources
--> Application 2
--> Application1.sln
Team Project CoreProject 1
--> Development
--> ...
--> MAIN
--> Sources
--> CoreProject1.csproj
With this approach, you can set in your CI builds to have all Applications start building once you 've checked in in a CoreProject.
You are bound to have the Team Projects locally mapped with a convention (or else compiling will break)
This is a good approach if you constantly change the CoreProject & need those changes quickly reflected to all impacted Apps.
It also implies that you can afford instability on a certain App, should a breaking change in a CoreProject inflict it.
Option 2: Indirectly reference them via branching
Team Project Application 1
--> Development
--> ...
--> MAIN
--> SharedSources
--> CoreProject1_branch
--> CoreProject1.csproj
--> Sources
--> Application 1
---> Application1.sln
Team Project Application 2
--> Development
--> ...
--> MAIN
--> SharedSources
--> CoreProject1_branch
--> CoreProject1.csproj
--> Sources
--> Application 2
---> Application1.sln
Team Project CoreProject 1
--> Development
--> ...
--> MAIN
--> Sources
--> CoreProject1.csproj
With this approach, each time you have checked in changes in CoreProject1, you need to organize a merge to each affected Application. This poses certain effort, but gives you the time to stabilize the CoreProject on its own playground and then merge it to your Apps.
This approach implies that you also have a build definition for each CoreProject.
In general this is a good way to proceed if you value the stability of CoreProject & can't afford to 'contaminate' your Apps should changes cause trouble. This is btw the approach we went for.
Option 3: Make file reference in each application
Team Project Application 1
--> Development
--> ...
--> MAIN
--> SharedBinaries
--> CoreProject1_branch
--> CoreProject1.dll
--> Sources
--> Application 1
---> Application1.sln
Team Project Application 2
--> Development
--> ...
--> MAIN
--> SharedBinaries
--> CoreProject1_branch
--> CoreProject1.dll
--> Sources
--> Application 2
---> Application1.sln
Team Project CoreProject 1
--> Development
--> ...
--> MAIN
--> Sources
--> CoreProject1.csproj
With this approach, you 'd need to check in the binary output of a CoreApplication build into each Application.
This is only recommended if you 're confident that CoreApplication is stable & you wouldn't need to debug it on a regular basis.
Essentially option 2 & option 3 are similar, separated by the well-known debate "project vs. file reference". See here for a SO resource, plenty more can be retrieved via searches.
If your Core Projects are due to constant change and/or have low unit test coverage you should go for option 2 (or 3).
If you 're confident on their quality, going for option 1 is a good choice, since it will greatly enhance your overall productivity.
Without any knowledge on your products and their quality, simply based on the fairly large numbers you provide (20 people, 50 solutions, 7 core projects), I 'd go for option 2/3.
Another important hint: It happens more than often that shared projectd don't have any means of separated testing. If this is the case and Core Projects have no own unit tests, no dedicated test plan etc, there's no point in doing anything else than option 1.
An additional great resource on the matter is the work by the ALM rangers.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With