I have split contours in a image and I want to connect them assuming they are parametrized curves. I was considering some curve fitting or curve tracing - but I do not know, how can I implement it.
I have exactly 1 px wide split contours:
import itertools
import cv2
# Get all pixel directions
directions = list(itertools.product([-1, 0, 1], repeat=2))
directions.remove((0, 0))
# Load image
img = cv2.imread("image.png", cv2.IMREAD_COLOR)
gray = cv2.cvtColor(img, cv2.COLOR_RGB2GRAY)
cv2.imshow("Input", img)
# Find contours
_, thresh = cv2.threshold(gray, 25, 255, cv2.THRESH_BINARY)
contours, _ = cv2.findContours(thresh, cv2.RETR_LIST, cv2.CHAIN_APPROX_NONE)
I know, where are ends of all contours:
for cnt in contours:
ends = []
# Get ends of contour
# - first pixel is not always the first pixel of open contour
# - last pixel is not mostly the last pixel in contour
for pix in cnt:
pixel = tuple(pix[0])
pixel_x, pixel_y = pixel
total_neighbours = 0
# Count pixel neighbours
for plus_x, plus_y in directions:
current_x = pixel_x + plus_x
current_y = pixel_y + plus_y
if current_y < 0 or current_y >= gray.shape[0]:
continue
if current_x < 0 or current_x >= gray.shape[1]:
continue
if gray[current_y][current_x]:
total_neighbours += 1
# If it is end pixel
if total_neighbours == 1:
ends.append(pixel)
cv2.circle(img, pixel, 3, (0, 255, 255), 1)
As a human I know, where are the curves and what contours are part of each unique curve:
For good human imagination, there are raw images:
But I do not know, how can I connect these split contours programmatically. I tried using first and second derivative to predict, but it was not good enough:
# Make contour "predictions" by first derivative
# - go from end to second end and calculate first derivative
# - "predict" on second end contour connection
for end in ends:
pixel_x, pixel_y = end
def predict(pixel_x, pixel_y, was):
for plus_x, plus_y in directions:
current_x = pixel_x + plus_x
current_y = pixel_y + plus_y
if current_y < 0 or current_y >= gray.shape[0]:
continue
if current_x < 0 or current_x >= gray.shape[1]:
continue
if (current_x, current_y) not in ends:
if (current_x, current_y) not in was:
if gray[current_y][current_x]:
was.append((current_x, current_y))
predict(current_x, current_y, was)
else:
derivatives_x = []
derivatives_y = []
# Calculate derivative
for pix in range(len(was) - 3):
x1, y1 = was[pix]
x2, y2 = was[pix + 3]
derivatives_x.append(x1 - x2)
derivatives_y.append(y1 - y2)
if not derivatives_x:
continue
# Get last N derivatives and average them
avg_x = derivatives_x[-20:]
avg_y = derivatives_y[-20:]
avg_x = sum(avg_x) / len(avg_x)
avg_y = sum(avg_y) / len(avg_y)
# Predict N pixels
for i in range(7):
pos = round(x2 - avg_x * i), round(y2 - avg_y * i)
cv2.circle(img, pos, 1, (0, 0, 255), cv2.FILLED)
predict(pixel_x, pixel_y, [])
cv2.imshow("First derivative", img)
cv2.waitKey(0)
Anybody knows how to fit some curve/spline/Bezier/.. to it and recognize the curves as a human?
There are more binarized source images:
Thanks.
Despite its name, you can fit curves using linear regression. The most common method is to include polynomial terms in the linear model. Polynomial terms are independent variables that you raise to a power, such as squared or cubed terms.
The curve follows equation A42 with a = 5, b = -1, c = -5 and d = 1. The Trendline type is Polynomial. The highest-order polynomial that Trendline can use as a fitting function is a regular polynomial of order six, i.e., y = ax6 + bx5 +cx4 + ak3 + ex2 +fx + g. polynomials such as y = ax2 + bx3'2 + cx + + e.
For above example, x = v and y = p. The process of finding the equation of the curve of best fit, which may be most suitable for predicting the unknown values, is known as curve fitting. Therefore, curve fitting means an exact relationship between two variables by algebraic equations.
Quick Reference. The problem of finding the curve that best fits a number of data points. The philosophical interest lies in justifying any particular trade-off of simplicity, accuracy, and boldness, that may commend itself.
The scipy.optimize.curve_fit routine can be used to fit two-dimensional data, but the fitted data (the ydata argument) must be repacked as a one-dimensional array first. The independent variable (the xdata argument) must then be an array of shape (2,M) where M is the total number of data points.
If you want to create a new refined mask from the estimated curves, you can do the following: yy = 1:size (bw,1); %// single value per row xxr=polyval (pr,yy); %// corresponding column values xxl=polyval (pl,yy);
The result can be visualized in 3D with the residuals plotted on a plane under the fitted data: or in 2D with the fitted data contours superimposed on the noisy data:
Extrapolation is notoriously vulnerable to initial conditions (in this case the state of the spline where the cut was made. That being said, I do believe that the first and second derivative are sufficient for reasonable solutions to these data sets.
I think there are two primary mistakes in your extrapolation code: (1) first and second derivative calculations on integer pixel data is going to have additional noise relating to the discrete nature of integer space (splines are continuous). This could be solved by down-sampling your integer precision curves (using a voxel grid filter or something similar) into double precision curves. (2) The way that you are calculating the second derivative is to take the cartesian difference between two subsequent directional vectors along a spline. A superior strategy is to take the angular change between subsequent direction vectors instead. In addition to being more accurate, this allows splines to wrap back on themselves as well as continuously propagating a lossless curve.
The way that I determined collision was through extrapolation of each curve against each other curve. After each step of extrapolation, 3 estimates of fitness were assessed to determine a local minima (and subsequently best fitness for curve combination). (1) cartesian distance between the endpoints of each curve. (2) angular distance between endpoints of each curve (180 degree offset is considered optimal). (3) cumulative extrapolation distance. The total fitness is a weighted sum of these 3 values. Extrapolation of each test candidate continues until the total fitness decreases. After testing every combination of every curve, if the best fitness is below a configurable threshold, those two curves are combined and the process is repeated.
Each time ideal candidates for spline merging are determined, both are extrapolated until they meet in the middle. Those extrapolation points are then averaged using sinusoidal weighting to improve the curvature/ continuity of the spline. The three sets (spline1, bridge, spline2) of points are then reconstructed and the spline reformed.
Code (c++11, opencv):
#include <stdio.h>
#include <opencv2/opencv.hpp>
#include <Windows.h>
#include <string>
#define M_PI 3.14159
using namespace std;
using namespace cv;
double sqDist(Point p1, Point p2)
{
return std::pow((double)p1.x - p2.x,2) + std::pow((double)p1.y - p2.y,2);
}
vector<Point2d> resampleContour(vector<Point> originalContour, int voxelWidth_px = 3)
{
//quick and dirty voxel filter downsampling to 5px spacing
vector<Point2d> outputCurve = vector<Point2d>();
while(originalContour.size()>0)
{
int binX = std::floor(originalContour[0].x / voxelWidth_px);
int binY = std::floor(originalContour[0].y / voxelWidth_px);
int sumX = originalContour[0].x;
int sumY = originalContour[0].y;
int count = 1;
vector<Point> remainingPoints = vector<Point>();
for (int i = 1; i < originalContour.size(); i++)
{
int tempBinX = std::floor(originalContour[i].x / voxelWidth_px);
int tempBinY = std::floor(originalContour[i].y / voxelWidth_px);
if (tempBinX == binX && tempBinY == binY)
{
sumX += originalContour[i].x;
sumY += originalContour[i].y;
count++;
}
else
{
remainingPoints.push_back(originalContour[i]);
}
}
originalContour = remainingPoints;
double aveX = (double)sumX / (double)count;
double aveY = (double)sumY / (double)count;
outputCurve.push_back(Point2d(aveX, aveY));
}
return outputCurve;
}
Point2d getNN(vector<Point2d> cloud, int targetIndex, int &nnIndex, double &dist)
{
nnIndex = -1;
double shortestDist = 0;
for (int i = 0; i < cloud.size(); i++)
{
if (i == targetIndex) { continue; }
double tempDist = sqDist(cloud[targetIndex], cloud[i]);
if (nnIndex == -1 || tempDist < shortestDist)
{
nnIndex = i;
shortestDist = tempDist;
}
}
dist = shortestDist;
return cloud[nnIndex];
}
int getNN(vector<Point2d> cloud, Point2d pt)
{
int nnIndex = -1;
double shortestDist = 0;
for (int i = 0; i < cloud.size(); i++)
{
double tempDist = sqDist(pt, cloud[i]);
if (nnIndex == -1 || tempDist < shortestDist)
{
nnIndex = i;
shortestDist = tempDist;
}
}
return nnIndex;
}
vector<Point2d> getNNs(vector<Point2d> cloud, int targetIndex, int count)
{
Point2d tPt = cloud[targetIndex];
sort(cloud.begin(), cloud.end(), [tPt](const Point2d& p1, const Point2d& p2) {
return sqDist(tPt,p1) < sqDist(tPt, p2);
});
vector<Point2d> outCloud = vector<Point2d>();
for (int i = 1; i <= count && i < cloud.size(); i++) //first point will be same point
{
outCloud.push_back(cloud[i]);
}
return outCloud;
}
Vec2d normalize(Vec2d inVec)
{
double length = sqrt(pow(inVec(0), 2) + pow(inVec(1), 2));
if (length == 0)
{
throw;
}
inVec = inVec / length;
return inVec;
}
double angleBetween(Vec2d vec1, Vec2d vec2, bool directionalFlag = false)
{
vec1 = normalize(vec1);
vec2 = normalize(vec2);
double angle = (atan2(vec1(1), vec1(0)) - atan2(vec2(1), vec2(0)));
if (angle > M_PI) { angle -= 2 * M_PI; }
else if (angle <= -M_PI) { angle += 2 * M_PI; }
if (!directionalFlag)
{ angle = abs(angle); }
return angle;
}
vector<Point> roundPoints(vector<Point2d> cloud)
{
vector<Point> outCloud = vector<Point>();
for (int i = 0; i < cloud.size(); i++)
{
outCloud.push_back(Point(round(cloud[i].x), round(cloud[i].y)));
}
return outCloud;
}
class PointNorm
{
public:
PointNorm() {}
//point at p1 with direction p1-p2
PointNorm(Point2d p1, Point2d p2)
{
X = p1.x;
Y = p1.y;
dir = Vec2d(p1.x - p2.x, p1.y - p2.y);
dir = normalize(dir);
}
PointNorm(double x,double y,double dx,double dy)
{
X = x;
Y = y;
dir = Vec2d(dx, dy);
dir = normalize(dir);
}
double X = 0;
double Y = 0;
Vec2d dir = Vec2d();
static void dif(PointNorm pn1, PointNorm pn2, double& distance, double& angle)
{
distance = sqrt(pow(pn1.X - pn2.X, 2) + pow(pn1.Y - pn2.Y, 2));
angle = angleBetween(pn1.dir, pn2.dir, false);
}
};
vector<Point2d> orderCurve(vector<Point2d> cloud)
{
if (cloud.size() < 3) { return cloud; }
vector<Point2d> outCloud = vector<Point2d>();
int currentIndex = cloud.size() / 2;
double lastDist = -1;
vector<Point2d> tempCloud = cloud;
for (int i = 0; i < cloud.size(); i++)
{
vector<Point2d> twoNeighbors = getNNs(cloud, i, 5); //technically u can increase this count to increase endpoint confidence
bool endFlag = true;
for (int ii = 0; ii < twoNeighbors.size()-1 && endFlag; ii++)
{
for (int iii = 0; iii < twoNeighbors.size() - 1 && endFlag; iii++)
{
if (ii == iii) { continue; }
PointNorm pn1 = PointNorm(cloud[i], twoNeighbors[ii]);
PointNorm pn2 = PointNorm(cloud[i], twoNeighbors[iii]);
double tempAngle = angleBetween(pn1.dir, pn2.dir);
if (tempAngle > M_PI / 2)
{ endFlag = false; }
}
}
if (endFlag)
{
outCloud.push_back(cloud[i]);
break;
}
}
tempCloud = cloud;
currentIndex = getNN(cloud, outCloud[0]);
while (tempCloud.size() > 1)
{
int testIndex = 0;
double testDist;
Point2d tempPoint = getNN(tempCloud, currentIndex, testIndex, testDist);
outCloud.push_back(tempPoint);
tempCloud.erase(tempCloud.begin() + currentIndex);
if (testIndex > currentIndex) { testIndex--; }
currentIndex = testIndex;
}
return outCloud;
}
class ModSpline
{
public:
ModSpline(vector<Point2d> orderedCurve, int dirEstimationOffset, bool _comboCurve = false)
{
//totalCurve = orderedCurve;
totalCurve = vector<Point2d>();
copy(orderedCurve.begin(), orderedCurve.end(), back_inserter(totalCurve));
int end = orderedCurve.size() - 1;
head = PointNorm(orderedCurve[0], orderedCurve[dirEstimationOffset]);//slight offset gives better estimation of direction if last point is noisy
tail = PointNorm(orderedCurve[end], orderedCurve[end- dirEstimationOffset]);
comboCurve = _comboCurve;
}
void stepExtrapolate_Head(int stepSize_px, int derivativeLookback)
{
int tempLookback = derivativeLookback;
if (tempLookback > totalCurve.size() - 1) { tempLookback = totalCurve.size() - 1; }
head.X += head.dir(0) * (double)stepSize_px;
head.Y += head.dir(1) * (double)stepSize_px;
totalCurve.insert(totalCurve.begin(), 1, Point2d(head.X, head.Y));
{
double dirChangeAngle = computeSecondDerivative(0, tempLookback);
head.dir = normalize(rotateByAngle(head.dir, dirChangeAngle * (double)stepSize_px));
}
}
void stepExtrapolate_Tail(int stepSize_px, int derivativeLookback)
{
int tempLookback = derivativeLookback;
if (tempLookback > totalCurve.size() - 1) { tempLookback = totalCurve.size() - 1; }
tail.X += tail.dir(0) * stepSize_px;
tail.Y += tail.dir(1) * stepSize_px;
totalCurve.push_back(Point2d(tail.X, tail.Y));
{
double dirChangeAngle = computeSecondDerivative(totalCurve.size() - 1, totalCurve.size() - (1 + tempLookback));
tail.dir = normalize(rotateByAngle(tail.dir, dirChangeAngle * (double)stepSize_px));
}
}
void stepExtrapolate_Bidirectional(int stepSize_px, int derivativeLookback)
{
stepExtrapolate_Head(stepSize_px, derivativeLookback);
stepExtrapolate_Tail(stepSize_px, derivativeLookback);
}
void stepExtrapolate_SingleDirection(int stepSize_px, int derivativeLookback, bool headFlag)
{
if (headFlag) { stepExtrapolate_Head(stepSize_px, derivativeLookback); }
else { stepExtrapolate_Tail(stepSize_px, derivativeLookback); }
}
static double estimateSpineDistance(ModSpline spl1, ModSpline spl2,int stepSize_px, int derivativeLookback, double distWeight, double angleWeight, double travelWeight)
{
bool dir_1_head, dir_2_head;
getNearestEndpoints(spl1, spl2, dir_1_head, dir_2_head);
//todo: run multiple times with adjusted dir and derivatives
double lastAngle, lastDist, lastComboDist;
PointNorm::dif(spl1.getEndpointnorm(dir_1_head), spl2.getEndpointnorm(dir_2_head), lastDist, lastAngle);
lastAngle = abs(M_PI - lastAngle);//looking for opposing vectors;
lastComboDist = lastAngle * angleWeight + lastDist * distWeight;
double totalExtrapolationDistance = 0;
while (true)
{
spl1.stepExtrapolate_SingleDirection(stepSize_px, derivativeLookback, dir_1_head);
spl2.stepExtrapolate_SingleDirection(stepSize_px, derivativeLookback, dir_2_head);
totalExtrapolationDistance += (stepSize_px + stepSize_px);
double tempAngle, tempDist, tempComboDist;
PointNorm::dif(spl1.getEndpointnorm(dir_1_head), spl2.getEndpointnorm(dir_2_head), tempDist, tempAngle);
tempAngle = abs(M_PI - tempAngle);//looking for opposing vectors;
tempComboDist = tempAngle * angleWeight + tempDist * distWeight +totalExtrapolationDistance* travelWeight;
if (tempComboDist > lastComboDist) { break; }
else
{
lastAngle = tempAngle;
lastDist = tempDist;
lastComboDist = tempComboDist;
}
}
return lastComboDist;
}
static ModSpline mergeSplines(ModSpline spl1, ModSpline spl2, int stepSize_px, int derivativeLookback, int dirEstimationOffset, vector<vector<Point2d>> &debugClouds)
{
bool dir_1_head, dir_2_head;
getNearestEndpoints(spl1, spl2, dir_1_head, dir_2_head);
double lastAngle, lastDist, lastComboDist;
int extrapolationCount = 0;
PointNorm::dif(spl1.getEndpointnorm(dir_1_head), spl2.getEndpointnorm(dir_2_head),lastDist, lastAngle);
while (true)
{
extrapolationCount += 1;
spl1.stepExtrapolate_SingleDirection(stepSize_px, derivativeLookback, dir_1_head);
spl2.stepExtrapolate_SingleDirection(stepSize_px, derivativeLookback, dir_2_head);
double tempAngle, tempDist, tempComboDist;
PointNorm::dif(spl1.getEndpointnorm(dir_1_head), spl2.getEndpointnorm(dir_2_head), tempDist, tempAngle);
if (tempDist > lastDist) { break; }
else { lastDist = tempDist; }
}
vector<Point2d> outCloud = vector<Point2d>();
vector<Point2d> spline1Cloud = spl1.getTotalCurve();
if (dir_1_head) { reverse(spline1Cloud.begin(), spline1Cloud.end()); }
vector<Point2d> spline2Cloud = spl2.getTotalCurve();
if (!dir_2_head) { reverse(spline2Cloud.begin(), spline2Cloud.end()); }
//spline 1
{
vector<Point2d> debug = vector<Point2d>();
for (int i = 0; i < spline1Cloud.size() - extrapolationCount; i++)
{
outCloud.push_back(spline1Cloud[i]);
debug.push_back(spline1Cloud[i]);
}
debugClouds.push_back(debug);
}
//fused region
{
vector<Point2d> debug = vector<Point2d>();
double theta = 0;
double theta_Step = (M_PI / 2.0) / (double)extrapolationCount;
for (int i = 1; i < extrapolationCount-1; i++)
{
int invI = extrapolationCount - i;
Point2d p1 = spline1Cloud[spline1Cloud.size() - (1 + invI)];
Point2d p2 = spline2Cloud[i];
double alpha = sin(theta);
theta += theta_Step;
//sinusoidal interpolation
Point2d fusedPt = Point2d(alpha*p2.x+(1.0-alpha)*p1.x, alpha * p2.y + (1.0 - alpha) * p1.y);
outCloud.push_back(fusedPt);
debug.push_back(fusedPt);
}
debugClouds.push_back(debug);
}
//spline 2
{
vector<Point2d> debug = vector<Point2d>();
for (int i = extrapolationCount; i < spline2Cloud.size(); i++)
{
outCloud.push_back(spline2Cloud[i]);
debug.push_back(spline2Cloud[i]);
}
debugClouds.push_back(debug);
}
return ModSpline(outCloud,dirEstimationOffset,true);
}
vector<Point2d> getTotalCurve()
{
return totalCurve;
}
int getPtCount() { return totalCurve.size(); }
vector<PointNorm> getEndpoints()
{
vector<PointNorm> endPoints = vector<PointNorm>();
endPoints.push_back(head);
endPoints.push_back(tail);
return endPoints;
}
bool isComboCurve() { return comboCurve; }
Point2d getEndpoint(bool headFlag)
{
if (headFlag) { return Point2d(head.X, head.Y); }
else { return Point2d(tail.X, tail.Y); }
}
PointNorm getEndpointnorm(bool headFlag)
{
if (headFlag) { return head; }
else { return tail; }
}
static void getNearestEndpoints(ModSpline spl1, ModSpline spl2, bool& headFlag1, bool& headFlag2)
{
double dist1 = sqDist(Point2d(spl1.head.X, spl1.head.Y), Point2d(spl2.head.X, spl2.head.Y));
double dist2 = sqDist(Point2d(spl1.head.X, spl1.head.Y), Point2d(spl2.tail.X, spl2.tail.Y));
double dist3 = sqDist(Point2d(spl1.tail.X, spl1.tail.Y), Point2d(spl2.head.X, spl2.head.Y));
double dist4 = sqDist(Point2d(spl1.tail.X, spl1.tail.Y), Point2d(spl2.tail.X, spl2.tail.Y));
double minDist = min(dist1, min(dist2, min(dist3, dist4)));
if (minDist == dist1) { headFlag1 = true; headFlag2 = true; }
else if (minDist == dist2) { headFlag1 = true; headFlag2 = false; }
else if (minDist == dist3) { headFlag1 = false; headFlag2 = true; }
else if (minDist == dist4) { headFlag1 = false; headFlag2 = false; }
}
private:
vector<Point2d> totalCurve;
PointNorm head;
PointNorm tail;
bool comboCurve = false;
double computeSecondDerivative(int startIndex, int endIndex)
{
int increment = 1;
if (endIndex < startIndex) { increment = -1; }
double totAngle = 0;
double totalDistance = 0;
int count = 0;
for (int i = startIndex; i + 2 * increment != endIndex; i += increment)
{
count++;
Point2d p1 = totalCurve[i];
Point2d p2 = totalCurve[i + increment];
Point2d p3 = totalCurve[i + 2 * increment];
Vec2d v1 = Vec2d(p1.x - p2.x, p1.y - p2.y);
Vec2d v2 = Vec2d(p2.x - p3.x, p2.y - p3.y);
double tempAngle = angleBetween(v1, v2, true);
double aveDist = (sqrt(sqDist(p1, p2)) + sqrt(sqDist(p2, p3))) / 2.0;
totalDistance += aveDist;
totAngle += tempAngle;
}
totAngle = totAngle / totalDistance;
return totAngle;
}
static Vec2d rotateByAngle(Vec2d inVec, double angle)
{
Vec2d outVec = Vec2d();
outVec(0) = inVec(0)*cos(angle) - inVec(1)*sin(angle);
outVec(1) = inVec(0)*sin(angle) + inVec(1)*cos(angle);
return outVec;
}
};
vector<Scalar> colorWheel = {
Scalar(255,0,0),
Scalar(0,255,0),
Scalar(0,0,255),
Scalar(255,255,0),
Scalar(255,0,255),
Scalar(0,255,255) };
int colorWheelIndex = 0;
void rotateColorWheel()
{
colorWheelIndex++;
if (colorWheelIndex >= colorWheel.size()) { colorWheelIndex = 0; }
}
int main(int argc, char** argv)
{
//control downsampling and extrapolation (several these could be static members of modspline instead)
const int stepSize_px = 1;//how far each extrapolation step travels
const int derivativeLookback = 15;//how far back in curve is considered in determining 2nd derivative
const int dirEstimationOffset = 2;//min 1. determines how much deviations at ends of curves effect extrapolation (lower equals more impact)
const int voxelWidth = 5; //voxel dimension for averaging intial pixels into point doubles
//control spline similarity calculation (each of these contributes to a weighted sum of "spline distance")
const double distWeighting = 1.0; //how much distance between two splines after optimal extrapolation
const double angleWeighting = 10.0; //how much angle between two splines after optimal extrapolation
const double travelWeighting = 0.05; //how far splines have to travel to connect
const double maxTotalSplineError = 15; //unitless weighted combination of "spline distance"
bool debugFlag = true;// flag to enable or disable debug visualizers
std::string path = "C:/Local Software/voyDICOM/resources/images/SparseCurves6.png";
Mat originalImg = cv::imread(path, cv::IMREAD_GRAYSCALE);
if (debugFlag) { cv::imshow("orignal", originalImg); }
Mat debugImg = cv::imread(path, cv::IMREAD_COLOR);
Mat bwImg;
threshold(originalImg, bwImg, 150, 255, THRESH_BINARY);
vector<vector<Point> > contours;
findContours(bwImg, contours, RETR_LIST, CHAIN_APPROX_NONE);
vector<vector<Point2d>> dCurves = vector<vector<Point2d>>();
Mat initialCurves = debugImg.clone();
for (int i = 0; i < contours.size(); i++)
{
vector<Point2d> tempDCurve = resampleContour(contours[i], voxelWidth);
if (tempDCurve.size() < 7) { continue; }
tempDCurve = orderCurve(tempDCurve);
dCurves.push_back(tempDCurve);
vector<Point> tempCloud = roundPoints(tempDCurve);
cv::polylines(initialCurves, { tempCloud }, false, colorWheel[colorWheelIndex], 2);
circle(initialCurves, tempCloud[0], 5, Scalar(0, 255, 0), 1);
circle(initialCurves, tempCloud[tempCloud.size() - 1], 5, Scalar(0, 0, 255), 1);
rotateColorWheel();
}
if (debugFlag) { cv::imshow("initialCurves", initialCurves); }
vector<ModSpline> travCurves = vector<ModSpline>();
vector<ModSpline> tempCurves = vector<ModSpline>();
for (int i = 0; i < dCurves.size(); i++)
{
travCurves.push_back(ModSpline(dCurves[i], dirEstimationOffset));
tempCurves.push_back(ModSpline(dCurves[i], dirEstimationOffset));
}
if (debugFlag)
{
for (int i = 0; i < 25; i++)
{
colorWheelIndex = 0;
Mat extrapViewer = debugImg.clone();
for (int ii = 0; ii < tempCurves.size(); ii++)
{
//if (!(ii == 7 || ii == 13)) { continue; }
tempCurves[ii].stepExtrapolate_Bidirectional(stepSize_px,derivativeLookback);
vector<Point2d> tempCloud = tempCurves[ii].getTotalCurve();
for (int iii = 0; iii < tempCloud.size(); iii++)
{
cv::circle(extrapViewer, tempCloud[iii], 1, colorWheel[colorWheelIndex], 1);
}
rotateColorWheel();
}
cv::imshow("extrapolation debug", extrapViewer);
cv::waitKey(100);
}
}
int fusionCounter = 1;
while (true)
{
vector<tuple<int, int>> validMerges = vector<tuple<int, int>>();
for (int i = 0; i < travCurves.size(); i++)
{
for (int ii = 0; ii < travCurves.size(); ii++)
{
if (i == ii) { continue; }
double tempComboDist = ModSpline::estimateSpineDistance(
travCurves[i],
travCurves[ii],
stepSize_px,
derivativeLookback,
distWeighting,
angleWeighting,
travelWeighting);
if (tempComboDist < maxTotalSplineError)
{
validMerges.push_back(tuple<int,int>(i,ii));
}
}
}
if (validMerges.size()>0)
{
vector<int> splineSizes = vector<int>();
for (int i = 0; i < travCurves.size(); i++)
{
splineSizes.push_back(travCurves[i].getPtCount());
}
//sort valid merges by combined spline size (favor larger spline merges before smaller ones)
sort(validMerges.begin(), validMerges.end(), [splineSizes](const tuple<int, int>& spl1, const tuple<int, int>& spl2) {
return splineSizes[get<0>(spl1)] + splineSizes[get<1>(spl1)] > splineSizes[get<0>(spl2)] + splineSizes[get<1>(spl2)];
});
int splineIndex1 = get<0>(validMerges[0]);
int splineIndex2 = get<1>(validMerges[0]);
vector<vector<Point2d>> debugClouds = vector<vector<Point2d>>();
ModSpline newCurve = ModSpline::mergeSplines(
travCurves[splineIndex1],
travCurves[splineIndex2],
stepSize_px,
derivativeLookback,
dirEstimationOffset,
debugClouds);
travCurves.erase(travCurves.begin() + max(splineIndex1, splineIndex2));
travCurves.erase(travCurves.begin() + min(splineIndex1, splineIndex2));
travCurves.push_back(newCurve);
if (debugFlag)
{
Mat debugFusions = debugImg.clone();
cv::polylines(debugFusions, { roundPoints(debugClouds[0]) }, false, Scalar(255, 0, 0), 2);
cv::polylines(debugFusions, { roundPoints(debugClouds[1]) }, false, Scalar(0, 255, 0), 1);
cv::polylines(debugFusions, { roundPoints(debugClouds[2]) }, false, Scalar(0, 0, 255), 2);
cv::imshow("debugFusion"+std::to_string(fusionCounter++), debugFusions);
cv::waitKey(500);
}
}
else
{
break;
}
}
Mat finalCurves = debugImg.clone();
colorWheelIndex = 0;
for (int i = 0; i < travCurves.size(); i++)
{
if (!travCurves[i].isComboCurve()) { continue; }
cv::polylines(finalCurves, { roundPoints(travCurves[i].getTotalCurve()) }, false, colorWheel[colorWheelIndex], 2);
rotateColorWheel();
}
cv::imshow("final curves", finalCurves);
cv::imshow("initialCurves", initialCurves);
cv::imshow("original", originalImg);
cv::waitKey(0);
}
Additional data sets (same parameters across all 3 sets):
Final notes: playing with the variables at the top of the main function allow full control over the specificity of the spline collision detection. The next step for making this more robust would be to build up a large dataset and modify the parameters as neccesary to give an optimal solution for each data set. Then, analyzing all those individual setting configurations, you should be able to establish center points and ranges of confidence for the settings such that a single configuration works across the largest number of test images.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With