I am trying to see if the performance of both can be compared based on the objective functions they work on?
The Objective Function in K-Means In K-means, the optimization criterion is to minimize the total squared error between the training samples and their representative prototypes. This is equivalent to minimizing the trace of the pooled within covariance matrix.
Fuzzy c-means (FCM) is a data clustering technique in which a data set is grouped into N clusters with every data point in the dataset belonging to every cluster to a certain degree.
The main advantage of fuzzy c – means clustering is that it allows gradual memberships of data points to clusters measured as degrees in [0,1]. This gives the flexibility to express that data points can belong to more than one cluster.
Objective function-based clustering is one way of accomplishing the grouping. In this type of clustering algorithm, there is a function (the objective function) which you try to minimize or maximize. The examples or objects to be partitioned into clusters are described by a set of s features.
K-Means clustering and Fuzzy-C Means Clustering are very similar in approaches. The main difference is that, in Fuzzy-C Means clustering, each point has a weighting associated with a particular cluster, so a point doesn't sit "in a cluster" as much as has a weak or strong association to the cluster, which is determined by the inverse distance to the center of the cluster.
Fuzzy-C means will tend to run slower than K means, since it's actually doing more work. Each point is evaluated with each cluster, and more operations are involved in each evaluation. K-Means just needs to do a distance calculation, whereas fuzzy c means needs to do a full inverse-distance weighting.
BTW, the Fuzzy-C-Means (FCM) clustering algorithm is also known as Soft K-Means.
The objective functions are virtually identical, the only difference being the introduction of a vector which expresses the percentage of belonging of a given point to each of the clusters. This vector is submitted to a "stiffness" exponent aimed at giving more importance to the stronger connections (and conversely at minimizing the weight of weaker ones); incidently, when the stiffness factor tends towards infinity the resulting vector becomes a binary matrix, hence making the FCM model identical to that of the K-Means.
I think that except for some possible issue with the clusters which have no points assigned to them, it is possible to emulate the K-Means algorithm with that of the FCM one, by simulating an infinite stiffness factor (= by introducing a function which changes the biggest value in the vector to 1, and zeros out the other values, in lieu of the exponentiation of the vector). This is of course a very inefficient way of running a K-Means, because the algorithm then has to perform as many operations as with a true FCM (if only with 1 and 0 values, which does simplify the arithmetic, but not the complexity)
With regards to performance, the FCM therefore needs to perform k (i.e. number of clusters) multiplications for each point, for each dimension (not counting also the exponentiation to take stiffness into account). This, plus the overhead needed for computing and managing the proximity vector, explains why FCM is quite slower than plain K-Means.
But FCM/Soft-K-Means is less "stupid" than Hard-K-Means when it comes for example to elongated clusters (when points otherwise consistent in other dimensions tend to scatter along a particular dimension or two), and that's why it's still around ;-)
From my original reply:
Also, I just thought about this, but haven't put any "mathematical" thought to it, FCM may converge faster than hard K-Means, somewhat offsetting the bigger computational requirement of FCM.
May 2018 edit:
There is actually no reputable research that I could identify which support my above hunch about FCM's faster rate of convergence. Thank you Benjamin Horn to keep me honest ;-)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With