I was reading an article on ibm.com/developerworks (can't find the article now) about developing scalable software for the cloud.
Of course the main idea was going stateless. Nothing should contain state anymore and this was done by not having member data anymore. Every method should get its date by arguments passed to it.
One example was something like:
class NonScalableCircle {
int radius;
void setRadius(int radius){
this.radius = radius;
}
public integer getDiameter() {
return 2*radius;
}
}
The explanation why this was not scalable was because you have to set the radius first and then call the diameter. So there is an order to it in order to work because methods work on the same data.
The scalable example was:
class ScalableCircle {
public integer getDiameter(int radius) {
return 2*radius;
}
}
And of course it's true. Stateless scales way better. Given this and the fact that OBJECT = data + behavior, my questions are:
Is OOP simply not good for highly-concurrent applications ? Is OOP going to die and be replaced by Procedural Programming ?
Because even as it stands now, a lot of developers use the Anemic Domain Model and code the logic in services. There's not much OOP done really.
Scalability is a major perk of abstraction and OOP in general. Large codebases make for messier changes and maintenance. With encapsulation and abstraction working together, programmers can call objects without needing to access or open in-depth class mechanisms.
Scalability is the measure of a system's ability to increase or decrease in performance and cost in response to changes in application and system processing demands.
Functional programs have features which make it easier to design programs with scalability in mind.
Abstraction, encapsulation, inheritance, and polymorphism are four of the main principles of object-oriented programming.
The answer is, nobody knows. There isn't much concensus yet on the "right" way to write serial software, and parallel and concurrent programming is that much more difficult.
The entire key to efficient parallel computation at scale is distribution of data, and so there's an argument to be made that by encapsulating the data too early in the design process -- or by taking a data encapsulation that makes sense for small numbers of tasks, and naively hoping that scales up -- you are hurting scalability. Maybe that means OO has one hand tied behind its back in writing scalable code, but maybe it just means OO, like everything else, requires careful planning to be massively scalable.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With