I am building a library of various functions that will be often reused in the project. Each function is stateless (doesn't require parameters at creation, and doesn't have any memory). Some functions use others.
These functions would be passed around as arguments in the rest of the project.
Which of the following approaches is better?
1.Define all functions as global functions in a certain module:
def f1(x):
# use x
def f2(x):
# use x and f1
2.Define all functions as methods in classes, and arrange the classes in a hierarchy based on use:
class F1:
def __call__(x):
# use x
f1 = F1()
class F2(F1):
def __call__(x):
# use x and f1
f2 = F2()
The reason I even considered option 2 is that some of my functions have something in common. E.g., functions f2
, f3
, f11
all start by calling f1
. I was thinking I might want to do something like this:
class F1:
def __call__(self, x):
self.f1(x)
self.calc(x)
def f1(self, x):
# do something
# don't define calc here; F1 is abstract base class
class F2(F1):
def calc(self, x):
# do something
class F3(F1):
def calc(self, x):
# do something
Option 1 is a LOT simpler. Option 2 is needlessly complex!!
Another suggestion that may make testing easier:
1.1. Define them all as methods of a single class in one module. Use @staticmethod and @classmethod decorators as appropriate. That can make them easier to substitute with mocks or override with alternate implementations by providing a new class or a subclass later.
spam.py:
class Spam(object):
@staticmethod
def f1(x):
# use x
@classmethod
def f2(cls, x):
# use x and cls.f1
This is still more complex so you may just want to stick with option 1 until you have a need for the above.
If you need several functions to execute some common code in the beginning and/or in the end, you can put the common code in a decorator, as explained here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With