I know it's not Pythonic to write functions that care about the type of the arguments, but there are cases when it's simply impossible to ignore types because they are handled differently.
Having a bunch of isinstance
checks in your function is just ugly; is there any function decorator available that enables function overloads? Something like this:
@overload(str)
def func(val):
print('This is a string')
@overload(int)
def func(val):
print('This is an int')
Update:
Here's some comments I left on David Zaslavsky's answer:
With a few modification[s], this will suit my purposes pretty well. One other limitation I noticed in your implementation, since you use
func.__name__
as the dictionary key, you are prone to name collisions between modules, which is not always desirable. [cont'd][cont.] For example, if I have one module that overloads
func
, and another completely unrelated module that also overloadsfunc
, these overloads will collide because the function dispatch dict is global. That dict should be made local to the module, somehow. And not only that, it should also support some kind of 'inheritance'. [cont'd][cont.] By 'inheritance' I mean this: say I have a module
first
with some overloads. Then two more modules that are unrelated but each importfirst
; both of these modules add new overloads to the already existing ones that they just imported. These two modules should be able to use the overloads infirst
, but the new ones that they just added should not collide with each other between modules. (This is actually pretty hard to do right, now that I think about it.)
Some of these problems could possibly be solved by changing the decorator syntax a little bit:
first.py
@overload(str, str)
def concatenate(a, b):
return a + b
@concatenate.overload(int, int)
def concatenate(a, b):
return str(a) + str(b)
second.py
from first import concatenate
@concatenate.overload(float, str)
def concatenate(a, b):
return str(a) + b
A decorator is a design pattern in Python that allows a user to add new functionality to an existing object without modifying its structure. Decorators are usually called before the definition of a function you want to decorate.
Python does not support function overloading as in other languages, and the functional parameters do not have a data type.
Methods in Python can be called with zero, one, or more parameters. This process of calling the same method in different ways is called method overloading.
You'll use a decorator when you need to change the behavior of a function without modifying the function itself. A few good examples are when you want to add logging, test performance, perform caching, verify permissions, and so on. You can also use one when you need to run the same code on multiple functions.
Since Python 3.4 the functools
module now supports a @singledispatch
decorator. It works like this:
from functools import singledispatch
@singledispatch
def func(val):
raise NotImplementedError
@func.register
def _(val: str):
print('This is a string')
@func.register
def _(val: int):
print('This is an int')
Usage
func("test") --> "This is a string"
func(1) --> "This is an int"
func(None) --> NotImplementedError
Quick answer: there is an overload package on PyPI which implements this more robustly than what I describe below, although using a slightly different syntax. It's declared to work only with Python 3 but it looks like only slight modifications (if any, I haven't tried) would be needed to make it work with Python 2.
Long answer: In languages where you can overload functions, the name of a function is (either literally or effectively) augmented by information about its type signature, both when the function is defined and when it is called. When a compiler or interpreter looks up the function definition, then, it uses both the declared name and the types of the parameters to resolve which function to access. So the logical way to implement overloading in Python is to implement a wrapper that uses both the declared name and the parameter types to resolve the function.
Here's a simple implementation:
from collections import defaultdict
def determine_types(args, kwargs):
return tuple([type(a) for a in args]), \
tuple([(k, type(v)) for k,v in kwargs.iteritems()])
function_table = defaultdict(dict)
def overload(arg_types=(), kwarg_types=()):
def wrap(func):
named_func = function_table[func.__name__]
named_func[arg_types, kwarg_types] = func
def call_function_by_signature(*args, **kwargs):
return named_func[determine_types(args, kwargs)](*args, **kwargs)
return call_function_by_signature
return wrap
overload
should be called with two optional arguments, a tuple representing the types of all positional arguments and a tuple of tuples representing the name-type mappings of all keyword arguments. Here's a usage example:
>>> @overload((str, int))
... def f(a, b):
... return a * b
>>> @overload((int, int))
... def f(a, b):
... return a + b
>>> print f('a', 2)
aa
>>> print f(4, 2)
6
>>> @overload((str,), (('foo', int), ('bar', float)))
... def g(a, foo, bar):
... return foo*a + str(bar)
>>> @overload((str,), (('foo', float), ('bar', float)))
... def g(a, foo, bar):
... return a + str(foo*bar)
>>> print g('a', foo=7, bar=4.4)
aaaaaaa4.4
>>> print g('b', foo=7., bar=4.4)
b30.8
Shortcomings of this include
It doesn't actually check that the function the decorator is applied to is even compatible with the arguments given to the decorator. You could write
@overload((str, int))
def h():
return 0
and you'd get an error when the function was called.
It doesn't gracefully handle the case where no overloaded version exists corresponding to the types of the arguments passed (it would help to raise a more descriptive error)
It distinguishes between named and positional arguments, so something like
g('a', 7, bar=4.4)
doesn't work.
g
.All of these could be remedied with enough fiddling, I think. In particular, the issue of name collisions is easily resolved by storing the dispatch table as an attribute of the function returned from the decorator. But as I said, this is just a simple example to demonstrate the basics of how to do it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With