I understand what register_buffer does and the difference between register_buffer and register_parameters.
But what is the precise definition of a buffer in PyTorch?
This can be answered looking at the implementation:
def register_buffer(self, name, tensor):
if '_buffers' not in self.__dict__:
raise AttributeError(
"cannot assign buffer before Module.__init__() call")
elif not isinstance(name, torch._six.string_classes):
raise TypeError("buffer name should be a string. "
"Got {}".format(torch.typename(name)))
elif '.' in name:
raise KeyError("buffer name can't contain \".\"")
elif name == '':
raise KeyError("buffer name can't be empty string \"\"")
elif hasattr(self, name) and name not in self._buffers:
raise KeyError("attribute '{}' already exists".format(name))
elif tensor is not None and not isinstance(tensor, torch.Tensor):
raise TypeError("cannot assign '{}' object to buffer '{}' "
"(torch Tensor or None required)"
.format(torch.typename(tensor), name))
else:
self._buffers[name] = tensor
That is, the buffer's name:
not isinstance(name, torch._six.string_classes)
.
(dot): '.' in name
name == ''
hasattr(self, name)
name not in self._buffers
and the tensor
(guess what?):
isinstance(tensor, torch.Tensor)
So, the buffer is just a tensor with these properties, registered in the _buffers
attribute of a Module
;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With