I'm trying to track down the implementation of torch.nn.NLLLoss
in the source code. I got as far as a call to torch._C.nll_loss
in the function nll_loss
in the file torch.nn.functional
. But I can't find a place where _C is created.
Anyone have any info on this?
Take a look at A Tour of PyTorch Internals on the PyTorch blog. Relevant excerpt:
PyTorch defines a new package torch. In this post we will consider the ._C module. This module is known as an “extension module” - a Python module written in C. Such modules allow us to define new built-in object types (e.g. the Tensor) and to call C/C++ functions.
The ._C module is defined in torch/csrc/Module.cpp. The init_C() / PyInit__C() function creates the module and adds the method definitions as appropriate. This module is passed around to a number of different __init() functions that add further objects to the module, register new types, etc.
Part II to that post goes into detail about the build system. In the section on NN modules, it says
Briefly, let’s touch on the last part of the build_deps command: generate_nn_wrappers(). We bind into the backend libraries using PyTorch’s custom cwrap tooling, which we touched upon in a previous post. For binding TH and THC we manually write the YAML declarations for each function. However, due to the relative simplicity of the THNN and THCUNN libraries, we auto-generate both the cwrap declarations and the resulting C++ code.
The reason we copy the THNN.h and THCUNN.h header files into torch/lib is that this is where the generate_nn_wrappers() code expects these files to be located. generate_nn_wrappers() does a few things:
- Parses the header files, generating cwrap YAML declarations and writing them to output .cwrap files
- Calls cwrap with the appropriate plugins on these .cwrap files to generate source code for each
- Parses the headers a second time to generate THNN_generic.h - a library that takes THPP Tensors, PyTorch’s “generic” C++ Tensor Library, and calls into the appropriate THNN/THCUNN library function based on the dynamic type of the Tensor
Perhaps not that helpful without the context, but I don't think I should copy the entire post here.
When I tried to track down the definition of NLLLoss without having read those posts, I ended up at aten/src/THNN/generic/ClassNLLCriterion.c, via aten/src/ATen/nn.yaml. The latter is probably the YAML the second post talks about, but I haven't checked.
TL; DR: If you want to find out how a function is implemented in C code, you can check the github repository of Pytorch here:
https://github.com/torch/nn/tree/master/lib/THNN/generic
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With