I'm using the wonderful Sphinx tool to create some documentation and I'm trying to keep the codebase in a modular form by separating chapters of the same part into separate files. (See here for definitions of 'chapter' and 'part'.)
I've tried to do this using two files, test1.rst
:
######
Part 1
######
*********
Chapter 1
*********
Section 1
=========
Test Content 1.
and test2.rst
:
*********
Chapter 2
*********
Section 2
=========
Test Content 2.
They are included in index.rst
like this:
.. toctree::
:maxdepth: 2
test1
test2
However, upon build, Chapter 2 doesn't get nested within Part 1. Why is this? Is there any way of doing this without having to create a script to append them into a single file like the example below?
Example:
######
Part 1
######
*********
Chapter 1
*********
Section 1
=========
Test Content 1.
*********
Chapter 2
*********
Section 2
=========
Test Content 2.
reStructuredText is the default plaintext markup language used by Sphinx. This section is a brief introduction to reStructuredText (reST) concepts and syntax, intended to provide authors with enough information to author documents productively.
Since reST does not have facilities to interconnect several documents, or split documents into multiple output files, Sphinx uses a custom directive to add relations between the single files the documentation is made of, as well as tables of contents. The toctree directive is the central element.
.. toctree:: :maxdepth: 2 intro strings datatypes numeric (many more documents listed here) This accomplishes two things: Tables of contents from all those documents are inserted, with a maximum depth of two, that means one nested heading. toctree directives in those documents are also taken into account.
It seems like the include
directive is what you are looking for. The included file's content is parsed and included at the point of the directive.
test1.rst:
######
Part 1
######
*********
Chapter 1
*********
Section 1
=========
Test Content 1.
.. include:: test2.rst
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With