The default recursion limit is not good enough to handle complex books. I'm sometimes receiving this error message: sphinx.errors.SphinxParallelError: RecursionError: maximum recursion depth exceeded while pickling an object or those: maximum recursion depth exceeded while calling a Python object This can happen with very large or deeply nested source files. You can carefully increase the default Python recursion limit of 1000 in conf.py with e.g.: import sys; sys.setrecursionlimit(1500) Increase it. Signed-off-by: Mauro Carvalho Chehab <mchehab@xxxxxxxxxxxxxxxx> --- Documentation/conf.py | 1 + 1 file changed, 1 insertion(+) diff --git a/Documentation/conf.py b/Documentation/conf.py index 45a0741b39ed..ff5a5979f5d5 100644 --- a/Documentation/conf.py +++ b/Documentation/conf.py @@ -19,6 +19,7 @@ import sphinx # Get Sphinx version major, minor, patch = sphinx.version_info[:3] +sys.setrecursionlimit(5000) # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the -- 2.9.3 -- To unsubscribe from this list: send the line "unsubscribe linux-doc" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html