Re: [PATCH] docs: conf.py: increase recursion limit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On 13.04.2017 15:29, Mauro Carvalho Chehab wrote:
Em Thu, 13 Apr 2017 14:55:03 +0200
Markus Heiser <markus.heiser@xxxxxxxxxxx> escreveu:

On 13.04.2017 12:42, Mauro Carvalho Chehab wrote:
The default recursion limit is not good enough to handle
complex books. I'm sometimes receiving this error message:

	sphinx.errors.SphinxParallelError: RecursionError: maximum recursion depth exceeded while pickling an object

or those:

	maximum recursion depth exceeded while calling a Python object
	This can happen with very large or deeply nested source files.  You can carefully increase the default Python recursion limit of 1000 in conf.py with e.g.:
	    import sys; sys.setrecursionlimit(1500)


Is this behavior reproducible?

I also observed those errors in the past. But mostly, it turned
out that I was the problem ;)

Sphinx caches the doctree (they call it "pickling"). On re-compile, this
cached doctree is readed in and changes from the rst-files are merged into
this doctree. After, the "updated" doctree is pickled (cached) again.
This procedere is fragil in some circumstances.

E.g. Sphinx parses every rst-file in Documents/ and below (every!
not only those files refered in toctrees), if you have some C&P
corpses with rst-files in ... its all parsed into the doctree and
merged and pickled again.

E.g. re-compiling on a shares without a clean before, while different
Sphinx versions are installed on your hosts using this share. (IMO
merging the cached doctree with changes from rst-files is not stable
over Sphinx versions).

I noticed the issue when building the docs with Sphinx 1.4.9 with
my ABI patches, after adding xref links to it.

Ah, OK .. I hope I will find the time to test your patch in the
next days (may be there is something wrong with the xrefs, I have
to check).


What I suspect is that Sphinx use some sort of fragile recursion
algorithm to parse cross references.

This could be one reason .. cross refs are solved by using the (pickled
and updated) doctree. When such a doctree is damaged/fragil, solving
refs might end in (endless) recursion.

Increase it.

I recomend to increase the recusion-limit only, if we have a reproducible
scenario.

Why? Could this make it less stable?

No. Increasing the limit might slow down compilation, but it is not a instable.

Mostly we have some other problem when we need to increase the limit and
I want to identify/exclude the problem.

FYI: As far as I know, the default limit is mostly about 1k. The recursion
limit is not about Sphinx, it is about Python's stack:

  https://docs.python.org/3/library/sys.html#sys.setrecursionlimit

In my 15 years python experience I never needed to increase it (except some
very cruel and hackish scripts), so I have doubts; I it smells like fixing
the symptoms.

--Markus--

--
To unsubscribe from this list: send the line "unsubscribe linux-doc" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html



[Index of Archives]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Linux FS]     [Yosemite Forum]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]     [Linux Resources]

  Powered by Linux