Hi! On 4/7/23 17:06, Eli Zaretskii wrote: >> Date: Fri, 7 Apr 2023 09:43:19 -0500 >> From: "G. Branden Robinson" <g.branden.robinson@xxxxxxxxx> >> Cc: alx.manpages@xxxxxxxxx, dirk@xxxxxxxxxxx, cjwatson@xxxxxxxxxx, >> linux-man@xxxxxxxxxxxxxxx, help-texinfo@xxxxxxx, groff@xxxxxxx >> >> ...which brings me to the other factor, of which I'm more confident: man >> page rendering times are much lower than they were in Unix's early days. >> >> On my system, all groff man pages but one render in between a tenth and >> a fortieth of a second. The really huge pages like groff(7), >> groff_char(7), and groff_diff(7) are toward the upper end of this range, >> because they are long, at ~20-25 U.S. letter pages when formatted for >> PostScript or PDF, or have many large tables so the tbl(1) preprocessor >> produces a lot of output. >> >> The outlier is groff_mdoc(7) at just over one-third of a second. > > Some people consider 0.1 sec, let alone 0.3 sec, to be long enough to > be annoying. > > Also, did you try with libpng.3 or gcc.1? $ time man -w gcc | xargs zcat | groff -man -Tutf8 2>/dev/null >/dev/null real 0m0.406s user 0m0.534s sys 0m0.042s But as others said, I don't really care about the time it takes to format the entire document, but rather the first 24 lines, which is more like instantaneous (per your own definition of ~0.5 s). $ time man -w gcc | xargs zcat | groff -man -Tutf8 2>/dev/null | head -n24 >/dev/null xargs: zcat: terminated by signal 13 real 0m0.064s user 0m0.051s sys 0m0.030s As a curiosity, mandoc(1) seems to be faster for rendering the entire document, but slower to "start reading". $ time man -w gcc | xargs zcat | mandoc >/dev/null real 0m0.270s user 0m0.218s sys 0m0.057s $ time man -w gcc | xargs zcat | mandoc | head -n24 >/dev/null real 0m0.136s user 0m0.119s sys 0m0.023s As a disclaimer, I do sometimes care about reading entire documents, but even in that case, it's not so bad. I can read the few thousand man pages in the Linux man-pages in about a few seconds, or a minute. [1] > >> Human subjects need a minimum of about 0.1 second of visual experience >> or about .01 to .02 second of auditory experience to perceive >> duration; any shorter experiences are called instantaneous. >> -- Encyclopædia Britannica[2] > > IME, 0.05 sec of visual experiences is closer to reality. This is the time to load the first 24 lines of almost any page. gcc(1), which is one of the longest I have, takes 0.6 s. MAX(3), which is one of the shortest I have, takes 0.4 s. > > Anyway, I won't argue. Cheers, Alex [1]: Here's why I do care about time to lead entire pages. I know I can optimize this pipeline by calling groff(1) directly, or even better, mandoc(1), now that I know it's faster for entire docs, but since I haven't used this function for a long time, I didn't spend time optimizing it. man_lsfunc() { if [ $# -lt 1 ]; then >&2 echo "Usage: ${FUNCNAME[0]} <manpage|manNdir>..."; return $EX_USAGE; fi for arg in "$@"; do man_section "$arg" 'SYNOPSIS'; done \ |sed_rm_ccomments \ |pcregrep -Mn '(?s)^ [\w ]+ \**\w+\([\w\s(,)[\]*]*?(...)?\s*\); *$' \ |grep '^[0-9]' \ |sed -E 's/syscall\(SYS_(\w*),?/\1(/' \ |sed -E 's/^[^(]+ \**(\w+)\(.*/\1/' \ |uniq; } man_section() { if [ $# -lt 2 ]; then >&2 echo "Usage: ${FUNCNAME[0]} <dir> <section>..."; return $EX_USAGE; fi local page="$1"; shift; local sect="$*"; find "$page" -type f \ |xargs wc -l \ |grep -v -e '\b1 ' -e '\btotal\b' \ |awk '{ print $2 }' \ |sort \ |while read -r manpage; do (sed -n '/^\.TH/,/^\.SH/{/^\.SH/!p}' <"$manpage"; for s in $sect; do <"$manpage" \ sed -n \ -e "/^\.SH $s/p" \ -e "/^\.SH $s/,/^\.SH/{/^\.SH/!p}"; done;) \ |man -P cat -l - 2>/dev/null; done; } man_lsfunc() is quite slow, but it's acceptable to me, since I only run it sporadically. -- <http://www.alejandro-colomar.es/> GPG key fingerprint: A9348594CE31283A826FBDD8D57633D441E25BB5
Attachment:
OpenPGP_signature
Description: OpenPGP digital signature