On Mon, Aug 19, 2024 at 13:52:18 +0200, Peter Krempa wrote: > On Wed, Aug 14, 2024 at 23:40:25 +0200, Ján Tomko wrote: > > While the parsing is still done by 1K buffers, the results > > are no longer filtered during the parsing, but the whole JSON > > has to live in memory at once, which was also the case before > > the NSS plugin dropped its dependency on libvirt_util. > > > > Also, the new parser might be more forgiving of missing elements. > > > > Signed-off-by: Ján Tomko <jtomko@xxxxxxxxxx> > > --- > > tools/nss/libvirt_nss_leases.c | 339 ++++++++++----------------------- > > 1 file changed, 96 insertions(+), 243 deletions(-) [...] > > - while (1) { > > - rv = read(fd, line, sizeof(line)); > > + while (jerr != json_tokener_continue) { > > So this passes on first iteration, but ... > > > + rv = read(fd, line, sizeof(line) - 1); > > if (rv < 0) > > goto cleanup; > > if (rv == 0) > > break; > > nreadTotal += rv; > > > > - if (yajl_parse(parser, (const unsigned char *)line, rv) != > > - yajl_status_ok) { > > - unsigned char *err = yajl_get_error(parser, 1, > > - (const unsigned char*)line, rv); > > - ERROR("Parse failed %s", (const char *) err); > > - yajl_free_error(parser, err); > > - goto cleanup; > > - } > > + line[rv] = 0; > > Why is this needed ... > > > + > > + jobj = json_tokener_parse_ex(tok, line, rv); > > ... this seems to accept a length argument. > > > + jerr = json_tokener_get_error(tok); > > ... the docs state: > > A partial JSON string can be parsed. If the parsing is incomplete, > NULL will be returned and json_tokener_get_error() will be return > json_tokener_continue. json_tokener_parse_ex() can then be called with > additional bytes in str to continue the parsing. > > So unless I'm missing something very important this will not be able to > parse a file larger than 1k. > > > } > > > > - if (nreadTotal > 0 && > > - yajl_complete_parse(parser) != yajl_status_ok) { > > - ERROR("Parse failed %s", > > - yajl_get_error(parser, 1, NULL, 0)); > > + if (nreadTotal > 0 && jerr != json_tokener_success) { > > + ERROR("Cannot parse %s: %s", file, json_tokener_error_desc(jerr)); > > goto cleanup; > > } Also at this point if no bytes were read the original code would skip any parsing and return nothihg. You unconditionally call 'findLeaseInJSON' with 'jobj' argument being NULL, which in better case will result in ERROR("parsed JSON does not contain the leases array"); and in the worse outcome the code will crash.