Hi! On Tue, Oct 29, 2019 at 9:06 AM Fabio Ugo Venchiarutti <f.venchiarutti@xxxxxxxxx> wrote: > You can use subqueries and array_agg() to deepen your output tree all > the way to a stack overflow, a single <whatever>_to_json() call at the > top will recursively traverse and convert whatever you feed it. Yes, what you are describing is exactly the sad state of things: the only way to meaningfully retrieve inline composite types which are made when one aggregate things like that, or when you subselect a set of fields from a table in a sub-query, is that you then convert the whole thing to JSON and transmit it in that way. Because this is the only way you can parse things on the client. Because if you leave it as raw composite type encoding, you cannot really parse that on the client correctly in all cases without knowing what types are stored inside those composite types you are getting. But JSON is not a lossless transport format: it does not support full floating point spec (no inf, NANs) and for many types of fields it just converts to string representation of that, which can be problematic. For example, if you have binary blobs. So no, JSON is a workaround, but it is sad that we should have to use it. PostgreSQL seems to be almost there with the support for composite types and nested query results, only it seems you cannot really parse it out. I mean, why PostgreSQL even has its own binary format for results, then it could just transmit everything as JSON. :-) But that does not really work for many data types. I think RowDescription should be extended to provide full recursive metadata about all data types. That would be the best way to do it. Mitar -- http://mitar.tnode.com/ https://twitter.com/mitar_m