By default the bytea_output is in hex format.
On Tue, 15 Aug, 2023, 12:44 am Ron, <ronljohnsonjr@xxxxxxxxx> wrote:
Did you try changing bytea_output to hex?
On 8/14/23 12:31, Sai Teja wrote:
I am just running select query to fetch the resultQuery : select id, content_data, name from table_nameSo here content_data is bytea content which is having more than 700 MB. Even if I run this query in any DB client such as Pgadmin, dbeaver etc.. I'm facing the same error. But this query is being called in java as wellSo, I don't think java could be the issue as I can able to successfully insert the data. But, only the problem is with fetching the data that too only specific rows which are having huge volume of data.
Thanks,Sai
On Mon, 14 Aug, 2023, 10:55 pm Rob Sargent, <robjsargent@xxxxxxxxx> wrote:
On 8/14/23 09:29, Sai Teja wrote:
> Could anyone please suggest any ideas to resolve this issue.
>
> I have increased the below parameters but still I'm getting same error.
>
> work_mem, shared_buffers
>
> Out of 70k rows in the table only for the few rows which is of large
> size (700MB) getting the issue. Am unable to fetch the data for that
> particular row.
>
> Would be appreciated if anyone share the insights.
>
> Thanks,
> Sai
>
>
Are you using java? There's an upper limit on array size, hence also on
String length. You'll likely need to process the output in chunks.
--
Born in Arizona, moved to Babylonia.