Search Postgresql Archives

Re: Fatal Error : Invalid Memory alloc request size 1236252631

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Team,

Even I used postgreSQL Large Objects by referring this link to store and retrieve large files (As bytea not working)
https://www.postgresql.org/docs/current/largeobjects.html

But even now I am unable to fetch the data at once from large objects

select lo_get(oid);

Here I'm getting the same error message.

But if I use select data from pg_large_object where loid = 49374
Then I can fetch the data but in page wise (data splitting into rows of each size 2KB)

So, here how can I fetch the data at single step rather than page by page without any error.

And I'm just wondering how do many applications storing huge amount of data in GBs? I know that there is 1GB limit for each field set by postgreSQL. If so, how to deal with these kind of situations? Would like to know about this to deal with real time scenarios.

We need to store large content (huge volume of data) and retrieve it. Currently It is not happening due to limit of field size set by postgreSQL. 

Would request to share your insights and suggestions on this to help me for resolving this issue.


Thanks & Regards,
Sai Teja 

On Tue, 15 Aug, 2023, 8:53 am Tom Lane, <tgl@xxxxxxxxxxxxx> wrote:
Sai Teja <saitejasaichintalapudi@xxxxxxxxx> writes:
> I got to know the field size limit for the bytea datatype column is limited
> to 1 GB in postgreSQL. Then how can we increase this?

You can't.  That limit is wired-in in many ways.  Think about how to
split your data across multiple table rows.

                        regards, tom lane

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux