Fwd: Mount Point /datastorage is 93 percent full

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




Hi Team,

We are getting following error while running this query. As checked, the query is fine.
Please guide on this

 

Query :

with recursive ctee as ( select * , 1 as level
from kms_chunked_content
where processid = '6598'
and nodeid = 1
union all select e.*, c.level + 1
from ctee c
join kms_chunked_content e on
e.parentnodeid = cast(c.nodeid as text)
and e.is_active = 'Y'
and e.processid = '6598' ) select *
from ctee
order by position;

 

 

Exception :

org.jkiss.dbeaver.model.sql.DBSQLException: SQL Error [53400]: ERROR: temporary file size exceeds temp_file_limit (52428800kB)
    at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:134)
    at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:467)
    at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$0(SQLQueryJob.java:407)
    at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:146)
    at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:405)
    at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:849)

 Note : We have set temp_file_limit =50GB still query getting issue.




Thanks&Regards,
Satyam Naidu

On Wed, 3 Aug 2022 at 14:41, Julien Rouhaud <rjuju123@xxxxxxxxx> wrote:
Hi,

On Wed, Aug 03, 2022 at 02:32:05PM +0530, Satyamnaidu Gokeda wrote:
>
> We are getting following error while running this query. As checked

You probably forgot to copy the query here.

> org.jkiss.dbeaver.model.sql.DBSQLException: SQL Error [53400]: ERROR:
> temporary file size exceeds temp_file_limit (52428800kB)

This error is raised because the query used more than temp_file_limit (see
https://www.postgresql.org/docs/current/runtime-config-resource.html#RUNTIME-CONFIG-RESOURCE-DISK).
Your options are to raise that parameter (and maybe increase the
underlying disk given the message subject) or find a way to consume less disk
in temporary files.  It's hard to know without knowing the query and the
underlying data, but maybe rewriting the query or splitting it into multiple
queries over a temporary table would work.

[Index of Archives]     [Postgresql Home]     [Postgresql General]     [Postgresql Performance]     [Postgresql PHP]     [Postgresql Jobs]     [PHP Users]     [PHP Databases]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Yosemite Forum]

  Powered by Linux