It's probably worth removing the iterating code Just In Case.
Apologies for egg-suck-education, but I assume you're not doing something silly like
for (i=0; i < strlen(bigtextstring); i++) {
....
}
I know it sounds stupid, but you'd be amazed how many times that crops up, and for small strings it doesn't matter, but for large strings it's catastrophic.
I know it sounds stupid, but you'd be amazed how many times that crops up, and for small strings it doesn't matter, but for large strings it's catastrophic.
Geoff
On 20 October 2017 at 16:16, Cory Nemelka <cnemelka@xxxxxxxxx> wrote:
All I am am doing is iterating through the characters so I know it isn't my code.--cnemelkaOn Fri, Oct 20, 2017 at 9:14 AM, Cory Nemelka <cnemelka@xxxxxxxxx> wrote:Yes, but I should be able to read them much faster. The psql client can display an 11MB column in a little over a minute, while in C using libpg library, it takes over an hour.Anyone have any experience with the same issue that can help me resolve?--cnemelkaOn Thu, Oct 19, 2017 at 5:20 PM, Aldo Sarmiento <aldo@xxxxxxxxxxxxxxxx> wrote:I believe large columns get put into a TOAST table. Max page size is 8k. So you'll have lots of pages per row that need to be joined with a size like that: https://www.postgresql.org/docs/9.5/static/storage-toa st.html Aldo SarmientoPresident & CTO
On Thu, Oct 19, 2017 at 2:03 PM, Cory Nemelka <cnemelka@xxxxxxxxx> wrote:I have getting very poor performance using libpq to process very large TEXT columns (300MB+). I suspect it is IO related but can't be sure.Anyone had experience with same issue that can help me resolve?--cnemelka