Search Postgresql Archives

multi terabyte fulltext searching

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

I have been struggling with getting fulltext searching for very large databases. I can fulltext index 10s if gigs without any problem but when I start geting to hundreds of gigs it becomes slow. My current system is a quad core with 8GB of memory. I have the resource to throw more hardware at it but realistically it is not cost effective to buy a system with 128GB of memory. Is there any solutions that people have come up with for indexing very large text databases?

Essentially I have several terabytes of text that I need to index. Each record is about 5 paragraphs of text. I am currently using TSearch2 (stemming and etc) and getting sub-optimal results. Queries take more than a second to execute. Has anybody implemented such a database using multiple systems or some special add-on to TSearch2 to make things faster? I want to do something like partitioning the data into multiple systems and merging the ranked results at some master node. Is something like this possible for PostgreSQL or must it be a software solution?

Benjamin


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux