Re: tsvector limitations

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Tim <elatllat@xxxxxxxxx> wrote:
 
> I would be surprised if there is no general "how big is this
> object" method in PostgreSQL.
 
You could cast to text and use octet_length().
 
> If it's "bad design" to store large text documents (pdf,docx,etc)
> as a BLOBs or on a filesystem and make them searchable with
> tsvectors can you suggest a good design?
 
Well, I suggested that storing a series of novels as a single entry
seemed bad design to me.  Perhaps one entry per novel or even finer
granularity would make more sense in most applications, but there
could be exceptions.  Likewise, a list of distinct words is of
dubious value in most applications' text searches.  We extract text
from court documents and store a tsvector for each document; we
don't aggregate all court documents for a year and create a
tsvector for that -- that would not be useful for us.
 
> If making your own search implementation is "better" what is the
> point of tsvectors?
 
I remember you asking about doing that, but I don't think anyone
else has advocated it.
 
> Maybe I'm missing something here?
 
If you were to ask for real-world numbers you'd probably get farther
than demanding that people volunteer their time to perform tests
that you define but don't seem willing to run.  Or if you describe
your use case in more detail, with questions about alternative
approaches, you're likely to get useful advice.
 
-Kevin

-- 
Sent via pgsql-admin mailing list (pgsql-admin@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-admin


[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]

  Powered by Linux