On Tue, Jan 13, 2009 at 03:28:18PM -0600, Jason Long wrote: > Steve Atkins wrote: > >On Jan 13, 2009, at 10:34 AM, Jason Long wrote: > >>I would like to use PSQLFS(http://www.edlsystems.com/psqlfs/) > >>to store 100 GB of images in PostgreSQL. > >> > >>Is there a better way to load 20,000 plus files reliably into Postgres? That would imply that they're around 5MB on average? If they're all under, say, 20MB (or maybe even much more) you should be able to handle it by doing the most naive things possible. > I just want an easy way to load the files into the DB and their original > path they were loaded from. > > Is possible through SQL to load a file into a bytea column? You'd need to generate the SQL somehow; if you know python it's probably a pretty easy 20 or 30 lines of code to get this working. psycopg seems to be the recommend way of accessing PG with python and you basically want to be doing something like: import psycopg2; filename = "myimage.jpeg" conn = psycopg2.connect(""); conn.cursor().execute( "INSERT INTO pictures (filename,data) VALUES (%s,%s);", [filename,psycopg2.Binary(open(filename,"rb").read())]); conn.commit(); This seems to do the right thing for me, and obviously needs to be put into a loop of some sort. But it'll hopefully get you started. Sam -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general