On Wed, 25 Jun 2014 13:21:44 -0500 Merlin Moncure <mmoncure@xxxxxxxxx> wrote: > > The cookbook currently uses PQexec so multiple SQL commands are > > wrapped in a transaction unless an explicit transaction > > instruction appears. I don't want to change this behaviour but > > the only way to get exactly the same effect from psql is to use > > the -c option. > > > > I suspect some may shove rather large SQL scripts through this to > > the extent that it may break the command line limit, if not on > > Linux, then perhaps on Windows, where I gather it's 32,768. Passing > > these scripts on the command line doesn't seem particularly elegant > > in any case. I'd really like to use stdin but this has different > > transactional behaviour. I thought about looking for instances of > > transaction instructions in advance but I have seen that PostgreSQL > > does not do this naively; it uses the lexer. > > > > Is there another way? > > If I understand you properly (not sure), I pretty commonly get around > this via 'cat'. > > cat \ > <(echo "BEGIN;") \ > <(echo "\set ON_ERROR_STOP") \ > foo.sql bar.sql etc > <(echo "COMMIT;") \ > | psql ... This would work but given that this will be part of a public and widely-used cookbook, it needs to be able to deal with any scripts that will be thrown at it. Some of these may contain transactional statements and these will not work properly if wrapped in a big BEGIN/COMMIT. Having said that, Tom Lane has suggested that we should not rely on the existing transactional behaviour so maybe we'll need to be more explicit about whether we actually want a transaction or not. Thanks anyway, James