Hey,
thanks, now we have good information:Unlike plperl or plR there is no easy way to preload packages.
There may be some solutions to make this import at connection start but it would involve C modification (found no trace of python file or hackable sql script in postgres source and install directory)
After that,
further optimization is possible by avoiding the useless 'import' (because it is already loaded) (see the trick here)
My use case is simple geometry manipulation functions. It is easier to use plpython rather than plpgsql thanks to numpy for vector manipulation. Usually the functions are called inside complex query with many CTE, and execute over 100k of rows. Total execution time is in the order of minutes. (exemple of querry at the end)
Thanks everybody,
Rémi
Thanks everybody,
Rémi
Example of querry
CREATE TABLE holding_result AS
WITH the_geom AS (
SELECT gid, geom
FROM my_big_table --200k rows
)FROM my_big_table --200k rows
SELECT gid, my_python_function(geom) AS result
FROM the_geom;
FROM the_geom;
2014-06-27 4:27 GMT+02:00 Adrian Klaver <adrian.klaver@xxxxxxxxxxx>:
On 06/26/2014 02:14 AM, Rémi Cura wrote:
I got to thinking about this.Hey,
thanks for your answer !
Yep you are right, the function I would like to test are going to be
called a lot (100k times), so even 15 ms per call matters.
100K over what time frame?
How is it being called?
--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx