I'm building an app in Django and I want to have some functions directly in postgres. I'd prefer to use pl/python for the functions as it'd look better in Django migration files (python code within python code, instead of using PLPGSQL).
But one of the functions I need to create needs to accept an array of records.
The example of what I'm trying to do:
CREATE TABLE employee (
name text,
salary integer,
age integer
);
CREATE OR REPLACE FUNCTION testp(e employee)
RETURNS integer
AS $$
plpy.notice('type', e.__class__)
$$ LANGUAGE plpythonu;
select testp(
('asd',10,10)::employee
);
CREATE OR REPLACE FUNCTION testp2(es employee[])
RETURNS integer
AS $$
for e in es:
plpy.notice('here', e.__class__)
$$ LANGUAGE plpythonu;
select testp2(
ARRAY[
('asd',10,10)::employee
]::employee[]
);
Running this .sql yelds:
CREATE TABLE
CREATE FUNCTION
psql:nfun.sql:15: NOTICE: ('type', <type 'dict'>)
CONTEXT: PL/Python function "testp"
testp
-------
(1 row)
CREATE FUNCTION
psql:nfun.sql:28: NOTICE: ('here', <type 'str'>)
CONTEXT: PL/Python function "testp2"
testp2
--------
(1 row)
so testp() that receives a single "employee" has the correct cast to "employee" and yelds type __dict__ as expected.
testp2() that receives an array of "employee", yelds type __str__ for each element of the array...
Should I declare the function some other way?
Is there any way I can force the cast on each element? I found some hits on google on *_fromDatum() internal functions but I didn't understand how/if I can call them explicitly..
Thanks,
Filipe