Thanks a lot Ken,
I will try it soon.
But when the table becomes huge (how big 'huge' in postgres ?), how to optimize such command ?
I have index on all important field like date, productid, supplierid, customerid and so on
Optimization is really an important thing as i plan to keep all transaction data as long as possible.
On Nov 18, 2013, at 5:37 PM, Ken Tanzer wrote: If the tables aren't huge, you're not concerned about optimization, and you just want to get your numbers, I think something like this would do the trick. I haven't actually tried it 'cause I didn't have easy access to your tables:
SELECT a.product_id, a.product_name,
b.initial_stock_sum, c.in_out_sum, c.in_sum, c.out_sum
FROM a LEFT JOIN (SELECT
product_id, SUM(initial_stock) AS initial_stock_sum FROM b
GROUP BY product_id ) b USING (product_id) LEFT JOIN
(SELECT product_id, sum(CASE WHEN date < 'BEGINNING DATE' THEN in-out ELSE 0 END) AS in_out_sum,
sum(CASE WHEN date BETWEEN 'BEGINNING DATE' AND 'ENDING DATE' THEN in ELSE 0 END) AS in_sum, sum(CASE WHEN date BETWEEN 'BEGINNING DATE' AND 'ENDING DATE' THEN out ELSE 0 END) AS out_sum
FROM c GROUP BY product_id ) c USING (product_id)
WHERE a.supplier_id='XXX';
Cheers, Ken
|