On Mar 19, 2020, at 4:13 PM, pabloa98 <pabloa98@xxxxxxxxx
<mailto:pabloa98@xxxxxxxxx>> wrote:
On Thu, Mar 19, 2020 at 2:50 PM Rob Sargent <robjsargent@xxxxxxxxx
<mailto:robjsargent@xxxxxxxxx>> wrote:
> On Mar 19, 2020, at 3:36 PM, pabloa98 <pabloa98@xxxxxxxxx
<mailto:pabloa98@xxxxxxxxx>> wrote:
>
> Hello,
>
> My schema requires a counter for each combination of 2
values. Something like:
>
> CREATE TABLE counter(
> group INT NOT NULL,
> element INT NOT NULL,
> seq_number INT NOT NULL default 0,
> CONSTRAINT PRIMARY KEY (group, element)
> );
>
> For each entry in counter, aka for each (group, element)
pair, the model requires a seq_number.
>
> If I use a table "counter", I could still have counter
collisions between 2 transactions. I need truly sequence
behavior. Is that possible by using a table like "counter"
table, where the counter could be increased out of the
transaction so it performs as a sequence without having race
conditions between concurrent transactions?
>
> The other option is to create sequences for each new pair of
(group, element) using triggers. There are millions of pairs.
So this approach will generate millions of sequences.
>
> How a PostgreSQL database would behave having millions of
sequences in a schema? Would it degrade its performance? Is
there any negative impact?
>
> Regards
>
> Pablo
>
To clarify, are you hoping for consecutive numbers as the each
row is added to the table, i.e. “serial”?
What is the intension of “seq_number”?
>
>
the idea is to have like a serial sequence, but for each pair of
(group, element).
so that when we insert rows in another table, we could have
something like:
group, element, event_id, ...
1, 1, 1
1, 1, 2
1, 1, 3
2, 1, 1
1, 1, 4
1, 3, 1
1, 1, 5
1, 3, 2
2, 1, 2
2, 1, 3
The 3rd column is the sequence number we get from the appropriate
sequence created by the trigger.
I want to implement a variation of
https://stackoverflow.com/a/30204854 and that will generate
millions of sequences.