On Sep 1, 2007, at 12:44, Alban Hertroys wrote:
It would be possible to write an aggregate that returns a single
random
value from a set. The algorithm is something like:
n = 1
v = null
for each row
if random() < 1/n:
v = value of row
n = n + 1
return v
Doesn't this always return the first record, since random() is
always less than 1/1?
I don't think this method has a linear distribution, but then again
I don't understand what 'value of row' refers to...
Oh, now I see... The first time guarantees that v has a value (as
random() < 1/1), and after that there is a decreasing chance that a
new row gets re-assigned to v. That means the last row has a chance
of 1/n, which would be it's normal chance if the distribution were
linear, but doesn't the first row have a chance of 1/(n!) to be
returned?
--
Alban Hertroys
alban@xxxxxxxxxxxxxxxxx
magproductions b.v.
T: ++31(0)534346874
F: ++31(0)534346876
M:
I: www.magproductions.nl
A: Postbus 416
7500 AK Enschede
// Integrate Your World //
!DSPAM:737,46d9551a289904044091126!
---------------------------(end of broadcast)---------------------------
TIP 1: if posting/reading through Usenet, please send an appropriate
subscribe-nomail command to majordomo@xxxxxxxxxxxxxx so that your
message can get through to the mailing list cleanly