I think this is curious. die() rolls a die. This code almost always returns 3 for large input n.
int depth(int n)
{
if (n == 0)
{
return die(6);
}
return max(
min(depth(n - 1), depth(n - 1)),
min(depth(n - 1), depth(n - 1)));
}
(I have Haskell code to compute the exact prob.)
=> More informations about this toot | More toots from dpiponi@mathstodon.xyz
I was wondering how much you could control the resulting distribution if the only tools you have available are 6-sided dice and the functions min and max. Looks like you can keep "refining" the distribution to get it as close as you like to being supported at just one value.
=> More informations about this toot | More toots from dpiponi@mathstodon.xyz
Code https://godbolt.org/z/Thex6n8ev
=> More informations about this toot | More toots from dpiponi@mathstodon.xyz
If you replace die() with a standard normal variate you tend to get around -0.3. I wonder how hard it is to compute the exact result for the limit - assuming there is a limit.
=> More informations about this toot | More toots from dpiponi@mathstodon.xyz
@dpiponi I think the analysis in my last reply (https://assemblag.es/@qualmist/113536125549115188) applies here as well – the particular distribution doesn't matter since we're doing order statistics. For a continuous uniform distribution from 0 to 1, the probability distribution will limit to a delta at (3 - √5)/2 ≈ 0.38. To turn this into limit for the standard normal, you apply the inverse CDF of the standard normal to this value, which Mathematica tells me gives "-Sqrt[2] InverseErfc[3 - Sqrt[5]]", aka -0.300321.
=> More informations about this toot | More toots from qualmist@assemblag.es
@dpiponi Hmm does taking the min or max of two identical distributions always reduce the variance?
=> More informations about this toot | More toots from sjb@mstdn.io
@sjb I don't think so. If you have a distribution on {1,2} with p(1)=q and p(2)=1-q and q is close to 1 then the max of two draws I think has higher variance.
=> More informations about this toot | More toots from dpiponi@mathstodon.xyz
@dpiponi Yes, that seems right. So it just happens for the distributions in this case.
=> More informations about this toot | More toots from sjb@mstdn.io
@dpiponi That’s neat. So it’s a sort of… not sure what to call it… emergent distribution? based on the structure of the logic. I like it.
=> More informations about this toot | More toots from 4raylee@mathstodon.xyz
@4raylee I noticed the similarity to logic too. Not sure if there's any mileage you could get out of that.
=> More informations about this toot | More toots from dpiponi@mathstodon.xyz
@dpiponi "die() is the function you think it is" -> spent 30 seconds being very confused since I thought die(n) killed the program with an exit code of n.
=> More informations about this toot | More toots from byorgey@mathstodon.xyz
@byorgey yeah, I forgot I can come back and edit on mastodon but will do so now
=> More informations about this toot | More toots from dpiponi@mathstodon.xyz
@dpiponi Think you can approach this analytically with cumulative distribution functions:
If F is the CDF for x, F² is the CDF for max(x₁, x₂), and 1-(1-F)² is the CDF for min(x₁, x₂). So the CDF for max(min(x₁, x₂), min(x₃, x₄)) is (1-(1-F)²)². If you iterate this process, the CDF will drop if x < (3 - √5)/2 ≈ 0.38 and increase if x > that. That means the CDF drops for rolls of 1 & 2 (up to a CDF of 1/3) and increases for 3 and beyond, so probability gets concentrated onto 3.
=> More informations about this toot | More toots from qualmist@assemblag.es
@qualmist Nice. Connecting to another thread, these formulae are very similar to how Boole originally represented truth values using 1-x for negation, multiplication for and and so on. Or, in fact, fuzzy logic, which I hadn't really connected with CDFs before.
=> More informations about this toot | More toots from dpiponi@mathstodon.xyz This content has been proxied by September (ba2dc).Proxy Information
text/gemini