Poker Mavens Statistics Question (1 Viewer)

OK, so my site has seen 156,351 unique player hands. If I do a very simply summary here, I get this. This is taking into account some extra games which were not in my original set (play test games for checking cards/chips ... but still, they were hands dealt)

HandsOccurrences1-in-x Hands
Total: 156351189.0580411
AA827202.5272021
KK772192.7879162
QQ811194.7085928
JJ803195.6833542
TT799190.6719512
99820199.6819923
88783201.2239382
77777189.2869249
66826211.5710419
55739189.2869249
44826194.2248447
33805188.3746988
22830

So that's still more often, right? Maybe not as high as my original numbers, but it's happening more often than it should...

No, that's LESS often. It should average to 1 in 221 hands if the data is correct (it's probably not though).

Sorry to be so blunt, but the data you are posting in this thread is all over the place. I have zero confidence in any of the numbers you are reporting. I can't even begin to make the necessary probability calculations here to tell you whether or not something is off because because your numbers and assumptions are all wrong.

I see nothing in this thread that makes me think Poker Mavens is dealing out pocket pairs or royal flushes at higher than expected rates. I see plenty, however, of reasons not to trust any of the numbers you're posting. I'm confident that if I had access to the raw data, I would be reporting different numbers.
 
For anyone interested in Mavens related "enhanced services", I will just make a quick pitch for you to get in touch with @Finest, friend and data-wizard (see some of his work in the posts above).
 
No, that's LESS often. It should average to 1 in 221 hands if the data is correct (it's probably not though).
Just going to point out that 1 in 2 is more frequent than 1 in 10 and therefore 1 in 189 is more frequent than 1 in 221.

That said, I think the data that @Finest posted in #37 is a pretty solid representation, and I do think the data set @BearMetal posted in #45 are pretty reasonable and point to the same conclusion, nothing outside of a standard deviation is going on here.
 
No, that's LESS often. It should average to 1 in 221 hands if the data is correct (it's probably not though).

Sorry to be so blunt, but the data you are posting in this thread is all over the place. I have zero confidence in any of the numbers you are reporting. I can't even begin to make the necessary probability calculations here to tell you whether or not something is off because because your numbers and assumptions are all wrong.

I see nothing in this thread that makes me think Poker Mavens is dealing out pocket pairs or royal flushes at higher than expected rates. I see plenty, however, of reasons not to trust any of the numbers you're posting. I'm confident that if I had access to the raw data, I would be reporting different numbers.
No offense at all. Let me see if I can explain:

The original post incorrectly used Table Hands instead of Player Hands in its calculations. Once I switched over to actual player hands dealt (see above 156,351), we can now compare the total number of pocket pairs (say AA at 827) to the total number of actual hands dealt out.

This means that out of 156,351 hands, 827 of them were AA, which is 1-out-of-189. This means that we see AA once every 189 hands, which is MORE OFTEN than the expect 1-out-of-221. It's close (within 15%), but it's odd that all pocket pairs are more common.
 
I see nothing in this thread that makes me think Poker Mavens is dealing out pocket pairs or royal flushes at higher than expected rates. I see plenty, however, of reasons not to trust any of...
As I was reading this I was fully prepared for the next words to be "you. You're all bad at math"
 
I suspect if you do the analysis exactly like the other one with all hand combos the stats might be a little different. I might be wrong on that but that's my gut.
Variance this high isn't really possible. After tens of thousands of hands being dealt, the data will have converged. The average number of specific pocket pair averages per player will be 1 in 221 hands. It probably won't even be 1 in 220 or 1 in 222. It'll be within a small decimal of 221. There's no way in hell it would be 1 in 190 after that many hands simply due to variance.
 
Last edited:
No offense at all. Let me see if I can explain:

The original post incorrectly used Table Hands instead of Player Hands in its calculations. Once I switched over to actual player hands dealt (see above 156,351), we can now compare the total number of pocket pairs (say AA at 827) to the total number of actual hands dealt out.

This means that out of 156,351 hands, 827 of them were AA, which is 1-out-of-189. This means that we see AA once every 189 hands, which is MORE OFTEN than the expect 1-out-of-221. It's close (within 15%), but it's odd that all pocket pairs are more common.
Sorry, I misunderstood your post. Yes, that would be MORE often, not less. However, I still suspect you're not querying the data correctly. There are probably some hands that are not being accounted for somehow.
 
Sorry, I misunderstood your post. Yes, that would be MORE often, not less. However, I still suspect you're not querying the data correctly. There are probably some hands that are not being accounted for somehow.
I don't think so; now that I know the right data to compare, it's pretty easy to get. They're all pretty close to 221; within 10-15%. It's just weird they are all more common.
 
Variance this high isn't really possible. After tens of thousands of hands being dealt, the data will have converged. The average number of specific pocket pairs per player will be 1 in 221 hands. It probably won't even be 1 in 220 or 1 in 222. It'll be within a small decimal of 221. There's no way in hell it would be 1 in 190 after that many hands simply due to variance.
I believe @JustinInMN has more experience in the area of statistics than I do, so I'll keep this simple and stand by if he wants to hash out details, but I'm pretty sure you're just promoting a baseless assumption here.
 
I believe @JustinInMN has more experience in the area of statistics than I do, so I'll keep this simple and stand by if he wants to hash out details, but I'm pretty sure you're just promoting a baseless assumption here.
I'm good at searching data. (Which I think was the fundamental problem of "establishing the haystack" at first in this thread.) I'm pretty below-to-average with statistics involving distributions. It's my instinct that at 10-15% variance is within the norm (as someone with a "C" in this class, so FWIW), but I will yield to anyone that is better with distributions than I.

(Cs get degrees)
 
I don't think so; now that I know the right data to compare, it's pretty easy to get. They're all pretty close to 221; within 10-15%. It's just weird they are all more common.
That's not close though. You have 156,000+ hands. The probability of being off by 10% or more with a sample size this large is essentially zero. It would be essentially zero even with a sample size of just 10,000 hands, let alone 150k+. Something else is wrong. I assume it's your denominator. Are you sure you're counting 2 hands when there are two tables?
 
That's not close though. You have 156,000+ hands. The probability of being off by 10% or more with a sample size this large is essentially zero. It would be essentially zero even with a sample size of just 10,000 hands, let alone 150k+. Something else is wrong. I assume it's your denominator. Are you sure you're counting 2 hands when there are two tables?
The number of tables is irrelevant in this query:
egrep -e 'Seat.*\[\w\w \w\w\]' HandHistory/* | wc -l

It's counting every seat which played across all tables, across all games. Counting pocket pairs (like AA) is equally easy with:
egrep -e 'Seat.*\[A\w A\w\]' HandHistory/* | wc -l

Those are the numbers I get; 156,351 and 827.
 
I mean I did get plenty of As in my database development classes, I figured I would do much better in statistics than I did, but it proved a challenge.
Not going to get too far off topic, but I love Database Design as well as Systems Analysis.

The geek in me gets excited about drawing up ERDs :ROFL: :ROFLMAO:
 
Not going to get too far off topic, but I love Database Design as well as Systems Analysis.

The geek in me gets excited about drawing up ERDs :ROFL: :ROFLMAO:
It's funny; I hate databases. I realized that to better run queries against the raw PM data, I'll most likely need to convert my scripts to run off of a database instead of the PM flat files. Every time I go back to designing the tables and relationships I'm like "uggghhh". Normalization? Maybe ... if I feel like it. Otherwise, it's just getting dumped in 1 column and I'll use JSONB or something :)
 
It's funny; I hate databases. I realized that to better run queries against the raw PM data, I'll most likely need to convert my scripts to run off of a database instead of the PM flat files. Every time I go back to designing the tables and relationships I'm like "uggghhh". Normalization? Maybe ... if I feel like it. Otherwise, it's just getting dumped in 1 column and I'll use JSONB or something :)
Normalization is just so.....tedious, you have to make a table for everything lol.

Im a sql guy. If I have my choice, I’m doing T-SQL for database querying. My last job forced me to learn PL/SQL so whatever :rolleyes: Just adding ammo to my Arsenal lol.
 
There isn't any chance that it's picking up both the hand being dealt and then if it gets logged again for being shown down?
 
It's funny; I hate databases. I realized that to better run queries against the raw PM data, I'll most likely need to convert my scripts to run off of a database instead of the PM flat files. Every time I go back to designing the tables and relationships I'm like "uggghhh". Normalization? Maybe ... if I feel like it. Otherwise, it's just getting dumped in 1 column and I'll use JSONB or something :)

Just use our "Data Lake"...

 
I believe @JustinInMN has more experience in the area of statistics than I do, so I'll keep this simple and stand by if he wants to hash out details, but I'm pretty sure you're just promoting a baseless assumption here.
Wait, is this the part of the discussion where we all start breaking out our college degrees and transcripts? :ROFL: :ROFLMAO: Probability and statistics is my area of expertise. I use it every day at work, and it's what I studied in grad school (I got As, not Cs). I assure you that my "assumption" is not baseless. In fact, I'll prove it to you - whether you guys will understand it, I don't know, but here you go anyhow, just for fun...

AA is dealt 1 in 221 hands
Pocket pairs, in general, are dealt 1 in 17 hands
221/13 = 17

If each pocket pair is dealt 1 in 189 hands as @BearMetal says his data suggests, then any pocket pair is dealt 1 in ~14.5 hands
189/13 = 14.53846

The probability of all specific pocket pairs being dealt averaging out to 1 in 189 hands after 10,000 hands have been dealt follows a binomial distribution with the probability of success being 0.06878308 (1 in 14.53846). We can calculate this using statistical programming software (I'm using R) as follows:
> 1-pbinom(10000/(189/13), 10000, 1/17)
= 1.887275e-05 or 0.00001887275
Thus, as I stated earlier, the likelihood that you'd average out 189 hands per pocket pair even over a sample size of just 10,000 hands would be essentially zero (as I've proven here, 0.00001887275 is pretty damn close to 0).

However, the probability of pocket pairs averaging 1 in 189 hands after 156,356 hands have been dealt is:
1-pbinom(156356/(189/13), 156356, 1/17)
= 0
Yep, 0. Not happening.


The standard deviation of the average expected number of total pocket pairs dealt after 156,356 hands is defined as sqrt(n*p*(1-p)) and can be calculated here as follows:
> sqrt(156356*(1/17)*(1-1/17))
= 93.03971

99.7% of observations fall within 3 standard deviations of the mean. Given the true probability of a pocket pair being dealt is 1 in 17 hands, we know that 99.7% of the time that we run this experiment with a fair deck after 156,356 hands, we will have between 8918 and 9477 total pocket pairs.
> (156356/17) - 3 * 93.03971 # Lower bound
= 8918.293
> (156356/17) + 3 * 93.03971 # Upper bound
= 9476.531

BearMetal claims that there were ~10,755 total pocket pairs dealt, as he said they averaged out to 1 in 189 hands per pair after 156,356 hands.
156356/(189/13) = 10754.65

This is basically impossible, as this would be almost 17 standard deviations away from the expected number of 9197 total pocket pairs dealt (156356/17 = 9197). Either the deck is stacked, or his calculations/data aggregations are wrong. As I stated before, and as I proved here, variance cannot explain his results.
(10755-9197)/93.03971 = 16.74554 standard deviations
 
Last edited:
Wait, is this the part of the discussion where we all start breaking out our college degrees and transcripts? :ROFL: :ROFLMAO: Probability and statistics is my area of expertise. I use it every day at work, and it's what I studied in grad school (I got As, not Cs). I assure you that my "assumption" is not baseless. In fact, I'll prove it to you - whether you guys will understand it, I don't know, but here you go anyhow, just for fun...

AA is dealt 1 in 221 hands
Pocket pairs, in general, are dealt 1 in 17 hands
221/13 = 17

If each pocket pair is dealt 1 in 189 hands as @BearMetal says his data suggests, then any pocket pair is dealt 1 in ~14.5 hands
189/13 = 14.53846

The probability of all specific pocket pairs being dealt averaging out to 1 in 189 hands after 10,000 hands have been dealt follows a binomial distribution with the probability of success being 0.06878308 (1 in 14.53846). We can calculate this using statistical programming software (I'm using R) as follows:
> 1-pbinom(10000/(189/13), 10000, 1/17)
= 1.887275e-05 or 0.00001887275
Thus, as I stated earlier, the likelihood that you'd average out 189 hands per pocket pair even over a sample size of just 10,000 hands would be essentially zero (as I've proven here, 0.00001887275 is pretty damn close to 0).

However, the probability of pocket pairs averaging 1 in 189 hands after 156,356 hands have been dealt is:
1-pbinom(156356/(189/13), 156356, 1/17)
= 0
Yep, 0. Not happening.


The standard deviation of the average expected number of total pocket pairs dealt after 156,356 hands is defined as sqrt(n*p*(1-p)) and can be calculated here as follows:
> sqrt(156356*(1/17)*(1-1/17))
= 93.03971

99.7% of observations fall within 3 standard deviations of the mean. Given the true probability of a pocket pair being dealt is 1 in 17 hands, we know that 99.7% of the time that we run this experiment with a fair deck after 156,356 hands, we will have between 8918 and 9477 total pocket pairs.
> (156356/17) - 3 * 93.03971 # Lower bound
= 8918.293
> (156356/17) + 3 * 93.03971 # Upper bound
= 9476.531

BearMetal claims that there were ~10,755 total pocket pairs dealt, as he said they averaged out to 1 in 189 hands per pair after 156,356 hands.
156356/(189/13) = 10754.65

This is basically impossible, as this would be almost 17 standard deviations away from the expected number of 9197 total pocket pairs dealt (156356/17 = 9197). Either the deck is stacked, or his calculations/data aggregations are wrong. As I stated before, and as I proved here, variance cannot explain his results.
(10755-9197)/93.03971 = 16.74554 standard deviations
So what you’re telling me is that at the end of the night @CraigT78 can see all my cards once the game is over?!?! WTF :ROFL: :ROFLMAO: :ROFL: :ROFLMAO: :ROFL: :ROFLMAO:
 
Last edited:
Wait, is this the part of the discussion where we all start breaking out our college degrees and transcripts? :ROFL: :ROFLMAO: Probability and statistics is my area of expertise. I use it every day at work, and it's what I studied in grad school (I got As, not Cs). I assure you that my "assumption" is not baseless. In fact, I'll prove it to you - whether you guys will understand it, I don't know, but here you go anyhow, just for fun...

AA is dealt 1 in 221 hands
Pocket pairs, in general, are dealt 1 in 17 hands
221/13 = 17

If each pocket pair is dealt 1 in 189 hands as @BearMetal says his data suggests, then any pocket pair is dealt 1 in ~14.5 hands
189/13 = 14.53846

The probability of all specific pocket pairs being dealt averaging out to 1 in 189 hands after 10,000 hands have been dealt follows a binomial distribution with the probability of success being 0.06878308 (1 in 14.53846). We can calculate this using statistical programming software (I'm using R) as follows:
> 1-pbinom(10000/(189/13), 10000, 1/17)
= 1.887275e-05 or 0.00001887275
Thus, as I stated earlier, the likelihood that you'd average out 189 hands per pocket pair even over a sample size of just 10,000 hands would be essentially zero (as I've proven here, 0.00001887275 is pretty damn close to 0).

However, the probability of pocket pairs averaging 1 in 189 hands after 156,356 hands have been dealt is:
1-pbinom(156356/(189/13), 156356, 1/17)
= 0
Yep, 0. Not happening.


The standard deviation of the average expected number of total pocket pairs dealt after 156,356 hands is defined as sqrt(n*p*(1-p)) and can be calculated here as follows:
> sqrt(156356*(1/17)*(1-1/17))
= 93.03971

99.7% of observations fall within 3 standard deviations of the mean. Given the true probability of a pocket pair being dealt is 1 in 17 hands, we know that 99.7% of the time that we run this experiment with a fair deck after 156,356 hands, we will have between 8918 and 9477 total pocket pairs.
> (156356/17) - 3 * 93.03971 # Lower bound
= 8918.293
> (156356/17) + 3 * 93.03971 # Upper bound
= 9476.531

BearMetal claims that there were ~10,755 total pocket pairs dealt, as he said they averaged out to 1 in 189 hands per pair after 156,356 hands.
156356/(189/13) = 10754.65

This is basically impossible, as this would be almost 17 standard deviations away from the expected number of 9197 total pocket pairs dealt (156356/17 = 9197). Either the deck is stacked, or his calculations/data aggregations are wrong. As I stated before, and as I proved here, variance cannot explain his results.
(10755-9197)/93.03971 = 16.74554 standard deviations
That all sounds very cool. Can someone please now use this type of analysis to help me figure out how sucko keeps hitting everything he needs when he constantly gets it in behind
 
BearMetal claims that there were ~10,755 total pocket pairs dealt, as he said they averaged out to 1 in 189 hands per pair after 156,356 hands.
156356/(189/13) = 10754.65

This is basically impossible, as this would be almost 17 standard deviations away from the expected number of 9197 total pocket pairs dealt (156356/17 = 9197). Either the deck is stacked, or his calculations/data aggregations are wrong. As I stated before, and as I proved here, variance cannot explain his results.
(10755-9197)/93.03971 = 16.74554 standard deviations
I get what you're saying. I ran my script on a friends server just to test his output (he has less hands than mine). Here's what I got:

Total Player Hands Dealt: 102999
AA4791-in-215
KK4491-in-229
QQ4931-in-209
JJ4541-in-227
TT4661-in-221
994781-in-215
884981-in-207
774341-in-237
664841-in-213
554151-in-248
444921-in-209
334801-in-215
224591-in-224

So, my same script produces much "better" results on this data. What's up with my site, then???
 
Wait, is this the part of the discussion where we all start breaking out our college degrees and transcripts? :ROFL: :ROFLMAO: Probability and statistics is my area of expertise. I use it every day at work, and it's what I studied in grad school (I got As, not Cs). I assure you that my "assumption" is not baseless. In fact, I'll prove it to you - whether you guys will understand it, I don't know, but here you go anyhow, just for fun...

AA is dealt 1 in 221 hands
Pocket pairs, in general, are dealt 1 in 17 hands
221/13 = 17

If each pocket pair is dealt 1 in 189 hands as @BearMetal says his data suggests, then any pocket pair is dealt 1 in ~14.5 hands
189/13 = 14.53846

The probability of all specific pocket pairs being dealt averaging out to 1 in 189 hands after 10,000 hands have been dealt follows a binomial distribution with the probability of success being 0.06878308 (1 in 14.53846). We can calculate this using statistical programming software (I'm using R) as follows:
> 1-pbinom(10000/(189/13), 10000, 1/17)
= 1.887275e-05 or 0.00001887275
Thus, as I stated earlier, the likelihood that you'd average out 189 hands per pocket pair even over a sample size of just 10,000 hands would be essentially zero (as I've proven here, 0.00001887275 is pretty damn close to 0).

However, the probability of pocket pairs averaging 1 in 189 hands after 156,356 hands have been dealt is:
1-pbinom(156356/(189/13), 156356, 1/17)
= 0
Yep, 0. Not happening.


The standard deviation of the average expected number of total pocket pairs dealt after 156,356 hands is defined as sqrt(n*p*(1-p)) and can be calculated here as follows:
> sqrt(156356*(1/17)*(1-1/17))
= 93.03971

99.7% of observations fall within 3 standard deviations of the mean. Given the true probability of a pocket pair being dealt is 1 in 17 hands, we know that 99.7% of the time that we run this experiment with a fair deck after 156,356 hands, we will have between 8918 and 9477 total pocket pairs.
> (156356/17) - 3 * 93.03971 # Lower bound
= 8918.293
> (156356/17) + 3 * 93.03971 # Upper bound
= 9476.531

BearMetal claims that there were ~10,755 total pocket pairs dealt, as he said they averaged out to 1 in 189 hands per pair after 156,356 hands.
156356/(189/13) = 10754.65

This is basically impossible, as this would be almost 17 standard deviations away from the expected number of 9197 total pocket pairs dealt (156356/17 = 9197). Either the deck is stacked, or his calculations/data aggregations are wrong. As I stated before, and as I proved here, variance cannot explain his results.
(10755-9197)/93.03971 = 16.74554 standard deviations

This is why I got a C. But this was the analysis I was looking for. Thanks @RainmanTrail .

My instinct is off. What would be an acceptable margin of error within one standard deviation?
 
So, my same script produces much "better" results on this data. What's up with my site, then???
Maybe your hands (or someone else specific) are being excluded from the results somehow, perhaps somewhere in your settings? I'm not familiar with Poker Mavens software, so I can't help with their data queries, but it seems like there might be a filter being applied somewhere. I just know probability theory & statistics will enough to know that either your numbers are wrong, or the software is rigged. Although apparently it's only rigged for your game if so, and not anyone else's. Which makes me think there's something wrong with your numbers.
 

Create an account or login to comment

You must be a member in order to leave a comment

Create account

Create an account and join our community. It's easy!

Log in

Already have an account? Log in here.

Back
Top Bottom