In Roulette can events with similar probabilities for running totals have different stats for occurring in a row (or clustering)?
Since Red and Black is 50/50, does this also dictate how many reds or blacks can appear in a row on average?
However, I think some 50/50 events whilst also having 50% probability could result in vastly different results when it comes to average in a row and maximum in a row?
So could events be described as 50/50 (or whatever probability) but with different properties when it comes to clustering?
Dozen 1 = 01 | 1462 | 1 in a row | 1462 | 67% |
Dozen 1 = 02 | 495 | 2+ in a row | 733 | 33% |
Dozen 1 = 03 | 159 | | 2195 | |
Dozen 1 = 04 | 55 | | | |
Dozen 1 = 05 | 20 | | | |
Dozen 1 = 06 | 3 | | | |
Dozen 1 = 07 | 1 | | | |
| | | | |
Dozen 2 = 01 | 1415 | | | |
Dozen 2 = 02 | 501 | | | |
Dozen 2 = 03 | 157 | | | |
Dozen 2 = 04 | 36 | | | |
Dozen 2 = 05 | 17 | | | |
Dozen 2 = 06 | 8 | | | |
Dozen 2 = 07 | 3 | | | |
| | | | |
Dozen 3 = 01 | 1447 | | | |
Dozen 3 = 02 | 475 | | | |
Dozen 3 = 03 | 167 | | | |
Dozen 3 = 04 | 57 | | | |
Dozen 3 = 05 | 15 | | | |
Dozen 3 = 06 | 8 | | | |
Dozen 3 = 07 | 2 | | | |
Dozen 3 = 08 | 1 |
Above, we clearly got more chance for 1 in a row compared to 2+ in a row, right?
But when I do the average for Dozen 1 in a row I get 1.49, so that would tell us there's a fine line between 1 in a row compared to 2+ in a row?
What is the difference between these 2 statistical methods and which one is more accurate?