I never forget this quote from WIKI:
"Regression toward the mean simply says that, following an extreme random event, the next random event is likely to be less extreme. In no sense does the future event "compensate for" or "even out" the previous event, though this is assumed in the gambler's fallacy (and variant law of averages). Similarly, the law of large numbers states that in the long term, the average will tend towards the expected value, but makes no statement about individual trials. For example, following a run of 10 heads on a flip of a fair coin (a rare, extreme event), regression to the mean states that the next run of heads will likely be less than 10, while the law of large numbers states that in the long term, this event will likely average out, and the average fraction of heads will tend to 1/2. By contrast, the gambler's fallacy incorrectly assumes that the coin is now "due" for a run of tails, to balance out."
So i run 1 million trails to see the worst and extreme for even money bets:
Low: 18 in a row (once)
High: 18 in a row (once)
Red: 19 in a row (once)
Black: 20 in a row (twice)
Odd: 18 in a row (twice)
Even: 18 in a row (once)
So two times in 1 millon trails, so was there no hit or regression towards the mean.
I just test this as i would like to confirm that 10 events are extreme and that our expectation for the next 10 trails will include at least one winning bet.