I've been working on a new project / method which I will present on the forum once I work out the details and narrative. In putting the pieces together, I tested some data that seems very interesting that I wanted to share.
I've added an image of a graph (below) that shows results of some of my test runs using this particular methodology. I ran 4 sessions of 100 spins each = 400 spins total. Each session result is overlaid on the graph, with the blue line being session 1, red being session 2, yellow session 3, and green being session 4. I also created a 5th line representing the average of the 4 - shown in dark purple.
The session lines remained fairly close together up to and around the 24th spin of each. Then they all began to diverge quite wildly. Sessions 1 and 3 drifted above the break even point (5000) for awhile, while sessions 2 and 4 started sinking significantly. Although all went negatively tilted eventually, the short time frame showed relative stability. What really caught my eye was the divergence of each started at nearly the same point (24th spin). You would expect them to scatter, but all at nearly the same point and time? Interesting to say the least. I've got some theories on this - but they would just bounce around in my head.... I'd love to hear from the members and get some input on this. One thing that came to mind is with such a narrow window at the beginning of this divergence, some loss limit triggers (i.e... tipping points where recovery becomes highly unlikely) could be devised from such. The betting method is non-progressive, and does not vary.
I prefer to use and compare multiple shorter term data as opposed to one long graph that stretches to infinity, as they seem more applicable to casino (and real life) situations.