# Which casino games are the worst, and why

Those of you who know me personally may recall that back in the day, I was a pretty solid card player. Mostly blackjack, but I played around a bit (successfully in theory, not so much in practice) with poker. At any rate, I got into a conversation with some of my students about the mathematics of gambling the other day, and I put the question to them:

How does the casino figure out how much you’re going to lose?

Your immediate answer might be, “As much as they can,” but bear with me. This question isn’t quite as stupid as it seems. The answer is also very technical, so you may want to skip this one if you’re not into that sort of thing.

First things first: The motivation of the casino. A casino wants to win as much money, in aggregate, from the players as possible, while still assuring themselves that they come back for more. Think about it this way: if you lost literally every time you played craps, for example, and nearly everyone at other tables lost as well, there’s no way you’d go back for more. It’s the frequent wins that keep you interested.

A 30 second review of normal distributions

Typically, you lose more than you win. In the language of probability, a win at a casino might be a “1-$sigma$ result (though the fraction of wins that’ll keep you coming back is a question for psychology). In lots of situations, we assume or can show that the distribution of probabilities of an event are the so-called “normal disitrbution” (also known as a Gaussian):

A 1-$sigma$ event will occur something like 16% of the time. This $sigma$ is also the number that people give when describing a typical errorbar. If someone cites a poll that says that $40%pm 3%$ support a candidate, it means that there’s a 16% chance that 43% or more actually support him/her.

How to design a game

Let’s say that I’m designing a new game for a casino. The game should be fun but profitable, given the standards above.

In every game, you either win, with a probability, $p$ or lose with a probability, $1-p$. If you win, you get $W$ times your original bet. If you lose, you’re out one unit. That’s it.

Now I realize that real games aren’t quite so simple. In blackjack, for instance, there’s doubling down, splitting, or blackjacks themselves, each of which pay off more than even odds if you are successful. For any game, you can approximate the average payout. As a good approximation, you either win or lose, and if you win, you win whatever your original bet was. So $Wsimeq 1$ for blackjack.

For roulette, on the other hand, a successful bet on an individual number will pay of 35:1, so W=35 for roulette.

The big number that the casino cares about is your expected gain per bet (which had better be a negative number, at least from their perspective). If you play N hands, then you expect to win:

$=pN$

where $$ is the expected number of wins. Thus, your expected total gain is:

$=pNtimes W-(N-pN)$

or equivalently, that your expected gain per hand is

$=p(1+W)-1$

Let’s say that you have a 49.5% chance of winning a hand of blackjack (which is a pretty accurate number, but depends slightly on local variants of the rules). The return equation gives a return of

$r_{blackjack}=-0.01$

On the other hand, consider roulette. In American style roulette, there are 38 different possible numbers, 18 red, 18 black, green 0 and green 00. This means that your probability of winning is 1/38. Running through the numbers, your expected return in roulette is:

$r_{roulette}=-0.05$

about five times worse than blackjack! Check for yourselves, and you’ll see that you get the same return whether you bet on a color (which pays 1:1 odds and has a 18/38 chance of winning) of on an individual number.

So far, we’ve only worried about the average return per play. The casino wants this to be negative, and it wants you to play as long as possible. The longer you (and the thousands of other people in the casino) play, the more you’ll lose on aggregate.

But you don’t lose at a steady pace. Instead, after $N$ hands, you might be up, and you might be down. While you’re expected net return is:

$=rN$

The variation in the number of wins goes as $sqrt{N}$. In particular, it’s straightforward to show that for a binomial game (one with only two outcomes), after N hands, your return is likely to be in the range:

$R=rNpm sqrt{p(1-p)} (1+W)sqrt{N}$

That is a lot to digest. The upshot is that the longer you play, the larger your swings. Notice also, that the bigger “W” is, the larger your expected swings.

How long can you play before you almost certainly will lose? We’re looking for the situation where a $1-sigma$ upswing still isn’t enough to help you. In that case:

$rN+sqrt{p(1-p)} (1+W)sqrt{N}=0$

or, solving for N

$N=frac{p(1-p)(1+W)^2}{r^2}$

You can play for a long time if either the return is very close to zero (that is, it’s a fair game) or the payoff is extremely high.

Suppose you bet \$1 per hand/spin on 1000 hands or spins of roulette. While there’s no way you could find a minimum that low in either Atlantic City or Vegas (at least with favorable rules), it’s not crazy to suppose you might play 1000 hands over a weeklong visit to either.

What is your range of reasonable outcomes?

The blue is blackjack, and the green is roulette. Even if you don’t follow the math above, this picture should be pretty clear. In both cases, you have a very good chance of coming out ahead: about 37% for blackjack, 38% for roulette, but if you lose, you expect to be in far worse shape playing roulette.

The typical blackjack player will be down around \$10, while the typical roulette player will be down \$50. But suppose you’re unluckier than typical, what then?

A “minus 1 $sigma$” blackjack player (someone doing better than only about 16% of all other players in the casino) will be down about \$40, but a similarly unlucky roulette player will be down \$230.

Mind you, this is if you play the games perfectly. There’s no chance to make an error in roulette or slots, but even craps and certainly blackjack allow for ample player errors, making your odds worse.

So back to the original question, if a casino wanted to figure out how much you should per hand, they simply need to figure out how long a typical player will play in a visit, or before they don’t mind losing his/her business. Then, invert the return relation above. You get:

$r_{ideal}=-frac{sqrt{p(1-p)} (1+W)}{sqrt{N}}$

The minus sign out front means that you’ll always design it to make players lose on average. The longer you’d like them to play, the closer to “fair” you need to make it.

However, the biggest effect is that, if you make the payoffs large, the overall return can be very, very bad. We’ve already seen that with roulette. With craps, the typical payout is 1:1 (or nearly so), so the odds are relatively good. Slot machines, which can have typical payoffs of 100:1 (or more) have incredibly bad returns, sometimes close to -10%!

And if think about the very highest payout gambling there is — the lottery. The payout is millions to one. And, unsurprisingly, about half of the revenue of lotteries get used for general expenditures. In other words, your expected return is a horrifying -50%.