Poker Strategy Winning With Game Theory

Kurt Verstegen
  1. Winning Poker Strategy Guide
  2. Poker And Game Theory

Applying von Neumann’s principles to beat a simplified toy game is one thing; figuring out how to use game theory to win online poker riches isn’t as simple. The more complex the game, and the more variables and decision points you throw in, the harder it is to find an optimal strategy. Poker Strategy: Winning with Game Theory by Nesmith C. Ankeny A copy that has been read, but remains in clean condition. All pages are intact, and the cover is intact. The spine may show signs of wear. Pages can include limited notes and highlighting, and the copy can include previous owner inscriptions. At ThriftBooks, our motto is: Read More. Poker tells; poker equity; game theory optimal (GTO). On an e-reader like the Kindle or you prefer the old ink-and-paper combo, poker books are an essential element of any winning poker strategy.

``Poker Strategy: Winning with Game Theory' An application of game theoretic betting principles is found in the book ``Poker Strategy: Winning with Game Theory' by Nesmith Ankeny 20. In this 1981 book, the author attempts to give a complete near-optimal strategy for the game of Five-Card-Draw. The relationship between pot odds and odds of winning is one of the most important concepts in poker strategy. Pot odds are the ratio of the size of the pot to the size of the bet required to stay in the pot. For example, if a player must call $10 for a chance to win a $40 pot (not including their $10 call), their pot odds.

In this article we will continue talking about game theory. If you haven't read Part 1 yet, I would advise you to do so first, otherwise you might not fully understand this article. So let's get right to it.

Half-street games, what are they? These are simple games with the following characteristics:

• Player 1 (often called X) checks in the dark

• Player 2 (often called Y) then has the choice to also check or bet a certain amount in accordance with the rules of the game

• If Y bets, X can call and there is a showdown. X can also fold, but can't raise. If Y checks the players will also see a showdown.

So to illustrate this:

When talking about the value of the game, then we mean the EV of player Y, given that players X and Y both play optimal. When talking about ex-showdown value, we mean the money that goes from one player to the other as a result of betting in the game. In this game the hand of player Y is picked randomly out of a selection, 50% of which will beat the hand of player X, and 50% of which will lose to player X's hand. What becomes apparent immediately is that player Y can't have negative EV in this game because he always has the option to check behind, giving him an EV of 0 at any moment. If both players always check their hands, both will win 50% of the time. What is also important is that X only gets one hand and player Y knows this hand, so player Y has an information advantage.

X and Y both only have to make one decision. Y has to come up with a range with which to bet, and X has to come up with a range with which he will call a bet from player Y. Actually, the range of player X is only one hand, so he will have to decide how often to call a bet with that hand. Y can now play a pure strategy game (meaning he will choose 1 option 100% of the time) and always valuebet the nuts and check his worse hands. X can obviously exploit this strategy by always folding when player Y bets because he knows that Y only bets with the nuts and X would always lose if he calls. If X does this, Y can change his strategy to betting the nuts and bluffing his weak hands. If Y does this, then X can start calling 100% of the time again, and as a result Y can go back to his original strategy: bet the nuts and check the bluffs. As you can see we have a pattern of recurring strategies here. This tells us that the optimal strategy will be a mixed-strategy, that is, a strategy where you will be executing different options a certain percentage of the time.

In this game there are two strategic choices – one for X: how often shall he call, and one for Y: how often shall he bluff. Y will always have to bet the nuts as this option is clearly more profitable than checking. X will call C% of the time and Y will bluff B% of the time. Once we find a value for C and B we have our answer.

Let's start with player X. X plays optimal when player Y is indifferent between bluffing and checking. With this I mean: bluffing and checking has the same EV for player Y. If Y bluffs successfully (i.e player X folds) he will win P bets and he will lose one bet if X calls. If C is the call-frequency, then 1-C is the fold-frequency. If X plays optimal, then Y is indifferent between bluffing and checking, so:

(pot size)(fold-frequency X) = (bluff bet)(call-frequency X)

(P)(1-C) = (1)(C)

P – PC = C

P = C + PC

P = C(1+P)

C = P/(1+P)

As we can see, the bigger the pot, the more often X will have to call. This relates back to the pot-odds principle: the more money in the pot, the more often X will have to call to counter Y's bluffing behaviour.

On the other side we have Y's strategy. Y will have to bluff often enough so that X is indifferent between calling and folding. If X calls he will lose one bet to Y's valuebet and win P+1 when calling a bluff. Remember that B is the bluff-frequency.

1 = B(P+1)

B = 1/(P+1)

The value of 1/(P+1) is very important in poker analysis. Because it is so important, we will call it A, so:

A = 1/(P+1).

A implies two things in this game. First of all X will have to call often enough to make Y indifferent between checking and bluffing his weak hands. X's call frequency is equal to P/(P+1), which is equal to 1-A. For those of you who don't understand why P/(P+1) = 1-A, the following might help you:

If A = 1/(P+1) then:

1-A = 1 – 1/(P+1)

1-A = (P+1)/(P+1) – 1/(P+1)

1-A = (P+1-1)/(P+1)

1-A = P/(P+1)

Poker Strategy Winning With Game Theory

1-A is therefore the call-frequency of X and A is X's fold-frequency if he gets confronted with a bet from player Y. Furthermore, as we worked out earlier, Y will bluff 1/(P+1) or A percent of the time. In general we can therefore say that the optimal strategy in this game is as follows: Y bets all his nut-hands and bluffs A% of the time with his weak hands (or you could say he bluffs A/2 of all his hands, seeing as his ratio of winning and losing hands is 50-50) and X calls with 1-A of all his hands.

Lets use an example for this. Imagine that P=3. This means that:

A = 1/(3+1)

A = 0.25 and also: 1-A = 0.75

We can see now that Y will bluff 25% of the time. So he will bet every time he has the nuts (50% of the time), and bet 25% of the time when he has a dead hand, which is equal to 0.25 x 0.5 = 0.125 = 12.5 % of the time (and as we can see A/2 is also 0.125). Furthermore, player X will call 75% of the time.

Now lets say that P = 4.

A = 1/(4+1)

A = 0.20 and also: 1-A = 0.80

We can see now that Y will bet 20% of the time as a bluff. So he will always bet when he has the nuts (50% of the time) and bet 20% of the time when he has a dead hand, which is equal to 0.20 x 0.5 = 0.10 = 10 % of the time ( and as we can see A/2 is also 0.10). Furthermore, player X will call 80% of the time.

Observe how Y bluffs less the bigger the pot gets. This might go against your intuition, seeing as bluffing successfully in a bigger pot results in you winning more, but an important principle of playing an optimal game is that bluffing in an optimal poker game is not a profitable move. The combination of bluffing and valuebetting was invented to make sure that the optimal strategy gains in value, regardless of your opponent's reaction.

Let's now look at a real poker example where we use game theory to decide how often we can bluff. Imagine we're playing No Limit Texas Hold'em and we are heads-up waiting for the river card and we want to know how often we can bluff in this situation. Bear in mind we don't always need game theory to come up with the answer. When up against a player who calls a lot, we never bluff. On the other hand, if we're up against a player who folds a lot, we can bluff more often. Game theory comes in handy if you don't know your opponent very well and we think he is better than we are. We want to make sure that he doesn't exploit us.

Imagine we have a 20% chance of winning on the river (for example with a flushdraw). There is $100 in the pot and $50 seems to be an appropriate bet (maybe $60 would be better but for the sake of keeping the pot odds simple we'll choose $50). Our opponent then gets odds of 150:50 or 3:1. To find a bluff frequency here we need to make sure that our bluff odds are equal to his pot odds. With bluff odds I mean the chance that you bluff when you bet.

Seeing as his pot odds are equal to 3:1, our bluff odds also need to be 3:1 or 25%. If we bet on the river, we will therefore do this with the best hand 75% of the time, and 25% of the time we'll be bluffing. Betting with the best hand 75% of the time represents the 20% chance that we have of hitting the best hand. The other 25% also represents a certain chance, namely 6.66% (20% divided by the 3 of the 3:1 odds). Therefore, on the river, we will bet with the best hand 20% of the time, bluff 6.66% of the time and check 73.34% of the time.

So how can you use this in practice? We can now choose to bet on all of our 9 outs and on 3 additional 'bluff outs'. Make sure that you can actually represent something with these bluff-outs, otherwise this plan will backfire. 3 outs are just about equal to 6.66% (divide 3 by 46 unknown cards = 0.06521). So, on the river we will bet on our 9 outs that give us the flush plus 3 additional outs which we chose beforehand.

If our opponent now has to call a bet from us for which he gets odds of 3:1, he will see that we have the best hand 75% of the time and that we're bluffing 25% of the time. His chance of winning, therefore, is 25%. His EV of a call is therefore: (0.25)($150) + (0.75)($-50) = $37.5 - $37.5 = $0 and his EV of a fold is also $0. As you can see, with the use of game theory we made sure that we can not be exploited by an opponent who is unknown or probably better than us. So, follow these steps if you want to avoid being exploited when bluffing:

Winning Poker Strategy Guide

1) Decide on a good, believable betsize and look at the odds your opponent is getting from this betsize

2) Make sure that your bluff odds are equal to the pot odds of your opponent. In other words, if you bet on the river, his pot odds have to be equal to the chance of you bluffing.

Poker Strategy Winning With Game Theory

Just another quick example: The pot is $500 and you bet $400. Your opponent now gets odds of 2.25:1 or 30.77%. If you decide to bet in this situation, you should do so with the best hand 69.23% of the time and bluff 30.77% of the time. This way your opponent won't be able to exploit you.

Poker strategy winning with game theory

Poker And Game Theory

The examples in this article were inspired by two books, The Mathematics of Poker and The Theory of Poker. Since these books, in my opinion, fail to be very clear on a couple of things, I decided to try and present the topic a little more clear and understandable for everyone. But if you're interested on more information on game theory these two books are still a good source.

I hope you found it interesting. As always, questions, comments and criticism are more than welcome in on the forum.

  • Tags

    Intermediate strategyAdvanced strategy