How Is This Book Organized?

How Is This Book Organized?

Let me now provide a tour of the book and describe the logic behind its struc- ture. After an introduction to game theory in Chapter 1, Chapter 2 is about constructing a game by using the extensive and strategic forms. My experience is that students are more comfortable with the extensive form because it maps more readily to the real world with its description of the sequence of deci- sions. Accordingly, I start by working with the extensive form—initiating our journey with a kidnapping scenario—and follow it up with the strategic form, along with a discussion of how to move back and forth between them. A virtue of this presentation is that a student quickly learns not only that a strategic form game can represent a sequence of decisions, but, more generally, how the extensive and strategic forms are related.

Although the extensive form is more natural as a model of a strategic situ- ation, the strategic form is generally easier to solve. This is hardly surprising, since the strategic form was introduced as a more concise and manageable mathematical representation. We then begin by solving strategic form games in Part 2 and turn to solving extensive form games in Part 3.

The approach taken to solving strategic form games in Part 2 begins by lay- ing the foundations of rational behavior and the construction of beliefs based upon players being rational. Not only is this logically appealing, but it makes for a more gradual progression as students move from easier to more difficult con- cepts. Chapter 3 begins with the assumption of rational players and applies it to solving a game. Although only special games can be solved solely with the as- sumption of rational players, it serves to introduce students to the simplest method available for getting a solution. We then move on to assuming that each player is rational and that each player believes that other players are rational.

PREFACE xvii

These slightly stronger assumptions allow us to consider games that cannot be solved solely by assuming that players are rational. Our next step is to assume that each player is rational, that each player believes that all other players are rational, and that each player believes that all other players believe that all other players are rational. Finally, we consider when rationality is common knowl- edge and the method of the iterative deletion of strictly dominated strategies (IDSDS). In an appendix to Chapter 3, the more advanced concept of rational- izable strategies is covered. Although some books cover it much later, this is clearly its logical home, since, having learned the IDSDS, students have the right mind-set to grasp rationalizability (if you choose to cover it).

Nash equilibrium is generally a more challenging solution concept for stu- dents because it involves simultaneously solving all players’ problems. With Chapter 4, we start slowly with some simple 2 � 2 games and move on to games allowing for two players with three strategies and then three players with two strategies. Games with n players are explored in Chapter 5. Section 5.4 examines the issue of equilibrium selection and is designed to be self- contained; a reader need only be familiar with Nash equilibrium (as described in Chapter 4) and need not have read the remainder of Chapter 5. Games with a continuum of strategies are covered in Chapter 6 and include those that can be solved without calculus (Section 6.2) and, for a more advanced course, with calculus (Section 6.3).

The final topic in Part 2 is mixed strategies, which is always a daunting sub- ject for students. Chapter 7 begins with an introductory treatment of proba- bility, expectation, and expected utility theory. Given the complexity of working with mixed strategies, the chapter is compartmentalized so that an instructor can choose how deeply to go into the subject. Sections 7.1–7.4 cover the basic material. More complex games, involving more than two players or when there are more than two strategies, are in Section 7.5, while the maximin strat- egy for zero-sum games is covered in Section 7.6.

Part 3 tackles extensive form games. (Students are recommended to re- view the structure of these games described in Sections 2.2–2.4; repetition of the important stuff never hurts.) Starting with games of perfect information, Chapter 8 introduces the solution concept of subgame perfect Nash equilibrium and the algorithm of backward induction. The definition of subgame perfect Nash equilibrium is tailored specifically to games of perfect information. That way, students can become comfortable with this simpler notion prior to facing the more complex definition in Chapter 9 that applies as well to games of im- perfect information. Several examples are provided, with particular attention to waiting games and games of attrition. Section 8.5 looks at some logical and ex- perimental sources of controversy with backward induction, topics lending themselves to spirited in-class discussion. Games of imperfect information are examined in Chapter 9. After introducing the idea of a “game within a game” and how to properly analyze it, a general definition of subgame perfect Nash equilibrium is provided. The concept of commitment is examined in Section 9.4.

Part 4 covers games of incomplete information, which is arguably the most challenging topic in an introductory game theory class. My approach is to slow down the rate at which new concepts are introduced. Three chapters are devoted to the topic, which allows both the implementation of this incre- mental approach and extensive coverage of the many rich applications involv- ing private information.

xviii PREFACE

Chapter 10 begins with an example based on the 1938 Munich Agreement and shows how a game of imperfect information can be created from a game of incomplete information. With a Bayesian game thus defined, the solution con- cept of Bayes–Nash equilibrium is introduced. Chapter 10 focuses exclusively on when players move simultaneously and thereby extracts away from the more subtle issue of signaling. Chapter 10 begins with two-player games in which only one player has private information and then takes on the case of both players possessing private information. Given the considerable interest in auctions among instructors and students alike, both independent private-value auctions and common-value, first-price, sealed-bid auctions are covered, and an optional chapter appendix covers a continuum of types. The latter requires calculus and is a nice complement to the optional calculus-based section in Chapter 6. (In ad- dition, the second-price, sealed-bid auction is covered in Chapter 3.)

Chapter 11 assumes that players move sequentially, with the first player to move having private information. Signaling then emerges, which means that, in response to the first player’s action, the player who moves second Bayesian updates her beliefs as to the first player’s type. An appendix introduces Bayes’s rule and how to use it. After the concepts of sequential rationality and consis- tent beliefs are defined, perfect Bayes–Nash equilibrium is introduced. This line of analysis continues into Chapter 12, where the focus is on cheap talk games. In Section 12.4, we also take the opportunity to explore signaling one’s intentions, as opposed to signaling information. Although not involving a game of incomplete information, the issue of signaling one’s intentions natu- rally fits in with the chapter’s focus on communication. The material on sig- naling intentions is a useful complement to Chapter 9—as well as to Chapter 7—as it is a game of imperfect information in that it uses mixed strate- gies, and could be covered without otherwise using material from Part 4.

Part 5 is devoted to repeated games, and again, the length of the treat- ment allows us to approach the subject gradually and delve into a diverse col- lection of applications. In the context of trench warfare in World War I, Chapter 13 focuses on conveying the basic mechanism by which cooperation is sustained through repetition. We show how to construct a repeated game and begin by examining finitely repeated games, in which we find that coop- eration is not achieved. The game is then extended to have an indefinite or in- finite horizon, a feature which ensures that cooperation can emerge. Crucial to the chapter is providing an operational method for determining whether a strategy profile is a subgame perfect Nash equilibrium in an extensive form game with an infinite number of moves. The method is based on dynamic pro- gramming and is presented in a user-friendly manner, with an accompanying appendix to further explain the underlying idea. Section 13.5 presents empir- ical evidence—both experimental and in the marketplace—pertaining to coop- eration in repeated Prisoners’ Dilemmas. Finally, an appendix motivates and describes how to calculate the present value of a payoff stream.

Chapters 14 and 15 explore the richness of repeated games through a series of examples. Each example introduces the student to a new strategic scenario, with the objective of drawing a new general lesson about the mechanism by which cooperation is sustained. Chapter 14 examines different types of pun- ishment (such as short, intense punishments and asymmetric punishments), cooperation that involves taking turns helping each other (reciprocal altruism), and cooperation when the monitoring of behavior is imperfect. Chapter 15

PREFACE xix

considers environments poorly suited to sustaining cooperation—environ- ments in which players are finitely lived or players interact infrequently. Nevertheless, in practice, cooperation has been observed in such inhospitable settings, and Chapter 15 shows how it can be done. With finitely lived players, cooperation can be sustained with overlapping generations. Cooperation can also be sustained with infrequent interactions if they occur in the context of a population of players who share information.

The book concludes with coverage of evolutionary game theory in Part 6. Chapter 16 is built around the concept of an evolutionarily stable strat- egy (ESS)—an approach based upon finding rest points (and thus analogous to one based on finding Nash equilibria)—and relies on Chapter 7’s coverage of mixed strategies as a prerequisite. Chapter 17 takes an explicitly dynamic approach, using the replicator dynamic (and avoids the use of mixed strate- gies). Part 6 is designed so that an instructor can cover either ESS or the repli- cator dynamic or both. For coverage of ESS, Chapter 16 should be used. If coverage is to be exclusively of the replicator dynamic, then students should read Section 16.1—which provides a general introduction to evolutionary game theory—and Chapter 17, except for Section 17.4 (which relates stable outcomes under the replicator dynamic to those which are an ESS).

How Can This Book Be Tailored to Your Course? The Course Guideline (see the accompanying table) is designed to provide some general assistance in choosing chapters to suit your course. The Core treatment includes those chapters which every game theory course should cover. The Broad Social Science treatment covers all of the primary areas of game theory that are applicable to the social sciences. In particular, it goes beyond the Core treatment by including select chapters on games of incomplete information and repeated games. Recommended chapters are also provided in the Course Guideline for an instructor who wants to emphasize Private Information or Repeated Interaction.

If the class is focused on a particular major, such as economics or political science, an instructor can augment either the Core or Broad Social Science treatment with the concepts he or she wants to include and then focus on the pertinent set of applications. A list of applications, broken down by disci- pline or topic, is provided on the inside cover. The Biology treatment recog- nizes the unique elements of a course that focuses on the use of game theory to understand the animal kingdom.

Another design dimension to any course is the level of analysis. Although this book is written with all college students in mind, instructors can still vary the depth of treatment. The Simple treatment avoids any use of probability, calculus (which is only in Chapter 6 and the Appendix to Chapter 10), and the most challenging concepts (in particular, mixed strategies and games of incom- plete information). An instructor who anticipates having students prepared for a more demanding course has the option of offering the Advanced treatment, which uses calculus. Most instructors opting for the Advanced treatment will elect to cover various chapters, depending on their interests. For an upper-level economics course with calculus as a prerequisite, for example, an instructor can augment the Advanced treatment with Chapters 10 (including the Appendices), 11, and 13 and with selections from Chapters 14 and 15.

xx PREFACE

COURSE GUIDELINE

Broad Social Private Repeated

Chapter Core Science Information Interaction Biology Simple Advanced

1: Introduction to Strategic Reasoning ✔ ✔ ✔ ✔ ✔ ✔ ✔

2: Building a Model of a Strategic Situation ✔ ✔ ✔ ✔ ✔ ✔ ✔

3: Eliminating the Impossible: Solving a Game when Rationality Is Common Knowledge ✔ ✔ ✔ ✔ ✔ ✔ ✔

4: Stable Play: Nash Equilibria in Discrete Games with Two or Three Players ✔ ✔ ✔ ✔ ✔ ✔ ✔

5: Stable Play: Nash Equilibria in Discrete n-Player Games ✔ ✔

6: Stable Play: Nash Equilibria in Continuous Games ✔

7: Keep ’Em Guessing: Randomized Strategies ✔ ✔ ✔ ✔

8: Taking Turns: Sequential Games with Perfect Information ✔ ✔ ✔ ✔ ✔ ✔ ✔

9: Taking Turns in the Dark: Sequential Games with Imperfect Information ✔ ✔ ✔ ✔ ✔ ✔ ✔

10: I Know Something You Don’t Know: Games with Private Information ✔ ✔

11: What You Do Tells Me Who You Are: Signaling Games ✔ ✔

12: Lies and the Lying Liars That Tell Them: Cheap Talk Games ✔

13: Playing Forever: Repeated Interaction with Infinitely Lived Players ✔ ✔ ✔ ✔

14: Cooperation and Reputation: Applications of Repeated Interaction with Infinitely Lived Players ✔ ✔ 14.3 ✔

15: Interaction in Infinitely Lived Institutions ✔

16: Evolutionary Game Theory and Biology: Evolutionarily Stable Strategies ✔

17: Evolutionary Game Theory and Biology: Replicator Dynamics ✔ ✔

PREFACE xxi

Resources for Instructors To date, supplementary materials have been relatively minimal to the instruc- tion of game theory courses, a product of the niche nature of the course and the ever-present desire of instructors to personalize the teaching of the course to their own tastes. With that in mind, Worth has developed a variety of products that, when taken together, facilitate the creation of individualized resources for the instructor.

Place Your Order Here!

How Is This Book Organized?
How Is This Book Organized?

Leave a Comment

Your email address will not be published. Required fields are marked *