Cut down on the fastballs.
That's one of the recommendations from two economists who recently analyzed all Major League regular-season games from 2002 to 2006 (excluding extra innings). Admittedly, they didn’t look at postseason contests and they're, well, economists, who tend to know a lot more about GDP and base rates than OBP and baseball.
Still, when you’ve studied the outcomes of more than 3 million pitches and applied a little game theory, you're bound to learn something. Here’s what the two number-meisters discovered:
“Pitchers appear to throw too many fastballs,” write Kenneth Kovash, an MBA and numbers whiz, and Steven Levitt, a University of Chicago professor and coauthor of the Freakonomics blog at The New York Times, in their paper for the National Bureau of Economic Research. It turns out that fastballs lead to a higher OPS – the sum of a batter’s on-base percentage and slugging percentage – than nonfastballs do (.753 vs. .620).
If batters are more likely to reach base on fastballs, pitchers should adjust, according to a strand of game theory known as the "minimax solution." According to minimax, teams gradually discover the set of plays that minimizes their maximum possible loss. But since pitchers didn't seem to be adjusting, Mr. Kovash and Dr. Levitt decided to look deeper. What they found was the importance of the ball-strike count during an at bat.
As long as there were fewer than two strikes during an at-bat, the outcome between fastballs and nonfastballs tended to be small. But when there were two strikes, fastballs generated an OPS that was more than 100 points higher than nonfastballs did. Thus, if a team’s pitchers cut down on fastballs by 10 percentage points (particularly in two-strike situations), they would allow roughly 15 fewer runs in a season. That's about 2 percent of their total runs allowed, the authors calculate.
That's an edge. The economists found another: Pitchers are too predictable. Minimax theory predicts that pitch order shouldn't matter. In fact, patterns emerge: If the last pitch was a fastball, the likelihood that the next one would be a fastball fell by 4.1 percentage points. If the last patch was a slider, it was 2 percentage points less likely that the next one would be a slider. Other patterns the researchers found: Fastballs were more likely to follow changeups than other nonfastballs; curveballs were most likely to follow fastballs and least likely to follow changeups.
There's a good reason for this, as any good pitching coach knows. Making a batter react to a 70 m.p.h. changeup followed by a 96 m.p.h. fastball keeps him off-balance.
Knowing these statistics would boost a batter’s OPS by .006, which should be worth about 10 to 15 runs per team per season, the authors estimated, based on interviews with Major League executives and some assumptions of their own.
Do these regular-season statistics hold in a World Series? Who knows?
One reason they may not is that the OPS gap associated with fastballs was smallest for good pitchers. And World Series teams presumably have among the best pitchers in the league. Then again, the OPS gap for fastballs only held for good and medium hitters. World Series lineups should be full of them, too. (Bad hitters, the authors found, did better with changeups than fastballs and worst of all with curveballs.)
So this Series, switch on the tube and try out your game theory. In a close-fought contest, it really could come down to a few hundredths of a percentage point.
– Hit a news home run by following us on Twitter.