1. Which teams best matched their relievers to leverage last season? That is, which teams managed their bullpens such that their best/worst relievers faced the highest/lowest average entering leverage?
2. Has there been a noticeable improvement in the last 30 years in how well teams match their relievers to leverage?
3. Does matching relievers to leverage more effectively improve bullpen performance?
Baseball analysts frequently criticize Major League managers for assigning specialized roles to their relief pitchers (i.e., eighth-inning guy, closer, etc.). The complaint levied against the current role-based system is that it prioritizes innings over leverage. By waiting to bring in their best pitchers in their assigned innings, managers needlessly limit their flexibility and employ a suboptimal tactical approach.
In the place of an innings-based assignment of roles, analysts argue for a role-based system of a different type -- specifically, roles based on leverage rather than innings. Simply put, managers should rank their relievers based on effectiveness, and then deploy them in a leverage-sensitive, rather than innings-sensitive, manner.
If managed correctly, at the end of the season the best reliever on the staff should face the highest average entering leverage. By "entering leverage," I mean the measure of leverage at the moment he takes the mound. Since the best reliever should be most often used to extinguish high-leverage threats, they should have the highest entering leverage on the staff, with the second-best reliever having the second-highest entering leverage, and so on.
There is more to bullpen management than just responding to leverage. Matchups matter, as do considerations of overuse. In short, managing a bullpen is much more complex than just calculating leverage and throwing the appropriate pitcher out there. Most analysts understand this, and I'm not suggesting managers should only look at leverage.
However, managers should be more sensitive to leverage than they currently are. It is more than a bit perplexing that there has been such resistance to scraping the innings-based assignment of roles, since the approach presumably emerged in the first place because managers wanted their best pitchers pitching with the game on the line, which is generally in late innings. The leveraged-based approach is simply a more effective way of ensuring that managers have their best pitchers on the mound in the most important situations. It merely argues that managers should be more flexible when it comes to deploying their bullpen resources.
The persistence of innings-based bullpen roles and this study:
The data below displays how well Major League managers matched their bullpens to leverage last season. Since almost all managers still subscribe to an innings-based approach, we can’t say that the highest ranking teams deserve credit for breaking free from the traditional approach. Rather, what we can say is that these teams organized their bullpens, such that, their best pitchers were pitching the late innings when, on average, leverage is higher. Moreover, in those late-middle innings, when roles are typically less defined, they generally choose wisely.
What I did
1. Collect all the relief pitchers who had at least 30 innings pitched in 2013.
2. Download each pitcher's FIP statistic and gmLI (entering leverage) from Fangraphs.com.
3. Created scatter plots and ran linear regression trendlines through them and calculated R-Squared.
Interpretation of Results
The higher the R-squared, the more effectively a team matched its relievers to leverage.
The Atlanta Braves best matched their relief pitchers to leverage. An R-squared of .765 can be interpreted as 76.5 percent of the variance in entering leverage can be explained by pitchers’ FIP. The Pirates ranked fourth in the National League and ninth in Major League Baseball.
Here are the scatter plots for the Atlanta Braves, Pittsburgh Pirates and MLB.
Better than any other team in baseball, the Atlanta Braves had a close fit between decreasing FIP (moving right to left) and increasing average entering leverage (moving from the bottom up on Y-axis).
The dots from left-to-right represent: Melancon, Grilli, Watson, Mazzaro, Wilson, Gomez, Hughes, Morris. As many on this site noted over the course of the year, Bryan Morris’ performance did not match up well with his role.
Finally, here is the scatter plot for all of Major League relief pitchers with 30 or more innings pitched.
Yearly changes in leverage sensitivity
The table below shows the yearly fluctuations in what I am calling leverage sensitivity. Leverage sensitivity simply is a measure of the league-wide correlation between FIP and average entering leverage. The higher the R-squared, the more effectively managers matched their relievers to leverage.
While leverage sensitivity has been generally improving, 2013 had the lowest R-squared since 2003.
Leverage sensitivity and shutdowns/meltdowns
We would expect that teams that more effectively match their relievers to leverage would show improved bullpen performance. Surprisingly, while the data does seem to show some relationship, the relationship is very weak. (Some may even dismiss as too weak to be meaningful.) This poses some interesting questions that we can take up in the discussion section. It may be that I’m not measuring the phenomenon correctly, or that my expectations of the nature the relationship were overly inflated.
X-Axis is the number of shutdowns divided by meltdowns.
As defined by Fangraphs a "Shutdown is when a reliever accumulates greater than or equal to 0.06 WPA in any individual game. A Meltdown is when a reliever’s WPA is less than or equal to -0.06 in any individual game." The higher the ratio, the more effective the bullpen.
Y-Axis is the leverage sensitivity R-squared. The greater the R-squared, the more effectively relievers were matched to leverage.
Each dot in the table below represents a baseball season. Seasons with higher average leverage sensitivity tend to have a better shutdown/meltdown ratios, but the relationship is fairly weak.
Each dot represents leverage sensitivity and SD/MD ratio for each team in 2013. Again there is something going on, but it is very weak.
1. There are tremendous differences between how well Major League Baseball managers matched their bullpens to leverage last season. The Pirates ranked in the top third in this regard. The differences between the teams probably has less to do with managers using different approaches then it does with some managers organizing their bullpens more effectively than others according to the traditional innings-based approach.
2. We have seen a general increase in matching relievers to leverage over the course of the past 30 years. However, the year to year totals are volatile.
3. Surprisingly, matching relievers to leverage does not seem to create a very strong relationship with improved bullpen performance as measured by the ratio shutdown/meltdown. This is interesting and requires further research.