Catcher framing is an area of baseball that has recently come to the public’s attention as a way some team’s can gain an edge over the competition. However, some think that the advantage gained may be disappearing. Ian York looks to see if catcher framing is on its way out.
The importance of catcher framing has been solidly established, thanks mainly to PITCHf/x data that let us directly measure the number of extra strikes a catcher can get, relative to his peers. Is it possible that the same tool that revealed the importance of framing will eventually be its demise?
Framing is the art of fooling an umpire. By what is essentially sleight-of-hand, the catcher tricks the umpire into believing the ball crossed the plate where it actually didn’t. If umpires were infallible robots, framing would be non-existent.
Since the new umpires’ contract in 2010, umpires have been evaluated on their ball/strike calls using video review (that is, PITCHf/x or the equivalent). Since then, two things have happened to strike calls: The strike zone has changed, becoming ever closer to the official zone as defined in the rulebook, and umpires have become better at calling balls and strikes. This is what has happened to out-of-zone ball/strike calls since 2008, when PITCHf/x became available:
(For this chart, I defined “out-of-zone” calls as called strikes that were more than one baseball diameter away from the de facto strike zone as it was called each year, or pitches that were inside that zone and called balls, including all umpires who made at least 2,500 calls in the year. The chart shows error rates, with the box outlining the 75th percentiles, the whiskers showing the 95% confidence zone, and the bar in the center showing the median rate.)
Umpires started to improve their calls almost immediately after video review was put in place, and have continued to improve each year. In 2015, umpires had a mean error rate of about 2.98%, compared to 4.30% in 2010.
A 97% accuracy rate, while not robotically infallible, is pretty good. So the question is, has the umpires’ improved accuracy reduced the effectiveness of framing?
The answer is a little unsatisfying: Maybe. I used the framing data at StatCorner.com, with a cutoff of 5,000 chances per year in order to look at 30-35 catchers each season:
Looking at either extra strikes per game, or total extra strike calls, it does seem that the spread of framing has decreased slightly. The top end has dropped a little. The best framer in 2008 (Jose Molina) was more than two extra strikes per game better than the best one in 2015. But even in 2008, Molina was an outlier, and a 33-year-old outlier at that; he retired after 2014, at the age of 39. Aside from Molina and Brian McCann (the other catcher over 2 extra strikes per game), all the other top framers in 2008 were in the same range as in 2015.
But most of the change is at the bottom end. There are fewer very bad framers now than there were in 2008. As baseball has become aware of the importance of framing, the worst framers no longer have jobs as full-time catchers.
(This chart may underestimate the overall improvement in framing, since we are comparing each catcher to the other catchers in the same year. If the worst framers are removed from the set, the average gets set higher so a catcher who is two strikes per game better than the average for the year would have been better in earlier years, when he was compared to a worse catcher. Jose Molina was about 4 strikes per game better than the average catcher in 2008, but the average catcher that year included the abysmal Ryan Doumit, who was losing over 4 strikes per game; catchers since then didn’t have that advantage.)
All in all, it’s hard to say that the impact of framing has decreased since 2010, even though umpires have improved their accuracy by 30% in that period. Perhaps the umpiring skills that apply to accurate identification of the strike zone are equally vulnerable to being fooled by framing magic.