/cdn.vox-cdn.com/uploads/chorus_image/image/69323714/1318819301.0.jpg)
Did you watch the last MLB no-hitter? In most seasons, such a question would be relatively easy. There have been 268 no-hitters in the modern era, which was marked by the formation of the American League in 1901. That averages out to a little more than two per year, which has been pretty consistent if you stretch it out for at least a couple of years—over the four-year span from 2016 through 2019 there were nine no-hitters, for instance.
Baseball is weird, though. Sometimes there are more no-hitters than other years. In 2005, there were precisely zero no-hitters. In 2015, there were seven no-hitters. And though we haven’t seen a perfect game since 2012, there were three that year (and three in 2010, too; I’m counting Armando Galarraga’s and there’s nothing you can do to stop me).
But this year, well, there’s something different. It is not yet Memorial Day, and we have witnessed six no hitters already. Six! The most recent was a mere two days ago from Corey Kluber, earning his first career no-no against the Texas Rangers, who were on the receiving end of their second no-hitter this season.
Why is this happening? Aren’t fewer and fewer pitchers even getting to the ninth inning? Well, yes. But a big reason why this is happening is because no one is getting hits. The league batting average—the league batting average—is .236. If that seems low, that’s because it is. It is, in fact, the lowest league batting average in the history of Major League Baseball.
Why is this the case? There are...a lot of reasons. Defensive shifts have lowered the Batting Average on Balls in Play (BABIP), which have turned many traditional hits into routine outs. Players have realized that fly balls are much more dangerous than ground balls, literally selling out for home run swings to maximize efficiency—batting average be damned. Thirdly, pitchers are better now than they have ever been; they’re throwing harder and have nastier breaking stuff. This has resulted in rising strikeouts and fewer balls in play overall.
It’s a perfect storm, and it’s resulted in a very, very odd year. It has also made judging players difficult, because the traditional metrics have been warped. A decade ago, the average, well, average was at .255, and a .235 hitter was a bad hitter. Now, that .235 hitter is sitting pretty right at league average.
Thankfully, there’s a way around that: adjusted stats. Adjusted stats are simply stats that are adjusted for league average for that specific year. Every adjusted stat is the same: 100 represents league average, and one point above or below 100 represents one percentage point better or worse than league average. It’s that simple.
The beauty of adjusted stats is that they relay information that can be considered across years. A hitter that is 20% above league average in 1902 is just as productive as a hitter that is 20% above league average in 2022; that said hitter in 1902 may hit .290 as opposed to .250 is literally irrelevant. Different years, different eras—but adjusted stats let you compare across them.
Perhaps the most commonly used adjusted stat is wRC+, or Adjusted Weighted Runs Created. The “+” designates that it is an adjusted stat, and, like other adjusted stats, league average is set at 100. But there are other stats, too. In 2019, Fangraphs rolled out their Plus Stats tab to their leaderboards, which lets you view adjusted batting average, walk rate, on base percentage, and more. And Baseball-Reference lists both OPS+ and ERA+ on every team leaderboard and individual page.
So, if you too are frustrated with how bizarre stats look in 2021, consider using some adjusted stats. You might just be surprised.