Notice: Trying to get property 'display_name' of non-object in /var/www/html/wp-content/plugins/wordpress-seo/src/generators/schema/article.php on line 52
keyboard_arrow_uptop

I’d like to start today by showing a chart way, way earlier than I (or anyone, probably) usually would. It’s going to break at least a few rules about proper graph-making (no title, unlabeled axes), but it’s worth it for the illustrated point. See if you can figure out what it shows, I’ll wait.

Okay, time’s up. Here it is again with labels in place.

In the last nine years, the number of pitches thrown at 100 mph or greater has increased by more than an order of magnitude, from 255 in 2008 to 2977 in 2015. Plotting the same data set with alternative minimum speeds (95 mph, 97 mph) shows the same results; while the shape of the curve changes slightly, the overall upward trend doesn’t. This isn’t anything a reader of BP doesn’t already know, but it’s nice to put real data behind your assumptions some times.

Something else the BP reader, or any baseball fan really, also accurately assumes is that it’s harder to hit a faster pitch. The following graph shows batter performance on contact from 2008-2016 as a function of fastball speed (using a LWTS-runs/PA metric I’ll describe later). Allowing for a bizarre spike upwards at 99 mph, the trend is evident.

Okay. With this data-supported theoretical background in place, we now come to the heart of the issue, which is a point BP’s editor-in-chief Sam Miller talked with me about back when I was first signing on here—batters will need to improve their currently-deficient ability to hit extremely fast pitches in order to keep up with pitchers, but it’s not yet clear whether they’ve been able to make any improvements in this area in the time period I covered above.

Sam first touched on this idea in January of this year, using some contact metrics and slugging percentage as markers of skill. He found that there was no significant difference between batter performance in 2008 and in 2015, and that that was concerning for batters:

“This is, I have to say, pretty surprising to me: Hitters aren’t getting better at this. I figured they would be. You get used to it. You look for it. You’ve been raised around it. You adjust to it. You force the pitcher to adjust back. But, in fact, 95 mph is no easier for 2015 hitters to hit than it was for 2005 (or, at least, 2008) hitters. Pitchers might not ever have to adjust again, if they can just keep throwing more and more pitches like these pitches.”

In the rest of this, I’ll try to take a thorough look at the same question, and see if I arrive at the same conclusion.

The first thing to address is finding a suitable control group. Any results I find when focusing on fast pitches are meaningless without context; if batters didn’t get any better at hitting fast pitches in my data set, the meaning of that changes depending on whether their performance on slower pitchers got better, stayed the same, or got worse in the same time frame. Using the entire sample of fastballs below some velocity threshold is the easy solution, but not the right one—the changing league velocity profile would change the shape/distribution of such a sample over time[1], which is the same reason I shouldn’t (and won’t) look at all pitches *over* a velocity threshold for fast pitches. I decided, then, to look for the fastball velocity that was most stable (in terms of total number of pitches) over my time frame, using range and relative standard deviation to pseudo-quantify stability. I settled on 92 mph fastballs.

Choosing a sample to represent the high-velocity fastballs was more of a gut-feel kind of call—I wanted enough separation from the control group that it would have significant impact on batter reaction times, enough of a jump in the raw number of pitches that the difference from the start of the sample to the end was clear and meaningful, and a large-enough sample that I felt confident drawing results from the data. I settled on 97 mph fastballs, of which there were roughly 9,000 in 2008 and 23,500 in 2015.

I used four metrics to look at batter performance in these two pitch speed groups. The first is swing rate, which was simply swings divided by swings + takes. A positive swing rate change in the high-speed group could indicate growing comfort with the pitches. The second is contact rate, which uses the standard contact-made over swings measurement. Increasing contact rates might indicate batters are seeing high-speed pitches better. The third is fair-ball rate, which is the fraction of contact not fouled off. Better fair-ball rates could show an improved ability of batters to time the faster pitches. Lastly, I used a pseudo-value metric of Linear Weights-Runs per PA; to calculate it, I assigned each event its value in runs (as determined by BP’s linear weights) above an in-play out, summed them by group as needed, and divided by total plate appearances. Improvements in this measure may indicate that batters are making better contact with the fast pitches. I also created a sub-group within my data set of pitches with a called-strike probability of at least 80 percent, to attempt to account for the possibility that the increase in fast pitches is coming only on poorly hittable pitches. In all cases of all data, bunts were excluded.

Looking first at swing rate, and comparing the ratio of swing rate at 97 mph to swing rate at 92 mph, we see either no change or possibly a slight downward trend in both the full data and the high-strike-likelihood subset. By this measure, and if my reasoning that more swings would indicate more comfort is correct, batters have not gotten any more comfortable with high-velocity pitches than they were in 2008. Of course, there is an alternative hypothesis: batters might already be swinging at an optimal rate and so no change should be happening. These data do not exclude that possibility.

With contact rate, we do see some evidence of batter improvement. Although the high-strike-likelihood data is somewhat wavy, for both the full data and the subset the overall trend is upward, rising by about 5 percent over the time frame examined. The drop from 2014 to 2015 makes the pending 2016 results all the more interesting, too. I excluded current 2016 data, even though I’m showing rates and therefore wouldn’t need to account for there being fewer pitches overall, because the data currently missing from 2016 includes September call-ups, which could bias the results.

Fair-ball rate appears to be the least-informative of the bunch, since its up-and-down behavior obscures any obvious trends. Over the entire data set, it’s slightly up, but to so small that in combination with the wild swings from year to year no conclusions should be drawn.

Last, and probably most important, is runs per PA. In keeping with Sam’s preliminary work, I find that based on this particular measure of value/production, batters have not improved at hitting fast pitching in the last nine years; in fact, they may have even gotten slightly worse. For both the full data and the subset, the overall trend is downward.

However, take a look at this:

This is the same graph, now with the as-yet-incomplete 2016 data included. This is ABSOLUTELY NOT FAIR TO DO, since (as I said) the population of both hitters and pitchers is dramatically different in September than in all other months of the season, but the trend appears to at least potentially be reversing. The rise is so dramatic that it reminded me of something else that’s been heavily-discussed recently: the rising home run rate. This prompted me to add home run rate (home runs per PA) as a metric. Unfortunately, I didn’t find a matching trend.

These findings are… hard to explain. As Sam said, it doesn’t make sense that batters wouldn’t be getting better at this. They’re seeing so many more of these pitches every year, I’d have guessed they’d be getting some small benefit just from the increased exposure, but there’s little evidence of that. This appears to be a major concern for the game, particularly if the turn-around in runs/PA seen in the 2016 data doesn’t hold up. It wasn’t shown above, but for 2009-2015 the runs/PA ratio graph matches overall runs per game to a very high degree; that is, the drop in runs/PA on fast pitches follows a nearly-identical trend to overall runs per game. If batters are worse at hitting fast pitchers, aren’t getting better at it, and are seeing more of them, it’s difficult to see how the current trend towards low-scoring games could be reversed without some sort of structural change to the game.

Of course, there could be more to this story that I haven’t yet examined. In everything above, all batters were lumped together into a single average. It’d be interesting to see whether separating batters by age reveals anything further about this apparently lack of improvement; I could envision other ways to split the sample based on people that may also help with further insight. I’ll take a look at some of that data, and if any results appear meaningful I’ll show them here next time.



[1] In fact, and expectedly, the shape of fastball count as a function of fastball speed does change dramatically over the time frame of the data set. It didn’t fit in the main text, but here are the KDE plots for fastballs under and over the 95 mph threshold in 2008 and 2015.

Thank you for reading

This is a free article. If you enjoyed it, consider subscribing to Baseball Prospectus. Subscriptions support ongoing public baseball research and analysis in an increasingly proprietary environment.

Subscribe now
You need to be logged in to comment. Login or Subscribe
jfranco77
9/16
What kind of data do we have on college players? Would it be interesting to see a similar study on them? Let's say college players have seen an increase in pitchers over... I don't know, 94 MPH... that is a similar order of magnitude for major leaguers and 100 MPH?
JohnChoiniere
9/17
That's a really interesting idea, I'm going to look into what's available...