Consider what has happened to modern baseball (lower part of Figure 19). General play has improved significantly in all aspects of the game. But the balance between hitting and pitching has not altered. (I showed on pp. 101-105 that the standardbearers of baseball have frequently fiddled with the rules in order to maintain this balance.) The mean batting average has therefore remained constant, but this stable number represents markedly superior performance today (in both hitting and pitching). Therefore, this unchanged average must now reside much closer to the right wall. Meanwhile, and inevitably, variation in the entire system has shriveled symmetrically on both sides—at the lower end, because improvement of play now debars employment to men who field well but cannot hit; and at the upper end, for the simple reason that much less room now exists between the upwardly mobile mean and the unchanging right wall. The top hitters, trapped at the upper bound of the right wall, must now lie closer to the mean than did their counterparts of yore.
The best hitters of today can’t be worse than 0.400 hitters of the past. In fact, the modern stars may have improved slightly and may now stand an inch or two closer to the right wall. But the average player has moved several feet closer to the right wall—and the distance between ordinary (maintained at 0.260) and best has decreased, thereby erasing batting averages as high as 0.400. Ironically, therefore, the disappearance of 0.400 hitting marks the general improvement of play, not a decline in anything.
Our confidence in this explanation will increase if supporting data can be provided with statistics for other aspects of play through time. I have compiled similar records for the other two major facets of baseball—field— ing and pitching. Both support the key predictions of a model that posits increasing excellence of play with decreasing variation when the best can no longer take such numerical advantage of the poorer quality in average performance.
Most batting and pitching records are relative, but the primary measure of good fielding is absolute (or at least effectively so). A fielding average is you against the ball, and I don’t think that grounders or fly balls have improved through time (though the hitters have). I suspect that modern fielders are trying to accomplish the same tasks, at about the same level of difficulty, as their older counterparts. Fielding averages (the percent of errorless chances) should therefore provide an absolute measure of changing excellence in play. If baseball has improved, we should note a decelerating rise in fielding averages through time. (I do recognize that some improvement might be attributed to changing conditions, rather than absolutely improving play, just as some running records may fall because modern tracks are better raked and pitched. Older infields were, apparently, lumpier and bumpier than the productions of good ground crews today—so some of the poorer fielding of early days may have resulted from lousy fields rather than lousy fielders. I also recognize that rising averages must be tied in large part to great improvement in the design of gloves— but better equipment represents a major theme of history, and one of the legitimate reasons underlying my claim for general improvement in play.)
Following the procedure of my first compilation on batting averages, I computed both the league fielding average for all regular players and the mean score of the five best for each year since the beginning of major league play in 1876. Figure 20, showing decadal averages for the National League through time, confirms the predictions in a striking manner. Not only does improvement decelerate strongly with time, but the decrease is continuous and entirely unreversed, even for the tiny increments of the last few decades, as averages reach a plateau so near the right wall.
For the first half of baseball (the fifty-five years from 1876 to 1930), decadal fielding averages rose from 0.9622 to 0.9925 for the best players, for a total gain of 0.0303; and from 0.8872 to 0.9685 for average performers, for a total gain of 0.0813. (For a good sense of total improvement, note that the average player of the 1920s did a tiny bit better than the very best fielders of the 1870s.) For baseball’s second half (the fifty years from 1931 to 1980), the increase slowed substantially, but never stopped. Decadal averages for the best players rose from 0.9940 to 0.9968, for a small total gain of 0.0028—or less than 10 percent of the recorded rise of baseball’s first half. Over the same fifty years, values for league averages rose from 0.971 for the 1930s to 0.9774 for the 1970s, a total gain of 0.0064—again less than 10 percent of the improvement recorded during the same number of years during baseball’s first half.
FIGURE 20 Unreversed, but constantly slowing, improvement in mean fielding average through the history of baseball.
These data continue to excite me. As stated before, I have spent a professional lifetime compiling statistical data of this sort for the growth of organisms and the evolution of lineages. I have a sense of the patterns expected from such data, and have learned to pay special attention to noise and inevitable departures from expectations. I am just not used to the exceptionless data produced over and over again by the history of baseball. I would have thought that any human institution must be more sensitive than natural systems to the vagaries of accident and history, and that baseball would therefore yield more exceptions and a fuzzier signal (if any at all). And yet, here again—as with the decline of standard deviations in batting averages (see page 106)—I find absolute regularity of change, even when the total accumulation is so small that one would expect some exceptions just from the inevitable statistical errors of life and computation. Again, I get the eerie feeling that I must be calculating something quite general about the nature of systems, and not just compiling the individualized numbers of a particular and idiosyncratic institution (yes, I know, it’s just a feeling, not a proof). Baseball is a truly remarkable system for statisticians, manifesting two properties devoutly to be wished, but not often encountered, in actual data: an institution that has worked by the same rules for a century, and has compiled complete data (nothing major missing) on all measurable aspects of its history.
For example, as decadal averages for the five best reach their plateau in baseball’s second half, improvement slows markedly, but never reverses—the total rise of only 0.0028 occurs in a steady climb by tiny increments: 0.9940, 0.9953, 0.9958, and 0.9968. Lest one consider these gains too small to be anything but accidental, the first achievements of individual yearly values also show the same pattern. Who would have thought that the rise from 0.990 to 0.991 to 0.992, and so on, could mean anything at all? An increment of one in the third decimal place can’t possibly be measuring anything significant about actual play. And yet 0.990 is first reached in 1907, 0.991 in 1909, 0.992 in 1914, 0.993 in 1915, 0.994 in 1922, 0.995 in 1930. Then, thank goodness, I find one tiny break in pattern (for I was beginning to think that baseball’s God had decided to mock me; the natural world is supposed to contain exceptions). The first value of 0.996 occurs in 1948, but the sharp fielders of 1946 got to 0.997 first! Then we are back on track and do not reach 0.998 until 1972.
This remarkable regularity can occur only because, as my hypothesis requires in its major contention, variation declines so powerfully through time and becomes so restricted in later years. (With such limited variation from year to year, any general signal, however weak, should be more easily detected.) For example, yearly values during the 1930s range only from 0.992 to 0.995 for best scores, and from 0.968 to 0.973 for average scores. By contrast, during baseball’s first full decade of the 1880s, the yearly best ranged from 0.966 to 0.981, and the average from 0.891 to 0.927.
This regularity may be affirmed with parallel data for the American League (shown with the National League in Table 3). Again, we find unreversed decline, though this time with one exception as American League values fall slightly during the 1970s—and I have no idea why (if one can properly even ask such a question for such a minuscule effect). Note the remarkable similarity between the leagues in rates of improvement across decades. We are not, of course, observing two independent systems, for styles of play do alter roughly in parallel as both leagues form a single institutio
n (with some minor exceptions, as the National League’s blessed refusal to adopt the designated hitter rule indicates in our times). But nearly identical behavior in two cases does show that we are probably picking up a true signal and not a statistical accident.
Data on fielding averages are particularly well suited to illustrate the focal concept of right walls—the key notion behind my second explanation for viewing the disappearance of 0.400 hitting as a sign of general improvement in play. Fielding averages have an absolute, natural, and logical right wall of 1.000—for 1.000 represents errorless play, and you cannot make a negative number of errors! Today’s best fielders are standing with toes already grazing the right wall—0.998 is about an error per year, and nobody can be absolutely perfect. (Outfielders, pitchers, and catchers occasionally turn in seasons of 1.000 fielding, but only one infielder has ever done so for a full season’s regular play—Steve Garvey at first base in 1984.)
If you doubted my explanation for shrinking variation at the upper end of the bell curve for batting averages—that as the mean moves toward the right wall, variation scrunches up into an ever smaller available space, and must therefore decrease—you will surely grant me the argument for fielding averages so close to an absolute wall. Even the 1870s didn’t provide much space, but fielders had a bit of breathing room for improvement between their first decadal best of 0.962 and the wall. And improve they did, and steadily. But now, with the five best averaging 0.9968, there just isn’t any more space, barring the construction of truly errorless robotic fielding machines.
As the mean moves toward the wall, variation must decrease. For absolute measures of fielding, high numbers persist and low values get axed. But for relative measures of hitting, the wall itself bears no number. The advancing mean retains the same value (as a balance between hitting and pitching), while both hitting and pitching move in lockstep toward their right walls of human limitation. Thus, 0.400 hitting disappears as the league mean of 0.260 marches steadily toward the wall. But the 0.400 hitters of yore are alive and well, probably more numerous than ever, and standing where they always have resided—just inches from the right wall. But their current best does not measure 0.400 anymore, because everyone else has improved so much, raising average play to a level where an unchanged (or even slightly improved) best can no longer soar so far above the norm.
The best hitters of early baseball could compile 0.400 averages by taking advantage of a standard in average play much lower than today’s premier batters encounter. Wade Boggs would hit 0.400 every year against the pitching and fielding of the 1890s, while Wee Willie Keeler would be lucky to crack 0.320 today. Since pitching and batting both feature relative records, and presumably exist in effective balance throughout the history of baseball, we should be able to detect similar phenomena in the statistics of pitching through time. The best pitchers of the past, legendary figures like Christy Mathewson, Cy Young, Walter Johnson, Three Finger Brown, and Grover Cleveland Alexander, should be no better than their modern counterparts Sandy Koufax, Bob Gibson, Tom Seaver, and Nolan Ryan. But the old pitchers, standing next to their own right wall and facing much poorer average batting, should have racked up numbers that modern hurlers just can’t equal.
The fascinating and well-known history of minimal earned run averages provides our best illustration of symmetry between batting and pitching—another indication that these statistics record the general behavior of systems, not just a peculiarity of batting in baseball. As the best batters sacrificed their 0.400 averages because variation declined while average play improved, the best pitchers lost their earned run averages below 1.50 because ordinary hitters became too good.
The list of the hundred best seasonal ERAs shows a remarkable imbalance. More than 90 percent of the entries were achieved before 1920. Since then, only nine pitchers have obtained an earned run average in the top one hundred (and remember that the number of pitchers, hence the number of opportunities, has expanded dramatically, first with the introduction of the American League and later with expansion from an original eight to our current roster of fourteen teams per league). Moreover, of these nine modern values, seven rank in the lower half. If we consider the modern achievements, from the bottom up, we get a good sense of the obstacles that must face our superb contemporary pitchers.
Tied at number 100 are Sandy Koufax (1.74 in 1964) and Ron Guidry (1.74 in 1978). Koufax was, well, Koufax—by general agreement the greatest of modern pitchers, perhaps of all pitchers anywhere, anytime (he also holds the ninety-seventh spot at 1.73 for 1966). Guidry, a wonderful Yankee pitcher for a few years, compiled a stellar season in 1978 (with an unmatched combination of total victories and winning percentage of 25—3, for 0.893), and then threw his arm out. Nolan Ryan occupies eighty-seventh place at 1.69 for 1981. And Ryan was, well, Ryan. Nothing else need be said. Carl Hubbell, perhaps the premier pitcher of the 1930s (Lefty Grove was no slouch, either) turned in 1.66 in 1933 for seventy-sixth place and the only entry for his high-hitting decade. Dean Chance, a strictly okay pitcher of the last generation, posted an anomalous 1.65 for seventy-first place in 1964—and I can’t figure this one at all. Spud Chandler holds sixty-sixth place at 1.64 for 1943—a fine (if not fabulous) pitcher during the war years, when all decent hitters were blasting away at Germany or Japan instead. Luis Tiant, a damned fine pitcher but not among the greatest, holds sixtieth place at 1.60 for 1968—and I’ll return to him in a moment. Dwight Gooden had a fabulous sophomore season in 1985, with a 1.53 ERA that puts him in forty-second place as one of only two modern pitchers in the first half-hundred. He then fell victim to what the newspapers politely call "substance abuse."
We then come to what may be the finest record in modern sports— Bob Gibson’s truly incredible 1.12 ERA of 1968, for fourth place, surrounded by forty old-timers before we meet Doc Gooden at number forty-two. Gibson’s only superiors are Tim Keefe with 0.86 in 1880, Dutch Leonard at 0.96 for 1914, and Three Finger Brown at 1.04 for 1906. How could Gibson compile such a record—the only post-1920 value below 1.50, and way, way below at that—in our modern era of greatly improved average hitting?
I don’t want to take a thing away from Bob Gibson, who absolutely terrified me in the 1967 World Series, when he almost single-handedly beat the Red Sox by winning three games and casting a pall of inevitability over the whole proceedings. But, in slight mitigation, 1968 was a really funny year, as mentioned previously (see page 104). For some set of reasons that no one understands, pitching took a dramatic upper hand that year, capping a trend of several years’ duration. (As explained before, the rulemakers then restored the usual order by lowering the pitching mound and decreasing the strike zone; batting averages and ERAs rose appropriately in the 1969 season and have remained in balance ever since.) The 1968 season didn’t just belong to Gibson; in that year, low ERAs sprouted like dandelions in my garden. In most years of modern baseball, no pitcher in either league has posted an ERA lower than 2.00. Uniquely in 1968, all five leading American League pitchers bettered this mark, as Yastrzemski won the batting title with a paltry 0.301 (Tiant at 1.60, McDowell at 1.81, McNally at 1.95, McLain at 1.96—a banner year for Scotland—and John at 1.98. As I said, Tiant was a terrific pitcher and great fun to watch, but not one of the game’s greatest. If he could post 1.60 for 1968, baseball was really out of whack that year.) So Gibson certainly took maximal advantage of a weird year, but let’s not take anything away from him. No one, no matter how good, had any statistical right to post a value so much better than anything achieved for sixty years, especially when general improvement in play should have made such low ERAs effectively unobtainable. Gibson had one helluva year!
In quick summary of a long and detailed argument, symmetrically shrinking variation in batting averages must record general improvement of play (including hitting, of course) for two reasons—the first (expressed in terms of the history of institutions) because systems manned by best performers in competition, and working under the same rules through time, slowly
discover optimal procedures and reduce their variation as all personnel learn and master the best ways; the second (expressed in terms of performers and human limits) because the mean moves toward the right wall, thus leaving less space for the spread of variation. Hitting 0.400 is not a thing, but the right tail of the full house for variation in batting averages. As variation shrinks because general play improves, 0.400 hitting disappears as a consequence of increasing excellence in play.
11
A Philosophical Conclusion
Some people regard this explanation as a sad story. One can scarcely decry a general improvement in play, but the increasing standardization thus engendered does seem to remove much of the fun and drama from sports. The "play" in play diminishes as activities become ever more "scientific" in the pejorative sense of operating like optimized clockwork. Perhaps no giants inhabited the earth during baseball’s early days, but the best then soared so far above the norm that their numbers seemed truly heroic and otherworldly, while our current champions cannot rise nearly so far above the vastly improved average.
But I suggest that we should rejoice in the shrinkage of variation and consequent elimination of 0.400 hitting. Yes, excellence in play does imply increasing precision and standardization, but what complaint can we lodge against repeated maximal beauty? I have now been a fan for fifty years. I have seen hundreds of perfectly executed double plays and brilliant pegs from outfield to home (that may or may not have beaten the runner charging from third)—the kind of beautifully orchestrated precision that probably occurred only rarely in baseball’s early years. I do not thrill any less with each repetition. The pinnacle of excellence is so rare, its productions so exquisite. Did we ever get bored with Caruso or Pavarotti in their prime? I would much rather have my expectation of excellence affirmed when I go to the ballpark or the opera house than to take potluck and hope for a rare glimpse of glory in a sea of mediocrity.