Baseball Articles | Index

2001 Predictions -- Keeping Score

By Tom Tippett
November 27, 2001

Predictions. You can't get through the month of March without seeing lots and lots of predictions for the coming season. The pre-season baseball magazines have been publishing them for years. The Sporting News used to poll vast numbers of writers and assemble standings based on a consensus of those picks. And although many writers point out that predictions are silly because so many unexpected things can happen, they still offer their own because their readers want them.

We entered the fray when we began putting together our annual Projection Disks four years ago. The main purpose of our Projection Disks is to allow you to play the coming season using Diamond Mind Baseball and over 1500 established big leaguers and top minor-league prospects. These players are rated based on projected stats that are created using our projection system, which uses stats from the past three years and adjusts for the level of competition (majors, AAA, AA), ballpark effects (including minor-league parks), league rules (DH vs non-DH), and age, among other things.

After the projected stats have been computed and the player ratings assigned, we set up a manager profile for each team with its starting rotation, bullpen assignments, starting lineups versus left- and right-handed pitchers, and roles for bench players (such as platoons, spot starters, and defensive replacements). Using these projected stats, ratings and manager profiles, we can simulate the season many times and average the results to come up with our projected final standings for the season.

This article is a look back at our projections and those of other prominent publications and writers. Who was most accurate this year? How about last year and the year before? Who has the best track record during the four-year period we've been doing this?

To answer those questions, we need a way to assign an accuracy score. Pete Palmer (co-author of Total Baseball and The Hidden Game of Baseball) has been projecting team standings for more than 20 years, and he routinely collects predictions and ranks them at the end of the year. His rankings are based on a simple scoring system -- subtract each team's actual placement from their projected placement, square this difference, and add them up for all the teams. For example, if you predict a team will finish fourth and they finish second, that's a difference of two places. Square the result, and you get four points. Do this for every team and you get a total score. The lower the score, the more accurate your predictions.

In 2001, there was a tie for first in one division. In a case like this, we don't try to break the tie and name one team as having finished first and the other having finished second. Instead, we say that each team finished in 1.5th place for the purposes of figuring out how many places a prediction was off. For example, if you project a team to finish third and they finish in a tie for first, that's a difference of 1.5 places. That's why you'll see fractional scores in the following table.

This isn't the only scoring system one could use to rank these projections, of course. The folks at ESPN.com use the same approach but don't square the differences. Another web site assigns a score based on how many games each team finished out of their predicted place in the standings. The rankings would change if a different method was used. For example, the ESPN.com method makes us look a little better and the games-out method makes us look a little worse.

I should also point out that it's not entirely fair to put all of these predictions into the same group. Some of them, especially those for the spring baseball magazines, were made long before spring training started. Others, including ours, were done in early March, while some were published just before opening day. Obviously, the later you do them, the more information you have on player movement and injuries.

Nevertheless, here are the rankings for 2001, using Pete's method:

Forecaster                            Score



Sean McAdam, ESPN.com                  32.5

Lindy's                                36.5

Steve Mann                             38.5

Dan Shaughnessy, Boston Globe          44.5

USA Today                              46.5

Baseball Weekly                        46.5

Jayson Stark, ESPN.com                 46.5

The Sporting News (spring magazine)    52.5

Diamond Mind simulations               54.5

Tom Tippett, Diamond Mind              54.5

Baseball America                       54.5

Sports Illustrated                     56.5

Peter Gammons, ESPN                    56.5

Gordon Edes, Boston Globe              56.5

David Schoenfield, ESPN.com            56.5

Matt Szefc, ESPN.com                   56.5

Bob Klapisch, ESPN.com                 57.5

Zack Scott, Diamond Mind               58.5

Baseball Digest                        58.5

Baseball Yearbook                      58.5

Danny Sheridan, USA Today              59.5

Rany Jazayerli, Baseball Prospectus    62.5

2000 final standings                   64.5

Las Vegas over-under line              65.5

Rob Neyer, ESPN.com                    66.5

Athlon                                 67.5

John Sickels, ESPN.com                 68.5

Street & Smith                         68.5

Pete Palmer                            70.5

Larry Whiteside, Boston Globe          74.5

Mazeroski                              75.5

Bob Ryan, Boston Globe                 84.5

Spring training results               113.5

There are three highlighted entries for the Diamond Mind picks. The "Diamond Mind simulations" entry is the one representing the average result of simulating the season 50 times. These simulations were done about three weeks before the season started and were based on the information we had through March 10th. Two weeks later, Zack Scott and I each made our own predictions. These were based largely on the simulation results but also took into account new information about injuries and our own hunches. For example, our simulations had Boston finishing slightly ahead of New York, but two weeks later we knew that Nomar would miss at least half the season, so I reversed these two teams in my projections.

There are three entries in this list that don't represent the views of a writer or a publication:

- if you predicted that the 2001 standings would be the same as in 2000, your score would have been 64.5

- if you put together a set of standings based on the Las Vegas over-under line, you'd have scored 65.5.

Note: I've never placed a sports wager, so I'm not an expert on this subject by any means. But I know that you can go to Las Vegas and place a bet on whether a team will win more or fewer games than the over-under number. Because casinos take a commission on every bet, they're guaranteed to make a profit if equal amounts of money are bet on both sides. They only lose money if a lot more people bet one way and those people are right. In other words, the over-under line isn't a predicted win total by the experts at the casinos. Those experts might think Baltimore will win 68 games, but they'll set the line at 65 if that's what it takes to balance the wagers. And the number can change if more bets come down on one side. So the Vegas over-under line is actually an indication of how the betting community sees things.

- if you predicted that the regular season standings would match the 2001 spring training standings, your score would have been an atrocious 113.5. In other words, the spring training results were almost useless as a predictor of the real season.

Much more interesting than the overall scores, in my opinion, are the details. Leaving out the three entries that don't represent writers or publications, here are some observations about how the other 30 saw things last spring:

AL East. Twenty-three had the Yankees winning the division, the rest chose Boston. Twenty picked Baltimore for the basement, the others had Tampa Bay finishing in the cellar. This was the easiest division to forecast, with several having it right from top to bottom.

AL Central. Only six of thirty predictors thought Chicago would repeat, with the other twenty-four looking for Cleveland to bounce back. That surprised me a little bit. But I wasn't surprised to see that twenty predictors were expecting another last-place finish for Minnesota. Nobody picked them to finish second, and only three (Sean McAdam, Dan Shaughnessy, and Bob Klapisch) had them as high as third. Thanks to Minnesota, nobody got this division 100% right.

AL West. These thirty forecasters were unanimous in picking Oakland to win the division and they were evenly split on Seattle versus Texas for second place. In our simulations, Texas got just enough pitching to go with their high-powered offense and managed to finish second to Oakland, with Seattle third. In reality, Texas got no pitching and they finished last. Athlon was the only one to pick Texas for the basement; John Sickels the only one to put Seattle there. Seattle's runaway win ensured that nobody got this division completely correct.

NL East. Atlanta managed to win this division again, but just by a hair. Seven prognosticators saw their decline coming and projected a new division winner. In every case, however, their pick was New York. The Diamond Mind crew was alone in picking Philadelphia third, with sixteen predictions putting them in the basement for the second year in a row and the other eleven putting Philly fourth. On the other hand, we were the only ones who had the third-place Marlins finishing last. Once again, nobody nailed this division from top to bottom.

NL Central. Bob Ryan chose the Reds as the projected division winners, but everyone else settled on the Cardinals (25 picks) or the Astros (4 picks). We had St. Louis in first and Houston in second, and we were sure we were going against the grain to have the Astros doing so well. After all, they were coming off a very disappointing 2000 season. But 23 of the selectors saw things the same way and predicted that Houston would finish either first or second. The Cubs, who tied Philly for the league's worst record the year before, moved all the way from sixth to third in the division. Twenty-one predictions called for them to be last or second-last in 2001, and only one (Gordon Edes) correctly picked the Cubs for third place. This is the fourth division that nobody got entirely right.

NL West. This division killed us because we had the re-tooled Rockies winning it and they finished last instead. We weren't alone, but we didn't have much company either, as only three others picked Colorado first. This is where Sean McAdam sewed up the title -- he was one of only five to correctly pick Arizona as the division champs, and he had the second and third place teams right as well. Twenty-eight predictions had San Diego bringing up the rear, and when that didn't happen, we ended up with yet another division finish that nobody called correctly.

Looking back over the past four years, here are the rankings for those who were included in our sample every year:

Forecaster                   2001  2000  1999  1998  Total



Steve Mann                   38.5  58.0  54.0  44.0  194.5

Sports Illustrated           56.5  40.0  56.0  54.0  206.5

Diamond Mind                 54.5  68.0  42.0  44.5  209.0

Pete Palmer                  70.5  54.0  40.0  58.0  222.5

Baseball Weekly              46.5  58.0  51.5  60.0  216.0

Las Vegas over-under line    65.5  51.5  48.0  52.0  217.0

Sporting News                52.5  38.0  78.0  54.0  222.5

Athlon                       67.5  42.0  72.0  72.0  253.5

Street & Smith               68.5  58.0  68.0  64.0  258.5

Baseball Digest              58.5  66.0  76.0  58.0  258.5

Mazeroski                    75.5  48.0  84.0  88.0  295.5

We've added a bunch of new predictions to our database in the past two years, and here's how folks have done in 2000-2001:

Forecaster              2001  2000  Total



Sean McAdam             32.5  38.0   70.5

Gordon Edes             56.5  26.0   82.5

Sporting News           52.5  38.0   90.5

Steve Mann              38.5  58.0   96.5

Sports Illustrated      56.5  40.0   96.5

Dan Shaughnessy         44.5  54.0   98.5

Baseball Weekly         46.5  58.0  104.5

Peter Gammons           56.5  48.0  104.5

Rany Jazayerli          62.5  46.0  108.5

Baseball America        54.5  54.0  108.5

Athlon                  67.5  42.0  109.5

David Schoenfield       56.5  56.0  112.5

Rob Neyer               66.5  48.0  114.5

Las Vegas over-under    65.5  51.5  117.0

Diamond Mind sims       54.5  68.0  122.5

Mazeroski               75.5  48.0  123.5

Pete Palmer             70.5  54.0  124.5

Baseball Digest         58.5  66.0  124.5

John Sickels            68.5  58.0  126.5

Street & Smith          68.5  58.0  126.5

Bob Ryan                84.5  58.0  132.5

Bob Klapisch            57.5  78.0  135.5

Except for the disappointing 2000 season, our approach to developing projections seems to be providing good results. In several cases, our relatively high ranking was primarily due to putting certain over-rated teams in their places. On the other hand, our track record with very young teams and very old teams hasn't been quite as good. Before we produce our projections for 2002, we'll be taking another look at these results to see if there are any adjustments that can be made to our projection system.

In the meantime, congratulations to Sean McAdam for a very strong performance in a year that was far from easy to forecast.