Fund Ratings Are Fun, but Not Always Useful Magazines' Survey Methods May Be Part of Problem

Article excerpt

A seemingly surefire way for a financial magazine to jump off the newsstands is to put a mutual fund headline on its cover and a comprehensive rating of funds inside.

The ratings, which usually appear once or twice a year, are often used by investors to whittle their choices from the more than 5,000 funds on the market. But are these rankings reliable guides for fund investors?

Academic studies have found over the years that differences in risk-adjusted performance by money managers are merely blips on the screen and that no money manager has the skill to beat the market over time.

Hence, the blanket disclosure by mutual funds that past performance is no predictor of future performance. A study in 1990 concluded, though, that winners and losers in the fund industry tend to repeat. So the debate continues about both survey methodology and money managers' skills.

Shortcuts to choosing a fund, like culling a list to include only those funds that get a top grade of A+ or five stars from the magazines, can backfire, according to the latest study. It was conducted by Bob Fischer, an associate vice president of Legg Mason, in Richmond, Va., who concluded that magazine rankings were useless.

Fischer looked at four magazines - Business Week, Forbes, Kiplinger's and Money - and their fund ratings for 1992. For Business Week, which gave up and down arrows to funds, he assigned a numerical value of 1 to 5 for the arrows. For the other magazines, he converted their letter grades to numbers.

He then selected 25 growth-and-income funds at random from each magazine and compared the funds' ratings in 1992 with their 1993 performance. He also looked at how the funds did for the three years ended in 1994.

Fischer performed a standard statistical test known as a regression analysis, which measured the relationship between the funds' rankings and their performances. He found no correlation between rankings and future performances.

Is such a correlation possible? Yes, he said, citing the close connection between Value Line's stock ratings and stock performance. In 22 of 30 years, Value Line's top-rated stocks outperformed those in group 2, which outperformed group 3 and so on.

Fischer conceded that the scope of his fund study was limited and his sample size small. Even so, he said that he doubted a bigger study would reach a different conclusion.

The reason is that magazines don't do much original research for their rankings, said Scott Lummer, managing director of Ibbotson Associates Inc., a Chicago consulting and market research firm. "I don't expect them to thoroughly interview each fund manager," he said. "... I can't expect to achieve nirvana for $2.50."

The managing editor of Money, Frank Lalli, conceded that fund rankings can be superficial. For that reason, Money abandoned its A-through-F grading system in August 1993, Lalli said.

"Rather than provide all-encompassing grades, Money gives its readers sophisticated information on funds' performance, their portfolios and their management styles in our monthly Fund Watch section, our semi-annual mutual fund roundup issues and in the fund feature stories we run virtually every month," Lalli wrote in a letter to the editor of Registered Representative magazine, where Fischer's study first appeared.

Business Week took exception. "He's looking for absolute returns, while our ratings are based on risk-adjusted returns over five years," said Jeffrey M. …