Friday, October 14, 2005

Are Journal Impact Factors being Misused ?

An intersting article by Richard Monastersky entitled "The Number That's Devouring Science" was published in the October 14th issue of the Chronicle of Higher Education. The article discusses how the journal impact factor, once a simple way to rank scientific journals, has become an unyielding yardstick for hiring, tenure, and grants. The problem, it seems, is that the impact factor was developed select the most important journals for a new citation index, and not as an article level evaluation tool.

The ISI Citation Index has become one of the most widely used citation tools in science and the social sciences and was conceived in 1955 and developed in the 1960s, primarily by Eugene Garfield. The concept was developed to help them select the most important journals for a new citation index they were working on. It didn't make sense for them to include only the journals that get the most citations because that would tend to eliminate smaller publications. He came up with the "impact factor," a grading system for journals, that could help him pick out the most important publications from the ranks of lesser titles. The imapct factor reflects the average number of citations per article for each journal.

Each year, the number by which science journals live and die is computed from a simple formula. To calculate the impact factor for journal X, Thomson ISI examines the 7,500 journals in its index to find the number of citations to X in the previous two years. That number becomes the numerator. The denominator is the number of original research articles and reviews published by X in the previous two years.

Impact factors caught on since they are an objective measure that serves many purposes. Librarians use them to decide which journals to purchase or cancel. Editors and publishers can gauge their progress relative to their competitors. Scientists can examine the numbers to see where their research papers are more likely to get the most attention.

However, the measurement is an average of all the papers in a journal over a year; it doesn't apply to any single paper, let alone to any author. According to Monastersky, a quarter of the articles in Nature in 2004 drew 89 percent of the citations to that journal, so a vast majority of the articles received far fewer than the average of 32 citations reflected in the most recent impact factor. Mr. Garfield and ISI routinely point out the problems of using impact factors for individual papers or people. According to the article Jim Pringle, vice president for development at Thomson Scientific which oversees ISI, responded that "It is a fallacy to think you can say anything about the citation pattern of an article from the citation pattern of a journal."

The pressure to publish in the highest impact journals in order to be successful in the tenure process and grant competitions has led researchers to compete more and more for the limited number of slots in those journals. Impact factors may also affect the kind of research is conducted. Top journals require that papers be topical so researchers may shift the kinds of questions they investigate to accommodate those high-impact journals. The question has become if impact ranking have begun to control scientists, rather than the other way around.

Monastersky also detailed that in 1997 the Journal of Applied Ecology cited its own one-year-old articles 30 times. By 2004 that number had grown to 91 citations, a 200-percent increase. Similar types of citations of the journal in other publications had increased by only 41 percent.

Steve Ormerod, executive editor from 2000 through 2004, wrote several editorials during his tenure that cited his own journal dozens of times. In 2002, for example, two of his commentaries cited 103 papers published in the journal during 2000 and 2001. Those two editorials alone raised his journal's 2002 impact factor by 20 percent. The self-citations at his publication had a measurable effect since the journal's impact factor jumped from 1.3 in 1997 to 3.3 in 2004, and its ranking within the discipline rose from 29th out of 86 journals to 16th out of 107.

For More Information

Garfield E. Citation indexes to science: a new dimension in documentation through association of ideas. Science. 1955;122:108–111.

Garfield E, Sher IH. Genetics Citation Index. Philadelphia, Pa: Institute for Scientific Information; July 1963.

Garfield, E. The Obliteration Phenomenon in Science -- and the Advantage of Being Obliterated!. Essays of an Information Scientist; 2(396-398). 1975.

Garfield E. Which medical journals have the greatest impact? Ann Intern Med. 1986;105:313–320 Sphere: Related Content

No comments: