CEU's Sinatra and Barabasi on “SciSci” in Science Magazine

A group of mathematicians, computational social scientists and network scientists including CEU's Roberta Sinatra and Albert-Laszlo Barabasi are among a group of researchers using big data to look at the practice of science itself. Those outside the scientific world might not know that research scientists' careers flourish or flounder based on getting citations in scientific journals. In other words, for groups of peer reviewers to see merit in their work, to acknowledge breakthroughs and approve their work for publication. But what/who gets published is not black and white. In a paper released today in Science, Sinatra and Barabasi and their co-authors review recent developments in the study of science itself, or “SciSci.”

In the 21st century, we have access to large databases that allow scientists to study publication and citation data but is the way scientists conduct and report data conducive to capturing the broadest scientific discoveries and breakthroughs? The group found that scientists working on the same specific field or topic(s) tend to cite one another.

“Most new links fall between things only one or two steps away from each other, implying that when scientists choose new topics, they prefer things directly related to their current expertise or that of their collaborators. This densification suggests that the existing structure of science may constrain what will be studied in the future.”

Risk of not being published can affect what scientists choose to study, the authors found. Lack of publications means being eliminated from consideration for big awards and accolades. The authors suggest that “one way to alleviate this conservative trap is to urge funding agencies to pro-actively sponsor risky projects that test truly unexplored hypotheses and take on special interest groups advocating for particular diseases.” Interestingly, research shows that there is an imbalance in the allocation of biomedical resources in the U.S. that follows established patterns but is likely not addressing the most critical diseases.

Novelty seems to be eschewed by peer reviewers controlling grant money despite the fact that rare combinations of ideas or papers in which previously disconnected ideas are paired have high citation rates. “In other words, a balanced mixture of new and established elements is the safest path toward successful reception of scientific advances.”

Many factors play into successful or unsuccessful scientific careers, among them: the space to fail, gender, mobility and reputation. The “publish or perish” burden seems to hurt not just scientists but to damage the sciences themselves. The authors note that models of career growth show that short-term or non-tenure contracts “are responsible for productivity fluctuations, which often result in sudden career death.” They also cite a study that funding schemes that allow for early-career failure but reward long-term success are more likely to produce high-impact papers than grants run on short review cycles.

Gender inequality is still a major factor in science. “Women have fewer publications and collaborators and less funding, and they are penalized in hiring decisions when compared with equally qualified men,” they write. An experiment they cite showed that female candidates were systematically penalized by the hiring committee when gender was randomly assigned among the CVs of a pool of applicants.

Recent decades have seen a dramatic increase in team science, often bringing big names, big reputations and big research funds together. The authors note that “today, a team-authored paper in science and engineering is 6.3 times more likely to receive 1,000 citations or more than a solo-authored paper.” Underscoring the authors' point about the ideal paper being a mixture of novel combinations and familiar research domains, diverse teams can bring both of these critical elements.

Some suggestions the authors give are to study failure more closely to better understand and improve science. They also stress that there is a “rich-get-richer” dynamic that favors those scientists and engineers that fit into the established paradigm; they suggest other performance indicators – other than citations – be considered. The authors also call for more communication about the social implication and applications of science and technology and, perhaps, incentives that would bring out the best practical research and reduce redundancy.

Finally, Sinatra, Barabasi and their co-authors write about science funding and the biases and inconsistencies that affect it. They note some ways that have been suggested to erase these problems, including random distribution of funding and crowd-funding.

Part of the Sinatra’s research was funded by the University’s Intellectual Themes Initiative as part of the Just Data project.

Read the full Science paper here