Data Don't Lie, Do They?
When the truth of what’s published is questioned, an institution’s research fraud investigators might need to step in
Laid side by side, the slide shown at a conference in the late 1980s appeared similar, but certainly not identical, to a figure in an article in a leading scientific journal.
But the charge being leveled by the article’s author, recalls Margaret Dale, the former dean for faculty and research integrity, then a part of the School’s faculty affairs group, asserted that the slide was a copy of one in his published work.
The allegation became a case for Mortimer Litt, the School’s point person for academic and research integrity.
Litt, chief scientific investigator for the Office of Academic and Research Integrity at HMS, carefully examined the two slides. In a style that has become his signature over nearly three decades as a pioneer in the field of research integrity, Litt forged a path to find the truth: Instead of enlarging the photographs to discern similarities or marked differences in the images, he looked at the microscopic particles that are inadvertently deposited on every slide during preparation.
Litt found distinct, exact patterns of debris speckling each slide, indicating that the slides were identical. The allegations of plagiarism were true.
“It would have been very difficult to prove, but Mort found a creative way to investigate the allegation, and then presented his findings in a way that everyone could see clearly and understand,” Dale says, still amazed at her former colleague’s ability to sort through volumes of complex information to find answers to seemingly insoluble questions.
The 90-year-old Litt still arrives at his small, sunny office every morning to go about his job of finding the truth, just as he has done since the late 1980s. Litt and his colleagues in the School’s academic and research integrity group have in fact shaped how research integrity investigations are handled, not only at HMS, but at the federal level as well.
The Prepared Mind
On the bookshelf to the right of Litt’s uncluttered desk are stacks of manila folders, about seventy, he estimates, each containing a precisely detailed report of an investigation he has conducted into allegations of misconduct, fabrication, or plagiarism at research laboratories at HMS and its fifteen affiliated institutions.
How he came to be an investigator of research fraud is a story in itself, one marked by what Litt describes as “experiences—and luck.”
From a public high school in Brooklyn he enrolled at Columbia University, where he began his studies in philosophy but ended as premed. It was luck that put him in charge of a crew of medics at a U.S. Army hospital in Japan during World War II, a stroke of serendipity, he says, that reinforced his interest in the possibility of a career in medicine.
After returning home from the war and graduating from Columbia, he applied to twelve medical schools, but wasn’t invited to interview at any of them.
Once again, luck intervened. One of his professors offered him a chance to “do routine work” in a leading research laboratory at the Rockefeller Institute for Medical Research in Princeton, New Jersey.
There, Litt says, he got a feel for what working in a medical laboratory could offer, and, with the recommendations of his supervisors at the institute, he was accepted into medical school at the University of Rochester.
Weight of Evidence
Litt is soft-spoken, focused, and passionate as he explains his drive to unlock secrets contained in detailed lab notes, photographs, complex calculations, techniques, and in the words used by scientists up and down the hierarchy of the research community.
“Our job is to find the truth,” he says.
At stake are not only reputations but the integrity of science, as noted in a 2015 semi-annual report from the National Science Foundation’s Office of Inspector General.
Research misconduct, it says, “damages the scientific enterprise, is a potential misuse of public funds, and undermines the trust of citizens in government-funded research.”
That same report summarizes findings of an analysis that looked for evidence of plagiarism in the more than 8,000 proposals awarded by the NSF during fiscal year 2011. Using commercial plagiarism-detection software, the office assessed how much of each proposal’s text appeared to be copied.
Although many proposals contained some copied text, the amount of copying found in others led to thirty-four plagiarism investigations. Ten of those resulted in findings of research misconduct, which, at the time of the report, led to the recovery of $357,602 in federal funds.
A 2012 study, published in the Proceedings of the National Academy of Sciences by researchers at the Albert Einstein College of Medicine, found that misconduct, not errors, was responsible for most retractions from journals.
Among the 2,047 retracted papers they analyzed, the researchers found that “21 percent of the retractions were attributable to error, while 67 percent were due to misconduct, including fraud or suspected fraud (43 percent), duplicate publication (14 percent), and plagiarism (10 percent). Miscellaneous or unknown reasons accounted for the remaining 12 percent.”
Investigations of misconduct also carry a cost, both to individual institutions and to funding institutions. A 2010 paper reported in PLoS Medicine by researchers at Roswell Park Cancer Institute in Buffalo, New York, dissected the direct and indirect costs of one such investigation at their institution. When a host of factors—from loss of productivity to time lost to faculty involved in the investigation, both those accused and those reviewing the evidence, to the cost of penalties and retractions from the literature—were considered, the scientists estimated the cost of that one case approached $525,000.
Another look at the cost of misconduct appeared in 2014 in the online journal eLife, this one focusing on work funded by the National Institutes of Health. Its authors found that research papers retracted because of misconduct “accounted for approximately $58 million in direct funding by the NIH between 1992 and 2012, less than 1 percent of the NIH budget over this period.” While the financial toll was less significant than what the authors expected, they did find that the toll on researchers’ careers was steep, with many being derailed.
It was in the mid-to-late 1980s when a particularly complicated case found its way to the desk of Eleanor Shore ’55, then dean for faculty affairs and responsible for overseeing the rare allegation of wrongdoing for research conducted at the School.
When the case arose, a colleague suggested to Shore that Litt might lend a hand.
“What Mort did” recalls Gretchen Brodnicki, the dean for faculty and research integrity in the Office of Academic and Research Integrity, a stand-alone office at the School since 2008, “was to create a new discipline, to professionalize and standardize the assessment of concerns about data integrity, the assessment of figures, of data, and of publications. He helped establish a mechanism to determine whether there are issues of integrity.”
“It just didn’t exist until he started doing this work,” she adds.
Brodnicki says that the forensic research methodologies Litt developed at HMS are now used by the federal Office of Research Integrity and taught to people at institutions around the country.
According to Dale, Litt got involved in that first case shortly before the School’s comprehensive conflicts of interest and commitment policy was implemented. Questions of research misconduct would arise infrequently—and never with the idea that one would appear again anytime soon.
Today, there are twenty-five to thirty allegations that come across Brodnicki’s desk each year involving researchers at the School or its affiliated hospitals. The pressure to publish is a driving motivator behind the higher numbers, Brodnicki says.
Publishing drives academic research today and can mean funding, promotion, success, or that next job. For some people, it can mean staying in this country, as publishing can serve as a basis for visa applications.
But the primary reason Brodnicki thinks her office sees more cases today is because there is so much more external scrutiny than in the past. Why? The internet.
“The number of queries we get from outside sources has skyrocketed,” she says, while the number of cases coming from someone who has seen, witnessed, or been privy to misconduct has remained unchanged.
Also consistent is HMS’s numbers compared to those of other medical schools. Yet, because Brodnicki’s office reviews cases from all the School’s affiliated institutions, the numbers can seem high.
Whatever It Takes
The review process follows strict federal regulatory requirements, but remains an academic review by fellow scientists: The facts and circumstances are adjudicated by full professors. The proceedings are kept confidential unless and until there has been a determination that a researcher has engaged in research misconduct.
Litt emphasizes that “an allegation is just that; it’s not proof,” so keeping all information confidential is vital to avoid the risk of seriously and unduly damaging a researcher’s reputation.
Allegations of falsification, fabrication, or plagiarism of data can be made in any number of ways. It may begin with an email from a colleague at the next laboratory bench who saw something that looked odd. Or someone might walk into Brodnicki’s office saying that “someone published this work, and it’s my work,” with supporting documentation in hand. One lab may allege wrongdoing by another lab on campus, or allegations may come from people at other institutions in the United States or elsewhere.
Dale remembers a call that came from a researcher in Russia who read a paper and saw a figure that didn’t make sense. The researcher looked up Dale’s name on the HMS website and sent her his concern.
“And there was a problem,” Dale says.
After an allegation is made, Brodnicki and members of her team, which very often includes Litt, meet to determine whether the allegation meets the definition of research misconduct, defined by the U.S. Office of Research Integrity as “fabrication, falsification, or plagiarism in proposing, performing, or reviewing research, or in reporting research results.”
“If someone says, ‘I tried to do this work in my own lab, and I couldn’t make it work in the same way, and so I think this person falsified the data,’ that kind of complaint will not necessarily meet the definition of research misconduct,” says Brodnicki. “The definition of research misconduct does not include honest error or differences of scientific opinion.”
When an allegation does satisfy the definition, it enters the inquiry phase. The accused is formally notified and three full professors who work outside the department of the accused but have relevant scientific expertise form a panel to assist in the investigation. The accused is given an opportunity to assess panel members for possible conflicts. Those being investigated may hire outside counsel.
At this point, all information related to the research in question, including lab notebooks, computers, computer drives, and supplemental material, is sequestered. This is also when Paul Russell, the John Homans Distinguished Professor of Surgery at Massachusetts General Hospital and the chair of the Standing Committee on Faculty Conduct at HMS and HSDM, often gets involved. A determination is made either to close a case because no misconduct was found or proved, or to move forward to a full investigation.
An investigation can take a year or two to complete. It is in this protracted period of data sifting that Litt’s work as a sleuth has helped shape the field.
Litt can spend hundreds of hours poring over manuscripts and laboratory materials in cases where plagiarism is alleged.
“I read a paper, I look at the bibliography, and I will do a character-by-character analysis,” he said.
Litt notes that as Russell’s standing committee works on a case, they reference not only the forensic findings but also the physical exhibits that Litt carefully constructs. For example, in cases where written work is thought to have been borrowed without attribution, Litt prepares exhibits displaying the original and the alleged copy and marks identical words in green and similar words in yellow. If Litt has indeed found the work to be a result of plagiarism, his displays will show it at a glance, in vivid detail. Litt dismisses software designed to find plagiarism, and Brodnicki says he does catch things that such programs miss.
“Software will identify identical text,” says Brodnicki, “but nothing can take the place of human assessment of that text and whether there may be an explanation for the reuse of certain text.”
Russell also points out Litt’s ability to find patterns, whether they are in copied photographs, in dubious calculations, or experiments.
“Not infrequently the evidence is clear that something in the laboratory, a critical program, has gone awry,” says Russell. “And it’s up to us to decide: Has it been a matter of malfeasance, has it been a matter of mistakes, or has it been a matter of misbehavior?” he adds. “We must decide, is this person a knave or a fool?”
It is often Litt’s work, Russell says, that helps answer the question of intent, a key criterion in a finding of misconduct.
Does every miscalculation, every sloppy technique, every altered slide tip the conclusion in the direction the researcher would like to see? What is the plausibility of the mistakes? Can the circumstances in which they were made be re-created? Is there a consistency with which the mistakes occurred? Litt’s answers to these and other questions find their way into a report that Russell’s standing committee uses to make its final determination of whether there is research misconduct, and if there is, to make a recommendation on what measures should be taken.
The discussions are emotional and often difficult, according to Litt and Brodnicki, with outcomes that can range from lauding the person who has been reviewed to dismissing the individual.
“When I make connections, it satisfies me intellectually,” Litt said. “But along with that, I feel the tragedy of the situation.”
For Litt and the others who every day see work being done by brilliant, dedicated scientists who maintain strict integrity right alongside the few who are intentionally, knowingly, or recklessly trying to tip things in their favor, the difference is stark.
“It’s a double-sided experience, one positive and one negative, side by side,” Litt says.
Ellen Ishkanian is a freelance writer based in Massachusetts.
Images: John Soares