Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
For Sale: “Your Name Here” in a Prestigious Science Journal (scientificamerican.com)
46 points by jsnell on Jan 9, 2015 | hide | past | favorite | 22 comments


It's kind of bizarre that this article focuses on the plagiarism issue, but seems to miss the even bigger problem: It doesn't appear at all clear if these published studies are actually done!

If they aren't done and yet are published in fairly high impact journals, presumably people are relying on their conclusions. Even worse, increasingly automated systems are relying on journal articles for automated diagnosis.

OTOH, if the studies actually are done... I guess it's a new avenue for funding science!


> Even worse, increasingly automated systems are relying on journal articles for automated diagnosis.

Where can I learn more about this? Without hearing more, it sounds like a terrible idea.

As a scientist, if my beliefs were based on an unweighted sample of claims made in journal articles, I can tell you, I would believe a lot of nonsense. I hope no one is practicing medicine that way.


On mobile ATM, but IBM Watson does it. It is generally more accurate than humans, and able to keep up with latest research much better.


"It's kind of bizarre that this article focuses on the plagiarism issue, but seems to miss the even bigger problem: It doesn't appear at all clear if these published studies are actually done!"

In fairness, I didn't get the impression that the article was "focused" on the plagiarism. Rather, it described using the plagiarism as a method to uncover dubious papers. If you perform a search for idiosyncratic phrases, such as the ones mentioned in the article, you turn up papers likely to have been faked entirely. The plagiarism is a giveaway. The more important point is that the studies themselves are phony -- and are being bought and sold into prestigious journals.


But that's the point. Authorship is being sold, but there is no indication for or against on whether the studies are actually being done by somebody. Is it possible that the people doing the actual research are simply not getting the credit? Maybe only some snippets are plagiarised and not the entire study?


"Authorship is being sold, but there is no indication for or against on whether the studies are actually being done by somebody."

I inferred that the studies are entirely bogus, i.e., no actual labwork is being done. I feel you can make this inference pretty safely from, for instance, the fact that the phony papers reference a nonexistent statistical methodology. If real work is being conducted, why bother citing a fictitious methodology?

I could be wrong, but I felt that soup-to-nuts fraud was strongly implied in the piece.


These are meta-analyses and as such don't require lab work by definition. My impression was that somebody has created some useful tools or amassed enough experience to do this kind of work efficiently.


Fair point about "labwork," but even putting that aside, it doesn't sound like any legitimate science is being done, one way or the other. I doubt they're even doing any meta-analysis. Sounds like they're just playing buzzword bingo, and citing previous studies to make it seem as if there's a legitimate basis for their "study." If you'll notice, most of the phony papers cited failed to establish any significant findings in their putative meta-analyses. If one were going to fake whole-cloth studies, it seems that this is the safest and easiest way to do it. For example: I find 20 previously published papers on a given topic, I claim I did a "Begger's funnel" meta-analysis of them, and I claim the Begger's funnel analysis found no statistically significant results. Done. I have a new paper to publish, from essentially nothing.

The giveaway, to me, is the fake statistical method that's repeatedly cited (in this case, the "Begger's funnel plot"). A legitimate meta-analysis wouldn't cite a fake methodology. It would have no need to do so. The fake method is used because it sounds arcane and advanced, and whoever is doing the peer review is going to assume it's legit.


This article actually frustrates me. I rarely comment but for some reason this stirred me to comment. The fact is: the research is important not the wording. Many academics 'plagiarize' wording but change it to fit the experiments that they actually ran. Especially foreign ESL researchers who want to convey similar information. Ex/ Performing the exact same experiments just with different chemicals. The papers could be almost exactly the same wording but the data and graphs would be different. This doesn't make the research or experiments fake or any less meaningful. I do understand that this could be used as an indication that the research is not novel but by no means is it definitive proof.


The linked original research into textual similarity[1] offers more compelling examples than those in the Scientific American article (which could plausibly be the innocent consequence of researchers diligently following a Chinese language "how to structure a publishable meta analysis study" guide and incorporating a few turns of phrase from the example papers)

Especially, the use of a definition copied verbatim from Wikipedia...

[1]http://blog.thegrandlocus.com/2014/10/a-flurry-of-copycats-o...


If you're looking for proof, we can always run the research through a Begger's funnel plot and graph the results; the outliers should then be revealed as plain as day!


I do agree. They do bring up additional evidence, though (late changes of authorship for example).


>"Sometimes there are minor variations in the wording but in more than a dozen articles we found almost identical language with different genes and diseases seemingly plunked into the paragraph, like an esoteric version of Mad Libs, the parlor game in which participants fill in missing words in a passage."

Though programs to auto-generate academic papers have existed for quite a while, and the garbage output papers have even been published in journals, I think a Mad Libs based program to generate "human-guided" academic papers would be a whole lot of fun at parties.

>"Now that a number of companies have figured out how to make money off of scientific misconduct, that presumption of honesty is in danger of becoming an anachronism."

Just wait until a rowdy game of Academic Mad Libs at a party or pub is more fun than quarters or beer pong as a drinking game. It would mean everyone could be Alchodemic and Sloshfisticated.

(All joking aside, the frightening thought is it would be fairly easy to automate human-guided plagiarism/fraud paper creation. It's already a solved problem in the automated "news" webspam regurgitation space.)


"Scientists, for whom published articles are the route to promotion or tenure or support via grants"

This is the problem. It's not about the quality of the science being produced, it's about how many times your name shows up.

I'll tell you how one of my former bosses did it. He would do his work at a discounted rate in exchange for them adding him onto their paper. He did good work, but many of the papers he was added on were of a questionable scientific nature. He didn't care though, because he could claim to be published in over 1,000 papers!

The other issue besides quality of science, is the blocking of information. I hate hearing about new research paper on X, only to find it hidden behind a paywall. For the journals it has become very much about the almighty dollar and restricting access is part of that. Papers should be free to read in my opinion.


The idea that quantity strictly beats quality is not uncontroversial, however. There is plenty of evidence of people doing great work (usually also coming from a small-ish set of "superstar" labs), and still doing just fine. See, for instance, this list: http://www.thespectroscope.com/read/its-not-publish-or-peris...

I think the fact that all the paper mill instances cited in the OP are from Chinese groups suggests that this is a problem driven by a few broken systems (primarily in the developing world), and not the modern scientific enterprise as a whole.


I am sympathetic to your argument, but I would say that the quality dimension is also heavily corrupted, especially in basic science where people rarely retrace experiments, and there is no requirement for a "working product".

The word "superstar" labs is itself loaded. Scientific discovery has always been heavily influenced by chance, and the fact that certain labs regularly publish in high profile journals has more to do with political connections rather than the intrinsic quality or impact of their work.


I know a section editor of a top medical journal. I sent him this article, and he was absolutely shocked. He knew about plagiarised content (although I had to explain "Mad Libs" to him) but he had utterly no idea about authorship being sold.

I asked him if changes to authorship are common, and he said it sometimes happens for legitimate reasons -- for instance, the journal may send back the paper to the original authors and say "you need to perform additional analysis on XYZ before we can publish." If someone else needs to be consulted for this to happen, it's not unusual to add that person's name on the paper at a later date.


I only skimmed the article but does that mean that the research itself is fake as well?


The "Your Name Here" is a requirement for medical doctors. Medical doctors are highly skilled craftsmen, but not scientists. Still the need to write a doctor paper, and this causes authoring services that had been common in the first universities ages ago. The other evil side is, that the research is sponsored by big pharma, who has vital interest that their snake oil performs well on paper.


M.D. programs don't in general have a thesis/dissertation requirements (its a professional rather than research degree), though there are M.D. programs with thesis requirements, and they tend to have more prestige (either more research-oriented options at some institutions, and all M.D.s at certain institutions.)


Sounds like journal editors should run everything through TurnItIn (or other similar service).


Is that not already done?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: