Fake Journals and Fake Paper Publication:
Scholars are not only the people who will do fake papers and publishing them in a reputed journal – even most of the journals and their review mechanism were fake so that one can easily publish any ‘nonsense paper’ in most of such journals.
What are the real impacts of Fake Journals on Research :
- First of all, if a genuine researcher, unfortunately, gets trapped by a fake journal and published his state-of-the-art work in that fake journal, then certainly he will be affected by it. Because, later, the University or the Guide of the scholar may not consider it as a reputed peer-reviewed journal publication. After such a miss-fortunate publication, the scholar can not even republish his work in any other reputed journal since it will be considered as duplicate work or at least a work with a lot of self-plagiarism. So, in this case, the progress of the research or the entire research of that scholar will get affected by it.
- On the other hand, if a scholar is intentionally publishing a paper with a colourful fake theory along with colourful cooked-up fake results in that fake journal, that publication will also definitely affect another genuine research by another genuine researcher. To understand this, I hereby explain it with a classification problem in data mining. Let us assume a typical ‘cancer’ dataset that has an 80% clean record from which we can identify malignant and normal instances. In other words, in that 80% data, the malignant records are definitely distinguishable from normal records. And let us assume that the remaining 20% of records were indistinguishable from one another. It means, from the attributes of that 20% of records, we can not definitely say whether a record belongs to the normal or malignant category. In the very first published work on that dataset, the genuine researcher[1] may achieve 72% classification accuracy and publish his work based on that 72% accuracy. After some time, another researcher[2] may work hard on the same classification problem and may achieve 74.5% classification accuracy and publish his findings. If a scholar is doing research based on the works of [1] and [2], then certainly he will try to achieve higher accuracy. But, as a poor scholar, without knowing about the theoretical limit of maximum achievable accuracy of 80%, he will try to claim that his classification model achieved 85% accuracy. He will do a fake theoretical model and prepare a paper with results of 85% accuracy and try to publish it in a journal. By chance, if that journal is a fake journal, then his paper will get published with that hypothetical 85% results. Even he may try another fake model and publish another paper with 90% accuracy in that same journal. (In this case, if a (fake) researcher is claiming that his algorithm is capable of identifying/classifying that dataset with 85% or 90% accuracy, then definitely it will be fake. Because, if his algorithm is identifying/classifying one such indistinguishable record in that 20% as normal or malignant, then it means that the algorithm is not at all working, and it means it is randomly doing the classification.)
Ok. Now, if a genuine scholar is doing genuine research and really find a hard way to achieve 78.5% accuracy in this classification problem, then the real research work will never get appreciated or recognized by anyone because of these previously published fake papers with 85% and 90% accuracy. Of course, the attained 78.5% accuracy in this classification problem may be a real scientific accomplishment—but (mistakenly) it will never be considered because of the previous fake publications.
The Historical “Fake Journal Detection” Experiments.
Here we present some of the experiments that some intelligent people made to identify and reveal the real face of some of the fake journals.
Example 1: Experience of Mark Shrime, Harvard :
As a medical researcher at Harvard, Mark Shrime gets a very special kind of spam in his inbox: every day, he receives at least one request from an open-access medical journal promising to publish his research if he would only pay $500.Shrime decided to see how easy it would be to publish an article. So he made one up. Like, he literally made one up. He did it using www.randomtextgenerator.com. The article is entitled “Cuckoo for Cocoa Puffs?” and its authors are the venerable Pinkerton A. LeBrain and Orson Welles. The subtitle reads: “The surgical and neoplastic role of cacao extract in breakfast cereals.” Shrime submitted it to 37 journals over two weeks and, so far, 17 of them have accepted it.(They have not “published” it, but say they will as soon as Shrime pays the $500. This is often referred to as a “processing fee.” Shrime has no plans to pay them.)
In 2005, computer scientists David Mazières and Eddie Kohlercreated this highly profane ten-page paper as a joke, to send in reply to unwanted conference invitations. It literally just contains that seven-word phrase over and over, along with a nice flow chart and scatter-plot graph with the same even-word phrase.
An Australian computer scientist named Peter Vamplew sent it to the International Journal of Advanced Computer Technology in response to spam from the journal. Apparently, he thought the editors might simply open and read it.
Instead, they automatically accepted the paper — with an anonymous reviewer rating it as “excellent” — and requested a fee of $150.
A scientific study by Maggie Simpson, Edna Krabappel, and Kim Jong Fun has been accepted by two journals.
Of course, none of these fictional characters actually wrote the paper, titled “Fuzzy, Homogeneous Configurations.” Rather, it’s a nonsensical text, submitted by engineer Alex Smolyanitsky in an effort to expose a pair of scientific journals — the Journal of Computational Intelligence and Electronic Systems and the comic sans-loving Aperito Journal of NanoScience Technology.
These outlets both belong to a world of predatory journals that spam thousands of scientists, offering to publish their work — whatever it is — for a fee, without actually conducting peer review. When Smolyanitsky was contacted by them, he submitted the paper, which has a totally incoherent, science-esque text written by SCIgen, a random text generator.