Journals are the essence of scholarly
communication. They not only serve to disseminate latest scientific
advancements but also provide a platform for archiving scholarly
information for future reference, and allow a researcher to assert his
scientific mettle. Selecting the most suitable journal to showcase one’s
scholarly work is no mean feat. With more than 43,000 biomedical
journals listed with PubMed [1], the database maintained by United
States National Library of Medicine (NLM), this exercise can easily
flummox an inexperienced researcher. The huge risk of rejection of a
paper from a journal that is not the right fit, and a widening web of
dubious and predatory journals which publish almost everything sent to
them, make this task particularly daunting.
Where to Publish?
The aim of clinical research is to bring about a
positive change in practice and policy so that the mankind is benefitted
by the advances of medical science [2]. Therefore, unless the research
work gets published and reaches its target audience, the entire exercise
can be futile. Failure to choose the appropriate journal results in
rejection and wastage of precious time, and slow career progress for the
researcher. To facilitate the process of selecting the most appropriate
journal, we need to consider the following variables (Box 1):
BOX 1. Variables to Consider for Choosing
a Journal
• Focus/Scope
• Indexing status
• Impact factor
• Peer-reviewed
• Affiliation to scientific
societies
• Publication frequency
• Publication fees
• Accessibility
• Time to publication/Early online version
|
Focus: Every journal targets a certain audience
and has a certain focus. On the basis of their focus, journals can be
categorized as: broad-specialty vs specialty journals, pure
research vs applied science journals, qualitative research vs
quantitative research journals, veterinary (animal) science vs
human science journals, etc. Likewise, some journals may have a
more local and regional appeal, while others may have a more global
readership. Specialized journals, even with a potentially smaller
readership, may disseminate your work more efficiently to your desired
audience than a broad-specialty journal. It is important to remember
that we should not only be interested in getting our work published, but
also aim to get it noticed by the right audience. Therefore, it makes
sense to publish data pertaining to a regional community in a local
journal where it may influence the practice and policy rather than
publishing in a ‘reputed’ international journal that is seldom referred
to by the endasers. For example, a research work evaluating the
predictors of mortality in children suffering from dengue fever in an
urban belt in India would be better appreciated and read in a journal
popular (and published) in India rather than a foreign journal with very
limited circulation in the region of the origin of work.
The focus of the journal is usually stated on the
journal’s home page under the heading ‘scope of the journal’ or in the
instructions to authors. A look at the recent issues of the journal will
also give you an idea of the journal’s area of focus. It is important to
ascertain the harmony between the theme of the manuscript and focus of
the journal before submission, as a mismatch between the two is one of
the leading causes for outright rejection of the manuscript.
Indexing status: Indexing of a journal in a
citation database is a property by the virtue of which articles
published in it become searchable in that database [3]. The content
published in the journal is indexed at the article level by assigning
keywords, and then making them searchable in the database. Other
bibliographic elements of journal articles, including authors’ names,
title of the article, journal name, and date of publication, are also
used for indexing.
Index Medicus was the most widely accepted and
comprehensive database of biomedical journals from 1879 until 2004. With
the rapidly increasing number of journals, the printed publication Index
Medicus was replaced by its online version ‘Medline’ in 2005.
Among the major databases for biomedical journals, indexing by Medline
is considered as a benchmark of high quality for a journal. Over the
years, other databases like Embase, Scopus, Science Citation Index,
Directory of Open Access Journals, and many regional databases have
emerged. Remember Google, Google Scholar and Sherpa-Romeo are not
citation databases!
However, indexing of a journal comes with its own
problems. Inclusion of a journal in a reputed indexing database depends
on its scientific merit and rigorous publication policy and ethics, and
therefore not all journals get indexed. Several regional and national
journals, published in native languages, fail in their attempt to be
indexed in the international databases. We must remember that not all
research is relevant globally, and some may only be suited for
publication in a regional or national journal that may not be indexed.
Therefore, although important, indexing should not be used as the sole
criterion for choosing the journal.
Despite the fallacies of indexing, it continues to be
a major tool for assessing the merit of scientific publications. The
recent Medical Council of India (MCI) guidelines recommend that
publications indexed in Scopus, PubMed, Medline, Embase/Excerpta Medica,
Index Medicus and Index Copernicus should be considered for promotions
of teaching faculty in medical colleges [4]. This has generated
considerable debate amongst medical fraternity as indexing databases
like Science Citation Index and IndMed have been overlooked whereas a
database with questionable integrity – Index Copernicus – has been
included [4]. These MCI recommendations raise some important questions:
‘Whether an indexed journal should be preferred over a non-indexed
journal with a high potential of influencing change in practice and
policy’, ‘Which indexing database is valid,’ and ’Whether publications
should be evaluated for scientific merit by indexing status of the
journal rather than peer review?’
Impact Factor: Another parameter – the impact
factor (IF) – is often used as a proxy for the relative importance of a
journal within its field, and is frequently over-rated. IF of a journal
is the annual measure of the extent to which articles published in that
journal are cited. IF is awarded to the journals indexed in Science
Citation Index, published annually in Thomson Reuters Journal
Citation Reports [5]. However, IF must be interpreted with caution
as its calculations are prone to manipulation [6]. Editorial policies
such as preferential publication of review articles and articles dealing
with newer diagnostics and therapeutics, short publication lag, and
excessive self-citation can magnify the IF. English language journals
and Basic sciences journals have higher impact factors. Abuse of the IF
and the dominance of the prominent journals is a threat to the smaller
and non-English language journals, and is akin to the ‘Matthew effect’
whereby the rich get richer and the poor become poorer. Interestingly,
IF is not available for all indexed journals as not all journals indexed
in MedLine are indexed in the Science Citation Index [7]. Moreover, the
IF of a journal just tells about the merit of the journal, and not that
of a particular article published in the journal.
BOX 2. Bibliometric Indices to Assess the
Importance of a Journal
Impact factor. The number of citations per
paper received by a journal in a particular year to the papers
published in that journal during the two preceding years.
Immediacy index. The average number of times
an article gets cited in the year it is published, and hence
indicates how quickly articles in a journal are cited.
Cited half life of a journal. The median age
of the articles in the journal that were cited by other journals
during the year. Therefore it reveals whether articles that were
published a long time ago in that journal are still being cited.
Eigenfactor score. This score evaluates
journals according to the number of incoming citations over the
preceding five years, with citations from highly ranked journals
weighted to make a larger contribution to the Eigenfactor score than
those from poorly ranked journals (Page rank algorithm).
SCImago Journal Rank (SJR). It is
a measure of scientific influence of scholarly journals that
accounts for both the number of citations received by a journal and
the importance or prestige of the journals where such citations come
from. Calculation of the SJR indicator is very similar to the Eigenfactor score,
with the former being based on the Scopus database and the latter on
the Web of Science database
Altmetric. It is a non-traditional broad group of metrics
which cover other aspects of the impact of a work, such as how many
data and knowledge bases refer to it, cite it, article views (PDF or
HTML views), downloads, or mentions in media (journal comments,
science blogs, Wikipedia, Twitter, Facebook and other social media).
|
Considering the potential problems in calculation of
IF, it may be advisable to explore certain other bibliometric indices (Box
2) like Immediacy index, Cited half-life, SCImago journal rank and
Eigenfactor score to compare journals [8]. Likewise, it is important to
remember that for evaluating a researcher’s academic merit, h-index,
i-10 index, and citations (Box 3) are more relevant
indices than the above.
BOX 3. Bibliometric Indices to Assess the
Academic Contribution of a Researcher
h-index. The largest number h such
that an author’s ‘h’ publications have at least ‘h’ citations.
i10-index. It is the number of publications
with at least 10 citations.
Citations. Citations of a researcher are the
total number of citations to all articles authored by the
researcher.
Five-year citation. Number of citations received by an
author/article in last five years.
|
Affiliation of the Journal to prestigious
organizations: A journal publisher who is a member of the Committee
on Publication Ethics (COPE) indicates that the Journal will follow the
essential norms on publication ethics. COPE is a platform for editors
and publishers of peer reviewed journals to discuss and seek advice on
the ethical issues of publishing. Another indicator of the journal
quality is its affiliation to the International Committee of Medical
Journal Editors (ICMJE) which would also indicate that the journal
abides by the publication recommendations given by them. Open access
journals listed in the Directory of Open Access Journals (DOAJ) and Open
Access Scholarly Publisher’s Association also signify its credibility.
Journals owned by reputed scientific societies (academies) are perceived
to be superior.
Peer review: Peer review process is a service
rendered by reviewers who provide honest and constructive criticism of
research work to assess it worthiness for publication in a journal.
Hence, peer review process is vital element of scholarly publishing and
peer-reviewed journals are considered honorable [9].
Reputation among colleagues: A simpler way to
assess the reputation of a journal could be asking your peers or mentor
about their choice of journal.
Accessibility: Journals which have both print and
online versions have easier accessibility and hence may be preferred.
Journals providing free online content are more accessible, especially
to readers from underprivileged settings. In addition, regional or
national journals with English-translated versions may be globally more
acceptable.
Time-to-print: Many journals, including Indian
Pediatrics, declare the date of initial submission and the date of
final acceptance at the time of final publication. Journals offering a
reasonable time frame for publication should be preferred, lest the
research becomes outdated. However, with rampant unethical publishing
practices, authors need to be cautious while choosing to publish in
journals offering fast-track publication as many of these may actually
be predatory (discussed later).
Format: To avoid outright rejection, one must
check whether the journal has a policy of accepting articles of the form
you are writing. This can be ascertained by reading the ‘instructions to
authors’ of the journal, as well as looking at the past issues of the
journal. It is important to ascertain whether the manuscript structure
(order of sections), reference style, figure formats and image
specifications match with the journal’s style before submission. In
case, you are not clear on this aspect, you can verify this with the
editorial team by sending them an email (Pre-submission enquiry). This
may be particularly relevant in case of review articles as some journals
may solicit them from experts in the field.
Publication charges: In the traditional
publishing model, the access to published research work was controlled
by the publishers who charged libraries, institutions and individuals a
subscription fee and also a per article fee. This was referred to as the
"green road" of publishing. This model led to frustration amongst
individual researchers as they could not afford to pay the hefty journal
subscriptions which witnessed a steady annual rise of 8-10%. However,
2002 witnessed the Open access (OA) movement in scholarly publishing
wherein users could download and read journal articles on the internet
without having to pay for it [10]. The publishers of the OA journals
recovered costs by charging the authors a publication fee. This model is
referred to as the "gold road" of publication. Since most health care
research globally is public-funded, the OA model does seems righteous in
allowing all researchers a free access to research and in assisting
accelerated discovery and advancements in biomedical research. OA model
also allows researchers to get more citations for their research.
However, the economic viability of this model is debatable as the
"author-pay" model is a major obstacle for researchers from developing
countries who already struggle to get institutional budgetary allocation
for research and hence opt out of publishing in OA journals unless
granted a waiver of the author fees, which is usually tough. The
publishers of OA journals also feel that author fees alone cannot
sustain the high publication costs. While most OA journals do try to
ensure a fair peer-review process to ensure high academic standards,
lately there has been the emergence of several OA journals that
compromise on peer review process and the quality of papers, with the
aim of publishing more articles to generate more revenue from authors.
Author fees, lack of journal prestige, ethical concerns, and loss of
author copyright control are some of the major drawbacks of the OA
model, and force many authors to tread the green road of publication.
Where not to Publish?
Beware: Predatory journals on the prowl!
For sustained academic growth, ethical publishing is
a pre-requisite. With the plethora of biomedical journals to choose
from, authors need to be discerning more than ever before. While the OA
model in publishing fostered easy and free access to innovative
high-quality scholarly research, there was also a flip side to it [11].
Several poor-quality journals emerged on the internet that exploited the
OA model by offering fast-track publication of poor-quality research
without any peer-review, in return for a nominal article processing fee
[12]. While some of these journals state the publication fee upfront,
most notify the publication fee to the author only after his/her
manuscript is accepted for publication. By then, much time and effort
has already gone into the review and revision of article, and the author
has little option other than to pay the fee. These publishers lure naïve
researchers – mostly from developing world – by sending them spam and
phishing emails inviting them to publish research work in their
journals. Their emails often display the phrase "CALL FOR PAPERS" in
capital and large fonts. Some of them even strategize to entice authors
by sending them personalized emails praising their recent scholarly work
and inviting them to submit similar research work. In order to promote
the credibility of their journals, these publishers request researchers
to join their editorial teams as members and editors. Their mails often
have several spelling and grammatical errors. This unabashed unethical
seeking of authors by publishers was first pointed out by Jeffery Beall,
an academic librarian from Colorado, USA, who christened them as
‘Predatory publishers’ in his blog in 2010. In 2011, Beall published a
list containing the names of 18 predatory publishers [13], now called as
the Beall’s list. In 2012, Beall shifted his blog to a Word Press
platform and named it Scholarly Open Access (found at
http://scholarlyoa.com). In his blog, he updated the list of
‘Predatory Publishers’ and "Predatory Journals" annually to caution
inexperienced researchers and authors. Beall noticed that these
publishers had certain common traits and postulated criteria to help
identify them in his blog. Such publishers usually did not state the
location of their headquarters and their websites were of poor quality,
replete with typographical and grammatical errors. Papers in the
publisher’s journals were not only of inferior academic standards, but
were also poorly copy-edited. These publishers had a large portfolio
with several journal titles with most of them being recent with scant
content. The publishers’ emails have freely available domain names like
gmail, hotmail, yahoomail, etc. Beall also noticed that these
journals had little geographical diversity among editorial board members
as well as authors, and the editorial board member list exhibited a male
preponderance. The PDF files of papers published in these journals were
locked to prevent them from being vetted for authenticity, and the
publisher deliberately prevented the content from being indexed in
academic indices. In addition, most of these journals adopted a
nomenclature to closely mimic a reputed journal in the field, to beguile
inexperienced authors. He cautioned against journals with ambitious
titles containing terms like "Innovative", "World", "International",
"Global", "European", and "Euro-Asian". Predatory Publishers have been
known to make bogus claims about their indexing status and high impact
factors of their journals [14,15]. They also seek the assistance of
companies which claim to provide valid scholarly metrics such as
CiteFactor, Advanced Science Index, General Impact Factors, Global
Impact Factors and Science Impact Factor. These bibliometric indices
have been described by Beall as ‘Misleading metrics’ as their
calculations are non-transparent, unscientific and manipulated. These
companies charge the publisher and assign them indices which increase
with time, in an attempt to mesmerize naïve researchers. Since 2011,
there has been a deluge of predatory publishers, with 923 predatory
publishers (publishing thousands of predatory journals) and 882
stand-alone predatory journals being listed in 2016 [16].
Another concept of ‘Hijacked Journals’ was also
noticed by Beall, wherein a publisher creates a website which falsely
claims to be the website of an authentic scholarly journal and even
provides links to the bibliographic content of the authentic journal.
They then invite manuscript submissions for the hijacked version of the
journal and make away with the article submission and processing fees.
The 2016 Beall’s list mentions 101 hijacked journals against 30 such
journals listed in 2015.
We advise authors to ascertain the credibility of
journals by checking their credentials, indexing status, open access and
archiving options, affiliation to scientific societies, the reputation
of the publisher, and the Beall’s list. The last is particularly
important, as any association with predatory journals is now being
viewed negatively by the scientific world.
With the groundwork for publication done, from next
issue we will move on to the ‘How’ of writing a paper for a scientific
journals.
References
1. U.S. National Library of Medicine. List of all
journal cited in Pubmed®. Available from:
https://www.nlm.nih.gov/bsd/serfile_addedinfo.html. Accessed January
9, 2016.
2. Sachdev HPS, Ramji S. Why do we write? Indian
Pediatr. 2016;1:45-6.
3. Ng KH, Peh WC. Getting to know journal
bibliographic databases. Singapore Med J. 2010;51:757-60; quiz 761.
4. Aggarwal R, Gogtay N, Kumar R, Sahni P, for the
Indian Association of Medical Journal Editors. The revised guidelines of
the Medical Council of India for academic promotions: Need for a
rethink. Indian Pediatr. 2016;53:23-6.
5. Garfield E. The history and meaning of the journal
impact factor. JAMA. 2006;295:90-3.
6. Lippi G, Favalor EJ, Simundic AM. Biomedical
research platforms and their influence on article submissions and
journal rankings: an update. Biochem Med (Zagreb). 2012;22:7-14.
7. Elsaie ML, Kammer J. Impactitis: The impact factor
myth syndrome. Indian J Dermatol.2009;54:83–5.
8. Brown T. Journal quality metrics: Options to
consider other than impact factors. Am J Occup Ther. 2011;65:346–50.
9. Hojat M, Gonnella JS, Caelleigh AS. Impartial
judgment by the "gatekeepers" of science: fallibility and accountability
in the peer review process. Adv Health Sci Educ Theory Pract.
2003;8:75-96.
10. Albert KM. Open access: implications for
scholarly publishing and medical libraries. J Med Libr Assoc.
2006;94:253-62.
11. Bowman JD. Predatory publishing, questionable
peer review, and fraudulent conferences. Am J Pharm Educ. 2014;78:176.
12. Beall J. Predatory publishers are corrupting open
access. Nature. 2012;489:179.
13. Beall J. Medical publishing triage-chronicling
predatory open access pmublishers. Ann Med Surg (Lond). 2013;2:47-9.
14. Gasparyan AY, Yessirkepov M, Diyanova SN, Kitas
GD. Publishing ethics and predatory practices: A dilemma for all
stakeholders of science communication. J Korean Med Sci. 2015;30:1010-6.
15. Butler D. Investigating journals: The dark side
of publishing. Nature. 2013;495:433-5.
16. Beall J. Beall’s List of Predatory Publishers
2016. Available from:
http://scholarlyoa.com/2016/01/05/bealls-list-of-predatory-publishers-2016/.
Accessed January 9, 2016.