Home Print this page Email this page Users Online: 442
Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contacts Login 


 
 Table of Contents  
EDITORIAL
Year : 2019  |  Volume : 7  |  Issue : 3  |  Page : 85-86

How to assess quality of journals and researches (Part II)


Department of Ophthalmology, P D Hinduja National Hospital and Medical Research Centre, Mumbai, Maharashtra, India

Date of Web Publication11-Dec-2019

Correspondence Address:
Barun K Nayak
Department of Ophthalmology, P D Hinduja National Hospital and Medical Research Centre, Mumbai, Maharashtra
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/jcor.jcor_98_19

Rights and Permissions

How to cite this article:
Nayak BK. How to assess quality of journals and researches (Part II). J Clin Ophthalmol Res 2019;7:85-6

How to cite this URL:
Nayak BK. How to assess quality of journals and researches (Part II). J Clin Ophthalmol Res [serial online] 2019 [cited 2023 Mar 25];7:85-6. Available from: https://www.jcor.in/text.asp?2019/7/3/85/272717



I have discussed about the impact factor (IF) in the previous issue as Part I of the Editorial.[1] In the present editorial, I would discuss some of the alternative metrics which are available to judge the quality of journals and researchers. IF is based only on data from web of science; hence, a large chunk of data are left out such as from unpublished research, other data sets, presentations, hyperlinks, and web pages. In the United States of America, only 15%–20% of authors had published articles which were referenced by others.[2] The skewness of citation data is apparent by the fact that only 20% of articles published in 2013 and 2014 in Journal of Informetrics contributed 55% of citations in 2015, which was included in the citation of IF of 2015.[3] IF probably reflects impact mainly on the research community and not on the clinicians, government funding, and general public. In developing countries like India, this gets compounded by the fact that full-time researchers are sparse, and majority are clinicians performing research as their secondary responsibility. Researchers from these countries do not find place easily in journals with high IF and do not get due recognition. Hence, there is a need to explore the other methods of evaluating the value of publications.

Elsevier publisher has its database in Scopus.[4] Some of the indices are impact per publication (IPP), source normalized impact per paper (SNIP), and SCImago Journal Rank (SJR).[5] IPP of a journal is the average number of citations received in a year of all the articles published in three preceding years in that journal. SNIP is the ratio between the IPP as numerator and database citation potential as denominator. SJR is essentially similar to IF, but each citation is weighted based on the value of the citing journal. Google scholar is another huge database which calculates h-index which can be journal based or individual researcher based.[6] It is calculated based on the total number of papers published and the number of citations for each paper. To remove bias due to excessive self-citation, h-index calculations are available with all citations as well as after removing all self-citations. All these are citation based and carry similar limitations, as IF in varying degrees.

Now, it has been realized that research spectrum has expanded tremendously in the last couple of years, but the measurement of impact has remained stationary. The researchers, research administrators, funding agencies, corporate R and D segments, and publishers have different requirement to assess the researches, but there is no common source wherein all of them can extract whatever is relevant to their needs. Activities on social media are increasing in researchers' community as well. ResearchGate is getting quite popular due to ease of “navigation and simplicity” and does not require review or fees. It is influential in promoting innovation in developing countries and help in connecting scientists with their peers in developing countries. In one study conducted on 160 individuals in Delhi University who were using social media as a tool for research, ResearchGate was found to be the most popular.[7] ResearchGate score of a researcher measures scientific reputation based on how the work is received by peers, taking into consideration of all possible sources such as published articles, unpublished articles, projects, questions, and answers discussed by a particular researcher.[8]

Academia.edu is a useful site which provides a platform for academics to share their research paper.[9] It helps in social net articleing of academics with a mission to accelerate research in the world. It also provides analysis about the impact of the researcher. “Project cupcake” is another emerging concept which provides not only a single metric but also includes many information that researcher would like to know.[10] It can provide additional information of a journal about how the articles are handled after submission? How many rounds does it take to get the decision? What is the acceptance rate? What is the time to rejection? It can also provide technical information, such as what is the quality of typesetting?

One should also understand that biomedical database can have either all or in varying combination of the bibliographic database, citation, and database with full text. There are many alternatives which go beyond citations. “Cite Score,” “Dimensions,” and “Altmetrics” are some of the important examples. Cite score indicates average citations received per document to the published articles in three previous years in a title which can be peer-reviewed journal, conference proceedings, book series, and trade journals.[11] It is transparent, comprehensive, and free and includes data from Scopus which is the largest database of peer-reviewed literature. Suppose 'x' number of citations were received in the year 2018 from all the articles published in the year 2015, 2016, and 2017 and the total number of articles published in the year 2015, 2016, 2017 was 'y', the cite score for the year 2018 would be 'x/y'.

Digital Science has realized the need for comprehensive and flexible dataset for all concerned in the field of research. Six digital science portfolio companies have decided to create a common pool namely “Dimensions.” These companies are Readcube, Altmetrics, Figshare, Simplectic, DS Consultancy, and UBER Research. This new platform is easy to use, intuitive, and combines data which are invaluable for all groups of people involved in research. Dimensions can also get connections between clinical trials, publications, grants, policy documents, and patents. Dimensions form the state of the art platform based on the needs of research organizations, researchers, funders, and publishers.[12] It also removes barriers to access even to siloed data at a low cost. It also provides avenues for developing new metrics.

Altmetrics is another alternative providing qualitative data about publication which are not the replacement but complementary to the citation-based metrics.[13] They incorporate data from multiple sources such as peer reviews of faculty of 1000, research blogs, discussions, media coverage, citation in public policy documents, citation on Wikipedia, mentions on twitter or other social networks, and bookmarks on reference managers like Mendeley. Altmetrics are not single class of indicator, but it includes record of attention as well as a measure of dissemination. It is also an indicator of influence and impact. Altmetrics has many advantages over citation-based metrics as they are quicker to accumulate and captures more diverse impact as compared to citation-based metrics; They are not only limited to journal articles and books but also includes activity in other areas such as social media, discussions, comments, and policy documents. Altmetrics providers have also taken some measures to prevent gaming which is possible in IF.

It is clear from the ongoing discussion that there is no “one size fits all,” and some of them are complementary to each other. Further description is out of scope of this editorial; however, readers are requested to read from suitable sources if they want to have detailed information which is going to expand exponentially in the future. The purpose of this editorial is to introduce the concept of alternative metrics beyond IF. After having this information, researchers can choose the appropriate metrics based on their requirement and resources.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Nayak BK. How to assess quality of journals and researches (Part I). J Clin Ophthalmol Res 2019;7:35-6.  Back to cited text no. 1
  [Full text]  
2.
Waltman L, Traag VA. Use of the journal impact factor for assessing individual articles need not be wrong. In: Computer Science: Digital Libraries.  Back to cited text no. 2
    
3.
Priem J, Piwowar HA, Hemminger BM. Altmetrics in the wild: Using social media to explore scholarly impact. In: Computer Science: Digital Libraries.  Back to cited text no. 3
    
4.
Scopus: Content Coverage Guide. Available from:https://www.elsevier.com/__data/assets/pdf_file/0007/69451/0597-Scopus-Content-Coverage-Guide-US-LETTER-v4-HI-singles-no-ticks.pdf. [Last accessed on 2019 Nov 20].  Back to cited text no. 4
    
5.
Scholarly Publishing Resources for Faculty: Scopus Metrics (CiteScore, SNIP & SJR, h-Index). Available from: https://liu.cwp.libguides.com/c.php?g=45770&P=4417804. [Last accessed on 2019 Nov 20].  Back to cited text no. 5
    
6.
Google Scholar. Available from: https://library.ucar.edu/finding-your-h-index-hirsch-index-google-scholar. [Last accessed on 2019 Nov 20].  Back to cited text no. 6
    
7.
Margam M. Use of social networking sites by research scholars of the University of Delhi: A study. [Doi: 10.1080/10572317.2012.10762919].  Back to cited text no. 7
    
8.
ResearchGate: RG Score H-Index. Available from: https://explore.researchgate.net/display/support/RG+Score. [Last accessed on 2019 Nov 20].  Back to cited text no. 8
    
9.
Szkolar D. Social Networking for Academics and Scholars; 21 June, 2012. Available from: https://ischool.syr.edu/infospace/2012/06/21/social-networking-for-academics-and-scholars. [Last accessed on 2019 Nov 20].  Back to cited text no. 9
    
10.
Project Cupcake: Designing a New Type of Journal Metric. Available from: https://scholarlykitchen.sspnet.org/2017/05/22/project-cupcake-designing-new-type-journal-metric/. [Last accessed on 2019 Nov 20].  Back to cited text no. 10
    
11.
CiteScore: A New Metric to Help you Track Journal Performance and Make Decisions. Available from: https://www.elsevier.com/editors-update/story/journal-metrics/citescore-a-new-metric-to-help-you-choose-the-right-journal. [Last accessed on 2019 Nov 20].  Back to cited text no. 11
    
12.
Dimensions: Re-Imaging Discovery and Access to Research. Available from: https://www.digital-science.com/products/dimensions/. [Last accessed on 2019 Nov 20].  Back to cited text no. 12
    
13.
Measuring Research Impact: Altmetrics. Available from: https://guides.lib.berkeley.edu/researchimpact/altmetrics/. [Last accessed on 2019 Nov 20].  Back to cited text no. 13
    




 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
References

 Article Access Statistics
    Viewed1824    
    Printed75    
    Emailed0    
    PDF Downloaded250    
    Comments [Add]    

Recommend this journal