Research Analysis & Evaluation

International Double Blind Peer Reviewed, Refereed, Multilingual,Multidisciplinary & Indexed- Monthly Research Journal

  • ISSN(E) : 2320-5482 RNI : RAJBIL2009/30097
  • Impact Factor : 6.376 (SJIF)

The best theme for Education

Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s.

Search Paper S

Data not available..., related blogs, helpful links.

  • ugc journal

A-215, Moti Nagar,Street No-7 , Queens Road , Jaipur-302021, Rajasthan.

© Copyright 2016. Designed by: Corporate1 Software

research analysis and evaluation journal

Login | Register

  • Editorial Team
  • Start Submission
  • Become a Reviewer

About this Journal

Practical Assessment, Research, and Evaluation (PARE) is an online journal providing access to refereed articles that can have a positive impact on assessment, research, and evaluation. Manuscripts published in PARE are scholarly syntheses of research and ideas about methodological issues and practices, and offer unique contributions to the literature. PARE is intended to help members of the community keep up-to-date with effective methods, trends, and research developments from a variety of settings.

PARE is also committed to highlighting research that focuses on equity in assessment and measurement. To that end, we encourage the submission of manuscripts that address issues of equity and fairness – particularly articles that focuses on the assessment experiences and outcomes for historically marginalized populations.

Volume 29 • 2024

Clearer analysis, interpretation, and communication in organizational research: a bayesian guide.

Karyssa A. Courey, Frederick L. Oswald and Steven A. Culpepper

2024-01-18 Volume 29 • 2024 • Article 1

Using Cliff’s Delta as a Non-Parametric Effect Size Measure: An Accessible Web App and R Tutorial

Kane Meissel and Esther S. Yao

2024-01-22 Volume 29 • 2024 • Article 2

Are Exams Authentic Assessment? The Case of Economics

Petar Stankov

2024-01-29 Volume 29 • 2024 • Article 3

Statistical Analyses with Sampling Weights in Large-Scale Assessment and Survey

2024-02-27 Volume 29 • 2024 • Article 4

Impacts of Differences in Group Abilities and Anchor Test Features on Three Non-IRT Test Equating Methods

Inga Laukaityte and Marie Wiberg

2024-03-08 Volume 29 • 2024 • Article 5

A Practical Comparison of Decision Consistency Estimates

Amanda Wolkowitz and Russell Smith

2024-03-12 Volume 29 • 2024 • Article 6

Transforming Assessments of Clinician Knowledge: A Randomized Controlled Trial Comparing Traditional Standardized and Longitudinal Assessment Modalities

Shahid A. Choudhry, Timothy J. Muckle, Christopher J. Gill, Rajat Chadha, Magnus Urosev, Matt Ferris and John C. Preston

2024-03-26 Volume 29 • 2024 • Article 7

Frequentist and Bayesian Factorial Invariance using R

Teck Kiang Tan

2024-04-02 Volume 29 • 2024 • Article 8

A Framework of Caring Assessments for Diverse Learners

Blair Lehman, Jesse R. Sparks, Diego Zapata-Rivera, Jonathan Steinberg and Carol Forstyth

2024-06-14 Volume 29 • 2024 • Article 9

Assessing Students’ Application Skills Through Contextualized Tasks: Toward a More Comprehensive Framework for Embedding Test Questions in Context

Filio Constantinou

2024-06-19 Volume 29 • 2024 • Article 10

Discovering Educational Data Mining: An Introduction

Zachary Collier, Joshua Sukumar and Roghayeh Barmaki

2024-07-23 Volume 29 • 2024 • Article 11

What's in a School Grade? Examing How School Demographics Predict School Letter Grades

Margarita Pivovarova, Audrey Amrein Beardsley and Tray Geiger

2024-08-20 Volume 29 • 2024 • Article 12

Response Process Evidence for Academic Assessments of Students with Significant Cognitive Disabilities

Meagan Karvonen, Russell Swinburne Romine and Amy Clark

2024-08-26 Volume 29 • 2024 • Article 13

From Investigating the Alignment of A Priori Item Characteristics Based on the CTT and Four-Parameter Logistic (4-PL) IRT Models to Further Exploring the Comparability of the Two Models

Agus Santoso, Heri Retnawati, Timbul Pardede, Ezi Apino, Ibnu Rafi, Munaya Rosyada, Gulzhaina Kassymova and Xu Wenxin

2024-11-11 Volume 29 • 2024 • Article 14

The Circle of Methods for Evaluating Latent Variable Measurement Models: EFA, CFA, and ESEM

Kelvin Tosin Afolabi and Timothy Konold

2024-11-13 Volume 29 • 2024 • Article 15

Most Popular Articles

Justus Randolph

A Guide to Writing the Dissertation Literature Review

Anna B. Costello, Jason Osborne

Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis

Travis T. York, Charles Gibson, Susan Rankin

Defining and Measuring Academic Success

Evaluating Research Impact: A Comprehensive Overview of Metrics and Online Databases

  • Conference paper
  • First Online: 20 December 2023
  • Cite this conference paper

research analysis and evaluation journal

  • Seema Ukidve 16 ,
  • Ramsagar Yadav 17 ,
  • Mukhdeep Singh Manshahia 17 &
  • Jasleen Randhawa 18  

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 855))

Included in the following conference series:

  • International Conference on Intelligent Computing & Optimization

214 Accesses

The purpose of this research paper is to analyze and compare the various research metrics and online databases used to evaluate the impact and quality of scientific publications. The study focuses on the most widely used research metrics, such as the h-index, the Impact Factor (IF), and the number of citations. Additionally, the paper explores various online databases, such as Web of Science, Scopus, and Google Scholar, that are utilized to access and analyze research metrics. The study found that the h-index and IF are the most commonly used metrics for evaluating the impact of a publication. However, it was also found that these metrics have limitations and cannot be used as the sole criteria for evaluating the quality of research. The study also highlights the need for a comprehensive and holistic approach to research evaluation that takes into account multiple factors such as collaboration, interdisciplinary work, and societal impact. The analysis of online databases showed that while Web of Science and Scopus are considered to be the most reliable sources of research metrics, they may not cover all relevant publications, particularly those in less well-established or interdisciplinary fields. Google Scholar, on the other hand, is more inclusive but may not have the same level of accuracy and reliability as the other databases.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

research analysis and evaluation journal

The Use of Bibliometrics for Assessing Research: Possibilities, Limitations and Adverse Effects

research analysis and evaluation journal

Toward a classification of Spanish scholarly journals in social sciences and humanities considering their impact and visibility

research analysis and evaluation journal

From indexation policies through citation networks to normalized citation impacts: Web of Science, Scopus, and Dimensions as varying resonance chambers

Hirsch, J.E.: An index to quantify an individual’s scientific research output. Proc. Natl. Acad. Sci. USA 102 (46), 16569–16572 (2005)

Article   Google Scholar  

Batista, P.D., Campiteli, M.G., Kinouchi, O.: Is it possible to compare researchers with different scientific interests? An analysis of the h-index. Scientometrics 68 (3), 179–189 (2006)

Egghe, L.: Theory and practise of the g-index. Scientometrics 69 (1), 131–152 (2006)

Radicchi, F., Fortunato, S., Castellano, C.: Universality of citation distributions: toward an objective measure of scientific impact. Proc. Natl. Acad. Sci. USA 105 (45), 17268–17272 (2008)

Leydesdorff, L.: Mapping the global development of science by means of publication indicators: a study of Science Citation Index Expanded and Social Sciences Citation Index. J. Am. Soc. Inf. Sci. Technol. 61 (7), 1386–1403 (2010)

Google Scholar  

Scopus (n.d.). https://www.elsevier.com/solutions/scopus

Web of Science (n.d.). https://clarivate.com/webofsciencegroup/

Google Scholar (n.d.). https://scholar.google.com/

Mendeley (n.d.). https://www.mendeley.com/

arXiv (n.d.). https://arxiv.org/

Bollen, J., Van de Sompel, H., Hagberg, A., Chute, R.: A principal component analysis of 39 scientific impact measures. PLoS ONE 4 (6), e6022 (2009)

Bornmann, L., Leydesdorff, L.: What do citation counts measure? A review of studies on citing behavior. J. Document. 64 (1), 45–80 (2008)

Garfield, E.: Citation analysis as a tool in journal evaluation. Science 214 (4520), 671–681 (1979)

Hirsch, J.E.: Does the Hirsch index have predictive power? arXiv preprint arXiv:0707.3168 (2007)

Garfield, E.: Citation Indexing: Its Theory and Application in Science, Technology, and Humanities. Wiley, New York (1995)

Ioannidis, J.P.: Why most published research findings are false. PLoS Med. 2 (8), e124 (2005)

Radicchi, F., Fortunato, S., Castellano, C.: Diffusion of scientific credits and the ranking of scientists. Phys. Rev. E 80 (5), 056103 (2009)

Schreiber, M., Glassey, J.: A critical examination of the h-index in comparison with traditional indices and with peer judgement. Scientometrics 71 (2), 317–331 (2007)

Van Raan, A.F.J.: Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. J. Am. Soc. Inform. Sci. Technol. 57 (8), 1060–1071 (2006)

Waltman, L., van Eck, N.J.: A comparative study of four citation-based indices for ranking journals. J. Inform. 4 (2), 146–157 (2010)

Download references

Acknowledgements

Authors are grateful to Punjabi University, Patiala for providing adequate library and internet facility.

Author information

Authors and affiliations.

Department of Mathematics, L. S. Raheja College of Arts and Commerce, Santacruz(W), Maharashtra, India

Seema Ukidve

Department of Mathematics, Punjabi University Patiala, Patiala, Punjab, India

Ramsagar Yadav & Mukhdeep Singh Manshahia

Panjab University Chandigarh, Chandigarh, India

Jasleen Randhawa

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Ramsagar Yadav .

Editor information

Editors and affiliations.

Faculty of Electrical and Electronics Engineering, Ton Duc Thang University, Modeling Evolutionary Algorithms Simulation and Artificial Intelligence, Ho Chi Minh City, Vietnam

Pandian Vasant

Department of Computer Science, Chittagong University of Engineering & Technology, Chittagong, Bangladesh

Mohammad Shamsul Arefin

Federal Scientific Agroengineering Center VIM, Laboratory of Non-traditional Energy Systems, Russian University of Transport, Department of Theoretical and Applied Mechanics, 127994 Moscow, Russia;, Moscow, Russia

Vladimir Panchenko

Department of Computer Science, UOW Malaysia KDU Penang University Colleage, George Town, Malaysia

J. Joshua Thomas

Northwest University, Mmabatho, South Africa

Elias Munapo

Faculty of Engineering Management, Poznań University of Technology, Poznan, Poland

Gerhard-Wilhelm Weber

Facultad de Ciencias Económicas y Empresariales, Universidad Panamericana, Mexico City, Mexico

Roman Rodriguez-Aguilar

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Cite this paper.

Ukidve, S., Yadav, R., Manshahia, M.S., Randhawa, J. (2023). Evaluating Research Impact: A Comprehensive Overview of Metrics and Online Databases. In: Vasant, P., et al. Intelligent Computing and Optimization. ICO 2023. Lecture Notes in Networks and Systems, vol 855. Springer, Cham. https://doi.org/10.1007/978-3-031-50158-6_24

Download citation

DOI : https://doi.org/10.1007/978-3-031-50158-6_24

Published : 20 December 2023

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-50157-9

Online ISBN : 978-3-031-50158-6

eBook Packages : Intelligent Technologies and Robotics Intelligent Technologies and Robotics (R0)

Share this paper

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

IMAGES

  1. (PDF) 18-19

    research analysis and evaluation journal

  2. (PDF) An Evaluation of Journal Publication Evaluation Factors

    research analysis and evaluation journal

  3. Research and Analysis Journal

    research analysis and evaluation journal

  4. (PDF) A Critical Evaluation of Qualitative Reports and Their

    research analysis and evaluation journal

  5. Journal Article Analysis

    research analysis and evaluation journal

  6. Sample Journal Analysis

    research analysis and evaluation journal