Skip to Content

Evaluating publication activities - bibliometrics

Research activity is constantly being evaluated by both outside experts and in-house experts. An individual researcher or research group may evaluate the quality of their own research findings in relation to the research within their own discipline, and will be interested, for example, in the level of research produced by co-workers. The focus of the evaluation can be a single researcher, or a research group/institution, or the research activity of a whole country. The volume of publications is a basis for allocating basic funding for the universities in Norway.

Contents

Bibliometrics is the quantitative analysis of written documents. It is frequently used to analyze scientific and scholarly publications. Bibliometric analyses require a bibliometric data source. The two most commonly used bibliometric data sources are Web of Science, produced by Clarivate Analytics, and Scopus, produced by Elsevier. Google Scholar is also used quite frequently.

Bibliometrics offers valuable information about the publishing activity. However, this information should be used responsibly:

  • Bibliometric indicators provide approximate information. They do not offer an exact measurement of a concept of interest.
  • Bibliometric information can be seen as one element within a broader range of information sources available to support research management and research evaluation.
  • Be aware of limited coverage in the data sources such as Scopus and Web and Science.
  • In a research evaluation context, the units being evaluated should be able to confirm the data and analyzes.
  • Bibliometric analyzes reflect only some aspects of a research unit's activity.
  • Remember to explain the differences between scientific fields in publishing, authorship and citation practices.
  • Bibliometric analyzes require a careful balance between transparency (ensuring that the analysis is interpreted correctly) and analytical refinement (providing insight that is difficult to obtain by means of simpler bibliometric approaches).
  • Large-scale analyses at OsloMet can be based on Scopus and performed using SciVal.

A bibliometric analysis can yield different types of information. Important types of information include:

  • Scientific output. Information about the number of publications produced by a research unit.
  • Scientific impact. Information about the number of citations that publications have received.
  • Scientific collaboration. Information about co-authored publications, focusing for instance on national and international collaboration or on university industry collaboration.
  • Mobility. Information about researchers that change their affiliation.
  • Interdisciplinarity. Information about the interdisciplinarity of publications, usually based on the fields that are cited by a publication.
  • Gender. Information about the gender of the authors of publications.
  • Open access publishing. Information about the open access status of publications, distinguishing for instance between gold open access, green open access, and no open access.

SciVal

SciVal is a bibliometric tool that can be used to analyse and visualize the positioning of research of individuals, groups and institutes. As a researcher, you can use SciVal for example to create evaluation reports, identify a publication strategy, and to find new collaborations. 

SciVal uses the Scopus database and provides publication information and metrics of over 10,000 research institutions, countries and individual researchers.  

You can define your own publication sets, groups or research areas by using keywords, importing publications from Scopus for example. You can also collect and group names of colleagues or collaborators to analyse group performances. 

Read SciVal at a glance for a quick overview of how it works and access SciVal to explore the tool.

Bilde av modell med fem inndelinger

Evaluation Process

SCOPE is a five-step process that enables good and responsible evaluations. The main elements of the five steps are:

START with what you value
  • Not with what others’ value (external drivers)
  • Not with available data sources (the ‘Streetlight Effect’)
CONTEXT considerations
  • WHO are you evaluating? (Entity size)
  • WHY are you evaluating?
  • Do you need to evaluate at all?
OPTIONS for evaluating
  • Consider both quantitative and qualitative options
  • Be careful when using quantities to indicate qualities
  • Evaluate with the evaluated
PROBE deeply
  • WHO might your evaluation approach discriminate against?
  • HOW might your evaluation approach be gamed?
  • WHAT might the unintended consequences be?
  • Does the cost outweigh the benefit?
EVALUATE your evaluation
  • Did your evaluation achieve its aims?
  • Was it formative as well as summative?
  • Keep your approach under review

See more about SCOPE on the website for The INORMS Research Evaluation Working Group.

The Declaration on Research Assessment (DORA) 

OsloMet has signed The Declaration on Research Assessment (DORA). DORA recognizes the need to improve the ways in which the outputs of scholarly research are evaluated. The objectives are to call attention to new tools and processes in research assessment and the responsible use of metrics that align with core academic values and promote consistency and transparency in decision-making, to aid development of new policies and practices for hiring, promotion, and funding decisions, and to call for broader representation of researchers in the design of research assessment practices that directly address the structural inequalities in academia.

Researchers

Identifying key performance parameters for active researchers has always been problematic. Evaluation and comparison of researchers who are working in a given field is a necessity since these are vying for the same limited resources, promotions, awards or scholarships in scientific academies. Whatever method we use to assess the value of a researcher's individual contributions, it will be simple, fair and transparent.

Citation Indices
  • Citation Mapping via Web of Science
  • Co-Author Visualizer via Scopus
  • Google Scholar
  • H-index
  • Alternative indices

Journals

Journal metrics measures the influence of academic journals. Common for all of the journals is that they aim to contribute in creating rankings and in giving insight into the importance of journals based on citation analysis. Different measurement systems use different methods and data sources, thus several perspectives may be offered on the same academic publishing landscape.

Web of Science
  • Journal Citation Reports
  • Impact Factors
  • 5-years Impact Factors
  • Cited Half-Life
  • Article Influence Score
  • EigenFactor Score (uses data from WoS)
Scopus
  • Scopus Journal Analyzer
  • SCImagoJR
  • SNIP and SJIR
  • Altmetrics
The Norwegian register of publication channels
  • Subject-specific lists

Use in applications

In applications for research funding, one may be required to attach publication evaluations to the bibliographic information. Examples:

  • Number of citations (Scopus, Web of Science and Google Scholar)
  • H-index
  • Highly cited articles
  • Journals with high rankings

Useful links

Documents

Citations

Altmetrics

Visualization Tools

Ratings

Indicators

Others

Getting Started

If you want to know more about bibliometrics and how it can be used in your unit, you can contact the local research advisor or the Department of Research and Development by Tanja Strøm.

Contact


chatbot-portlet