Developing world class research for societal impact

By Benjamin Wan-Sang Wah and Michael Ming-Yuen Chang, The Chinese University of Hong Kong

Research assessment | 2016

Publicly funded research institutions are increasingly pressured by the need to demonstrate the relevance of their research to funding agencies and the general public. As a non-profit organization for higher education, The Chinese University of Hong Kong is naturally interested in understanding the impact of its research on society. The purpose of this article is to share approaches we have considered so far. 

Measuring societal impact is still an inexact science. Metrics commonly used by ranking agencies, such as reputation surveys and bibliometrics, have their shortcomings. Ranking agencies typically conduct reputation surveys by sending questionnaires to academics and asking them to pick 10 to 15 top universities in certain academic disciplines. They then normalize the results across regions and disciplines. A main drawback of this approach is that universities with top academic disciplines are usually universally recognized. But measuring the reputation of universities ranked below the top 50 is difficult and may lead to highly unreliable and volatile results.(1) The survey data could be noisy and possibly marred by regional and discipline bias. 

Bibliometrics is a less controversial method of measuring impact; bibliometrics considers a range of citation-based metrics, such as number of citations, citations per faculty, citations per paper, Impact Factor, normalization factor across disciplines, and region-specific factors. The bibliometric method also has deficiencies, particularly in measuring societal impact.

Citations evaluate only one type of scientific output, namely, peer-reviewed articles. The readership of these academic articles may be restricted, excluding the general public and practitioners. Secondly, it usually takes a few years for citations to become a meaningful factor. Also, fewer citations may not necessarily indicate poor quality of work, but instead result from the narrowness of a particular field. Finally, citation patterns differ among disciplines (including interdisciplinary subjects), which may require normalization of citation data.

A further problem of simple citation counting is that not all citations are equal, suggesting that systematic and automated analysis of the depth of citations may be more meaningful than the current dependence on number of citations as a proxy for the quality of a paper.(2)

 

International initiatives

Governments all over the world have launched various initiatives to measure the social impact of publicly funded projects. In the US, the ongoing STAR METRICS project, a collaboration among federal and research institutions, aims to create a repository of data and tools useful for assessing the impact of federal R&D investments, focused on measuring the impact on job creation only. In the current phase, researchers are developing tools and an open and automated data infrastructure linked to existing databases, such as patent and financial information, to facilitate a more holistic analysis of the impact of federal investment.

From 2003-2007, the Australian government initiated a Research Quality Framework for assessing research quality and impact. This project was never implemented due to difficulties in agreeing on definitions and evaluation methodology. It was replaced by Excellence in Research for Australia (ERA), which does not use “impact” as a measure of research quality. Rather, selected panel members from industry and academia assess case studies for reach and significance. (See more on the ERA in Nicol, Harvey and Byrne’s article within this issue.)

Assessors used similar criteria in the UK’s Research Excellence Framework (REF) 2014, which turned to case studies to evaluate research impact.(3) The results of this project have significant funding implications. The REF broadly defined impact as “an effect on, change or benefit to the economy, society, public policy or services, health, the environment or quality of life, beyond academia.” The exercise involved 6,679 impact case studies, which were diverse and wide-ranging. Over 80 percent of these studies included multidisciplinary research.(4)

The REF attracted a fair amount of criticism. First, costing around £55 million, it proved both an expensive and a time-consuming process. Secondly, it failed to address the time lags between the research and the resulting impact, which may vary across disciplines. Thirdly, the evaluation depended on panelists’ subjective judgement, and some may not have had knowledge of all domains. Moreover, the REF lacked robust normalization across disciplines. Lastly, the scale of research impacts was very hard to measure because of the difficulty in quantifying contributions and attribution for the linkage among input, activity, output and outcome.


Figure 1: Benchmarking with altmetrics and bibliometrics

This article, which was co-authored by CUHK researchers and published in Nature Photonics in 2013,has been cited 78 times, putting it in the 99th percentile of all publications in materials science in 2013 in terms of citation count. Moreover, it has been saved by 109 Mendeley readers and has been referenced 13 times in the mass media. The new Article Metrics Module in Scopus provides a dashboard to track the research performance of individual articles using both traditional bibliometric indicators (citation count, field-weighted citation impact, citation percentile relative to other publications in the same subject area) and altmetrics (Mendeley readership, mass media coverage, Twitter and Facebook posts, and more). For more details, see http://blog.scopus.com/posts/new-scopus-article-metrics-a-better-way-to-benchmark-articles Source: Scopus 



Altmetrics

In its search for a more efficient alternative, the Higher Education Funding Council for England in 2014 called for evidence in using metrics, including include bibliometric, webometrics and altmetrics,  in research assessment. Altmetrics stands for alternative metrics.(5) Its goal is to provide a broader, faster and less costly measure of impact, using data extracted from social media, which could not be captured by citation counts. These metrics can include information in tweets, Facebook posts, blogs, the Web generally, gray literature, online syllabuses, online presentations, discussion forums, mainstream media, and library holdings of books, as well as counts in Mendeley, SlideShare, figshare, etc. Open-access journals can provide not just citations, but also metrics such as number of views and downloads.

Altmetrics has the potential to complement citation-based metrics. Some limitations of citations, such as the length of time required to accumulate them, and the fact that people may not always cite a paper even if the paper influenced their thinking, could be addressed to some extent by altmetrics. Difficulty arises, however, in linking the reference to the source, especially where the linkage is diluted by social dialogue and comments on the original reference. Furthermore, people cannot be expected to cite original research in social media commentaries. A possible solution is to rely on text mining and data analytics, which could analyze articles with natural language processing. Coincidence of terms, for example, could be used to infer a relationship from an original article, even behind a broken link. This area of research remains an approach that could be further explored.


Final remarks

The measurement of societal impact remains fraught with difficulties, with no flawless solution yet in sight. Discussion about which metrics are most useful for assessing research and how they should be used remains open. Nevertheless, the necessary conditions for world-class research resulting in societal impact are indisputable: excellence in research staff, research environment and the ecosystem for research results. To create these necessary conditions requires a long-term investment demanding detailed planning, supportive government policies, and a vibrant and collaborative research culture.




1.   http://www.eua.be/Libraries/publications-homepage-list/EUA_Global_University_Rankings_and_Their_Impact_-_Report_II

2.   We examined a random sample of 20 papers from Scopus, all document-type articles, and made subjective differentiation on the citations in each, using both shallow analysis and deep analysis. We found that out of 546 citations analyzed, as many as 463 were shallow citations. Although this was a very small-scale experiment, it nevertheless suggests that systematic and automated analysis of the depth of citations may be more meaningful than the current dependence on number of citations as a proxy for the quality of a paper.

Papers were taken from the following journals: The Lancet, The New England Journal of Medicine, Science, Nature, Software Practices and Experiences, CA Cancer Journal of Clinicians, Business Horizons, Journal of Finance, SpringerPlus, International Journal of Psychology, Journal of Contemporary Brachytherapy, Alzheimer’s and Dementia.

Example of citations with shallow analysis: “Learners who apply this approach assume learning is personal commitment, which means that they seek the knowledge with interest and curiosity, related the content to previous knowledge and experience, abstract thinker (Arteche et al., 2009; Biggs and Moore, 1993; Entwistle, 1987), with high academic expectation (Rodriguez, 2009) and altruistic life goals (Wilding and Andrews, 2006).”

Example of citations with deep analysis: “Leung et al. (2004) also characterized these students as either high need-achievers or low need-achievers both aim for getting a better grade. The cluster analyses have revealed a small proportion of students with the combination of achieving motive and surface strategy (SS-AM). According to Leung et al. (2004), this group of students has the tendency to avoid failure because …”

3.    http://impact.ref.ac.uk/CaseStudies/

4.    http://www.hefce.ac.uk/pubs/rereports/Year/2015/analysisREFimpact/

5.    Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact. http://arxiv.org/html/1203.4745v1

Comment on this article

We reserve the right to remove or edit comments due to language, length, or pertinence to the subject matter.
CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
Image CAPTCHA
Enter the characters shown in the image.