Twitter
Facebook
YouTube
LinkedIn
RSS

Realizing the Impact Chain – From Knowledge Mobilization to Impact

Conference Day: 
Day 3 - November 3rd 2017

Organized by: Eddy Nason, Ontario SPOR Support Unit

Speakers: Jane Barratt, Secretary General, International Federation on Ageing; David Budtz Pedersen, Professor and Co-Director of the Humanomics Research Centre, Aalborg University; David Phipps, Executive Director, Research & Innovation Services, Division of Vice-President Research & Innovation / Office of Research Services, York University; Rémi Quirion, Chief Scientist of Quebec

Moderator: Eddy Nason, Assistant Director, Ontario SPOR SUPPORT Unit

Takeaways and recommendations: 
  • Several organizations in Canada have been successful in operationalizing impact assessments (e.g. Alberta Innovates, Canadian Institutes of Health Research, Canadian Health Services and Policy Research Alliance), but it is not evenly applied across Canada. While the grassroots community is strong, a high-level policy or mandate is needed to link the various efforts.

  • The new Canada Research Coordinating Committee, which will coordinate activities among the three federal granting agencies, should be mandated to develop standardized impact assessment tools (as is done in the UK).

  • Canada has shown leadership in building capacity in this field by teaming up with the UK and Spain to launch the International School on Research Impact Assessment.

  • International School on Research Impact Assessment (ISRIA) was founded by Paula Governments around the world continue to make decisions in the absence of good evidence. (e.g. Canada lacks adequate surveillance systems and registers to measure health impact of vaccines,)

Measure impact through the eyes of the end user

  • Researchers need to consider the ultimate impact their work will have on average people. “Try looking at the potential impact of your research through the lens of the end user.”

  • The road to impact starts with engagement with stakeholders (e.g. policy/practice partner/end users) to identify the research challenge and goals, design the project, disseminate and implement the results, and evaluate the impacts.

  • What matters to an academic (e.g. papers, conferences, successive grants) doesn’t matter to the end user. Disseminate research results in ways that end users can use. (e.g. the Social ABCs manual for parents and caregivers resulted in a 45% improvement in responsiveness of autism children. The program is now scaling to other cities.)

  • Think about impact in a way that tells a coherent story that is convincing for a policymaker, not just to get the next grant.

  • Academia, industry and government should work together to create solutions that translate evidence into action.

  • An example of data that matters to society but isn’t captured by research is in the area of “population aging”. Yet, there are very little data that are desegregated beyond age 75. That contributes to age discrimination and age stereotypical behaviour because we don’t have the data necessary to develop informed policy. (e.g. the World Health Organization’s Global Strategy and Action Plan on ageing and health, of which Canada is a signatory.)

UK’s approach to measuring impact

  • One delegate described the UK’s Research Excellence Framework (REF), it as “the most operational impact tool in the world”. 20% of the score from the REF is based on “impact”.

  • The REF defines impact as “an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia”.

  • Impacts are submitted to the REF as narrative case studies and assessed by a panel of experts.

  • A review of the 2014 REF found most case studies were interdisciplinary research; it also identified over 3,700 unique pathways from research to impact.

  • All granting councils and most research funders in UK use Researchfish®, a research impact assessment platform that measures impact and accountability; researchers are required to do this assessment at the end of their project, and again 2-5 years later.

  • A portfolio of research is more usefully assessed than individual projects.

  • There are still challenges to overcome:

    • How to measure and assess interdisciplinary research.

    • Understanding how impacts came about from retrospective analyses – “you don’t get involved in discussions as they happen”.

Quebec’s approach to measuring impact

  • Quebec’s efforts in this area are “still a work in progress”.

  • When it comes to outputs and outcomes, elected officials are as interested in qualitative data as quantitative data.

  • The most important metric for governments is jobs.

  • Quebec’s three research funding organizations are committed knowledge mobilization, and KM will be a major priority in future plans and activities aimed at elected officials, high-level officials and civil society.

  • Peer review and promotions/tenure in academia need to recognize and reward KM, which includes interdisciplinary research.

  • Fonds de Recherche du Québec (FRQ) recently launched AUDACE, a funding opportunity for intersectoral, innovative and high-risk research that includes a KM requirement.

  • KM should include efforts to increase critical thinking and science literacy. (e.g. FRQ holds science breakfasts for elected officials, and supports Détecteur de rumeurs, a fact-checking website developed by the Agence Science-Presse.)

Europe’s approach to measuring impact

  • There is no single definition of impact.

  • The impact “landscape” in Europe is divided primarily by an interest in academic impact and use of traditional metrics (e.g. bibliometrics, citation levels). But there is emerging interest in social impacts and a growing need for data on these impacts.

  • Literature review revealed a strong distinction between focusing on “ex-ante” drivers of impact (the impacts anticipated in a grant application) and “ex-post” drivers (post research evaluation).

  • Increasingly grant applications to funders like the European Research Council are being evaluated on criteria related to knowledge mobilization, knowledge transfer and broader societal impact, in addition to traditional metrics.

  • While several reports have examined responsible research, social scientific responsibility and responsible metrics, a discussion is needed on responsible impact.

  • There are no off-the-shelf metrics that can capture the full impact of research. Instead, we have to develop indicators “in the wild”, drawing on multiple pathways and data sources (e.g. policy, media, cultural, health interactions).

  • The “Responsible Impact” project is examining the type of impacts being commissioned by granting councils and foundations at the ex-ante level, and how research is measured at the ex-post level. (Principal investigator is David Budtz Pedersen)