KNAER-RECRAE

  • Knowledge Mobilization Connection:

    Have we Practiced what We've Preached?

    BY KATINA POLLOCK, CAROL CAMPBELL, KELLY BAIROS, AND SHASTA CARR-HARRIS

     
    In July 2016, Sandra Nutley wrote a blog for the KNAER entitled, Using research to shape knowledge mobilisation practice. Nutley commented in this blog that while there is a growing number of knowledge mobilization initiatives dedicated to facilitating and enhancing research use, she noted an irony that “many of these initiatives struggle to demonstrate that their own knowledge mobilisation practice are themselves not research-informed and in line with the best available research on how to enhance research use”. Nutley was referring to several tenets that emerged from a literature review (2016) that she, Huw Davies and Alison Powell did that explored the latest thinking and empirical evidence on best practices. Even though Nutley indicated that KNAER was the exception to the current practice, we, the KNAER Secretariat, reflected on this irony and asked ourselves, can we demonstrate that we followed these tenets from the Nutley and colleagues’ literature review? Can we demonstrate that our own decisions surrounding the original KNAER were evidence-informed and have we continued to do so for the renewed KNAER? Essentially, have we “practiced what we’ve preached” about using research to inform practice?  
     

    Tenet #1 Bringing Researchers and Research Users Together

    The original KNAER was active between 2010-2014 with 44 knowledge mobilization projects. Each project was selected through an adjudication process that included a call for proposals that included specific application guidelines. Within these guidelines each project was expected to include several partners or partnerships. Of these partnerships, projects were expected to collaborate with at least one academic researcher. Analysis of our final reports indicated that the majority of projects were somehow connected to researchers either through universities, colleges, health organizations, and/or research departments at school boards. These projects were also expected to connect to practitioners: many demonstrated this by working with school boards and teacher and principal associations as brokers to educators in the field.

     

    KNAER Phase I Partnerships
     

     

    Tenet #2 Acknowledge the Importance of Context

    We understood context in this case to mean the context of the Ontario public education sector. The KNAER acknowledged the importance of the Ontario public education context in a number of ways. The original KNAER initiative selected knowledge mobilization projects that focused on the (then) four ministry priority areas: teaching and learning, transitioning, equity, and engagement. This was specifically one of the criteria used in adjudicating the more than 100 proposals submissions. As is evident in Tenet One, all knowledge mobilization projects included partnerships with some combination of provincial and local intermediaries such as the Learning Disabilities Association of Ontario and Peel District School Board. These intermediaries work closely with educators in the field and are aware of the current educational trends and challenges in that particular area. Lastly, even though all projects fell within at least one of the then four priority areas, the actual educational issue being addressed varied considerably. This diverse range of topics (see below) was driven by local concerns and needs.

     

    • Aboriginal Education
    • Arts Education
    • Classroom Management
    • Early Childhood Education
    • Education in the North
    • English Language Learners
    • Equity and Inclusion
    • French-language Education
    • Knowledge Mobilization
    • Leadership
    • Mathematics Education
    • Mental Health
    • Multi-modal learning
    • Physical Health
    • Science Education
    • Special Education
    • Stakeholder Engagement
    • Student Identity

     

    Tenet #3 Being aware of the needs of research users

    During the two years of KNAER (2010-2012), further consultation was done with educators, researchers, intermediary groups, and parents and it became clear that while the successful project leads were well-versed in knowledge mobilization, knowledge mobilization overall was not well understood throughout the education sector. For this reason, the KNAER began supporting professional learning around knowledge mobilization. Specifically, we created resources about how to create an effective knowledge mobilization plan such as tips for Knowledge mobilization planning and write a short research summary for your primary audience. We also repurposed the KNAER website and created a toolkit that compiled the resources generated from the KNAER. This toolkit mainly concentrated on resources applicable to the content focus of the various knowledge mobilization projects. For example, a mathematics-focused project by Shelley Yearley, Trish Steele, and Cathy Bruce and their partners, entitled Exploring Learning and Differentiated Instruction for the Difficult to Learn Topic of Grade 6 Fractions Using Teacher-Coach-Research-Developer Networking, have several useful resources in the toolkit, including a literature review, an information sheet, and their own toolkit.

     

    Another example is the Our Kids Network: Taking Research to Practice project, which included a university partner (Charles Sturt University), a provincial network (Our Kids Network), a school board (Halton District School Board), several health agencies (Halton Region Children’s Services and Department of Health, ErinOakKids Centre for Treatment and Development), the Halton Police, and several community partners (Halton Children’s Aid Society, ROCK Reach Out Centre for Kids, Halton Multicultural Council). The OKN project focused on building capacity to utilize research, strengthen cross-sector partnering in taking research to practice, sustain engagement with internal and external stakeholders, and more effectively responding to issues facing children, youth and their families in the Halton region and used the strength, resources, and knowledge of their network and partners to accomplish this. Their resources in the KNAER toolkit include a report, online video, toolkit and virtual community of practice, and website.

     

    Tenet #4 Drawing on a range of types of knowledge, not just research-based knowledge

    A number of different types of knowledge can be found within the 44 KNAER projects. Because the majority of project topics were driven by local practitioners and communities connected with the education sector, practitioner and community knowledge was used substantially in efforts to inform practice. An example of a community-based project is the Kimaaciihtoomin E-Anishinaabe-Kikinoo’amaageyak (Beginning To Teach In An Indigenous Way) project by Jean-Paul Restoule and his partners, including the Toronto District School Board and its Aboriginal Education Center. This project focused on integrating Aboriginal perspectives into the classroom and has many resources available in the toolkit, including several presentations, articles, a toolkit, online videos, and website.
     

    Tenet #5 Testing and evaluating interventions

    We interpret this tenet, testing and evaluating interventions, to mean did our efforts produce any of our intended outcomes. It was difficult to determine any type of systematic evaluation of the full KNAER initiative because each KNAER knowledge mobilization project focused on a different educational issue and approached each issue in a different manner and included different partners. However, each project did attempt to report impact, degree of influence, record outputs and outcomes and we tried to provide a transparent narrative of the full initiative in the final report. An external evaluator evaluated the full initiative and concluded that KNAER was a “trailblazing” initiative (McGuire, Zorie, & Frank, 2014, p. 9). In addition, we conducted a review of the utility of KNAER, a literature review, interviews with knowledge mobilization experts and conducted planning sessions with various stakeholders to create our final report. The KNAER final report is our culminating internal evaluation and lessons learned about KNAER (2010-2014). This final report was (and is) the basis for the renewed KNAER initiative.
     

    Tenet #6 Feeding knowledge from evaluation back into future practice

    Last, but certainly not least, Nutley stated that another effective practice reported in the literature was to utilize knowledge gleaned from evaluations to inform future practice. As mentioned in Tenet Six, we conducted an extensive final report of the original KNAER (2010-2014) where we considered the findings from the external evaluation, our own literature review, expert interviews, virtual discussions, and planning sessions to develop a systems approach model that was proposed to the Ontario Ministry of Education for consideration in redesigning KNAER. The Ontario Ministry of Education accepted this model and has utilized it as the framework for the renewed KNAER that was launched in fall 2016. To read more about our lessons learned and recommendations for the renewed KNAER please see our final report
    KNAER II Model 2016-11-15

     

    Let’s return to the question we asked ourselves earlier in the blog: Did we practice what we preached? As demonstrated in this blog we do think that to varying degrees we have practiced what we’ve preached. Perhaps the more relevant question is how will we continue to take our lessons learned and apply them to new challenges that arise with the revised systems model as we move forward? How will we know the outcome from applying this new knowledge and understanding?

  • KNAER-RECRAE Highlight:

    Read The Hamilton Spectator article on launch of KNAER Well-being Network

    Mac, school board to lead student well-being networks

    Hamilton will be leading the effort to equip Ontario schools with new resources to improve student well-being.

    The Ministry of Education has announced that the Hamilton public school board and McMaster University's Offord Centre for Child Studies have been chosen to spur the building of "knowledge networks" addressing in an integrated way...[continue reading article]

  • Knowledge Mobilization Connection:

     

    CREATING PARTNERSHIPS: LEARNING NEW WAYS TO CONNECT

    BY VIVIAN TSENG, JOHN EASTON AND LAUREN SUPPLEE 

    Over the last two decades there has been a steady growth of high-quality research in education and human services. Yet we have heard researchers express frustration that policymakers and practitioners don't use (and sometimes misuse) research findings. Conversely, policymakers and practitioners suggest that research is frequently not relevant to their work-or it's not easily accessible or understood. This is not surprising as evidence-based practice and policy has traditionally been about producing evidence and disseminating research to users—an approach that Vivian Tseng has characterized as a one-way street.

    But just producing rigorous research and bringing evidence to practitioners and policymakers won't get it used. Users need ongoing engagement around research. They need opportunities to talk about the research and interact with others to apply findings.  Researchers need to focus less on dissemination and more on dialogue. Enter research-practice partnerships.

    Long-term collaboration between researchers and decision makers—research-practice partnerships—shift the dynamic between research producers and users by creating two way streets. Instead of asking how researchers can produce better work for practitioners, partnerships ask how researchers and practitioners can jointly define research questions. Rather than asking how researchers can better disseminate research to practitioners,partnerships strive for mutual understanding and shared commitments from the beginning. Successful partnerships enable researchers to develop stronger knowledge of practitioners' challenges, their contexts, and the opportunities and limitations for using research. And they allow practitioners to develop greater trust of the research and deeper investment in its production and use.

     

    The Principles Behind Successful Partnerships

    While there are many different kinds of RPPs, they are guided by key principles that make them different from other types of collaborations.

    Mutualism brings researchers and practitioners to the table to develop an agenda that is mutually beneficial.  By collaborating throughout the process, both researchers and practitioners get their needs met.

    Commitment to long-term collaboration means that the partnership can foster iterative work to understand and address key problems of practice. There is an ongoing cycle of learning and doing.

    Trusting Relationships are the key to effective partnerships. To sustain a long-term collaboration, partners need to believe that they can rely on each other to come through on agreements and to understand and even anticipate each other's needs and interests. Trust enables partners to weather bad news, disagreements, unfulfilled expectations, and changes in leadership.

     

    How do Partnerships Work?

    Building an RPP is hard work. They are complex organisms, with structures, processes, and roles that evolve as partnerships mature and adapt. However they form, we have observed five elements that seem to come together in successful partnerships.
    (See Diagram 1)


    1. All RPPs must determine their structure. While not all RPPs have formal documents, many develop charters, MOUs, and operating principles to record their shared goals, clarify stakeholder representation and roles, and spell out governance and operational issues.

    2. RPPs must develop a shared commitment, which includes defining the research agenda. The research agenda is a focal point for activities within a research–practice partnership. In RPPs, research agendas are shaped around problems of practice, policy, and implementation, and are co-shaped by partners to fit practice priorities.

    3. Partnerships need to develop their processes, routines, and “ground rules” for producing and using research evidence.  Many RPPs have a “no surprise” rule, wherein the agency partners have an opportunity to review a research report before it is released to the public.

    4. Capacity building may involve one or all partners in the collaboration. It can focus on building the agency's capacity to use research, bolstering researchers' capacity to conduct and communicate useful research, or supporting the capacity of the partnership itself through staffing.

    5. It's critical that the funding portfolio for RPPs covers partnership infrastructure as well as projects.


    As all of these elements come together, a partnership identity often emerges where research and practice partners develop a shared sense of what their partnership is and what it does.

    The University of Chicago Consortium on School Research's (CCSR) 25-year relationship with Chicago Public Schools illustrates how a school system can learn from research evidence to drive improvements over time.  In the late 1990s, CCSR researchers sought to measure freshman progress toward graduation and developed an "on-track indicator" to help schools identify freshmen who were unlikely to graduate in four years.  Their research challenged prevailing assumptions and stimulated new ideas for combating dropout.  But the work didn't stop there.  The district and schools put systems in place to use the indicator data to intervene with students while their research partners experimented with user-friendly ways to present the data.  As schools tried out different strategies for bringing students on track, CCSR systematically studied those strategies to determine which ones worked and why. By 2013 that sustained work had yielded an on-time graduation rate of 82 percent, up from 57 percent just six years earlier. Educators, using data and research, learned how to better support student success.

    Let's turn to a relatively new partnership between the research and program offices within one agency: the Office of Family Assistance's (OFA) healthy marriage programs and the Office of Planning, Research and Evaluation (OPRE)within the U.S. Department of Health and Human Services, Administration for Children and Families (ACF). In the early 2000's, OPRE launched long-term studies to evaluate whether marriage and relationship education programs might improve outcomes for children in low-income families. By the time the evaluation results were available-about ten years later-the context within ACF had changed and the program office had new questions, not envisioned when the long-term studies were planned.  A misalignment had emerged between what research could offer and what current policy needed. In order to make the research more relevant, closer collaboration between the research and program offices needed to be established.

    That's exactly what happened when in 2010 a partnership was structured between OPRE and OFA's healthy marriage programs. Lauren Supplee, formerly of OPRE, and her OFA counterparts sat down and discussed the needs of the program. Since they had a history of working together, they already had a trusting relationship. Subsequently, they were able to structure a partnership that would be responsive to OFA staff. One way that this took shape was that OPRE researchers invested time in building relationships with OFA acrossall levels of staff. By working together, ideas for new research projects are generated through the year. To build the research agenda, the OPRE and OFA staffs have ongoing conversations about program operations as well as specific structured meetings around annual research planning. The overall goal for these discussions is to raise questions from the field that research may be able to support. The research portfolio now includes multiple projects that include foundational projects to describe services and participants, impact studies to determine efficacy of components or whole programs and research capacity building for the field to address important questions simultaneously. Through shared goals, shared commitment, open communication, capacity building and infrastructure, an effective RPP was born.

     

    How can we Ensure the Next Generation of Partnerships

    RPPs represent a sea change in the way we think about research, practice, and the uses of evidence. They break down walls between researchers and practitioners by creating two-way streets of engagement. Researchers and practitioners, alike, must remain open to learning about and adapting to different perspectives. Taking the long view on research and practice improvement reaps tremendous benefits for youth and families.

    There are important questions that researchers need to ask themselves before entering into an RPP. Can they adopt new ways of working in order to produce more timely and useful research? Are they open to taking on different approaches to research? Finally, are they willing to acquire  skills that are not traditional for researchers, including communication with broader audiences and imagining research agendas from a practice perspective?

    Practitioners too, will need to work in new ways. They may need to ask themselves: are their organizations flexible enough to foster the conditions that will allow for more effective and systematic use of research evidence? At the same time, policymakers will need to address the bureaucratic barriers to research-program collaborations within agencies and between public agencies and external research partners.

    Private and public funders have a role to play as well in preserving the progress that partnerships have made. While funding for projects is easier to come by, some foundations are not as willing to invest in operating costs and infrastructure development. But, RPPs need resources for outreach, communications, and relationship building if they are to be sustained for the long haul. What would be ideal is a reliable combination of funds: government support for operating costs and specific large-scale research studies, private support for infrastructure maintenance and behind-the-scenes internal "R&D" activities, local foundation support for context specific research and evaluation studies and infrastructure, and university support for faculty time and partnership space.

    Research-practice partnerships are not the path for the faint of heart. But acknowledging and addressing the challenges inherent in the work can bring about opportunities to close the notorious gaps between research and practice—to build two-way streets that improve work on both sides of the divide. RPPs allow researchers and practitioners to build joint, strategic research agendas, to embed data and research in ongoing work, to build knowledge from one project to the next, and to integrate lessons learned into practice and policy. When mutual trust forges confidence in the research, we can collectively bring about more effective services and enhance outcomes for children and youth.


    Diagram 1

    Change Churn

     

    About the authors

    Vivian Tseng is the Vice President, Programs at the William T. Grant Foundation. She leads the Foundation’s grantmaking programs and its initiatives to connect research, policy, and practice to improve child and youth outcomes. In 2009, she launched the Foundation’s initiative on the use of research evidence in policy and practice. She also designed the Foundation’s support for research-practice partnerships, including a learning community of research-practice partnerships across the country. Tseng has longstanding interests in mentoring young researchers and strengthening the career pipeline for scholars of color. Under her leadership, the William T. Grant Scholars Program has deepened its support for early-career researchers and established a grants program to support mentoring for junior researchers of color. She serves on the Boards of the Forum for Youth Investment, Asian Americans and Pacific Islanders in Philanthropy, and Evidence and Policy.She was previously on the faculty in Psychology and Asian American studies at CSUN. Her studies of racial, cultural, and immigration influences on child development have been published in Child Development and her research on improving social settings and promoting social change have appeared in the American Journal of Community Psychology. She received her Ph.D. from NYU and her B.A. from UCLA.

    John Q. Easton is Vice President, Programs at the Spencer Foundation in Chicago.  At Spencer, he developed and leads a new grant program for Researcher-Practitioner Partnerships. From June of 2009 through August 2014 he was director of the Institute of Education Sciences in the U.S. Department of Education. Prior to his government service, Easton was executive director of the University of Chicago Consortium School Research. He was affiliated with the consortium since its inception in 1990, and became its deputy director in 1997 and executive director in 2002.  Easton served a term on the National Assessment Governing Board, which sets policies for the National Assessment of Educational Progress (NAEP).  He is a member of the Illinois Employment Security Advisory Board, the Illinois Longitudinal Data System Technical Advisory Committee, and the Chicago Public Schools’ School Quality Report Card Steering Committee.

    Lauren H. Supplee is a program area director for early childhood research. Dr. Supplee has devoted her professional career to working on research and evaluation with the goal of applying the knowledge to policy and practice. She is committed to conducting research and evaluation that can contribute to program improvement and improved outcomes for children and families. Her research has focused on evidence-based policy, social-emotional development in early childhood, parenting, prevention and intervention programs for children at-risk, and implementation research. Prior to joining Child Trends, Lauren worked for the federal Administration for Children and Families in the Office of Planning, Research, and Evaluation for ten years, with the last four of those as the director of the Division of Family Strengthening. She began her career as a research associate at the University of Pittsburgh. Lauren received her Ph.D. from Indiana University in educational psychology with a specialization in family-focused early intervention services.

  • KNAER-RECRAE Highlight:

     

    The Consortium for the Study of Leadership and Ethics in Education (CSLEE) annual conference is upon us!

    This year’s conference is being hosted by one of our partner institutions – Western University – and KNAER is a very active participant.

    KNAER will be present at CSLEE in a number of capacities:

    • Our social media coordinators will be live tweeting and providing social media updates throughout the conference - Follow us (@KNAER_RECRAE) on Twitter and like us (KnaerRecrae) on Facebook! And don't forget to follow the full CSLEE twitter conversation at #CSLEE2016

    • We will have a poster on display during the conference about knowledge mobilization in Ontario (check it out for some helpful tips, there will be a handy takeaway sheet as well)

    • We will be hosting a networking session that lets you create connections in a snap and get some useful advice on building lasting connections – learn the tricks of speed networking with us! Taking place on Thursday, October 20 from 3:30-4:30pm in the Juniper room (we’ll have tasty treats for you to enjoy) Don't forget to bring business cards!

    • Join us at our hands-on workshop on social media on Friday, October 21 from 1:30-3:00pm in the Cherry room, and learn how to utilize these digital tools to mobilize your research and expand your network (bring your laptop or tablet or phone or any other digital device that lets you connect – you’ll be glad you did!)

    • We confront the idea of values in knowledge mobilization on Saturday, October 22 from 10:15-11:30am in the Juniper room as part of a three paper panel. Come hear our co-directors speak about this important (and not often addressed) issue.

    • Come meet the KNAER team at any of our presentations or stop by our KNAER table for some swag, resources, and good conversation!

    The full conference program and more information can be found on the CSLEE conference website at: http://www.edu.uwo.ca/cslee2016/index.html

    Hope to see you there!

    The KNAER Team

  • Knowledge Mobilization Connection:

     

    Systems Thinking as Part of a Knowledge Translation (KT) Approach:

    How Our NeuroDevNet Team Used Systems Thinking to Improve Our Production of Research Summaries

    by Anneliese Poetz

    Anneliese Poetz completed her PhD in Social Science at McMaster University, which generated a systems-based model for Knowledge Translation. Anneliese has experience writing plain language research summaries for policymakers, parents and teachers at the Canadian Language and Literacy Research Network and, in her most recent work for the National Collaborating Centre for Infectious Diseases, she facilitated national stakeholder consultations, and developed stakeholder-and evidence-informed products to improve public health practice. For more on Anneliese and her work click here.

     

    I recently had the pleasure of being able to present at the recent Canadian Knowledge Mobilization Forum (#CKF16) conference in Toronto, Ontario. It was a 7-minute presentation entitled “Systems and Processes for Knowledge Translation” and focused on one of the examples of how I use systems thinking to inform my work in Knowledge Translation (KT).

     

    Several years ago, I met a business analyst who informed me that what I was doing in my job in the field of KT was essentially what a business analyst does: use stakeholder input to inform the design (and/or re-design) of products and processes. When you think about it, everything we do in KT is either a product or a process. The products I worked on included evidence-informed tools for health care practitioners to apply to their work, and guides for researchers to help them “do” KT. One of the processes that we needed to improve was for our production of clear language summaries called ResearchSnapshots.

     

    Wondering what the difference is between Knowledge Mobilization, Knowledge Translation, and other like terms? Visit Gary Myers' KMbeing blog post and join the conversation on KMb: Definitions & Terminology.  

     

    If you look at slide #6 in the above presentation, you will see a framework that outlines the key concepts in business analysis, according to the Business Analysis Body of Knowledge (BABOK v.3). These are: 1) Need 2) Change 3) Stakeholder 4) Solution 5) Value 6) Context

    Each of these is of equal importance, and all must be represented. One of the things we did wrong with our initial process for the ResearchSnapshots was that we transferred the existing process (and writing staff) used by York’s KMb Unit without consideration of the differences in context within NeuroDevNet.

    One of the ‘tools’ within the field of Business Analysis is a methodology called root cause analysis. We conducted a root cause analysis in order to pinpoint what the root of the problem was, and create a targeted solution. We discovered the problem was that the writers used by York KMb’s Unit, although highly skilled in clear language writing, had social science expertise but were asked to summarize research papers that were basic science and clinical science based. The researchers complained that they had to rewrite most if not all of the content, and it was a lot of work for them to do so. The result was that we achieved customer satisfaction (buy-in) among the researchers for the new process.

    What we did to improve the process was to first identify all the stakeholders directly and indirectly affected by the process. Then we gathered information about their needs (often in the form of the complaints we’d received from researchers) with respect to the process, which were then transformed into ‘requirements’. These requirements informed the re-design of the new process.

    The process had to be easy for researchers, and create value for the Network. Since the projects within the Network were so diverse and often specialized, it would have been too difficult (and maybe impossible) to find writers who were content experts. So, the new process begins with the researcher nominating a paper that was produced as a result of one of their NeuroDevNet funded projects, along with one of their trainees (students) who is expert in the content area. Then, we provide training and support toward the production of a clear language summary of their paper that is ready for final review and sign off by the researcher. In this way, it is easy for researchers because they only have to make minimal edits to the draft, and it creates value for the Network not only because of the clear language summary that is produced but the transferrable skills that the trainee acquires.

    Let’s break down how this method reflects ‘systems thinking’:

     

    1) A system is composed of parts. The first thing we did was map out the stakeholders and where they were situated within the system (see slide #9).

    2) All the parts of a system must be related (directly or indirectly). We mapped out the stakeholders as related, directly or indirectly, to the customer service issue (or ‘incident’).

    3) A system has a boundary, and the boundary of a system is a decision made by an observer or a group of observers. The ‘system’ was what facilitated the execution of the process for creating clear language summaries (ResearchSnapshots). In other words, the boundary of the system was the affiliation of researchers as part of NeuroDevNet, and research papers to be summarized were those produced as part of NeuroDevNet funded research projects.

    4) A system can be nested within another system, a system can overlap with another system. The ‘system’ for producing ResearchSnapshots within the KT Core with one researcher is nested within the larger ‘system’ of the NeuroDevNet pan-Canadian Network of researchers and projects.

    5) A system is bounded in time, but may be intermittently operational. A system is bounded in space, though the parts are not necessarily co-located.We engage with researchers to co-create ResearchSnapshotsat the time that we receive a service request, usually after a researcher has published a new peer-reviewed paper. These requests are sporadic depending on the frequency and pace of publications arising from its pan-Canadian NeuroDevNet-funded projects. 

    6) A system receives input from, and sends output into, the wider environment. We receive requests but we will also offer services if we see an opportunity. Once the ResearchSnapshots are finalized, they are made available on the NeuroDevNet website

    7) A system consists of processes that transform inputs into outputs. The process for clear language writing of ResearchSnapshots is one of the processes that exist within the KT Core, that transforms inputs (peer reviewed publications, clear language summary drafts in word) into outputs (finalized draft of clear language summary, formatted onto ResearchSnapshot .pdf template, formatted for accessibility). 

    8) A system is autonomous in fulfilling its purpose. A car is not a system. A car with a driver is a system. Similarly, the KT Core as a department within NeuroDevNet is not a system. The KT Core with a Lead, Manager and Assistant, is a system.

     

    As a systems thinker, remember that a system is dynamic and complex, and that information flows among the different elements that compose a system. For example, information flows among the KT Core Lead, Manager and Assistant. A system is a community situated within an environment. For example, the KT Core is a system situated within NeuroDevNet, and as a result, information also flows more broadly between the KT Core and NeuroDevNet’s community of researchers. Information flows from and to the surrounding environment, for example, the KT Core posts its finalized ResearchSnapshots publicly on the NeuroDevNet website.

    The field of Business Analysis has identified (and published in BABOK) a common sense framework and practical methodologies, which I believe can advance the field of KT towards more meaningful and useful products and processes that are responsive to the systems in which they are intended to be used.

  • Knowledge Mobilization Connection:

     

    EVALUATING RESEARCH IMPACT: BE PERSISTENT AND PLAN AHEAD

    By Anne Bergen* & Elizabeth Shantz**

    *Director, Knowledge to Action Consulting;
    **Knowledge Mobilization and Training Manager, Canadian Water Network

    This blog post gives an overview of helpful practices in research impact evaluation, with case study examples from the Canadian Water Network’s recent evaluation work.

    KEY MESSAGES

    • Understand why you are evaluating: your evaluation goals will impact how and what you evaluate;
    • Map out your evaluation criteria: inputs, outputs, outcomes, and assumptions. A logic model can be a helpful tool to do this;
    • Build relationships with end-users and plan for evaluation at the beginning of research projects or programmes.
    • Be flexible and pragmatic. Revisit and adapt your evaluation plan. 

     

    Research impact evaluation is an exercise in understanding how a project or body of work has contributed to change: in people, places, policy, and practice.That means that research impact evaluation goes beyond counting the outputs of research (e.g., publications, reports, presentations), to tracking and assessing uptake and outcomes and impact.

    The question of what should be considered “research impact” is complex. Counting outputs is relatively easy, but outputs aren’t evidence of impact. It’s much more difficult to draw a link between research activities and changes in individuals, organizations, or systems.

    As research impact tends to be indirect, it’s helpful to think about research activities as contributing to changes, rather than as a direct cause. That is, individual changes in knowledge, attitudes, and understanding may influence decisions and behaviour. However, these decision and behavioural changes occur due to uptake of multiple pieces of information, assimilation with existing knowledge and experience, and the larger context. Similarly, changes in “policy and practice” involve uptake and use of multiple pieces of evidence and external pressures. 

     

    Case Study Example: Canadian Water Network

    At Canadian Water Network (CWN), our goal has been to fund research that addresses real-world water challenges that affect public health, the environment and our economy.However, demonstrating links between research and decisions that affect these factors has been difficult, as policy and practice decisions take a long time to change, and changes generally result from a number of influences rather than from a single project.

    In 2012, CWN launched an evaluation to better understand the contribution of the last ten years of CWN research to these impacts. We used a 3-stage process:

    1. Analysis of existing reports and documents
    2. Interviews with project researchers
    3. Interviews with research users to learn about the impact of each project in their organization.

     

    STEP 1: DEFINE YOUR EVALUATION GOALS AND CREATE A LOGIC MODEL OR THEORY OF CHANGE

    Keep a close eye on your evaluation goals

    When planning to evaluate research impact, questions about the scope of evaluation should be defined as early in the process as possible. Revisit these goals often.

    What kind of impact do you expect? In what timeframe? For whom? Under what conditions? Based on what evidence?

    What aspects of your research do you want to evaluate? What are you going to do with the resulting information?

     

    Case Study Example: Canadian Water Network

    We had a number of goals for CWN’s evaluation project:
    1. Identify and share success stories
    2. Identify the elements of impactful research to inform future projects 
    3. Determine whether user-driven programs led to more uptake and impact than researcher-driven programs 
    4. Establish a comprehensive database summarizing inputs, outputs and outcomes for all projects

     

    Build a logic model to explain your research impact pathways

    Creating a logic model or theory of change defines the links between research activities, outputs, and desired outcomes, as well as the expected end-users or target populations for change. A logic model makes it easier to map contributions of research activities and outputs to uptake, use, and impacts.

     

    Logic Model Key Components

    End-users/ target populations

    Who are the audiences/targets of change of your research activities and outputs? These are the people and groups who can tell you about research uptake, use, and impact.

     

    Activities & outputs

    What are your research and knowledge translation activities that might impact your end-users? Think about in-person activities like meetings, workshops, and collaborations, as well as written and visual outputs.

    Outcomes & impacts

    What are the expected outcomes, and impacts of your research and knowledge translation activities? What are the changes in people’s knowledge, attitudes, skills, and actions? What are the community and systems level impacts?

     

    Assumptions & contexts

    What needs to be in place for activities to lead to desired outcomes? What context is necessary for the activities to have impact? When do you expect these changes to occur?

    CWN LOGIC MODEL

    CWN Logic Model

    Adapted from the University of Wisconsin-Extension evaluation logic model template 

    Case Study Example: Canadian Water Network

    CWN’s logic model - adapted from the University of Wisconsin Extension logic model, outlines the expected pathways between research inputs, activities, outputs or products, and short-, medium- and long-term outcomes. These categories formed the framework within which we gathered and interpreted data.

     

    CWN’s logic model from inputs to outputs to impacts shown here is a helpful starting example for mapping out your research impact pathways. Other examples include Morton’s (2015) matrix of outcomes and indicators for research impact and Phipps et al.’s (2016) logic model for knowledge mobilization, which spells out benefits for researchers and end-users (http://jces.ua.edu/the-co-produced-pathway-to-impact-describes-knowledge-mobilization-processes/.)

     

    STEP 2: BUILD RELATIONSHIPS TO IDENTIFY AND AMPLIFY IMPACT

    Ideally, you’ll start planning for research impact evaluation by defining goals and creating a logic model at the beginning of your research project. Research impact may take years or decades, but trying to reach out to end-users long after the fact to learn about uptake and use is unlikely to be helpful.

    Building relationships and involving end-users throughout the research process will make it easier to follow-up over time, and will build more opportunities for impact into the research process. In addition, end-user involvement and collaboration will amplify your research impact.

    Research impact takes time. Don’t expect to see changes in policy or practice right away. Consider also that although it will likely be easier to identify contributions from broader programmes of research, rather than single projects, these impacts will take a relatively longer period of time to emerge. Since it’s rarely possible to wait decades to evaluate impact, focus on identifying shorter-term indicators of potential future impacts.

     

    Case Study Example: Canadian Water Network

    CWN gathers evaluation information throughout a project, including information on outputs, short-term outcomes that have occurred, and medium- or long-term outcomes that are expected to occur later. This process helped us to forecast potential future impacts and highlighted areas where we could follow up during this evaluation to learn whether expected impacts actually occurred.

    Building strong relationships with researchers and end users is critical, and we believe relationships were a key factor in the high response rates to our interview requests (74% of researchers and 68% of end users) – even for projects that were 10+ years old!

     

    STEP 3: BE ATTENTIVE TO UNEXPECTED OUTCOMES: REVISE AND REVISIT YOUR EVALUATION PLAN

    Once you have evaluation goals and a logic model in place, keep revisiting and revising the goals and logic model. The outcomes you expected in Year One may not be the outcomes that are emerging in Year Five. In particular, your logic model for research impact should be updated about annually. The goals of your evaluation may shift, depending on changing pressures from funders and other stakeholders.

    Case Study Example: Canadian Water Network

    As our evaluation proceeded, we updated and changed some of the outcome categories in the logic model due to emerging impacts. Our priorities also shifted from identifying the elements of impactful projects and programs to identifying and sharing success stories. Having a large number of goals for the evaluation may have been a complicating factor – the more evaluation goals you have, the more difficult to achieve them all!

     

    STEP 4: BE PRAGMATIC. DON’T LET EVALUATION OVERWHELM YOUR ACTUAL WORK. LEARN FROM THE EVALUATION

    Remember that an evaluation cannot capture every single outcome or impact, that research impacts cannot necessarily be directly compared across projects or programmes. Finding a balance between doing research and evaluating research can be challenging. Those seeking to understanding research impact should be mindful that evaluation has costs, as well as benefits, for researchers and end-users. Working to minimize the burdens of data collection and reporting will make ongoing research impact evaluation more sustainable.

    Carefully designed and implemented evaluation can help you learn how to better mobilize knowledge and create research impact. Build in time and space to learn from the evaluation results, and make evaluation part of a regular reflective practice.

    Case Study Example: Canadian Water Network

    We originally intended to complete CWN’s evaluation in-house within 6-12 months, but found that it took almost 2 years, 2 interns, and the assistance of Knowledge to Action Consulting to gather all of the comprehensive data we were interested in. Database development was the most time-consuming aspect of the evaluation.

    Looking ahead, we plan to continue successive evaluation work. By strategically collecting data, building relationships and scaling back our evaluation plan, we hope to benefit from a more streamlined process.

     

    Note: This blog post was adapted from the authors’ recent previous work, including presentations to Alberta SPOR unit, OMAFRA, and CKF16.

     

    About the authors

    Anne is a consultant who helps people and organizations transform knowledge into action (https://knowledgetoaction.ca). She gained mixed methods research expertise through her PhD training in Applied Social Psychology, and believes that a common understanding of problems and solutions can be built through engaged research and collaborative action. Anne uses approaches grounded in social science theory to develop evaluation frameworks for collecting rigorous, meaningful, and actionable data. She has worked with diverse stakeholders (academic, community groups, non-profit, government) to create knowledge mobilization activities and strategies for projects, and programmes, and organizations.


    Elizabeth has been a knowledge mobilizer at Canadian Water Network since 2010. Her background in industrial/organizational psychology and experience as a community-engaged researcher inform her work at CWN; she is responsible for supporting the development and implementation of knowledge mobilization strategies in CWN programs to effectively bring together researchers and decision makers. She also evaluates the impact of CWN research on policy and practice, develops plain language products to share the results of CWN research, and designs tools and training programs to build knowledge mobilization capacity.

     

    RESOURCES

    Better Evaluation http://betterevaluation.org

    Economic and Social Research Council (2011).Branching Out: New Directions in Impact Evaluation from the ESRC’s Evaluation Committee. Appendix 1 – Conceptual Framework for Impact Evaluation. Retrieved from: http://www.esrc.ac.uk/files/research/evaluation-and-impact/branching-out-new-directions-in-impact-evaluation-from-the-esrc-s-evaluation-committee/

    Lavis, J. N., Robertson, D., Woodside, J. M., McLeod, C. B., & Abelson, J. (2003). How can research organizations more effectively transfer research knowledge to decision makers? Milbank Quarterly, 81, 221-248.

    Morris, Z. S., Wooding, S., & Grant, J. (2011). The answer is 17 years, what is the question: understanding time lags in translational research. Journal of the Royal Society of Medicine104, 510-520. Retrieved from http://jrs.sagepub.com/content/104/12/510.full

    Morton (2015a) Progressing research impact assessment: A ‘contributions’ approach Research Evaluation, 24, 405-419. Retrieved from http://rev.oxfordjournals.org/content/24/4/405

    Morton, S. (2015b). Creating research impact: the roles of research users in interactive research mobilisation. Evidence & Policy: A Journal of Research, Debate and Practice 11, 35-55.

    National Collaborating Centre for Methods and Tools (2012). Evaluating knowledge translation interventions: A systematic review. Hamilton, ON: McMaster University. Retrieved from http://www.nccmt.ca/resources/search/114.

    Phipps, D.J., Cummings, J. Pepler, D., Craig, W. and Cardinal, S. (2016). The Co-Produced Pathway to Impact describes Knowledge Mobilization Processes. J. Community Engagement and Scholarship, 9(1): 31-40.
  • Knowledge Mobilization Connection:

    Tangled Up in Networks

    By Heather Bullock

    We live in an increasingly networked world. Advances in technology over the past 15 years have enabled collaboration and networked activities in ways we couldn’t imagine just a few short years ago. Despite this, there has been relatively little research on networks, how they function, and their potential. Personally, I find networks both fascinating and mystifying. The “network” seems like a nebulous concept, but also a very real one, especially when I think about my “work network” or my “network of friends and family”. For me, in the past, networks were just things that existed; I gave little thought to how they are structured or function or how you might use networks to achieve goals, outside of the odd suggestion like “use your network to find a job after your graduate work”. But I gradually became more interested in networks. I would love to say curiosity alone drove me to find out more about networks, but in reality, about 10 years ago, people in my workplace began discussing using networks for certain purposes. So, I started to do some digging and here are some things I found out about networks:

    There is a whole language related to describing and understanding networks, including interesting terms like “actor”, “nodes” and “structural holes”

    Networks are about the connections among ‘actors’ within a specific type of knowledge and/or practice

    Actors can be individuals, groups or organizations

    Networks are constantly changing and have life cycles, similar to living things

    Networks can be social, work-related or interest-driven

    You can make maps of networks (and sometimes, they are quite beautiful)

     

    This told me something about WHAT networks are, but little about WHY they exist and do what they do, so more digging was needed. It turns out there are many advantages to networks. Some advantages that I find most interesting include:

    Building connections beyond one’s individual or organizational experience

    Allowing greater ease of movement beyond professional, disciplinary, and organizational boundaries

    Encouraging shared learning, rapid diffusion of new knowledge, cross-fertilization of diverse ideas, efficient problem solving and enhanced group ownership

    Now I started to get even more interested. Aren’t these advantages aligned with some of the goals of knowledge mobilization? If networks are a structure that can achieve these goals, how might I use them in my work? Intentionally creating a network was a novel idea for me. But where to start? As far as I was aware, there was no course on network building I could take. Back to the literature I went.


    It turns out that there are few researchers who have studied networks in detail over the years and many of the earlier helpful contributions are from two groups of people: one group consists of Provan and Milward, out of the University of Arizona. Along with some colleagues, they have concluded that there are several factors to consider when building networks and setting them up for success. Some of the factors that predict effectiveness are: the level of trust amongst members, the number of participants or “actors”, the level of consensus around the goals of the network and the level of skill and experience with other networks that the actors bring. These all also predict what model of network governance you should consider. The second group includes Robeson, Wenger, Garcia & Dorohovic, Huerta, and others, whose thinking has helped shape the following keys to network success:

    Establish clear purpose and goals

    Address hierarchy of needs

    Include a culture of trust in stated core values

    Fulfill specific role functions such as effective leadership, sponsorship, knowledge brokerage and community membership

    Maintain a flexible infrastructure

    Establish supportive processes

    Balance homogeneity and heterogeneity

    Secure adequate resources

    Demonstrate value

    The big question for KMb is how we maximize the value of networks to support moving evidence to action? Evidence Exchange Network (EENet) provides a case example. EENet is a knowledge exchange network that brings together diverse stakeholders, including researchers, policymakers, service providers, system planners, persons with lived experience, and family members. Our goal is to make Ontario’s mental health and addictions system more evidence-informed—no mean feat! But that’s why we take multiple approaches. We translate evidence – which we interpret broadly – into usable and accessible forms: Research Snapshots, for example. But we don’t simply disseminate findings through our network; we also help shape the knowledge that’s being created. Our Creating Together initiative brought together provincial partners to help set research priorities. Our Persons with Lived Experience (PWLE) and Family Members Advisory Panel – a mouthful, I know! – has informed projects under the Drug Treatment Funding Program.

    We are especially excited about our Communities of Interest (CoI) initiative. We view CoIs as forums for knowledge exchange and collaborative knowledge creation on a topic related to mental health and addictions. This format allows for progress toward some more specific knowledge mobilization goals with the support of the network’s knowledge brokers. This year, nine COIs are busily working toward their goals.

    All of these efforts – Creating Together, the PWLE and Family Members Advisory Panel, the CoIs – are almost like mini-networks within EENet. But then that’s been our conception for awhile: we are a network of networks. Maybe that’s confusing; but our aim is not to duplicate existing networks but, rather, to link up to them and help increase the spread of evidence.

    One challenge we continue to face is how to evaluate a network approach to knowledge mobilization? Especially given the multi-faceted nature of the activities taking place within the structure. Our answer has been to develop a theory of change and use a multi-pronged approach that allows us to examine the network as a whole as well as specific activities within the network. One of the most interesting evaluative approaches we have employed to date is a social network analysis of where Ontario mental health and addictions stakeholders are going for their evidence and how EENet fits within that landscape. Hopefully we will be publishing that soon.
    Some questions to leave you with:

    What networks do you belong to?

    Are there networks that exist out there that can help you fill a KM goal? (It is always easier to plug in to an existing structure than build from scratch like we did)

    How can technology help networks? Is the technological platform the network? Or is it a tool for the network?

    What are some of the downsides of networks?

    In the meantime, here some resources about networks that I have found useful:

    A new(er) paper on Inter-organizational networks:http://health-leadership-research.royalroads.ca/sites/default/files/Inter-organizational%20networks%20A%20critical%20review%20of%20the%20literature%20to%20inform%20practice.pdf

    One of my “go to” resources for networks in a KMb context: http://www.nccmt.ca/pubs/NetworkingPaperApr09EN_WEB.pdf 

    To learn more about Evidence exchange network: http://eenet.ca

    Note: An earlier version of this blog originally appeared as part of a course on “Knowledge Mobilization and Evidence-based Practice” at Renison College, University of Waterloo in 2014.

    About me:

    Heather Bullock, MSc. is a PhD candidate in the Health Policy program at McMaster University in Ontario, Canada and is part of the McMaster Health Forum's Impact Lab. Heather is a 2016 Trudeau Scholar. Heather has an extensive background in health care policy and knowledge mobilization, holding progressive leadership positions. She is on leave from her position as Director of Knowledge Exchange at the CAMH, Canada’s largest mental health and addictions teaching hospital. In this role, she developed and led an innovative knowledge mobilization initiative: Evidence Exchange Network, which aims to make Ontario’s mental health and addiction system more evidence-informed. She also helped build a program that supports implementation efforts in Ontario’s mental health and addictions system. Heather has also worked for the Government of Ontario as Research Transfer Advisory where she connected people within government with the best available research evidence to support evidence-informed policy-making. She instructs several courses and modules on knowledge exchange including a graduate-level course at McMaster University and previously at the University of Toronto.

    Heather’s research interests lie in how large jurisdictions implement evidence-informed policy directions in mental health systems. Her dissertation is exploring how developed countries structure their implementation efforts as well as the process of policy implementation in Ontario’s mental health and addiction system. Heather serves in an advisory capacity for several provincial, national, and international initiatives such as the International Knowledge Exchange Network for Mental Health and the Provincial Centre of Excellence for Child and Youth Mental Health. Her expertise is sought globally and she has conducted training in knowledge mobilization and implementation for Government of Saskatchewan, University of Toronto, University of Ottawa, Government of Ireland, and the Swedish Government, among others. She has Masters level training in behavioural ecology and evolutionary psychology from Queen’s University.

  • Knowledge Mobilization Connection:

    Five tips for getting knowledge into action 

    by Sarah Morton

    Sarah Morton, University of Edinburgh works at the interface between social research, policy and practice in a range of leadership roles. She is Co-Director at the Centre for Research on Families and Relationships www.crfr.ac.uk where she leads a Knowledge Exchange team .She is a Director of What Works Scotland (www.whatworksscotland.ac.uk) leading on the evidence to action stream that aims to increase ways that local authorities can use evidence to develop public services. She is also the Director (Knowledge Exchange and Research Impact) for the Usher Institute of population health and informatics (http://www.ed.ac.uk/usher). She has worked as an Impact Analyst within the University, for the ESRC (http://www.esrc.ac.uk/) and with wider projects. Sarah’s research has investigated the process assessing the impact of research on policy and practice. (http://crfrblog.blogspot.co.uk/2014/04/how-does-research-impact-happen.html). She is an Associate Editor of the journal Evidence and Policy (http://www.ingentaconnect.com/content/tpp/ep).  Here Sarah reflects from her experience and knowledge of the literature to bring out five tips for getting knowledge into action.


    I think we are at a good time in the knowledge mobilisation field. We have built a body of research that explains a lot about what helps and hinders knowledge getting used to aid decision making in policy and practice. (see Oliver 2014). We are developing rewards and incentives to help academics get research out of the academy (e.g http://www.rcuk.ac.uk/innovation/policies/), and we are at least playing lip service to the idea that policy and practice should be informed by the latest evidence (KNAER is a good example).

    Despite this, there are still many pitfalls along the way, and it is often easier to identify where things went wrong rather than success stories. The five tips I have identified below will not surprise many readers of this blog, and yet they are often the pitfalls that I see in practice. I’d really welcome any comments from readers about whether these are issues that you see in your practice, if I have missed something major, or if you disagree completely!
    Five tips for getting knowledge into action:
    Plan ahead
    Any good project or process involves careful planning, but how often is evidence-use included in the plan? If researchers want their research to have impact, a well-planned user engagement and KMb strategies have been shown to be effective (http://www.esrc.ac.uk/files/research/evaluation-and-impact/taking-stock-a-summary-of-esrc-s-work-to-evaluate-the-impact-of-research-on-policy-and-practice/). On the policy or practice side, valuing evidence, showing leadership and embedding evidence into organisational practices are all key.

    So what would a planned evidence use process look like? For those from policy or practice it might consider how evidence will frame any project or development (http://www.crfr.ac.uk/assets/CRFR_ESS_IS_Evidence_base_briefing.pdf) , how it will be considered and built on, what will be done when people don’t agree on what the evidence says, and how evidence will be accessed, analysed and interpreted. For research teams and partners it would consider who will be engaged and involved, what methods are best for engaging stakeholders  and how the research might contribute to change. This needs to move beyond simple ideas of making research accessible, into more complex and process focussed projects. (I have written about this much more extensively here http://rev.oxfordjournals.org/content/24/4/405.abstract)
    Get the right people round the table
    In our evidence to action projects About Families (http://www.crfr.ac.uk/projects/current-projects/about-families/) and the Evidence Request Bank (http://www.crfr.ac.uk/projects/current-projects/evidencebank) we learnt a lot about who is involved in evidence-use processes. Like others taking a systems-thinking approach (e.g. Best and Holmes 2010) we believe that it is essential to include a range of key actors in any knowledge mobilisation process. This would include considering the skills mix of any team in accessing, interpreting and animating evidence of different types. Any systems change also needs to include the perspectives of all key players within the system. Depending on the size of the change project these views might be represented in person, or through consultation of various kinds.
    Have the conversation
    Often the starting point for evidence use projects is the evidence itself, but there are a variety of discussions and framings that are essential for evidence to action. What is evidence needed for? What kinds of evidence might be useful? How will they be interpreted? How will evidence inform change processes? Who needs to be involved? Research doesn’t speak for itself so relationships are key to evidence to action.  Effective facilitation of knowledge mobilisation needs an ongoing sense of open dialogue, regular revisiting of planned aims, interrogation of context, and keeping the conversation going about the usefulness and relevance of evidence. We worked with Research Impact (http://researchimpact.ca/) and NCCPE (https://www.publicengagement.ac.uk/) to develop a manifesto for partnership research that can help frame some of this conversation http://www.crfr.ac.uk/manifesto-for-partnership-research/.
    Focus on the process
    Using evidence is not a one-off event, but an ongoing process. If people feel they have ‘done’ knowledge mobilisation then they are missing a trick. Using and reusing evidence, checking as programmes develop, and building up more evidence as events unfold are all essential parts of successful knowledge mobilisation. An ongoing focus on the processes can open up new opportunities, ensure ground is not lost, help address conflict and tension, and assess changing contexts and their implications for KMb. Overall a focus on processes helps to ensure knowledge mobilisation continues to be as effective and relevant as it can be.
    Learn, evaluate, review
    I said in the opening of this blog that we are in a good place in terms of understanding barriers and facilitators to knowledge mobilisation (although a recent review is opening up this conversation http://www.nesta.org.uk/publications/using-evidence-what-works) . We are in a less clear place about what strategies and methods are most effective in which circumstances (http://www.ncbi.nlm.nih.gov/books/NBK299400/) . As a community of knowledge mobilisers we need to develop evaluation methods, reflect more deeply and write up what we find out. My own approach to this has been published here http://rev.oxfordjournals.org/content/24/4/405.abstract . Every project needs a learning, review and evaluation process, even if on a simple team scale. As the field matures this will be essential in honing the craft, creating training programmes and developing the most effective strategies.


    So those are my five tips for getting knowledge into action. How do they resonate with your own experience? What might you add? What resources do you use? I look forward to continuing the conversation.


    ----
    Best, A. and B. Holmes (2010). "Systems thinking, knowledge and action: towards better models and methods." Evidence & Policy: A Journal of Research, Debate and Practice 6: 145-159.
  • Knowledge Mobilization Connection:

    USING RESEARCH TO SHAPE KNOWLEDGE MOBILISATION PRACTICE

    by SANDRA NUTLEY

    DIRECTOR OF THE RESEARCH UNIT FOR RESEARCH UTILISATION (RURU), UNIVERSITY OF ST ANDREWS, UK

    As a long-time advocate of the benefits of using research to inform the development and implementation of public policy and service delivery practices, I am heartened by the growing number of knowledge mobilisation initiatives (operating in many sectors and countries) dedicated to facilitating and enhancing research use. However, I am struck by the irony that many of these initiatives struggle to demonstrate that their own knowledge mobilisation practices are themselves research-informed and in line with the best available research on how to enhance research use. I do not for a moment think that this is due to knowledge mobilisers wilfully disregarding research findings when it comes to their own practices, so what is going on and what questions might this raise for KNAER-RECRAE?  

    Huw Davies and Alison Powell and I have sought to shed some light on this conundrum in a recent project which considered how key research agencies (funders, producers and intermediaries) working in the fields of health, social care and education designed and implemented their knowledge mobilisation strategies. We reviewed the knowledge mobilisation literature to distil the latest thinking and empirical evidence on best practice. We then studied the agencies’ strategies and practices by reviewing the websites of over 200 agencies, obtaining online survey responses from over 100 agencies and conducting in-depth interviews with 51 agencies.

    The key tenets that emerged from the literature review included the importance of using relational approaches that bring researchers and research users together; acknowledging the importance of context; being aware of the needs of research users; drawing on a range of types of knowledge, not just research-based knowledge; and testing and evaluating interventions and feeding that knowledge back into future practice.

    We found from the survey that relatively few agencies were fully embodying these insights in their own approaches to knowledge mobilisation. There was a marked focus on producing knowledge products and on traditional ways of disseminating (‘pushing’) these products to policy makers and practitioners. Few agencies were conducting robust evaluations of their own knowledge mobilisation activities. Indeed, overall, the field of knowledge mobilisation seemed somewhat detached from its own knowledge base with activities being developed and implemented without reference to existing theory or empirical evidence, and without robust evaluations that could contribute to developing the knowledge base in this area.

    The survey offered some insights into the reasons for this disconnection between the knowledge mobilisation literature and knowledge mobilisation practice. Part of the explanation resides in agencies’ frustration with the literature due to its complex concepts, growing jargon, and still limited empirical evidence on the effectiveness of different knowledge mobilisation strategies and activities. 

    This is not the whole story, however, because the majority of survey respondents expressed agreement with statements derived from the key tenets emerging from the literature. Their views about the features of effective knowledge mobilisation were in essence in line with the literature but they nevertheless struggled to reflect these understandings in their agency’s knowledge mobilisation practices. They expressed some frustration that the knowledge mobilisation literature did not offer much guidance on how to translate conceptual principles and models into practical action. 

    This translation process and the difficulties of moving beyond a traditional ‘push’ approach to knowledge mobilisation is bound up with the contextual and capacity factors that played a significant role in shaping the knowledge mobilisation practices of our agencies. These included short term funding regimes for mobilisation activities, and the performance measurement and accountability frameworks for researchers and agencies. The limited capacity of potential research-user organisations to engage in knowledge mobilisation activities also hindered the development of ongoing relationships and interactions.

    Do not despair: our study’s picture of the connection between knowledge mobilisation research and practice is not wholly gloomy. There are promising developments and it is in these that we see reasons for optimism and avenues for further development. 

    While the survey shows that few agencies fully embody the key tenets from the literature, many are seeking to move in this direction and some agencies have developed strategies and activities from which other agencies could learn.  Our survey indicated that very few agencies were already learning from each other and we would encourage the development of stronger cross-agency fora to enable this.  Such fora would facilitate learning from those agencies that have implemented successful relational approaches, are experimenting with innovative approaches and technologies, and are evaluating their knowledge mobilisation activities in ways that are adding to the existing knowledge base. 

    With regard to knowledge mobilisation research, there are already researchers who are seeking to develop better connections between their research and knowledge mobilisation practices through collaborative projects, including action research and the co-production of knowledge mobilisation research and practice. This should be encouraged as it is promises a situation where research and practice are both informed by and inform each other. 

    Members of KNAER-RECRAE have probably already identified points for self-reflection. These are likely to include reflecting on the knowledge base that underpins the network’s activities, whether the network is adding to and helping to refine that knowledge base, and how it is seeking to address the contextual and capacity factors that are likely to be limiting what it can do and achieve. I also hope that members of the network will see merit in further developing and contributing to fora that enable better cross-fertilisation of ideas and learning across knowledge mobilisation agencies, although such fora already seem more developed in Canada than in many other countries.

    Note: The findings referred to in this commentary are further explored in the final project report and in a paper published in Evidence & Policy.
  • Knowledge Mobilization Connection:

    Knowledge Mobilization and Human Rights: Making social justice evident

    By Peter Norman Levesque, KSJ
    President, Institute for Knowledge Mobilization

    This year marks fifteen years of my work in Knowledge Mobilization. When people ask me why I do this work, I have a simple statement that is grounded in a complex framework. It is fight for human rights.

    As a Canadian, my family and I have benefitted profoundly from the framework provided by the Universal Declaration of Human Rights (UDHR) adopted by the United Nations General Assembly on 10 December 1948 in Paris. I am deeply grateful for the fortune to live in a beautiful and peaceful country. However, in the words Rev. Dr. Martin Luther King, Jr, "no one is free until we are all free."

    Arising from the atrocities of the Second World War, the declaration represents the first global expression of fundamental human rights to be universally protected. There are generally well known articles of the declaration - Right to Equality; Freedom from Discrimination; Right to Life, Liberty, Personal Security; Freedom from Slavery. Others are not quite as well known - Right to a Nationality and the Freedom to Change It; Right to Marriage and Family; Right to own Property.


    It was one of these lesser known articles that triggered my understanding of knowledge mobilization as part of the fight for human rights. The conditions that support and sustain these fundamental rights are built. They can be built with the use of evidence. Evidence to demonstrate injustice. Evidence to demonstrate barriers to attaining each of the articles. Evidence to make evident systemic denial of human rights.


    Article 27 states: (1) Everyone has the right freely to participate in the cultural life of the community, to enjoy the arts and to share in scientific advancement and its benefits. (2) Everyone has the right to the protection of the moral and material interests resulting from any scientific, literary or artistic production of which he is the author (emphasis is mine).


    My entry to the profession of knowledge mobilization began officially in 2001 with the creation of the Office of Knowledge Products and Mobilization at the Social Sciences and Humanities Research Council of Canada. Following the successful piloting and evaluation of the Community University Research Alliances (CURA) programme, knowledge mobilization was coined as a term to describe the umbrella of activities and practices that supported civil society organization participation in research and the conditions that enabled enhanced uptake of findings into policy, programs, and practice. It was my privilege to help write and then act as Programme Officer for the CURA programme. It followed that I was a suitable candidate to help pilot the programme of activities in the new office dedicated to knowledge mobilization.


    Much of the early work was an exercise that was equal parts framing and frustration – what is this new “thing” we have created. What to include or exclude? What can we support or not? How does this fit with community-based research, technology transfer, action research, or lobbying? Some of these questions are still part of the conversation.


    It was one such conversation, with Dr. Henk Mulder, a chemist and Science Shop (Wetenschapswinkels) leader at the University of Groningen, in the Netherlands, that introduced me to the human rights framework as a way of thinking about knowledge mobilization.


    The participation of the community in the research process is connected to Article 27. The exchange, translation, and exchange processes and actions of knowledge mobilization connect to the “shared benefits” statement in Article 27. When you plunge in, the work of knowledge mobilization professionals connects to all the Articles – research on law and governance (Article 6 - Everyone has the right to recognition everywhere as a person before the law.); research on housing (Article 25); Research on access to education (Article 26); research on health (Article 3); and onwards. Research is not enough. We need to mobilize this research to help inform the construction of the conditions for human rights.


    This realization changed my life. It changed the narrative I used when talking about knowledge mobilization. It changed the way I related to institutions, governments, disciplines, and other organizing systems we use to define our connections and activities.

    I remain interested in research. I remain interested in methods and techniques of analysis. What interests me most, is how we make the world a better place. My interest in your research is how it helps build the conditions where I am free – because you are free. A world where you have housing, food, love, community, education, water, security, and health – supported by evidence of what works and what does not work.

    This is the spirit in which the Canadian Knowledge Mobilization Forum was created. A space where we come together, across the broad diversity of expertise, issues and methods, to learn from each other and co-create the world in which we all live with the rights that are unalienably ours. The full realization of the promise of the Universal Declaration of Human Rights is a complex and emergent process. A process that is only possible if we use the best of what we know – together we are mobilizers of this better world.

  • KNAER-RECRAE Highlight:

    The buzzing sound of enthusiastic chatter, the wafting smell of a gourmet breakfast, and some friendly faces greeted our KNAER team as we ascended the escalator at 89 Chestnut Street to join this year’s 2016 diTHINK conference.

    Our team was on a mission. Drawing on Peter Levesque’s strategy of honing in on a few ideas, people and actions to follow up with post-conference, our team was on the look out for ideas, people, or actions that could further inspire us in our quest to mobilize knowledge in the Ontario education system. Here is what stood out for us…

    Sketch note 1.1
    (SketchNotes by Shasta Carr-Harris)


    The panel discussion with Mark Daley, Compute Ontario Board Chair, Diane Findlay, from Compass for Success, Dan Mathieson, Mayor of Stratford, Jutta Treviranus, Director, Inclusive Design Research Centre at OCADU and moderator Richard Garner, VP of Media and Communications for The Stronach Group, raised a number of issues that are critical to KNAER’s work. Panelists discussed the importance of diversity and using technology to link diverse communities and stakeholders to exchange ideas and resources. Jutta Treviranus raised the negative impact of disparity and the double-edged sword technology can play in either reinforcing or disrupting systemic disparity: it all depends on how we use it. If modern technology is made accessible to only a privileged few, it has the potential to entrench present-day hierarchies. If it is accessible to all, it has the potential to counter disparity by linking marginal and mainstream groups and ensuring a diversity of perspectives, ideas, knowledge and skills are shared and reach the public domain.

    In breakout sessions, engaging conversations ensued around the role of technology for collaboration, as a tool for data processing, and in supporting transparency in public and private sectors. For example, the open data movement has placed emphasis on data sharing among groups, as well as the analysis of publicly available large data sets for the public good. Data processing and analysis techniques (including algorithms for processing large amounts of data) were discussed, as was the critical role that technology plays in turning information and raw data into concrete knowledge, which can be used to support improved practice and policy making.

    diThink panel 1.1

    In the diLEARN breakout session, important points were raised by the panel about technology and education. We were particularly interested in the connections Rhonda McEwan made about how technology can be used to support student learning. Rhonda felt that while many people assume that technology is primarily a teaching tool for students, her research shows that students tend to use devices to practice what they have previously been taught. This distinction gave us something to think about!

    Earlier in the day, a video presentation on the use of MineCraft in education, reminded us of the importance of making learning (and training) fun and the role that games can play – low and high tech games – in turning learning from a chore into an enjoyable experience. Connected to this, is the valuable role that instant feedback (often provided in tech games) can play in supporting rapid learning and skill development.

    On the lookout for leaders in education who use technology for knowledge mobilization in education, our team identified @dianefindlay as someone we would like to connect with. Diane Findlay has helped Compass for Success (a not-for-profit organization that focuses on the use of data by educators to improve student outcomes) support over 40 school boards and First Nation education organizations across the province of Ontario. We hope to learn more about Diane’s work and the specific technology she has used to mobilize knowledge to and among diverse stakeholders in Ontario education.

    All in all, a memorable day with many takeaways for the KNAER team. Thanks diTHINK 2016!

Find us on Facebook!
Follow Us  on Twitter!
Subscribe to our blog!