Logo for VIVA's Pressbooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

16 21. Qualitative research dissemination

Chapter outline.

  • Ethical responsibility and cultural respectfulness (8 minute read time)
  • Critical considerations (5 minute read time)
  • Informing your dissemination plan (11 minute read time)
  • Final product taking shape (10 minute read time)

Content warning: Examples in this chapter contain references to research as a potential tool to stigmatize or oppress vulnerable groups, mistreatment and inequalities experienced by Native American tribes, sibling relationships, caregiving, child welfare, criminal justice and recidivism, first generation college students, Covid-19, school culture and race, health (in)equity, physical and sensory abilities, and transgender youth.

Your sweat and hard work has paid off!  You’ve planned your study, collected your data, and completed your analysis. But alas, no rest for the weary student researcher.  Now you need to share your findings. As researchers, we generally have some ideas where and with whom we desire to share our findings, but these plans may evolve and change during our research process.  Communicating our findings with a broader audience is a critical step in the research process, so make sure not to treat this like an afterthought. Remember, research is about making a contribution to collective knowledge-building in the area of study that you are interested in.  Indeed, research is of no value if there is no audience to receive it. You worked hard…get those findings out there!

In planning for this phase of research, we can consider a variety of methods for sharing our study findings. Among other options, we may choose to write our findings up as an article in a professional journal, provide a report to an organization, give testimony to a legislative group, or create a presentation for a community event. We will explore these options in a bit more detail below in section 21.4 where we talk more about different types of qualitative research products. We also want to think about our intended audience.

For your research, answer these two key questions as you are planning for dissemination:

  • Who are you targeting to communicate your findings to?  In other words, who needs to hear the results of your study?
  • What do you hope your audience will take away after learning about your study?

dissemination of findings in qualitative research

21.1 Ethical responsibility and cultural respectfulness

Learning Objectives

Learners will be able to…

  • Identify key ethical considerations in developing their qualitative research dissemination plan
  • Conceptualize how research dissemination may impact diverse groups, both presently and into the future

Have you ever been misrepresented or portrayed in a negative light? It doesn’t feel good. It especially doesn’t feel good when the person portraying us has power, control and influence.  While you might not feel powerful, research can be a powerful tool, and can be used and abused for many ends. Once research is out in the world, it is largely out of our control, so we need to approach dissemination with care. Be thoughtful about how you represent your work and take time to think through the potential implications it may have, both intended and unintended, for the people it represents.

As alluded to in the paragraph above, research comes with hefty responsibilities. You aren’t off the hook if you are conducting quantitative research.  While quantitative research deals with numbers, these numbers still represent people and their relationships to social problems. However, with qualitative research, we are often dealing with a smaller sample and trying to learn more from them. As such, our job often carries additional weight as we think about how we will represent our findings and the people they reflect. Furthermore, we probably hope that our research has an impact; that in some way, it leads to change around some issue. This is especially true as social work researchers. Our research often deals with oppressed groups, social problems, and inequality. However, it’s hard to predict the implications that our research may have. This suggests that we need to be especially thoughtful about how we present our research to others.

Two of the core values of social work involve respecting the inherent dignity and worth of each person, practicing with integrity, and behaving in a trustworthy manner [1] .  As social work researchers, to uphold these values, we need to consider how we are representing the people we are researching. Our work needs to honestly and accurately reflect our findings, but it also needs to be sensitive and respectful to the people it represents. In Chapter 8 we discussed research ethics and introduced the concept of beneficence or the idea that research needs to support the welfare of participants. Beneficence is particularly important as we think about our findings becoming public and how the public will receive, interpret and use this information. Thus, both as social workers and researchers, we need to be conscientious of how dissemination of our findings takes place.

As you think about the people in your sample and the communities or groups to which they belong, consider some of these questions:

  • How are participants being portrayed in my research?
  • What characteristics or findings are being shared or highlighted in my research that may directly or indirectly be associated with participants?
  • Have the groups that I am researching been stigmatized, stereotyped, and/or misrepresented in the past? If so, how does my research potentially reinforce or challenge these representations?
  • How might my research be perceived or interpreted by members of the community or group it represents?
  • In what ways does my research honor the dignity and worth of participants?

dissemination of findings in qualitative research

Qualitative research often has a voyeuristic quality to it, as we are seeking a window into participants’ lives by exploring their experiences, beliefs, and values. As qualitative researchers, we have a role as stewards or caretakers of data. We need to be mindful of how data are gathered, maintained, and most germane to our conversation here, how data are used. We need to craft research products that honor and respect individual participants (micro), our collective sample as a whole (meso), and the communities that our research may represent (macro).

As we prepare to disseminate our findings, our ethical responsibilities as researchers also involve honoring the commitments we have made during the research process. We need to think back to our early phases of the research process, including our initial conversations with research partners and other stakeholders who helped us to coordinate our research activities. If we made any promises along the way about how the findings would be presented or used, we need to uphold them here.  Additionally, we need to abide by what we committed to in our informed consent .  Part of our informed consent involves letting participants know how findings may be used.  We need to present our findings according to these commitments. We of course also have a commitment to represent our research honestly.

As an extension of our ethical responsibilities as researchers, we need to consider the impact that our findings may have, as well as our need to be socially conscientious researchers.  As scouts, we were taught to leave our campsite in a better state than when we arrived. I think it is helpful to think of research in these terms.  Think about the group(s) that may be represented by your research; what impact might your findings have for the lives of members of this group? Will it leave their lives in a better state than before you conducted your research? As a responsible researcher, you need to be thoughtful, aware and realistic about how your research findings might be interpreted and used by others. As social workers, while we hope that findings will be used to improve the lives of our clients, we can’t ignore that findings can also be used to further oppress or stigmatize vulnerable groups; research is not apolitical and we should not be naive about this. It is worth mentioning the concept of sustainable research here.  Sustainable research involves conducting research projects that have a long-term, sustainable impact for the social groups we work with. As researchers, this means that we need to actively plan for how our research will continue to benefit the communities we work with into the future. This can be supported by staying involved with these communities, routinely checking-in and seeking input from community members, and making sure to share our findings in ways that community members can access, understand, and utilize them. Nate Olson provides a very inspiring Ted Talk about the importance of building resilient communities. As you consider your research project, think about it in these terms.

Key Takeaways

  • As you think about how best to share your qualitative findings, remember that these findings represent people. As such, we have a responsibility as social work researchers to ensure that our findings are presented in honest, respectful, and culturally sensitive ways.
  • Since this phase of research deals with how we are going to share our findings with the public, we need to actively consider the potential implications of our research and how it may be interpreted and used.

Is your work, in some way, helping to contribute to a resilient and sustainable community? It may not be a big tangible project as described in Olson’s Ted Talk , but is it providing a resource for change and growth to a group of people, either directly or indirectly? Does it promote sustainability amongst the social networks that might be impacted by the research you are conducting?

21.2 Critical considerations

  • Identify how issues of power and control are present in the dissemination of qualitative research findings
  • Begin to examine and account for their own role in the qualitative research process, and address this in their findings

This is the part of our research that is shared with the public and because of this, issues like reciprocity, ownership, and transparency are relevant.  We need to think about who will have access to the tangible products of our research and how that research will get used. As researchers, we likely benefit directly from research products; perhaps it helps us to advance our career, obtain a good grade, or secure funding.  Our research participants often benefit indirectly by advancing knowledge about a topic that may be relevant or important to them, but often don’t experience the same direct tangible benefits that we do. However, a participatory perspective challenges us to involve community members from the outset in discussions about what changes would be most meaningful to their communities and what research products would be most helpful in accomplishing those changes. This is especially important as it relates to the role of research as a tool to support empowerment.

Ownership of research products is also important as an issue of power and control. We will discuss a range of venues for presenting your qualitative research, some of which are more amenable to shared ownership than others.  For instance, if you are publishing your findings in an academic journal, you will need to sign an agreement with that publisher about how the information in that article can be used and who has access to it.  Similarly, if you are presenting findings at a national conference, travel and other conference-related expenses and requirements may make access to these research products prohibitive. In these instances, the researcher and the organization(s) they negotiate with (e.g. the publishing company, the conference organizing body) share control.  However, disseminating qualitative findings in a public space, public record, or community-owned resource means that more equitable ownership might be negotiated. An equitable or reciprocal arrangement might not always be able to be reached, however. Transparency about who owns the products of research is important if you are working with community partners. To support this, establishing a Memorandum Of Understanding (MOU) or Memorandum of Agreement (MOA) e arly in the research process is important. This document should clearly articulate roles, responsibilities, and a number of other details, such as ownership of research products between the researcher and the partnering group(s).

Resources for learning more about MOUs and MOAs

Center for Community Health and Development, University of Kansas. (n.d.). Community toolbox: Section 9. Understanding and writing contracts and memoranda of agreement [Webpage]. https://ctb.ku.edu/en/table-of-contents/structure/organizational-structure/understanding-writing-contracts-memoranda-agreement/main

Collaborative Center for Health Equity, University of Wisconson Madison. (n.d.). Standard agreement for research with community organizations [Template] https://d1uqjtzsuwlnsf.cloudfront.net/wp-content/uploads/sites/163/2018/08/CCHE-UW-MOU-sample.pdf

Office of Research, UC Davis. (n.d.). Research MOUs [Webpage].  https://research.ucdavis.edu/proposals-grants-contracts/international-agreements/memorandum-understanding/

Office of Research, The University of Texas at Dallas. (n.d.). Types of agreements [Webpage]. https://research.utdallas.edu/researchers/contracts/types-of-agreements

In our discussion about qualitative research, we have also frequently identified the need for the qualitative researcher to account for their role throughout the research process.  Part of this accounting can specifically apply to qualitative research products. This is our opportunity to demonstrate to our audience that we have been reflective throughout the course of the study and how this has influenced the work we did.  Some qualitative research studies include a positionality statement within the final product. This is often toward the beginning of the report or the presentation and includes information about the researcher(s)’s identity and worldview, particularly details relevant to the topic being studied. This can include why you are invested in the study, what experiences have shaped how you have come to think about the topic, and any positions or assumptions you make with respect to the topic.  This is another way to encourage transparency. It can also be a means of relegating or at least acknowledging some of our power in the research process, as it can provide one modest way for us, as the researcher, to be a bit more exposed or vulnerable, although this is a far cry from making the risks of research equitable between the researcher and the researched. However, the positionality statement can be a place to integrate our identities, who we are as an individual, a researcher, and a social work practitioner.  Granted, for some of us that might be volumes, but we need to condense this down to a brief but informative statement – don’t let it eclipse the research! It should just be enough to inform the audience and allow them to draw their own conclusions about who is telling the story of this research and how well they can be trusted. This student provides a helpful discussion of the positionality statement that she developed for her study.  Reviewing your reflexive journal (discussed in chapter  20 as a tool to enhance qualitative rigor) can help in identifying underlying assumptions and positions you might have grounded in your reactions throughout the research process. These insights can be integrated into your positionality statement. Please take a few minutes to watch this informative video of a student further explaining what a positionality statement is and providing a good example of one.

  • The products of qualitative research often benefit the researcher disproportionately when compared to research participants or the communities they represent.  Whenever possible, we can seek out ways to disseminate research in ways that addresses this imbalance and supports more tangible and direct benefits to community members.
  • Openly positioning ourselves in our dissemination plans can be an important way for qualitative researchers to be transparent and account for our role.

21.3 Informing your dissemination plan

  • Appraise important dimensions of planning that will inform their research dissemination plan, including: audience, purpose, context and content
  • Apply this appraisal to key decisions they will need to make when designing their qualitative research product(s)

This section will offer you a general overview of points to consider as you form the dissemination plan for your research. We will start with considerations regarding your audience, then turn our attention to the purpose of your research, and finally consider the importance of attending to both content and context as you plan for your final research product(s).

Perhaps the most important consideration you have as you plan how to present your work is your audience. Research is a product that is meant to be consumed, and because of this, we need to be conscious of our consumers. We will speak more extensively about knowing your audience in Chapter 24 , devoted to both sharing and consuming research. Regardless of who your audience is (e.g. community members, classmates, research colleagues, practicing social workers, state legislator), there will be common elements that will be important to convey. While the way you present them will vary greatly according to who is listening, Table 21.1 offers a brief review of the elements that you will want your audience to leave with.

Table 21.1 Elements to consider when planning for your audience
Aim What did my research aim to accomplish and why is it important
Process How did I go about conducting my research and what did I do to ensure quality
Concepts What are the main ideas that someone needs to know to make sense of my topic and how are they integrated into my research plan
Findings What were my results and under what circumstances are they valid
Connection How are my findings connected to what else we know about this topic and why are they important

Once we determine who our audience is, we can further tailor our dissemination plan to that specific group.  Of course, we may be presenting our findings in more than one venue, and in that case, we will have multiple plans that will meet the needs of each specific audience.

It’s a good idea to pitch your plan first.  However you plan to present your findings, you will want to have someone preview before you share with a wider audience. Ideally, whoever previews will be a person from your target audience or at least someone who knows them well. Getting feedback can go a long way in helping us with the clarity with which we convey our ideas and the impact they have on our audience. This might involve giving a practice speech, having someone review your article or report, or practice discussing your research one-on-one, as you would with a poster presentation.  Let’s talk about some specific audiences that you may be targeting and their unique needs or expectations.

Below I will go through some brief considerations for each of these different audiences. I have tried to focus this discussion on elements that are relevant specific to qualitative studies since we do revisit this topic in Chapter 24 .

dissemination of findings in qualitative research

Research community

When presenting your findings to an academic audience or other research-related community, it is probably safe to a make a few assumptions. This audience is likely to have a general understanding of the research process and what it entails.  For this reason, you will have to do less explaining of research-related terms and concepts. However, compared to other audiences, you will probably have to provide a bit more detail about what steps you took in your research process, especially as they relate to qualitative rigor, because this group will want to know about how your research was carried out and how you arrived at your decisions throughout the research process. Additionally, you will want to make a clear connection between which qualitative design you chose and your research question; a methodological justification . Researchers will also want to have a good idea about how your study fits within the wider body of scientific knowledge that it is related to and what future studies you feel are needed based on your findings. You are likely to encounter this audience if you are disseminating through a peer-reviewed journal article, presenting at a research conference, or giving an invited talk in an academic setting.

Professional community

We often find ourselves presenting our research to other professionals, such as social workers in the field. While this group may have some working knowledge of research, they are likely to be much more focused on how your research is related to the work they do and the clients they serve. While you will need to convey your design accurately, this audience is most likely to be invested in what you learned and what it means (especially for practice). You will want to set the stage for the discussion by doing a good job expressing your connection to and passion for the topic (a positionality statemen t might be particularly helpful here), what we know about the issue, and why it is important to their professional lives. You will want to give good contextual information for your qualitative findings so that practitioners can know if these findings might apply to people they work with. Also, as since social work practitioners generally place emphasis on person-centered practice, hearing the direct words of participants (quotes) whenever possible, is likely to be impactful as we present qualitative results.  Where academics and researchers will want to know about implications for future research, professionals will want to know about implications for how this information could help transform services in the future or understand the clients they serve.

Lay community

The lay community are people who don’t necessarily have specialized training or knowledge of the subject, but may be interested or invested for some other reason; perhaps the issue you are studying affects them or a loved one. Since this is the general public, you should expect to spend the most time explaining scientific knowledge and research processes and terminology in accessible terms. Furthermore, you will want to invest some time establishing a personal connection to the topic (like I talked about for the professional community). They will likely want to know why you are interested and why you are a credible source for this information.  While this group may not be experts on research, as potential members of the group(s) that you may be researching, you do want to remember that they are experts in their own community. As such, you will want to be especially mindful of approaching how you present findings with a sense of cultural humility (although hopefully you have this in mind across all audiences). It will be good to discuss what steps you took to ensure that your findings accurately reflect what participants shared with you ( rigor ). You will want to be most clear with this group about what they should take away, without overstating your findings.

Regardless of who your audience is, remember that you are an ambassador.  You may represent a topic, a population, an organization, or the whole institution of research, or any combination of these.  Make sure to present your findings honestly, ethically, and clearly.  Furthermore, I’m assuming that the research you are conducting is important because you have spent a lot of time and energy to arrive at your findings. Make sure that this importance comes through in your dissemination.  Tell a compelling story with your research!  

Who needs to hear the message of your qualitative research?

  • Example. If you are presenting your research about caregiver fatigue to a caregiver support group, you won’t need to spend time describing the role of caregivers because your audience will have lived experience.
  • Example. If you are presenting your research findings to a group of academics, you wouldn’t have to explain what a sampling frame is, but if you are sharing it with a group of community members from a local housing coalition, you will need to help them understand what this is (or maybe use a phrase that is more meaningful to them).
  • Example. If you are speaking to a group of child welfare workers about your study examining trauma-informed communication strategies, they are probably going to want to know how these strategies might impact the work that they do.
  • Example. If you are sharing your findings at a meeting with a council member, it may be especially meaningful to share direct quotes from constituents.

Being clear about the purpose of your research from the outset is immeasurably helpful.  What are you hoping to accomplish with your study?  We can certainly look to the overarching purpose of qualitative research, that being to develop/expand/challenge/explore understanding of some topic.  But, what are you specifically attempting to accomplish with your study? Two of the main reasons we conduct research are to raise awareness about a topic and to create change around some issue. Let’s say you are conducting a study to better understand the experience of recidivism in the criminal justice system. This is an example of a study whose main purpose is to better understand and raise awareness around a particular social phenomenon (recidivism). On the other hand, you could also conduct a study that examines the use of strengths-based strategies by probation officers to reduce recidivism. This would fall into the category of research promoting a specific change (the use of strengths-based strategies among probation officers). I would wager that your research topic falls into one of these two very broad categories. If this is the case, how would you answer the corresponding questions below?

Are you seeking to raise awareness of a particular issue with your research? If so,

  • Whose awareness needs raising? 
  • What will “speak” most effectively to this group? 
  • How can you frame your research so that it has the most impact?

Are you seeking to create a specific change with your research? If so,

  • What will that change look like? 
  • How can your research best support that change occurring? 
  • Who has the power to create that change and what will be most compelling in reaching them? 

How you answer these questions will help to inform your dissemination plan.  For instance, your dissemination plan will likely look very different if you are trying to persuade a group of legislators to pass a bill versus trying to share a new model or theory with academic colleagues. Considering your purposes will help you to convey the message of your research most effectively and efficiently. We invest a lot of ourselves in our research, so make sure to keep your sights focused on what you hope to accomplish with it!

Content and context

As a reminder, qualitative research often has a dual responsibility for conveying both content and context. You can think of content as the actual data that is shared with us or that we obtain, while context is the circumstances under which that data sharing occurs. Content conveys the message and context provides us the clues with which we can decode and make sense of that message.

While quantitative research may provide some contextual information, especially in regards to describing its sample, it rarely receives as much attention or detail as it does in qualitative studies. Because of this, you will want to plan for how you will attend to both the content and context of your study in planning for your dissemination.

  • Research is an intentional act; you are trying to accomplish something with it. To be successful, you need to approach dissemination planfully.
  • Planning the most effective way of sharing our qualitative findings requires looking beyond what is convenient or even conventional, and requires us to consider a number of factors, including our audience, the purpose or intent of our research and the nature of both the content and the context that we are trying to convey.

21.4 Final product taking shape

  • Evaluate the various means of disseminating research and consider their applicability for your research project
  • Determine appropriate building blocks for designing your qualitative research product

As we have discussed, qualitative research takes many forms. It should then come as no surprise that qualitative research products also come in many different packages. To help guide you as the final products of your research take shape, we will discuss some of the building blocks or elements that you are likely to include as tools in sharing your qualitative findings.  These are the elements that will allow you to flesh out the details of your dissemination plan.

Building blocks

There are many building blocks that are at our disposal as we formulate our qualitative research product(s). Quantitative researchers have charts, graphs, tables, and narrative descriptions of numerical output.  These tools allow the quantitative researcher to tell the story of their research with numbers. As qualitative researchers, we are tasked with telling the story of our research findings as well, but our tools look different.  While this isn’t an exhaustive list of tools that are at our disposal as qualitative researchers, a number of commonly used elements in sharing qualitative findings are discussed here.  Depending on your study design and the type of data you are working with, you may use one or some combination of the building blocks discussed below.

Themes are a very common element when presenting qualitative research findings. They may be called themes, but they may also go by other names: categories, dimensions, main ideas, etc.  Themes offer the qualitative researcher a way to share ideas that emerged from your analysis that were shared by multiple participants or across multiple sources of data.  They help us to distill the large amounts of qualitative data that we might be working with into more concise and manageable pieces of information that are more consumable for our audience. When integrating themes into your qualitative research product, you will want to offer your audience: the title of the theme (try to make this as specific/meaningful as possible), a brief description or definition of the theme, any accompanying dimensions or sub-themes that may be relevant, and examples (when appropriate).

Quotes offer you the opportunity to share participants’ exact words with your audience.  Of course, we can’t only rely on quotes, because we need to knit the information that is shared into one cohesive description of our findings and an endless list of quotes is unlikely to support this. Because of this, you will want to be judicious in selecting your quotes. Choose quotes that can stand on their own, best reflect the sentiment that is being captured by the theme or category of findings that you are discussing, and are likely to speak to and be understood by your audience.  Quotes are a great way to help your findings come alive or to give them greater depth and significance. If you are using quotes, be sure to do so in a balanced manner – don’t only use them in some sections but not others, or use a large number to support one theme and only one or two for another.  Finally, we often provide some brief demographic information in a parenthetical reference following a quote so our reader knows a little bit about the person who shared the information.  This helps to provide some context for the quote.

Kohli and Pizarro (2016) [2] provide a good example of a qualitative study using quotes to exemplify their themes. In their study, they gathered data through short-answer questionnaires and in-depth interviews from racial-justice oriented teachers of Color. Their study explored the experiences and motivations of these teachers and the environments in which they worked. As you might guess, the words of the teacher-participants were especially powerful and the quotes provided in the results section were very informative and important in helping to fulfill the aim of the research study. Take a few minutes to review this article.  Note how the authors provide a good amount of detail as to what each of the themes meant and how they used the quotes to demonstrate and support each theme. The quotes help bring the themes to life and anchor the results in the actual words of the participants (suggesting greater trustworthiness in the findings).   

Figure 21.1 below offers a more extensive example of a theme being reported along with supporting quotes from a study conducted by Karabanow, Gurman, and Naylor (2012) [3] . This study focused on the role of work activities in the lives of “Guatemalan street youth”. One of the important themes had to do with intersection of work and identity for this group.  In this example, brief quotes are used within the body of the description of the theme, and also longer quotes (full sentence(s)) to demonstrate important aspects of the description.

Figure 21.1. Example theme from the study Karabanow, J., Gurman, E., & Naylor, T. (2012). Street youth labor as an expression of survival and self-worth. ,  (2).

Work, be it formal or informal, is beneficial for street youth not only for its financial benefits; but it helps youth to develop and rebuild their sense of self, to break away from destructive patterns, and ultimately contributes to any goals of exiting street life. Although many of the participants were aware that society viewed them as “lazy” or “useless,” they tended to see themselves as contributing members of society earning a valid and honest living. One participant said, “Well, a lot of people say, right? ‘The kid doesn’t want to do anything. Lazy kid’ right? And I wouldn’t like for people to say that about me, I’d rather do something so that they don’t say that I’m lazy. I want to be someone important in life.” This youth makes an interesting and important connection in this statement: he intrinsically associates “being someone” with “doing something” – he accepts the work-based identity that characterizes much of contemporary capitalist society. Many of the interviews subtly enforced this idea that in the informal economy, as in the formal economy, “who one is” is largely dependent on “what one does.” This demonstrates two important ideas: that street youth working in the informal sector are surprisingly ‘mainstream’ in their underlying beliefs and ambitions, and that work – be it formal or informal – plays a crucial role in allowing street youth, who have often dealt with trauma, isolation and low self-esteem, to rebuild a sense of self-worth.

Many of the youth involved in this study dream of futures that echo traditional ideals: “to have my family all together…to have a home, or rather to have a nice house…to have a good job.” Several explained that this future is unattainable without hard work; many viewed those who “do nothing” as people who “waste their time” and think that “your life isn’t important to you.” On the other hand, those who value their lives and wish to attain a future of “peace and tranquility” must “look for work, that’s what needs to be done to have that future because if God allows it, in the future maybe you can find a partner, form a family and live peacefully.” For these youth, working – be it in the formal or informal sector – is essential to a feeling of “moving forward ( ).” This movement forward begins with self-esteem. Although the focus of this study was not the troubled pasts of the participants, many alluded to the difficulties they have faced and the various traumas that forced them onto the streets. Several of the youth noted that working was a catalyst in rebuilding positive feelings about oneself: one explained, “[When I’m working,] I feel happy, powerful…Sometimes when I go out to sell, I feel happy.” Another said:

For me, when I’m working I feel free because I know that I’m earning my money in an honest way, not stealing right. Because when you’re stealing, you don’t feel free, right? Now when you’re working, you’re free, they can’t arrest you or anything because you’re selling. Now if you’re stealing and everything, you don’t feel free. But when you’re selling you feel free, out of danger.

This feeling of being “free” or “powerful” rests on the idea that money is “earned” and not stolen; being able to earn money is associated with being “someone,” with being a valid and contributing member of society.

In addition, work helps street youth to break away from destructive patterns. One participant spoke of her experience working full time at a café:

For me, working means to be busy, to not just be there….It helps us meet other people, like new people and not to be always in the same scene. Because if you’re not busy, you feel really bored and you might want to, I don’t know, go back to the same thing you were in before…you even forget your problems because you’re keeping busy, you’re talking to other people, people who don’t know you.

For this participant, a formal job was beneficial in that it supplied her with a daily routine and allowed her to interact with non-street people – these factors helped to separate her from the destructive lifestyle of the street, and helped her to “move forward.” Although these benefits are indeed most obvious with formal employment, many participants spoke of the positive effects of informal work as well, although to varying degrees. In Guatemala, since the informal economy accounts for over half of the country’s GNP, there is a wide range of under-the-table informal work available. These jobs frequently bring youth out of the street context and, therefore, provide similar benefits to a formal job, as described by the above participant. As to informal work that takes place on the street, such as hawking or car watching, the benefits of work are present, although to a different degree. Even hawking, for example, gives young workers a routine and a chance to interact with non-street people. As one young man continuously emphasized throughout his interview, “work helps you to keep your mind busy, to be in another mind-set, right? To not be thinking the same thing all the time: ‘Oh, drugs, drugs, drugs…’” As explained earlier, the code of the hawking world dictates that vendors cannot sell while high – just like a formal job, hawking helps to distance youth workers from some of their destructive street habits. However, as one participant thoughtfully noted, it is difficult to break these habits when one is still highly embroiled in street culture; “it depended on who was around me because if they were in the same problems as I was, I stopped working and I started doing the same as they did. And if I was surrounded by serious people, then I got my act together.” While certain types of informal work, like cleaning or waitressing, can help youth to distance themselves from destructive patterns, others, such as car watching and selling, may not do enough to separate youth from their peers. While the routine and activity do have positive effects, they often are not sufficient.

Among some of the participants, there was the sentiment that informal work could function as a transition stage towards exiting the street; it could “change your life.” One participant said “there are lots of vendors who’ve gotten off the streets, if you make an effort, you go out to sell, you can get off the street. Like myself, when I was selling, I mean working, I got off the street, I went home and I managed to stay there quite a long time.” One might credit this success to several factors: first, the money the seller may have been able to save and accumulate; second, the routine of selling may have helped the seller to break from destructive patterns, such as drug use, and also prepared the seller for the demands of formal sector employment; and, thirdly, selling may have enabled the seller to develop the necessary confidence and sense of self to attempt exiting the street.

Pictures or videos

If our data collection involves the use of photographs, drawings, videos or other artistic expression of participants or collection of artifacts, we may very well include selections of these in our dissemination of qualitative findings.  In fact, if we failed to include these, it would seem a bit inauthentic.  For the same reason we include quotes as direct representations of participants’ contributions, it is a good idea to provide direct reference to other visual forms of data that support or demonstrate our findings. We might incorporate narrative descriptions of these elements or quotes from participants that help to interpret their meaning. Integrating pictures and quotes is especially common if we are conducting a study using a Photovoice approach, as we discussed in Chapter 17 , where a main goal of the research technique is to bring together participant generated visuals with collaborative interpretation.

Take some time to explore the website linked here. It is the webpage for The Philidelphia Collaborative for Health Equity’s PhotoVoice Exhibit Gallery and offers a good demonstration of research that brings together pictures and text.

Graphic or figure

Qualitative researchers will often create a graphic or figure to visually reflect how the various pieces of your findings come together or relate to each other. Using a visual representation can be especially compelling for people who are visual learners.  When you are using a visual representation, you will want to: label all elements clearly; include all the components or themes that are part of your findings; pay close attention to where you place and how you orient each element (as their spatial arrangement carries meaning); and finally, offer a brief but informative explanation that helps your reader to interpret your representation.   A special subcategory of visual representation is process.  These are especially helpful to lay out a sequential relationship within your findings or a model that has emerged out of your analysis. A process or model will show the ‘flow’ of ideas or knowledge in our findings, the logic of how one concept proceeds to the next and what each step of the model entails.

Noonan and colleagues (2004) [4] conducted a qualitative study that examined the career development of high achieving women with physical and sensory disabilities. Through the analysis of their interviews, they built a model of career development based on these women’s experiences with a figure that helps to conceptually illustrate the model. They place the ‘dyanmic self’ in the center, surrounded by a dotted (permeable) line, with a number of influences outside the line (i.e. family influences, disability impact, career attitudes and behaviors, sociopoltical context, developmental opportunities and social support) and arrows directed inward and outward between each influence and the dynamic self to demonstrate mutual influence/exchange between them.  The image is included in the results section of their study and brings together “core categories” and demonstrates how they work together in the emergent theory or how they relate to each other. Because so many of our findings are dynamic, like Noonan and colleagues, showing interaction and exchange between ideas, figures can be especially helpful in conveying this as we share our results.

Titled "restructuring at work". There are a series of boxes in a row with arrows leading from one to another. The first states "unresolved work-related conflicts". The second box states, "shaming process" with two bullets stating "interpersonal shaming and "intrapersonal shaming". The 3rd box states "making efforts to please" and has 3 bullets labeled "increased work intensity", "overtime", and "sickness presenteeism". The 4th box is labeled "mental overload" and contains 3 bullets, labeled "chronic tiredness and fatigue", "social withdrawal", and "estrangement from self and others". The fifth and final box is labeled "sick leave".

Going one step further than the graphic or figure discussed above, qualitative researchers may decide to combine and synthesize findings into one integrated representation. In the case of the graphic or figure, the individual elements still maintain their distinctiveness, but are brought together to reflect how they are related. In a composite however, rather than just showing that they are related (static), the audience actually gets to ‘see’ the elements interacting (dynamic). The integrated and interactive findings of a composite can take many forms.  It might be a written narrative, such as a fictionalized case study that reflects of highlights the many aspects that emerged during analysis. It could be a poem, dance, painting or any other performance or medium. Ultimately, a composite offers an audience a meaningful and comprehensive expression of our findings. If you are choosing to utilize a composite, there is an underlying assumption that is conveyed: you are suggesting that the findings of your study are best understood holistically. By discussing each finding individually, they lose some of their potency or significance, so a composite is required to bring them together.  As an example of a composite, consider that you are conducting research with a number of First Nations Peoples in Canada.  After consulting with a number of Elders and learning about the importance of oral traditions and the significance of storytelling, you collaboratively determine that the best way to disseminate your findings will be to create and share a story as a means of presenting your research findings.  The use of composites also assumes that the ‘truths’ revealed in our data can take many forms. The Transgender Youth Project hosted by the Mandala Center for Change , is an example of legislative theatre combining research, artistic expression, and political advocacy and a good example of action-oriented research.

While you haven’t heard much about numbers in our qualitative chapters, I’m going to break with tradition and speak briefly about them here.  For many qualitative projects we do include some numeric information in our final product(s), mostly in the way of counts. Counts usually show up in the way of frequency of demographic characteristics of our sample or characteristics regarding our artifacts, if they aren’t people.  These may be included as a table or they may be integrated into the narrative we provide, but in either case, our goal in including this information is to offer the reader information so they can better understand who or what our sample is representing.  The other time we sometimes include count information is in respect to the frequency and coverage of the themes or categories that are represented in our data. Frequency information about a theme can help the reader to know how often an idea came up in our analysis, while coverage can help them to know how widely dispersed this idea was (e.g. did nearly everyone mention this, or was it a small group of participants).

  • There are a wide variety of means by which you can deliver your qualitative research to the public.  Choose one that takes into account the various considerations that we have discussed above and also honors the ethical commitments that we outlined early in this chapter.
  • Presenting qualitative research requires some amount of creativity.  Utilize the building blocks discussed in this chapter to help you consider how to most authentically and effectively convey your message to a wider audience.

What means of delivery will you be choosing for your dissemination plan?

What building blocks will best convey your qualitaitve results to your audience?

  • National Association of Social Workers. (2017). NASW code of ethics. Retrieved from https://www.socialworkers.org/About/Ethics/Code-of-Ethics/Code-of-Ethics-English ↵
  • Kohli, R., & Pizarro, M. (2016). Fighting to educate our own: Teachers of Color, relational accountability, and the struggle for racial justice. Equity & Excellence in Education, 49 (1), 72-84. ↵
  • Karabanow, J., Gurman, E., & Naylor, T. (2012). Street youth labor as an Expression of survival and self-worth. Critical Social Work, 13 (2). ↵
  • Noonan, B. M., Gallor, S. M., Hensler-McGinnis, N. F., Fassinger, R. E., Wang, S., & Goodman, J. (2004). Challenge and success: A Qualitative study of the career development of highly achieving women with physical and sensory disabilities. Journal of Counseling Psychology, 51 (1), 68. ↵
  • Ede, L., & Starrin, B. (2014). Unresolved conflicts and shaming processes: risk factors for long-term sick leave for mental-health reasons. Nordic Journal of Social Research, 5 , 39-54. ↵

how you plan to share your research findings

One of the three values indicated in the Belmont report. An obligation to protect people from harm by maximizing benefits and minimizing risks.

A written agreement between parties that want to participate in a collaborative project.

A research journal that helps the researcher to reflect on and consider their thoughts and reactions to the research process and how it may be shaping the study

Context is the circumstances surrounding an artifact, event, or experience.

Rigor is the process through which we demonstrate, to the best of our ability, that our research is empirically sound and reflects a scientific approach to knowledge building.

Content is the substance of the artifact (e.g. the words, picture, scene). It is what can actually be observed.

Graduate research methods in social work Copyright © 2020 by Matthew DeCarlo, Cory Cummings, Kate Agnelli is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

Internet Explorer is no longer supported by Microsoft. To browse the NIHR site please use a modern, secure browser like Google Chrome, Mozilla Firefox, or Microsoft Edge.

National Institute for Health and Care Research logo | Homepage

How to disseminate your research

dissemination of findings in qualitative research

Published: 01 January 2019

Version: Version 1.0 - January 2019

This guide is for researchers who are applying for funding or have research in progress. It is designed to help you to plan your dissemination and give your research every chance of being utilised.

What does NIHR mean by dissemination?

Effective dissemination is simply about getting the findings of your research to the people who can make use of them, to maximise the benefit of the research without delay.

Research is of no use unless it gets to the people who need to use it

Professor Chris Whitty, Chief Scientific Adviser for the Department of Health

Principles of good dissemination

Stakeholder engagement: Work out who your primary audience is; engage with them early and keep in touch throughout the project, ideally involving them from the planning of the study to the dissemination of findings. This should create ‘pull’ for your research i.e. a waiting audience for your outputs. You may also have secondary audiences and others who emerge during the study, to consider and engage.

Format: Produce targeted outputs that are in an appropriate format for the user. Consider a range of tailored outputs for decision makers, patients, researchers, clinicians, and the public at national, regional, and/or local levels as appropriate. Use plain English which is accessible to all audiences.

Utilise opportunities: Build partnerships with established networks; use existing conferences and events to exchange knowledge and raise awareness of your work.

Context: Understand the service context of your research, and get influential opinion leaders on board to act as champions. Timing: Dissemination should not be limited to the end of a study. Consider whether any findings can be shared earlier

Remember to contact your funding programme for guidance on reporting outputs .

Your dissemination plan: things to consider

What do you want to achieve, for example, raise awareness and understanding, or change practice? How will you know if you are successful and made an impact? Be realistic and pragmatic. 

Identify your audience(s) so that you know who you will need to influence to maximise the uptake of your research e.g. commissioners, patients, clinicians and charities. Think who might benefit from using your findings. Understand how and where your audience looks for/receives information. Gain an insight into what motivates your audience and the barriers they may face.

Remember to feedback study findings to participants, such as patients and clinicians; they may wish to also participate in the dissemination of the research and can provide a powerful voice.

When will dissemination activity occur? Identify and plan critical time points, consider external influences, and utilise existing opportunities, such as upcoming conferences. Build momentum throughout the entire project life-cycle; for example, consider timings for sharing findings.

Think about the expertise you have in your team and whether you need additional help with dissemination. Consider whether your dissemination plan would benefit from liaising with others, for example, NIHR Communications team, your institution’s press office, PPI members. What funds will you need to deliver your planned dissemination activity? Include this in your application (or talk to your funding programme).

Partners / Influencers: think about who you will engage with to amplify your message. Involve stakeholders in research planning from an early stage to ensure that the evidence produced is grounded, relevant, accessible and useful.

Messaging: consider the main message of your research findings. How can you frame this so it will resonate with your target audience? Use the right language and focus on the possible impact of your research on their practice or daily life.

Channels: use the most effective ways to communicate your message to your target audience(s) e.g. social media, websites, conferences, traditional media, journals. Identify and connect with influencers in your audience who can champion your findings.

Coverage and frequency: how many people are you trying to reach? How often do you want to communicate with them to achieve the required impact?

Potential risks and sensitivities: be aware of the relevant current cultural and political climate. Consider how your dissemination might be perceived by different groups.

Think about what the risks are to your dissemination plan e.g. intellectual property issues. Contact your funding programme for advice.

More advice on dissemination

We want to ensure that the research we fund has the maximum benefit for patients, the public and the NHS. Generating meaningful research impact requires engaging with the right people from the very beginning of planning your research idea.

More advice from the NIHR on knowledge mobilisation and dissemination .

What you need to know about research dissemination

Last updated

5 March 2024

Reviewed by

In this article, we'll tell you what you need to know about research dissemination.

  • Understanding research dissemination

Research that never gets shared has limited benefits. Research dissemination involves sharing research findings with the relevant audiences so the research’s impact and utility can reach its full potential.

When done effectively, dissemination gets the research into the hands of those it can most positively impact. This may include:

Politicians

Industry professionals

The general public

What it takes to effectively disseminate research will depend greatly on the audience the research is intended for. When planning for research dissemination, it pays to understand some guiding principles and best practices so the right audience can be targeted in the most effective way.

  • Core principles of effective dissemination

Effective dissemination of research findings requires careful planning. Before planning can begin, researchers must think about the core principles of research dissemination and how their research and its goals fit into those constructs.

Research dissemination principles can best be described using the 3 Ps of research dissemination.

This pillar of research dissemination is about clarifying the objective. What is the goal of disseminating the information? Is the research meant to:

Persuade policymakers?

Influence public opinion?

Support strategic business decisions?

Contribute to academic discourse? 

Knowing the purpose of sharing the information makes it easy to accurately target it and align the language used with the target audience.

The process includes the methods that will be used and the steps taken when it comes time to disseminate the findings. This includes the channels by which the information will be shared, the format it will be shared in, and the timing of the dissemination.

By planning out the process and taking the time to understand the process, researchers will be better prepared and more flexible should changes arise.

The target audience is whom the research is aimed at. Because different audiences require different approaches and language styles, identifying the correct audience is a huge factor in the successful dissemination of findings.

By tailoring the research dissemination to the needs and preferences of a specific audience, researchers increase the chances of the information being received, understood, and used.

  • Types of research dissemination

There are many options for researchers to get their findings out to the world. The type of desired dissemination plays a big role in choosing the medium and the tone to take when sharing the information.

Some common types include:

Academic dissemination: Sharing research findings in academic journals, which typically involves a peer-review process.

Policy-oriented dissemination: Creating documents that summarize research findings in a way that's understandable to policymakers.

Public dissemination: Using television and other media outlets to communicate research findings to the public.

Educational dissemination: Developing curricula for education settings that incorporate research findings.

Digital and online dissemination: Using digital platforms to present research findings to a global audience.

Strategic business presentation: Creating a presentation for a business group to use research insights to shape business strategy

  • Major components of information dissemination

While the three Ps provide a convenient overview of what needs to be considered when planning research dissemination, they are not a complete picture.

Here’s a more comprehensive list of what goes into the dissemination of research results:

Audience analysis : Identifying the target audience and researching their needs, preferences, and knowledge level so content can be tailored to them.

Content development: Creating the content in a way that accurately reflects the findings and presents them in a way that is relevant to the target audience.

Channel selection: Choosing the channel or channels through which the research will be disseminated and ensuring they align with the preferences and needs of the target audience.

Timing and scheduling: Evaluating factors such as current events, publication schedules, and project milestones to develop a timeline for the dissemination of the findings.

Resource allocation: With the basics mapped out, financial, human, and technological resources can be set aside for the project to facilitate the dissemination process.

Impact assessment and feedback: During the dissemination, methods should be in place to measure how successful the strategy has been in disseminating the information.

Ethical considerations and compliance: Research findings often include sensitive or confidential information. Any legal and ethical guidelines should be followed.

  • Crafting a dissemination blueprint

With the three Ps providing a foundation and the components outlined above giving structure to the dissemination, researchers can then dive deeper into the important steps in crafting an impactful and informative presentation.

Let’s take a look at the core steps.

1. Identify your audience

To identify the right audience for research dissemination, researchers must gather as much detail as possible about the different target audience segments.

By gathering detailed information about the preferences, personalities, and information-consumption habits of the target audience, researchers can craft messages that resonate effectively.

As a simple example, academic findings might be highly detailed for scholarly journals and simplified for the general public. Further refinements can be made based on the cultural, educational, and professional background of the target audience.

2. Create the content

Creating compelling content is at the heart of effective research dissemination. Researchers must distill complex findings into a format that's engaging and easy to understand. In addition to the format of the presentation and the language used, content includes the visual or interactive elements that will make up the supporting materials.

Depending on the target audience, this may include complex technical jargon and charts or a more narrative approach with approachable infographics. For non-specialist audiences, the challenge is to provide the required information in a way that's engaging for the layperson.

3. Take a strategic approach to dissemination

There's no single best solution for all research dissemination needs. What’s more, technology and how target audiences interact with it is constantly changing. Developing a strategic approach to sharing research findings requires exploring the various methods and channels that align with the audience's preferences.

Each channel has a unique reach and impact, and a particular set of best practices to get the most out of it. Researchers looking to have the biggest impact should carefully weigh up the strengths and weaknesses of the channels they've decided upon and craft a strategy that best uses that knowledge.

4. Manage the timeline and resources

Time constraints are an inevitable part of research dissemination. Deadlines for publications can be months apart, conferences may only happen once a year, etc. Any avenue used to disseminate the research must be carefully planned around to avoid missed opportunities.

In addition to properly planning and allocating time, there are other resources to consider. The appropriate number of people must be assigned to work on the project, and they must be given adequate financial and technological resources. To best manage these resources, regular reviews and adjustments should be made.

  • Tailoring communication of research findings

We’ve already mentioned the importance of tailoring a message to a specific audience. Here are some examples of how to reach some of the most common target audiences of research dissemination.

Making formal presentations

Content should always be professional, well-structured, and supported by data and visuals when making formal presentations. The depth of information provided should match the expertise of the audience, explaining key findings and implications in a way they'll understand. To be persuasive, a clear narrative and confident delivery are required.

Communication with stakeholders

Stakeholders often don't have the same level of expertise that more direct peers do. The content should strike a balance between providing technical accuracy and being accessible enough for everyone. Time should be taken to understand the interests and concerns of the stakeholders and align the message accordingly.

Engaging with the public

Members of the public will have the lowest level of expertise. Not everyone in the public will have a technical enough background to understand the finer points of your message. Try to minimize confusion by using relatable examples and avoiding any jargon. Visual aids are important, as they can help the audience to better understand a topic.

  • 10 commandments for impactful research dissemination

In addition to the details above, there are a few tips that researchers can keep in mind to boost the effectiveness of dissemination:

Master the three Ps to ensure clarity, focus, and coherence in your presentation.

Establish and maintain a public profile for all the researchers involved.

When possible, encourage active participation and feedback from the audience.

Use real-time platforms to enable communication and feedback from viewers.

Leverage open-access platforms to reach as many people as possible.

Make use of visual aids and infographics to share information effectively.

Take into account the cultural diversity of your audience.

Rather than considering only one dissemination medium, consider the best tool for a particular job, given the audience and research to be delivered.

Continually assess and refine your dissemination strategies as you gain more experience.

Should you be using a customer insights hub?

Do you want to discover previous research faster?

Do you share your research findings with others?

Do you analyze research data?

Start for free today, add your research, and get to key insights faster

Editor’s picks

Last updated: 18 April 2023

Last updated: 27 February 2023

Last updated: 6 February 2023

Last updated: 5 February 2023

Last updated: 16 April 2023

Last updated: 9 March 2023

Last updated: 30 April 2024

Last updated: 12 December 2023

Last updated: 11 March 2024

Last updated: 4 July 2024

Last updated: 6 March 2024

Last updated: 5 March 2024

Last updated: 13 May 2024

Latest articles

Related topics, .css-je19u9{-webkit-align-items:flex-end;-webkit-box-align:flex-end;-ms-flex-align:flex-end;align-items:flex-end;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-box-flex-wrap:wrap;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;row-gap:0;text-align:center;max-width:671px;}@media (max-width: 1079px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}}@media (max-width: 799px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}} decide what to .css-1kiodld{max-height:56px;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;}@media (max-width: 1079px){.css-1kiodld{display:none;}} build next, decide what to build next, log in or sign up.

Get started for free

Study Site Homepage

  • Request new password
  • Create a new account

Doing Research in Counselling and Psychotherapy

Student resources, disseminating the findings of your research study.

It is very important to find appropriate ways to disseminate the findings of your research – projects that sit on office or library shelves and are seldom or never read represent a tragic loss to the profession.

A key dimension of research dissemination is to be actively involved with potential audiences for your work, and help them to understand what it means to them. These dialogues also represent invaluable learning experiences for researchers, in terms of developing new ideas and appreciating the methodological limitations of their work. An inspiring example of how to do this can be found in:

Granek, L., & Nakash, O. (2016). The impact of qualitative research on the “real world” knowledge translation as education, policy, clinical training, and clinical practice.  Journal of Humanistic Psychology , 56(4), 414 – 435. 

A further key dimension of research dissemination lies in the act of writing. There are a number of challenges associated with writing counselling and psychotherapy research papers, such as the need to adhere to journal formats, and the need (sometimes) to weave personal reflective writing into a predominantly third-person standard academic style. The items in the following sections explore these challenges from a variety of perspectives.

Suggestions for becoming a more effective academic writer

Sources of advice on how to ease the pain of writing:

Gioia, D. (2019). Gioia’s rules of the game.  Journal of Management Inquiry , 28(1), 113 – 115. 

Greenhalgh, T. (2019). Twitter women’s tips on academic writing: a female response to Gioia’s rules of the game. Journal of Management Inquiry , 28(4), 484 – 487.

Roulston, K. (2019). Learning how to write successfully from academic writers. The Qualitative Report, 24(7), 1778 – 1781. 

Writing tips from the student centre, University of Berkeley

File

The transition from being a therapist to being a researcher

Finlay, L. (2020). How to write a journal article: Top tips for the novice writer.  European Journal for Qualitative Research in Psychotherapy , 10, 28 – 40.

McBeath, A., Bager-Charleson, S., & Abarbanel, A. (2019). Therapists and academic writing: “Once upon a time psychotherapy practitioners and researchers were the same people”.  European Journal for Qualitative Research in Psychotherapy , 9, 103 – 116. 

McPherson, A. (2020). Dissertation to published article: A journey from shame to sharing.  European Journal for Qualitative Research in Psychotherapy , 10, 41 – 52.

Journal article style requirements of the American Psychological Association (including a section on writing quantitative papers)

Writing qualitative reports

Jonsen, K., Fendt, J., & Point, S. (2018). Convincing qualitative research: What constitutes persuasive writing?  Organizational Research Methods , 21(1), 30 – 67.

Ponterotto, J.G. & Grieger, I. (2007). Effectively communicating qualitative research.  The Counseling Psychologist , 35, 404 – 430.

Smith, L., Rosenzweig, L. & Schmidt, M. (2010). Best practices in the reporting of participatory action research: embracing both the forest and the trees.  The Counseling Psychologist, 38, 1115 – 1138.

Staller, K.M. & Krumer-Nevo, M. (2013).  Successful qualitative articles: A tentative list of cautionary advice. Qualitative Social Work, 12, 247 – 253. 

Clark, A.M. & Thompson, D.R. (2016). Five tips for writing qualitative research in high-impact journals: moving from #BMJnoQual . International Journal of Qualitative Methods , 15, 1 – 3

Gustafson, D. L., Parsons, J. E., & Gillingham, B. (2019). Writing to transgress: Knowledge production in feminist participatory action research. Forum Qualitative Sozialforschung/Forum: Qualitative Social Research, 20 . DOI:  10.17169/fqs-20.2.3164

Caulley, D.N. (2008). Making qualitative reports less boring: the techniques of writing creative nonfiction.  Qualitative Inquiry, 14, 424 – 449.

Logo for British Columbia/Yukon Open Authoring Platform

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Chapter 15: Sharing Your Research

15.3 Disseminating Findings

Presenting your work, as discussed in Section “ Presenting Your Research “, is one way of disseminating your research findings. In this section, we will focus on disseminating the written results of your research. Dissemination refers to “a planned process that involves consideration of target audiences and the settings in which research findings are to be received and, where appropriate, communicating and interacting with wider policy and…service audiences in ways that will facilitate research uptake in decision-making processes and practice” (Wilson, Petticrew, Calnan, & Natareth, 2010, p. 93). In other words, dissemination of research findings involves careful planning, thought, consideration of target audiences, and communication with those audiences. Writing up results from your research and having others take notice are two entirely different propositions. In fact, the general rule of thumb is that people will not take notice unless you help and encourage them to do so. To paraphrase the classic line from the film Field of Dreams , just because you build it does not mean they will come.

Disseminating your findings successfully requires determining who your audience is, where they are, and how to reach them. When considering who your audience is, think about who is likely to take interest in your work. Your audience might include those who do not express enthusiastic interest but might nevertheless benefit from an awareness of your research. Your research participants and those who share some characteristics in common with your participants are likely to have some interest in what you’ve discovered in the course of your research. Other scholars who study similar topics are another obvious audience for your work. Perhaps there are policy makers who should take note of your work. Organizations that do work in an area related to the topic of your research are another possibility. Finally, any and all inquisitive and engaged members of the public represent a possible audience for your work.

The location of your audience should be fairly obvious once you have determined who you would like your audience to be. You know where your research participants are because you have studied them. You can find interested scholars on your campus (e.g., perhaps you could offer to present your findings at some campus event), at professional conferences, and via publications such as professional organizations’ newsletters (an often-overlooked source for sharing findings in brief form), and scholarly journals. Policymakers include your state and federal representatives, who, at least in theory, should be available to hear a constituent speak on matters of policy interest. Perhaps you are already aware of organizations that work in an area related to your research topic, but if not, a simple web search should help you identify possible organizational audiences for your work. Disseminating your findings to the public more generally could take any number of forms, including a letter to the editor of the local newspaper, or a blog.

Finally, determining how to reach your audiences will vary according to which audience you wish to reach. Your strategy should be determined by the norms of the audience. For example, scholarly journals provide author submission instructions that clearly define requirements for anyone wishing to disseminate their work via a particular journal. The same is true for newspaper editorials; check your newspaper’s website for details about how to format and submit letters to the editor. If you wish to reach out to your political representatives, a call to their offices or, again, a simple web search should tell you how to do that.

Whether or not you act on all these suggestions is ultimately your decision. But if you have conducted high-quality research, and you have findings that are likely to be of interest to any constituents besides yourself, it is your duty as a scholar and a sociologist to share those findings. Disseminating findings involves the following three steps:

  • Determine who your audience is.
  • Identify where your audience is.
  • Discover how best to reach them.

Research Methods for the Social Sciences: An Introduction Copyright © 2020 by Valerie Sheppard is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

Breadcrumbs Section. Click here to navigate to respective pages.

9Dissemination of Qualitative Findings

9Dissemination of Qualitative Findings

DOI link for 9Dissemination of Qualitative Findings

Click here to navigate to parent product.

This chapter discusses strategies for disseminating the results of qualitative analysis. It begins with a discussion of the documentation of qualitative methods through the construction of a natural history of the methodology. The chapter then turns to a discussion of approaches to the presentation of qualitative data in written reports. They include the narrative style of presenting data, the construction of descriptive tables, and the presentation of data in graphic formats. This section of the chapter is followed by a discussion of how qualitative results fit into various modes of data dissemination, such as professional reports, public presentations, and web-based applications. Before closing the discussion of data presentation, the adoption of writing styles that make qualitative research interesting and memorable to audiences is discussed.

  • Privacy Policy
  • Terms & Conditions
  • Cookie Policy
  • Taylor & Francis Online
  • Taylor & Francis Group
  • Students/Researchers
  • Librarians/Institutions

Connect with us

Registered in England & Wales No. 3099067 5 Howick Place | London | SW1P 1WG © 2024 Informa UK Limited

Qualitative Research: An Introduction to Methods and Designs by

Get full access to Qualitative Research: An Introduction to Methods and Designs and 60K+ other titles, with a free 10-day trial of O'Reilly.

There are also live events, courses curated by job role, and more.

Dissemination of Findings

Biography and life story research is scientifically and practically useful for dissemination. Scholars employ several formats to write a biography and life story research report, making this research methodology beneficial for multiple types of audiences. These formats include journal articles, books written as biographies for both the general reader and the scholar, and book chapters in theory and methods handbooks. This range allows for the general reader to gain insight into the psychology of lives, and for a professional to gain theoretical knowledge as well as some technical insight into the practice of therapy, healing, and liberation. Disseminated biography and life story findings are also useful as case materials ...

Get Qualitative Research: An Introduction to Methods and Designs now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.

Don’t leave empty-handed

Get Mark Richards’s Software Architecture Patterns ebook to better understand how to design components—and how they should interact.

It’s yours, free.

Cover of Software Architecture Patterns

Check it out now on O’Reilly

Dive in for free with a 10-day trial of the O’Reilly learning platform—then explore all the other resources our members count on to build skills and solve problems every day.

dissemination of findings in qualitative research

dissemination of findings in qualitative research

How To Write The Results/Findings Chapter

For qualitative studies (dissertations & theses).

By: Jenna Crossley (PhD). Expert Reviewed By: Dr. Eunice Rautenbach | August 2021

So, you’ve collected and analysed your qualitative data, and it’s time to write up your results chapter. But where do you start? In this post, we’ll guide you through the qualitative results chapter (also called the findings chapter), step by step. 

Overview: Qualitative Results Chapter

  • What (exactly) the qualitative results chapter is
  • What to include in your results chapter
  • How to write up your results chapter
  • A few tips and tricks to help you along the way
  • Free results chapter template

What exactly is the results chapter?

The results chapter in a dissertation or thesis (or any formal academic research piece) is where you objectively and neutrally present the findings of your qualitative analysis (or analyses if you used multiple qualitative analysis methods ). This chapter can sometimes be combined with the discussion chapter (where you interpret the data and discuss its meaning), depending on your university’s preference.  We’ll treat the two chapters as separate, as that’s the most common approach.

In contrast to a quantitative results chapter that presents numbers and statistics, a qualitative results chapter presents data primarily in the form of words . But this doesn’t mean that a qualitative study can’t have quantitative elements – you could, for example, present the number of times a theme or topic pops up in your data, depending on the analysis method(s) you adopt.

Adding a quantitative element to your study can add some rigour, which strengthens your results by providing more evidence for your claims. This is particularly common when using qualitative content analysis. Keep in mind though that qualitative research aims to achieve depth, richness and identify nuances , so don’t get tunnel vision by focusing on the numbers. They’re just cream on top in a qualitative analysis.

So, to recap, the results chapter is where you objectively present the findings of your analysis, without interpreting them (you’ll save that for the discussion chapter). With that out the way, let’s take a look at what you should include in your results chapter.

Free template for results section of a dissertation or thesis

What should you include in the results chapter?

As we’ve mentioned, your qualitative results chapter should purely present and describe your results , not interpret them in relation to the existing literature or your research questions . Any speculations or discussion about the implications of your findings should be reserved for your discussion chapter.

In your results chapter, you’ll want to talk about your analysis findings and whether or not they support your hypotheses (if you have any). Naturally, the exact contents of your results chapter will depend on which qualitative analysis method (or methods) you use. For example, if you were to use thematic analysis, you’d detail the themes identified in your analysis, using extracts from the transcripts or text to support your claims.

While you do need to present your analysis findings in some detail, you should avoid dumping large amounts of raw data in this chapter. Instead, focus on presenting the key findings and using a handful of select quotes or text extracts to support each finding . The reams of data and analysis can be relegated to your appendices.

While it’s tempting to include every last detail you found in your qualitative analysis, it is important to make sure that you report only that which is relevant to your research aims, objectives and research questions .  Always keep these three components, as well as your hypotheses (if you have any) front of mind when writing the chapter and use them as a filter to decide what’s relevant and what’s not.

Need a helping hand?

dissemination of findings in qualitative research

How do I write the results chapter?

Now that we’ve covered the basics, it’s time to look at how to structure your chapter. Broadly speaking, the results chapter needs to contain three core components – the introduction, the body and the concluding summary. Let’s take a look at each of these.

Section 1: Introduction

The first step is to craft a brief introduction to the chapter. This intro is vital as it provides some context for your findings. In your introduction, you should begin by reiterating your problem statement and research questions and highlight the purpose of your research . Make sure that you spell this out for the reader so that the rest of your chapter is well contextualised.

The next step is to briefly outline the structure of your results chapter. In other words, explain what’s included in the chapter and what the reader can expect. In the results chapter, you want to tell a story that is coherent, flows logically, and is easy to follow , so make sure that you plan your structure out well and convey that structure (at a high level), so that your reader is well oriented.

The introduction section shouldn’t be lengthy. Two or three short paragraphs should be more than adequate. It is merely an introduction and overview, not a summary of the chapter.

Pro Tip – To help you structure your chapter, it can be useful to set up an initial draft with (sub)section headings so that you’re able to easily (re)arrange parts of your chapter. This will also help your reader to follow your results and give your chapter some coherence.  Be sure to use level-based heading styles (e.g. Heading 1, 2, 3 styles) to help the reader differentiate between levels visually. You can find these options in Word (example below).

Heading styles in the results chapter

Section 2: Body

Before we get started on what to include in the body of your chapter, it’s vital to remember that a results section should be completely objective and descriptive, not interpretive . So, be careful not to use words such as, “suggests” or “implies”, as these usually accompany some form of interpretation – that’s reserved for your discussion chapter.

The structure of your body section is very important , so make sure that you plan it out well. When planning out your qualitative results chapter, create sections and subsections so that you can maintain the flow of the story you’re trying to tell. Be sure to systematically and consistently describe each portion of results. Try to adopt a standardised structure for each portion so that you achieve a high level of consistency throughout the chapter.

For qualitative studies, results chapters tend to be structured according to themes , which makes it easier for readers to follow. However, keep in mind that not all results chapters have to be structured in this manner. For example, if you’re conducting a longitudinal study, you may want to structure your chapter chronologically. Similarly, you might structure this chapter based on your theoretical framework . The exact structure of your chapter will depend on the nature of your study , especially your research questions.

As you work through the body of your chapter, make sure that you use quotes to substantiate every one of your claims . You can present these quotes in italics to differentiate them from your own words. A general rule of thumb is to use at least two pieces of evidence per claim, and these should be linked directly to your data. Also, remember that you need to include all relevant results , not just the ones that support your assumptions or initial leanings.

In addition to including quotes, you can also link your claims to the data by using appendices , which you should reference throughout your text. When you reference, make sure that you include both the name/number of the appendix , as well as the line(s) from which you drew your data.

As referencing styles can vary greatly, be sure to look up the appendix referencing conventions of your university’s prescribed style (e.g. APA , Harvard, etc) and keep this consistent throughout your chapter.

Section 3: Concluding summary

The concluding summary is very important because it summarises your key findings and lays the foundation for the discussion chapter . Keep in mind that some readers may skip directly to this section (from the introduction section), so make sure that it can be read and understood well in isolation.

In this section, you need to remind the reader of the key findings. That is, the results that directly relate to your research questions and that you will build upon in your discussion chapter. Remember, your reader has digested a lot of information in this chapter, so you need to use this section to remind them of the most important takeaways.

Importantly, the concluding summary should not present any new information and should only describe what you’ve already presented in your chapter. Keep it concise – you’re not summarising the whole chapter, just the essentials.

Tips for writing an A-grade results chapter

Now that you’ve got a clear picture of what the qualitative results chapter is all about, here are some quick tips and reminders to help you craft a high-quality chapter:

  • Your results chapter should be written in the past tense . You’ve done the work already, so you want to tell the reader what you found , not what you are currently finding .
  • Make sure that you review your work multiple times and check that every claim is adequately backed up by evidence . Aim for at least two examples per claim, and make use of an appendix to reference these.
  • When writing up your results, make sure that you stick to only what is relevant . Don’t waste time on data that are not relevant to your research objectives and research questions.
  • Use headings and subheadings to create an intuitive, easy to follow piece of writing. Make use of Microsoft Word’s “heading styles” and be sure to use them consistently.
  • When referring to numerical data, tables and figures can provide a useful visual aid. When using these, make sure that they can be read and understood independent of your body text (i.e. that they can stand-alone). To this end, use clear, concise labels for each of your tables or figures and make use of colours to code indicate differences or hierarchy.
  • Similarly, when you’re writing up your chapter, it can be useful to highlight topics and themes in different colours . This can help you to differentiate between your data if you get a bit overwhelmed and will also help you to ensure that your results flow logically and coherently.

If you have any questions, leave a comment below and we’ll do our best to help. If you’d like 1-on-1 help with your results chapter (or any chapter of your dissertation or thesis), check out our private dissertation coaching service here or book a free initial consultation to discuss how we can help you.

dissemination of findings in qualitative research

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

21 Comments

David Person

This was extremely helpful. Thanks a lot guys

Aditi

Hi, thanks for the great research support platform created by the gradcoach team!

I wanted to ask- While “suggests” or “implies” are interpretive terms, what terms could we use for the results chapter? Could you share some examples of descriptive terms?

TcherEva

I think that instead of saying, ‘The data suggested, or The data implied,’ you can say, ‘The Data showed or revealed, or illustrated or outlined’…If interview data, you may say Jane Doe illuminated or elaborated, or Jane Doe described… or Jane Doe expressed or stated.

Llala Phoshoko

I found this article very useful. Thank you very much for the outstanding work you are doing.

Oliwia

What if i have 3 different interviewees answering the same interview questions? Should i then present the results in form of the table with the division on the 3 perspectives or rather give a results in form of the text and highlight who said what?

Rea

I think this tabular representation of results is a great idea. I am doing it too along with the text. Thanks

Nomonde Mteto

That was helpful was struggling to separate the discussion from the findings

Esther Peter.

this was very useful, Thank you.

tendayi

Very helpful, I am confident to write my results chapter now.

Sha

It is so helpful! It is a good job. Thank you very much!

Nabil

Very useful, well explained. Many thanks.

Agnes Ngatuni

Hello, I appreciate the way you provided a supportive comments about qualitative results presenting tips

Carol Ch

I loved this! It explains everything needed, and it has helped me better organize my thoughts. What words should I not use while writing my results section, other than subjective ones.

Hend

Thanks a lot, it is really helpful

Anna milanga

Thank you so much dear, i really appropriate your nice explanations about this.

Wid

Thank you so much for this! I was wondering if anyone could help with how to prproperly integrate quotations (Excerpts) from interviews in the finding chapter in a qualitative research. Please GradCoach, address this issue and provide examples.

nk

what if I’m not doing any interviews myself and all the information is coming from case studies that have already done the research.

FAITH NHARARA

Very helpful thank you.

Philip

This was very helpful as I was wondering how to structure this part of my dissertation, to include the quotes… Thanks for this explanation

Aleks

This is very helpful, thanks! I am required to write up my results chapters with the discussion in each of them – any tips and tricks for this strategy?

Wei Leong YONG

For qualitative studies, can the findings be structured according to the Research questions? Thank you.

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • BMJ Journals

You are here

  • Volume 28, Issue 3
  • Responsible dissemination of health and medical research: some guidance points
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0001-7765-2443 Raffaella Ravinetto 1 ,
  • http://orcid.org/0000-0002-6275-6853 Jerome Amir Singh 2 , 3
  • 1 Public Health Department , Institute of Tropical Medicine , Antwerpen , Belgium
  • 2 Howard College School of Law , University of Kwazulu-Natal , Durban , South Africa
  • 3 Dalla Lana School of Public Health , University of Toronto , Toronto , Ontario , Canada
  • Correspondence to Dr Raffaella Ravinetto, Public Health Department, Institute of Tropical Medicine, Antwerpen, Belgium; rravinetto{at}itg.be

https://doi.org/10.1136/bmjebm-2022-111967

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

  • PUBLIC HEALTH
  • Global Health

Ravinetto and Singh argue that better practices can be implemented when disseminating research findings through abstracts, preprints, peer-reviewed publications, press releases and social media

Dissemination has been defined as ‘the targeted distribution of information and intervention materials to a specific public health or clinical practice audience’, 1 and as being ‘simply about getting the findings of your research to the people who can make use of them, to maximise the benefit of the research without delay’. 2 Ethics guidelines concur that research stakeholders have ethical obligations to disseminate positive, inconclusive or negative results, 3 in an accurate, comprehensive and transparent way 4 —even more so during public health emergencies. 5

Summary of research dissemination

What —Dissemination of health and medical research entails communicating the findings of research to stakeholders in ways that can facilitate understanding and use.

Why —Any positive, inconclusive or negative research findings should be disseminated to maximise the social value of the research and to accurately inform medical policies and practices.

When —Dissemination of health and medical research should occur as soon as possible after completion of interim and final analysis, particularly during public health emergencies.

Who —Researchers, research institutions, sponsors, developers, publishers and editors must ensure the timely and accurate dissemination of research findings. Similarly, the scientific community should critically appraise research findings; policymakers and clinicians should weigh the implications of research findings for policy and clinical practice; while mainstream media should communicate the implications of research findings to the general public in a manner that facilitates understanding.

How —Research findings are primarily disseminated via press releases, preprints, abstracts and peer-reviewed publications. To ensure timely, comprehensive, accurate, unbiased, unambiguous and transparent dissemination, all research stakeholders should integrate ethics and integrity principles in their institutional dissemination policies and personal belief systems.

Peer-reviewed publications

Publication in peer-reviewed journals remains the benchmark dissemination modality. Independent peer-review aims to assure the quality, accuracy and credibility of reports, but does not always prevent the publication of poorly written, dubious or even fraudulent manuscripts, 9 particularly if there is dearth of qualified reviewers, and/or an findings are hastily published to gain competitive advantage and visibility. 10 Furthermore, researchers who are inexperienced or subject to an institutional ethos of ‘publish or perish’, may choose to publish in predatory journals with highly questionable marketing and peer-review practices. 11 While target audiences may be unable to access findings if journal content is not freely accessible on the Internet, some researchers, particularly those in resource-constrained settings 12 may be unable to publish their research due to resource constraints (eg, publication fees may be prohibitively high). 13 Some may be poorly motivated to publish inconclusive or negative data. 14 Because of such shortcomings, commentators such as Horby warn that ‘clinicians should not rely solely on peer review to assess the validity and meaningfulness of research findings’. 15

For peer-reviewed publications to remain a key-dissemination modality, editors should follow the Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals , of the International Committee of Medical Journal Editors, and comply with the core practices of the Committee on Publication Ethics (eg, data and reproducibility, ethical oversight, authorship and contributorship, etc.). This entails going beyond a ‘checklist approach’ and subjecting manuscripts to rigorous screening and assessment. Journals should strive to select qualified independent reviewers and prioritise open-access policies. Research institutions should distance themselves from a ‘publish or perish’ culture which, together with the willingness to hide ‘unfavourable’ results, remains a major driver of unethical publication practices—which, in turn, translates to ill-informed policies and practices. 16

Scientific conferences are valuable venues for sharing research results with peers, and getting prepublication critical feedback. Abstracts often appear in the supplement of a scientific journal, which reaches a broader audience. However, even if attendance costs are not prohibitively expensive, the selection of abstracts may be highly competitive. As a result, not all research findings—even of topical interest—are selected. Furthermore, even if selection is conducted by independent experts, the limited information contained in an abstract may mask scientific and/or ethical shortcomings in the work.

Communication via abstracts is laudable, but should be rapidly followed by peer-reviewed publications, which allows for the findings to be comprehensively reviewed by experts. When abstracts remain the sole source of information, the findings’ significance might be misunderstood, overestimated or wrongly used to guide behaviours, policies and practices.

Preprints, that is, preliminary reports of work not yet peer-reviewed, are uploaded in dedicated free-access servers, such as https://www.medrxiv.org/ . Preprints are increasingly being used by health researchers, thanks to the evolving policies of major journals that now accept manuscripts previously posted as preprints. 17 Theoretically, preprints possess high value as they allow for rapid, open-access dissemination, and immediate yet informal peer-appraisal in the comments section. However, preprints also hold implicit risks. For instance, rapidity may detract from quality and accuracy; most peers will not be able to systematically invest time for the expected high-quality feedback; rushed or inexperienced readers may miss the (sometimes, small print) cautioning that preprints should not be considered established information, nor become the basis for informing policy or medical guidelines; and findings from preprints that may later be substantially revised or rejected after undergoing peer-review processes, could continue to be relied on and disseminated if, for example, they were included in scoping or systematic reviews before peer-review (the same applies to retracted peer-reviewed manuscripts).

To mitigate such risks, researchers should submit preprint manuscripts to a peer-reviewed journal as soon as reasonably possible, and transparently communicate on negative peer-review outcomes, or justify why the preprint is not being timeously submitted to a peer-review journal. Once accepted or published, researchers could remove their preprint from preprint servers or link to the final published version. The media have a duty to communicate preprint findings as unreviewed and subject to change. The scientific community should reach agreement on ‘Good Preprint Practices’ and ascribe less ambiguous terminology to preprints (eg, ‘Not peer-reviewed’ or ‘Peer-review pending’). 18

Press releases, media coverage and social media

Since 2021, the dissemination of clinical trial findings by corporate press release has almost become synonymous with announcements of COVID-19 scientific breakthroughs. Therefore, it seems important to briefly contextualise the strategy underpinning such dissemination. Corporate press releases are often preceded by stock repurchasing or ‘buybacks’, that is, companies buy back part of their own stock held by executives. This increases demand for the stock and enhances earnings per share. 19 Pharmaceutical or biotechnology companies typically engage in strategically timed buybacks, before press releases announcing significant research findings. Furthermore, corporates in the USA and elsewhere may employ press releases to comply with the legal requirements to disclose information that impact on their market values, and changes in their ‘financial conditions and operations’. 20 Press releases are typically drafted by marketing experts and they are often first aimed at the market, and driven by corporate interests rather than social value.

For researchers, the potential to amplify scientific visibility through mass media may act as a powerful incentive to indulge in flattering but inaccurate language. Nonetheless, they have a moral responsibility to review press releases for accuracy, and to immediately make key-information including the protocol, analysis plan and detailed results, publicly available. For instance, the media briefing that announced on 16 June 2020 the life-saving benefit of dexamethasone in severe COVID-19 was followed on 26 June by a preprint with full trial results 15 . In ab sence of such good practices, press releases can contain inaccuracies or overhype findings 7 with major damaging downstream effects. 16

The media have an equally significant impact on science dissemination: peer-reviewed publications which receive more attention from lay-press, are more likely to be cited in scientific literature. 21 Perceived media credibility also impacts on dissemination: once individuals trust a media source, 22 they often let down their guard on evaluating the credibility of that source. This speaks to the importance of discerning media dissemination ( box 2 ). Journalists who cover early press releases should critically appraise them considering their limitations and potential conflicts of interest.

Recommendations for journalists

Recommendations for journalists who cover (early) press release.

A. Always be conscious of the power of the media to shape the views, fears and beliefs of the public, in the short term, medium term and long term.

B. Weigh the tone and the extent of coverage afforded to press releases, based, among other factors, on:

A critical appraisal of whether the press release was preceded by stock buyouts and/or aimed at influencing corporates share values.

A critical appraisal of the science underpinning the press release, such as the sample size, study population representativeness (for instance, age, sex, ethnicity), research questions that are not addressed yet, and any omissions of potential harms.

A recourse to the views of independent scientists, paying attention to any declared or undeclared conflicts of interest that may bias their opinions.

C. Critically appraise the accuracy and possible biases of (independent) scientists’ opinions on press releases, when shared on personal social media feeds, before deciding whether to afford coverage to such views.

D. Afford the same coverage given to the initial press release (or more, if necessary) to any significant follow-up information-related thereto.

A call for good dissemination practices

The scientific community, health system policy-makers and regulators are the primary audience of peer-reviewed manuscripts, abstracts and preprints. These constituents should be, or become, ‘sufficiently skilled in critical thinking and scientific methods that they can make sensible decisions, regardless of whether an article is peer reviewed or not’ 15 ; understand that the nature of scientific knowledge is incremental and cumulative (one study seldom changes practice on its own); and also critically assess other sources, for example, pharmacovigilance, etc. Conversely, corporate press releases are aimed at influencing the market, and society as a whole—and not suited for scientific appraisal.

Irrespective of dissemination modalities, upstream information is cascaded to mainstream and social media, spreading knowledge but risk catalysing misunderstanding or overemphasis. Risks are only partially mitigated by independent quality control on the upstream information (relatively stringent in peer-review, weaker in preprints and abstracts, and virtually absent for press releases). In table 1 , we summarise recommendations for good dissemination practices, aimed at researchers, research institutions, developers, medical journals editors, media, journalists, social media actors, medical opinion leaders, policy-makers, regulators and the scientific community. All these stakeholders should integrate ethics and integrity in their policies and behaviours, to ensure timely, comprehensive, accurate, unbiased, unambiguous and transparent dissemination of research findings.

  • View inline

Summary of the recommendations for good dissemination practices

Ethics statements

Patient consent for publication.

Not applicable.

Ethics approval

  • The Unites States Agency for Healthcare Research and Quality
  • National Institute for Health Care and Research
  • World Medical Association
  • Council for International Organizations of Medical Sciences (CIOMS)
  • ↵ WHO guidelines on ethical issues in public health surveillance. Geneva: World Health Organization; 2017. Licence: CC BY-NC-SA 3.0 IGO. Accessed on 21/04/2022 at P214263_WHO Guidelines on Ethical Issues_COUV.indd .
  • Massey DS ,
  • Wallach JD , et al
  • McMahon JH ,
  • Lydeamore MJ ,
  • Stewardson AJ
  • Cashin AG ,
  • Richards GC ,
  • DeVito NJ , et al
  • Upshur REG ,
  • Zdravkovic M ,
  • Berger-Estilita J ,
  • Zdravkovic B , et al
  • Ellingson MK ,
  • Skydel JJ , et al
  • Kmietowicz Z
  • Ravinetto R
  • Ravinetto R ,
  • Caillet C ,
  • Zaman MH , et al
  • ↵ et al Sorkin AR , Karaian J , Gandel S . Biden Renews Pushback against stock Buybacks , 2022 . Available: https://www.nytimes.com/2022/03/28/business/dealbook/biden-stock-buybacks.html
  • Neuhierl A ,
  • Scherbina A ,
  • Schlusche B
  • Anderson PS ,
  • Gray HM , et al
  • Pew Research Centre

Twitter @RRavinetto

Contributors This manuscript was jointly written by RR and JAS.

Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

Competing interests None declared.

Provenance and peer review Commissioned; externally peer reviewed.

Read the full text or download the PDF:

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

The Role of Dissemination as a Fundamental Part of a Research Project

Affiliations.

  • 1 1 Agència de Salut Pùblica de Barcelona.
  • 2 2 CIBER Epidemiología y Salud Pública (CIBERESP), Madrid, Spain.
  • 3 3 Institut d'Investigació Biomèdica (IIB Sant Pau), Barcelona, Spain.
  • 4 4 Universitat Pompeu Fabra, Barcelona, Spain.
  • PMID: 27799595
  • DOI: 10.1177/0020731416676227

Dissemination and communication of research should be considered as an integral part of any research project. Both help in increasing the visibility of research outputs, public engagement in science and innovation, and confidence of society in research. Effective dissemination and communication are vital to ensure that the conducted research has a social, political, or economical impact. They draw attention of governments and stakeholders to research results and conclusions, enhancing their visibility, comprehension, and implementation. In the European project SOPHIE (Evaluating the Impact of Structural Policies on Health Inequalities and Their Social Determinants and Fostering Change), dissemination was an essential component of the project in order to achieve the purpose of fostering policy change based on research findings. Here we provide our experience and make some recommendations based on our learning. A strong use of online communication (website, Twitter, and Slideshare accounts), the production of informative videos, the research partnership with civil society organizations, and the organization of final concluding scientific events, among other instruments, helped to reach a large public within the scientific community, civil society, and the policy making arena and to influence the public view on the impact on health and equity of certain policies.

Keywords: communication; health inequalities; knowledge transfer; public policies; research dissemination; social media.

PubMed Disclaimer

Similar articles

  • Introduction to the "Evaluating the Impact of Structural Policies on Health Inequalities and Their Social Determinants and Fostering Change" (SOPHIE) Project. Borrell C, Malmusi D, Muntaner C. Borrell C, et al. Int J Health Serv. 2017 Jan;47(1):10-17. doi: 10.1177/0020731416681891. Int J Health Serv. 2017. PMID: 27956577
  • Structural Transformation to Attain Responsible BIOSciences (STARBIOS2): Protocol for a Horizon 2020 Funded European Multicenter Project to Promote Responsible Research and Innovation. Colizzi V, Mezzana D, Ovseiko PV, Caiati G, Colonnello C, Declich A, Buchan AM, Edmunds L, Buzan E, Zerbini L, Djilianov D, Kalpazidou Schmidt E, Bielawski KP, Elster D, Salvato M, Alcantara LCJ, Minutolo A, Potestà M, Bachiddu E, Milano MJ, Henderson LR, Kiparoglou V, Friesen P, Sheehan M, Moyankova D, Rusanov K, Wium M, Raszczyk I, Konieczny I, Gwizdala JP, Śledzik K, Barendziak T, Birkholz J, Müller N, Warrelmann J, Meyer U, Filser J, Khouri Barreto F, Montesano C. Colizzi V, et al. JMIR Res Protoc. 2019 Mar 7;8(3):e11745. doi: 10.2196/11745. JMIR Res Protoc. 2019. PMID: 30843870 Free PMC article.
  • Communication in a Human biomonitoring study: Focus group work, public engagement and lessons learnt in 17 European countries. Exley K, Cano N, Aerts D, Biot P, Casteleyn L, Kolossa-Gehring M, Schwedler G, Castaño A, Angerer J, Koch HM, Esteban M, Schoeters G, Den Hond E, Horvat M, Bloemen L, Knudsen LE, Joas R, Joas A, Dewolf MC, Van de Mieroop E, Katsonouri A, Hadjipanayis A, Cerna M, Krskova A, Becker K, Fiddicke U, Seiwert M, Mørck TA, Rudnai P, Kozepesy S, Cullen E, Kellegher A, Gutleb AC, Fischer ME, Ligocka D, Kamińska J, Namorado S, Reis MF, Lupsa IR, Gurzau AE, Halzlova K, Jajcaj M, Mazej D, Tratnik JS, Huetos O, López A, Berglund M, Larsson K, Sepai O. Exley K, et al. Environ Res. 2015 Aug;141:31-41. doi: 10.1016/j.envres.2014.12.003. Epub 2014 Dec 12. Environ Res. 2015. PMID: 25499539
  • Community health outreach program of the Chad-Cameroon petroleum development and pipeline project. Utzinger J, Wyss K, Moto DD, Tanner M, Singer BH. Utzinger J, et al. Clin Occup Environ Med. 2004 Feb;4(1):9-26. doi: 10.1016/j.coem.2003.09.004. Clin Occup Environ Med. 2004. PMID: 15043361 Review.
  • [DECIDE: developing and evaluating communication strategies to support informed decisions and practice based on evidence]. Parmelli E, Amato L, Saitto C, Davoli M; Gruppo di Lavoro "DECIDE Italia. Parmelli E, et al. Recenti Prog Med. 2013 Oct;104(10):522-31. doi: 10.1701/1349.14997. Recenti Prog Med. 2013. PMID: 24326703 Review. Italian.
  • Harnessing real-life experiences: the development of guidelines to communicate research findings on Developmental Coordination Disorder/dyspraxia. Purcell C, Dahl A, Gentle J, Hill E, Kirby A, Mason A, McQuillan V, Meek A, Payne S, Scott-Roberts S, Shaw K, Wilmut K. Purcell C, et al. Res Involv Engagem. 2024 Aug 8;10(1):84. doi: 10.1186/s40900-024-00611-0. Res Involv Engagem. 2024. PMID: 39118133 Free PMC article.
  • The cultural safety of research reports on primary healthcare use by Indigenous Peoples: a systematic review. Hiyare-Hewage A, Sinka V, Grande ED, Kerr M, Kim S, Mallitt KA, Dickson M, Jaure A, Wilson R, Craig JC, Stephens JH. Hiyare-Hewage A, et al. BMC Health Serv Res. 2024 Jul 31;24(1):873. doi: 10.1186/s12913-024-11314-3. BMC Health Serv Res. 2024. PMID: 39085815 Free PMC article.
  • Comparison of recruitment methodologies for clinical trials: Results from the time for living and caring (TLC) intervention study. Sparks C, Hsu A, Neller SA, Eaton J, Thompson A, Wong B, Iacob E, Terrill AL, Caserta M, Stark L, Utz RL. Sparks C, et al. Contemp Clin Trials. 2024 May;140:107518. doi: 10.1016/j.cct.2024.107518. Epub 2024 Mar 29. Contemp Clin Trials. 2024. PMID: 38554816
  • Supporting early-career dementia researchers: Identifying support needs and ways forward via a European study. Dupont C, Gilissen J, Dassen FCM, Branco RM, Heins P, Heffernan E, Bartels SL. Dupont C, et al. Alzheimers Dement. 2024 Feb;20(2):1321-1333. doi: 10.1002/alz.13530. Epub 2023 Nov 20. Alzheimers Dement. 2024. PMID: 37983858 Free PMC article.
  • Cutting Through the Noise: Predictors of Successful Online Message Retransmission in the First 8 Months of the COVID-19 Pandemic. Renshaw SL, Mai S, Dubois E, Sutton J, Butts CT. Renshaw SL, et al. Health Secur. 2021 Jan-Feb;19(1):31-43. doi: 10.1089/hs.2020.0200. Health Secur. 2021. PMID: 33606574 Free PMC article.

Publication types

  • Search in MeSH

LinkOut - more resources

Full text sources, other literature sources.

  • scite Smart Citations
  • MedlinePlus Health Information

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Implement Sci

Logo of implemsci

Strategies for effective dissemination of research to United States policymakers: a systematic review

Laura ellen ashcraft.

1 University of Pittsburgh School of Social Work, 2117 Cathedral of Learning, 4200 Fifth Avenue, Pittsburgh, PA 15260 USA

Deirdre A. Quinn

2 Center for Health Equity Research and Promotion (CHERP), VA Pittsburgh Healthcare System, University Drive C, Building 30, Pittsburgh, PA 15240 USA

Ross C. Brownson

3 Prevention Research Center, Brown School, Washington University in St. Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO 63130 USA

4 Department of Surgery, Division of Public Health Sciences, and Alvin J. Siteman Cancer Center, Washington University School of Medicine, 660 South Euclid Avenue, Saint Louis, MO 63110 USA

Associated Data

Raw search results, citations, and abstracts available upon request.

Research has the potential to influence US social policy; however, existing research in this area lacks a coherent message. The Model for Dissemination of Research provides a framework through which to synthesize lessons learned from research to date on the process of translating research to US policymakers.

The peer-reviewed and grey literature was systematically reviewed to understand common strategies for disseminating social policy research to policymakers in the United States. We searched Academic Search Premier, PolicyFile, SocINDEX, Social Work Abstracts, and Web of Science from January 1980 through December 2019. Articles were independently reviewed and thematically analyzed by two investigators and organized using the Model for Dissemination of Research.

The search resulted in 5225 titles and abstracts for inclusion consideration. 303 full-text articles were reviewed with 27 meeting inclusion criteria. Common sources of research dissemination included government, academic researchers, the peer reviewed literature, and independent organizations. The most frequently disseminated research topics were health-related, and legislators and executive branch administrators were the most common target audience. Print materials and personal communication were the most common channels for disseminating research to policymakers. There was variation in dissemination channels by level of government (e.g., a more formal legislative process at the federal level compared with other levesl). Findings from this work suggest that dissemination is most effective when it starts early, galvanizes support, uses champions and brokers, considers contextual factors, is timely, relevant, and accessible, and knows the players and process.

Conclusions

Effective dissemination of research to US policymakers exists; yet, rigorous quantitative evaluation is rare. A number of cross-cutting strategies appear to enhance the translation of research evidence into policy.

Registration

Not registered.

Contributions to the literature

  • This is one of the first systematic reviews to synthesize how social policy research evidence is disseminated to US policymakers.
  • Print materials and personal communications were the most commonly used channels to disseminate social policy research to policymakers.
  • Several cross-cutting strategies (e.g., start early, use evidence “champions,” make research products more timely, relevant, and accessible) were identified that are likely to lead to more effective translate of research evidence into the policy making process in the United States.

In recent years, social scientists have sought to understand how research may influence policy [ 1 , 2 ]. Interest in this area of investigation has grown with the increased availability of funding for policy-specific research (e.g., dissemination and implementation research) [ 3 ]. However, because of variation in the content of public policy, this emerging area of scholarship lacks a coherent message that specifically addresses social policy in the United States (US). While other studies have examined the use of evidence in policymaking globally [ 4 – 7 ], the current review focuses on US social policy; for the purposes of this study, social policy includes policies which focus on antipoverty, economic security, health, education, and social services [ 8 – 10 ].

Significant international research exists on barriers and facilitators to the dissemination and use of research evidence by policymakers [ 4 , 5 ]. Common themes include the importance of personal relationships, the timeliness of evidence, and resource availability [ 4 , 5 ]. Previous work demonstrates the importance of understanding policymakers’ perceptions and how evidence is disseminated. The current review builds on this existing knowledge to examine how research evidence reaches policymakers and to understand what strategies are likely to be effective in overcoming identified barriers.

Theoretical frameworks offer a necessary foundation to identify and assess strategies for disseminating research to policymakers. The Model for Dissemination of Research integrates Diffusion of Innovations Theory and Social Marketing Theory with the Mathematical Theory of Communication [ 11 , 12 ] and the Matrix of Persuasive Communication [ 13 , 14 ] to address the translation gap between research and policy. The purpose of the Model for Dissemination of Research is to highlight the gaps between research and targets audiences (e.g., policymakers) and improve dissemination through the use of a theoretical foundation and review of the literature [ 15 ]. Diffusion of Innovations Theory describes the spread and adoption of novel interventions through an “s-curve,” ordered process, and characteristics of the message and audience [ 16 ]. Additional theoretical contributions for dissemination research come from Social Marketing Theory, which postulates commercial marketing strategies summarized by the four P’s (produce, price, place, and promotion) and the understanding that communication of the message alone will not change behavior [ 17 ].

The Model for Dissemination of Research includes the four key components described by Shannon and Weaver [ 11 , 12 ] and later McGuire [ 13 , 14 ] of the research translation process: the source, message, audience, and channel (Fig. ​ (Fig.1). 1 ). The source includes researchers who generate evidence. The message includes relevant information sent by the source on a policy topic. The audience includes those receiving the message via the channel [ 15 ]. The channel is how the message gets from the source to the audience [ 15 ].

An external file that holds a picture, illustration, etc.
Object name is 13012_2020_1046_Fig1_HTML.jpg

The Model for Dissemination of Research. The Model for Dissemination of Research integrates Diffusion of Innovations Theory, the Mathematical Theory of Communication, and Social Marketing Theory to develop a framework for conceptualizing how information moves from source to audience. Originally published by Brownson et al. in Journal of public health management and practice in 2018

While the Model for Dissemination of Research and its origins (i.e., the Mathematical Theory of Communication and Diffusion of Innovations Theory) appear linear in their presentation, Shannon and Weaver [ 11 , 12 ] and Rogers [ 16 ] clearly acknowledge that the dissemination of information is not a linear process and is effected by the environment within which it occurs. This approach aligns with the system model or knowledge to action approach proposed by Best and Holmes [ 18 ]. The systems model accounts for influence of the environment on a process and accounts for the complexity of the system [ 18 ]. Therefore, while some theoretical depictions appear linear in their presentation; it is important to acknowledge the critical role of systems thinking.

To date, lessons learned from dissemination and implementation science about the ways in which research influences policy are scattered across diverse disciplines and bodies of literature. These disparate lessons highlight the critical need to integrate knowledge across disciplines. The current study aims to make sense of and distill these lessons by conducting a systematic review of scientific literature on the role of research in shaping social policy in the United States. The results of this systematic review are synthesized in a preliminary conceptual model (organized around the Model for Dissemination of Research) with the goal of improving dissemination strategies for the translation of scientific research to policymakers and guiding future research in this area.

This systematic review aims to synthesize existing evidence about how research has been used to influence social policy and is guided by the following research questions:

  • What are common strategies for using research to influence social policy in the United States?
  • What is the effectiveness of these strategies?

We used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA-P) model [ 19 , 20 ] to examine and distill existing studies on strategies for using research evidence to influence social policy.

Eligibility criteria

Studies were eligible for this review if they met the following inclusion criteria: (1) occurred in the United States; (2) reported in English; (3) systematically evaluated the impact of research on social policy (this typically excluded studies focusing on policymaker dissemination preferences); (4) discussed domestic social policy (as defined above); and (5) were published in the peer reviewed literature or the grey literature (e.g., think tank research briefs, foundation research publications).

We chose to focus our review on the United States to capture the strengths and challenges of its unique, multi-level policy and political environment. The de-centralized structure of government in the United States allows significant decision-making authority at the state and local levels, with wide variation in capacity and the availability of resources across the country [ 21 ]. For example, some states have full-time legislatures while other states have part-time legislatures. In total, these factors create a fitting and complex environment to examine the dissemination of research to policymakers. The influence of lobbying in the United States also differs from other western countries. In the United States, there is more likely to be a “winner-take-all” process where some advocates (often corporations and trade associations) have disproportionate influence [ 22 ]. In addition, the role of evidence differs in the US compared with other countries, where the US tends to take a narrower focus on intervention impact with less emphasis on system-level issues (e.g., implementation, cost) [ 23 ].

Studies were excluded if they were not in English or occurred outside of the United States. We also excluded non-research sources, such as editorials, opinion pieces, and narrative stories that contain descriptions of dissemination strategies without systematic evaluation. Further, studies were excluded if the results focused on practitioners (e.g., case managers, local health department workers) and/or if results for practitioners could not be parsed from results for policymakers.

To identify studies that systematically evaluated the impact of research on social policy, we reviewed the research questions and results of each study to determine whether or not they examined how research evidence reaches policymakers (as opposed to policymaker preferences for disseminated research). For example, we would not include a research study that only describes different types of policy briefs, without also evaluating how the briefs are used by policymakers to inform policy decisions. We used the Model for Dissemination of Research, as defined above, to see if and how the studies describe and test the channels of dissemination. We built on the Model of Dissemination by also considering passive forms of knowledge, such as peer-reviewed literature or research briefs, as potential sources of knowledge and not just as channels in and of themselves.

Information sources

We took a three-pronged approach to develop a comprehensive understanding of existing knowledge in this area. First, we searched the peer reviewed literature using the following databases: Academic Search Premier, PolicyFile, SocINDEX, Social Work Abstracts, and Web of Science. We expanded the inquiry for evidence by searching the grey literature through PolicyFile, and included recommendations from experts in the field of dissemination of research evidence to policymakers resulting in 137 recommended publications.

Search strategy

Our search strategy included the following terms: [research OR study OR studies OR knowledge] AND [policy OR policies OR law OR laws OR legislation] AND [use OR utilization OR utilisation] OR [disseminate OR dissemination OR disseminating] OR [implementation OR implementing OR implement] OR [translate OR translation OR translating]. Our search was limited to studies in the United States between 1980 and 2019. We selected this timeframe based on historical context: the 1950s through the 1970s saw the development of the modern welfare state, which was (relatively) complete by 1980. However, shifting political agendas in the 1980s saw the demand for evidence increase to provide support for social programs [ 24 ]; we hoped to capture this increase in evidence use in policy.

Selection process

All titles and abstracts were screened by the principal investigator (LEA) with 20% reviewed at random by a co-investigator (DAQ) with total agreement post-training. Studies remaining after abstract screening moved to full text review. The full text of each study was considered for inclusion (LEA and DAQ) with conflicts resolved by consensus. The data abstraction form was developed by the principal investigator (LEA) based on previous research [ 25 , 26 ] and with feedback from co-authors. Data were independently abstracted from each reference in duplicate with conflicts resolved by consensus (LEA and DAQ). We completed reliability checks on 20% of the final studies, selected at random, to ensure accurate data abstraction.

Data synthesis

Abstracted data was qualitatively analyzed using thematic analysis (LEA and DAQ) and guided by the Model for Dissemination of Research. The goal of the preliminary conceptual model was to synthesize components of dissemination for studies that evaluate the dissemination of social policy to policymakers.

Descriptive results

The search of the literature resulted in 5675 articles and 137 articles recommended by content experts for review with 5225 titles and abstracts screened after duplicates removed. Of those articles, 4922 were excluded due to not meeting inclusion criteria. Further, 303 full text articles were reviewed with 276 excluded as they did not meet inclusion criteria. Twenty-seven articles met inclusion criteria (see the Fig. ​ Fig.2 2 for the PRISMA flow diagram).

An external file that holds a picture, illustration, etc.
Object name is 13012_2020_1046_Fig2_HTML.jpg

PRISMA flowchart. The preferred reporting items for systematic reviews and meta-analyses (PRISMA) flow diagram reports included and excluded articles in the systematic review

Included studies are listed in Table ​ Table1. 1 . The 27 included 6 studies using quantitative methods, 18 that employed qualitative methods, and 3 that used a mixed methods approach. The qualitative studies mostly employed interviews ( n = 10), while others used case studies ( n = 6) or focus groups ( n = 3). Most studies examined state-level policy ( n = 18) and nine studies examined federal-level policy, with some studies looking at multiple levels of government. Included studies focused on the executive and legislative branches with no studies examining the judicial branch.

Included studies

Lead author, YearTheory/frameworkMethod and sample sizeLevel and branch of governmentSource (i.e., where/who is the information coming from)Message (i.e., what information is being shared)Channel (i.e., how is the information getting to the target audience)Audience (i.e., who is the information going to)
Allen, 2015 [ ]The operationalization frameworkQualitative; 29 InterviewsLocal/MunicipalPolicymakers and advocates from three citiesInformation about needle exchangesPublic education campaign about needle exchange; public pressure put on policymakers’Policymakers (broadly defined)
Austin, 2017 [ ]The 'triggers-to-action' frameworkQualitative (case study)State; LegislativeAn academic-community-government partnershipHarm caused to consumers by dietary supplements for weight loss and muscle buildingPeer-reviewed recommendations; Media outreach campaign, via websites, email listservs and social media; Fact sheet, talking points, summary of research, the original legal research articleState legislators
Brim, 1983 [ ]NoneQualitative (case study)Other; LegislativeFoundation Reports Child DevelopmentCurrent state and past trends for children and families; budget analysis; national survey; federal expendituresReports; books; collaboration between foundations and the public sector; grants to advocacy organizations; compilation of dataSenate Subcommittee on Children and Youth Hearings
Brownson, 2011 [ ]NoneQuantitative; 291State; Executive, LegislativeState-level policy makersInformation about mammography screeningPolicy briefs (data/state; data/local; story/state; story/local)State-level policymakers
Bumbarger, 2012 [ ]Interactive Systems FrameworkQualitative (case study)State; ExecutivePenn State Prevention Research Center program staff and researchersOngoing information about evidence-based practices and policiesOne-page fact sheets, PowerPoint presentations, research briefs, short YouTube videos, and infographicsAdministrative/Executive Representatives from related offices
Coffman, 2009 [ ]NoneQualitative (case study)State; LegislativeCHBRP faculty and staff, who review and synthesize existing literatureMedical effectiveness and costs of interventionsCHBRP medical effectiveness reportsState legislators
Crowley, 2018 [ ]Research-to-Policy (RPC) Collaboration ModelMixed Methods; 10 legislative offices and 22 prevention experts were trainedFederal; LegislativeRapid Response Researcher NetworkResearch information based on legislative inquiryIn-person meetings and web conferencingFederal legislative offices; legislators and legislative staff
Friese, 2009 [ ]Two Communities Theory and Community Dissonance TheoryQualitative (Interviews); 14 interviewsFederal, State; Executive, LegislativeResearchers who presented at Family Impact Seminars between 1993-2002Child support, early childhood education, helping poor children succeed, long-term care, moving families out of poverty, parenting, prescription drugs, and welfare reformPresenting or testifying before Congress, legislatures, and committees; responding to individual questions from policymakers and staff via phone and email; serving on committees, advisory panels, and task forces; and writing briefs, memoranda, and contract research reportsLegislators, legislative aides, governor's office staff, legislative service agency staff, and agency representatives
Garcia, 2016 [ ]Consolidated Framework (Damschroder)Quantitative; 96County; ExecutiveResearch Evidence (undefined)Child welfare practices, mental health, juvenile justice, and other social service areasAcademic journals, training manuals, presentations, consultants, intervention developers, and web-based clearinghouseCounty-level executive policymakers
Hopkins, 2018 [ ]Social Network TheoryQuantitative; 56 CSS membersState; ExecutiveCSSS leaders and membersResearch related to science education policyPerson-to-person exchangesState education agency leaders
Jabbar, 2015 [ ]Advocacy Coalition Framework and Operationalization FrameworkQualitative; 53 interviewsState, Local/Municipal; Executive, LegislativeIntermediary organizations, researchers, journalistsIncentive-based reforms for public school systemsConversations, news stories, press releases, targeted reports, informational cards, blog posts, TwitterLocal, state, and federal policymakers
Jamieson, 1999 [ ]NoneQualitative (case study)State; ExecutiveAnalysis of Washington state child welfare dataFoster care, guardianship, and racial differences in foster care systemPublications; presentations to a wide array of audiences; repetition of focused messages via presentations, newsletters, news media, public forums, articles, participation in work groups, meetings with key individuals, and broad distribution of reports; placing the data in a local contextDSHS administration, social workers, legislators, judges, the attorney general, tribal councils and American Indian communities, African American communities, child advocates, court-appointed special advocates, and private agencies
Lane, 2011 [ ]knowledge-value mapping (KVM)Other (Comparative effectiveness study design); (spokespeople for) 6 national organizationsFederal; ExecutiveAcademic journals; training programs and conference proceedings; relevant websites; white papers and internal reports from other sources; experts; Assistive technology agenciesResearch on assistive technology devices and servicesElectronic media (email, listserv, websites); conferences, presentations, workshops; trainings/certificate programs; electronic media (email, listserv, websites); conference proceedings, presentations, and workshops; white papers and position papers; small-group meetings with policy makers and staff members in government agenciesPublic policy agencies
Massell, 2012 [ ]NoneMixed Methods (Interview, Survey); 49 interviews / 300+ surveysState; ExecutiveResearch; Program evaluation; Research-based guidance; Data on schools in your state; Advice from colleagues within the SEA and external practitionersResearch on improving low-performing schools (broad & generic)Advice from own colleagues; school data; published original research, research syntheses or summaries, and meta-analyses; results of program evaluations (least often cited)State education agency leaders
McBride, 2008 [ ]NoneQualitative (focus groups); 3 groups (17, 20+, 20+)Federal; LegislativeResearchersHealth services research (broadly)Brochures, fact sheets, working papers/ reports, policy briefs, PowerPoint presentation slides, press releases, Web sites, and other products (provided by researchers)National Organizations and congressional staffers
McGinty, 2019 [ ]Advocacy Coalition FrameworkQualitative (Interviews, Case Study); 25Federal, StateResearchers and stakeholders who worked as collaborators in each of the casesRelevant research for gun policy, opioid policy, and drug control policyCoalition convening, working groups, sign-on process to endorse recommendations; public release of recommendations; policy dissemination and education via state forums; legislation development; formal policy advocacy; policy implementation supportPolicymakers (general)
McVay, 2016 [ ]NoneQuantitative; 266State, Local/Municipal; ExecutivePublic health researchers (General)Generalized researched findings (no specific topic)Face-to-face meetings, academic journals, press releases, policy briefs, and media interviewsLocal, state, and/or federal public health departments
Meisel, 2019 [ ]NoneQualitative (interviews); 18 interviewsFederal, State, Other; Other (never specified)Economic researchTreatment and economic research in substance use disorderInformal networks with researchers; person-to-person communication; conferences and webinars; literature reviews and summary reportsPolicymakers (general)
Mosley, 2012 [ ]NoneQualitative (Interviews, Observations) ; 38 interviewsState; Executive, LegislativeCo-sponsors of bill (9 organizations)Cost effectiveness & experiences by youth who age out of foster carePress releases, press conferences, one-page summaries, and other communications; testimonial evidenceState policymakers
Nelson, 2009 [ ]Weiss typologyQualitative (Interviews, Focus Groups); 10 interviews; 55 participants in 5 focus groupsFederal, State, Local/Municipal; Executive, LegislativeEvidence (peer-reviewed and commissioned studies); Organizations and individuals; Publications and conferences; electronic sourcesEducational information influential to No Child Left BehindNewspapers; media reports; constituent feedback; data (state and local databases, evaluation data from previous initiatives, data collected from multiple databases); personal experience and the experience of others from similar schools, districts, and states; and empirical research evidenceEducation policymakers, staff, and advocates
Purtle, 2019 [ ]Weiss typologyQualitative (Interviews, content analysis of newspaper articles); 44 interviews, unknown # of articlesCounty; ExecutivePublished data (County Health Rankings) from the University of Wisconsin Population Health InstituteCounty-level health informationCounty Health Rankings ReportCounty Health Department officials
Sorian, 2002 [ ]NoneQualitative, Quantitative; 292 surveysState; Legislative, ExecutivePublications and conferences; electronic sources; organizations (public and private) and individualsGeneral health policy issuesElectronic and hard copy print materialsState legislative policymakers
Valentine, 2014 [ ]NoneQualitative (Interviews, Focus Groups)State; ExecutiveSenior advisors from to policymakers from state departments of mental health in management positions and Policy director from a national nonprofit organizationState-level mental health disparitiesMental health care disparity report cardsState mental health executive; audience for mental health care disparity report cards would be state policymakers
Weiss, 2008 [ ]Weiss typology "Imposed Use"Qualitative (Interviews); 16 interviewsFederal, State, Local/Municipal; ExecutiveUS Department of Education expert panelRecommendations for substance abuse prevention programs for implementation in primary and secondary educational settingsList of Exemplary and Promising Prevention ProgramsSchool district stakeholders
Weissman, 2015 [ ]NoneQuantitative; 46 states, 60 (out of 100) directorsState; ExecutivePublished research; experts; own data; patientsApplication of CER coverage decisionsRCTs; Consensus statements; Systematic Reviews; Expert Opinion; Observational studies (own data; other data); patient experience/consumer advocacyState Medicaid Directors and Pharmacy Directors
Yanovitsky, 2019 [ ]Message Framing Theory; Weiss typologyQualitative; Thematic and content analysis of 786 documentsFederal; LegislativeGovernment, academic, think-tank research, anecdotal evidence, and 'generic'Research evidence on childhood obesityCongressional committee hearings & billsFederal legislators and their staff
Zelizer, 2018 [ ]NoneQuantitative; 74 legislators (1,216 legislator-bill dyads)State; LegislativeLegislative staffer who worked for the Veterans CaucusInformation on proposed legislationIn-person meeting with one Veterans caucus stafferState senator or representative

We examined dissemination based on geographic regions and/or political boundaries (i.e., regions or states). Sixteen of the 27 studies (about 59%) used national samples or multiple states and did not provide geographic-specific results [ 27 – 42 ]. Two studies (about 7%) did not specific the geographic region or state in which the study took place [ 43 , 44 ]. Of the remaining studies, four examined policymaking in the Northeastern United States [ 45 – 48 ], four in the Western US [ 49 – 52 ], and one in the South [ 53 ]. The geographic regional groups used similar channels to disseminate evidence to policymakers including publications and presentations.

We also analyzed whether dissemination at different levels of government (i.e., local, state, and federal) used unique channels. Six of included studies (about 22%) examined multiple levels of government and did not separate results based on specific levels of government [ 27 – 31 , 53 ]. One study did not specifically identify the level of government used [ 46 ]. While there is considerable overlap in dissemination channels used at each level of government, there are some unique characteristics.

Five studies (about 18.5%) examined dissemination at the federal level [ 32 – 36 ]. At the federal level, dissemination channels tended to be more formal such as congressional committee hearings [ 36 ] and legislative development [ 35 ]. Twelve studies (about 44%) evaluated dissemination at the state level [ 38 – 44 , 47 , 48 , 50 – 52 ]. State level dissemination heavily relied on printed materials including from mental health care disparity report cards [ 41 ], policy briefs [ 38 ], and effectiveness reports [ 50 ]. Another common channel was in-person communications such as one-on-one meetings [ 44 ] and presentations to stakeholders [ 51 ]. Three studies (about 11%) focused on local-level government. Dissemination channels at the local level had little consistency across the three studies with channels including public education [ 45 ], reports [ 37 ], and print materials [ 49 ].

Roughly half of studies were atheoretical ( n = 13). Four studies used the Weiss Typology [ 29 , 36 , 54 , 55 ], two studies used the operationalization framework [ 45 , 53 ], and two studies used the advocacy coalition framework [ 53 , 56 ].

Model for dissemination of research

We used the Model for Dissemination of Research to summarize the findings from the included studies into the themes of source, message, audience, and channel (i.e., strategies). We integrated themes from the studies into the Model (see Fig. ​ Fig.3 3 ).

An external file that holds a picture, illustration, etc.
Object name is 13012_2020_1046_Fig3_HTML.jpg

A conceptual model for dissemination of research to policymakers. The populated conceptual model builds on the Model for Dissemination of Research by organizing findings from the current systematic review to build an understanding of how research is disseminated to policymakers in the United States

The sources of knowledge varied across studies with some studies including multiple sources of social policy information. The most common sources of knowledge included research, as in peer-reviewed literature ( n = 7) [ 30 , 33 , 38 , 42 , 43 , 49 , 54 ], researchers ( n = 5) [ 27 , 31 , 32 , 34 , 56 ], and research broadly defined ( n = 5) [ 36 , 39 , 47 , 48 , 55 ], the government ( n = 11) [ 29 , 36 , 41 – 44 , 47 , 50 , 54 , 56 , 57 ], and organizations ( n = 7) [ 33 , 36 , 46 , 52 – 54 , 56 ].

The majority of studies focused on health topics ( n = 12) [ 29 , 30 , 33 , 34 , 38 , 41 , 42 , 45 , 47 , 55 , 56 , 58 ] and child and family well-being ( n = 6) [ 27 , 36 , 46 , 49 , 52 , 57 ]. The remaining studies covered the topics of education ( n = 4) [ 39 , 43 , 53 , 54 ], guns [ 56 ], veterans [ 44 ], and general social research ( n = 3) [ 31 , 32 , 48 ]. Multiple studies offered specific recommendations for message framing, suggesting that the packaging of information is as critical as the information itself [ 27 ]. One study piloted multiple styles of policy briefs and found staffers preferred to use and share narrative or story-based briefs while legislators were more likely to use and share statistical, data-based briefs [ 38 ]. This finding was mirrored in two studies that found testimonial or descriptive evidence to be as effective as data-driven research [ 34 , 52 ], particularly in the context of sympathetic populations [ 52 ]. Three studies highlighted the reliance of effective message delivery on the message’s ability to capture audience interest (e.g., what the research means to the policymaker, specifically and if possible, personally) [ 27 , 34 , 41 ]. Finally, two studies emphasized creating a sense of urgency or even shock-value within the message in order to capture policymakers’ interest [ 36 , 57 ].

The audience included executive branch policymakers [ 49 ], administrators ( n = 9) [ 27 , 31 , 38 , 39 , 41 , 43 , 53 , 55 , 57 ], and staff [ 42 ]. Studies which focused on the legislative branch examined legislators ( n = 12) [ 27 , 32 , 36 , 38 , 44 – 47 , 50 , 52 , 53 , 58 ] and staff ( n = 3) [ 32 , 34 , 36 ]. Three studies examined broadly defined policymakers [ 33 , 54 , 56 ] and generalized staff [ 54 ] without indication for specific branch of government.

Included studies examined a variety of channels with many including multiple channels. Print materials was the most commonly used channel, including reports ( n = 10) [ 27 , 30 , 33 , 41 , 46 , 50 , 53 , 55 , 57 , 58 ] and policy briefs ( n = 3) [ 31 , 34 , 38 ]. Researchers examined in-person meetings and communications as a channel to disseminate research ( n = 9) [ 30 , 32 , 33 , 39 , 44 , 48 , 53 , 56 , 57 ]. Research and research summaries were also studied ( n = 7) [ 30 , 31 , 42 , 47 , 49 , 52 , 54 ]. Both traditional ( n = 6) [ 31 , 33 , 47 , 52 – 54 ] and social media ( n = 2) [ 47 , 53 ] were examined as channels to disseminate research to policymakers. Other channels include conferences and presentations ( n = 4) [ 33 , 34 , 49 , 57 ], electronic communication ( n = 2) [ 27 , 57 ], online resources ( n = 3) [ 34 , 49 , 58 ], and personal testimony ( n =2) [ 42 , 52 ].

Effectiveness and lessons learned

The majority of studies employed qualitative research methods (e.g., interviews, case studies, focus groups) to evaluate the impact of scientific research on domestic social policy. Our review of the literature also identified nine quantitative and mixed-methods studies [ 31 , 32 , 38 , 39 , 42 – 44 , 49 , 58 ]. We identified a series of cross-cutting dissemination strategies for engaging policymakers including recommendations for and barriers to research-to-policy (see Table ​ Table2 2 ).

Strategy for engaging policymakersRecommendations for research-policy translationBarriers to research-policy translation
Start early

• Engage policymakers when planning research [ , ]

• Be strategic about research and audience [ , ]

• Take initiative to contact policymakers [ ]

Drum up support

• Involve a broad pool of experts [ ]

• Cultivate broad coalition of supporters [ ]

• Policymakers may appear not to value research [ ]
Use research evidence 'champions' or 'brokers'

• Research use 'champions' engage with community stakeholders and policymakers [ ]

• Intermediary organizations connect “research supply” to “research demand” [ ]

• External brokers play a role both in connecting policymakers to research and in conceptualizing and developing policy [ , ]

• Intermediary individuals or organizations may select or spin research to make their point [ , , ]

• Policymakers may have a list of preferred evidence brokers [ ]

• Basing policy on evidence requires identified 'best evidence', which may reflect bias and favoritism [ ]

Context matters

• Integrate research evidence into broader sociopolitical context [ ]

• Research must be locally, contextually relevant [ , , ]

• Specify which government office(s) are responsible [ ]

• Federally imposed policies (e.g., education) often override local expertise around context and population [ ]

• Ideology, whether personal or regional, may create a barrier between researchers and policymakers [ , , , , – ]

Make research products timely, relevant, and accessible

• Tailor design of products to meet diverse end user needs [ , ]

• Present research in commonly-used formats (e.g., briefs, talking points, videos) [ ]

• Research must be timely and geared to policymakers' concerns [ , , , , ]

• Use clear, careful language [ ]

• Formalize the organizational / individual process of translating research to policy [ ]

• Complexity of research [ ]

• Disconnect between the goals and language of policymakers and researchers [ , , ]

• Concerns about data/research evidence quality [ , , ]

Know the players and the process

• Familiarize yourself with policymaking process [ , ]

• Show respect for policymakers' knowledge/experiences [ ]

• Learn about / build relationships with the target policymaking audience [ , ]

• Expand contact and working relationships with end users [ ]

• Lack of familiarity with effective dissemination strategies [ ]

• Lack of financial and institutional support for dissemination [ ]

Miscellaneous• Approach policy work as an educator rather than as an advocate [ ]

Start early

Four studies highlighted the importance for early and ongoing engagement with policymakers throughout the research process in order to maximize interest and applicability. Researchers are encouraged to take the initiative to contact policymakers as early as possible in the research process. Many policymakers may be interested in accessing and using research but uncertain who or how to make connections in the academic or research community [ 27 ]. Involving policymakers when designing projects and framing initial research questions increases the likelihood that key policy stakeholders will remain invested in the work by allowing their individual research interests to shine [ 34 , 41 ]. Early engagement also ensures that research products (e.g., reports, policy briefs, factsheets) will have strategic usefulness for policymakers [ 30 ].

Drum up support

In addition to early policymaker engagement, three studies highlighted the need for researchers to garner outside support for their work, ideally involving a broad pool of experts and cultivating a broader coalition of supporters than typical academic endeavors [ 47 ]. Often, policymakers appear unwilling or uninterested in considering the application of evidence to their work [ 45 , 53 ]; when researchers can demonstrate the value and relevance of their work [ 58 ], policymakers may be more likely to engage.

Use research evidence “champions” or “brokers”

A common strategy for garnering support (as recommended above) is the use of evidence champions or brokers ; these are intermediary individuals or organizations who connect research suppliers (e.g., individual researchers, academic institutions) to research demand (e.g., policymakers) [ 53 ]. These champions can broker important connections; however, researchers and policymakers alike must remember that these intermediaries are not neutral carriers of information, and may spin research in support of personal agendas [ 45 , 52 , 53 ]. Individual biases may also present a barrier in research-to-policy translation, as individuals or organizations are empowered to select the “best” research evidence to share with policymakers [ 29 ]. One study found that nearly half of state policymakers named professional associations as trusted sources for research information, specifically because the organization is perceived not to have a stake in the final policy outcome [ 58 ].

Two studies specifically addressed the role of intermediary organizations or brokers in the translation of research evidence to policy. Hopkins et al. [ 39 ] explored the exchange of research evidence among state education agency (SEA) leaders, while Massell et al. [ 43 ] examined more broadly the origins of research evidence use in three SEAs. Both studies found that external brokers played a role in connecting SEA policymakers to relevant research, as well as in the conceptualization and development of policy.

Focus on context

Multiple studies stressed the importance of research evidence being contextually relevant to the specific policy audience [ 29 , 54 , 55 , 57 ]. For some policymakers, the needs and interests of local constituents will drive the use of research and the specifics of the policy agenda; for others, discussions that integrate research evidence into the broader sociopolitical context will be more effective [ 45 ]. For state- and local-level policymakers, policies may be most effective when based on the evidence-based understanding of local stakeholders, rather than imposed from the federal level without local contextual details [ 29 ].

Ideology of external advisors and brokers (as discussed above) and policymakers’ own personal beliefs and experiences [ 54 ] and the prevailing political ideology of a particular geographic region [ 55 ] are critical components of context. Ideological beliefs, often deeply held and personal, may create a barrier between researchers and policymakers [ 41 ], though differentiating ideology from other factors that affect individual position-taking is difficult in most situations [ 44 ]. McGinty et al. [ 56 ] suggest that in polarized contexts involving strong ideological beliefs, research may add legitimacy to a particular viewpoint, though as with brokers, that research is likely to be carefully curated to support the desired message. Purtle et al. [ 55 ] concur, reporting that some county health officials were wary of the potential to spin research findings to make a case for certain programs over others and noted the need to avoid the challenge of distorting evidence. Two studies recommend positional neutrality as a researcher’s best approach to handling potential ideological differences, suggesting that presenting research findings as simple fact, rather than making specific recommendations for action, may help avoid conflict and also help researchers gain credibility across the ideological spectrum [ 27 , 50 ].

Make research products timely, relevant, and accessible

As with all research endeavors, timeliness and relevance are paramount. However, the typical timeline for academic research (years) is often too long for policymakers whose window for championing a policy action is much shorter (weeks or months) [ 27 , 52 ]. A frequently reported barrier in research-to-policy translation is the complexity of research and concerns about the quality of research evidence [ 29 , 41 , 56 ]; one strategy for combating this concern is the use of clear, careful language [ 27 ], and tailored, audience-specific products that meet the needs of a diverse population of end users [ 27 , 34 , 58 ]. Research that is presented in commonly used, accessible formats (e.g., briefs, factsheets, videos) [ 48 ] may also be more effective, though one study found that use of these formats was dependent on job type, with legislators and staffers preferring different formats [ 58 ].

Multiple studies engaged with policymakers in an effort to determine how they receive research evidence and what strategies or formats are most desirable or effective [ 38 ]. After piloting four different styles of policy briefs (on the same research topic) with state-level policymakers, Brownson et al. [ 38 ] found that while all styles of brief were considered understandable and credible, opinions on the usefulness of the brief varied by the style of the brief and by the level of policymaker (e.g., legislative staff, legislators, and executive branch administrators). These findings suggest that targeted, audience-specific research evidence materials may be more likely to be used by policymakers than generic research evidence. One study explored the usefulness of electronic vs. printed research material and again found differences by type of policymaker—legislators were more likely to read hard copy printed material, while staffers gave higher ratings to online content. Not surprisingly, the age of the policymaker also played a role in the choice to access electronic or printed material, with younger policymakers much more likely to read electronic copy than were their older peers [ 58 ].

A study on state policymakers’ perceptions of comparative effectiveness research (CER) found that the most useful research is that which is consistent and specific to the needs of the policymakers [ 42 ]. The same study identified related barriers to the use of CER in policy decision-making, citing a lack of relevant high quality or conclusive research [ 42 ].

Finally, two studies described pilot projects focused on the delivery of research evidence directly to policymakers. The first cultivated researchers’ capacity to accelerate the translation of research evidence into useable knowledge for policymakers through a rapid response researcher network [ 32 ]. This model was shown to be effective for both researchers (in mobilizing) and policymakers (in eliciting requests for research evidence to bolster a policy conversation or debate) [ 32 ]. The second implementation study reported on a field experiment in which state legislators randomly received relevant research about pending policy proposals [ 44 ]. Findings from this study suggest that having relevant research information increases policymakers’ co-sponsorship of proposals by 60% and highlights the importance of research access in the policy process [ 44 ].

Know the players and the process

Policymakers are as much experts in their arena as researchers are in their academic fields. In order to build lasting working relationships with a target policymaking audience and maximize the relevance of research products for policy work, researchers must first understand the policy process [ 27 , 30 , 34 ]. One study examined the role of researchers themselves in disseminating findings to policymakers and identified individual- and organizational-level facilitators and barriers to the process [ 31 ]. Researchers’ familiarity with the policy process, the relevance of policy dissemination to individual programs of research, and the expectation of dissemination (from higher institutional or funding bodies) facilitated the research-to-policy exchange, while lack of familiarity with effective dissemination strategies and lack of financial and institutional support for dissemination emerged as primary barriers in the research-to-policy exchange [ 31 ].

Public policy, whether legislative, executive, or judicial, affects all areas of daily life in both obvious and subtle ways. The policy process (i.e., the steps from an idea to policy enactment) does not exist in a vacuum; it is influenced by many factors, including public opinion [ 59 , 60 ], special interest groups [ 61 ], personal narratives [ 62 ], expressed needs of constituents [ 1 ], the media [ 63 – 65 ], and corporations [ 66 , 67 ]. Research may also play a role in shaping policy and has the potential to add objectivity and evidence to these other forces [ 1 , 2 , 68 ]. The current study synthesizes existing knowledge to understand dissemination strategies of social policy research to policymakers in the United States.

Many channels exist to disseminate evidence to policymakers, with the most common being print materials (i.e., reports and policy briefs). This finding is surprising in our current digital age, as print materials are necessarily time-bound and rapidly evolving technology has created more channels (e.g., social media, videos) which may be preferred by policymakers. This shift creates an opportunity to optimize the content of print materials to disseminate in new mediums; it also offers a chance for authors to improve the accessibility of their work for broader audiences (e.g., via more visual presentation formats) [ 15 , 69 – 71 ].

Our review found strategies to increase effectiveness of research dissemination to policymakers includes starting early, drumming-up support, using champions and brokers, understanding the context, ensuring timeliness, relevance, and accessibility of research products, and knowing the players and the process. These themes align with existing knowledge about policymaker preferences including face-to-face engagement [ 72 , 73 ], contextual considerations (e.g., timeliness and budget) [ 2 , 72 ], and existing barriers and facilitators to research evidence use [ 4 , 5 ]. Our study adds to what we already know about policymakers’ desire for research evidence and their varying preferences as to the context and form of that knowledge [ 2 , 72 , 74 ] and supports existing efforts to bridge the gap between researchers and policymakers.

Many of the barriers and facilitators to research dissemination that we identified in this review mirror those cited by policymakers as barriers and facilitators to evidence use; this overlap reasonably suggests that efforts to expand research dissemination may improve the other. Particularly relevant lessons from the evidence use literature that also emerged from our review include emphasis on the benefit of building personal relationships between researchers and policymakers [ 5 , 75 , 76 ], narrowing the perceived gap between the two groups [ 77 , 78 ], and changing the culture of decision making to increase appreciation for the value of research in policy development [ 5 , 75 – 77 ]. Considering the multiple pathways through which research evidence is used in policy, from providing direct evidence of a program’s effectiveness to informing or orienting policy makers about relevant issues [ 23 ], these shared lessons around barriers and facilitators may better inform researchers, policymakers, and staff as to best practices for future communication and collaboration.

Our findings also highlight several unique elements of the US policy landscape, wherein significant power is reserved from the federal-level and afforded to state-level government. In some states, this power is further distributed to county and local governments. This system creates major variation across the country in both policy decisions and in resource availability for social policy implementation. Despite our relatively unique government structure, however, many of the effective strategies for dissemination we identified mirror strategies found in other countries [ 79 , 80 ].

Studies that focused on a specific level of government had some unique characteristics such as formality and reliance on print materials. For example, federal dissemination relied more heavily on formal legislative testimony while state level material relied on written policy materials (e.g., policy briefs, report cards). However, these results are limited by small sample sizes and limited evidence about effectiveness.

A wide range of contextual variables may influence policy dissemination in the US at different levels of government. In the federal legislative context alone, multiple committees and subcommittees of both the U.S. House of Representatives and the U.S. Senate may exercise some control over programs and policies related to a single social policy issue (e.g., child and family services) [ 81 ]. At the federal level, the Congressional Research Service (CRS) provides non-partisan research support to legislators in multiple formats including reports on major policy issues, expert testimony, and responses to individual inquiries; the Domestic Social Policy Division offers Congress interdisciplinary research and analysis on social policy issues [ 82 ]. While there may be fewer decision-makers for each issue on the state level, policymaking is further complicated by the extensive rules and reporting requirements attached to state use of federal funding as well as competing priorities or needs at the local level within each state [ 83 , 84 ]. Another dissemination influence may include geographic proximity; for example, geographical proximity may increase the likelihood of university-industry partnerships [ 85 ].

Infrastructure differences may also represent important differences between the US social policy context and that of other developed nations. Each country has a distinct and perhaps unique policy context given available resources, political rules and regulations, and priorities. While models for infrastructure and dissemination interventions may be shared across policy contexts, it may be difficult to directly compare dissemination strategies in one country with dissemination strategies in another country.

Several examples across western countries contribute to a stronger nexus between research evidence and the policy-making process. In the United States, the Wisconsin Family Impact Seminars ( www.wisfamilyimpact.org ) are an example of long-standing initiatives that provide the opportunity for researchers and policymakers to come together to discuss unbiased policy-relevant evidence [ 86 ]. As exemplified by Friese and Bogenschneider [ 27 ], these forums continue to be perceived as objective, relevant, and useful by policymakers and have succeeded at bringing attention to social policy [ 86 ]. Researchers and policymakers in Canada have sought to bridge the research-to-policy gap. For example, the Canadian Foundation for Healthcare Improvement (formerly the Canadian Health Services Research Foundation), funded by the Canadian federal government, brings together researchers and policymakers early and throughout the research development process to discuss, prioritize, and evaluate opportunities for research and dissemination [ 79 ]. In the UK, infrastructure at the national level includes the National Institute for Health Research Policy Research Programme, which funds health research with the explicit goal of informing national policy decisions in health and social care [ 87 ]. These efforts include open calls for research proposals as well as 15 dedicated Policy Research Units located at leading academic institutions around the country. Another resource is the EPPI-Centre at University College London, which provides policymakers support for finding and using research to inform policy decisions through its Research Advisory Service. This allows researchers to work alongside policymakers to reach their goals in addressing educational needs with evidence-informed policy [ 80 ].

Limitations

The current study has several limitations—these illustrate opportunities for future research. First, we attempted to cast a wide net when searching for studies which examined the influence of research on social policy by including a broad search of the peer-reviewed literature, think tanks, and content experts. However, it is possible we missed some studies which examine how research influences policy. Second, we provide a rationale for focusing on US studies and that our findings may not be generalizable to other countries. Third, we were unable to assess the risk of bias for individual studies as current standards note difficulties in assessing quality and bias in qualitative research [ 88 ]. Fourth, many studies examined multiple channels or strategies for how research influences policy, so the parsing of singular strategies (e.g., policy brief, in-person meeting) as an effective approach should be interpreted with caution. Additional investigation is needed to explore and test causal pathways in how these channels can best influence social policy. Fifth, the majority of studies did not use any theory or framework as a foundation or guide for exploration. This gap may indicate a space to use frameworks such as the Model for Dissemination of Research to guide future research. Finally, the dearth of mixed-methods studies that systematically evaluate the impact of research evidence on domestic social policy (this review identified only 3) presents an opportunity for future work in this field to integrate quantitative and qualitative methodologies.

One significant challenge to increasing the rigor in dissemination research studies is the difficulty in choosing and then measuring an outcome. Many of the studies included in this review are either case studies or descriptive, making it difficult to determine what, if any, impact the given research had on policy. Bogenschneider and Corbett discuss this at length as one of the primary challenges to furthering this research [ 72 ], imploring researchers not to focus solely on the outcome of whether or not a piece or legislation passes but rather to examine whether research influenced one of the proposed policy options [ 72 ]. However, this information can be difficult both to operationalize and to collect. That said, some researchers have already begun to think beyond the passage of legislation, as evidenced by Zelizer [ 44 ] who examined bill co-sponsorship rather than passage. A recent review of health policy implementation measurement found that validated quantitative measures are underutilized and recommends further development and testing of such measures [ 89 ]. Difficulties in identifying robust outcomes and high-quality scales to operationalize them present opportunities for additional exploration in this area.

Dissemination and implementation are often described together; not surprisingly, is overlap in effective strategies for each. The current review identified six dissemination strategies and described their reported effectiveness, while the Expert Recommendations for Implementing Change (ERIC) Project identified 73 implementation strategies [ 90 ]. One such similarity is obvious: the dissemination strategy of using champions and brokers mirrors the ERIC implementation strategy of identifying and preparing champions. The difference between the number of implementation strategies and dissemination strategies is striking and highlights the gap in research. Future work should further explore the degree to which dissemination strategies and implementation strategies either overlap or are distinct.

Finally, the dissemination of research to policymakers may raise certain ethical issues. It is imperative for researchers to critically assess when and how to disseminate research findings to policymakers, keeping in mind that promoting a specific policy agenda may result in a perceived or real loss of objectivity [ 91 ]. Syntheses of policy-relevant evidence can be useful, particularly when researchers work in partnership with non-governmental organizations to inform the policy process.

We summarize strategies and illuminate potential barriers to the research-to-policy dissemination process. Key findings are drawn from multiple disciplines and suggest that lessons learned may cut across both research topics and levels of government. The most frequently referenced channel for dissemination to policymakers was print materials, with personal communication (including both in-person and electronic meetings and individual communications) a close second. Corresponding strategies for effective dissemination to policymakers included starting early, drumming-up support, using champions and brokers, understanding the context, ensuring timeliness, relevance, and accessibility of research products, and knowing the players and the process. A shared feature of these strategies is the distillation of complex research findings into accessible pieces of relevant information that can then be delivered via multiple avenues.

Interdisciplinary collaboration is a common practice in scientific research [ 92 ]. Our findings provide leads on how to more effectively to engage with policymakers, leading to a greater likelihood of translating research evidence into policy action. Engaging policymakers early as contributing members of the research team, maintaining communication during the research process, and presenting relevant findings in a clear, concise manner may empower both researchers and policymakers to further apply scientific evidence to improve social policy in the United States.

Supplementary information

Acknowledgements.

The views expressed herein are those of the authors and do not reflect those of the Department of Veterans Affairs, the Centers for Disease Control and Prevention, or the National Institutes of Health.

Abbreviations

USUnited States
PRISMA-PPreferred Reporting Items for Systematic Reviews and Meta-Analyses
CERComparative Effectiveness Research
ERICExpert Recommendations for Implementing Change

Authors’ contributions

Review methodology: LEA, DAQ, RCB; eligibility criteria: LEA, DAQ, RCB; search strings and terms: LEA, DAQ; abstract screening: LEA, DAQ; full text screening: LEA, DAQ; pilot extraction: LEA, DAQ; data extraction: LEA, DAQ; data aggregation: LEA, DAQ; writing: LEA, DAQ; editing: LEA, DAQ, RCB. The author(s) read and approved the final manuscript.

LEA is supported by a pre-doctoral Clinical and Translational Science Fellowship (NIH TL1 TR001858 (PI: Kraemer)). DAQ is supported by a postdoctoral fellowship through the Department of Veterans Affairs (VA) Office of Academic Affiliations and the Center for Health Equity Research and Promotion at the VA Pittsburgh Healthcare System. RCB is supported by the National Cancer Institute (P50CA244431) the Centers for Disease Control and Prevention (U48DP006395). The funding entities had no role in the development, data collection, analysis, reporting, or publication of this work. Article processing charges for this article were fully paid by the University Library System, University of Pittsburgh.

Availability of data and materials

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

The authors declare they have no conflicting interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Laura Ellen Ashcraft, Email: ude.ttip@tfarchsanellearual .

Deirdre A. Quinn, Email: [email protected] .

Ross C. Brownson, Email: ude.ltsuw@nosnworbr .

Supplementary information accompanies this paper at 10.1186/s13012-020-01046-3.

  • Systematic review
  • Open access
  • Published: 07 August 2024

Models and frameworks for assessing the implementation of clinical practice guidelines: a systematic review

  • Nicole Freitas de Mello   ORCID: orcid.org/0000-0002-5228-6691 1 , 2 ,
  • Sarah Nascimento Silva   ORCID: orcid.org/0000-0002-1087-9819 3 ,
  • Dalila Fernandes Gomes   ORCID: orcid.org/0000-0002-2864-0806 1 , 2 ,
  • Juliana da Motta Girardi   ORCID: orcid.org/0000-0002-7547-7722 4 &
  • Jorge Otávio Maia Barreto   ORCID: orcid.org/0000-0002-7648-0472 2 , 4  

Implementation Science volume  19 , Article number:  59 ( 2024 ) Cite this article

179 Accesses

5 Altmetric

Metrics details

The implementation of clinical practice guidelines (CPGs) is a cyclical process in which the evaluation stage can facilitate continuous improvement. Implementation science has utilized theoretical approaches, such as models and frameworks, to understand and address this process. This article aims to provide a comprehensive overview of the models and frameworks used to assess the implementation of CPGs.

A systematic review was conducted following the Cochrane methodology, with adaptations to the "selection process" due to the unique nature of this review. The findings were reported following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) reporting guidelines. Electronic databases were searched from their inception until May 15, 2023. A predetermined strategy and manual searches were conducted to identify relevant documents from health institutions worldwide. Eligible studies presented models and frameworks for assessing the implementation of CPGs. Information on the characteristics of the documents, the context in which the models were used (specific objectives, level of use, type of health service, target group), and the characteristics of each model or framework (name, domain evaluated, and model limitations) were extracted. The domains of the models were analyzed according to the key constructs: strategies, context, outcomes, fidelity, adaptation, sustainability, process, and intervention. A subgroup analysis was performed grouping models and frameworks according to their levels of use (clinical, organizational, and policy) and type of health service (community, ambulatorial, hospital, institutional). The JBI’s critical appraisal tools were utilized by two independent researchers to assess the trustworthiness, relevance, and results of the included studies.

Database searches yielded 14,395 studies, of which 80 full texts were reviewed. Eight studies were included in the data analysis and four methodological guidelines were additionally included from the manual search. The risk of bias in the studies was considered non-critical for the results of this systematic review. A total of ten models/frameworks for assessing the implementation of CPGs were found. The level of use was mainly policy, the most common type of health service was institutional, and the major target group was professionals directly involved in clinical practice. The evaluated domains differed between the models and there were also differences in their conceptualization. All the models addressed the domain "Context", especially at the micro level (8/12), followed by the multilevel (7/12). The domains "Outcome" (9/12), "Intervention" (8/12), "Strategies" (7/12), and "Process" (5/12) were frequently addressed, while "Sustainability" was found only in one study, and "Fidelity/Adaptation" was not observed.

Conclusions

The use of models and frameworks for assessing the implementation of CPGs is still incipient. This systematic review may help stakeholders choose or adapt the most appropriate model or framework to assess CPGs implementation based on their specific health context.

Trial registration

PROSPERO (International Prospective Register of Systematic Reviews) registration number: CRD42022335884. Registered on June 7, 2022.

Peer Review reports

Contributions to the literature

Although the number of theoretical approaches has grown in recent years, there are still important gaps to be explored in the use of models and frameworks to assess the implementation of clinical practice guidelines (CPGs). This systematic review aims to contribute knowledge to overcome these gaps.

Despite the great advances in implementation science, evaluating the implementation of CPGs remains a challenge, and models and frameworks could support improvements in this field.

This study demonstrates that the available models and frameworks do not cover all characteristics and domains necessary for a complete evaluation of CPGs implementation.

The presented findings contribute to the field of implementation science, encouraging debate on choices and adaptations of models and frameworks for implementation research and evaluation.

Substantial investments have been made in clinical research and development in recent decades, increasing the medical knowledge base and the availability of health technologies [ 1 ]. The use of clinical practice guidelines (CPGs) has increased worldwide to guide best health practices and to maximize healthcare investments. A CPG can be defined as "any formal statements systematically developed to assist practitioner and patient decisions about appropriate health care for specific clinical circumstances" [ 2 ] and has the potential to improve patient care by promoting interventions of proven benefit and discouraging ineffective interventions. Furthermore, they can promote efficiency in resource allocation and provide support for managers and health professionals in decision-making [ 3 , 4 ].

However, having a quality CPG does not guarantee that the expected health benefits will be obtained. In fact, putting these devices to use still presents a challenge for most health services across distinct levels of government. In addition to the development of guidelines with high methodological rigor, those recommendations need to be available to their users; these recommendations involve the diffusion and dissemination stages, and they need to be used in clinical practice (implemented), which usually requires behavioral changes and appropriate resources and infrastructure. All these stages involve an iterative and complex process called implementation, which is defined as the process of putting new practices within a setting into use [ 5 , 6 ].

Implementation is a cyclical process, and the evaluation is one of its key stages, which allows continuous improvement of CPGs development and implementation strategies. It consists of verifying whether clinical practice is being performed as recommended (process evaluation or formative evaluation) and whether the expected results and impact are being reached (summative evaluation) [ 7 , 8 , 9 ]. Although the importance of the implementation evaluation stage has been recognized, research on how these guidelines are implemented is scarce [ 10 ]. This paper focused on the process of assessing CPGs implementation.

To understand and improve this complex process, implementation science provides a systematic set of principles and methods to integrate research findings and other evidence-based practices into routine practice and improve the quality and effectiveness of health services and care [ 11 ]. The field of implementation science uses theoretical approaches that have varying degrees of specificity based on the current state of knowledge and are structured based on theories, models, and frameworks [ 5 , 12 , 13 ]. A "Model" is defined as "a simplified depiction of a more complex world with relatively precise assumptions about cause and effect", and a "framework" is defined as "a broad set of constructs that organize concepts and data descriptively without specifying causal relationships" [ 9 ]. Although these concepts are distinct, in this paper, their use will be interchangeable, as they are typically like checklists of factors relevant to various aspects of implementation.

There are a variety of theoretical approaches available in implementation science [ 5 , 14 ], which can make choosing the most appropriate challenging [ 5 ]. Some models and frameworks have been categorized as "evaluation models" by providing a structure for evaluating implementation endeavors [ 15 ], even though theoretical approaches from other categories can also be applied for evaluation purposes because they specify concepts and constructs that may be operationalized and measured [ 13 ]. Two frameworks that can specify implementation aspects that should be evaluated as part of intervention studies are RE-AIM (Reach, Effectiveness, Adoption, Implementation, Maintenance) [ 16 ] and PRECEDE-PROCEED (Predisposing, Reinforcing and Enabling Constructs in Educational Diagnosis and Evaluation-Policy, Regulatory, and Organizational Constructs in Educational and Environmental Development) [ 17 ]. Although the number of theoretical approaches has grown in recent years, the use of models and frameworks to evaluate the implementation of guidelines still seems to be a challenge.

This article aims to provide a complete map of the models and frameworks applied to assess the implementation of CPGs. The aim is also to subside debate and choices on models and frameworks for the research and evaluation of the implementation processes of CPGs and thus to facilitate the continued development of the field of implementation as well as to contribute to healthcare policy and practice.

A systematic review was conducted following the Cochrane methodology [ 18 ], with adaptations to the "selection process" due to the unique nature of this review (details can be found in the respective section). The review protocol was registered in PROSPERO (registration number: CRD42022335884) on June 7, 2022. This report adhered to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [ 19 ] and a completed checklist is provided in Additional File 1.

Eligibility criteria

The SDMO approach (Types of Studies, Types of Data, Types of Methods, Outcomes) [ 20 ] was utilized in this systematic review, outlined as follows:

Types of studies

All types of studies were considered for inclusion, as the assessment of CPG implementation can benefit from a diverse range of study designs, including randomized clinical trials/experimental studies, scale/tool development, systematic reviews, opinion pieces, qualitative studies, peer-reviewed articles, books, reports, and unpublished theses.

Studies were categorized based on their methodological designs, which guided the synthesis, risk of bias assessment, and presentation of results.

Study protocols and conference abstracts were excluded due to insufficient information for this review.

Types of data

Studies that evaluated the implementation of CPGs either independently or as part of a multifaceted intervention.

Guidelines for evaluating CPG implementation.

Inclusion of CPGs related to any context, clinical area, intervention, and patient characteristics.

No restrictions were placed on publication date or language.

Exclusion criteria

General guidelines were excluded, as this review focused on 'models for evaluating clinical practice guidelines implementation' rather than the guidelines themselves.

Studies that focused solely on implementation determinants as barriers and enablers were excluded, as this review aimed to explore comprehensive models/frameworks.

Studies evaluating programs and policies were excluded.

Studies that only assessed implementation strategies (isolated actions) rather than the implementation process itself were excluded.

Studies that focused solely on the impact or results of implementation (summative evaluation) were excluded.

Types of methods

Not applicable.

All potential models or frameworks for assessing the implementation of CPG (evaluation models/frameworks), as well as their characteristics: name; specific objectives; levels of use (clinical, organizational, and policy); health system (public, private, or both); type of health service (community, ambulatorial, hospital, institutional, homecare); domains or outcomes evaluated; type of recommendation evaluated; context; limitations of the model.

Model was defined as "a deliberated simplification of a phenomenon on a specific aspect" [ 21 ].

Framework was defined as "structure, overview outline, system, or plan consisting of various descriptive categories" [ 21 ].

Models or frameworks used solely for the CPG development, dissemination, or implementation phase.

Models/frameworks used solely for assessment processes other than implementation, such as for the development or dissemination phase.

Data sources and literature search

The systematic search was conducted on July 31, 2022 (and updated on May 15, 2023) in the following electronic databases: MEDLINE/PubMed, Centre for Reviews and Dissemination (CRD), the Cochrane Library, Cumulative Index to Nursing and Allied Health Literature (CINAHL), EMBASE, Epistemonikos, Global Health, Health Systems Evidence, PDQ-Evidence, PsycINFO, Rx for Change (Canadian Agency for Drugs and Technologies in Health, CADTH), Scopus, Web of Science and Virtual Health Library (VHL). The Google Scholar database was used for the manual selection of studies (first 10 pages).

Additionally, hand searches were performed on the lists of references included in the systematic reviews and citations of the included studies, as well as on the websites of institutions working on CPGs development and implementation: Guidelines International Networks (GIN), National Institute for Health and Care Excellence (NICE; United Kingdom), World Health Organization (WHO), Centers for Disease Control and Prevention (CDC; USA), Institute of Medicine (IOM; USA), Australian Department of Health and Aged Care (ADH), Healthcare Improvement Scotland (SIGN), National Health and Medical Research Council (NHMRC; Australia), Queensland Health, The Joanna Briggs Institute (JBI), Ministry of Health and Social Policy of Spain, Ministry of Health of Brazil and Capes Theses and Dissertations Catalog.

The search strategy combined terms related to "clinical practice guidelines" (practice guidelines, practice guidelines as topic, clinical protocols), "implementation", "assessment" (assessment, evaluation), and "models, framework". The free term "monitoring" was not used because it was regularly related to clinical monitoring and not to implementation monitoring. The search strategies adapted for the electronic databases are presented in an additional file (see Additional file 2).

Study selection process

The results of the literature search from scientific databases, excluding the CRD database, were imported into Mendeley Reference Management software to remove duplicates. They were then transferred to the Rayyan platform ( https://rayyan.qcri.org ) [ 22 ] for the screening process. Initially, studies related to the "assessment of implementation of the CPG" were selected. The titles were first screened independently by two pairs of reviewers (first selection: four reviewers, NM, JB, SS, and JG; update: a pair of reviewers, NM and DG). The title screening was broad, including all potentially relevant studies on CPG and the implementation process. Following that, the abstracts were independently screened by the same group of reviewers. The abstract screening was more focused, specifically selecting studies that addressed CPG and the evaluation of the implementation process. In the next step, full-text articles were reviewed independently by a pair of reviewers (NM, DG) to identify those that explicitly presented "models" or "frameworks" for assessing the implementation of the CPG. Disagreements regarding the eligibility of studies were resolved through discussion and consensus, and by a third reviewer (JB) when necessary. One reviewer (NM) conducted manual searches, and the inclusion of documents was discussed with the other reviewers.

Risk of bias assessment of studies

The selected studies were independently classified and evaluated according to their methodological designs by two investigators (NM and JG). This review employed JBI’s critical appraisal tools to assess the trustworthiness, relevance and results of the included studies [ 23 ] and these tools are presented in additional files (see Additional file 3 and Additional file 4). Disagreements were resolved by consensus or consultation with the other reviewers. Methodological guidelines and noncomparative and before–after studies were not evaluated because JBI does not have specific tools for assessing these types of documents. Although the studies were assessed for quality, they were not excluded on this basis.

Data extraction

The data was independently extracted by two reviewers (NM, DG) using a Microsoft Excel spreadsheet. Discrepancies were discussed and resolved by consensus. The following information was extracted:

Document characteristics : author; year of publication; title; study design; instrument of evaluation; country; guideline context;

Usage context of the models : specific objectives; level of use (clinical, organizational, and policy); type of health service (community, ambulatorial, hospital, institutional); target group (guideline developers, clinicians; health professionals; health-policy decision-makers; health-care organizations; service managers);

Model and framework characteristics : name, domain evaluated, and model limitations.

The set of information to be extracted, shown in the systematic review protocol, was adjusted to improve the organization of the analysis.

The "level of use" refers to the scope of the model used. "Clinical" was considered when the evaluation focused on individual practices, "organizational" when practices were within a health service institution, and "policy" when the evaluation was more systemic and covered different health services or institutions.

The "type of health service" indicated the category of health service where the model/framework was used (or can be used) to assess the implementation of the CPG, related to the complexity of healthcare. "Community" is related to primary health care; "ambulatorial" is related to secondary health care; "hospital" is related to tertiary health care; and "institutional" represented models/frameworks not specific to a particular type of health service.

The "target group" included stakeholders related to the use of the model/framework for evaluating the implementation of the CPG, such as clinicians, health professionals, guideline developers, health policy-makers, health organizations, and service managers.

The category "health system" (public, private, or both) mentioned in the systematic review protocol was not found in the literature obtained and was removed as an extraction variable. Similarly, the variables "type of recommendation evaluated" and "context" were grouped because the same information was included in the "guideline context" section of the study.

Some selected documents presented models or frameworks recognized by the scientific field, including some that were validated. However, some studies adapted the model to this context. Therefore, the domain analysis covered all models or frameworks domains evaluated by (or suggested for evaluation by) the document analyzed.

Data analysis and synthesis

The results were tabulated using narrative synthesis with an aggregative approach, without meta-analysis, aiming to summarize the documents descriptively for the organization, description, interpretation and explanation of the study findings [ 24 , 25 ].

The model/framework domains evaluated in each document were studied according to Nilsen et al.’s constructs: "strategies", "context", "outcomes", "fidelity", "adaptation" and "sustainability". For this study, "strategies" were described as structured and planned initiatives used to enhance the implementation of clinical practice [ 26 ].

The definition of "context" varies in the literature. Despite that, this review considered it as the set of circumstances or factors surrounding a particular implementation effort, such as organizational support, financial resources, social relations and support, leadership, and organizational culture [ 26 , 27 ]. The domain "context" was subdivided according to the level of health care into "micro" (individual perspective), "meso" (organizational perspective), "macro" (systemic perspective), and "multiple" (when there is an issue involving more than one level of health care).

The "outcomes" domain was related to the results of the implementation process (unlike clinical outcomes) and was stratified according to the following constructs: acceptability, appropriateness, feasibility, adoption, cost, and penetration. All these concepts align with the definitions of Proctor et al. (2011), although we decided to separate "fidelity" and "sustainability" as independent domains similar to Nilsen [ 26 , 28 ].

"Fidelity" and "adaptation" were considered the same domain, as they are complementary pieces of the same issue. In this study, implementation fidelity refers to how closely guidelines are followed as intended by their developers or designers. On the other hand, adaptation involves making changes to the content or delivery of a guideline to better fit the needs of a specific context. The "sustainability" domain was defined as evaluations about the continuation or permanence over time of the CPG implementation.

Additionally, the domain "process" was utilized to address issues related to the implementation process itself, rather than focusing solely on the outcomes of the implementation process, as done by Wang et al. [ 14 ]. Furthermore, the "intervention" domain was introduced to distinguish aspects related to the CPG characteristics that can impact its implementation, such as the complexity of the recommendation.

A subgroup analysis was performed with models and frameworks categorized based on their levels of use (clinical, organizational, and policy) and the type of health service (community, ambulatorial, hospital, institutional) associated with the CPG. The goal is to assist stakeholders (politicians, clinicians, researchers, or others) in selecting the most suitable model for evaluating CPG implementation based on their specific health context.

Search results

Database searches yielded 26,011 studies, of which 107 full texts were reviewed. During the full-text review, 99 articles were excluded: 41 studies did not mention a model or framework for assessing the implementation of the CPG, 31 studies evaluated only implementation strategies (isolated actions) rather than the implementation process itself, and 27 articles were not related to the implementation assessment. Therefore, eight studies were included in the data analysis. The updated search did not reveal additional relevant studies. The main reason for study exclusion was that they did not use models or frameworks to assess CPG implementation. Additionally, four methodological guidelines were included from the manual search (Fig.  1 ).

figure 1

PRISMA diagram. Acronyms: ADH—Australian Department of Health, CINAHL—Cumulative Index to Nursing and Allied Health Literature, CDC—Centers for Disease Control and Prevention, CRD—Centre for Reviews and Dissemination, GIN—Guidelines International Networks, HSE—Health Systems Evidence, IOM—Institute of Medicine, JBI—The Joanna Briggs Institute, MHB—Ministry of Health of Brazil, NICE—National Institute for Health and Care Excellence, NHMRC—National Health and Medical Research Council, MSPS – Ministerio de Sanidad Y Política Social (Spain), SIGN—Scottish Intercollegiate Guidelines Network, VHL – Virtual Health Library, WHO—World Health Organization. Legend: Reason A –The study evaluated only implementation strategies (isolated actions) rather than the implementation process itself. Reason B – The study did not mention a model or framework for assessing the implementation of the intervention. Reason C – The study was not related to the implementation assessment. Adapted from Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ 2021;372:n71. https://doi.org/10.1136/bmj.n71 . For more information, visit:

According to the JBI’s critical appraisal tools, the overall assessment of the studies indicates their acceptance for the systematic review.

The cross-sectional studies lacked clear information regarding "confounding factors" or "strategies to address confounding factors". This was understandable given the nature of the study, where such details are not typically included. However, the reviewers did not find this lack of information to be critical, allowing the studies to be included in the review. The results of this methodological quality assessment can be found in an additional file (see Additional file 5).

In the qualitative studies, there was some ambiguity regarding the questions: "Is there a statement locating the researcher culturally or theoretically?" and "Is the influence of the researcher on the research, and vice versa, addressed?". However, the reviewers decided to include the studies and deemed the methodological quality sufficient for the analysis in this article, based on the other information analyzed. The results of this methodological quality assessment can be found in an additional file (see Additional file 6).

Documents characteristics (Table  1 )

The documents were directed to several continents: Australia/Oceania (4/12) [ 31 , 33 , 36 , 37 ], North America (4/12 [ 30 , 32 , 38 , 39 ], Europe (2/12 [ 29 , 35 ] and Asia (2/12) [ 34 , 40 ]. The types of documents were classified as cross-sectional studies (4/12) [ 29 , 32 , 34 , 38 ], methodological guidelines (4/12) [ 33 , 35 , 36 , 37 ], mixed methods studies (3/12) [ 30 , 31 , 39 ] or noncomparative studies (1/12) [ 40 ]. In terms of the instrument of evaluation, most of the documents used a survey/questionnaire (6/12) [ 29 , 30 , 31 , 32 , 34 , 38 ], while three (3/12) used qualitative instruments (interviews, group discussions) [ 30 , 31 , 39 ], one used a checklist [ 37 ], one used an audit [ 33 ] and three (3/12) did not define a specific instrument to measure [ 35 , 36 , 40 ].

Considering the clinical areas covered, most studies evaluated the implementation of nonspecific (general) clinical areas [ 29 , 33 , 35 , 36 , 37 , 40 ]. However, some studies focused on specific clinical contexts, such as mental health [ 32 , 38 ], oncology [ 39 ], fall prevention [ 31 ], spinal cord injury [ 30 ], and sexually transmitted infections [ 34 ].

Usage context of the models (Table  1 )

Specific objectives.

All the studies highlighted the purpose of guiding the process of evaluating the implementation of CPGs, even if they evaluated CPGs from generic or different clinical areas.

Levels of use

The most common level of use of the models/frameworks identified to assess the implementation of CPGs was policy (6/12) [ 33 , 35 , 36 , 37 , 39 , 40 ]. In this level, the model is used in a systematic way to evaluate all the processes involved in CPGs implementation and is primarily related to methodological guidelines. This was followed by the organizational level of use (5/12) [ 30 , 31 , 32 , 38 , 39 ], where the model is used to evaluate the implementation of CPGs in a specific institution, considering its specific environment. Finally, the clinical level of use (2/12) [ 29 , 34 ] focuses on individual practice and the factors that can influence the implementation of CPGs by professionals.

Type of health service

Institutional services were predominant (5/12) [ 33 , 35 , 36 , 37 , 40 ] and included methodological guidelines and a study of model development and validation. Hospitals were the second most common type of health service (4/12) [ 29 , 30 , 31 , 34 ], followed by ambulatorial (2/12) [ 32 , 34 ] and community health services (1/12) [ 32 ]. Two studies did not specify which type of health service the assessment addressed [ 38 , 39 ].

Target group

The focus of the target group was professionals directly involved in clinical practice (6/12) [ 29 , 31 , 32 , 34 , 38 , 40 ], namely, health professionals and clinicians. Other less related stakeholders included guideline developers (2/12) [ 39 , 40 ], health policy decision makers (1/12) [ 39 ], and healthcare organizations (1/12) [ 39 ]. The target group was not defined in the methodological guidelines, although all the mentioned stakeholders could be related to these documents.

Model and framework characteristics

Models and frameworks for assessing the implementation of cpgs.

The Consolidated Framework for Implementation Research (CFIR) [ 31 , 38 ] and the Promoting Action on Research Implementation in Health Systems (PARiHS) framework [ 29 , 30 ] were the most commonly employed frameworks within the selected documents. The other models mentioned were: Goal commitment and implementation of practice guidelines framework [ 32 ]; Guideline to identify key indicators [ 35 ]; Guideline implementation checklist [ 37 ]; Guideline implementation evaluation tool [ 40 ]; JBI Implementation Framework [ 33 ]; Reach, effectiveness, adoption, implementation and maintenance (RE-AIM) framework [ 34 ]; The Guideline Implementability Framework [ 39 ] and an unnamed model [ 36 ].

Domains evaluated

The number of domains evaluated (or suggested for evaluation) by the documents varied between three and five, with the majority focusing on three domains. All the models addressed the domain "context", with a particular emphasis on the micro level of the health care context (8/12) [ 29 , 31 , 34 , 35 , 36 , 37 , 38 , 39 ], followed by the multilevel (7/12) [ 29 , 31 , 32 , 33 , 38 , 39 , 40 ], meso level (4/12) [ 30 , 35 , 39 , 40 ] and macro level (2/12) [ 37 , 39 ]. The "Outcome" domain was evaluated in nine models. Within this domain, the most frequently evaluated subdomain was "adoption" (6/12) [ 29 , 32 , 34 , 35 , 36 , 37 ], followed by "acceptability" (4/12) [ 30 , 32 , 35 , 39 ], "appropriateness" (3/12) [ 32 , 34 , 36 ], "feasibility" (3/12) [ 29 , 32 , 36 ], "cost" (1/12) [ 35 ] and "penetration" (1/12) [ 34 ]. Regarding the other domains, "Intervention" (8/12) [ 29 , 31 , 34 , 35 , 36 , 38 , 39 , 40 ], "Strategies" (7/12) [ 29 , 30 , 33 , 35 , 36 , 37 , 40 ] and "Process" (5/12) [ 29 , 31 , 32 , 33 , 38 ] were frequently addressed in the models, while "Sustainability" (1/12) [ 34 ] was only found in one model, and "Fidelity/Adaptation" was not observed. The domains presented by the models and frameworks and evaluated in the documents are shown in Table  2 .

Limitations of the models

Only two documents mentioned limitations in the use of the model or frameworks. These two studies reported limitations in the use of CFIR: "is complex and cumbersome and requires tailoring of the key variables to the specific context", and "this framework should be supplemented with other important factors and local features to achieve a sound basis for the planning and realization of an ongoing project" [ 31 , 38 ]. Limitations in the use of other models or frameworks are not reported.

Subgroup analysis

Following the subgroup analysis (Table  3 ), five different models/frameworks were utilized at the policy level by institutional health services. These included the Guideline Implementation Evaluation Tool [ 40 ], the NHMRC tool (model name not defined) [ 36 ], the JBI Implementation Framework + GRiP [ 33 ], Guideline to identify key indicators [ 35 ], and the Guideline implementation checklist [ 37 ]. Additionally, the "Guideline Implementability Framework" [ 39 ] was implemented at the policy level without restrictions based on the type of health service. Regarding the organizational level, the models used varied depending on the type of service. The "Goal commitment and implementation of practice guidelines framework" [ 32 ] was applied in community and ambulatory health services, while "PARiHS" [ 29 , 30 ] and "CFIR" [ 31 , 38 ] were utilized in hospitals. In contexts where the type of health service was not defined, "CFIR" [ 31 , 38 ] and "The Guideline Implementability Framework" [ 39 ] were employed. Lastly, at the clinical level, "RE-AIM" [ 34 ] was utilized in ambulatory and hospital services, and PARiHS [ 29 , 30 ] was specifically used in hospital services.

Key findings

This systematic review identified 10 models/ frameworks used to assess the implementation of CPGs in various health system contexts. These documents shared similar objectives in utilizing models and frameworks for assessment. The primary level of use was policy, the most common type of health service was institutional, and the main target group of the documents was professionals directly involved in clinical practice. The models and frameworks presented varied analytical domains, with sometimes divergent concepts used in these domains. This study is innovative in its emphasis on the evaluation stage of CPG implementation and in summarizing aspects and domains aimed at the practical application of these models.

The small number of documents contrasts with studies that present an extensive range of models and frameworks available in implementation science. The findings suggest that the use of models and frameworks to evaluate the implementation of CPGs is still in its early stages. Among the selected documents, there was a predominance of cross-sectional studies and methodological guidelines, which strongly influenced how the implementation evaluation was conducted. This was primarily done through surveys/questionnaires, qualitative methods (interviews, group discussions), and non-specific measurement instruments. Regarding the subject areas evaluated, most studies focused on a general clinical area, while others explored different clinical areas. This suggests that the evaluation of CPG implementation has been carried out in various contexts.

The models were chosen independently of the categories proposed in the literature, with their usage categorized for purposes other than implementation evaluation, as is the case with CFIR and PARiHS. This practice was described by Nilsen et al. who suggested that models and frameworks from other categories can also be applied for evaluation purposes because they specify concepts and constructs that may be operationalized and measured [ 14 , 15 , 42 , 43 ].

The results highlight the increased use of models and frameworks in evaluation processes at the policy level and institutional environments, followed by the organizational level in hospital settings. This finding contradicts a review that reported the policy level as an area that was not as well studied [ 44 ]. The use of different models at the institutional level is also emphasized in the subgroup analysis. This may suggest that the greater the impact (social, financial/economic, and organizational) of implementing CPGs, the greater the interest and need to establish well-defined and robust processes. In this context, the evaluation stage stands out as crucial, and the investment of resources and efforts to structure this stage becomes even more advantageous [ 10 , 45 ]. Two studies (16,7%) evaluated the implementation of CPGs at the individual level (clinical level). These studies stand out for their potential to analyze variations in clinical practice in greater depth.

In contrast to the level of use and type of health service most strongly indicated in the documents, with systemic approaches, the target group most observed was professionals directly involved in clinical practice. This suggests an emphasis on evaluating individual behaviors. This same emphasis is observed in the analysis of the models, in which there is a predominance of evaluating the micro level of the health context and the "adoption" subdomain, in contrast with the sub-use of domains such as "cost" and "process". Cassetti et al. observed the same phenomenon in their review, in which studies evaluating the implementation of CPGs mainly adopted a behavioral change approach to tackle those issues, without considering the influence of wider social determinants of health [ 10 ]. However, the literature widely reiterates that multiple factors impact the implementation of CPGs, and different actions are required to make them effective [ 6 , 46 , 47 ]. As a result, there is enormous potential for the development and adaptation of models and frameworks aimed at more systemic evaluation processes that consider institutional and organizational aspects.

In analyzing the model domains, most models focused on evaluating only some aspects of implementation (three domains). All models evaluated the "context", highlighting its significant influence on implementation [ 9 , 26 ]. Context is an essential effect modifier for providing research evidence to guide decisions on implementation strategies [ 48 ]. Contextualizing a guideline involves integrating research or other evidence into a specific circumstance [ 49 ]. The analysis of this domain was adjusted to include all possible contextual aspects, even if they were initially allocated to other domains. Some contextual aspects presented by the models vary in comprehensiveness, such as the assessment of the "timing and nature of stakeholder engagement" [ 39 ], which includes individual engagement by healthcare professionals and organizational involvement in CPG implementation. While the importance of context is universally recognized, its conceptualization and interpretation differ across studies and models. This divergence is also evident in other domains, consistent with existing literature [ 14 ]. Efforts to address this conceptual divergence in implementation science are ongoing, but further research and development are needed in this field [ 26 ].

The main subdomain evaluated was "adoption" within the outcome domain. This may be attributed to the ease of accessing information on the adoption of the CPG, whether through computerized system records, patient records, or self-reports from healthcare professionals or patients themselves. The "acceptability" subdomain pertains to the perception among implementation stakeholders that a particular CPG is agreeable, palatable or satisfactory. On the other hand, "appropriateness" encompasses the perceived fit, relevance or compatibility of the CPG for a specific practice setting, provider, or consumer, or its perceived fit to address a particular issue or problem [ 26 ]. Both subdomains are subjective and rely on stakeholders' interpretations and perceptions of the issue being analyzed, making them susceptible to reporting biases. Moreover, obtaining this information requires direct consultation with stakeholders, which can be challenging for some evaluation processes, particularly in institutional contexts.

The evaluation of the subdomains "feasibility" (the extent to which a CPG can be successfully used or carried out within a given agency or setting), "cost" (the cost impact of an implementation effort), and "penetration" (the extent to which an intervention or treatment is integrated within a service setting and its subsystems) [ 26 ] was rarely observed in the documents. This may be related to the greater complexity of obtaining information on these aspects, as they involve cross-cutting and multifactorial issues. In other words, it would be difficult to gather this information during evaluations with health practitioners as the target group. This highlights the need for evaluation processes of CPGs implementation involving multiple stakeholders, even if the evaluation is adjusted for each of these groups.

Although the models do not establish the "intervention" domain, we thought it pertinent in this study to delimit the issues that are intrinsic to CPGs, such as methodological quality or clarity in establishing recommendations. These issues were quite common in the models evaluated but were considered in other domains (e.g., in "context"). Studies have reported the importance of evaluating these issues intrinsic to CPGs [ 47 , 50 ] and their influence on the implementation process [ 51 ].

The models explicitly present the "strategies" domain, and its evaluation was usually included in the assessments. This is likely due to the expansion of scientific and practical studies in implementation science that involve theoretical approaches to the development and application of interventions to improve the implementation of evidence-based practices. However, these interventions themselves are not guaranteed to be effective, as reported in a previous review that showed unclear results indicating that the strategies had affected successful implementation [ 52 ]. Furthermore, model domains end up not covering all the complexity surrounding the strategies and their development and implementation process. For example, the ‘Guideline implementation evaluation tool’ evaluates whether guideline developers have designed and provided auxiliary tools to promote the implementation of guidelines [ 40 ], but this does not mean that these tools would work as expected.

The "process" domain was identified in the CFIR [ 31 , 38 ], JBI/GRiP [ 33 ], and PARiHS [ 29 ] frameworks. While it may be included in other domains of analysis, its distinct separation is crucial for defining operational issues when assessing the implementation process, such as determining if and how the use of the mentioned CPG was evaluated [ 3 ]. Despite its presence in multiple models, there is still limited detail in the evaluation guidelines, which makes it difficult to operationalize the concept. Further research is needed to better define the "process" domain and its connections and boundaries with other domains.

The domain of "sustainability" was only observed in the RE-AIM framework, which is categorized as an evaluation framework [ 34 ]. In its acronym, the letter M stands for "maintenance" and corresponds to the assessment of whether the user maintains use, typically longer than 6 months. The presence of this domain highlights the need for continuous evaluation of CPGs implementation in the short, medium, and long term. Although the RE-AIM framework includes this domain, it was not used in the questionnaire developed in the study. One probable reason is that the evaluation of CPGs implementation is still conducted on a one-off basis and not as a continuous improvement process. Considering that changes in clinical practices are inherent over time, evaluating and monitoring changes throughout the duration of the CPG could be an important strategy for ensuring its implementation. This is an emerging field that requires additional investment and research.

The "Fidelity/Adaptation" domain was not observed in the models. These emerging concepts involve the extent to which a CPG is being conducted exactly as planned or whether it is undergoing adjustments and adaptations. Whether or not there is fidelity or adaptation in the implementation of CPGs does not presuppose greater or lesser effectiveness; after all, some adaptations may be necessary to implement general CPGs in specific contexts. The absence of this domain in all the models and frameworks may suggest that they are not relevant aspects for evaluating implementation or that there is a lack of knowledge of these complex concepts. This may suggest difficulty in expressing concepts in specific evaluative questions. However, further studies are warranted to determine the comprehensiveness of these concepts.

It is important to note the customization of the domains of analysis, with some domains presented in the models not being evaluated in the studies, while others were complementarily included. This can be seen in Jeong et al. [ 34 ], where the "intervention" domain in the evaluation with the RE-AIM framework reinforced the aim of theoretical approaches such as guiding the process and not determining norms. Despite this, few limitations were reported for the models, suggesting that the use of models in these studies reflects the application of these models to defined contexts without a deep critical analysis of their domains.

Limitations

This review has several limitations. First, only a few studies and methodological guidelines that explicitly present models and frameworks for assessing the implementation of CPGs have been found. This means that few alternative models could be analyzed and presented in this review. Second, this review adopted multiple analytical categories (e.g., level of use, health service, target group, and domains evaluated), whose terminology has varied enormously in the studies and documents selected, especially for the "domains evaluated" category. This difficulty in harmonizing the taxonomy used in the area has already been reported [ 26 ] and has significant potential to confuse. For this reason, studies and initiatives are needed to align understandings between concepts and, as far as possible, standardize them. Third, in some studies/documents, the information extracted was not clear about the analytical category. This required an in-depth interpretative process of the studies, which was conducted in pairs to avoid inappropriate interpretations.

Implications

This study contributes to the literature and clinical practice management by describing models and frameworks specifically used to assess the implementation of CPGs based on their level of use, type of health service, target group related to the CPG, and the evaluated domains. While there are existing reviews on the theories, frameworks, and models used in implementation science, this review addresses aspects not previously covered in the literature. This valuable information can assist stakeholders (such as politicians, clinicians, researchers, etc.) in selecting or adapting the most appropriate model to assess CPG implementation based on their health context. Furthermore, this study is expected to guide future research on developing or adapting models to assess the implementation of CPGs in various contexts.

The use of models and frameworks to evaluate the implementation remains a challenge. Studies should clearly state the level of model use, the type of health service evaluated, and the target group. The domains evaluated in these models may need adaptation to specific contexts. Nevertheless, utilizing models to assess CPGs implementation is crucial as they can guide a more thorough and systematic evaluation process, aiding in the continuous improvement of CPGs implementation. The findings of this systematic review offer valuable insights for stakeholders in selecting or adjusting models and frameworks for CPGs evaluation, supporting future theoretical advancements and research.

Availability of data and materials

Abbreviations.

Australian Department of Health and Aged Care

Canadian Agency for Drugs and Technologies in Health

Centers for Disease Control and

Consolidated Framework for Implementation Research

Cumulative Index to Nursing and Allied Health Literature

Clinical practice guideline

Centre for Reviews and Dissemination

Guidelines International Networks

Getting Research into Practice

Health Systems Evidence

Institute of Medicine

The Joanna Briggs Institute

Ministry of Health of Brazil

Ministerio de Sanidad y Política Social

National Health and Medical Research Council

National Institute for Health and Care Excellence

Promoting action on research implementation in health systems framework

Predisposing, Reinforcing and Enabling Constructs in Educational Diagnosis and Evaluation-Policy, Regulatory, and Organizational Constructs in Educational and Environmental Development

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

International Prospective Register of Systematic Reviews

Reach, effectiveness, adoption, implementation, and maintenance framework

Healthcare Improvement Scotland

United States of America

Virtual Health Library

World Health Organization

Medicine I of. Crossing the Quality Chasm: A New Health System for the 21st Century. 2001. Available from: http://www.nap.edu/catalog/10027 . Cited 2022 Sep 29.

Field MJ, Lohr KN. Clinical Practice Guidelines: Directions for a New Program. Washington DC: National Academy Press. 1990. Available from: https://www.nap.edu/read/1626/chapter/8 Cited 2020 Sep 2.

Dawson A, Henriksen B, Cortvriend P. Guideline Implementation in Standardized Office Workflows and Exam Types. J Prim Care Community Heal. 2019;10. Available from: https://pubmed.ncbi.nlm.nih.gov/30900500/ . Cited 2020 Jul 15.

Unverzagt S, Oemler M, Braun K, Klement A. Strategies for guideline implementation in primary care focusing on patients with cardiovascular disease: a systematic review. Fam Pract. 2014;31(3):247–66. Available from: https://academic.oup.com/fampra/article/31/3/247/608680 . Cited 2020 Nov 5.

Article   PubMed   Google Scholar  

Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):1–13. Available from: https://implementationscience.biomedcentral.com/articles/10.1186/s13012-015-0242-0 . Cited 2022 May 1.

Article   Google Scholar  

Mangana F, Massaquoi LD, Moudachirou R, Harrison R, Kaluangila T, Mucinya G, et al. Impact of the implementation of new guidelines on the management of patients with HIV infection at an advanced HIV clinic in Kinshasa, Democratic Republic of Congo (DRC). BMC Infect Dis. 2020;20(1):N.PAG-N.PAG. Available from: https://search.ebscohost.com/login.aspx?direct=true&db=c8h&AN=146325052&amp .

Browman GP, Levine MN, Mohide EA, Hayward RSA, Pritchard KI, Gafni A, et al. The practice guidelines development cycle: a conceptual tool for practice guidelines development and implementation. 2016;13(2):502–12. https://doi.org/10.1200/JCO.1995.13.2.502 .

Killeen SL, Donnellan N, O’Reilly SL, Hanson MA, Rosser ML, Medina VP, et al. Using FIGO Nutrition Checklist counselling in pregnancy: A review to support healthcare professionals. Int J Gynecol Obstet. 2023;160(S1):10–21. Available from: https://www.scopus.com/inward/record.uri?eid=2-s2.0-85146194829&doi=10.1002%2Fijgo.14539&partnerID=40&md5=d0f14e1f6d77d53e719986e6f434498f .

Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3(1):1–12. Available from: https://bmcpsychology.biomedcentral.com/articles/10.1186/s40359-015-0089-9 . Cited 2020 Nov 5.

Cassetti V, M VLR, Pola-Garcia M, AM G, J JPC, L APDT, et al. An integrative review of the implementation of public health guidelines. Prev Med reports. 2022;29:101867. Available from: http://www.epistemonikos.org/documents/7ad499d8f0eecb964fc1e2c86b11450cbe792a39 .

Eccles MP, Mittman BS. Welcome to implementation science. Implementation Science BioMed Central. 2006. Available from: https://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-1-1 .

Damschroder LJ. Clarity out of chaos: Use of theory in implementation research. Psychiatry Res. 2020;1(283):112461.

Handley MA, Gorukanti A, Cattamanchi A. Strategies for implementing implementation science: a methodological overview. Emerg Med J. 2016;33(9):660–4. Available from: https://pubmed.ncbi.nlm.nih.gov/26893401/ . Cited 2022 Mar 7.

Wang Y, Wong ELY, Nilsen P, Chung VC ho, Tian Y, Yeoh EK. A scoping review of implementation science theories, models, and frameworks — an appraisal of purpose, characteristics, usability, applicability, and testability. Implement Sci. 2023;18(1):1–15. Available from: https://implementationscience.biomedcentral.com/articles/10.1186/s13012-023-01296-x . Cited 2024 Jan 22.

Moullin JC, Dickson KS, Stadnick NA, Albers B, Nilsen P, Broder-Fingert S, et al. Ten recommendations for using implementation frameworks in research and practice. Implement Sci Commun. 2020;1(1):1–12. Available from: https://implementationsciencecomms.biomedcentral.com/articles/10.1186/s43058-020-00023-7 . Cited 2022 May 20.

Glasgow RE, Vogt TM, Boles SM. *Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322. Available from: /pmc/articles/PMC1508772/?report=abstract. Cited 2022 May 22.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Asada Y, Lin S, Siegel L, Kong A. Facilitators and Barriers to Implementation and Sustainability of Nutrition and Physical Activity Interventions in Early Childcare Settings: a Systematic Review. Prev Sci. 2023;24(1):64–83. Available from: https://www.scopus.com/inward/record.uri?eid=2-s2.0-85139519721&doi=10.1007%2Fs11121-022-01436-7&partnerID=40&md5=b3c395fdd2b8235182eee518542ebf2b .

Higgins JPT, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, et al., editors. Cochrane Handbook for Systematic Reviews of Interventions. version 6. Cochrane; 2022. Available from: https://training.cochrane.org/handbook. Cited 2022 May 23.

Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372. Available from: https://www.bmj.com/content/372/bmj.n71 . Cited 2021 Nov 18.

M C, AD O, E P, JP H, S G. Appendix A: Guide to the contents of a Cochrane Methodology protocol and review. Higgins JP, Green S, eds Cochrane Handb Syst Rev Interv. 2011;Version 5.

Kislov R, Pope C, Martin GP, Wilson PM. Harnessing the power of theorising in implementation science. Implement Sci. 2019;14(1):1–8. Available from: https://implementationscience.biomedcentral.com/articles/10.1186/s13012-019-0957-4 . Cited 2024 Jan 22.

Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev. 2016;5(1):1–10. Available from: https://systematicreviewsjournal.biomedcentral.com/articles/10.1186/s13643-016-0384-4 . Cited 2022 May 20.

JBI. JBI’s Tools Assess Trust, Relevance & Results of Published Papers: Enhancing Evidence Synthesis. Available from: https://jbi.global/critical-appraisal-tools . Cited 2023 Jun 13.

Drisko JW. Qualitative research synthesis: An appreciative and critical introduction. Qual Soc Work. 2020;19(4):736–53.

Pope C, Mays N, Popay J. Synthesising qualitative and quantitative health evidence: A guide to methods. 2007. Available from: https://books.google.com.br/books?hl=pt-PT&lr=&id=L3fbE6oio8kC&oi=fnd&pg=PR6&dq=synthesizing+qualitative+and+quantitative+health+evidence&ots=sfELNUoZGq&sig=bQt5wt7sPKkf7hwKUvxq2Ek-p2Q#v=onepage&q=synthesizing=qualitative=and=quantitative=health=evidence& . Cited 2022 May 22.

Nilsen P, Birken SA, Edward Elgar Publishing. Handbook on implementation science. 542. Available from: https://www.e-elgar.com/shop/gbp/handbook-on-implementation-science-9781788975988.html . Cited 2023 Apr 15.

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):1–15. Available from: https://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-4-50 . Cited 2023 Jun 13.

Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76. Available from: https://pubmed.ncbi.nlm.nih.gov/20957426/ . Cited 2023 Jun 11.

Bahtsevani C, Willman A, Khalaf A, Östman M, Ostman M. Developing an instrument for evaluating implementation of clinical practice guidelines: a test-retest study. J Eval Clin Pract. 2008;14(5):839–46. Available from: https://search.ebscohost.com/login.aspx?direct=true&db=c8h&AN=105569473&amp . Cited 2023 Jan 18.

Balbale SN, Hill JN, Guihan M, Hogan TP, Cameron KA, Goldstein B, et al. Evaluating implementation of methicillin-resistant Staphylococcus aureus (MRSA) prevention guidelines in spinal cord injury centers using the PARIHS framework: a mixed methods study. Implement Sci. 2015;10(1):130. Available from: https://pubmed.ncbi.nlm.nih.gov/26353798/ . Cited 2023 Apr 3.

Article   PubMed   PubMed Central   Google Scholar  

Breimaier HE, Heckemann B, Halfens RJGG, Lohrmann C. The Consolidated Framework for Implementation Research (CFIR): a useful theoretical framework for guiding and evaluating a guideline implementation process in a hospital-based nursing practice. BMC Nurs. 2015;14(1):43. Available from: https://search.ebscohost.com/login.aspx?direct=true&db=c8h&AN=109221169&amp . Cited 2023 Apr 3.

Chou AF, Vaughn TE, McCoy KD, Doebbeling BN. Implementation of evidence-based practices: Applying a goal commitment framework. Health Care Manage Rev. 2011;36(1):4–17. Available from: https://pubmed.ncbi.nlm.nih.gov/21157225/ . Cited 2023 Apr 30.

Porritt K, McArthur A, Lockwood C, Munn Z. JBI Manual for Evidence Implementation. JBI Handbook for Evidence Implementation. JBI; 2020. Available from: https://jbi-global-wiki.refined.site/space/JHEI . Cited 2023 Apr 3.

Jeong HJJ, Jo HSS, Oh MKK, Oh HWW. Applying the RE-AIM Framework to Evaluate the Dissemination and Implementation of Clinical Practice Guidelines for Sexually Transmitted Infections. J Korean Med Sci. 2015;30(7):847–52. Available from: https://pubmed.ncbi.nlm.nih.gov/26130944/ . Cited 2023 Apr 3.

GPC G de trabajo sobre implementación de. Implementación de Guías de Práctica Clínica en el Sistema Nacional de Salud. Manual Metodológico. 2009. Available from: https://portal.guiasalud.es/wp-content/uploads/2019/01/manual_implementacion.pdf . Cited 2023 Apr 3.

Australia C of. A guide to the development, implementation and evaluation of clinical practice guidelines. National Health and Medical Research Council; 1998. Available from: https://www.health.qld.gov.au/__data/assets/pdf_file/0029/143696/nhmrc_clinprgde.pdf .

Health Q. Guideline implementation checklist Translating evidence into best clinical practice. 2022.

Google Scholar  

Quittner AL, Abbott J, Hussain S, Ong T, Uluer A, Hempstead S, et al. Integration of mental health screening and treatment into cystic fibrosis clinics: Evaluation of initial implementation in 84 programs across the United States. Pediatr Pulmonol. 2020;55(11):2995–3004. Available from: https://www.embase.com/search/results?subaction=viewrecord&id=L2005630887&from=export . Cited 2023 Apr 3.

Urquhart R, Woodside H, Kendell C, Porter GA. Examining the implementation of clinical practice guidelines for the management of adult cancers: A mixed methods study. J Eval Clin Pract. 2019;25(4):656–63. Available from: https://search.ebscohost.com/login.aspx?direct=true&db=c8h&AN=137375535&amp . Cited 2023 Apr 3.

Yinghui J, Zhihui Z, Canran H, Flute Y, Yunyun W, Siyu Y, et al. Development and validation for evaluation of an evaluation tool for guideline implementation. Chinese J Evidence-Based Med. 2022;22(1):111–9. Available from: https://www.embase.com/search/results?subaction=viewrecord&id=L2016924877&from=export .

Breimaier HE, Halfens RJG, Lohrmann C. Effectiveness of multifaceted and tailored strategies to implement a fall-prevention guideline into acute care nursing practice: a before-and-after, mixed-method study using a participatory action research approach. BMC Nurs. 2015;14(1):18. Available from: https://search.ebscohost.com/login.aspx?direct=true&db=c8h&AN=103220991&amp .

Lai J, Maher L, Li C, Zhou C, Alelayan H, Fu J, et al. Translation and cross-cultural adaptation of the National Health Service Sustainability Model to the Chinese healthcare context. BMC Nurs. 2023;22(1). Available from: https://www.scopus.com/inward/record.uri?eid=2-s2.0-85153237164&doi=10.1186%2Fs12912-023-01293-x&partnerID=40&md5=0857c3163d25ce85e01363fc3a668654 .

Zhao J, Li X, Yan L, Yu Y, Hu J, Li SA, et al. The use of theories, frameworks, or models in knowledge translation studies in healthcare settings in China: a scoping review protocol. Syst Rev. 2021;10(1):13. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7792291 .

Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50. Available from: https://pubmed.ncbi.nlm.nih.gov/22898128/ . Cited 2023 Apr 4.

Phulkerd S, Lawrence M, Vandevijvere S, Sacks G, Worsley A, Tangcharoensathien V. A review of methods and tools to assess the implementation of government policies to create healthy food environments for preventing obesity and diet-related non-communicable diseases. Implement Sci. 2016;11(1):1–13. Available from: https://implementationscience.biomedcentral.com/articles/10.1186/s13012-016-0379-5 . Cited 2022 May 1.

Buss PM, Pellegrini FA. A Saúde e seus Determinantes Sociais. PHYSIS Rev Saúde Coletiva. 2007;17(1):77–93.

Pereira VC, Silva SN, Carvalho VKSS, Zanghelini F, Barreto JOMM. Strategies for the implementation of clinical practice guidelines in public health: an overview of systematic reviews. Heal Res Policy Syst. 2022;20(1):13. Available from: https://health-policy-systems.biomedcentral.com/articles/10.1186/s12961-022-00815-4 . Cited 2022 Feb 21.

Grimshaw J, Eccles M, Tetroe J. Implementing clinical guidelines: current evidence and future implications. J Contin Educ Health Prof. 2004;24 Suppl 1:S31-7. Available from: https://pubmed.ncbi.nlm.nih.gov/15712775/ . Cited 2021 Nov 9.

Lotfi T, Stevens A, Akl EA, Falavigna M, Kredo T, Mathew JL, et al. Getting trustworthy guidelines into the hands of decision-makers and supporting their consideration of contextual factors for implementation globally: recommendation mapping of COVID-19 guidelines. J Clin Epidemiol. 2021;135:182–6. Available from: https://pubmed.ncbi.nlm.nih.gov/33836255/ . Cited 2024 Jan 25.

Lenzer J. Why we can’t trust clinical guidelines. BMJ. 2013;346(7913). Available from: https://pubmed.ncbi.nlm.nih.gov/23771225/ . Cited 2024 Jan 25.

Molino C de GRC, Ribeiro E, Romano-Lieber NS, Stein AT, de Melo DO. Methodological quality and transparency of clinical practice guidelines for the pharmacological treatment of non-communicable diseases using the AGREE II instrument: A systematic review protocol. Syst Rev. 2017;6(1):1–6. Available from: https://systematicreviewsjournal.biomedcentral.com/articles/10.1186/s13643-017-0621-5 . Cited 2024 Jan 25.

Albers B, Mildon R, Lyon AR, Shlonsky A. Implementation frameworks in child, youth and family services – Results from a scoping review. Child Youth Serv Rev. 2017;1(81):101–16.

Download references

Acknowledgements

Not applicable

This study is supported by the Fundação de Apoio à Pesquisa do Distrito Federal (FAPDF). FAPDF Award Term (TOA) nº 44/2024—FAPDF/SUCTI/COOBE (SEI/GDF – Process 00193–00000404/2024–22). The content in this article is solely the responsibility of the authors and does not necessarily represent the official views of the FAPDF.

Author information

Authors and affiliations.

Department of Management and Incorporation of Health Technologies, Ministry of Health of Brazil, Brasília, Federal District, 70058-900, Brazil

Nicole Freitas de Mello & Dalila Fernandes Gomes

Postgraduate Program in Public Health, FS, University of Brasília (UnB), Brasília, Federal District, 70910-900, Brazil

Nicole Freitas de Mello, Dalila Fernandes Gomes & Jorge Otávio Maia Barreto

René Rachou Institute, Oswaldo Cruz Foundation, Belo Horizonte, Minas Gerais, 30190-002, Brazil

Sarah Nascimento Silva

Oswaldo Cruz Foundation - Brasília, Brasília, Federal District, 70904-130, Brazil

Juliana da Motta Girardi & Jorge Otávio Maia Barreto

You can also search for this author in PubMed   Google Scholar

Contributions

NFM and JOMB conceived the idea and the protocol for this study. NFM conducted the literature search. NFM, SNS, JMG and JOMB conducted the data collection with advice and consensus gathering from JOMB. The NFM and JMG assessed the quality of the studies. NFM and DFG conducted the data extraction. NFM performed the analysis and synthesis of the results with advice and consensus gathering from JOMB. NFM drafted the manuscript. JOMB critically revised the first version of the manuscript. All the authors revised and approved the submitted version.

Corresponding author

Correspondence to Nicole Freitas de Mello .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

13012_2024_1389_moesm1_esm.docx.

Additional file 1: PRISMA checklist. Description of data: Completed PRISMA checklist used for reporting the results of this systematic review.

Additional file 2: Literature search. Description of data: The search strategies adapted for the electronic databases.

13012_2024_1389_moesm3_esm.doc.

Additional file 3: JBI’s critical appraisal tools for cross-sectional studies. Description of data: JBI’s critical appraisal tools to assess the trustworthiness, relevance, and results of the included studies. This is specific for cross-sectional studies.

13012_2024_1389_MOESM4_ESM.doc

Additional file 4: JBI’s critical appraisal tools for qualitative studies. Description of data: JBI’s critical appraisal tools to assess the trustworthiness, relevance, and results of the included studies. This is specific for qualitative studies.

13012_2024_1389_MOESM5_ESM.doc

Additional file 5: Methodological quality assessment results for cross-sectional studies. Description of data: Methodological quality assessment results for cross-sectional studies using JBI’s critical appraisal tools.

13012_2024_1389_MOESM6_ESM.doc

Additional file 6: Methodological quality assessment results for the qualitative studies. Description of data: Methodological quality assessment results for qualitative studies using JBI’s critical appraisal tools.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Freitas de Mello, N., Nascimento Silva, S., Gomes, D.F. et al. Models and frameworks for assessing the implementation of clinical practice guidelines: a systematic review. Implementation Sci 19 , 59 (2024). https://doi.org/10.1186/s13012-024-01389-1

Download citation

Received : 06 February 2024

Accepted : 01 August 2024

Published : 07 August 2024

DOI : https://doi.org/10.1186/s13012-024-01389-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Implementation
  • Practice guideline
  • Evidence-Based Practice
  • Implementation science

Implementation Science

ISSN: 1748-5908

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

dissemination of findings in qualitative research

Research Assistant I - Public Health/Health Sciences, IHP/DEI

How to apply.

To be considered for this position, candidates must attach a cover letter as the first page of your resume. The cover letter should address your specific interest in the position and outline skills and experience that directly relates to the qualifications of this position. 

Job Summary

The University of Michigan-Flint is seeking a Research Assistant I (Temporary).  This research is part of the DEI Centers Project: Exploring the History, Role of Student Activism, Administrative Leadership, and Fiscal Decisions in the Founding and Maintenance of Center for Global Engagement at the University of  Michigan-Flint, led by Dr. Lisa Lapeyrouse and funded by the University of Michigan's Inclusive History Project.

Three objectives of this work are:

  • Research and document the history, role of student activism in the establishment of the Center for Global Engagement on the UM-Flint Campus
  • Research and document how changes in leadership and funding have shaped and reshaped the Center for Global Engagement's infrastructure, capacity, priorities and programming over time
  • Research and document the relationship between student's sense of belonging and the Center for Global Engagement

This position is from August 2024 through December 31, 2024 for approximately 6 hours per week. Work schedule will be determined after hire to account for a student's work schedule, if applicable.

Responsibilities*

  • Interviewing, collecting and organizing historical documents and materials
  • Preparing historical documents and materials for preservation
  • Instrument design
  • Qualitative and quantitative data collection and analysis
  • Presentation/dissemination of research findings

Required Qualifications*

  • Strong oral and written communication skills
  • Ability to work independently
  • Research experience 
  • Ability to work at the UM-Flint Campus (not a remote position)
  • Available at least 6 hours per week

Desired Qualifications*

  • Prior research experience 
  • Demonstrate the ability to respect confidentiality
  • Knowledge of the Center for Global Engagement's mission, programming and patrons/clientele
  • Undergraduate student in good standing with the University of Michigan-Flint 

Modes of Work

Onsite The work requires, or the supervisor approves a fully onsite presence.  Onsite is defined as a designated U-M owned or leased work location within or outside of the State of Michigan

Additional Information

University of Michigan-Flint - Plan for Diversity, Equity and Inclusion

The University of Michigan-Flint's DEI plan can be found at: https://www.umflint.edu/dei/?  

The University of Michigan-Flint exhibits its commitment to diversity, equity, and inclusion through enacting fair practices, policies, and procedures particularly in support of the equitable participation of the historically underserved. UM-Flint recognizes the value of diversity in our efforts to provide equitable access and opportunities to all regardless of individual identities in support of a climate where everyone feels a sense of belonging, community, and agency.

Diversity is a core value at University of Michigan-Flint. We are passionate about building and sustaining an inclusive and equitable working and learning environment for all students, staff, and faculty. The University of Michigan-Flint seeks to recruit and retain a diverse workforce as a reflection of our commitment to serve the diverse people of Michigan, to maintain the excellence of the University, and to offer our students richly varied disciplines, perspectives, and ways of knowing and learning for the purpose of becoming global citizens in a connected world.

Background Screening

The University of Michigan conducts background checks on all job candidates upon acceptance of a contingent offer and may use a third party administrator to conduct background checks.  Background checks are performed in compliance with the Fair Credit Reporting Act.

Application Deadline

Temporary job openings are posted for a minimum of three calendar days.  The review and selection process may begin as early as the fourth day after posting. This opening may be removed from posting boards and filled anytime after the minimum posting period has ended.

U-M EEO/AA Statement

The University of Michigan is an equal opportunity/affirmative action employer.

COMMENTS

  1. Ten simple rules for innovative dissemination of research

    Dissemination of research is still largely ruled by the written or spoken word. However, there are many ways to introduce visual elements that can act as attractive means to help your audience understand and interpret your research. Disseminate findings through art or multimedia interpretations.

  2. Communicating and disseminating research findings to study participants

    The researcher interview guide was designed to understand researchers' perspectives on communicating and disseminating research findings to participants; explore past experiences, if any, of researchers with communication and dissemination of research findings to study participants; document any approaches researchers may have used or intend ...

  3. (PDF) Strategies for Disseminating Qualitative Research Findings: Three

    Dissemination, as the written or oral representation of research findings, usually happ ens. at the end of a research project (B arnes et al. 2003) and is part of. utilisation - utilisation ...

  4. 21. Qualitative research dissemination

    However, disseminating qualitative findings in a public space, public record, or community-owned resource means that more equitable ownership might be negotiated. An equitable or reciprocal arrangement might not always be able to be reached, however. Transparency about who owns the products of research is important if you are working with ...

  5. How to disseminate your research

    Principles of good dissemination. Stakeholder engagement: Work out who your primary audience is; engage with them early and keep in touch throughout the project, ideally involving them from the planning of the study to the dissemination of findings. This should create 'pull' for your research i.e. a waiting audience for your outputs.

  6. Disseminating research findings: what should researchers do? A

    We define dissemination as a planned process that involves consideration of target audiences and the settings in which research findings are to be received and, where appropriate, communicating and interacting with wider policy and health service audiences in ways that will facilitate research uptake in decision-making processes and practice.

  7. A Guide to Effective Dissemination of Research

    4. Manage the timeline and resources. Time constraints are an inevitable part of research dissemination. Deadlines for publications can be months apart, conferences may only happen once a year, etc. Any avenue used to disseminate the research must be carefully planned around to avoid missed opportunities.

  8. Strategies for Disseminating Qualitative Research Findings: Three

    Assuming there are those who do pay attention to the dissemination of qualitative research findings, what can we learn from them? ... We conclude by considering the ethical issues that may be involved in these forms of disseminating qualitative research, as well as the challenges for evaluating the impact of such strategies. URN: urn:nbn:de ...

  9. Disseminating research findings: what should researchers do? A

    Background Addressing deficiencies in the dissemination and transfer of research-based knowledge into routine clinical practice is high on the policy agenda both in the UK and internationally. However, there is lack of clarity between funding agencies as to what represents dissemination. Moreover, the expectations and guidance provided to researchers vary from one agency to another. Against ...

  10. Disseminating the Findings of your Research Study

    Granek, L., & Nakash, O. (2016). The impact of qualitative research on the "real world" knowledge translation as education, policy, clinical training, and clinical practice. Journal of Humanistic Psychology, 56(4), 414-435. A further key dimension of research dissemination lies in the act of writing.

  11. 15.3 Disseminating Findings

    15.3 Disseminating Findings Presenting your work, as discussed in Section "Presenting Your Research", is one way of disseminating your research findings. In this section, we will focus on disseminating the written results of your research.Dissemination refers to "a planned process that involves consideration of target audiences and the settings in which research findings are to be ...

  12. 9Dissemination of Qualitative Findings

    ABSTRACT. This chapter discusses strategies for disseminating the results of qualitative analysis. It begins with a discussion of the documentation of qualitative methods through the construction of a natural history of the methodology. The chapter then turns to a discussion of approaches to the presentation of qualitative data in written reports.

  13. PDF Communicating qualitative research findings: An annotated bibliographic

    Research dissemination may be passive or active. Passive dissemination is defined as a form of communication such as publication in an academic journal; and active dissemination as the tailoring of research findings to a target audience using a dynamic flow of information (Walter et al. 2003).

  14. Dissemination of Findings

    Dissemination of Findings Biography and life story research is scientifically and practically useful for dissemination. Scholars employ several formats to write a biography and life story research report, making this … - Selection from Qualitative Research: An Introduction to Methods and Designs [Book]

  15. Analysis, Interpretation, and Dissemination

    Analysis yields the basic findings of the study; interpretation attempts to spell out their implications or broader meanings. Interpretation is the main topic of the Discussion section of a research article; it consists of three related parts that are covered in the chapter.

  16. Dissertation Results & Findings Chapter (Qualitative)

    Section 1: Introduction. The first step is to craft a brief introductionto the chapter. This intro is vital as it provides some context for your findings. In your introduction, you should begin by reiterating your problem statement and research questionsand highlight the purpose of your research.

  17. Dissemination of Research Results: On the Path to Practice Change

    A variety of approaches are available for the dissemination of research findings, but by far the most common are publications in biomedical journals (discussed in detail later in this article) and presentations at professional meetings. ... Sutton J, Austin Z. Qualitative research: data collection, analysis, and management. Can J Hosp Pharm ...

  18. Responsible dissemination of health and medical research: some guidance

    Dissemination has been defined as 'the targeted distribution of information and intervention materials to a specific public health or clinical practice audience',1 and as being 'simply about getting the findings of your research to the people who can make use of them, to maximise the benefit of the research without delay'.2 Ethics guidelines concur that research stakeholders have ...

  19. View of Strategies for Disseminating Qualitative Research Findings

    Return to Article Details Strategies for Disseminating Qualitative Research Findings: Three Exemplars

  20. The Role of Dissemination as a Fundamental Part of a Research ...

    Abstract. Dissemination and communication of research should be considered as an integral part of any research project. Both help in increasing the visibility of research outputs, public engagement in science and innovation, and confidence of society in research. Effective dissemination and communication are vital to ensure that the conducted ...

  21. PDF Patient and Community Engagement in Disseminating HEAL-Funded Research

    the research directly informs changes that can improve public health. Consider these steps for disseminating your findings in the communities you've engaged. Plan ahead to share findings. As you plan to share your research findings: Collaborate with patient and community partners in authoring articles. Credit these partners as coauthors on ...

  22. Strategies for effective dissemination of research to United States

    Finally, the dissemination of research to policymakers may raise certain ethical issues. It is imperative for researchers to critically assess when and how to disseminate research findings to policymakers, keeping in mind that promoting a specific policy agenda may result in a perceived or real loss of objectivity . Syntheses of policy-relevant ...

  23. Further exploration of dissemination bias in qualitative research

    Dissemination bias in qualitative research therefore cannot be articulated within a discourse of outcome, but rather needs to be viewed in relation to the complete and accurate representation of the phenomenon of interest. ... The increasing use of qualitative research findings in clinical guidelines and health and social care decision-making ...

  24. Strategies for effective dissemination of research to United States

    Research has the potential to influence US social policy; however, existing research in this area lacks a coherent message. The Model for Dissemination of Research provides a framework through which to synthesize lessons learned from research to date on the process of translating research to US policymakers. The peer-reviewed and grey literature was systematically reviewed to understand common ...

  25. Review of Research and Evaluation in Education and Psychology

    Donna M. Mertens's "Research and Evaluation in Education and Psychology: Integrating Diversity with Quantitative, Qualitative, and Mixed Methods" stands as an indispensable resource for those seeking to understand contemporary research methodologies in education and psychology, with a particular focus on diversity.

  26. Models and frameworks for assessing the implementation of clinical

    The presented findings contribute to the field of implementation science, encouraging debate on choices and adaptations of models and frameworks for implementation research and evaluation. Background Substantial investments have been made in clinical research and development in recent decades, increasing the medical knowledge base and the ...

  27. Research Assistant I

    The University of Michigan-Flint is seeking a Research Assistant I (Temporary). This research is part of the DEI Centers Project: Exploring the History, Role of Student Activism, Administrative Leadership, and Fiscal Decisions in the Founding and Maintenance of Center for Global Engagement at the University of Michigan-Flint, led by Dr. Lisa ...