Qualitative program evaluation methods

Qualitative program evaluation methods

J. Mitch Vaterlaus, M.S.
Graduate Extension Assistant
Utah State University

Brian J. Higginbotham, Ph.D.
Associate Professor and Family Life Extension Specialist
Utah State University

Abstract

Evaluation is an important component of refining programs and documenting impacts. Evaluation aids the profession as a whole and assists Extension faculty in meeting promotion requirements. Qualitative methods are commonly used in evaluations in order to explore specific facets of programs and to give voice to participants’ experiences. These methods provide in-depth information that can assist Extension faculty in enhancing the quality of their programs. This review highlights differences between quantitative and qualitative evaluation methods. The elements, processes, and limitations of qualitative evaluation methodology are detailed. In addition, specific guidelines are provided for increasing the trustworthiness of qualitative evaluations.

Keywords

Evaluation, methods, qualitative

Introduction

Extension professionals may not feel they have the time, resources, or expertise for conducting advanced statistical analyses (Higginbotham, Henderson, and Adler-Baeder 2007). There also may be concern that quantitative methodologies will not provide practical and in-depth information often needed for program improvement. Extension faculty with these concerns should consider the possibilities of qualitative research.

“Qualitative research” is a title that represents a broad family of methods (Bamberger, Rugh, and Mabry 2006; Bodgan and Biklen 1998). It has been defined as the process of “making sense” of data gathered from interviews, on-site observations, documents, etc., then “responsibly presenting what the data reveal” (Caudle 2004, 417). The major difference between qualitative and quantitative approaches lies in their epistemological foundations (Bamberger et al. 2006). In other words, the approaches differ in what constitutes knowledge, how knowledge is acquired, and how it is used. Ragin (1994, 93) explains, “Most quantitative data techniques are data condensers. They condense data in order to see the big picture. Qualitative methods, by contrast, are best understood as data enhancers. When data are enhanced, it is possible to see key aspects of cases more clearly.”

The underlying assumptions of qualitative methods are closely related to Cooperative Extension’s mission of understanding and meeting people’s needs at the local level (U.S. Department of Agriculture 2010). For Extension administrators and faculty, qualitative program evaluations can enhance understanding of their participants’ experiences (Bamberger et al. 2006). This is done through techniques that give voice and articulate participant perspectives (Bodgan and Biklen 1998). Qualitative analyses are often used in large-scale, rigorous, and formal program evaluations. However, they can also be used in the pilot studies, small budget projects, ad hoc, and quick-turnaround endeavors that many Extension faculty undertake (Bamberger et al. 2006; Caudle 2004). This review highlights the following issues for Extension faculty who may be interested in using qualitative methods in program evaluation:

  • The research question
  • Qualitative data collection
  • Qualitative data analysis
  • Quality in qualitative evaluation
  • Challenges and considerations in qualitative evaluation

The research question

Research questions are different in quantitative and qualitative methodologies (Corbin and Strauss 2008). Qualitative research questions are used to seek understanding of phenomena that are not fully developed, where quantitative methods are used to test hypotheses. In qualitative research, the research question leads the evaluator into the data where the issue can be explored. Qualitative research questions are broader than quantitative research questions but should be specific enough to tell the reader what is being investigated. For example, “What do male participants say about their marital relationships after completing a marriage enrichment course?” The question identifies the topic (marital relationships), the period in time (after program completion), and the perspective of interest (men who participated in a marriage enrichment course). With qualitative research, the perspective of interest can be individuals, families, groups, or organizations (Corbin and Strauss 2008).

Qualitative data collection

Once a research question has been formulated then data can be collected from appropriate sources. A particular strength of qualitative research is the variety of data sources that can be used including face-to-face interviews, phone interviews, focus groups, videos, observation, diaries, or historical documents (Corbin and Strauss 2008). Interviews are commonly used in qualitative program evaluations (Bamberger et al. 2006).

Qualitative interviewing is typically semi-structured. The interviewer has a focus but is also afforded flexibility (Bamberger et al. 2006). In semi-structured interviews the interviewer generally has a list of questions and discussion prompts, but the order in which they are asked can vary in each interview. The interviewer may ask additional questions and probe beyond the questions on their lists (Berg 1998). Some things to consider in collecting data through interviews include the following:

  • Confidentiality. Just as in other types of research, participants may expect their answers to be confidential. Depending on the requirements of the researchers’ Human Subjects Review Board, confidentiality may even be required. If data are published, real names of people should be replaced with pseudonyms (Corbin and Strauss 2008).
  • Interviewing not intervening. Collecting information from an interview can bring up sensitive topics. Depending on the state, interviewers may have to follow reporting laws (e.g., abuse reports). It can be helpful to have a list of resources on hand during an interview in case a request comes up (e.g., local therapists, women’s shelters, etc.). However, interviewers should remember they are acting in the role of “researcher” and not as a “therapist” or “detective.”
  • Reciprocal process. Interviewing makes the interviewer an active part of the research process (Corbin and Strauss 2008). An interviewer should be aware of his or her biases, paradigms, and belief systems. The interviewer should not lead participants to desired or preconceived conclusions nor use non-verbal language to reinforce or discourage certain responses (e.g., nodding, rolling eyes, etc.).
  • Recording. Audio recording is often used in interviewing (Creswell 2007). The audio recording can then be transcribed. This allows for the inclusion of direct quotes in final reports, which can support themes and results from the overall study. Appropriate permission must be granted from the participants in order to record or videotape the interview.
  • Questions. The questions used in the interview should be open-ended questions or conversational prompts (Kaplan and Saccuzzo 2009). For example, “Tell me about your experience participating in this program.” Open-ended questions keep the interaction flowing; closed-ended questions halt the interview. An example of a closed-ended question would be, “Did you like participating in this program?”
  • Cultural competence. The language and culture of the person being interviewed should be taken into consideration (Bamberger et al. 2006). If possible, participants should be interviewed in their own language. Careful attention should always be given to interpretations (Caudle 2004). Teaming with a representative of the culture may assist in making culturally competent translations (Bamberger et al. 2006).
  • Sampling. Purposive sampling is often used in qualitative methodology because the focus is more on understanding than it is on generalizability (Creswell 2007). Quota sampling is one technique that can lessen the effects of sampling bias (Bamberger et al. 2006). For example, five members who attended the entire program and five members who attended only part of the same program could be interviewed. The type of sampling procedure largely depends on the perspective of interest in the research question (e.g., anyone who participated in the program vs. only those who completed the program). This procedure can also be used to gain understanding from different genders, ethnicities, ages, etc.

Qualitative data analysis

Generally, qualitative findings are generated through inductive processes ­— from detailed information to general themes (Bamberger et al. 2006). The most common qualitative analytic technique is thematic analysis. Thematic analysis involves:

  • Viewing the data several times as a whole (e.g., reading and re-reading the manuscripts).
  • Identifying patterns and themes (e.g., finding common statements or ideas that appear repeatedly).
  • Reorganizing the data (e.g., coding the data according to the themes identified).

This type of data analysis requires attention to detail and simultaneously being able to consider the data as a whole. Depending on the number and length of interviews, this process can be very time consuming. There are several variations of thematic approaches (Bodgan and Biklen 1998). There are also other analysis techniques that can be used depending on the type of data that is collected (see Berg 1998; Corbin and Strauss 2008; Creswell 2007).

Quality in qualitative evaluation

The quality of qualitative research rests on how the data are gathered and analyzed (Tracy 2010). “Trustworthiness” is a common term in qualitative research and is closely related to the term “validity” in quantitative research (Marshall and Rossman 2011). This term refers to the credibility, transferability, dependability, and objectivity of the research (Marshall and Rossman 2011; Schwandt 2007). Increasing the trustworthiness of the study increases the likelihood that evaluation results will warrant publication. A few suggestions for increasing trustworthiness include

  • Triangulation. This concept refers to cross-checking the data (Shwandt 2007). Triangulation reduces the potential systematic bias that can occur with using only one data source, method, or procedure (Maxwell 2008). Triangulation can be done through the use of multiple data sources (e.g., facilitators, participants, and observations), multiple methods of data collection (e.g., individual interviews, focus groups, and diaries), multiple data collectors (e.g., more than one interviewer), multiple data collection points (e.g., same person interviewed several times over a defined time period), multiple theories (e.g., using theories from multiple disciplines), and using a mixed-methods approach (e.g., collaborating with a quantitative researcher on the evaluation) (Bamberger et al. 2006; Creswell 2007; Tracy 2010).
  • Theory. Theory may emerge from qualitative inquiry, although this is generally not the primary purpose (Bamberger et al. 2006). Qualitative results are not generally used for confirmation of existing theories but can provide additional support for them. Existing theory can be used to guide qualitative research (Malterud 2001). Published qualitative studies often use theoretical frameworks to provide justification for the methodologies that are used (Corbin and Strauss 2008). Theoretical frameworks can also provide explanations and deeper understanding when interpreting the qualitative results.
  • Validation. This is the process of checking with participants concerning the accuracy of the data and interpretations (Creswell 2007; Tracy 2010). It is also called ‘member checking.’ Selected representatives of the sample are given opportunities to review, prior to dissemination, copies of the transcribed data (manuscripts, with confidentiality requirements completed) and the results section (e.g., containing the themes drawn from the interviews).

Challenges and considerations in qualitative evaluation

Qualitative evaluation does not come without challenges. The beginning qualitative researcher may feel overwhelmed by the time and expertise required to complete qualitative evaluations (Corbin and Strauss 2008). Many of the procedures and terminologies used within qualitative inquiry are very different than quantitative research (Malterud 2001).

As with any evaluation, Extension faculty must carefully make a plan to complete the evaluation in light of their other responsibilities and time constraints. Organization and documentation is particularly important when working with large data sets (e.g., transcripts, recordings, field notes) (Bogdan and Biklen 1998; Caudle 2004). Research procedures should be documented and accepted best practices should be followed to ensure quality and trustworthiness. Planning the entire process from the onset can also increase the coherence in the design and procedures (Maxwell 2009). The plan should include realistic time frames for conducting interviews, transcribing, coding, and writing.

Participants may feel uncomfortable with the less-structured nature of qualitative interviews. Consideration should be given in the procedures to build rapport and to ensure participants’ confidentiality. Extension faculty may need to identify areas of qualitative inquiry that they may need to read more about or seek mentorship from a more experienced qualitative researcher.

When data is collected and analyzed, researchers should use caution in discussing implications and generalized findings. The foundational purposes of qualitative research are different than quantitative research. Malterud (2001, 486) explained, “The findings from a qualitative study are not thought of as facts that are applicable to the population at large, but rather as descriptions, notions, or theories applicable within a specified setting.” The sampling technique and rigor of the data collection influence the scope of the generalizability or transferability of the findings. The results from qualitative studies provide in-depth and rich information that can lead to new hypotheses, theory, and directions in programming. Before presenting or submitting an article based on qualitative data, Extension faculty should consider the scope and purpose of the research to make sure the evaluation will make a meaningful impact on the field (Tracy 2010).

Publishing qualitative results is one way to contribute to the progression of Extension. The trustworthiness of the data is critical because academic journals attempt to publish rigorous findings. Some academic journals do not publish qualitative research but some journals exclusively publish qualitative research (e.g., http://qrj.sagepub.com/). The Forum for Family and Consumer Issues and Journal of Extension regularly publish articles that use qualitative methods. Lists of journals that are receptive to qualitative methods can be found online (see http://www.slu.edu/organizations/qrc/QRjournals.html). Reviewing qualitative articles from these journals can lead to a greater understanding of qualitative procedures and terminologies.

Conclusion

Extension faculty are generally required to publish articles in order to meet tenure promotion requirements (Schwab 2003). They are also expected to provide quality research-based programming (U.S. Department of Agriculture 2010). It is possible for Extension faculty to accomplish both of these purposes through the evaluation of their programs. Qualitative evaluation may serve as a less intimidating way to contribute to professional literature and meet promotion requirements. It does not require an advanced knowledge of statistics and can be done at a scale and scope to match each agent’s budget, interests, and need. Furthermore, steps can be taken to insure the quality of the results and to enhance the trustworthiness of the process. When done well, qualitative research can provide valuable insights that can be used to improve programs locally while also influencing related programming efforts more broadly (see Higginbotham, Henderson, and Adler-Baeder 2007).

 

References

Bamberger, M., J. Rugh, and L. Mabry. 2006. Real World Evaluation: Working under budget, time, data, and political constraints. Thousand Oaks, CA: Sage.

Berg, B. 1998. Qualitative research methods for the social sciences. Boston: Allyn and Bacon.

Bodgan, R., and S. Biklen. 2007. Qualitative research for education: An introduction to theories and methods. Boston: Pearson.

Caudle, S.L. 2004. “Qualitative data analysis,” in J.S. Wholey, H.P. Hatry, and K.E. Newcomer (eds.) Handbook of practical program evaluation. San Francisco: Jossey-Bass. 417-438.

Corbin, J., and A. Strauss. 2008. Basics of qualitative research. Thousand Oaks, CA: Sage.

Creswell, J. W. 2007. Qualitative inquiry and research design: Choosing among five approaches. Thousand Oaks, CA: Sage.

Kaplan, R. and D. Saccuzzo. 2009. Psychological testing: Principles, applications, and issues. Belmont, CA: Wadsworth.

Higginbotham, B., K. Henderson, and F. Adler-Baeder. 2007. Using research in marriage and relationship education programming. Forum for Family and Consumer Issues12(1).http://ncsu.edu/ffci/publications/2007/v12-n1-2007-spring/higginbotham/fa-4-higginbotham.php

Malterud, K. 2001. “Qualitative research: Standards, challenges, and guidelines.” The Lancet 358(9280): 483-488. doi: 10.1016/S0140-6736(01)05627-6

Marshall, C., and G. Rossman. 2011. Designing qualitative research. Thousand Oaks, CA: Sage.

Maxwell, J.A. 2009. “Designing a qualitative study.” In L. Bickman and D.J. Rog (eds.) Applied Social Research Methods. Thousand Oaks, CA: Sage. 214-253.

Ragin, C. 1994. Constructing social research. Thousand Oaks, CA: Pine Forge Press.

Schwab, C. 2003. “Editor’s Corner: The scholarship of extension and engagement: What does it mean in the promotion and tenure process?” The Forum for Family and Consumer Issues 8(2).http://ncsu.edu/ffci/publications/2003/v8-n2-2003-may/editors-corner.php

Schwandt, T.A. 2007. “Judging interpretations.” New Directions for Evaluation 114:11-25.

Tracy, S.J. 2010. Qualitative quality: Eight “big-tent” criteria for excellent qualitative research. Qualitative Inquiry 16:837-851.

United States Department of Agriculture. 2010. Extension.http://www.csrees.usda.gov/qlinks/extension.html

 

 

Back to table of contents ->https://www.theforumjournal.org/2017/09/03/spring-2011-vol-16-no-1/

Read Next Issue
Read Previous Issue