Evaluation that Goes Beyond the Data

Volume 6, No. 2, Spring 2001

Karen DeBord and Lynne Borden

Abstract

Program evaluation is a process that documents the successes and efforts of outreach programs describing educational or behavioral impacts. Increasingly, educators who work with youth and family programs are asked to document program outcomes. These results have become increasingly important to educators and funders. However, with limited accessible information pertaining to program evaluation, educators are often challenged and frustrated by the need to design and conduct good evaluation. This article describes two accessible interactive Web sites that were designed to provide support to educators designing program evaluations. The first site is designed to assist educators in evaluating parenting education programs, in particular. The second site assists educators in designing effective evaluations for any program. These two Web sites offer educators a self-study opportunity to gain new skills and knowledge to support their program evaluation efforts.

Increasingly, educators working with youth and family programs are asked to document their successes, describe educational or behavioral impacts, and be accountable for their program efforts. Determining how to appropriately evaluate programming efforts can be challenging and frustrating. For some, evaluation is a negative word.

Decker and Yerka (1990) indicate that evaluation is an integral part of the program development process and can help with planning decisions, implementation decisions, and decisions about continuing, modifying, or terminating a program. However, research has shown that evaluation plans and instruments that are not appropriate for their audience can provide results that are often unusable (Cronbach 1982). Many educators are not trained as evaluators and need assistance to determine how to design program evaluation. Training in program evaluation, program accountability, and research methods has been identified as a primary need by Radhakrishna and Martin (1999) who assessed Extension training needs in South Carolina.

Often, educators do not design evaluation plans for their programs. Without the evaluation planning, they are left with a post-program evaluation model. Lack of planning often leaves educators with data that is not useful to making future programming decisions or describing program success. With useless data and no plans for disseminating this information, it often remains either on a shelf or in a file cabinet.

Evaluation is more than collecting answers or completing the required reporting forms. Evaluation is a process that helps educators know if they were effective and allows them to redesign programs based on what was learned through the evaluation process.

How can active educators become effective evaluators?

With the growth in the use of technology for rapid dissemination of information and access to a wide variety of educational materials via the Web, the design of self-paced and readily accessible learning experiences for educators is evolving (Elliott 1999). The use of e-mail, list serves, Web pages, and on-line courses is increasing. Combining the need to learn about evaluation with the rise in the use of educational technology, two interactive evaluation Web sites were designed to provide support for educators who find it difficult to be an evaluator and to understand how to take the data collected and use it to present educational successes to others — particularly to funding sources and key decision makers.

Two interactive learning sites

The first site was conceptualized during a two-day retreat funded by the National Network for Family Resiliency (NNFR). Five Extension specialists from the states of Washington, Alabama, Colorado, Kentucky, and North Carolina were involved. The original goal in the retreat was to design a method for evaluating parenting programs nationwide. However, what resulted was not a “silver bullet” that would capture all work with parenting education, but rather a multi-step conceptual paper outlining the necessary steps for a total educational program.

The group agreed that the Web site should teach Extension educators about the process of developing and evaluating programs. The group discussed the need to make the information interactive rather than remain a “flat” medium that may not be used. The paper in HTML format was placed on the NNFR Web site. A few months later, the paper was used as a basis to guide a computer programmer and one of the members of the team in designing, conducting a review, and piloting the interactive Web site. This site uses parenting education as the subject matter to take educators through a ten-step interactive workbook to design the educational program from conducting a needs assessment through reporting results.

This site can be found at http://www.ces.ncsu.edu/depts/fcs/nnfr/. The steps in the plan include

  • exploring your personal perspectives about various parent audiences
  • examining methods of needs assessments
  • defining the presenting program or educational issue
  • identifiying stakeholders with this issue
  • writing overall goals
  • writing educational objectives
  • understanding and selecting and evaluation method
  • selecting a teaching strategy or intervention
  • writing a success story of a report for stakeholders and others
  • learning from the program and redesigning the program for new audiences

During the piloting phase, users commented

  • “This is a great site for planning parenting programs.”
  • “There is so much good information and ideas in the Planning Evaluation section of the workbook.”
  • “This site is a good self-study for educators just learning how to plan programs.”

The original team agreed that the investment of time to conceptualize and outline the steps in the final paper was key to having the components transferred into a useful site. The difficult part was devising interactive ways to involve the learner and draw them through the steps incorporated in the site. Using quizzes, fill in the blanks, and choice lists, the site was completed. It takes users approximately 30-45 minutes to complete the steps in the interactive workbook. Once complete, users are able to print their unique plan for a parenting educational program. Although the site is written with parenting education as the point of reference, others have found it useful in understanding the steps necessary to consider in program design.

Building on the success of this interactive educational site, a second collaboration resulted in a site that would move the educator/evaluator from conducting evaluations to understanding what to do once evaluation data is collected. Through a collaboration between the Extension National Network for Child Care (http://www.nncc.org) and the Extension National Network for Collaboration (http://crs.uvm.edu/nnco/collab/) a second interactive Web site was built.

Found at http://www.ces.ncsu.edu/depts/fcs/beyonddata, the Beyond Data site provides a wealth of information for educators. This site was conceptualized by one Extension evaluator from each of the networks along with a computer programmer. During a day-and-a-half planning session, the site was conceptualized through flow charts and drawings on easel paper. Original ideas moved into educational concepts with the idea that this would be a reference site or a teaching site with a quiz-like interactive section so users could challenge themselves to move from simple evaluation results to implications for future educational efforts.

This interactive learning site takes users BEYOND the DATA and helps learners design effective evaluation while recognizing how to present data in a meaningful way. There are three sections; an information data bank, sample program data, and presentations.

The information data bank contains definitions, examples, and advantages and disadvantages of eight data collection methods. The data collection methods examined include case studies, use of existing information such as census data, personal interviews, group interviews, written surveys, phone surveys, observation, mass media, public hearings, and public forums.

The second section uses the same eight data collection methods as section one, however, sample data is provided and users interact through a series of questions asking them to integrate what is seen in the data to understand what generalization can be made or implication noted from the data provided. Hypothetical examples are provided followed by an interactive true/false quiz to learn if the stated implication is on target. The quiz has a pop-up help screen for correct responses.

The third section provides multiple reporting methods one can use to present findings in written and oral form. Each example is accompanied by tips and graphs. The reporting methods described include abstracts and briefings, annual reports, brochures, exhibits, fact sheets, news releases, newsletters, posters, slides, and overheads. The types of data graphics include tables, line graphs, bar graphs, pie charts, and diagrams. There are links to power words to use in writing, links to on-line dictionaries, and style manuals.

Educators who have piloted the site from around the nation have found it extremely useful. A Webtrends statistical report recorded more than 6500 hits to the site between January 2000 and May 2000. Many users have remarked that this is extremely useful for new Extension educators and other community educators as they design programs. Their comments include

  • “I wish I had known about this site before!”
  • “This is a nice overview, probably helpful to someone starting from scratch.”
  • “It presents needed information in a very usable format. I liked the progression from info to sample data to presentations.”
  • “I think that all of the suggestions in the site are very helpful and would be useable at the (Extension) county level.”

The planners of this site included the computer programmer from the beginning, which proved beneficial to both the planners and the programer. The planners saved time because they knew up front what technology was possible. The programer not only learned about the content from the beginning but shared the vision and was able to begin work on the site as text files were generated by the designers. Submitting text files and discussing what interactivity was desired for each step became the protocol of operation to complete the site.

Summary

Both of the discussed sites have been used in evaluation in-service training and by college professors for student homework and activity. Additionally, these sites serve as a foundation in orienting new Extension faculty and staff to the design and evaluation of programs. Users always are amazed that evaluation can be learned in such a user-friendly way in the comfort of their own office or at their home computer. As technology advances, these sites will need to be updated, however, the information is sound and will be useful to community educators for years to come.

Designing interactive Web-based learning sites takes time in the conceptual stages, but with a talented computer programmer, the content can be readily accessible and on-line shortly after the conceptual stages. It is critical to have reviewers and users to read through and test the interactive buttons and paths before making the site public. The era of technology is dictating what learners need and in what environment they will access information. For educators willing to use Web-based learning, interactive sites provide useful information that can be used as self-study and can provide an online orientation to community educators while serving as an easy reference for return users.

 

 

References

Cronbach, L. 1982. Designing evaluations of educational and social programs. San Francisco: Jossey-Bass.

Decker, D. J., and B. L. Yerka. 1990. Organizational philosophy for program evaluation. Journal of Extension. 28 (2). Available at http://www.joe.org/joe/1990summer/f1.html

Elliott, M. 1999. Classifying family life education on the World Wide Web. Family Relations, 48:7-13.

Radhakrishna, R., and M. Martin. 1999. Program evaluation and accountability training needs of extension agents. Journal of Extension. 37(3).

Authors

Karen DeBord, Ph.D., CFLE, State Specialist, Child Development, NC State University, karen_debord@ncsu.edu; and
Lynne Borden, Ph.D., State Leader for 4-H Youth Development Children, Youth and Family Programs, Michigan State University Extension, bordenl@msu.edu.

Cite this article:

DeBord, Karen, and Lynne Borden. 2001. Evaluation that goes beyond the data. The Forum for Family and Consumer Issues. 6(2).

 

 

Back to table of contents ->https://www.theforumjournal.org/2017/09/04/spring-2001-vol-6-no-2/

Read Next Issue
Read Previous Issue