Online Webinars as Tools for Building Extension Evaluation Capacity

Online Webinars as Tools for Building Extension Evaluation Capacity

Benjamin Silliman
Department of 4-H Youth Development
and Family and Consumer Sciences
North Carolina State University

Abstract

Extension professionals who engaged in monthly interactive webinars on program evaluation skills reported gains in knowledge and skills and intentions to apply knowledge, but offered mixed reviews of online delivery. Participant comments affirmed previous research on professionals’ needs for evaluation training, on preferences for online learning, and on the technical and collegial qualities of online learning communities. The webinar series revealed both the limitations of and opportunities for web-based delivery of professional skills and networking. Implications for practice, research, and policy are discussed. Keywords: Web-based learning, Online learning, Community of practice, Evaluation capacity-building

Online Webinars as Tools for Building Extension Evaluation Capacity

Program evaluation skills are increasingly important to outreach professionals for accountability and program improvement. Many community-and campus-based professionals possess limited skills, resources, or time to master the wide-ranging skill-set for evaluating programs (Jayaratne, 2009; Rennekamp and Engle, 2008). To support evaluation capacity-building among employees and partners, the eXtension online learning network of the Cooperative Extension System (eXtension, 2009) developed an Evaluation Community of Practice (E-CoP) (Evaluation Community of Practice, 2011). A priority project of the E-CoP were monthly webinars on evaluation topics designed to increase awareness and knowledge of evaluation and promote dialogue and resource exchange. This paper describes the first year of the webinar pilot, discusses participant feedback on content, delivery, and applications, and reviews implications for practice, research, and policy.

Evaluation Capacity Building

As program quality and accountability become more important for Cooperative Extension and other publicly-funded organizations (Taylor-Powell and Boyd, 2008; Rennekamp and Engle, 2008), abilities to plan, evaluate, and report results become foundational competencies for staff at all levels (Scheer, Ferrari, Earnest, and Connors, 2006; Stone and Coppernoll, 2004). Extension specialists and agents (Guion, Boyd, and Rennekamp, 2007; National Association of Extension 4-H Agents, 2006; Radhakrishna, 1999; Radhakrishna and Martin, 2001) indicate that skills in needs assessment and program evaluation are high priorities for training. Two capacity-building initiatives found that most Extension staff initially identified themselves as “novices” or “beginners” in evaluation (Arnold, 2006; Douglah, Boyd, and Gunderman, 2003). Stand-alone workshops are typically inadequate to meet these needs (Kelsey, 2008), but extended mentoring (Arnold, 2006) or work in teams (Taylor-Powell and Boyd, 2008) can be effective in building skills. Unfortunately, few state or county Extension systems possess the human capital to sustain such “high touch” approaches and evaluation skills representare one among many staff needs (Conklin, Hook, Kelbaugh, and Nieto, 2002). In fact, many state evaluators must rely primarily on self-directed learning to upgrade their own skills (Guion, Boyd, and Rennekamp, 2007). A few programs such as Children, Youth, and Families at Risk (CYFAR, 2011), Successful Assessment Methods and Measurement in Evaluation (SAMMIE) (Archer, Bruns, and Heaney, 2007), and American Evaluation Association brief webinars (AEA, 2011) offer more extensive training in evaluation, targeted primarily to program collaborators or professional evaluators. The University of Wisconsin Cooperative Extension (2011) offers an array of program development and evaluation resources in a self-directed format. Although these resources are accessible, research-based, and high quality, they offer relatively few live, interactive learning opportunities. Moreover, state Extension systems face increasing needs for evaluation capacity building with declining resources to support it (Taylor-Powell and Boyd, 2008).

Internet Training for Professional Development

The internet is the fastest-growing source for formal and informal education. Internet-based professional development enables practitioners to access a wide range of experts and colleagues via educational courses and webinars, resource sites, and communities of practice conveniently and inexpensively (Allen and Seaman, 2007). Web-based communities of practice enable professionals to share expertise and experiences and work toward timely solutions to shared challenges (Sobrero, 2008). Consistent with this trend, Extension professionals prefer web-based training over other venues, perhaps because it is convenient and requires relatively little time commitment or cost (Conklin, et al., 2002; Senyurenki, Dworkin, and Dickinson, 2006). The eXtension online network has spawned some 60 communities of practice on diverse topics (eXtension, 2011).

Research about online education has focused primarily on formal education but is beginning to address non-formal training. Online training using the Elluminate! Live (E!) platform was rated positively by tutors (Ng, 2007) and nursing students (Battin-Little, Passmore, and Schullo, 2006). Online professional education was effective in teaching crisis response skills to rural mental health workers (Hills, Robinson, Kelly, and Heathcote, 2010) and improving knowledge in environmental education (Dillard, 2006; Kries and Wilke, 2007). Addressing faculty development, Brooks (2010) recommended a blend of online and face-to-face training to accommodate constraints (e.g., time, fiscal support, accessibility) and opportunities (e.g., information, skill-building, collegial learning and support). Sellers, et al. (2009) recommend that communities of practice (CoPs) use internet websites and webinars for networking, processing or application of content knowledge rather than for basic instruction, which can be addressed more effectively by structured tools such as Moodle.

Qualities of Effective Online Learning

Research on best practices for academic online learning found Elluminate! Live most effective when interactive elements such as dialogue, organizational flexibility, and autonomy were more balanced (McBrien and Jones, 2009) and when students were trained to maximize use of technology tools. Shi and Morrow (2006) found that many e-conferencing tools on undergraduate distance education platforms (e.g., presentation forums, polling, text and audio interfaces, web browser, application sharing) support critical elements of student learning including active learning, mutual engagement, and accommodation of diverse learning styles but that current competencies of instructors and students do not reflect optimal use of these applications. LaPointe and Reischotter (2008) found that students’ views of peer networks in online courses were influenced by values, learning styles, lifestyles, and subject matter mastery as well as technical competence.

These observations seem equally valid for professional collaboration and development (Brooks, 2010), especially for new professionals “learning the ropes” and implementing new practices (Schiagke, Fusco, and Schank, 2002).

Research on communities of practice (CoP) (Wenger, McDermott, and Snyder, 2002) identified several factors conducive to initiating or sustaining professional collaboration: 1) open-ended design, allowing for iterative growth; 2) opportunities for dialogue within and beyond the community; 3) flexibility in level and type of participation; 4) public and private community spaces; 5) focus on the value of the community; 6) a balance of predictable routine and novelty; 7) establishing a regular rhythm for community activities. Sobrero (2008) added: 1) strong leadership from a selfless facilitator; 2) accessible, reliable, flexible technology for interactive and archival applications; 3) well-focused, mutually-beneficial goals; 4) passion for engagement at both technical and social levels to sustain online networks and off-line projects and mentoring; and 5) organic (vs. prescribed) growth in numbers and diversity of audience and applications. Gannon-Leary and Fountainhe (2007) emphasized the roles of trust and expertise with multi-cultural or multi-discipline groups.

Evaluation Community of Practice Webinar

The eXtension Evaluation Community of Practice (E-CoP) webinar was initiated early in 2009 to increase interest and capacity in program evaluation among Cooperative Extension professionals and their community partners. Webinars were designed to present and illustrate concepts and practices and engage participants in dialogue, question-and-answer, and foster self-directed learning through a related website. E-CoP members recommended topics, facilitated the monthly two-hour sessions, and recruited participants from their own professional networks, supported by state and national leaders across all Extension disciplines who encouraged participation by their staff. The sponsoring institution provided the Elluminate! Live (E!) platform that featured PowerPoint delivery, access to documents, video, and websites, interactive audio and chat, polling, and recording applications.

Although the webinar design was informed by research on evaluation, online learning, and communities of practice, its content and format were necessarily experimental and its development iterative. For instance, initial sessions were more highly structured with more emphasis on presentation and engagement via polls and planned responses. As regular participants became accustomed to the technology and interactive format, presentations were punctuated with more flexible and extended give-and-take. To facilitate webinar quality, presenters were oriented and rehearsed beforehand. The web portal was open 15 minutes before presentation and participants were oriented to the E! Live platform before and during webcast.

Webinar schedules included a 5 minute orientation, a 45 minute team presentation, 15 minutes of participant questions, 20 minutes of expert commentary, 15 minutes for discussion, 15 minutes for evaluation, and 5 minutes of announcements. Topics included Creating a Culture of Evaluation, Effective Communities of Practice, Applying Theory to Practice, Evaluating Science and Technology, Participatory Evaluation, Distinguishing Evaluation and Research, Evaluation Across the Program Cycle, Reporting Outcomes, and Using Photolanguage Methods. Webcast archives and additional resources were posted on a related Wikispaceweb site, http://nc4-heval.wikispaces.com site, enabling asynchronous and repeated use. This site typically received 30-40 “hits” in the 1-2 days following the webinar and 5-10 visits on other days.

Methods

Over the first ten sessions, the webinar engaged an average of 40 participants from over 30 states across 5 time zones. Participation ranged from 23 to 62 at peak (highest number at any time), with a cumulative total of 10-20 percent over peak, and with 50-60 percent participating for the full duration. Participants did not have to register, thus were distinguished only by self-identification via textbox. Approximately 50 percent of the audience participated in multiple webcasts, with 25-30 percent attending most sessions. Participants were primarily Cooperative Extension faculty, with university-based specialists generally outnumbering community-based practitioners. Approximately 70 percent were females. Overall, by years of experience, participants with 0-5, 6-10, 11-20, and more than 20 years were equally represented. Sixty to seventy percent of participants held a graduate degree. The largest contingent by discipline served 4-H Youth Development, followed by Family and Consumer Sciences and Agriculture.

Each webinar concluded with a brief evaluation in which participants were asked about their interest in the webinar and in the community of practice, their rating of the relevance, delivery, and quality of the webinar. Data was analyzed using Microsoft Excel with an emphasis on descriptive statistics, given the limited sample and pilot status of the study. Qualitative comments on learning, planned applications, and webinar improvements were organized on an Excel spreadsheet and prioritized by frequency of occurrence.

Results

Presentation teams from multiple sites in nine different states had little difficulty moderating sessions on the Elluminate platform. Those who used a practice opportunity affirmed its value for minimizing technology problems and maximizing use of applications. Participation ranged from 23-62 persons, averaging about 40. Most participants had relatively little difficulty connecting through a live web link and few experienced difficulty navigating the site. Most participated for 60 to 90 minutes, with increasing numbers sharing in polling, text comments, and session evaluations with each session. Fewer participants used the microphone, perhaps for technical (e.g., no headset, poor audio), personal (e.g., self-conscious, passive orientation), or professional reasons (e.g., expertise insufficient to ask questions). Many preferred to be observers, at least during their first session.

Webinar Evaluation

Each webinar concluded with a brief evaluation, with 60 percent of peak participation, on average, still engaged. Respondents available to complete end-of-session evaluation ranged from 28 percent to 71 percent, with over 55 percent of participants completing end-of-session evaluations in 8 of 10 sessions. Thus responses may reflect the most interested or least busy members. Trends for each feedback category for each webinar are presented in Table 1. A summary of open-ended comments follows.

Table 1: End-of-webinar participant feedback, by percentage
Interest in topic, participating in the community of practice, webinar relevance, quality, and overall quality for each session in the E-CoP webinar series
Percent Reporting High or Very High Levels

Session Topic Eval N/ Total N Interest in Topic Interest in CoP Relevance of webinar Quality of Delivery Overall Quality
Importance of Evaluation 15/53 93 80 87 73 80
Methods of Evaluating 37/56 73 73 65 86 88
Communities of Practice 16/36 75 76 90 32 56
Theory to Practice 19/31 94 79 87 27 62
Evaluating Science Learning 14/23 91 95 91 27 70
Participatory Evaluation 23/49 79 82 58 72 75
Research and Evaluation 39/62 77 45 82 85 89
Systems Evaluation 20/30 74 66 84 58 85
Communicating Results 30/45 84 52 93 59 81
Photolanguage 31/45 67 63 68 42 56

 

Table 1 indicates consistently high levels of interest in topics and in the community of practice as well as positive views of the webinar’s relevance and overall quality. Delivery quality was generally rated as high, although tended lower in sessions marked by technical difficulties such as limited bandwith, transmission speed, or audio clarity.

Open-ended Responses

Open-ended questions addressed knowledge and skill gains, plans to use information, and suggestions for improvements in the webcast. Participants offered the fewest comments on knowledge gain, although comments were generally consistent with session objectives. Many emphasized increased understanding for how and why the topic or skill was important. Participants expressed appreciation for guidance with webinar technology and resources and interest in learning more about the webinar topic. Plans to use information included emulating recommended practices (e.g., building a community of practice, trying a tool or method), studying the topic, applying skills to a new audience, and sharing information with colleagues, partners, administrator, decision-makers, other stakeholders. Suggestions for improvement of the webcast addressed both content topics (e.g., specific methods and models, examples from all Extension disciplines) and process factors (e.g., better preparation of speakers, help with managing technical tools). A few participants recommended that the two-hour time frame was too long, although no one suggested reducing or eliminating any component of the program. Frequently participants offered, “no suggestions” or “the system works great!” Several E-CoP member participants noted that the quality of content and delivery effectiveness improved with each session.

Discussion

This evaluation sought to determine whether and how a monthly online webinar, sponsored by a community of practice, might increase awareness and skills in program evaluation. Steady participation averaging 40 per two-hour session, including many “regulars,” indicates interest in evaluation and online delivery consistent with earlier research (e.g., Conklin, et al. 2002; Radhakrishna 2001). The webinar engaged many more than the 8-10 E-CoP leaders, but could reach a much larger potential audience of several thousand Extension staff and partners. As the audience expanded beyond-E-CoP leaders and their local colleagues, interest in joining the E-CoP long-term declined slightly. Through the year, a growing minority of participants used audio or text chat features and shared in end-of-session evaluation, illustrating a gradual accommodation to the technology, evaluation topics, and to the culture of an interactive learning community. For all topics, interest, relevance, and quality were consistently rated high. Surprisingly, the lowest ratings for relevance occurred for webinars on methods, which was among the most requested topics in open-ended comments. This discrepancy may indicate that participants did not plan to use evaluation methods presented immediately, were interested in other methods, or that they required more detailed training to implement those methods. Given audience diversity, each of these explanations may be viable depending on participant need. Participants particularly appreciated practical examples and discussion opportunities that accompanied quality instruction. The most common open-ended comments about aspirations indicated intentions to learn more, try webinar ideas, or share information with peers, suggesting that additional skill-building would be welcomed. These findings support the assumption that monthly webinars provide a forum for building awareness and for networking but cannot provide stand-alone skill training. Overall, consistently high ratings on relevance and quality, with aspirations for application and continued learning, provide evidence of immediate and potential long-term benefit.

Evaluation results also provide insight on online delivery of professional development. Ratings on delivery may reflect the developmental and circumstantial elements of the online webinar. Anecdotal observations suggested that regular participants became more comfortable with both topics and technology as the series proceeded. Criteria for evaluating delivery were not specified, but prior research on CoPs suggests participants value a clear, varied, and coherent style as well as compatible and manageable technology (Sobrero 2008). In the author’s experience, webinar speakers who prepared most carefully tended to enjoy higher ratings, even when presentations originated from multiple sites and used multiple technology tools (as in the case for Culture of Evaluation, Methods, Participatory Evaluation, Research). Nevertheless, even unanticipated breakdowns in audio links or transmission of web links can increase frustration and result in lower ratings on delivery, as was apparently the case for the STEM, Systems, and Communicating Results sessions. Delivery issues appeared to be the most sensitive on survey feedback and most frequently cited as areas for improvement in open-ended comments, perhaps because most participants are inexperienced and anxious with technology problems. Yet even where ratings on delivery are lower, ratings on relevance or quality remained fairly high, suggesting that participants do not generalize based on technical difficulties.

Participant comments from later sessions tended to reflect higher expectations for capable and creative delivery and greater disappointment when expectations were not met. Thus presenters face a dilemma in that poor or problematic delivery may result in loss of audience before they can recover or improve, yet improvements in delivery tend to raise expectations for performance. In this situation, the support and coaching of a committed community of practice can help sustain participation and improve webinar delivery.

Limitations

This report describes a pilot project involving several innovative technologies for which there are few established protocols for programming or evaluation. A brief end-of-session survey captured only immediate and general impressions, although open-ended items generated more extensive feedback. Observations from E-CoP leaders and participants provided insight on “what works” but the open-ended nature of comments may have missed some key points about webinar content, delivery, or application. Audience attrition and non-response resulted in response rates ranging from 30-70 percent. Respondents may have differed from non-respondents by level of commitment or confidence, or simply by fewer schedule conflicts. Although participants were a self-selected group of Extension professionals interested in program evaluation, their preferences and perceptions seem consistent with earlier research on professional development and online learning. Webinar assessment and programming may not rival academic or professional development training courses, but do provide a low-cost alternative that links practitioners with peers and resident experts.

Recommendations for Practice, Research, and Policy

The E-CoP webinar “experiment” yielded many useful insights that might be applied to practice, research, and policy. Perhaps most elementary is the recognition that innovation as well as best practice and program fidelity is critical to advancing programming. Practices related to webinar success include 1) a consistent schedule of announcements and meetings; 2) speakers who were well-prepared on the technology as well as the topic; 3) providing expert instruction linked to 4) practical examples and 5) engagement via interactive polls, question-and-answer, discussion, and website archives; 6) supported by a facilitator and technology troubleshooter who 7) helped presenters and participants master technology applications. The once-a-month format seems ideal to introduce topics and methods and facilitate networking, but more intensive training requires hands-on mentoring and practice between sessions. A two-hour format is likely too long but shorter sessions might create scheduling difficulties. Diverse audiences can complement as well as inhibit learning. A community of practice leadership team is critical to recruitment, training, online support and improvement of webinars. Archiving webinars increases access to knowledge but cannot overcome the primary barrier to participation: available time.

More research is needed to understand how webinars foster awareness, knowledge, skill-building, and behavioral application with diverse audiences and settings, both separately and together with face-to-face mentoring or project learning. Research should also look at how technology can be used to foster active learning and productive networking.

The initial success of the webinar was due to enabling policies of the eXtension network and state-level partners. Where state organizations face shrinking resources, strategic and collaborative investments in human and infrastructure resources can facilitate a level of capacity-building and impact not possible for each unit. To achieve such impacts, organizations must be forward-thinking, collaborative, and nimble, viewing new technology as a quality enhancement rather than as a cost-effective alternative to human systems.

 

 

References

Allen, I.E., and J. Seaman. 2007. Online nation: Five years of growth in online learning. Needham, MA: Sloan Consortium. Accessed April 22, 2011.http://sloanconsortium.org/publications/survey/pdf/online_nation.pdf

Arnold, M. E. 2006. Developing evaluation capacity in Extension 4-H field faculty: A framework for success. American Journal of Evaluation 27: 257-269.

Arnold, M. E., M. Calvert, M.E. Cater, W. Evans, S. LeMenestrel, B. Silliman, and J.S. Walahoski. 2008. Evaluating for Impact: Educational Content for Professional DevelopmentWashington, DC: National 4-H Learning Priorities Project, Cooperative State Research, Education, and Extension Service, USDA. http://www.national4-hheadquarters.gov/library/Indicators_4H_MM.pdf

Archer, T. M., K. Bruns, K., and C,A, Heaney. 2007. SAMMIE: Using technology for a one-stop program evaluation resource. Journal of Extension 45(5): Article 5TOT1. http://www.joe.org/joe/2007october/tt1.php

Battin-Little, B., D. Passmore, and S. Schullo. 2006. Using synchronous software in web-based nursing courses. CIN: Computers, Informatics, Nursing 24(6): 317-325.

Brooks, C.F. 2010. Toward ‘hybridised’ faculty development for the twenty-first century: Blending online communities of practice and face-to-face meetings in instructional and professional support programmes. Innovations in Education and Teaching International 47(3): 261-270.

Conklin, N.L., L.L. Hook, B.J. Kelbaugh, and R.D.Nieto. 2002. Examining a professional development system: A comprehensive needs assessment approach. Journal of Extension40(5): Article 5FEA1. http://www.joe.org/joe/2010august/a3.php

Dillard, J. (2006). Environmental Educators Learn Applied Evaluation Skills Online.” Paper presented at the Annual meeting of the North American Association For Environmental Education, TBA, St. Paul, MN. Accessed April 13, 2011 http://www.allacademic.com/meta/p124592_index.html

Douglah, M., H. Boyd, and D, Gundermann. 2003 Nurturing the development of an evaluation culture in public educational agencies. Paper presented at the annual conference of the American Evaluation Association, Reno, NV. extension. 2011. Communities and institutions. http://www.extension.org/people/communities

Gannon-Leary, P.M., and E. Fontainha. 2007. Communities of practice and virtual learning communities: Benefits, barriers, and success factors. eLearning Papers 5.http://www.elearningpapers.eu/en/article/Communities-of-Practice-and-virtual-learning-communities%3A-benefits,-barriers-and-success-factors

Guion, L., H.H. Boyd, and R.A. Rennekamp. 2007. An exploratory profile of Extension evaluation professionals. Journal of Extension, 45(4). Article 4FEA5. Retrieved fromhttp://www.joe.org/joe/2007august/a5.shtml

Hills, D.J., T. Robinson, B. Kelly, and S. Heathcote. 2010. Outcomes from the trial implementation of a multidisciplinary online learning program in rural mental health emergency care. Education for Health: Change in Learning and Practice, 23(1),:1-12.

Jayaratyne, K. 2009. Faculty perceptions about outreach scholarship: Strategies for fostering outreach scholarship.” Presentation at the Tenth Annual Outreach Scholarship Conference, Athens, GA, September 29, 2009.

Kelsey, K. 2008. Do workshops work for building evaluation capacity for Cooperative Extension faculty. Journal of Extension 46(6): 6RIB4. Accessed April 3, 2011.http://www.joe.org/joe/2008december/rb4p.shtml

Kreis, R. and R. Wilke. 2007. EE Program Impacts from Online EE Program Evaluation Training. Presented at the Annual meeting of the North American Association for Environmental Education, Virginia Beach, VA. Accessed July 25, 2010.http://convention2.allacademic.com/one/www/www/index.php?cmd=www_search&offset=0&limit=5&multi_search_search_mode=
publication&multi_search_publication_fulltext_mod=fulltext&textfield_submit=
true&search_module=multi_search&search=Search&search_field=titleh

LaPointe, L., and M. Reischotter 2008. Belonging online: Students’ perceptions of the value and efficacy of an online learning community. International Journal on E-learning 74(4): 641-665.

McBrien, J.L., P. Jones, P., and R. Cheng. 2009. Virtual spaces: Employing a synchronous online classroom to facilitate student engagement in online learning. The International Review of Research in Open and Distance Learning, 10(3). Accessed April 16, 2011. http:///www.irrodl.org/index.php/irrodl/article/view/605/1298

National Association of Extension 4-H Agents. 2006. NAE4-HA Membership survey results. Public Relations and Information and Research, Evaluation and Programs Committees. United States Department of Agriculture. (For a copy of the report contact Dr. Susan Le Menestrel, National Program Leader, Youth Development Research, email:mailto:slemenestrel@csrees.usda.gov..

National Institute for Food and Agriculture (NIFA). 2009. State and national partners. Retrieved from http://www.csrees.usda.gov/qlinks/partners/state_partners.html

Ng, K.C. (2007). Replacing face-to-face tutorials by synchronous online technologies: Challenges and pedagogical implications. In International Review of Research in Open and Distance Learning,8(1).

http://eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/70.pdf

Radhakrishna, R.B. 2001. Professional development needs of state extension specialists.Journal of Extension 39, (5): Article 5RIB4, http://www.joe.org/joe/2010february/a2.php

Radhakrishna, R.B., and M. Martin. 1999. Program evaluation and accountability needs of extension agents. Journal of Extension 37, (3), Article 3RIB1.http://www.joe.org/joe/1999june/rb1.php

Rennkamp, R.A., and M. Engle. 2007. A case study in organizational change: Evaluation in Cooperative Extension. In Program evaluation in complex organizational systems: Lessons from Cooperative ExtensionNew Directions for Evaluation. Edited by M.Braverman, M. Engle, M. Arnold, and R.A. Rennekamp. (Eds.). (New York: Wiley, 15-26.

Scheer, S.D., T.M. Ferrari, G.W. Earnest, and J.J. Connors. 2006. Preparing extension educators: The Ohio State University’s model of extension education. Journal of Extension44(4): Article 4FEA1. http://www.joe.org/joe/2006august/a1p.shtml

Schlager, M.S., J. Fusco, and P. Schank. 2002. Evolution of an online education community of practice. In Building virtual communities: Learning and change in cyberspace. Edited by K.A. Renninger and W. Shumar. New York: Cambridge University Press, 129-158.

Sellers, D.M., A.B. Crocker, A. Nichols, S.A. Kirby, and M. Britnall-Peterson. 2009. Creating the extension family caregiving community of practice. Journal of Extension, 47(5): Article 5FEA1. http://www.joe.org/joe/2009october/index.php

Senyurekli, A.R., J. Dworkin, and J. Dickinson. 2006. On-line professional development for Extension educators. Journal of Extension, 44(3): Article 3RIB1.http://www.joe.org/joe/2006june/rb1.php

Shi, S., and B.V. Morrow. 2006. E-conferencing for instruction: What works? Educause Quarterly, 4: 42-49.

Sobrero, P.M. 2008. Essential components for successful virtual learning communities. Journal of Extension, 46(4): Article 4FEA1. http://www.joe.org/joe/2008august/a1.php

Stone, B., and S. Coppernoll. 2004. You, Extension, and Success: A competency-based professional development system. Journal of Extension 44, (2): Article 2IAW1.www.joe.org/joe/2004april/iw1.php

University of Kansas. 2009. Community Toolbox of the Workgroup for Community Health and Development. . http://ctb.ku.edu/en/tableofcontents/chapter_1036.htm http://ctb.ku.edu/en/default.aspx

University of Wisconsin Cooperative Extension. 2011. Program development and evaluation.http://www.uwex.edu/ces/pdande/evaluation/index.htmll

Wenger, E., E.McDermott, and W.M. Snyder. 2002. Cultivating communities of practice. Cambridge, MA: Harvard University Press.

 

 

Back to table of contents ->https://www.theforumjournal.org/2017/09/03/spring-2011-vol-16-no-1/

Read Next Issue