Final report for NEVT17-001
Project Information
Twenty-nine Vermont agricultural service providers participated in professional development to learn evaluation concepts and apply techniques to improve their sustainable agriculture programs in ways that increase impact at the farm level. A core cohort of 22 participants completed multiple training sessions, gaining both a conceptual foundation in evaluation approaches and practical skills related to planning and implementing evaluation activities. Of the core cohort, 18 demonstrated changes in knowledge, attitude/awareness and skills related to planning and implementing evaluation, and 14 integrated at least one aspect of what they learned to modify how they evaluate their water quality protection and/or farmer development programs. At the conclusion of the project, 8 participants reported using their evaluation results to improve programs and services conducted with approximately 1290 farmers in Vermont and other states. These participants reported using evaluation to improve: grant programs that provide financial assistance to farmers to implement water quality protection on their farms; grant programs that provide funds for farm business development education and technical assistance; direct farmer education on business start-up; business development, profitability, and nutrient management on livestock farms.
20 Extension educators and non-profit personnel will use increased outcome evaluation knowledge and skills to improve programs designed to help 150 beginning farmers launch and grow farm enterprises that meet their business, stewardship and lifestyle goals, and help 150 established producers adopt nutrient management, cover crop, and other production practices that support Vermont’s new water quality goals.
The goal of this plan is increase the ability of agricultural service providers in Vermont to apply evaluation concepts and techniques to improve their sustainable agriculture programs in ways that increase impact at the farm level.
The project will focus on evaluation in the context of two transdisciplinary areas of high relevance to NESARE’s outcome statement: programs that a) help new farmers establish and grow sustainable farm businesses and b) those that help farmers implement practices that reduce agricultural nonpoint water pollution. As such, it will engage a diverse group of agricultural service providers from Extension, nonprofit and government agencies who work across a variety of crop and livestock production, marketing and business development content areas. Participants will gain both a conceptual foundation and practical skills that they will use to strengthen their nutrient management, cover cropping, grazing management, enterprise planning, business planning, marketing, financial management, apprenticeship, and farmer mentoring programs.
Through the application of what they learn, participants will: gather more meaningful needs assessment data; develop more powerful learning objectives for their programs; keep better track of participant learning and behavior change; increase their understanding of factors that support or inhibit farm level adoption; use data generated through internal reviews and client feedback to improve service planning and delivery; and more effectively communicate the value of their sustainable agriculture programs to agricultural audiences, funders, and the general public.
In Vermont, the majority our agricultural educators in Extension—from our agronomy and pasture teams that are helping farmers improve soil health through cover cropping, reduced tillage, management intensive grazing, etc. to our farm business viability team that assists farm owners with financial management and business and transition planning—currently provide education and technical assistance that contributes to the goals of the SARE outcome statement.
While the majority of this programming is well received and well attended by the farming community, a review of needs assessments of Extension educators conducted annually at the Extension Professional Improvement Conference revealed that program evaluation and reporting has been a top professional development need in Vermont for at least the past four years. Similarly, NGO and agency partners have indicated interest in professional development on evaluation, and have expressed challenges accurately assessing impacts of their programs.
Extension educators and our NGO partners are committed to evaluating their farmer programming but many struggle to develop effective and efficient evaluation that both meets funders requirements and that help us identify the approaches and components of our programs that are most effective at supporting on-farm change. UVM Extension has not had an evaluation faculty specialist or staff coordinator since 2005, leaving evaluation planning and implementation as the responsibility of the faculty and staff delivering programs. While some larger states have Extension evaluation and curriculum development specialists with whom educators and service providers can consult when developing an evaluation plan, developing survey questions, or analyzing data, Vermont agricultural service providers are generally undertaking these activities on their own, sometimes looking to each other for feedback, guidance and suggestions.
Therefore, this project seeks to train a cadre of Extension educators and non-profit organization staff to apply evaluation concepts and approaches in their work with farmers to help us all identify and amplify components that are effective at supporting tangible changes on the farm.
We will draw on Northeast SARE’s outcomes funding framework (Williams et al., 1996) and the logic model approach used in other USDA programs. Penna and Philips (2005) suggest that these outcome evaluation frameworks allow educators to move beyond project outputs to measure meaningful outcomes. The project will also draw upon “Theory of Change,”a methodology used for planning, participation, and evaluation to promote social change (Taplin and Rasic, 2012).
Advisors/Cooperators
Educational Approach
Through an online application, the project administered a competitive process to recruit 14 Extension educators and 6 non-profit and government agency staff to participate in the three-year project. We selected participants based on supervisor support, commitment to participate, and the relevance of their area of expertise/programming to water quality and or beginning farmer topics.
The project, which we called Evaluation Works, used variety of educational delivery methods, including workshops, webinars, and individual technical assistance and mentoring. It supplemented in-person and distance trainings with small (2- to 4-person) team peer support/learning circle-style check-ins. The check-ins included time for progress updates and planning next steps. We used email listserv and website to support ongoing dialogue and resource exchange. Participants selected one of their sustainable agriculture education projects to use for their evaluation work throughout the project. In several cases, multiple people from a program team, organization or agency enrolled in the project, and worked collaboratively on the same or aligned programs/porojects.
Evaluation Works provided modest incentives for participation, including evaluation publications, mileage reimbursement to trainings and follow-up activities, and access to online evaluation and reporting tools. Through the creation of a project lending library and purchase of reprints, participants had access to both scholarly and practical articles, books and guides on evaluation approaches and techniques.
All group learning sessions (whether in-person or online) included action planning components, where service provider participants discuss ideas implementing evaluation activities in their programs, with at least one action step for doing so. Action plans included steps such as planning (and conducting) needs assessment interviews; identifying key indicators; developing registration forms and survey instruments; developing interview and focus group protocols; analyzing data from document reviews, interviews, focus groups and/or surveys; and creating reader-friendly reports and data visualizations.
Milestones
1. 12 Extension, nonprofit and public agency key informants participate in interviews and focus groups to identify and prioritize their learning needs and priorities related to evaluation of beginning farmer development and water quality programming practices.
12
1
11
January 31, 2018
Completed
May 15, 2018
Eleven agricultural service providers and one farmer participated in interviews about their learning needs related to beginning farmer development and water quality programs. The individuals represented Extension, nonprofit organizations, state government agencies and private consultants. Half of the individuals were people whose responsibilities include program leadership and coordination of multi-organizational projects, so they were able to provide insights into needs across the organizations they represent. The information they shared confirmed that building evaluation capacity and expertise is a need across organizations working on water quality and new farmer development. The needs assessment also gave us new appreciation for the amount customization and that will be needed to meet the variety of needs, and to support implementation within their teams and projects. For that reason, we revised our target for the number of participants from 24 to 12-15.
2. 120 agricultural service providers receive an announcement describing the new SARE evaluation education project; the announcement includes an invitation to enroll as a participant and complete an anonymous baseline assessment of their knowledge and skills.
120
80
July 31, 2018
Completed
November 30, 2018
We conducted outreach to 80 agricultural educators and service providers across Vermont. Outreach was conducted by direct, individual email to individuals who work for UVM Extension, the Vermont Agency of Agriculture, Vermont conservation districts, nonprofit organizations and USDA FSA and NRCS. Within those groups, we targeted our outreach to people whose work has a significant focus either on helping new producers grow sustainable farm businesses or helping farmers implement practice that reduce agricultural non-point pollution. Emails pointed interested people to the new VT-SARE website, which provided an overview of the project. We also offered an interactive introductory webinar to potential participants a taste of the kinds of activities they would engage in and opportunities to ask questions. A recording of the webinar was posted on the project website so people who could not attend "live" could view it. 18 people attended or viewed the webinar.
3. 24 agricultural service providers complete the self-assessment and online application, and commit to participating in the online, in-person and peer-to-peer components of the project. The project team uses the information from registration forms to match participants with a “peer learning partner” from the group and notifies all participants of the match.
24
15
August 31, 2018
Completed
December 20, 2018
Seventeen individuals completed an online application and were accepted into the program. Subsequently, 3 individuals withdrew. Two of the individuals withdrew because of changes in their job. The third withdrew because the program was not a good match for that individual's priorities at this point in their career.
The individuals current in the program are employed by Extension programs (6), nonprofit organizations (4), local or state government agencies (3) and a land grant university. They work across a variety of programs involved in nutrient management, cover cropping, grazing management, speciality crop, enterprise planning, business planning and apprenticeship and farmer mentoring programs. As expected, the cohort is diverse in terms of their prior knowledge and experience with evaluation topics, and in the specific kinds of information and skill building they hope to gain through the project. Participants indicated they anticipate using what they learn to improve current programming (13 people) to plan new programs (12 people) to build their network (12 people) and to make decisions about whether to continue programs (9 people). We decided to wait until the cohort has more interaction with each other to assign peer learning partners.
4. 14 service providers complete an online form verifying that they have completed a series of pre-workshop assignments. These assignments reviewing informational materials about theory of change, developing learning goals, and methods for assessing program performance. They also share the indicators their programs and/or projects are currently using to measure progress toward outcome goals. (September 2018)
24
14
September 30, 2018
Completed
December 20, 2018
All participants completed an in-depth self assessment about their current evaluation practices and providing us with information about the approaches/frameworks and measures they are using in the current programming.
Participants have also completed work sheets outlining their own program's theory of change and assumptions, and 12 of the 14 have completed and shared with the project coordinator their personal action and learning plan through the summer of 2019.
5. 20 agricultural service providers attend a full-day workshop to increase their knowledge and skills in the targeted evaluation topic areas. The focus of this session will be on framing evaluation questions and outcome indicators and approaches for collecting reliable, valid data. During the session, service providers will engage in individual reflections and small group discussions to generate ideas for implementing what they learn in their work with farmers, and planning at least one action step to improve programming related to new farmer development or water quality protection. They also obtain an orientation to the peer learning and expert consulting components of the project. (October 2018)
20
14
October 31, 2018
Completed
April 12, 2019
We held two sessions of the in-person workshop in April 2019. The same content was covered in both sessions. Two sessions were needed because schedules did not align for a single session.
6. 12 agricultural service providers use an online application form to request individual technical assistance from one of the consulting expert evaluators. (By the end of October 2018)
20
12
October 31, 2019
Completed
May 31, 2019
Requests for technical assistance were incorporated into the action planning document. Two individuals will be completing theirs in May 2019. The project coordinator is currently working to identify the consulting evaluators to provide specific technical assistance/information individuals have requested.
Because most of the participants indicated an interest in more in-depth training in the methods of Results Based Accountability, we are exploring offering a multi-session training for the group as a whole.
1. 14 agricultural service providers receive monthly email updates about the project and invitations to participate in bi-monthly distance meetings (via Zoom). (Starting December 2018 and continuing through the second and third year of the project.) Each distance meeting will combine a presentation by an expert on a technical topic with opportunities for participants ask more general questions, and share challenges, successes and progress towards action steps. Emails will provide links to related resources. While we will fine-tune the topics based on the priorities of the cohort, anticipated year 2 distance session topics include:
• writing strong survey questions and reporting survey results;
• technology to automate, streamline and strengthen data collection in farmer education settings;
• Beyond the survey: observation, documents, interviews, and other data collection methods
• Uses and benefits of qualitative data.
14
20
September 30, 2019
Completed
December 20, 2019
In year 2 of the project, 21 agricultural educators and service providers engaged with the program, attending a combination of online and in-person educational workshops.
In the first half of the project year we held 3 distance meetings (via Zoom, and recorded), and two in-person workshops.
The topics covered during the three sessions were:
- Orientation to the Evaluation Works program
- Learning Organizations and Theory of Change
- Change Management
Supplementary resources and follow-up activities were provided for the second and third sessions.
Two workshops (covering the same content) were necessary because it was not possible to bring all the participants together for a single day session. Participants at both sessions indicated strong satisfaction with the opportunity to learn in a highly interactive, small group session. Content covered during the sessions included:
- Evaluation Frameworks: Participants learned about three approaches to evaluation frameworks -- logic models, rubrics and the Results Based Accountability framework -- with an emphasis on understanding similarities and differences, and uses and limitations of each of the frameworks models. Through individual and small group work, participants applied these concepts to determining an evaluation framework for their work, and began to consider which model(s) make the most sense for their program and/or projects.
- Learning Goals & Evaluation: Participants learned about and practiced writing learning goals that articulate changes in client behavior in specific measurable ways. Participants worked individually and in pairs or trios giving each other feedback and working together to develop learning goals that are both meaningful and practical.
- Data Collection & Survey Design: After discussion of different methods of collecting evaluation data and on survey and survey question design, participants worked individually to apply the concepts discussed to their work. We asked all participants to bring a the theory of change sheet they developed earlier in the year and recent (or planned) survey or other data collection tool (such as a application form) from their project. Each participant then marked up their survey/tool, coding it to identify the kind of performance measures it collects and what components of their theory of change it addresses. Participants were asked to consider whether they are collecting the right data, enough data or perhaps too much data given the goals of their program and the resources/capacity they have. Each participant also made a "to do" list about priority changes they want to pursue. Then in discussion in small groups, participants summarized what they discovered about their own instrument/theory of change, and provided feedback to each other. Based on feedback, participants updated their to-do list.
- Introduction to the Learning Circles and Technical Assistance Components of the Project: Participants learned about how the learning circles and technical assistance components would be structured.
- Action Planning: All participants completed an action plan with three next steps that they want to pursue. We took photos of their action plans and are using that to guide upcoming content for webinars and individual consulting.
- Reflections and take-aways: We concluded the session by getting feedback from participants about what worked well for them during the session and what they would do differently if we offered it again.
In the second half of the project year, the project director conducted an in-person, 75-minute workshop on Using Technology to Streamline Reporting at a UVM Extension in-service training. The session was attended by 13 Extension employees, both individuals who are part of the Evaluation Works cohort (4) and additional Extension faculty and program staff (9). The session provided information about ways technological tools can help agricultural service providers reduce the time they spend entering, managing, and report data and that can help make it easier to analyze what the data means for program performance and impact. At the end of the session, participants identified at least one approach to streamlining data collection that they would pursue in the coming months.
During the second half of the year the project also supported five people to attend a full-day, Stakeholder Engagement Training, with Curtis Ogden of the Interaction Institute for Social Change. The training took participants through the entire stakeholder engagement process, familiarizing them with a variety of helpful frameworks and assessment tools and, most importantly, providing hands on opportunities for them to apply the frameworks and tools to their own programs and needs assessment priorities, and give and receive feedback from other attendees. Participants found that it was valuable spend an entire day "thinking deeply about why, how, when, and with whom we should be engaging in our work.”
2. 14 agricultural service providers launch a peer learning circles process. Learning circles groups adopt a meeting and recording template (provided by the team) to outline the format, ground rules and expectations for their collaborative work and to document progress, accomplishments and challenges they encounter. (April 2018).
14
14
14
May 02, 2019
Completed
May 30, 2019
Participants have been assigned to five learning circles. The project director decided the membership of the learning circles based on the interest areas of the individuals (outlined in their self assessment). All of the learning circles are using the same meeting and recording template to outline the format, ground rules and expectations for their collaborative work. The template also serves as a way to document progress, accomplishments and challenges they they encounter, and to communicate with the project director about their progress and needs.
3. 20 agricultural service providers engage in ongoing discussion and program development with at least one peer (matches made by the program coordinator) and a consulting evaluator as they integrate new knowledge and skills into their programming. (Starting October 2018 and continuing through September 2020.) The consulting evaluator documents the questions/challenges participants are addressing and shares that information with the project coordinator.
14
14
14
September 02, 2020
Completed
December 20, 2019
Two of the learning circles have been meeting consistently and independently in the 8 months since they launched, with each member moving forward with their goals. The other three learning circles met for the first two or three months after they were launched, but encountered scheduling conflicts during the growing season and did not start back up independently. For the third year of the project, the project director is reorganizing the learning circles into pairs/triads to try reduce scheduling conflicts while still incorporating a peer-to-peer learning and support aspect to the project.
The project director and consulting evaluator have provided assistance and support to Evaluation Works participants based on their requests. Seven individuals have requested assistance, some on more than one topic. Assistance has been provided on the following topics
- survey design (2 people)
- survey questions (3 people)
- analyzing and interpreting survey data (1 person)
- developing an evaluation framework for their program (2 people)
- communicating findings to stakeholders and the general public (1 person).
Because a significant component of the cohort was interested in additional training on the Results Based Accountability framework and process, the project director and consulting evaluator developed a hybrid (in person and online) six-session "short course" for the cohort. RBA is a process for evaluating and improving public sector programs that has been adopted by Vermont state government. We invited Evaluation Works participants to invite colleagues from their organizations to participate in the training. As a result, six additional individuals are now receiving training through the project. We are now half way through the short course and will report in full on outcomes and impacts in year 3. The project director has developed a section of the Evaluation Works website for the RBA course, which includes recordings of sessions, supplementary materials, and assignments.
4. 20 agricultural service providers complete a follow-up self-assessment survey from the Project Director and report on actions taken thus far to use evaluation skills and knowledge from this project in their water quality and/or beginning farmer programs, and give input about future training needs. (by May 2019)
14
14
May 31, 2019
Completed
January 14, 2020
Of the 21 individuals who have participated in Evaluation Works education and training to date, 14 have provided feedback about changes in their knowledge, skillset, attitude and/or behavior via surveys, self assessments, learning circle notes and/or individual interviews. All 14 report they are in the process of integrating new knowledge and skills in their work.
Significantly, 10 participants report working collaboratively with other Evaluation Works participants to modify, improve and align their measures and data collection approaches across project, program, and organizations/agencies.
Additionally, four have used feedback from the project director, their learning circle partners and/or the consulting evaluator to make changes to surveys and other evaluation instruments.
For example, an Extension employee who routinely collaborates with a nonprofit on a variety of farmer education initiatives, reported that over the last year she and the executive director of the nonprofit (also an Evaluation Works participant) have had multiple conversations between themselves and with advisors and the nonprofit's board aligning data collection in the two organizations. Their goal -- to reduce survey fatigue among clients while focusing more on key measures and collecting "really substantive information to communicate to funders and policy makers." As a result, they have dropped some survey questions that they had been asking for a number of years, and modified or added others so that their instruments align.
Following are additional quotes from participants about the impact of participation thus far.
"I am really finding the information useful, and it has enabled our team to think more broadly about our program evaluation."
"The elements of RBA are helping me to consider new ways of measuring success."
"I'm gaining new understanding of the major areas of measurement, engaging with colleagues, and understanding how to lead my team through the process of developing these metrics."
"I am gaining tools to help facilitate discussions with a large team around finding commonalities for multi-programatic alignment."
"This course has really opened my eyes and provided me with a new area of excitement in my work."
"I keep thinking there is a magic formula to this -- like there is going to be an absolute right. It seems like there are wrongs but not a perfect right. So I'd like to get more comfortable with that."
"Its so great to have this toolkit because it is so relevant to the work I'm doing . . . I'll be having a metrics meeting with [my board] to really look at how can we expedite data collection and at the same time make sure we are collecting the information that can really help us make our program better. And I don't think it would have happened without this program."
The project director will follow-up in year 3 with all Evaluation Works participants and report on their progress incorporating new knowledge and skills in their program evaluation work.
5. Through the survey, 6 agricultural service providers share tools and templates that they developed/used in their work, which the project team shares on the Vermont SARE website. (July 2019)
6
5
July 31, 2019
Completed
March 31, 2020
Sharing of tools and templates (between participants) was pushed back to year 3.
Five individuals shared draft and/or final registration forms, survey instruments, or reporting forms that they created and used in their programming. One of the five individuals also shared a report about program accomplishments.
Because these instruments and tools were so specific to the projects they were created to support, project director instead identified and shared on the website a set of more broadly applicable and adaptable tools, templates and resources relevant to the cohort and other Vermont service providers.
6. 20 agricultural service providers attend the project’s Year 2 Workshop, which builds on the skill base developed through the first year’s professional development activities. The focus of the year 2 workshop will shift from designing their evaluation plan and collecting data to analyzing their data and implementing changes in their programming. While we plan to fine tune content based on the priorities identified in the May 2019 survey, anticipated topics will include practices for assessing program performance, using that performance data to modify programming and communicating results to stakeholders. As with the first workshop, the session will be highly interactive, with opportunities for both group discussion and individual reflection. By the end of the day, all participants will plan (and share with the project director) at least one new action step to improve programming related to new farmer development or water quality protection. (September 2019).
20
12
September 30, 2019
Completed
February 28, 2020
Responding to participants priorities, we substituted a six-session "Results Based Accountability (RBA) Short Course" for the single day workshop.
Twelve people completed the short course. Sessions included presentations, discussion (full and small group) and individual and small group work sessions.
- Overview of the RBA approach and how it works;
- The difference between population and performance accountability measures
- Choosing indicators
- Practices for practices for assessing program performance at the output level
- Practices for assessing program performance at the outcome level
- Practices for assessing program performance at the impact level
- Communicating results to funders, other stakeholders and program participants
Each of the participants was provided with a copy of Trying Hard is Not Good Enough and Turning Curves: An Accountability Companion Reader two books by Mark Friedman about the RBA process.
At the close of the series, participants presented to the group the work they had done to either plan or begin implementing an RBA-influenced evaluation effort. Of the twelve participants, six focused their Evaluation Works activities on water quality programming, four primarily focused on farm business development programs and two people were considering organization-wide evaluation activities that incorporated both water quality and farm business development initiatives.
14 agricultural service providers engage in ongoing discussion and program development with at least one peer (matches made by the program coordinator) and a consulting evaluator as requested as they integrate new knowledge and skills into their
programming. (Starting May 2018 and continuing through August 2020.) The consulting evaluator documents the questions/challenges participants are addressing and shares that information with the project coordinator.
14
7
August 31, 2020
Incomplete
Some ongoing discussion among participants occurred in early 2020, but due to covid, participants prioritized direct response to client needs (including pandemic-related) through 2020 and the majority did not seek the assistance of the consulting evaluator.
14 agricultural service providers identify least one new action step to improve programming related to new farmer development or water quality protection and incorporate it into their programming.
14
14
14
February 28, 2020
Completed
September 30, 2021
All 12 individuals who completed the RBA short course identified next action steps as part of the closing session of course.
14 agricultural service providers continue to receive monthly email updates from the project and invitations participate in bi-monthly distance meetings (via Zoom). (through September 2020).
14
6
August 31, 2020
Incomplete
Given all the other demands being made on project participants through 2020, it was not possible to schedule this kind of group learning during 2020. Instead, the project director has provided support/coaching to participants on an individual or small group level. For example, the project coordinator assisted participants in developing a needs assessment process and survey that the group used to help determine how to transition an annual statewide grazing conference to an online event.
14 agricultural service providers complete a follow-up survey from the Project Director and report on actions taken thus far to use evaluation skills and knowledge from this project in in their water quality and/or beginning farmer programs. (August 2020)
14
8
August 31, 2020
Completed
September 15, 2021
Eight individuals completed the end-of-project survey. Results are discussed in detail in the Educational and Performance Target outcome sections below.
6 agricultural service providers share in-depth information about how their expanded evaluation capacity improved their programs and impacts which the project team uses to write case study examples. These participants are selected by the project coordinator, based on the information they provide in the August 2020 survey. The case studies are published on the Vermont SARE website. (September 2020).
6
2
September 30, 2020
In Progress
Three individuals provided detailed information about their expanded evaluation capacity, and case studies of two of the three have been completed and added to the Evaluation Works website.
30 agricultural service providers (core group of participants joined by a broader group of agricultural service providers from the organizations and agencies they represent) attend a final in-person meeting to share information about how they have used evaluation to improve programs and to map out next steps at the individual, organizational and inter-organizational levels. The budget for this summit would be in the 2020-2021 state plan.
30
June 30, 2021
Incomplete
Due to ongoing pandemic-related restrictions on in-person meetings and staff resignations, it was not possible to schedule this kind of group learning.
"We are having so much trouble getting our grant deliverable work with farmers done that it's not possible to allow the time for staff to attend," the supervisor of three Evaluation Works participants said, explaining why her staff members' had had limited engagement with the project since March 2020 and why her team could not participate in even an abbreviated virtual session. "We are short-staffed and overcommitted. That's the reality at the moment and I don't see it ending."
Milestone Activities and Participation Summary
2- New Vt-SARE website with section focusing on the Evaluation Works project
3- Application form and self assessment instrument to collect baseline information about project participants
Participation Summary:
Learning Outcomes
The project coordinator verified changes in knowledge, attitudes, skills and/or awareness among 18 of the 22 service providers in the core cohort of participants. The project used a variety of tools to verify changes in knowledge, attitude, skills and/or awareness. These tools included: review of participants work product (for example evaluation planning documents, data collection tools and reports that participants shared with the project team); as well as interviews and surveys. Verification activities were conducted within and immediately after trainings, and at the end of the project.
Verification within and immediate after trainings:
Based on work product shared with the project team and post-training surveys, 18 of the participants provided evidence that their knowledge, skill and/or confidence level increased as a result of participating in one or more project activities. This data was collected in 2018, 2019 and early 2020 and is reflected in the reporting in the milestone sections of the final report.
Final Evaluation Survey:
Eight people responded to the final evaluation survey. All eight of these people are represented within the larger group above. Because the survey was conducted months after the last training sessions it provides insights into the ways increased knowledge, skill and confidence changes are being incorporated into their work over the long term.
All respondents reported increased knowledge, skill and confidence related to using evaluation to improve programming. Significantly, all eight of the respondents reported that their participation in Evaluation Works changed the way that they think about evaluation and its role in their work with farmers.
The survey instrument asked respondents to rate changes in their knowledge, confidence and skills on a five-point scale from “No change” to “Significant increase.” The greatest change reported was related to increases in confidence in their ability to undertake evaluation activities and to use evaluation results to help improve programs for farmers. All eight respondents reported moderate (3) or significant (5) improvements in confidence. Seven reported moderate (5) or significant (2) increases in knowledge and five reported moderate (4) or significant (1) improvements in their evaluation skill levels. S
In open-ended questions respondents reflected on the changes to their knowledge, skill, confidence and attitudes, and the benefits these changes are having on their programs.
• The homework and actual program/work integration (rather than example) were very helpful in building confidence and practicing skills learned.
• Collecting qualitative impacts are [sic] easier to capture than I thought. Some of the data is actually more compelling.
• Shifting to service-oriented thinking, incorporating customer perspectives into process development.
• I was given access to training that will prove very beneficial to the long-term ways that I address collecting data and storytelling..
• I am very grateful for learning about Results Based Accountability, and have found endless application in the work that I do.
Performance Target Outcomes
Performance Target Outcomes - Service Providers
Target #1
20 Extension educators and non-profit personnel will use increased outcome evaluation knowledge and skills to improve programs designed to help 150 beginning farmers launch and grow farm enterprises that meet their business, stewardship and lifestyle goals, and help 150 established producers adopt nutrient management, cover crop, and other production practices that support Vermont’s new water quality goals.
Year 1 | Year 2 | Year 3 |
---|---|---|
18 | 8 |
Improving evaluation practices and using evaluation results to improve design and delivery of farmer programs related to water quality protection and business development. Of the eight people who completed the final evaluation survey, three reported using evaluation to make improvements to water quality programs, one reported changes to a farm business development program, and four reported making changes to both farmer development and water quality education programs.
Actions taken by all eight included changes to: evaluation planning; assessment of outcomes for program participants; data collection and analysis practices; and the ways the communicate with funders and stakeholders about program accomplishments.
Year 1 | Year 2 | Year 3 |
---|---|---|
1290 |
Activity | Year 1 | Year 2 | Year 3 | Total |
---|---|---|---|---|
Workshops and field days | 2 | 2 | ||
Stakeholder assessment and feedback data from farmers and aspiring farmers via intake forms, surveys and feedback forms (10); Farmer feedback and assessment via interviews and focus groups (2); modifications to grant application, review and administrative processes (3); online conference planning and decision-making (2); communications about program opportunities, activities and accomplishments via written reports (3). | 8 | 12 | 20 |
This section focuses on outcomes the project coordinator verified through both year 2 and end-of-project interviews and surveys.
Covid & Cohort changes:
In discussing achievement of the performance target, it is important to acknowledge the impacts the Covid-19 pandemic and the "great resignation" have affected this cohort of learners. Of the total cohort of 22 participants, six people withdrew from Evaluation Works prior to the project's end due to changes in their employment. All but one these individuals have left the organization they were employed at when they initially enrolled in the project. The sixth individual has a new position and different responsibilities. Additionally, due to covid-related changes in work plans, staffing shortages and pressures to address farmer needs and project deliverables, the supervisor of two other participants restricted them from participating in the final project year.
The project coordinator sent a final project evaluation survey to 14 of the individuals in the project’s core cohort. Eight people responded. The survey asked a number of questions about changes they made as a result of participating in Evaluation Works. The questions addressed adoption of new practices related to:
- The use of evaluation in program decision-making and delivery
- Participants’ approach to evaluation planning
- The way participants collect and analyze data
- The way participants communicate with funders and stakeholders about program activities and accomplishments
Use of Evaluation in Program Design & Delivery
All respondents said that their participation in Evaluation Works helped them use evaluation practices to improve program design and delivery. Respondents indicated that they have applied what they learned in evaluation works to:
- Assessing outcomes for participants (8)
- Improving existing programs/services (7)
- Assessing how satisfied participants are with programs/services (6)
- Planning new programs/services (5)
- Making decisions about whether to continue programs/services (1)
Changes to Evaluation Planning
All participants said their participation in Evaluation Works has influenced their approach to evaluation plan. Specific changes include:
- The kinds of data they are now collecting (8)
- Identifying performance measures (7)
- How much data they collect (6)
- Stakeholder engagement in developing program goals, activities and evaluations (6)
Changes to data collection and analysis practices.
All participants said their participation in the project influenced the ways they are collecting and analyzing data about their programs. Changes include:
- Wording of questions on registration forms and/or evaluation surveys (8)
- Overall design of registration forms and/or evaluation surveys (4)
- Data analysis practices (4)
- Use of in-session polling (3)
- Questions asked in focus groups/interviews (3)
- Use of observation and/or assignment completion (3)
- Other – Addressing data visualization differently (2)
All respondents said that as a result of participating in Evaluation Works they have changed the way they are communicating with funders and stakeholders about program activities. Changes included:
- The kind of information provided in written reports (8)
- Data/evidence provided in grant proposals (8)
- Information shared with partners and collaborators (6)
- The quality of the information provided in written reports (6)
In aggregate, over 1290 farmers participated in the programs in which respondents applied learning from the Evaluation Works project. The minimum number of farmer program participants was 20 and the maximum was 500.
Six of the eight respondents reported sharing learning from Evaluation Works with colleagues at their organization or with collaborators/partners from other organizations and agencies. In total, Evaluation Works participants reported sharing learning with 65 other agricultural service providers and educators.
Examples:
- I worked on this with a team mate in the organization -- this was very valuable. We supported one another when we worked with the rest of the organization (the board) - it was a very positive experience. The board saw the value in our participation and the changes we made related to our program evaluations.
- As a result of this course, I started to collaborate on a measurement plan that expanded past my own program to include aligning metrics across the [division].
- We have pivoted from measures . . . focused on how much toward considering how well and is anyone better off even though those are harder data points to gather, they offer the critical feedback to truly evaluate what we do.
- We have changed the questions we ask on our evaluations and in our grant reporting. We know why we are asking the question and what the results will help us learn and communicate.
Seven of the eight survey respondents reported that pandemic affected their ability to apply Evaluation Works learning in their work. Six of the seven rated the impact as moderate or significant. Several indicated that it was more difficult to get feedback following virtual education. "In the same way that COVID has impacted much of the work we do, it has impacted in-person events and the ability to have focus group discussions with participants," one respondent explained. "We still held out major event but we would have done more programs and therefore would have used our new skills more often," another said.
Additional Project Outcomes
Year 1 | Year 2 | Year 3 | Total |
---|---|---|---|
2 | 2 | 2 | 6 |
Unanticipated outcomes:
Pandemic Response
Six participants reported that Evaluation Works helped them adapt their farmer programming during the pandemic. "Evaluation Works actually helped us through the pandemic and virtual collaboration space," one respondent said. "It provided a framework and structure to tackle big program discussions and/or season debriefs." Another respondent credited the knowledge gained through the program with providing "heightened awareness of critical information to capture during an unprecedented time." Said another, " We used our improved survey technique to find out our memberships willingness to participate in programs during the pandemic."
Ripple Effects Across Organizations and Across New England States
Several participants reported that they are using what they learned in Evaluation Works to improve and change and align data collection and reporting across multiple programs at their organizations/agencies.
- One Evaluation Works participant who is responsible for supporting planning and evaluation activities throughout her organization of 130 employees, reported that the participation improved her ability to help colleagues engage in more meaningful evaluation. "I feel like I'm doing a better job supporting people to think about how they will track and name, show change," she said, explaining that she has a better sense of how to help them develop an evaluation plan. "What data exists? What do you plan on collecting and tracking? And how does that relate to the goal that you're aiming for? The support that I'm providing in those spaces has become more refined. More helpful. It's more refined."
- Another participant shared that as a result of participating in Evaluation Works she started to collaborate on an initiative that will align metrics across multiple program in her agency and provide opportunities for the programs to aggregate outcome and impact data, both to better understand how their work is collectively affecting Vermont's farm and forest economies and to more effectively communicate that information to funders, policy makers, and program participants.
Two other participants report that they are using what they learned in Evaluation works to help align data collection and reporting in a six-state, multi-organizational New England livestock project.
---
In reflecting on their experience, several Evaluation Works participants expressed appreciation for the project's focus on supporting each participant to focus on their program and needs in an environment of collaborative inquiry and problem solving.
"I appreciated this program . . . because it promoted bringing what we are already working on into the learning space. Because of the integration of what we are already working on, the homework didn't feel like extra it felt like enhancements to the program and provided multiple benefits; personal experience and training AND program improvements," one participant said.
"I appreciated the very practical nature of this program," another wrote. "We worked on our own material basically the whole time. "
Said another, "Thank you . . . for helping to impart the importance of thoughtful evaluation in our work. Thank you for the coaching, the information, and most of all for the collaborative space to become better at what we all do."
Participants also noted that meaningful program evaluation is hard work, which requires significant and sustained engagement to successfully implement and lead to program improvements. "I put a lot of time and intent into this and wish credits towards a Masters program were relevant," said one participant.
SARE Outreach
In 2020 and 2021, The Vermont SARE Program conducted outreach about SARE Grant Opportunities and the results of SARE research in the following ways.
- Bringing the Northeast SARE exhibit and materials to four major Vermont agricultural conferences:
- Northeast Organic Farming Association of Vermont - Winter Conference (2 days, February 2019);
- Vermont Grass Farmer's Association Conference - (2 days, January 2020)
- Vermont Industrial Hemp Conference (1 day, February 2020)
- Vermont Organic Dairy Conference (1 day, March 2020)
- Maintaining and expanding the website for the Vermont SARE state program and the Evaluation Works professional development project (http://blog.uvm.edu/vtsare/).
- Continue to build social media presence of SARE projects on Twitter, Facebook, and partner blogs.
- Responding to individual telephone and email inquiries (approximately 20 over each of the two years) from farmers and service about SARE funding opportunities.
Due to covid restrictions, no in person exhibiting was conducted between April 2020 and the end of the project.
In 2019, the Vermont SARE Program conducted outreach about SARE grant opportunities and results of SARE research in the following ways:
- Bringing the Northeast SARE exhibit and materials to five major Vermont agricultural conferences:
- The Vermont Farm Show (3 days, January 2019)
- Northeast Organic Farming Association of Vt Winter Conference (2 days, February 2019);
- Vermont Industrial Hop Conference (1 day, February 2019)
- Vermont Grain Conference (1 day, March 2019)
- Northwest Crops and Soils Field day (August 2019)
- Coordinating and delivering a presentation at a 75-minute information session on SARE's graduate student grant program (April 2019).
- Maintaining and expanding the website for the Vermont SARE state program and the Evaluation Works professional development project (http://blog.uvm.edu/vtsare/).
- Continue to build social media presence of SARE projects on Twitter, Facebook, and partner blogs.
- Responding to individual telephone and email inquiries (approximately 20 over the year) from farmers and service about SARE funding opportunities.
In 2018, the Vermont SARE Program conducted outreach about SARE grant opportunities and results of SARE research in the following ways:
- Bringing the Northeast SARE exhibit and materials to five major Vermont agricultural conferences:
- The Vermont Farm Show (3 days, January 2018)
- Northeast Organic Farming Association of Vt Winter Conference (2 days, February 2018);
- Vermont No-Till and Cover Crop Conference (1 day, February 2018)
- Vermont Hop Conference (1 day, February 2018)
- Vermont Grain Conference (1 day, March 2018)
- Coordinating and delivering a presentation at a 75-minute information session on SARE's graduate student grant program (April 2019).
- Building a new website for the Vermont State SARE Program and state professional development project (http://blog.uvm.edu/vtsare/).
- Continue to build social media presence of SARE projects on Twitter, Facebook, and partner blogs.
Recieved information about SARE grant programs and information resouces:
Audience | Year 1 | Year 2 | Year 3 | Total |
---|---|---|---|---|
Service providers | 250 | 250 | 250 | 750 |
Farmers | 1800 | 1800 | 1800 | 5400 |