You are viewing archived content
of the Inter-American Foundation website as it appeared on June 1, 2018.

Content in this archive site is NOT UPDATED.
Links and dynamic content may not function, and downloads may not be available.
External links to other Internet sites should not be construed as an endorsement of the views contained therein.
Go to the current website
for up-to-date information about community-led development in Latin America and the Caribbean.

Evaluation Process

Press Enter to show all options, press Tab go to next option
IAF has a longstanding commitment to evaluation and learning. Evaluation involves a systematic comparison of actual change against planned goals and objectives, which helps both the IAF and its grantee partners assess the results of their work. The primary tool used is the Grassroots Development Framework (GDF), a system of 41 indicators developed by the IAF in the mid-1990s to measure the tangible and intangible results of its investments. Application of the GDF generates basic information that is reported to the IAF’s board of directors and to the Office of Management and Budget and Congress as required by the Government Performance Results Act (GPRA).

Staff and partners are actively engaged in exploring the dynamics of citizen-led community
development. The entire active portfolio and relationship with grantee partners informs our
work and adds to the body of knowledge about community development. Because effective
development is a long-term endeavor, the IAF conducts yearly ex-post assessments of
selected projects four or five years after their IAF funding ceases.

Process of Evaluation and Learning
The IAF requires grantee partners to report regularly on results of their work so that
progress toward objectives can be assessed. Results are tracked by applying the GDF’s
menu of tangible and intangible measures of results at the level of individual or household,
the grantee organization and, more broadly, the relevant community or society. The process helps grantee partners set goals, clarify achievements and challenges, and learn from them.

1. Selecting Best Indicators for the Project
During their initial orientation meeting, the grantee partner, the IAF representative
and the professional contracted in-country (data verifier) (GDF indicators) choose
indicators from the GDF that coincide with the desired objectives, or short-term
results, of a development project.

Examples follow:
  • the knowledge and skills that beneficiaries acquired and applied;
  • managerial skills developed by the grantee organization, resources leveraged and relations forged with the private and public sector toward expanding the impact of the project;
  • changes in programs and policies that benefit disadvantaged people in the community and dissemination of good practices and lessons.
2. Baseline Data
The grantee partner and the IAF’s Office of Evaluation document baseline
conditions before the IAF’s initial disbursement of funds so that conditions can be
compared at the outset, during the course of the project and at the end. If the project
is assessed five years after IAF funding terminates, baseline data allow for a longerterm

3. Reporting
Grantee partners report regularly on progress toward their objectives and goals. The
Office of Evaluation verifies and aggregates results for the annual Grant Results
Report, which responds to GPRA. Reports prepared by individual grantee partners
as well as the Grant Results Report are used by the Office of Programs to track

4. Project History
At the end of the project, the grantee partner compiles a narrative detailing its

project’s design, implementation, results, goals met, expected sustainability and impact. The narrative identifies what worked, what did not and why, and includes key lessons and comments. Data verifiers review the project history and include their own assessments as to the extent that projects were successful in achieving development objectives. The Office of Evaluation and Office of Programs summarize best practices relevant to each project. Project histories can be clustered by common themes or desired outcomes, which helps the IAF draw more robust conclusions.

5. Ex-post Assessments
Once a year, the IAF’s Office of Evaluation and the Office of Programs select for
assessment by thematic cluster or individually a subset of between five and ten
projects whose IAF funding ended several years earlier. Data verifiers visit each
project site; glean relevant data from interviews, observations and other sources; and
draft a report for review by IAF staff. The ex-post assessment compares the baseline
data collected before the initial disbursement of IAF funding with the data on the
same indicators collected later. The Office of Evaluation summarizes conclusions to
be posted on the IAF website or published in the Grassroots Development or other

6. Meta Analysis
The Office of Evaluation and the Office of Programs may contract external
evaluators to conduct meta analyses of a cluster of ex-post assessments. This
involves examining them for commonalities and trends traceable to IAF funding.
Lessons extracted are applied to the grant making process and disseminated to a
broader audience.

Learning Potential
To improve its effectiveness as an organization, the IAF constantly evaluates its approach to community development. Reviewing the entire portfolio helps determine the validity of its
core work principles and approaches. Systematic evaluation of and reporting on results

achieved along the way also helps grantee organizations manage their respective projects, report clearly and demonstrate agency and accountability.

Table: The Evaluation Process and Responsibilities

Grantee Partner
chooses GDF indicators (at start of project), documents baseline conditions (at start of project), reports results as measured by chosen indicators (twice a year), drafts project history  (end of project)
Data Verifier
confirms baseline conditions (at start of project),  validates grantee reports and checks if goals and benchmarks are met (regularly), reviews project history and adds commentary (at end of project),  conducts ex-post assessments (four to five years after project ends)
Foundation Representative
reviews project history (at the end of the project), reviews ex-post assessments (regularly)
Office of Evaluation
aggregates data in results report (annually), chooses topic for ex-post assessments (annually), reviews and summarizes ex-post assessments (annually), reviews and summarizes meta analysis (periodically)
Office of Programs
 reviews ex-post assessments (annually), reviews meta analysis (periodically)