Organizational EffectivenessSharing Learning 2016
Organizational Effectiveness Overview
The David and Lucile Packard Foundation’s (Packard Foundation) Organizational Effectiveness (OE) team provides funding to non-profit organizations, networks, and individual leaders to build core strengths in key areas of infrastructure, such as strategic and business planning, financial management, and board and executive leadership.
To more effectively support grantees in becoming stronger and more resilient, OE conducts ongoing measurement, learning, and evaluation activities for all grants. The 2016 Sharing Learning report presents grantee-reported data and findings on the extent to which OE grants result in greater organizational capacity and amplify the impact of Foundation programs and strategies, as well as the extent to which the OE team provides good service to grantees.
Sharing Learning Evaluation Approach
This report is one piece of OE’s evaluative activities, specifically examining the results of OE grants at the end of the grant term. The Sharing Learning evaluation explores grantee objective completion, capacity increases, and program impacts reported by the grantee.
ORS Impact reviewed and qualitatively coded 55 grant reports. Grants included in the sample ended between July 1, 2015 and April 30, 2016 and had their grant report submitted and approved by June 30, 2016/
Grant reports included in the analysis were distributed across program areas, with the greatest numbers in Local Grantmaking and Conservation & Science.
ORS Impact also considered a number of relevant grant characteristics in the analysis—including grant size, geography, and project focus—to understand how these factors might shape objective completion, organizational capacity changes, and program impact.
Over half of grants were awarded less than $30,000 and only 13% over $45,000. The average award size was $30,044.
Two thirds of grants were domestic, and one third were global.
The greatest number of grants were in strategic planning and executive leadership. However, even within a focus area, grant activities and objectives could look—or be defined by the grantee—differently.
Grants included between one and three focus areas. Grants categorized as 'Other' included objectives such as conduct an audit of the external marketplace,' 'conduct a feasibility study for developing a training center,' design an internship,' etc.
Grant objectives were defined by the grantee organization with the assistance of their OE consultant and varied widely in specificity and number (between one and seven).
Nearly all grantees completed objectives, with some variability by program area and project focus.
82% of grantees met all objectives. This was slightly lower than the proportion of grantees (95%) that met all objectives in 2015.
All Population and Reproductive Health grantees met all objectives, while Local Grantmaking had the greatest proportion of grantees not meeting one or more objectives.
Grantees that included Board Development and/or Strategic Planning as a focus area were less likely to meet all objectives. That said, the objective completion rate for these focus areas still exceeded 70%.
Grantees with small OE grants were less likely to meet all objectives. However, more grantees received small OE grants, than medium or large.
Additionally, grantees that were domestic in scope were less likely to meet all objectives. However, more grantees were domestic in scope than global.
Capacity change was coded when assessing the difference between reported organizational capacity outcome(s) and project focus area(s). ORS Impact coded grant reports using the following scheme:
- Grantees reporting no organizational capacity outcomes were coded as "capacity did not increase."
- Grantees reporting capacity outcomes within the project focus area were coded as "capacity increased."
- Grantees reporting capacity outcomes within and beyond their project focus area(s) were coded as "capacity increased beyond focus."
Nearly all grantees built organizational capacity—and did so in a number of areas—as a result of OE investments.
91% of grantees reported an increase in capacity in or beyond the project focus area. Capacity increases were usually aligned with the project focus.
Strategic planning and executive leadership projects resulted in greater proportions of grantees that experienced capacity increases beyond the focus area. For example, one Executive Leadership grant resulted in a systemic shift in the grantee organization's structure and culture.
Note: ORS Impact defined 'increase in organizational capacity' and used a different coding scheme for 'type of capacity outcome' than previous evaluators. Therefore, we were unable to compare capacity data and results between 2015 and 2016.
Organizations experienced capacity outcomes in a number of areas. Capacity outcomes tended to be logical and aligned with the focus area(s). The table to the left showcases the types of capacity outcomes reported by grantees.
The capacity outcomes that grantees most frequently reported were:
- stronger or more efficient infrastructure or operations, including new or improved systems (e.g., databases, advisory groups) or processes (e.g., data use, onboarding);
- improved long-term planning or more intentionality to strategic decisions and/or actions;
- increased engagement of board members.
Program impact can be understood as positive changes for the people or places that are the target of the grantee and/or foundation's programmatic work. While program impacts are beyond OE's direct sphere of influence, the assumption is that over time, new or strengthened organizational capacities contribute to program impacts. Given the timing of the Sharing Learning evaluation (i.e., end-of-grant), OE does not expect grantees to report program impact.
While most grantees experienced organizational capacity outcomes, few reported specific program impacts. At the end of the grant term, four grantees described changes in program quality, four in program design, and one in target population.
These findings are consistent with those reported in 2015. Evaluators of the 2015 grantees found that 10 of 77 grantees (13%) reported direct program impacts, compared to eight of 55 in 2016 (15%). This could be attributed to the timing of the Sharing Learning evaluation; data collection after grant closure may not fully capture program impact.
Grantees are satisfied with the service and supports provided by OE staff. When asked what the OE team could do better to support capacity building, 46% of grantees reported that OE staff were doing a good job. When remarking on OE staff performance, grantees described OE staff's flexibility, support, insight, and transparency.
Recommendations for Improvement
In terms of recommendations for improvements, grantees most frequently cited wanting help finding consultants and for OE to fund grantee staff.
The table to the left showcases the types of advice reported by grantees.
Nonprofit Developmental Stages
The graphic to the left suggests that nonprofit organizations move through a number of stages in a developmental trajectory as they grow.
In future evaluative efforts, this framework may be useful for understanding the transformative impact of OE grants on non-profit capacity-building relative to where grantees are in their lifecycle.
The David and Lucile Packard Foundation's Organizational Effectiveness grants have now been evaluated over multiple years and by different evaluators using different methods. Across years and methodological approaches, data have consistently shown high levels of objective completion that lead to clear capacity outcomes. Given the consistency of these results, the OE team may wish to consider what else they would like to learn that could strengthen their grantmaking or inform the sector more broadly.
Considering the changes in capacity outcomes that OE grantees experienced, the OE team (or others investing in organizational effectiveness) might contemplate the following questions:
- Do the Sharing Learning data/findings align with your expectations?
- Are the capacity outcomes commonly reported by OE grantees ones you would have expected to see as a result of the OE grants? Are there other outcomes that you had hoped your grantees would experience but that did not arise?
- Is there anything else you've observed as a result of OE investments that is not reflected here? What other data might be useful/actionable to capture in future evaluative efforts?
- Are there takeaways from this evaluation that would be worth sharing with others? Would it be helpful for grantees to learn from each other about their experiences? How else might we maximize and share learning?
Lasting Change 2016
ORS Impact also conducted the 2016 Lasting Change evaluation, which looks at the longer-term impact of OE grants on grantee organizations.