Our Piece of the Pie: From Data to Decision-Making

Download PDF

Author(s): Alex Cortez, Liana Rao

Published Date: January 01, 2008

When nonprofit leaders try to use data to make better decisions, their experiences frequently seem like a 21st century equivalent of Goldilocks and the Three Bears. Many have too little data. Others have too much and are paralyzed by its quantity, not knowing which data will yield the insights they need. Very few have access to just the right data, at the right time, and in the right format. Even fewer have the internal capabilities to use that “just right” data to inform important strategic decisions about how to increase their organization’s impact.

Our Piece of the Pie® (OPP®) is a youth-serving nonprofit organization based in Hartford, Connecticut. In 2006, its management team undertook an internal effort to define “just right” data for the organization and put it to use. This case study follows OPP’s management team members as they determined what key decisions they wanted data to inform, identified what data was required, developed a method for collecting that data, and ultimately shifted their organization’s culture to better use data in driving decisions.

The Context

OPP, originally Southend Community Services, was founded in 1974 to serve the residents of one of Hartford’s most disadvantaged neighborhoods. By 2004, the organization had expanded across the city, serving individuals of various ages through a wide range of services. But with its biggest contract due to expire in 2005 (a five-year Youth Opportunities grant from the U.S. Department of Labor), Bob Rath, OPP’s president and CEO, and his management team decided to re-evaluate the organization’s strategic direction. Their goal was to ensure that OPP could make the biggest possible difference in Hartford going forward. As a result of that planning effort, OPP repositioned itself as a youth-serving organization, and exited other service areas, including elder care and early child daycare.[1]

By the fall of 2005, the repositioning was complete. OPP leaders had created a new core program called “Pathways to Success” (Pathways) to replace and augment the suite of services previously offered under the auspices of the Youth Opportunities grant. Pathways was serving more than 500 Hartford youth ages 14 to 24. In addition, OPP was operating a youth-focused employment program funded through the Workforce Investment Act (WIA) and the Jobs First Employment Services (JFES).

Managing the WIA/JFES program was relatively straightforward because it utilized an established government model. OPP was a contractor, providing a consistent set of services and tracking performance as directed by an externally developed and hosted system.

Pathways, in contrast, was complicated. Each youth participant worked with a case manager and received a customized set of education and/or employment-oriented services based on his or her age, circumstances, and goals. (The sidebar entitled “OPP’s Pathways” provides more detail.) And since Pathways was home-grown, having evolved whole-cloth out of OPP’s experience with the Youth Opportunities grant, OPP’s leaders needed to develop their own tools and methods for managing its delivery and assessing performance.

The organization was collecting some good data. For example, its financial reporting was solid. And in May 2006, OPP had implemented a system custom-designed by a third-party software provider to track Pathways’ beneficiary data, including enrollment and attendance as well as outcomes achieved. This system had strong capabilities for capturing individual beneficiary data. However, OPP’s leaders were learning that there was a steep learning curve associated with using it. Rath and his team were concerned that the data being generated was not 100 percent accurate, but it was difficult to identify if and where mistakes might be occurring.

In addition, the system was designed to track the progress of an individual youth. Aggregating the performance of the total population (or of segments, such as the youth in each Pathway designation) was complex and labor-intensive.

OPP’s Pathways

Pathways provides a combination of case management and direct services to youth ages 14 to 24 in the Hartford area. Upon entering the program, participants are assigned to one of the following Pathways, based on their individual situation as assessed by a case manager (a “Youth Development Specialist”):

  • Pathway 1: Out of school, no diploma
  • Pathway 2: In school, more than two grades behind
  • Pathway 3: In school, on track (two or less grades behind)
  • Pathway 4a: Out of school, with diploma, working towards higher education
  • Pathway 4b: Out of school, with diploma, working towards employment

Each Pathway consists of a set of education and/or employment services provided by OPP. The services vary not only by Pathway but also by the age of the specific youth. For example, programming for younger participants in Pathway 1 is focused on getting them engaged and back in school. In contrast, programming for older youth in this same Pathway is focused on helping them attain a GED and long-term, full-time employment. Services (and Pathway designations) are also further customized at times to meet individual long-term educational and/or employment aspirations that the Youth Development Specialists define through their work with each youth.

OPP works with and provides support to youth for multiple years to help get and keep them on track to enter post-secondary education and/or obtain full-time, unsubsidized employment. Youth “graduate” into new Pathways as their circumstances change. The model’s flexibility and personalization allow OPP to serve a wide range of Hartford’s youth, but make it complex to administer and manage.

Finally, none of the beneficiary data could be integrated with either OPP’s financial or operational data, which were in turn separate from each other. OPP tracked financial information in a standard accounting system. It stored operational data in several places, with the data collection method varying by department. Some departments tracked “slots” (the estimated number of youth that could be served for any given service); others did not. Staff recorded their time in timesheets that mapped to neither slot information nor financial reporting.

As a result, OPP’s management team members were regularly frustrated in their efforts to use data to guide their work. They could not easily determine if they were making the best use of their resources to help as many youth as thoroughly as possible through the Pathways program. They could not quantify the cost of each type of service and youth served to ensure financial sustainability. And they could not predict with complete confidence the resources they would need to respond effectively to new growth opportunities. Data was meant to be a source of strategic value, but its limitations were constraining the success of the organization.

This frustration also existed because OPP’s managers had become accustomed to using data to drive decisions when they administered the Department of Labor’s Youth Opportunities grant. They had begun to build an organizational culture that relied on data—and they had started to develop discipline around asking what kinds of information they needed to gather. This history provided a good foundation for the hard work that lay ahead.

Key Questions

With the support of the Edna McConnell Clark Foundation, OPP engaged with the Bridgespan Group over a period of 10 weeks to address what Rath and his team saw as both an obstacle and a significant opportunity: unlocking the value of data in managing the Pathways program.

To begin the process, Rath assembled a working team consisting of all of his senior management, including program, HR, finance, and fundraising staff. Together with Bridgespan, this group identified four key questions that would frame their work:

  • What decisions did OPP management want data to better inform?
  • Which pieces of data were required to support these decisions?
  • What was the most effective and efficient process to supply this data in a timely manner?
  • How would OPP’s culture need to change in order to make using data a consistent and fundamental part of decision-making?

What Decisions Did OPP Want Data to Better Inform?

In several group sessions and in individual interviews, the OPP working team and Bridgespan staff identified places where data limitations were constraining decision-making. It quickly became clear that while each management team member had unique decisions to make, all were suffering from a deficit of data.

For example:

  • Hector Rivera, director of Youth Development Services, knew that the youth served by Pathways were achieving positive short-term and long-term educational and employment outcomes. But he did not know how well these youth were adhering to their prescribed set of services. Was OPP assigning them to the right Pathway? Were youth being enrolled and then attending the services aligned with their goals? Was Pathways succeeding because its design was being implemented well or for other reasons? Could OPP further improve performance? Rivera wanted a way to assess program fidelity quickly and regularly, without having to audit each youth’s case file.
  • Educational Coordinator Lucy Carmona wanted to know if Pathways staff members were allocated optimally, so that OPP was offering the maximum number of slots for each service provided. She also wanted to be able to re-allocate staff in a timely manner—from general tutoring to SAT preparation, for example—as the needs of the youth population shifted. However, she did not have clear and timely information on the aggregate demand for each type of service. How well were resources being allocated to meet youth needs? Was OPP allocating resources to best respond to changing needs?
  • Vice President of Administration Lisa Mottola needed better data to help her track the costs associated with Pathways. Lacking the ability to integrate the program’s financials with its beneficiary data, she was unable to tell if certain groups of beneficiaries were more expensive to serve than others. As a result, it was hard for her to predict how changes in the organization’s mix of participants (for example, shifting towards higher-need youth) would affect its financial position. Was the program model remaining within target cost levels? How were changes in the composition of its youth beneficiaries altering financial needs over time?
  • Marie McFadden, manager of Development and Sustainability, was concerned about whether OPP was accurately pricing its services. McFadden was responsible for seeking new contracts and responding to interest from government agencies and other entities. However, it was difficult to pursue growth without a clear understanding of the cost required to serve different youth segments (based on Pathway and/or age). Were the proposals OPP was making and responding to structured to cover the true costs of serving different populations? When should OPP accept or decline a proposal for growth based on financial sustainability?

Based on these observations, OPP’s management team ultimately identified that a successful project would allow the organization to better use data to inform decisions (and actions) in three areas:

  • Program fidelity and performance;
  • Efficient resource allocation;
  • Clarity in costs to promote financial sustainability and to better pursue growth.

Rivera explained that, “It was clear to us that we were doing good work and having positive outcomes for youth. But we needed to be able to prove it—to ourselves and to our funders.”

What Data Did OPP Need to Support Decisions in These Areas?

The project team next turned to the question of what data would actually be required. The goal was to sort through OPP’s existing data and determine what they could use, what was missing, and how they could combine the data into an informative and user-friendly set of metrics.

They recognized that in some ways, this task was far more difficult than identifying the decisions. They already had a lot of data; collecting additional data, as well as combining pieces of data into meaningful metrics, could easily become overwhelming in terms of staff time. As a result, the team decided to be extremely disciplined in identifying only the data that would be most meaningful. Less time collecting and processing data, with a focus on what was most important, would mean more time for interpretation and discussion.

The team also recognized that each decision would require a different set of metrics created from the data pool. Consider the issue of funding. Vice President of Administration Mottola needed to know that OPP was raising sufficient funds to meet the needs of its beneficiaries. To compile this metric, she would need to collect and integrate data from multiple sources:

Beneficiary data inputs Operational/staff data inputs Financial data inputs
  • Which individual youth were in each Pathway
  • Which specific services each youth was receiving, and in what dosage
  • The total number of youth in each Pathway and their aggregate consumption of each service
  • Staffing allocations by timesheet to provide each service
  • Number of slots provided for each type of service
  • Staff salaries
  • Non-staff direct costs for each service
  • Allocation of general & administrative costs to each service

 

Manager of Development and Sustainability McFadden also needed to draw from these data sources—to price out OPP’s services for government requests for proposals (RFPs) accurately and to respond to queries from other potential funders—but some of her required metrics were different. She needed to calculate the cost of providing specific services (such as case management or youth employment services) and combine it with various intermediate and long-term outcomes to quantify the cost of achieving success for youth with different needs.

While developing these metrics, the team was acutely aware of the effort it would take to gather the data. Based on their goal to keep things as streamlined as possible, they eliminated several metrics deemed desirable but not necessary. At the same time, they weighed the cost of not having some core pieces of information. To continue with the earlier example, OPP certainly could save itself considerable effort by simply looking at the average cost per youth served across all Pathways. However, the team knew (but could not quantify) that the cost to serve youth in different Pathways varied significantly. If, as an example, the organization responded to a request to work with youth coming out of the juvenile justice system, most of whom would be in the intensive Pathway 1, and based its per-youth cost on the overall average, OPP would be under-funding its services significantly.

How to Collect and Share the Data: Making Apples Talk to Oranges

The metrics now clarified, the team examined where to find the necessary data. They quickly realized that most of it was already available. Individual beneficiary data (e.g., which youth were assigned to which Pathways, and what services they were receiving) was in the Pathways beneficiary database. Basic financial data was in OPP’s accounting system. How staff were allocating their time by service resided in timesheets. There was only one new piece of data to collect: a consistent estimate of available slots for each service.

The big challenge, then, was developing a structure in which the data could be joined and made to “speak together” efficiently. None of the existing systems could be adapted to take data from the others, yet each provided vital pieces of the puzzle. The team needed a separate system that could serve as a Rosetta Stone, translating all of the data from these disparate sources into a single meaningful set of metrics. The system would need to make it easy for staff to collect and analyze the data. It would also have to allow for reporting the data in an accessible and user-friendly manner. After some research, the team determined that this new system, while complicated, could be built and hosted in Microsoft Excel.

At this point, the project took on a more tactical focus. The first step towards developing the new system was to create a set of audience-specific “dashboards.” Each dashboard would be populated with a custom subset of the metrics—just the ones that were relevant to its audience’s particular decision-making needs—to avoid overwhelming individuals with unnecessary information.

Complementing and unifying all of the dashboards would be a summary version that the entire management team could review. This dashboard would provide a common high-level summary of the entire Pathways program, as well as OPP’s WIA/JFES contract and several other smaller programs. (See Appendix A for an illustrative version of the Pathways summary dashboard.)

Even with a separate system in place to combine all the data, the team members knew that they would still have to make some adjustments within existing systems. For example, based on how they wanted to segment and display information on services, staff, and different types of youth, they would need to adjust the general ledger coding in the accounting system and the categories staff used to fill in their timesheets so that they were consistent with each other and with the way youth beneficiaries and their program participation were tracked. The project team also worked with the vendor of the Pathways beneficiary tracking system to design custom reports that fed easily into the dashboards.

Finally, they had to determine how often OPP staff should collect the data. Too frequently would lead to more time spent on collecting than using data, and therefore diminishing returns. Too infrequently would mean not having data in a timely manner. They decided to collect most data on a monthly basis. A monthly data-collection schedule would also complement OPP’s preexisting monthly management meetings. A smaller group of key metrics related to youth educational outcomes (such as grade advancement, graduation, and college enrollment) would be collected annually, since these conditions only changed once per year.

To make the new system work, team members knew that they would have to reinforce in the minds of all staff the need to report data in an accurate and timely manner. They also determined that, to complement this collective reporting effort, one person would need to be responsible (and accountable) for (a) inputting the data into the new Excel model, (b) identifying any obvious data gaps or input mistakes from the raw data, and (c) providing the dashboard reports to the management team ahead of the monthly management meetings. In July 2007, they tapped Kimberly Williams-Rivera as their quality assurance director to handle this task.

Changing the Culture: Using data to inform decisions

Having data is, of course, not the same as using data. Just as OPP staff were learning to “feed” the new dashboard system, OPP’s management would need to adapt to using the new level of information.

The management team recognized that the greater transparency would be an adjustment. They would need to recognize both positive and negative insights. Using it optimally would require an environment of both trust and candor.

As Williams-Rivera explained, “Data by itself does not solve problems. It fosters discussions around possible solutions. Data provides focused discussions versus general undocumented assumptions that could be way off base. Our goal was not to single out people whose results were lower than others, but to learn from those with the best results and replicate their methodology throughout the group.”

They also understood that having good data would not necessarily always (or even frequently) translate directly into obvious decisions. The monthly dashboards could reveal where issues were arising, but management would need to dig deeper to fully understand why and what to do. However, the OPP team came to recognize that for many key issues, there were “likely suspects” to look at more deeply depending on if the performance being reported was good or bad. As a consequence, after going through the first full set of populated dashboards, the team added a set of prompting questions on each dashboard to encourage meaningful analysis and discussion, and to provide greater guidance on how to “peel the onion” in reaction to a dashboard’s content.

To help with the integration process, the team explicitly dedicated a sizable portion of each monthly management meeting to the dashboards. They would review the dashboards collectively and discuss the implications for each management team member’s area of operations.

A Progress Report: How better data has led to better decisions, and lessons learned

After four months of using the dashboards regularly in management meetings, Rath and his team reported seeing gains in five areas:

  1. Compliance and proficiency in inputting accurate data
    The dashboard system had greatly increased management’s ability to foster staff compliance and proficiency with the beneficiary management system. Errors had become so obvious that they could be identified, investigated, and corrected, benefiting data integrity as well as helping to pinpoint which staff needed additional training and reinforcement. As Director of Youth Development Services Rivera explained, “When you see three 22-year-old youth designated in Pathways 3, meaning that they are in high school, you can tell that there is either a big problem in assessing youth needs, or, more likely, that the staff has made a mistake in inputting data. Either way, the dashboard system allows us to identify and correct the issue.”

    The dashboard helped Rivera make better use of his time with front-line staff, and helped front-line staff to see clearly the connection between the accuracy of the data and OPP’s ability to serve clients appropriately. (See Appendix B for an illustrative version of the Pathways summary dashboard on youth enrollment.)
  2. Programmatic changes

    There had been multiple instances of dashboard insights resulting in decisions to make programmatic changes, large and small.Two of the biggest were:
    • OPP management discovered that Youth Development Specialists (YDSs) were spending more time with youth who were in school (Pathways 2 and 3) versus those who were out of school without a diploma (Pathway 1). They wanted to see just the opposite, given that out-of-school youth require greater support. With this insight in hand, they opened a dialogue with the YDSs to better understand the cause (it was in part because in-school youth are easier to reach). That dialogue reinforced management’s expectation of how staff should allocate their time, and also prompted OPP to reallocate case loads to balance the distribution of high- and low-need youth each YDS serves. As a result of these efforts, OPP saw a significant shift in hours towards high-need youth.
    • OPP management was surprised to find that very few of the older Pathway 1 youth were availing themselves of the educational services geared to help them reengage with school or achieve a GED. Once aware of this issue, they investigated and identified the main cause: Many of the participants were holding down jobs, and the times when the educational services were offered conflicted with their work schedules. Armed with this knowledge, OPP changed its service offering to accommodate their schedules.
  3. Allocation of resources based on cost

    By being able to combine staff timesheets, accounting data, and the number of youth in each program, OPP management had identified significant variation in the cost required for youth participating in two different youth-employment programs. OPP looked deeper into youth program completion and skill attainment, and concluded the additional cost was not warranted. As a result, OPP has begun to transition resources to the more cost-efficient of the two, allowing OPP to serve more youth within the same budget. OPP also has used this insight to guide future growth plans, deciding to emphasize growth of the more cost-efficient program.
  4. Budgeting

    The budgeting process had become more detailed and more explicitly connected to the nature of the services OPP was providing. Team members had previously based their budgets on the general costs incurred by various departments. Once the group began to track key service categories, however, it was possible to carry that focus through to budgeting. For example, instead of recording all costs associated with employment services into one general employment cost category, OPP can now differentiate the costs to provide job training for older youth from the costs to provide internship experiences for younger youth. As Vice President of Administration Mottola noted, “This will allow us to forecast youth needs more accurately, as individuals in the program progress toward long-term goals. Budgeting in this way adds another level of complexity to financial reporting and tracking, but our budget now provides better information on how we do business and where we may need to concentrate more dollars in the future depending on the number of youth expected to receive services within each pipeline each year.”
  5. Growth

    The dashboard reporting had become a competitive advantage in securing funding and winning contracts. Being able to demonstrate rigorous management of its program, finances, and outcomes had helped the organization build credibility with funders. And the ability to isolate the cost of delivering each program and serving specific youth segments had allowed OPP to engage deeply with funders interested in supporting specific types of services (such as just education or just employment) or specific youth segments. The increased clarity also had allowed OPP to decline new sources of funding that did not cover the full cost of the commitment. As Manager of Development and Sustainability McFadden noted, “The City of Hartford and the state agencies have increasingly required ‘results-based accountability’ to win and maintain service contracts. Foundations and private donors are also better informed and want more information about the effectiveness of their investments. OPP is making significant strides in providing results-based accountability to its funders and is positioned for growth based on an accurate understanding of the needs of Hartford’s youth and the resources it takes to achieve those results.”

In reflecting on this process overall, Bob Rath and his team report that they have learned a lot about how to implement this type of system. Among their key insights on how to manage such an undertaking:

Data integration takes time

Combining data from three separate systems took longer than expected. The process of meshing the internal financial chart of accounts and staff timesheets with the new dashboard system took several months, and OPP’s beneficiary tracking system vendor required as much time to automate the customized reporting. Mottola reflected that, “Understandably when approaching a project of this magnitude, time was going to play a major factor. It was clear that this was not something that could be accomplished within a few weeks considering other responsibilities that took priority on occasion. It is also extremely important to take as much time needed on the front end to make sure you are capturing and reporting exactly what you want, to ultimately save you time in the long run. Now that the [dashboard] model has been created, tested, and refined using all the required data from the various systems, it takes less than 15 minutes to populate and to generate the monthly reports.”

A dashboard is only as good as its data

Getting staff to adopt a new information management system also took time. Initially, data-entry compliance was low in areas. Some staff were uncomfortable with the new methods, and that discomfort translated to reporting becoming a lower priority. Once staff did begin to adopt a new system, learning to use it well required even more time. Quality Assurance Director Williams-Rivera estimates that staff require six to 12 months to take ownership of using a new system, and another six to 12 months to gain true proficiency. Mastery takes even longer.

It’s important to establish explicit ownership of the dashboards

Having one person (OPP’s quality assurance director) directly responsible for both overseeing staff’s faithful data entry and pulling the data into the dashboards was critical to making the new system work. As Rath noted, “If we hadn’t put a quality assurance director in place, this wouldn’t have happened. You can’t implement this type of system without a staff resource.”

Taking stock of the entire experience, Rath said, “We are strengthening our capacity and building a data-driven learning organization. This is a significant step forward for us, and we would encourage other organizations to do the same. Our message for them, in a nutshell, would be this: A simple flip of the switch solution is not possible. Only a journey focused on clear-headed results with relentless open-minded attention to detail and the human beings driving results will work.”

Appendix A: Overview of OPP's monthly enrollment and financial status

OPP-From-Data-17/>

Questions to ask when reviewing this dashboard:

Is "people served" above/below reforecast?

If above:

  1. Were drop-outs/completions lower than expected?
  2. Did additional budget/funds allow for more people served?

If below:

  1. Was intake/recruitment low? Are there people on the wait list?
  2. Was turnover higher than expected?

Is total cost above/below reforecast?

If above:

  1. Was consumption of services higher than expected?
  2. Did OPP serve more people than expected?
  3. Were there other new costs?


If below:

  1. Did OPP serve fewer people than expected?
  2. Were there unexpected staff departures and/or other cost reductions?

Is cost per person served above or below reforecast?

If above:

  1. Was consumption of services higher than expected?
  2. Did OPP serve fewer people than expected?
  3. Were there other new costs?

If below:

  1. Did OPP serve more people than expected?
  2. Were there unexpected staff departures and/or other cost reductions
  3. Were expected services not provided?

Overlap with Pathways

  1. What is OPP's expected (or target) overlap between Pathways and other programs?
  2. Is OPP reaching those levels?
  3. If not, are open slots that can be used to drive additional overlap

Appendix B: Overview of OPP's youth population by pathway

OPP-From-Data-19

 

 

Questions to ask when reviewing this dashboard:

Is there a surprising concentration in one Pathway, age, grade, gender, or ethnicity?

Are there any youth in the wrong Pathways based on their age or grade?

Sources Used For This Article:


[1] For additional information on this strategic decision, please see www.bridgespan.org for the Bridgespan case study, “Our Piece of the Pie: Making the Biggest Difference in Hartford,” published in April 2006.



This work by The Bridgespan Group is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Permissions beyond the scope of this license may be available at Bridgespan's Terms of Use page.

Comments

Bridgespan welcomes your comments on our work. All comments are reviewed according to our Terms of Use before posting.

Post a comment Start the conversation by leaving a comment:


Rate the Content:



 
 Security code