Data Optimization

Administrative Data in Health and Human Services – Episode 2

September 2019

We continue our podcast series inspired by the PEW Charitable Trusts report about how states use data to inform decisions. We talk again with Amber Ivy, one of the researchers who contributed to the PEW report as well as Lily Alpert, Senior Researcher at Chapin Hall at the University of Chicago, and Britany Binkowski, who served at the Tennessee Department of Children's Services and is now with Youth Villages.

Episode 2

Listen on Spotify    Listen on iHeartRadio

Click here to listen to the full interview on SoundCloud.


Edited excerpts of the interview are available in the transcript below.

Released in February 2018, the PEW Charitable Trusts report, How States Use Data to Inform Decisions, was the first of its kind as no comprehensive report existed that focused on how states use the data they gather. The research project team members interviewed over 350 people in public policy areas as well as health and human services leaders. Amber Ivy, one of the researchers shares, "We discovered strategic ways that states used the data they collected—crafting policy responses to complex problems, improving service delivery, managing existing resources, and examining policy and program effectiveness. We also discovered there were challenges for states that included staffing, training, data accessibility, lack of permissions, data quality, having the wrong data, and willingness for data sharing." Five key actions were devised to use data in more strategic ways: 1) Planning and setting up guiding structures and goals; 2) Building the capacity of stakeholders to use data and using partnerships with universities and the private sector; 3) Ensuring quality data was accessible by stakeholders; and 4) Analyzing data to create meaningful information and learn new things. Ivy says, "Beyond just collecting data you also need to take that data and make it into a story. For example, instead of just showing the number of deaths in the opioid crisis, share what is happening in the community. Finally, we need to look at how we sustain the support for continued efforts. We need to connect to leaders on the ground, look at crafting legislation and how to sustain our efforts from administration to administration."

------
We are joined by Lily Alpert, Senior Researcher at Chapin Hall at the University of Chicago.

You have a curriculum related to this at the University of Chicago. Can you tell us about that? What are a couple of the things you teach in your classes?

The University of Chicago curriculum is called EDGE which stands for "Evidence Driven Growth and Excellence." The goal of the curriculum is to build skills in evidence-based decision making and, specifically, in best practices in performance measurements and how to transform administrative data into evidence about performance. Alpert shares, "The curriculum revolves around a core set of priority concepts. First, the process of improvement starts with a question. We begin by addressing what kinds of questions fuel the CQI process, then how to use best practices and measurement to answer those questions correctly. We consider how to become a savvy generator, interpreter and applier of evidence about system performance. Answering questions correctly means that your method is reliable, valid and representative." EDGE builds skills so people know how to generate answers to performance questions, how to avoid traps leading to biased information and how to consume analytic results. The system teaches that when answered correctly, answers to actionable performance questions become the evidence that fuels the CQI process.

EDGE relies on a four part template for change: 1) observe there is a problem that needs improvement; 2) determine it is because of a specific reason; 3) plan to do something different, to create an intervention representing a change; and 4) look for results in improvement in the desired outcome. Alpert says, "EDGE teaches that when claims observations are supported by evidence, you have a defensible and testable series of changes and an improvement plan with legs. Students worked hands on with relative data from their own foster care system. The course involved four months of learning hands on skills and four months of project work with local foster care challenges, where they focused on the performance of the foster care system and improving outcomes."

------
We are now joined by Britany Binkowski who served at the Tennessee Department of Children's Services, most recently as Special Assistant to the Commissioner, and is now the Director of New Allies at Youth Villages.

The Tennessee Department of Children's Services reached out to Chapin Hall for the development of the EDGE system.They already had significant amounts of data in varying degrees of quality and felt it was important to develop an internal capacity to make good judgements about what information they had that was the most reliable. Binkowski shares, "It was important to have people equipped to interpret the data we had, to develop well designed strategies to respond to what we learned. The program requires a commitment for lots of levels of leadership and we saw this as an investment in our leaders and management. We made it a priority to find the right people for the class. We started before implementation by meeting with regional leaders and discussing what it would take for success. We reviewed applications carefully to ensure we had the right students before they were admitted. We wanted ones who would be powerful decision makers at the regional level and be able to influence change process. It was a sacrifice for them to attend and was hard for both students and leadership, so it was important they had support from their local team. As we became established and had more EDGE graduates, the process became easier."

How can child welfare agencies put evidence-use training into action in the field (i.e., how can they use what they have learned)? How do you incentivize evidence-based decision making in an agency juggling many other priorities?

Lily Alpert relays that one of the most powerful lessons is the importance of strong agency leadership. "We can teach skills, provide an opportunity to practice and provide an environment that supports that, yet unless an agency calls on its staff to execute their new skills and creates an environment to support that work -- there's a good chance people will stick with their old habits. Evidence generated according to best practices will sometimes contradict your working knowledge. It may fly in the face of what you think is true about the performance of your own organization. Leaders have to be willing to introduce new knowledge -- even when it may counter investments that have already been made."

Britany Binkowski shares that the commitment Tennessee made to integrate the principles of EDGE went beyond supporting students and extended to changes in management activities to reinforce value for using evidence to make decisions. "One thing we did was shift metrics for the things we regularly measured regional leadership on, to set strategic priorities for each administrative region. Instead of the standard of asking every region of the state to improve 5%, we used the valid and reliable measurements that EDGE had taught us to tailor expectation for each region, to respond to the evidence of where they had the most room to improve. Regional leadership and staff saw us basing our expectations of them based on evidence about what they were doing and where they had strong practice and opportunity to improve."

EDGE also helped influence potential investments. Investments were not being looked at to reflect whether they responded to an identified need in the system. A strategic alignment review team was assigned to review all proposals for new initiatives and review next to evidence about agency performance, which modeled expectations about how decisions should be made. Staff was also required to generate evidence about their proposals, which reinforced their skills. Thoughtful evaluation plans were a powerful reminder to staff that a complete CQI process required testing and monitoring results and making adjustments. "When it's done well, evidence use doesn't become just another thing we do, but rather becomes a foundational element of how we do what we do. It requires champions and advocates who are relentless about demanding that evidence be brought to the table. Then the need becomes less explicit -- we find that people begin to expect evidence on the table every time they are committing resources and developing strategies to improve practice at the agency level."

This episode and more available on our podcast page. Go to Episode 1. Go to Episode 3.

Podcast Episode 2

Additional Data Optimization Posts

The Health and Human Services Workforce: Igniting the Potential Part 2

Posted on 10/1/2018
Minnesota is one of a handful of states whose child welfare system is structured as a state-supervised and county-administered model. Minnesota’s system spans a large geographical area made up of 87 counties and 11 federally recognized tribes. In 201

Data Optimization | The Catalyst

Posted on 1/1/2014
Find additional posts on Data Optimization.

Browse Impact Areas

Productive National Narrative

Modern H/HS Policy

Evidence-Informed Investments

Data Optimization

Agile H/HS Workforce

Healthier Ecosystem

The Catalyst Home