Data Optimization

Administrative Data in Health and Human Services – Episode 3

September 2019

Our third installment of the podcast completes this series inspired by the PEW Charitable Trust report about how states use data to inform decisions. We talk again with Amber Ivy, one of the researchers who contributed to the PEW report and we're also joined by Susan Smith who has worked with the Connecticut Department of Children for almost 24 years and is currently serving as the agency's Director of the Office for Research and Evaluation.

Episode 3

Listen on Spotify    Listen on iHeartRadio

Click here to listen to the full interview on SoundCloud.


Edited excerpts of the interview are available in the transcript below.

PEW Charitable Trust commissioned a report on how states use data after finding that, while states traditionally collected data for compliance purposes, there wasn't a clear understanding of what else they were doing with the collected data. PEW assembled a team that interviewed over 350 government leaders and leaders in the health and human services field across all 50 states and the District of Columbia. Researcher Amber Ivy shares, "What we found was that outside of reporting for annual spending, performance measurement, and transparency, states are using data in a few key areas. This includes crafting policy responses to complex problems and in improving service delivery. For example, in Missouri they used data to identify individuals who frequently use the emergency room and helped them to get services and coordinated care. Another example is in managing existing resources. Delaware used GPS devices to understand how people were using their cars and were able to reduce miles driven and save $1 million over a four-year period by looking at their fleet and managing those resources. Examining policy and program effectiveness is another use. The Washington DC Lab that does randomized control trials for projects in the District used that data to look at participation in summer youth programs and other area activities. "These users were going beyond just the metrics and using data in innovative ways. PEW researchers wanted to understand what actions helped them use the data more effectively. One way, which we will examine today, is in building the state's capacity to use data.

------
We are joined by Susan Smith, the Connecticut Department of Children's Director of the Office for Research and Evaluation.

Connecticut has a long history of building capacity in analytics – what are some of your lessons learned and how have you kept people engaged over time?

Building capacity is an iterative process, where teams build on the agency's body of work and try to stay aligned with the mission and priorities. Smith shares, "We try to find interesting and diverse ways to "hide the vegetables.' By this we mean making analytics relevant to staff at all levels. We transmit in ways that are conversational, hands on and digestible. We often hear from our team members – "I'm a social worker because I don't do data.' We try to tie the data to critical inquiries they have during an investigation and diagnosis, trying to find a correlate and parallel so it is accessible and folks feel less intimidated by it." Smith's team used learning forums where they convene staff at different levels and discussed information that they gleaned from qualitative reviews. They had fun events like "Pi day,' where they used the mathematical equation to celebrate data. They integrated staff into the work and shared some of the things they had accomplished. Smith was thoughtful about the communication tools chosen. They used brief one-page documents called QTips to provide an overview of data analysis they had done and apply it to practical work that the staff was involved in. With "data flashes', they showed new data reports with brief information like updates to a dashboard, an explanation of what the change is with contact details. "As a countdown to Pi day, we released data tips, mocking up pages that looked like a recipe card your grandmother used to use. We included small items like how to run something in Excel, for example how to figure out a child's date of birth from a date or how to calculate a percentage change. We try to make it fun but also relative to the staff and use it to drive outcomes."

What kind of training do you provide for the Department of Children and Families (how is it structured and what are people able to do after the training)? How do you structure training to use quality assurance and data to promote racial justice?

Data is something that is infused throughout all of Connecticut's training. New employees can become familiar with the vernacular of the agency and the outcomes they are striving to achieve with a variety of training activities. Some of the training is straightforward, simply ensuring new staff know where to access the data, the key places to find items and the tools and reports available to them. Smith says, "We do a lot of ad hoc training, trying to make sure our outreach is hands on and fun. We create custom training which can be a mix of concrete and soft skills, like how to create formulas or graphs. We examine how do you look at data and interpret it, how do you represent your data in a results-based accountability framework." Smith's CT team has also participated in sustained professional development opportunities that are longer exposures over several months, including a Data Leadership Academy. All training has change initiatives with analytic components and Smith's team is thoughtful about conveyances like the QTips, using them to elevate, access, and understand the use of data.

Smith says, "One of the recent data tips we used was to have a statistician build a four groups comparison calculator. As we start to talk about disparity and disproportionality we also want to look at statistical significance. The calculator introduces that. If you want to know whether children of color have a higher or lower permanency rate you can use the calculator to determine that. Racial justice is a real commitment and priority for us and we have imbedded it in all of our data work." When users access the Connecticut data boards, race and ethnicity is a standard element. When information and datasets are presented to the public, race and ethnicity is a component people can easily filter and view. They have a suite of racial justice dashboards and created a racial justice pathway data chart which is a series of stacked tables that allow users to look at disparity and disproportionality across child protection decision making. "That is one of our sentinel reports used frequently and it is a grounding report. We find that results-based accountability work always includes the question "is anyone better off?' We can look at whether our services are also equitable. Are children of color faring as well as other populations? We know children of color are often overrepresented. As analytics become more prominent, we are thoughtful about the data ethics and dignity. We want to use policies that make us pump the brakes, to make sure we are not inadvertently contributing to any disproportionality or disparity."

I've frequently heard from our state and county membership that one of the problems their agencies face when trying to use analytics, is that IT and business folks don't always know how to communicate. But in CT, the Quality and Planning Division works closely with your Information Systems (IS) Division. How do business and IT communicate, and can you tell us a little bit more about the dashboards those teams have create for staff?

Smith says, "We are fortunate to have a great relationship with our IS folks. We have a great deal of mutuality and are in the midst of updating our child welfare information system. They have brought in staff at all levels to help inform that and we are in constant contact with IS folks. It's been a tremendously collaborative process." The agency has myriad data products, with the dashboard users being able to determine foster care info, numbers of kids in placement, permanency; whether supervision is occurring within policy expectations, caseload sizes, teaming activities, risk management, arrests or fatalities. The mechanism used to set them up works so that people can easily access information that is of interest to them. They can filter by region, manager level, staff level, supervisor level and time frames.

"It boils down to neither side being territorial. We all work for the benefit of the kids and families we serve and the staff and agencies. Neither of us treat the analytics or reports as a hot potato -- we put on our oven mitts and try to cook together! Our IS team has really have done a great job of trying to understand what we do on a given day, thinking about how that translates into what the interface needs to be to better capture that information."

This episode and more available on our podcast page. Go to Episode 1. Go to Episode 2.

Podcast Episode 3

Additional Data Optimization Posts

The Health and Human Services Workforce: Igniting the Potential Part 2

Posted on 10/1/2018
Minnesota is one of a handful of states whose child welfare system is structured as a state-supervised and county-administered model. Minnesota’s system spans a large geographical area made up of 87 counties and 11 federally recognized tribes. In 201

Data Optimization | The Catalyst

Posted on 1/1/2014
Find additional posts on Data Optimization.

Browse Impact Areas

Productive National Narrative

Modern H/HS Policy

Evidence-Informed Investments

Data Optimization

Agile H/HS Workforce

Healthier Ecosystem

The Catalyst Home