Tell Us About It: Victim Research Convos

Podcasts

In this CVR podcast series, we talk with those doing research and serving victims and learn about the work they've done together.

Tell Us About It, Episode 21: Measuring Outcomes in Children’s Advocacy

A convo with Kaitlin LounsburySep 06Time: 25:58

  • Ways to Listen
  • Listen on Apple Podcasts
  • Listen on Soundcloud
  • Listen on Spotify

On this episode of Tell Us About It, we talk with Kaitlin Lounsbury of the National Children’s Alliance (NCA), the membership association and accrediting body for Children’s Advocacy Centers (CACs). She talks about the development and implementation of NCA’s Outcome Measurement System (OMS), including the challenges of bringing the nation’s 900+ CACs into the system and the value of OMS at the local, state, and national levels.

Kaitlin Lounsbury is the Program Evaluation Coordinator at the National Children’s Alliance, overseeing programs to measure the impact of Children’s Advocacy Centers, including the Outcome Measurement System (OMS) to collect essential feedback from families and team members serving child victims of abuse.

Related links:

 

Transcript:

Susan Howley: Welcome to Tell Us About It: Victim Research Convos, a podcast from the Center for Victim Research with support from the Office for Victims of Crime. On each episode of Tell Us About It, we talk to researchers and practitioners about their work, the tools being built for use in the field, and how we can work together to build an evidence base for crime victim services. I’m Susan Howley and today we’re talking with Kaitlin Lounsbury of the National Children’s Alliance. Kaitlin, welcome and can you tell us briefly about the NCA and your role there?

Kaitlin Lounsbury: Hi Susan, thanks for having me. So the National Children’s Alliance is the membership association and accrediting body for Children’s Advocacy Centers. CACs are places where child victims of abuse and their families receive help from a multidisciplinary team of investigative and treatment professionals that all work together to investigate the abuse, help the children and families heal, and hold offenders accountable. And we have almost 900 CAC members working in their local communities across all 50 states now. My role as Program Evaluation Coordinator is to work with our members and help them measure this work that they’re doing and then NCA, in turn, uses this national data to track trends in our field and determine the types of support we can offer that will have the biggest impact.

Susan Howley: So Kaitlin, why did NCA decide to tackle the issue of outcome measurement for CACs?

Kaitlin Lounsbury: Well, every five years, we update our strategic plan based on feedback from our members and our partners. And in 2010, the CACs overwhelmingly told us that they wanted a way to measure outcomes directly from the people they serve, specifically the families and team members. They all wanted to measure the same outcomes so they could benchmark their performance at state and national levels. At the time, many Centers were trying to measure outcomes at a local level, but since those measures weren’t research-based or standardized, they couldn’t really tell how they were doing compared to the larger field. The state of Texas had recognized that in particular and developed the first outcome measurement system or OMS for Children’s Advocacy Centers in 2009, and even at that time OMS replaced over 50 tools in just that one state that were being used. So you can imagine how many different outcomes CACs were measuring across the country. Seeing the success of OMS in Texas and the demand for members in our strategic plan, NCA entered into an agreement to purchase OMS from Texas and additional states began offering the surveys in 2012.

Susan Howley: So Kaitlin, I know a lot of people are interested in outcome measurement and as you said this demand came from the field. But once you started pulling together or using the Texas system as a basis and having a defined set that was going to apply to all CACs, did you have any challenges or any pushback as you tried to implement that?

Kaitlin Lounsbury: Well like you said, because it came out of our strategic plan, we had support from the field and we feel like that’s a huge strength of doing our strategic planning in this way with our members, because we can base our work on what the field needs and wants, rather than just telling them what they should be doing. That being said, sometimes we do, at that time and still, encounter CAC staff who are reluctant to approach families in particular for participation in the surveys. They may feel like it’s just one more thing to burden the family at an already stressful time or they might worry how it comes across to families. So to work on that, we focus a lot of our training and really when we are implementing this program on changing those perceptions – helping staff feel confident in why we’re doing this and incorporating those surveys as part of standard practice just like any other part of the CAC intervention. Although it’s true that many Centers do use OMS results for external purposes like giving data to funders or meeting requirements of NCA accreditation, the real goal is to help families. So we emphasize that real importance of asking families what’s working well and what we could improve and that really OMS is giving them a voice in this process and that’s going to help all future families coming the CAC. So when staff understand that, they’re usually pretty enthusiastic about measuring these outcomes. That being said turnover in the CAC field, like any other human service organization, means that we’re always training new people about the benefits of collecting feedback. It wasn’t just a matter of convincing everyone once and it being done, it’s really an ongoing process.

Susan Howley: So with this ongoing process, how do you make sure that everyone is still interpreting the questions the same way? How do you connect with the people who are doing the actual data collection to be sure that what you’re collecting is high quality and uniform?

Kaitlin Lounsbury: Well part of that is having a standardized system like OMS where we’ve set the exact surveys and questions that are contained in those. Anytime we make revisions, we do it nationally and we pilot test those results for reliability. We check back with the original development process, that we’ll talk about in a moment. But ultimately, by providing this tool as a national association, we are setting the standard for what types of questions should be asked. So even though there’s a lot of flexibility in the timing when an individual Center throughout that process can offer the survey or how they present it to a family member, the questions that are being asked are standardized across the Centers and that way we know that they’re all measuring the same outcomes, which is really what they wanted. They want the flexibility of being able to ask questions of their own, which we do allow OMS, but there’s that core set of questions that every family that walks into a CAC has an opportunity to give feedback on.

Susan Howley: How did you come up with that core set of questions? Did you just take what Texas had already developed, and if so do you know how they came up with a core set of questions?

Kaitlin Lounsbury: I can give them information about how Texas originally developed this and then over the years, we’ve made revisions and improvements to the tool as well, so I’m happy to talk about both. So as I mentioned CAC Texas did originally develop the first version of the OMS program and they did that in collaboration with researchers at the RGK Center at the University of Texas at Austin. So to do this, the researchers started by doing CAC site visits and focus groups with CAC directors so they could see in the real world what was most needed by these Centers, what was their day to day experience like. They asked the directors about current indicators that they were already tracking, maybe expectations they had from funders or partners and any remaining areas they really wished they could track but maybe hadn’t found a good way to do that yet. The researchers also did a literature review on key aspects of the CAC model itself, as well as general research on outcome measurement systems in other fields so they could anticipate what are some of the key components that they’ll need to include, what are some hurdles that they might need to encounter along the way. This information from the CACs and that literature review helped to determine the topic areas that were most essential. And then to figure out the best way to ask questions, they also reviewed those existing tools I had mentioned earlier that the Texas CACs were using, as well as national examples. And in fact the National Institute of Justice had come out with a special report in 2004 called “A Resource for Evaluating Child Advocacy Centers,” which was pretty timely given that this work to develop OMS was from about 2006 to 2009. And so all of that came together to create a set of surveys for caregivers and team members based on issues that were of most importance to the Centers and their stakeholders. And then a diverse group of CACs across Texas piloted those surveys and the results were tested for reliability. So most Centers in Texas were using OMS by the time of our 2010 strategic planning process and we were already hearing a lot of positive feedback about that program. So when NCA started this process of adopting the OMS program, originally we used the survey tools as-written, as they had already been developed and for that first couple of years, we really saw it as a national pilot. We were offering the surveys entirely on paper and results were being stored into spreadsheets that would then be aggregated at state and national levels. However, we quickly learned that there were some adaptations we needed to make, both in terms of the content of the questions and adapting to new technology. So in 2014, we did our first big round of revisions and that was in collaboration with the Crimes Against Children Research Center at the University of New Hampshire. And then we – again, listening to the needs of the field and and adapting to current technology and evolving needs of CACs as a whole – we revise the surveys yet again in 2018, which is the current version. The UNH team was again involved in that process, plus there was collaboration with a researcher from the University of Illinois at Urbana Champagne as well. So every time we’ve made improvements to these surveys, they’ve focused on reading ease of the surveys. So right now the OMS surveys are at a fifth grade reading level, which makes them actually quite a bit easier to read than a lot of standard government documents. And we’ve also made sure to adapt to those evolving needs. So if Centers are saying, we really need to ask about this or this question has caused confusion, we look at how to make those revisions. We get together a large group of Centers to pilot those, retest for reliability, and so forth. And so that’s allowed us to adapt the surveys to meet those evolving needs of the field, while keeping the service research based. And ultimately, these questions are all focusing on aspects of the CAC process there are helping children and families heal, such as whether their questions were answered, whether they had access to resources, any additional resources that they feel that the CAC should be offering. There’s also questions for team members on those surveys and those focused on whether they can collaborate as efficiently as possible on cases, such as whether they understand each other’s roles and are really actively working together to share information.

Susan Howley: Oh that’s great. So you have both outcome measures for the end users and also the team members so that double layer is wonderful. Now you talked earlier about having a core set of questions and then allowing CACs to add some additional questions. And I know this is an issue with others who are trying to measure outcomes, that you don’t want to be overly burdensome but you want to allow some flexibility. So how did you decide what was going to be a core question and how hard was it to try to get that down as narrow as possible and still be useful?

Kaitlin Lounsbury: Right. Well, it’s definitely a tricky balance. So we wanted to make a tool that had a core set of items that everyone was measuring, knowing that some Centers would need to add additional questions because of a particular funder in their area or there’s a program that they offer that’s unique to their Center that they need to collect feedback on. We wanted OMS to be a tool where they could collect feedback on various aspects of their program, both what’s central to every CAC and what might be unique to their needs. So to do this, though, we needed to develop a tool that could work in the real world of CACs. This can be one of the most difficult times in a child or caregiver’s lives and team members are also stretched really thin with all their responsibilities on these cases. So we needed a tool that could collect as much information as possible without burdening families and teams. So a big part of that was sorting out what types of measures could actually come from other sources. So maybe there were items that we thought it would be great to know about this and we realized, that’s a better question for an intake form or this is really something that needs to be tracked for a specific family or team member, so that needs to be done in more of a conversational component rather than a standardized survey. So once we eliminated things that could come from other resources – like intake forms, case management systems, interviews, and so forth – we narrowed it down to just the information that needed to come directly from families and team members themselves on the surveys. And as a result, the surveys are focused less on what was done in the course of a specific case and more on perceptions about how various parts of that CAC process impacted the families and team members. Were their questions answered? Did they feel supported? What remaining resources were needed? And we also wanted to make sure that those surveys were anonymous so that families and team members could be as honest as possible, which meant that the questions couldn’t be too specific in a way that could identify a particular person, family, or case, which also helped us to narrow this down.

Susan Howley: Well this sounds like this was a dream project. What were the challenges that came up as you tried to roll this across 900 CACs?

Kaitlin Lounsbury: Well at the time, luckily, we were at around 600, so it was a little bit less, but it was still hundreds and hundreds of Centers all over the country. So when we first tried to offer this, we trained our state chapters – so in all of these 50 states, there’s a state chapter that helps to coordinate training and technical assistance at that level. And so we trained our state chapter leaders to provide the majority of training and technical assistance for the program at that time, so this was back in around 2012. While there were definitely strengths to that approach, especially in terms of having a local support person, and we’ve kept aspects of that still, the chapters really did struggle to take this on in terms of this level of responsibility. Some states just didn’t have the resources to do this, so participation was limited to states that actually had that capacity. So if a Center was in a state where the chapter was still getting up and running and maybe only had a part time person, they didn’t necessarily have the opportunity to participate in the program. And given the frequent turnover at CACs, this also meant that training had to be repeated regularly. The chapter couldn’t just do it once and say OK now I’m done. There was always new staff reaching out for support at existing Centers, never mind new Centers wishing to join. Plus when OMS was first offered, I had mentioned earlier that it was entirely paper surveys with results recorded in spreadsheets. That was really difficult to scale up to hundreds of Centers and the Centers wanted to use modern technology to collect and store the data electronically, so to develop a system like that you needed a national leader for that. So to address all of those needs, we realized that we needed one point of contact at the national level who could provide consistent, dedicated support for the OMS program. So NSA created the OMS Coordinator position in 2014. I had been interested based on my previous work on the revisions in 2014 when I worked in academic research at University of New Hampshire and I applied for that position. And given the overlap between OMS and so many other areas of program evaluation at CACs, the position has continued to evolve and take on additional programs over the past several years. But it really is still a core part of the work that I do every day to support our CACs and those state chapters who are my backup to provide that local support.

Susan Howley: I want to go back, briefly, because you reminded us that this started as a paper survey and moved to tech. What kind of tech do you use with the families or care givers? Are they encouraged to sit down at a computer that’s there at the CAC, or is it an app? What do you use?

Kaitlin Lounsbury: Sure. Well there’s actually a lot of variety, which I think is a big strength of the program and has really helped us to expand it now. We have over 800 of our 900 members using this program and I think the flexibility of the technology is a huge part of why Centers are able to take this on. So they still do have the option to collect surveys on paper, but they also do have the option to have the survey open on a computer or tablet at the Center. They also have the option to send the survey home with a family member, so giving a handout with the survey link, even a scan-able QR code for someone to do on their cell phone, sending the link by email or text message, and they can determine with the family while they’re on site, whether the family would like to do the survey on site or use one of those after-visit options. And then we also have a caregiver follow-up survey that can be either done onsite if they’re coming back to the Center in that timeframe or it can be sent down one of those various options. So the way that we do this, we started off with one online system – the survey system actually was bought out by a competitor so we needed to find a replacement, which could be a whole podcast in and of itself of navigating from one system to another. But the current system we use is Qualtrics and so it’s a pretty popular survey system with a lot of flexibility in how we do this. And the way that we are able to make this happen is the NCA has one central account and we create a unique link to each OMS survey type for each Center. So when the Center is using it, they’re using the same link for everyone, which makes it anonymous at the person level, but we’re able to trace that back to a specific Center. And embedded in that link is information about that Center location – what state and region they belong to. And that means that the moment a survey is completed, the results automatically go to an aggregated dashboard for the Center, their state chapter, their regional CAC, and NCA to be able to see those results in real time.

Susan Howley: That is so cool. When you are talking about the various forms of access that parents or caregivers can have, what’s your participation rate, or do you know the average participation rate for the clients?

Kaitlin Lounsbury: Well it does vary quite a bit. And we have to remember that not every child that’s brought into a Center is accompanied by a long term or stable caregiver. Sometimes the children may be in crisis and are brought in because they’re in state custody at the time. So we never expect that we’ll have 100 percent participation. And it does vary so, so much. We have some Centers that actually have close to 90 percent participation and they really serve as the model for our Centers and we constantly are looking for those examples of best practices to share back with the CACs. Other Centers do struggle to get caregivers to participate. Often it’s because it’s a new process, it’s not part of the standard practice yet, so they’re trying to figure out the best way to offer, the best timing, the best language to use when presenting it to a family. So on average, we find that the national participation rate is about 20 percent response, so about one in five families. It is a voluntary survey, so we would never force someone to take it. Even if it’s presented as a standard part, it’s still up to the family if they want to participate. And certainly we do have much better response rates if it’s given right there at the Center, if there’s a set aside place and time in their visit to be able to fill out the survey. Any time it gets to sending something like an email or a take home option, the response rate does drop like any other survey out there.

Susan Howley: So now that you have this wonderful system in place, how are NCA and the local programs or the state chapters using the information?

Kaitlin Lounsbury: Well this is a huge part of our work, since we know that this information isn’t going to do anyone any good if it’s just sitting there and no one’s looking at it. So when Centers are first getting started with the program, we encourage them to focus first on getting comfortable with their data – accessing that reporting dashboard on a regular basis, reading these results. And that should be throughout anyone’s participation, we want to make sure they’re checking the results regularly. And then starting by using that for internal purposes: figuring out what’s working and what might need to be improved, conveying that information to staff and team members, and making real changes to benefit the people that they’re serving. And similarly, we use the information at state, regional, and national levels to determine the types of training and technical assistance that’s most needed at our member CACs. When Centers have a good handle on the data, we encourage them to go further with it and use the information to promote their work. The results of the surveys really are impressively, overwhelmingly positive. Families really do and team members really do appreciate the work of the CACs. So we strongly urge our Centers to share these successes, whether by supporting a funding application that they’re doing. Most of our members are nonprofits and so this is helpful to convey the success and impact of the program. It also can provide context to public awareness campaigns or anything else to put this valuable information to good use at the Center and in their community. And recently NCA has been looking for ways to use the results to provide targeted interventions on areas that we might be lagging behind on, according to our national results. We’re actually about to wrap up our first year of a three year project with the University of Oklahoma Health Sciences Center that’s focused on family engagement and mental health services. The need for that came directly from trends in our OMS caregiver follow up survey, which showed that although caregivers were receiving information about services, they weren’t always going on to use those services. And the biggest barriers, according to families, aren’t the availability of convenient services or transportation to those locations, like we normally think. Instead, caregivers are telling us that the most common reasons they’re not using the services is they don’t see why those services are important or how they’re actually going to help their children. And so based on this, we’re working with University of Oklahoma to create a training curriculum for victim advocates that educates those essential CAC staff members in using screening tools, knowing about the benefits of different treatment models, and using motivational interviewing skills to convey that information to families. And what really strikes me, as someone who came from an academic research background, is that we have so much information available from OMS. We have over 350,000 surveys collected nationally just since NCA took over the program in 2012. So we really need to partner with researchers to use this valuable information and I’m always looking for opportunities to do this.

Susan Howley: Wow, that was a great illustration with the need for more education or awareness building around mental health, because it shows that research can sometimes reveal something that is different than what everyone’s gut instinct is telling them. You had well-trained professionals who just assumed that they knew answers and here’s something that hadn’t occurred to them. That’s fascinating. Where can listeners find information about your outcome measurement system and any of the results that you’re able to share on a national basis?

Kaitlin Lounsbury: Sure so on our website, NationalChildrensAlliance.org. When we have new reports available, we always put those in the banner that goes at the top. But also under the About section, we do have a section around our outcomes and so we post a brief version that’s available to the public as a whole. We combine OMS data with other statistics that we collect, information about service areas of CACs, into a brief report that makes it really digestible. And then we do develop a member version, as well, for our Centers to take a really deep dive into these trends and we often share that with practitioners that might be interested in working with us on some of this great data that we’re collecting.

Susan Howley: What’s next for NCA and your members in the area of outcome measurement?

Kaitlin Lounsbury: Well you asked previously about having to narrow down our focus with these and what else would we have liked to include. And we knew from the beginning that children and adolescents served at CACs need to have a voice in the CAC process. But when OMS was first developed, there was concern about whether it was feasible to ask the kids for feedback directly. Now that CACs are comfortable collecting feedback from team members and getting more comfortable asking caregivers for feedback, we really feel that it’s time to develop tools for youth to share their voices. And we again, we’re a member-focused organization, so we polled our members earlier this year and found the vast majority of CACs are ready and willing to use a youth feedback survey. So we have plans to start developing that tool and hope to launch it in 2020.

Susan Howley: Well Kaitlin, I can hardly wait to see how that comes out. I really applaud you all for looking to include youth voices, but I know that it can be challenging.

Kaitlin Lounsbury: Absolutely.

Susan Howley: Well this has been fascinating and inspiring. I am really ready to go out and spread the good word about outcome measurement to other areas of our field. You all have really led the way with this.

Kaitlin Lounsbury: Thank you so much. We are very happy to have this tool and again, we are always looking to partner with other organizations that might be looking to do something similar or are willing to partner on projects with us. We’re always looking to go out there and make these partnerships.

Susan Howley: Thank you so much.

Kaitlin Lounsbury: Thank you.