Tell Us About It: Victim Research Convos

Podcasts

In this CVR podcast series, we talk with those doing research and serving victims and learn about the work they've done together.

Tell Us About It, Episode 20 – Promoting Safe Youth Relationships with Expect Respect

A convo with Barri Rosenbluth and Barbara BellAug 20Time: 33:37

  • Ways to Listen
  • Listen on Apple Podcasts
  • Listen on Soundcloud
  • Listen on Spotify

On this episode of Tell Us About It, we take a look at Expect Respect, a program of The SAFE Alliance in Austin, TX, promoting safe and healthy relationships for youth and preventing dating and sexual violence. Barri Rosenbluth and Barbara Bell join us to discuss the genesis of the project, the different programs associated with it, and the value of bringing an evaluation aspect into the project.

Barri Rosenbluth is the Senior Director of the Expect Respect program at The SAFE Alliance. She has been at the Safe Alliance for over 30 years, beginning as a counselor in their emergency shelter and non-residential center.

Barbara Ball is the former Director for Evaluation and Training for the Expect Respect program at The SAFE Alliance. She began as a facilitator of support groups for girls in middle and high schools. Barbara currently serves as Senior Associate at the Texas Institute for Child & Family Wellbeing at the Steve Hicks School of Social Work, University of Texas at Austin.

Related links:

 

Transcript:

Susan Howley: Welcome to Tell Us About It: Victim Research Convos, a podcast from the Center for Victim Research with support from the Office for Victims of Crime. On each episode of Tell Us About It, we talk to researchers and practitioners about their work, the tools being built for use in the field, and how we can work together to build an evidence base for victim services. I’m Susan Howley and today we’re talking with Barri Rosenbluth and Barbara Ball about their work on Expect Respect at SAFE in Austin, Texas. Barri, can you tell us about your position and your history at SAFE?

Barri Rosenbluth: Hi yes, I am the Senior Director of the Expect Respect program at the Safe Alliance. I’m beginning my 30th year here at the agency. When I first started, I was a counselor working with women in our emergency shelter and non-residential center and discovered early on that so much of the abuse they were experiencing had begun in adolescence.

Susan Howley: Thank you. And Barbara, can you tell us about your role at the Safe Alliance and with Expect Respect?

Barbara Ball: Sure. So currently I work as the Director for Evaluation and Training but I didn’t start out that way. So 15 years ago, I started facilitating support groups for girls in middle and high schools and then worked with elementary students as well. And then had the opportunity to get more engaged with the research, with a CDC study that we’ll soon describe more, and grew into the role of being the evaluator for the program.

Susan Howley: Thank you. Barri, could you give us a quick overview of Expect Respect.

Barri Rosenbluth: Sure Expect Respect as a program of the Safe Alliance. We’re a nonprofit working to stop abuse for everyone. The program works in schools to promote safe and healthy relationships for youth and to prevent dating and sexual violence. It’s a comprehensive trauma-informed program with multiple components. We are working with young people who are at increased risk for victimization and perpetration of violence due to prior violence exposure or they’re in current abusive relationships with peers or dating partners. And in schools we offer support groups for those students throughout the school year, curriculum-based sessions that teach skills for healthy relationships. In addition to our support group intervention, we offer youth leadership programs at the campuses, as well as through a Summer Youth Leadership Academy, which is an exciting partnership with our city summer youth employment program. So students are paid for a summer job to become prevention activists at the Safe Alliance. We also partner with theater organizations in town to provide educational theatre performances and productions that are youth-led and address current or relevant issues in teens lives. So that’s kind of in a nutshell the scope of the work we do. We also train school personnel on responding effectively to incidents of bullying and sexual harassment, dating abuse or sexual assault amongst students. We like to speak with parent groups about their role in supporting healthy teen relationships and we have a lot of collaborative projects with other partners in the community bringing the topic of healthy safe relationships into other settings.

Susan Howley: Wow so many great aspects to this program. I mean, I’ve been hearing about Expect Respect for many years and every time I learn something new about it. This theater program sounds terrific. What did Expect Respect to look like in the early days and when did it get started, Barri?

Barri Rosenbluth: In 19 – late 1980s some of the counselors here at the organization, which at that time was called the Center for Battered Women, were providing presentations in high school classrooms and sometimes students, after the presentations, would come up and say “What if this is happening to me? Where can I go? Is help available?” We began to work with school counselors and some of our local high schools who were interested in having support groups on campus or for girls. We modeled this after the work we were doing at the time with adult women, which was peer support. And we found that this was equally effective with teens. They came forward in greater numbers. We were able to obtain some local grants to expand that program and to begin writing lesson plans and partnering with our University of Texas researchers to look at the effects of those groups on the students.

Susan Howley: So you mentioned partnering with the University of Texas researchers. Was evaluation part of your program from the get-go or when did you start bringing evaluation into your work?

Barri Rosenbluth: It really was part of the program from the get-go because we knew, for one, that we needed to demonstrate the effects of this program if we wanted to expand it. Of course, the counselors had no shortage of anecdotal information that students were feeling empowered – they were speaking up for themselves, they were seeking help, they were getting out of abusive relationships. But this was – we needed to demonstrate quantitatively what the impact of the services were.

Susan Howley: So how did you how did you do that? What did the first evaluations look like?

Barri Rosenbluth: Very simple pre- and post-tests. We looked at measures at the time that we thought would be changing for the girls in the groups. I remember, this was a long time ago, but we looked at self-esteem. We looked at social support. We looked at measures of well-being. And I’m not sure we had really any conclusive results at the time, but we knew that the process of evaluation was important for our program development and sustainability.

Susan Howley: With those first evaluations, did you use internal or external evaluators?

Barri Rosenbluth: It was really the team of internal staff and our partnerships with researchers at the University of Texas and also with our city health department that were interested in helping us evaluate our program – not only our support groups but educational presentations that we continued to provide in the classroom. We worked as a team. We were all interested in knowing what was changing as a result of our program.

Susan Howley: Barbara, your program has been evaluated by the CDC. How did that come about and how did that help you all at Expect Respect?

Barbara Ball: So we had multiple CDC grants, actually, that helped us on the way to evaluation. One was as early as 1997, where we looked at elementary school programs that prevent sexual harassment and then thinking in the future that by doing so, we would also prevent later intimate partner violence. So that was a very early grant. I think the big breakthrough for us was an empowerment evaluation that the CDC designed and released in 2003. And what they did was they looked in the communities for programs that had capacity to engage further in evaluation. So what Barri described just before that she had always worked with evaluators, basically from the get-go of the program, put us in the position to be one of the four programs chosen for that empowerment evaluation. And so over multiple years this entailed really intensive work and training for us to look at qualitative and quantitative ways of measuring prevention programming. And I think what was really, really important for us was a qualitative component of this project. So we interviewed our support group participants at the end of the program. And it was a pretty big undertaking. We were working in 20+ schools, had about 40 groups a year on average. So we interviewed a good part of those groups, tape recorded, transcribed, coded interviews with the support of the researchers and we learned a lot from it. What we learned was that the boys and girls participating in our groups – and these are, as Barri mentioned before, these are young people who have been exposed to violence in their lives, they are at really high risk and they’re struggling in their relationships, and they’re participating in our 24-week support group program. So what these students told us is that two things counted the most for them. One was to learn how to make their relationships better. They really wanted to learn the skills. And number two was the group experience itself. So it was not about something didactic. One of them would say, “Well I knew that all before, I just never bothered to think about it.” So what this did was confirm for us that we needed to engage them in an experience. It needed to be – having a supportive group was critical for these students that we were trying to reach and that we needed to really focus on skills. Up until then, a lot of the work was based on worksheets and was very educational, more didactic in some ways. And we still do those components, but that’s not what made the change for them. So from there we started our first big curriculum project, and that was about 10 years ago. We published it, I think, in 2008. So that’s a curriculum that has been out now and currently we’re working on a new version of it. And then we also continued to do more quantitative evaluation. And I think because we were running so many groups and have so many participants every year, we were actually able to, with support from the CDC, to design rather complex pre- and post-tests testing out different scales. And also looking at what kind of questions resonate with the young people? Where do we see changes from pre- to post-? And we tried a number of different things. We looked at their gender-based attitudes about gender norms. We looked at justification of violence. We looked at general levels of aggression. We included scales for measuring healthy relationship skills. We looked at their perpetration of victimization with teen dating violence. So a lot of different factors and we were able, in a pilot study, to show some positive results. So that was an uncontrolled evaluation just as part of our regular programming providing pre- and post-tests and we were able to show some positive outcomes. So the CDC then picked up and basically offered us the opportunity for a rigorous and controlled outcome evaluation. We started in 2011. It was a four or five year process. One year was a pilot for our project and then three years’ worth of data collection.

Susan Howley: That’s great. So Barri and Barbara, what excites you the most about what you are learning from this evaluation? Is it some of the early information you’re learning about outcomes or is it about effective strategies?

Barbara Ball: Well, there are a lot of components. It was very important to hear from our participants what mattered to them. I think that is probably the piece of information that is really, really critical for any program development and for informing how we implement a program. We also did interviews with our facilitators. That was part of the CDC outcome study. We learned a lot from that about what groundwork work do we need to do in every school to effectively support the programs so that our support groups run well. We need to get referrals. We need some support from the school. What are we giving back to the schools? There are lots of questions, lots of things we learn through that. And then, of course, the outcome study really ask hard questions – can we measure, are we measuring changes? Is the program working as intended? And we are always trying also to refine our measures to get a better sense of what is happening in the groups.

Susan Howley: What kind of outcomes are you finding?

Barbara Ball: Well, in the CDC-funded study, which is certainly the most robust and rigorous piece of the evaluation we have done so far, we found that for all participants there was a significant reduction in general aggression toward peers. We measured that through, it’s called reactive and proactive aggression. So in other words, it’s aggression in the moment towards a peer, but also aggression that would probably fall under the label of bullying. So anything where aggression is used to control and manipulate another person. So there was a significant reduction in that, which I think is very important.

Susan Howley: So am I hearing that correctly that then the outcomes that you were seeing went beyond the dating violence that was the initial focus of this project?

Barbara Ball: Yes it did. And we’d of course also look at teen dating violence victimization or perpetration and that includes emotional, physical and sexual violence. And there we saw that for the boys in our program, there were incremental declines in that, which I think is very, very important. We did not find the same outcomes for the girls on those measures, which has been puzzling, since the program started for girls. We actually would have thought that we would have better outcomes with girls than boys. So we were thrilled to see that we can really make that difference in the boys’ lives and continuously trying to figure out what we can do to support the girls in different ways.

Susan Howley: I wonder what sorts of evaluative questions are occurring to you now to explore girls’ experience and whether their lack of a better outcome might actually be a difference in willingness to report, for example?

Barbara Ball: Yes, there are always questions like that. You always need to reflect on the measures, on how boys or girls respond differently to measures. There might also be relationship dynamics that are different because we did see, for boys and girls, that their general aggressiveness in relationships decreased. So we’re asking ourselves what’s different for dating relationships. Why do we see not the same as in their peer relationships? I don’t have a conclusive answer for you at this point because it’s so complicated what all goes into students’ responses to any of your questions. We may need to do some more qualitative work to dig a little deeper and have a better understanding of the meaning of it.

Barri Rosenbluth: Well that’s just the challenge with prevention evaluation, I think. There are so many questions about how do they even understand the question at the baseline versus after a 24-week program? So we’re struggling with those challenges as well. But I’m also excited about the new opportunity to look at school-level outcomes. We have a data sharing agreement with our local school district and we will this coming year be able to look at rates of disciplinary offenses, academic performance, absences for the students participating in our program and compare that to non-participants. So I’m excited to see whether we’re having outcomes in that area.

Susan Howley: This is exciting. And to me, negative or unexpected results are not bad because it always gives you a step forward. I always say with science, even bad news is good news because you learn something that’s like, “oh okay, now we know this, or now we know what direction we should take our question in the future, or we’ve opened up all of these new areas to look at.”

Barri Rosenbluth: We are in total agreement with you on that.

Barbara Ball: So for quite a while, through the CDC evaluation and after, we focused very strongly on gender norms and on gender differences and how different attitudes toward violence or justification of violence. More recently with we’ve retooled a little bit and I don’t have results yet, but I’m really curious about to think more about the emotional mental health aspects. Because mental health and violence prevention have evolved so separately, yet in the lives of the young people that we’re working with, because they have been exposed to violence and adversity and stress and they are experiencing trauma all along, we are really merging the two fields. We have slowly been doing this over the years and we’re strongly addressing some of the mental health components without making this a mental health or counseling program and really thinking how can we help our participants regulate their emotions better? How can we help them cope better? How can we educate them on trauma and how that may play out in their everyday reactions to situations? So there are so many layers of this and maybe that will actually support the girls more so that they are in a space where they can make decisions and choices and not be as reactive in that trauma space that they’re usually living in. So we’re working hard on that and just designed a new baseline survey, which I think was relevant to the students based on the responses I’ve seen. You can tell right away when you look at how students respond to a survey. Have they actually even considered the questions or are they just glossing over it? So I’m hopeful that somehow we’re maybe tapping into something that is relevant that perhaps we haven’t looked at as much and that that might also tell us more about what’s going on differentially for boys and girls.

Susan Howley: That’s great. Now there are not all that many programs that have the luxury of having an in-house evaluator like you all do. Barb, that’s just a terrific role that you play there. Can you tell us a little bit about the benefits of having an in-house evaluator? How do the two of you work together?

Barri Rosenbluth: We work very closely together. The program director has to value evaluation and has to see it as part of their job. For me, I need to hold the integrity of the evaluation as a priority and know that all results are good results because the purpose of the evaluation is to help us understand the impact we’re having and how we can improve. And that should be the role of the program director.

Barbara Ball: We’re working together really closely because I’m also intimately involved with program development. So a lot of what we find through the evaluation and the questions that are being raised that we just discussed, they kind of flow back into how can we actually improve the program. So I think the great part about having an evaluator in-house is that you can do that feedback cycle really directly. So at the end of the year when we have our pre- and post-data and I’m able to pull a report together, I’ll present to the team and we’ll talk about why we have those findings or why we’re successful in one area but not in the other. When I’m looking at survey instruments and I’m trying to shift some things or try something new, I’ll vet it with the team. So it’s like a constant conversation. And the whole team is assisting with implementing surveys or sometimes entering survey data. So it’s not a separate, it’s not an entirely separate role and certainly not very different from an outside evaluator.

Barri Rosenbluth: It’s a huge asset to the program and I am very protective of having this position in my department because somebody like Barbara with these skills is in high demand in any organization. And I don’t want her to be spread too thin because I know how much time and effort it really takes to do the level of evaluation and program development she is doing in our program. So I caution other people who have this resource on their team to really protect that because people can get called into doing lots of different things, particularly when they have high level skills that are needed in so many ways.

Susan Howley: That’s wonderful. Barri, what’s next for Expect Respect?

Barri Rosenbluth: What’s next is our program manual being published. We’re excited to launch that. We offer training around the country for other communities that would like to replicate some of the strategies we’ve developed here. And we can help them customize the program to their needs, to who their partners are and what setting variant. We enjoy doing that and I look forward to doing more of that with the new program manual. We also need to continue to fund the programming here locally and as our program grows that gets harder and harder to do. So we do a lot of grant writing. We need to partner with a lot of other organizations so that we can stay current with all of the new influences and directions in the field.

Susan Howley: Barbara, what’s next for the evaluation focus? And a thought occurs: as people pick up this program and implement it around the country, will you be guiding their evaluation or will their evaluations have any connection to you all in Austin?

Barbara Ball: That’s a really good question. I always make our instruments available to anybody who e-mails or calls and that happens quite frequently. We don’t have currently any formal agreements with other sites to more rigorously evaluate what they are doing and maybe put the data together. That can be very challenging because implementation can vary so much from one site to another. That’s one of the reasons we’re not doing that currently, but we’re definitely sharing the instruments that we have. What we’re doing and we are really, really excited about is that we are working with Dr. Liz Miller, who got a grant through the CDC to replicate and evaluate Expect Respect with middle school students in the Pittsburgh area. So we’re consulting on that grant. And we’re really, really excited to see the program being used in another geographic area and with middle school students, and we’re really excited to see that she’s evaluating the program. On another note, as I mentioned earlier, we are really curious about looking into the connection between mental health promotion, emotional well-being, and violence prevention. There’s so much conversation about mental health in relationship to school violence, so I think there is a lot of work to be done that’s very needed. And so hopefully we can contribute to that area a little bit more.

Susan Howley: That’s all fantastic. Barri, there’s a lot of pressure from funders and others about outcomes measurement. What lessons have you learned at Expect Respect?

Barri Rosenbluth: As a Program Director, I know that there are so many requirements from different funding sources for different outcomes and indicators, and this can make the job of the program director very complicated, as well as the implementers of the program who are needing to collect so many different pieces of data and different ways for different funders. And I just think it’s really important to determine what outcomes you think are changing and to try to write those same indicators into grants. I think we spend a lot of time unnecessarily trying to track a lot of different outcomes and indicators that may not be what we really need to know or what we think are changing, but we are required to for various funders. It helps tremendously to have a core set of variables that you measure consistently just to consolidate some of the evaluation and administrative tasks that are involved in running a program.

Barbara Ball: I could add to that. So we’ve talked a lot about the evaluation of Expect Respect and I just want to clarify that we’ve talked about one program component of Expect Respect and doing that – that’s the support groups. And I think the reason that we focused our evaluation on that program component is that it was the first one to be manualized. We have a 24-session curriculum that has existed for a long time, prior to even the 2008 publication of a program manual, and it has existed consistently over time and of course has grown and developed and become much richer and deeper. Buts it is a curriculum or it’s a program component that is hard to evaluate but possible to evaluate because it has a set framework. And also because we we consistently have a high number of schools participating and we have consistently between two and 300 students every year and that’s just an average to complete a baseline survey. So we have a high number of participants in that program, so we actually can evaluate it and it makes sense to put resources and evaluation behind it. We have other program components that are also very, very important and very dear to us, but they are not that easy to evaluate because, for instance, the youth leadership programs have to be flexible and creative. They vary in length, they’re youth determined, and they have a different form and shape every year – some work with educational theater, others create media projects. Some are three hours long and some meet for 100 hours. There’s so much variability. It’s much harder to have a rigorous evaluation of a program that by nature needs to be flexible. Now that’s not being said that we don’t value those programs, but we do it in a different way. Sometimes we may decide an audience feedback or participant feedback form is really all that’s warranted. And because it does not make sense to do a pre- and post-test under certain circumstances. The other leadership programs that are more structured and more consistent over time, where we’re also doing pre- and posts. I guess what I’m saying is you really need to look at what’s the nature of my program, what makes sense in terms of the resources I put in, and what are the questions that are really important to ask. And then, as Barri said, we have multiple different program components. We tried to streamline somehow the questions that we are asking. We have a core set of questions we ask of everyone and those are the simplest feedback questions. So then our grant reporting has become easier because we can include those same questions and we can compare outcomes across different program components. So that was I think a really important step to align evaluation of different kinds, but regardless we aligned it across different program components.

Susan Howley: Barri, what would you say that your school partners are excited about right now with Expect Respect?

Barri Rosenbluth: Right now, they’re excited about seeing our educational theater performances. This is a different component of the Expect Respect program. We offer theater productions that are developed by youth, for youth audiences at the middle school level and at the high school level. We work with college level actors to engage students in dialogue in the classroom using a theater-for-dialogue approach. And these are very well received. They’re very interactive. They stay very fresh and current with the issues that teens are dealing with in their lives, online, with their friends, with their phones. And I think that’s probably what the schools are most excited about right now. We’re also launching a consent campaign with stickers and posters and social media contests and that’s generating some buzz as well.

Susan Howley: It is so good to hear about a program that’s been demonstrated to make a difference in reducing violence in the lives of young people. And great to see how evaluation and measurement has been part of the development of Expect Respect. Barri and Barbara, thank you both so much for sharing your work with us today.

Barri Rosenbluth: Thank you.

Barbara Ball: Thank you.