S2, Ep. 4 - Data-Driven Social Change

April 17, 2024

Listen to this episode

00:13 JAMES GAMAGE, HOST:

Welcome to Responsible Disruption. My name is James Gamage, Director of Innovation and the Social Impact Lab. Today, we welcome Jeff Couillard, Co CEO and Director of Learning and Development at the Ally Co., a leadership and culture consulting company that specializes in helping teams to deepen connection, alignment, and impact to the conscious use of power. Jeff is also an advanced trainer and chairperson of the Board of Directors for the Right Use of Power Institute, an organization dedicated to promoting ethical and positive uses of power globally. Welcome to the show, Jeff.

00:51 JEFF COUILLARD, GUEST:

Thanks James for having me. It's a real pleasure to be here.

54 JAMES: Cool. Good. So I'm going to start with an easy question, but one that sort of helps to position your career for our listeners. So you've talked about life changing moments in the past and your Ted X talk and your decision to pursue entrepreneurship, can you tell listeners a little bit about your journey through your career?

01:16 JEFF: Sure. Yeah. Like most of our careers, probably some twists and turns, some unexpected, some things we thought we'd do, and then where we actually ended up is maybe a bit a bit different in hindsight. I'd say that about half of my career, so the first dozen or so years was in addiction and mental health and or working with people in vulnerable positions. So people with disabilities or kids with cancer, those types of kind of therapeutic healing space. So I would, I would say, and I didn't intend to go there, you know, just one of those things that I my background or my undergraduate degrees in ecotourism and outdoor leadership. And so I thought I'd be a mountain guide. And so I was doing some of that kind of seasonal work out in Canadian Rockies kind of that starving guide thing and then I fell in love, accidentally got hired as a youth worker at an addiction treatment program because of some of those guiding skills and fell in love with the healing power of nature and the transformative impact that it was having on clients. And So what turned into what I thought I'd do for a year turned into a big chunk of my career so far. And then at a certain point, we would kind of zoom out a little bit and start to look at the systems that are in play that are impacting, maybe something like people needing to access addiction treatment and there was a strong impulse for me to go beyond kind of the confines of working in the social impact space or the nonprofit space into where can I have impact more broadly in the system. And so that kind of kicked off that journey into entrepreneurship. I don't know, maybe 10 years ago or so. And so yeah, it's that's the highlight the high level real if there's anything in there you want to dig into.

02:42 JAMES: For sure, I’m interested in how those experiences have shaped your understanding of social change.

02:50 JEFF: That's a great question. You started with an easy one, and now let's talk. Let's drive right into what is social change now? How do we make it happen? And I remember it was a quote from Paul Bourne, who was the CEO of Tamarack Institute for a long time and I was at a conference and he said it doesn't matter what the problem is. Community is the answer. And that's stuck with me for a long time, because living in community with our clients and addictions treatment and looking at the change process they were going through and trying to tease out what's actually driving that change, it's really hard to decouple an individuals change process from the community and the environment that they're part of. And so I think social change requires structure, it requires systematic approaches to this. It's not a one by one approach, which unfortunately is a lot of how we approach social change issues as a kind of an individual level. That's a big driver in society for us to get individualized with a lot of our approaches. So yeah, high level, I think it's zooming out to see the systems and structures and the stories that inform how we think about social change issues and addressing them at that level wherever we can.

03:52 JAMES: I think that's great insight. Thank you. Thank you. So, so talk to me about why you wanted to found an organization like the Ally Co. and sort of what came easily to you what you needed help on when creating the organization.

04:07 JEFF:

You know, I mean, it's still being created. It's being, it's being Co-created as we speak, both internally with our team and also with our clients and the communities that we are part of. I have to. I'll do it real quick. Just a correction. It's actually Co. with the period, not Ally Company, which is totally fine because I would say Ally Company, but the Co is really deliberate actually. It's a reminder to us to be in community, to collaborate, to communicate all the great Co words. It's actually the Ally Co. was named as much for us as practitioners as it is an outward facing thing because of how we show up with our clients as allies in an ideal world, sometimes as activists or as advocates or whatever is needed from us. But, you know, Allyship and then Co, like I said, is really kind of a driver for us as Co-creation. We take an approach that we want to do our work with people, not to people. I think a lot when you talk about social change, a lot of change management, a lot of change is driven at people, and we try to do it to them, and when we shift gears and start to do it with people or Co-create that change with them, we have much better and more sustainable results. So that kind of, I guess, the impetus behind creating the Ally Co. was a recognition that the status quo is not working when it comes to learning and development and consulting, primarily in lots of different spaces, that model. You know, we've been at it for decades now, and a lot of the big indicators that we would look at to say, "Is this working?". So it's something like leadership development actually working. We would look at things like engagement rates, attrition, burnout, and those things, and a lot of them are moving in the wrong direction. And so the approach that we've got isn't working. It's the Ally Co. is about what the better approach, how can we do this in a more meaningful way. And found some like-minded practitioners and in my partner Pablo, kind of a human-centered practitioner coming to the table. To kind of rethink and redesign how we do that.

05:54 JAMES: That's great. And what about that journey into entrepreneurship for you? I mean, how did that go? Where was it? Were there easy parts? Where did you find difficulties in that? Or are there easy parts and where are you finding...?

06:11 JEFF:

Is any of this easy? I don't know if I'd use the word easy to describe the journey. Meaningful, meaningful in that. At a certain point in the journey, you get to be the person in the room holding up the mirror in some capacity and saying, "Hey, this is what people are saying, this is what you were saying in this room and it's different than this other thing over here and like we need to make sense of that." And so helping clients make sense of their experiences and then take action on it. That's the most meaningful thing. A lot of organizations, a lot of teams, a lot of programs are kind of stuck in this status quo where we keep bumping up against the edges of what we know and what we think is on the table. So entrepreneurship for me is being able to be the person in the room who says, "Actually, lots of things are possible, but we actually need to get to the same starting point." And so a lot of our work with clients is getting to a shared understanding of this is the thing that we're trying to accomplish here and make sure that everyone is aligned with that. And so, I think that one of the things I appreciate most about the entrepreneurship journey is being able to come alongside clients and help them figure out where they're at and get to where they want to go. And yeah, different kind of power dynamic, right? When you're that per squeaky wheel at the leadership table, sometimes it's easy to just dismiss it because you're internal and you've got all of that other relational friction or all of that history with each other. Being the consultant, the facilitator to come in, you can say things that other people are thinking or have told you but haven't seen the light of day. And so there's a piece of being a change agent in the system or whatever systems we're part of that. Unfortunately, in a lot of ways, it's easier as an external person to come in and have that type of impact.

07:49 JAMES: Yeah. Interesting. So you mentioned power there and you as part of your work, you help teams succeed through the conscious use of power. So what does that mean to you?

07:59 JEFF:  I guess a lot of conversations with leadership teams start with, "What is power? Do we even have a shared understanding of what power is in the first place?" Because a lot of us carry around experiences with power and our own definitions. Kind of a local definition of power to me is this, and it oftentimes will have a positive or negative association. With power, I think it's fair to say that in the social impact space, most of the leaders that I end up talking with have a negative association with the word power just right out of the gates because they're on the helping end of lots of misuses of power in society. Right? And that's certainly my journey. I had to rehabilitate my own perspective on power as this negative force in the world to what it truly is, which is actually just a neutral force. Power is the ability to affect change. And there are different types of power, different sources of power, which we can access, things like our personal power, you're experiencing some of that today in the conversation. Jeff is a person, right? And everything I bring to the table. We have role power. So even in this dynamic, we have you in the role of podcast host and me as a guest. There's a power dynamic there. You get to ask the questions and I get to choose whether to answer them or not. But you still have a little bit more power potentially in your, in the role that you sit in. And understanding what that actually means and how that shows up in a relationship is a critical component of getting to the root cause of lots of the dynamics that we see in organizations or within programs. And actually, the turning point for me with power was being introduced to it while working in addiction treatment and seeing my entire team, we all leaned back at the table and we're like, "Oh no." Like, all of these things that we were talking about as far as we thought it was like, you know, lack of engagement in treatment or motivation to be in treatment or treatment readiness. When you start to look at it through a power lens, it's like, "Oh, who's actually responsible for creating the conditions for engagement?" We put a lot of the burden and the blame on people in positions of less power than us to engage in a meaningful way when actually the power is in our hands to control a lot of that experience and to kind of create the outcomes that we get. And so there's a shift in our programming and in our approach that centered power consciousness in addiction treatment. And it made it like I can point to it as a turning point when you talk about turning points in a career. That moment when that team really leaned into what does it mean to hold power in this relationship, and how are we misusing it, even accidentally or unintentionally? If we're misusing it, we shouldn't be surprised if we get a little bit of resistance or pushback, and it's really easy when you're in a position of power to blame the other person for their lack of engagement when you actually have set that up in an unconscious, maybe way so that conscious use of power is really about power awareness as a fundamental starting point. Do I understand how much power I have in this relationship and what that impact might be on the other person?

10:32 JAMES: Excellent. So without wanting to dive down a rabbit hole so how does that classically work with when you're engaging with a client or an organization? So is that holding a mirror up to them unpacking what they already understand about how, how does that work?

10:54 JEFF:  It depends on the client and their readiness for that conversation. Some clients hear the word power and are resistant to having that conversation, and that's a readiness check. We won't force the conversation, but we ask, "What are you ready to talk about?" We take a design thinking approach, which you're obviously familiar with being in the social impact lab. So we try to embrace the design process where we do a lot of discovery on the front end because the problem that leaders often come to the table with or come to us with is rarely the actual problem that the people on the ground are experiencing. It's about getting that perspective, that experience, of multiple perspectives on the table so that we can see where we are aligned with each other and where we misaligned. Let's start with where we're aligned and then ask, "Now what? What does this actually mean?" Let's get to a shared understanding of what's the opportunity or the problem here that we're trying to solve for. Because when you think about defining the problem, whoever gets to define the problem is usually the person in the position of power. And if the person in the position of power is defining the problem right out of the gates, then any solutioning that happens, or any efforts to fix that problem, are already at risk of being misaligned with the needs of the employees in this case, or with the participants in the program or whatever that happens to be. Yeah. So that's like one example of how power shows up is who gets to define the problem and then therefore the solutions that follow is an active power. And so a lot of our work is around distributing power more equitably, and that's about creating the space for those voices to be heard. Hence the mirror analogy is we'll help gather that and then hold it up and say, "Hey, this is what's really happening. What do you think? Let's make sense of it."

12:30 JAMES: Sure. Thinking about sort of the role of leadership in in that, obviously that's central when we're thinking about power. But also when you work with leaders in the social change space, talk to me about leadership in that context, you know, how can leadership be a crucial catalyst? I guess for social change or change within the community.

12:58 JEFF:  Another big question, well, maybe I'll give you our definition of leadership because a while back I went looking for a good definition of leadership that resonated with me, and I just didn't find any. So we sat down to write our own, and it's open to adjustment too. So if you want to wordsmith it with me, we can do that or someone else listening has a better way to say this than all ears. We look at leadership as the conscious use of power to create the conditions for meaningful connection and positive action. So the role of leadership is to create the conditions for meaningful connection and positive action, and it's in that order because it's connection before direction. That's something I heard as a youth worker right out of the gates and has stuck with me: you have to connect with someone before you direct them somewhere, right? You have to meet someone where they're at before you move them to where you want them to be. And the danger of being in power, as you can clearly see, "Hey, I want you over here," and so I'm going to force this conversation, I'm going to force you to comply with this thing to get you over here. But the danger of doing that is obviously resistance and misalignment and all kinds of things happen in that journey. So the role of leaders, I think in the social impact space primarily, is about connection. And it's about meaningful connection to the people being served. Too often, and I was guilty of this as a leader, is accidentally centering the needs of the funder or the donors or the executive of the board instead of the needs of the clients. And as soon as you do that, you lose control of the connection. And that's part of the job, right? And so this is probably going to lead us into that data-driven space and some of my thoughts on how we use data that's an act of power to collecting data and using it is an act of power. And who are we centering in that conversation? So like long story short, I think as a leader, if you're centering truly centering the needs of clients or people being served or your employees in your efforts ahead of people with other positions of power above you, that's a better approach, or maybe a more meaningful and sustainable approach than saying what does the funder want? And then we'll just do that kind of regardless of the impact that it might have.

15:05 JAMES: Yeah. I mean obviously, given the work that we do in the lab, I buy into that wholeheartedly. But one of the challenges that we face and I guess you would face in your context is the time that take when we're all under time pressure and you know there might be a funder involved who is driving towards a solution and it creating the space for that kind of interaction and that kind of learning is difficult. How do you foster the right kind of environment to be able to do that?

15:43 JEFF:  Do we want to just dive into kind of a data-driven approach? That's where we can build off some of your previous guests because there was some great content for sure around developmental evaluation and some of those other pieces. And there's an ingredient in that mix that made all the difference to us in addiction treatment. And I've since seen it make all the difference in lots of other spaces, but it's still pretty niche. I would say I don't think it's made as mainstream as it should be. So I guess it comes down to a mindset first. So if we think about the mindset around, say, outcome and evaluation, what's the mindset in the sector or the mindset of your frontline staff? And I've been there, I've been that frontline staff and I've been looking at the pre- and post-surveys that I'm supposed to administer to my clients and like, how is this meaningful? Like, what do we use this data for? And I think one of your other guests alluded to it, maybe it was Joy, who said, you know, we're not looking at data for a year, like what are we doing? Like, why are we collecting that data if all it does is get rolled up into the annual report and fed to a funder? It's a miss on making that meaningful for clients. So that first kind of mindset shift is if we're truly client-centered, which lots of leaders try and convince me that they're client-centered, right? If we're truly client-centered, then the outcome and evaluation system has to also be client-centered. And we have to ask the question, how is this meaningful to clients? Is this meaningful to clients? Not just clients hypothetically and theoretically, but like the client that I'm sitting with, the client who is in my program right now, who I'm asking to give us this information or this data about their experience, is it meaningful to them or is this an exercise of us extracting information and not using it in a meaningful way with them? So that first kind of flipping the model from "we do evaluation to prove that our work works to funders" to "doing evaluation to improve our practice." And not just in the long term, but in this moment or in this journey that this client is taking with us. And so that shift is an essential kind of mindset shift because the truth is, if you do really meaningful client-centered evaluation, it's going to be useful for funders. You're going to have rich stories to tell and more data to analyze your effectiveness. If you center your clients' needs and client outcomes, and I think that's probably one of the key ingredients that a lot of programs don't have is a clear picture of what are we trying to do here, what are the outcomes that we're after? And so we end up not necessarily really pinning that down. And I'll use addiction treatment as a mental health as a great example. We didn't have a clear sense of what success looked like other than finishing treatment. And so what would we do is we'd, you know, report into the funder how many kids finished treatment this year or what was the average length of stay? At that point, we're just taking attendance, right? Imagine a school that all they did was take attendance and they never marked anything. They never had grades. They never did quizzes or tests or they didn't do any evaluations on assignments. Like imagine that world where the only thing that system reported on was attendance rates. That's fundamentally what's happening in a lot of these sectors in addiction, mental health is a prime example of that. And so we looked at that and said, "OK, that's not actually what we're here to do. We're not here to just have kids in bunk beds and out in the wilderness for three months. We're here to meaningfully impact their lives." And addiction? We'll use addiction as an example. If we look at that as the problem, then we start to look at well, less drug use would be the outcome that we're after, right? Or no drug use, abstinence, or a reduction in drug use. But when you actually get a little closer to it, you realize that addiction in this case is a solution to another set of problems. Problems of childhood trauma. Problems of conflict at home. Problems of emotional regulation. Problems of communication. Like all of the skills that are absent and addiction is now a solution, a coping mechanism for that young person. And so it's like, OK, well, maybe success isn't actually the elimination of addiction. But the pro-social side of that, which is the growth of these skills and these competencies. And so when you shift your lens into an actual outcome, it would be someone who leaves treatment able to emotionally regulate without the use of drugs and alcohol. Suddenly we can start to measure emotional regulation or get client feedback on their ability to emotionally regulate, right? Or their conflict. And so that shifted that lens of actually centering the outcomes you're after that are important to clients because clients don't care if they're in treatment for 82 days or 95.

They just don't like. That's not a meaningful number, but they do care. Is my relationship with my mother improving? Am I back on track in school? Or like whatever other indicators? So making sure that we're client-centered in the outcomes that we're actually looking for as a starting point. And then the unlock here, and this is where I took a little bit of, I heard it was James, your guest who said we shouldn't burden youth workers or frontline youth workers with all of this data collection. We should have somebody else in the system. And I agree with that. The current system, if we're asking youth workers or frontline people to collect data and make sense of it and make meaning of it and it's not meaningful to clients, that's just extra burden, right? That's administrative burden that is not helpful for anyone. But as soon as we look at how might this be useful for the client that I'm sitting with right now to make their treatment visible to them, to make their change journey actually visible to them and to me as the frontline youth worker, well, suddenly that becomes a core piece of my job. Helping people see change as it occurs week to week, month to month, depending on the length of your program. We look at progress monitoring or routine outcome monitoring or feedback-informed treatment. There's that cluster of three frameworks that are all quite similar, but moving from a world where we do a pre-survey and a post-survey and roll it up on an annual basis to a baseline assessment when you come in the door, and depending again on the length of your treatment, maybe it's weekly, maybe it's monthly, but pulse checks on how are you actually doing? Are you making progress on these outcomes or not so that we can see is treatment on track or off track? All of a sudden, that becomes a really useful tool for youth workers or psychologists, whoever's doing that heavy lifting on the frontline. It becomes part of their practice. And you get way more data and a way better understanding of how impactful your program actually is. That was a monologue from me.

21:41 JAMES: That's good That's really great stuff in there. And I came across the concept the other day of good hearts law which is when a measure becomes a target, it's no longer a useful measure, and that really resonated me with me when you were talking about passing through the end of an addiction treatment program, it's you know that as a target is useless. There's so many different ways that you can achieve that target by turfing people out of the program without being properly...

22:11 JEFF: That’s the downside is that these were called proximate indicators. But like, they're indicators of something, but they're not quite the thing we're aiming at is all of a sudden we're starting to game the system to get closer to that target and maybe we take a few extra intakes this month or maybe we slow down to try and get that number up a little bit.

22:31 JAMES: So it promotes the wrong type of behaviour, is the summary. The interesting thing is that sprung to mind when you were talking there is... Absolutely, that is the right way to evaluate and report. How do you take funders with you on that journey because you know, these might the way that you're evaluating might not be obvious at the outset when you're getting the funding, you know and you're under pressure to say. Well, with this funding we will achieve X target. How do you get that alignment around? What is meaningful evaluation with, with the funder or those that's providing money?

23:15 JEFF: I don't have a clean answer for that because that honestly was one of the biggest struggles that we had. We ended up getting some external funding from Health Canada to do a multi-year study and to fund a lot of the outcome evaluation work that we wished was baked into the treatment program. There are some examples out there but I think the number's about 10% last time I saw it, like 10% of budget should be going to meaningful outcome evaluations. If you're a $1,000,000 a year program, $100,000 a year on outcome evaluation is like a good target. And I think we're nowhere near that. And so without that funding, it's tough to design and implement a meaningful system. And so we're stuck with that pre- and post-survey type thing. Getting funders on board, I think is... I mean, I would love to have that conversation with you or someone at United Way because as a funder, I think one of the things that we don't do well enough in the sector, I know I've been guilty of that, is not being as curious with the funder as we are with our clients. So I've been the manager who makes a bunch of assumptions about the funder and what they need and maybe how disconnected they are from the actual work on the ground. And so I've come to the table without the level of curiosity that might help us together co-create a shared understanding of what this program is meant to accomplish. I think a lot of times we get a little bit of that adversarial or combative or just maybe a little bit of disconnect sometimes between the people doing the work on the ground and the funders and middle management and leadership is stuck in that space of trying to do more with less. Right, advocate on a yearly basis for the grant, don't want to push too hard for anything because that money might migrate somewhere else in the sector, and so there are some of those power dynamics at play that make it tricky because... and I think your guest alluded to that, I think it was either Bethany or James in that conversation talking about that power dynamic with the funder, right? And so how do we enter that as people in positions of relative down power dependent on that funding? And I think one of the biggest things that I've been guilty of doing so this is the... like, don't do this. You know, this is, you know, lessons learned from Jeff. And if you could rewind the tape, he'd be a little less combative. But I was the guy who would be sitting in the boardroom pointing at the values on the wall and saying, hey, help me understand how this is reflective of client-centered care. And a little bit pushier probably than I needed to be, and that's probably towards the end of the journey where it's like, come on guys. Like, let's get aligned on how important this is. But I think that especially if you're designing a new program or building something from scratch, that's a great opportunity to have those conversations. But I think going back to getting really clear from the funders, what difference do you want this to make in the world? Right, let's get really concrete about the impact, not the high-level statements of impact that usually accompany funding or are inside of a logic model. But like get really granular and what does that actually look like for the client as they're accessing our services. And let's get clear and aligned on that. And then let's go shop that, buy some clients and see it. Does this fit for you? And that's a missing piece too, as we're designing evaluations. And I can tell you stories like we changed questions on our questionnaires because clients would give us back, "I don't know what that word means." Like, "Well, you should know what that word means because we're asking you the question. So if you don't know what it means, then anything you respond is not that meaningful." I think oftentimes we get stuck in that we need to solve it ourselves. And so we don't reach up to the funder to get some support to say, hey, here, you know, what do you need out of this? And would this... you've asked us for this traditionally. But we think this is actually a better outcome indicator. This is what our clients are reporting is really happening for them when they change and we think that's more meaningful than how long they stayed in treatment. Would you be okay if we shifted gears or layered into this existing attendance records, something a bit more meaningful? And so I think it's finding what's that shared place and meaning. I'm just brainstorming out loud at this point, James. I don't know the answer.

27:01 JAMES: Yeah. Well, I think you're talking about involving the funder a bit more in sort of that learning and iteration around the process. I mean it. You know, when you're innovating and you know a solution, you know you've got to be open to learning and iteration and it just feels like that that should be translated through to even through to the evaluation and to your point that if 10% if the sort of the model is that 10% of your time or budget should be spent on evaluation, that's what it should be spent on because I know in the kind of things that I've been involved in, we haven't really spent nearly that kind of amount.

27:47 JEFF: No, and oftentimes it's legacy.

27:47 JAMES: It is part of the, it's the process that actually takes that time.

27:53 JEFF:  Yeah, that's a great summary of my very long-winded non-answer to your question, but there's a nugget in there around learning which I think is really important. It goes back to that kind of evaluation for the sake of proving that what we're doing is effective versus improving. And if you shift the mindset to say, hey, we're gonna do evaluation to improve our services as the primary goal, then you have to be okay with feedback from clients that says this thing is not working. And you have to be okay to be transparent with that because clients are gonna see that data, your staff are gonna see that data, funders are gonna see that data. And so we had components of the program that consistently got negative feedback. This is not doing the thing you think it is. And at that point, you have to decide, do we go back to the funder and say, hey, we're changing some of our programming or we're changing this model up a little bit based on feedback, or are we going to pretend that it's still working and not want to be vulnerable? And there's a vulnerability there, which is again inside of a power dynamic. When you're on the down power side of a power dynamic, there's more risk to being vulnerable. If we come back to a funder and say, hey, guess what, we just did a year's worth of evaluation and this program is not nearly as effective as we think it should be or could be, you know, how are they gonna respond to that? So there's a lot of trust and vulnerability that's involved in that relationship.

29:04 JAMES: Absolutely. Are there any other kind of principles that you would apply to that client person centered evaluation process?

29:12 JEFF:  Yeah. So I mentioned a few different frameworks, but I think someone mentioned the work of Michael Quinn Patton. He's got some great stuff on utilization-focused evaluation, and principles-focused evaluation, I think, is the more recent kind of updated version of that. But we took a utilization-focused evaluation approach, which is how is this evaluation itself useful in the treatment journey that people are going through? Not just as, let's extract data and use it downstream at some point, but is this meaningful and useful? So I think that set of principles-based evaluation is really important around making sure that we're centering truly centering the client. So I'd start there. And then I would layer in what we call progress monitoring or routine outcome monitoring, which moves us from a pre and post-survey world to an ongoing feedback. Because then we can see in real time what's happening with individual clients. So we get that again that change making change visible, our evaluation system should do that. It should make change visible to the clients themselves and to the people doing the work, whether that's a youth worker or a psychologist or social worker, whoever is on the delivery side because then they can navigate it together and they can negotiate it. How powerful is it to be two weeks into treatment and see that your stress scores have gone through the roof and you're worse off than when you first came? Because you like. You are self-reported data that showed we showed about 15% of our clients would get worse in treatment in the first two weeks. And like lots of if we could dig a little bit deeper and figure out like what's going on there and it tended to be young men more than young women because young men would under report when they first came. No, everything's fine. I'm doing fine. Two weeks later, like, oh, things are not fine. I'm I'm more in touch and so maybe that's what's happening, but maybe some people you know shouldn't be in that treatment program. And our data showed over a couple of years that about 15% of our clients needed to exit early for their own wellness, right for their own healing journey right before they would have just probably been discharged for behavioral issues or they would have self-selected out and just taken off at some opportunity, right? But this allowed us to see that treatment journey more visibly, right and then have a discussion, does this make sense to you as the client? Is this reflective of your experience? Yes, great. What now what should we do next, right? It puts the you become the co-pilot on the treatment journey and it puts gives the steering wheel more firmly into the client's hands, which is again if we're going to be client-centered, I think that's an approach. So those are a handful I think routine outcome monitoring getting away from pre and post and getting into how can we get some ongoing feedback at whatever intervals makes sense is a huge unlock for lots of programs. And again if we make it useful if we pull it into case planning or case consultation or some of the other mechanisms we have to make treatment decisions then all of a sudden we're doing that those become more client-centered And I can tell you how many meetings I sat around talking about kids without kids in the room and without any meaningful data from them hypothesizing and making a bunch of assumptions and everyone sharing their opinions as opposed to actually, this is what's happening From their perspective, it might be different than our perspective, but at least we have their perspective in the room.

32:12 JAMES: Yeah, OK. And maybe this is an obvious question and it it's my naivety, but it feels like a lot of these evaluations or the way that you might collect the data or might lend itself more to sort of a qualitative rather than a quantitative view, does. How does that play out, you know, to what extent are these sort of measures and metrics as opposed to a more qualitative view that might be richer but might not be able to be aggregated in a way that might be more meaningful to funders.

32:54 JEFF: Great question. Can I give you a couple of different examples? It is a mix of both and I can give you the example. We took a very quantitative approach and so we used a measure called the Youth Outcome Questionnaire, which is self-reported scale and it was 64 questions long and they did it weekly. So we're administering weekly 64 question Likert scale on levels of distress but not but symptoms of distress. Right. And so how is stress showing up in these areas, intrapersonal anxiety and depression interpersonally conflict or lack of communication somatically in my body, not sleeping well, you know, headaches, those types of things. Like basically, how is stress manifesting in these different domains? We did that on a weekly basis and with a tool that's I think was published in their early 2000s, late 90s. So tons of data behind it and a validated tool. So accurate to what we wanted to measure, right, which was, how is stress impacting these young people and through our treatment are we giving them the skills to cope with that stress in a healthier way because that was the fundamental. Belief that we had is that addiction is a response to stress. It's a coping mechanism and so if we can equip youth with these skills and kind of improve these levers, pull these levers, then addiction doesn't make sense anymore in that young person's life. It's not the go to kind of coping mechanism because they'll see there's lots of downsides, obviously. So very quantitative like we took a very quantitative approach and it actually it would we got to a point where they do it on a tablet and we would pull up their report and we would see how they were last week and we'd see the change and it either goes you know improvement flat line or worse. Those are kind of your three options. And it didn't matter. Honestly, James, it did not matter whether it was worse, better or the same, because now it's, we would say it's grist for the mill. What does this mean? You haven't made any progress since last week, but actually you did because you made progress on this one. But you got worse on that one. So let's dig into that. Why did this go up? Well, I had a terrible conversation with my mom. I went on a home visit and I relapse. But I had a really great backpacking trip with the team. And then I'm feeling better about this area. And so you get the nuance that context that you get from the qualitative, but you have a starting point and a shared starting point that is client centered. It's client reported data. It's not me coming to the table and saying, hey, I think you're doing better in treatment and This is why And centering My perspective right. And so we had a lot of quantitative tools and within that we would have places for context, but we would capture a lot of that context in the discussion. Right. That we would use to then plan for the next week or month of treatment or whatever that time interval looks like. That's one example of using that in more of a client centered kind of space, and that's on like stress scores. I was working with a a homeless serving organization in Calgary and we wanted to measure hope and levels of hopefulness. And it turns out there's a couple of great hope scales That measures state hope and trait. Hope. Is it innate to me. You know how much hope do I have? Generally as a person, and how much hope do I have in this situation that I'm currently in and we would measure that on a monthly basis when they came in for services and we would see change and we would use that as a as a big indicator because a lot of, you know, there's a lot of great research around hope and hopefulness and how important it is for people as they go through a change Process, whether it's an addiction or homelessness or those types of things, and so some of those things that seem a little bit fuzzy to measure, they're actually some great research and tools that have been built to measure some of those things. And then of course, the context and the qualitative is important to tell the story. And I think that's actually a missing piece. A lot of the Reporting that we do in the social impact space is qualitative in nature. It's the stories, it's the pictures, it's the, you know, the feel good, the emotional things. But when you can layer that on top of a quantitative journey and say, hey, this person came in at this level of distress and they left here and these are all the points in the journey and how it went for them and what they said about that, that's much richer story to both improve your program but also Prove that it works to funders or donors or interested parties.

36:40 JAMES: Excellent. Thank you. So looking ahead, do you have any thoughts or what's the sector thinking about the future of evaluation and data. I was thinking when you were talking about that example about the use of technology for example, every young person has a mobile phone that they're, rightly or wrongly, constantly attached to and I'm thinking micro evaluations or kind of concept that was just one idea I had, but any thoughts about the future of evaluation?

37:22 JEFF:  Yeah. We started with pen and paper and hand graphing, and then at some point in the evaluation, shifted to tablets and laptops, being able to e-mail links to parents to fill out after the home visit. And like all of those things, data collection, there's really no excuse now to collect data in a meaningful and robust way. We don't have to mail out the post-survey and hope that they send it back with the prepaid stamp. We can get access to people, and there's a huge opportunity there. So, we were on the cusp of building out an app, a support kind of a post-treatment support app that wasn't just about how are you doing today, and answering micro-surveys on a daily or weekly basis as people transitioned from treatment back into life. But as a place for resources, contacts, and for people to build out their post-treatment plan. So, there is some movement in that space, and that's addiction mental health. But I think we're going to see that throughout app-based support, post-treatment or post-intervention with the opportunity to then collect more meaningful data as well because that's the big, most common question I would get asked is when I told someone I'm working in addiction, mental health. Like two seconds later, what's your success rate? And what's my answer to that question, James? Well, it depends. How do you define success? And I would squirm my way out of having to say, yeah, you know the truth, which is we don't actually know. We have no idea. I can tell you that the average length of stay is 82 days. But I can't tell you six months later or a year later, how these young people are doing. And so that longitudinal connection and the ability to collect actually meaningful post-treatment data has a huge opportunity. It's a big gap if we're honest about effectiveness. Something else, I think the future of the question was kind of the future of evaluation. I think it's sharing the outcome evaluation, so getting clear on where outcomes are and getting good at measuring those, but also looking at the variables that we have control of. So, we call them process variables, right, but a technical term for like what are we actually doing in treatment or in this program and are we getting feedback on that thing? So that we can pair up that feedback with the impact that it's making. And so the beautiful thing that happened for us is that we would weekly collect the outcome data. And then we would ask them questions about their experience. And we at some point, your data gets big enough that you can put those two on top of each other. And we can say, hey, interesting effect at Week 3 of treatment across the board, people get better by a significant margin. What's happening in Week three that we think might be driving that or the vice versa? Hey, towards the end of treatment scores start to tick up, and people are more distressed. The last two weeks, what are we doing about that, right, and you can start to actually that's improving your practice because just collecting outcome data isn't enough because then we're still making assumptions about what's driving that change. Right, and so getting client feedback on our practice, and that's the pinchy part. That's the part where we have to be willing to say, oh, that thing we think is really effective is not doing the job.

40:13 JAMES: Yeah, yeah. Interesting. Good. OK. Well, look, this has been a fantastic conversation, but when we end these conversations with our guests, we ask them a question about closing advice. So you know maybe this is something that you haven't talked about and you want to talk about. Or maybe it's a question of where would a leader in an organization start if they're going to incorporate more sort of client and person centered evaluation methods into their practices. What would you like to leave our listeners with?

40:53 JEFF: Can I quickly hit both of those things? So one thing that I would and an interesting shift in my work and the work of the Ally Co has been to take some of these principles and client like a client centered approach to be more broadly human centered.

41:11 JEFF: When considering evaluation within organizations, whether it's with our own employees or teams, we're often just as guilty of conducting the annual Employee Engagement Survey, asking questions that don't lead to meaningful change. When I hear about teams feeling disconnected from their employee engagement surveys or feeling unheard, it's because we haven't figured out how to gather more meaningful feedback and make it relevant for everyone. So, that same approach of making change visible and feedback meaningful can be applied to the work we do with our teams as leaders. In many ways, that's a better starting point than focusing solely on clients. After all, if the people responsible for driving change for clients are disconnected or facing challenges like burnout and high turnover, addressing those issues should be a priority before adding another layer of evaluation to their workload, which can feel burdensome. Therefore, I would encourage most leaders to examine whether they're practicing what they preach—are they making feedback meaningful for themselves? Are they asking questions about their leadership and how they support their teams? This human-centered approach extends beyond client work to encompass how we interact with our own employees and teams. Here's a question for listeners to ponder: How do you know when you're doing great work? To answer that question, you need to define what great work looks like and get clear on the outcomes you're aiming for. Then, you need a system in place to track and measure those outcomes. Only then can you confidently say, "I know we do great work." It all comes back to that common question, "What's your success rate?" The answer depends on having a clear understanding of what success means to you and having the tools to measure it. This question became a pivotal point for us when an evaluator challenged us to define how we do great work. The truth is, we didn't have a clear answer at the time, but we were eager to find out. So, perhaps that's a good starting point for anyone looking to improve their practice.

43:26 JAMES: That's sage advice and a great question to leave our listeners with. So Jeff, thank you so much. You know, in very insightful conversation clearly that you have a wealth of experience and wisdom in the realm of data-driven social change and leadership. Your journey is very inspiring. Your insight is very inspiring. So thank you so much for your time. Until next time goodbye.

[Outro music]

That's all for today's episode of Responsible Disruption. Thank you for tuning in and we hope you found the conversation valuable. If you did, don't forget to follow, rate, and share wherever you get your podcasts. To stay up to date on future episodes and show notes, visit our website at thesocialimpactlab.com or follow us on social and until next time, keep on designing a better world.