Ep. 11 - Assessing the Impact

June 28, 2023

In this episode hosted by James Gamage, the spotlight shines on evaluation in practice. Joining the conversation are two distinguished guests, Janis Galloway, Community Impact Officer from Alberta Ballet and Joy Bowen-Eyre, Chief Executive Officer at The Alex Community Health Centre, who bring their unique perspectives to the table. James, Janis, and Joy engage in a thought-provoking discussion, exploring the nuances of evaluation methodologies, best practices, and the importance of collaboration and stakeholder involvement. You will gain a deeper understanding of how evaluation impacts artistic endeavours and community initiatives and how it can drive meaningful improvements in practice.

Listen to this episode

June 28, 2023; Assessing the Impact


Welcome to this episode of Responsible Disruption Podcast. My name is James Gamage, Director of Innovation at the Social Impact Lab with the United Way of Calgary and area. Today's episode is a continuation of our last discussion about the theory around evaluating social innovation projects and initiatives. Don't worry, you don't need to have listened to the last episode to pull useful nuggets out of this conversation, because today we're moving from the conceptual to the practical, addressing the how it works on the ground. So with me today are two inspiring local leaders, Joy Bowen Eyre, of the Alex Community Health Centre, and Janice Galloway from the Alberta Ballet. So let's introduce our guests. So Joy Bowen-Eyre is the Chief Executive Officer at the Alex Community Health Centre and a recognised leader in Calgary Social Services community. Joy has a passion for community health, education and social work, and an intimate knowledge of Calgary's non-profit sector. Her work is a testament to commitment in public service and collaboration to solve complex social issues. She is passionate about advocating for the things that matter most to her, including access to healthcare, a strong public education system, and a strong social sector that is focused on the wellness of all. She spent her career challenging the status quo and loved connecting and working with others who feel the same way.

Janice Galloway is the Community Impact Officer at the Alberta Ballet, where she works to understand how dance can contribute to building healthy communities and develops programming that brings the benefits of dance to more Albertans. She has worked as a PR and communications professional for 15 years, with a focus on supporting organizations which prioritize corporate responsibility and social impact in a variety of sectors, including the arts, fashion, food and more. So welcome to you both.


Thank you.


Thank you, James.

02:11 JAMES: So as a bit of a background to this conversation. We met with James Stauch from the Institute of Community Prosperity at MRU and Bethany White from the United Way in the last episode. And Bethany and James were able to illuminate our understanding around some of the theory of social impact evaluation as well as some of the issues associated with it. So we talk loads about evaluation at a systems level, the value of working with stakeholders to identify what's important to them, how you report on things that just can't be measured, theory of transformation, developmental evaluation and the role of the evaluating the project team and we discuss later more as well. So what I'm hoping from you today is to bring your practical experience of evaluation to the conversation as you're both at the Colts place and we're really interested in how this theory might play out in practice. I'm gonna start with a two parter question just to get the conversation moving and moving from that conceptual to the practical tactical. So if joy, if I can ask this of you first. So why is evaluation important in the context of social innovation projects and initiatives in general? And what is the usefulness of evaluation specifically for your organization? I will ask the same of you in a moment, Janice. But Joy, if you can kick yourself on that one.

03:41 JOY: Absolutely, James. So I'll probably give kind of a long winded example. So I've been in the not-for-profit sector for close to 30 years now. And so what I can say is when I first started out in this work doing frontline work, we were doing innovative things but we weren't capturing those innovative ideas, those projects and we certainly weren't evaluating. So when you come full circle to the position I'm in right now, I think a lot has shifted. And I think it's shifted for many different reasons. There's a lot of competition for funding and there's a lot of competition within the not-for-profit sector. There's lots of people doing great work, but how do you set yourselves apart and how do you demonstrate that the work that you're actually doing is having impact? And I think this is where the role of evaluation really started to take root probably about a decade ago. I think it was historically where it was a funder, whether that be the United Way, government, or somebody with philanthropic dollars. They're like, I really like joy. She's a lovely person. I know her team is doing great stuff. I'm just going to give this gift, but there was no measurement about whether the money that was given was actually creating impact or shifting how we responded to people. For us, we're always talking about at The Alex is how do we improve quality of life. Well, that's a big hairy, audacious goal about improving quality of life for those that we serve and everybody's quality of life looks different. So mine looks different than James's. Mine looks different than Janice's. And we serve about 16,000 individual types of folks at the Alex every year. So when we talk about evaluation and the why, what we're trying to understand is the money that's given to us from a myriad of different sources, we want to be able to demonstrate back to those funders that we are actually making that impact, that we are actually improving that quality of life. And so the usefulness around the evaluation metrics and the tools that have been agreed upon between the funder and the organization are instrumental about moving the work forward.

For example, today we have a community meal program that's happening at the Alex Community Health Center. We know that we're impacting quality of life because folks are coming in the door, haven't had access to food and so by us giving food to folks, it allows us to engage in a conversation that allows us to start entering into a different type of relationship with where we can start to talk with the individuals about what are their other needs and so how do we then improve quality of life in a big picture. So for me, that's why I see the importance of the evaluation. It's not just about the numbers, it's about also the impact.

06:35 JAMES: Yeah, and we will go on and talk in more depth about that. Just making sure that the numbers align to the impact and what we're trying to achieve as organisations. But Janice, I’m interested in your perspective on that.

06:50 JANICE: So echo everything that Joy just said that obviously I think funders and donors and just the public in general, patrons of what we do are becoming more invested into the actual outcome or impact of their dollars and organizations they support in general. So I think there's a more sophisticated understanding and thirst for people to understand everything they do, whether the grocery store they go to or the arts performance they see or where they're giving their philanthropic dollars. Do they want to understand that impact in a more tangible level? So it's very important for Alberta Balet for the same reasons Joy said. I think also to elaborate on that, it gives us learnings, right? So for me personally, as a professional, you don't go into social impact work just to keep yourself busy. You do it because you really want to see an actual change or impact in the world. And so it's very motivating for me to conduct really thorough evaluation to understand, are we actually making the impact that we're trying to do? Are we doing it? Let's see the evidence to show it. And if we're not, let's pivot to make sure we make changes. And so I find that very valuable from a personal perspective. And I think also it creates accountability, transparency, and credibility for an organization to say we've done the work to make sure we understand what our impact is and here's the evidence to prove it.

08:20 JAMES: I’m thinking about that range of stakeholders, I think you both mentioned it and I think particularly you Janice mentioned funders, donors, and patrons. Back to the start of the process. How do you understand what they're interested in particular? Is that from conversations? How does that process work?

08:38 JANICE: You ask them. To me, that's the civilizer. You simply ask them and we have folks who work on the development side, the fundraising side, who are out having those conversations all the time with funders and donors to understand what is important to them and feeding that back to us, but then also for the actual beneficiaries of the programming we do, again we ask them. And our partners, for example, one of the programs that I oversee at Alberta Ballet is our welcoming newcomers program and we work very closely with immigrant serving agencies to deliver that program and they are such an important stakeholder to make sure that we are creating a communication channel with the immigrants that we're inviting into the theater, inviting into our dance studios to understand, is this what you want? Is this beneficial? What was your experience in this program and so they work with us to help us collect that information. So yeah, I guess it's just ask. Ask what's important to them and they'll tell you.

09:40 JAMES: Yeah, and I guess Joy as well with a range of stakeholders. Is there a risk that you have a range of measurements and how do you prevent that from becoming unwieldy?

09:54 JOY: So, similar to what Janice had said, is for us it's about asking the donor, the funder. What problem are you trying to solve? These are the services that we offer. So we try to marry a donor with a particular program or interest or a project that they're passionate about. So, for example, we have a good relationship with Pembina Pipelines. And so their two interest areas from their strategic priorities are supporting Indigenous peoples and providing access to food. And so it seems when they approached us and said we'd like to support the Alex, the most logical step then would be to support our community kitchen up front just because we know that we serve a great deal of Indigenous people and so we had them on site here last week and they wanted to know is, what was the impact of the dollars that they are giving to the Alex. And it wasn't just about the number of people that we were serving and the number of Indigenous people that we’re serving or the number of meals. What they wanted to know is, when you're having those connections with folks, what else are you able to offer? What other services can you provide to them that would increase their quality of life? And so many of the folks that access the Alex are unstably housed. They are homeless. They're living definitely below the poverty line and so the meal is the entrance way into our organization that provides that other robust buffet of services that can actually help people on a day-to-day basis. And so we have found that if we tried to take funding from somebody whose strategic priorities do not align with ours. Or whose outcome or problem that they are hoping to achieve does not align with ours. Then it's an absolute misfit. So the importance is of those conversations early upfront and to align what you're both trying to achieve is absolutely critical.

12:01 JAMES: And does it sometimes happen that you move away from your funder and that their expectations different from the outcomes that you're able to provide? Does that happen?

12:13 JOY: It does, and it happens for very different reasons. Sometimes the funder has an idea of what they'd like to achieve, but doesn't understand where and who can best be positioned within the ecosystem of who can provide that level of service. And so sometimes they'll come at a specifically to say, hey, we're working with a donor right now and it's quite challenging because what they're hoping to achieve is very hard to measure and so because we work with individuals who we don't see every day, sometimes it's really difficult to understand whether the food that we are providing. So the two nutritional meals that they provide every day will actually contribute to somebody's health and well-being and improve mental health. And so we are having conversations with the funder to say, look, you know, generally people eat three times a day plus snacks. If we're only seeing people twice a day, is the data that we are getting from these folks, is it accurately to assume then that their mental health is improving based on two meals a week?

13:29 JANICE: Yeah, it's similar challenges on our end at Alberta Ballet. One of the programs we've been piloting over the spring is a new series of programming called Alberta Ballet Outreach classes. So these are free beginner ballet classes that we're doing in communities around the province, but also for immigrant youth and one of the outcomes we were hoping to achieve is supporting mental health. Building self-confidence. All of these things that are very beneficial when you're maybe navigating a challenging time in your life and to measure that over potentially, a six or eight week program is obviously very challenging and something like mental health is also it can change from day-to-day so that you might be measuring it this day and they might self-report that their mental health is here. But what is it the week following? So I think again, it's challenging with capacity issues or your access to that participant in the program, if it's only for us maybe a short period of time not a long time, those outcomes could be very, very hard to measure.

14:34 JAMES: Yeah, it feels like the importance of that one is transparent conversation up front and the ongoing dialogue is paramount. Otherwise, I guess you could chase the funding with your programming and that's not the right way around to think about it. Is it?

14:51 JANICE: No, and I think that the magic for me has happened when that funding partner wants to be involved a little bit more in that process or wants to understand and when it's collaborative in that way and there's an exchange of education and learning is happening, I think that's a sweet spot in those partnerships is that you're growing together and figuring out what can we measure and if it's beneficial for both parties.

15:20 JOY: I know for us we've done long term evaluations and then we've done quick cycle evaluations and looping back to the funder and saying, hey, this is what we've learned, this is what we've discovered. Are you still with us and do you still want to participate and are you still interested? And even though you came in at this point, this is the new learnings we've got and now we need to pivot over here because it's only been six weeks. So we'll do a six-week quick evaluation and loop back. So we're involved in the community mobile crisis response project. So that's a social innovation project between the Alex, City of Calgary, and the Calgary Police service and so we have got quick evaluation turnovers and so we're reevaluating every month to find out are we responding to the right types of calls. If we're not, why not? Are we getting the right types of calls? If not, why not? And so we've continuously pivoted and shifted that project, and we only started in the middle of February. And so having a partner in which that ability to pivot is critical as opposed to having a partner who comes in and says, no, this is the only way in which that we would like to see the project evolve and you’re only going to get the money if we do this.

16:49 JAMES: That speaks to some of the work we do in the Social Impact Lab, Joy, because if we're looking at an innovative project, what you're testing is whether something works in the short term. We're trying to prove or test assumptions and prove assumptions it with an experiment or a short-term piece of work which might allow us then to pivot and move into a different direction. So that's interesting. You have that relationship with some funders to be able to work with them and take them on that learning journey with you through the innovative projects. So that's interesting. And any thoughts on that from you, Janice?

17:36 JANICE: I think it's a shift that's exciting to see within the social and community investment sector that I think as even these corporate funders as they grow capacity in their organization. For these kinds of partnerships as well, I think they have more resources to develop these kinds of relationships and come at it with the approach of like, yes, we're we might be doing an experiment, we're testing a hypothesis and it's OK if it doesn't turn out exactly how we want it so that we can print our pretty report and say look what we've done. Like that shift, I can feel it happening in the conversations that I'm having with thunders and it's really exciting because that that's where the real work happens and the work that makes you feel good about what you're doing and knowing that you're approaching it in a sincere way.

18:32 JAMES: And just moving to how those relationships might develop and thinking about qualitative versus quantitative measures or outcomes. So it's a bit like art over science. When you're evaluating very complex things, there are some outcomes that might not be possible through quantitative measures. So how do you handle that?

19:00 JOY: So I'll kick start us off. So several years ago we identify that we really need to bring Indigenous health strategy to the work that we do. We see many folks who are identify as Indigenous and so we have listened and learned to our Indigenous colleagues who said it's not all about the numbers. It needs to be about the story. And the story helps frame what the numbers can do. And so we have put our evaluators through the Indigenous evaluation course so that we can relay our stories better to folks and we can capture that blend of what needs to happen from a traditional western to Indigenous approaches. And I think it's been really eye opening for us to be able to do that. When we find that when we're working specifically with government, they're very focused on the science part of it, whereas when we're working with Indigenous organizations or Indigenous staff and from a cultural perspective, it's the art part of it. And so for us, we're still in what I would call a journey to be able to report back to some of our funders, our donors. And blend those Western ways with traditional ways. Because one way is not always the right way. It's that blend.

20:33 JAMES: Would you do that storytelling in circle with Indigenous elders?

20:40 JOY: So we have Elder Advisory Council and we have Indigenous gathering circles. And so for us to be able to tell that story, we've invited funders into that Indigenous gathering circle. And then we've been able to capture some of those stories specifically. So for example, we have elders on site. And so we are funded through different sources to provide an elder on site to provide cultural support, cultural healing, cultural practices to Indigenous clients. And so realistically, it's not possible for us to ask an elder to provide a report to us. And so what we are now doing. To be able to capture that impact and be transparent with the funders about what it is that we're doing is capturing the stories of the elders and then having one of our staff meet and interview with the Elder and capture their story for us then to be able to report back in more of a traditional western approach.

21:41 JAMES: Do you have that situation where outcomes just can't be measured?

21:48 JANICE: Yeah, I think speaking to the quantitative qualitative like to me there's equally as important and just like Joy said, typically some of the government funders want those clear data point metrics which are absolutely important. But to me the qualitative is the why. It's the context to the quantitative that helps you understand what those numbers actually mean. And again, I think we're seeing a movement where there is a little bit more prioritization or respect for the qualitative, which is very exciting to me. I again, I come from a PR background. Humans relate to story, it's the easiest way for us to communicate impact is through story and that qualitative, it can be personal narrative. So an example is, we recently piloted this immigrant youth dance classes at our studios in both Edmonton and Calgary. And there was one little girl who came, and I went for the first class and the last class and that first class, there was a young girl from Ukraine who clearly was very overwhelmed at first with the language barrier, so did not have a lot of English, was able to follow the teacher and do the moves. By the end of the class I had seen definitely she was happy and definitely better mood. By the last class I came to 8 weeks later. She was like her English had leaps and bounds and she was having a blast and just seeing the progress of her. And I know that wasn't just coming to ballet class each week, there's all the other things that are going on in her life that are contributing to that. Just the self-confidence that I saw and again that progress in her English in over 8 weeks. And so that observational qualitative data is also very valuable, but also going back to what can you measure, what can you not like that's not in a controlled experiment that we can say it was exactly just that she came to ballet class each week that we saw this progress. There's lots of things that are going on and that are impacting that outcome. So yeah, it is challenging, but I think the relationships and for the evaluator and the project team to be, I guess like more involved in that project is really helpful for gathering that observational qualitative data that helps you understand the nuances and complexity that don't show up just in OK, we had this many people participate.

24:21 JAMES: Yeah, that's a lovely story. And actually a very good example. We can all relate to that even if you can't necessarily tie the ballet class to the outcome. But you did talk about there, about evaluators working with the project team, so. I want to ask you both about that as well. How important is them almost to be like embedded within the project team, Joy?

24:45 JOY: So when we have a project, we embed the evaluator right into the project team. When the idea starts to come, we already start to think about evaluation and how we're going to measure the impact. So we have a team here. We have a manager of research and evaluation and so that individual works directly with each one of our program leads. And so she did a beautiful example last week in our leads meeting and said, look, if you're collecting data and you haven't done anything with it in over a year, you've never looked at it, you've never measured it, you've never analyzed it. Why are you doing that? What is the purpose of that? She's newer to our team, and so it was this aha moment for all of our leaders in our organization to say, right, so why are we capturing that information? And if you're not doing anything with it, stop doing it. Let's talk about the data that we need to collect to figure out what does inch impact forward. And so we're in a reset mode here because I think traditionally in healthcare, the funder such as Alberta Health Services, they've told us what they want to measure. As opposed to we're the ones or boots on the ground, we're having the direct impact, we're seeing the clients each and every day. And so we're starting to tip the tables, some of that social disruption to say, I know this is what you want, but this is actually what you need to see. And this is the data that we're collecting. And so we're in that social disruption phase and we're trying to tell a different story within health and I think it's kind of exciting.

26:29 JAMES: That's a lovely story as well. That also speaks to something else I want to probe is it feels like unless you're really careful, and unless you have that critical evaluation of what needs to be evaluated, the evaluation itself could become incredibly overbearing to projects and work. Is that the key? You had to embed an evaluator and to have them constantly challenging maybe whether or why we're collecting data and what the key data is to collect is. Is that the way to prevent it becoming overbearing to the work, would you say, Janice?

27:06 JANICE: Well, I'm interested in what Joy has to say about this, to be honest. Because for Alberta ballet, I think we're at the beginning of a new journey in evaluation. So we don't have a manager of research and evaluation which I would love to have. So we've collected. A lot of data over the years, a lot of I would say quantitative data and we do use a lot of it. So I'm proud to say that we don't have too many of the I like to call them the dusty binder on the shelf. We don't have a lot of that data that gets thrown in a closet and know it looks at it. I would say for the programs that I'm working on right now, I'm both operationalizing them and I'm in charge of the evaluation. So I am deeply embedded which I like because it's helping me to kind of understand again those complexities and nuances of how can we measure this? Is this even possible what I'm asking the project team to measure and so I'm deeply involved in that collaboration to make sure that we're trying to set ourselves up for success in the evaluation process as early as possible, but I'm interested what Joy has to say because I feel like she's got a sophisticated team over there.

28:27 JOY: We have a great team which is true, but we have gone down certain paths and we work with physicians where either a physician or a program manager or somebody in the evaluation team has a specific interest in researching and evaluating something that as an organization we have not agreed to or that the funder hasn't indicated that they wanted to go down there. And so we've had to be super careful about ensuring that whoever is assigned to that project team, to do the evaluation that's within scope. Because you when you've got somebody with a brilliant mind and somebody who has vulnerable people that they have access to, it needs to be ethically... are you actually capturing the data from the clients based on the project about what you're trying to achieve and so there has been times where conversations have been crunchy internally where a physician has wanted to take the organization down a certain path. Really wanted to allocate resources that are not funded. But thinks that the scope of the project could be tweaked a little to go over here to investigate something so we really had to focus our efforts and to look at what's ethical, what are we resourced for and what outcome are we trying to achieve.

30:12 JAMES: In in your world, Janice, the ethics behind data collection and evaluation. Do you find the same kind of pressures sometimes as Joy experienced?

30:25 JANICE: Yeah, I'm happy that Joy brought that up. We haven't run into anything too crunchy as Joy has yet, but it's definitely something we're quite conscious of because, as you can imagine, when it comes to looking for funding and wanting to make programs attractive to funders, you have to be very clear about the path that you're on and what your North star is because a funder can come in with a priority and say, we're really interested in this. And you start to maybe veer off, and sometimes it might make sense they might expand what you think the program could be, but sometimes it can be... it's either going to extend your capacity and your resources that you don't have and and dilute that project. Or it can take you off path pretty quickly. In terms of the ethics, in terms of collecting the data. What I'm just really conscious of is working with the partners that we have and the stakeholders. So again going back to the example of our welcoming newcomers program, I refer and speak a lot to the immigrants serving agencies we work with to say what would be the best method of collecting this data. Do you think this is appropriate? How would this be perceived? Is this too much of a burden on the participant or even your organization if we were to ask for this? To collect this data. So again, going back to ask your partners what they perceive is appropriate as well, and that's been really supportive for me and the work that I'm doing is just having that collaborative approach.

32:04 JAMES: That's good. Thank you. I think you've both talked about the fact that evaluation has changed and has matured in the last few years, possibly driven by what Joy mentioned, competition at the start of the conversation. The discipline itself has matured. Are there any directions that it's taking that you're really excited about? Are there new initiatives if I can call them that in an evaluation which excites you?

32:32 JOY: I’ve got a really cool example from the pandemic. So as the pandemic was rolling in, I’ll never forget the day, it was March 8th of 2020. We were approached by government to say, hey, can you folks stand up an isolation facility for people who are homeless who may get this COVID thing that's going around. And so I was like, well, we've never done that and so they said, well, can you just do it and we'll figure it out. So we took this massive leap of faith as an organization. And we stood up a hotel in the city of Calgary and we supported people with COVID-19 throughout the entire pandemic. So the facility was open for 26 months. And so right up front, we figured we better be capturing data so we can tell this story because this is a once in our, hopefully lifetime, pandemic that we're all going to endure. And so we started to take in data from clients and of course we've got really health specific informational clients. It's highly confidential information. What we ended up discovering through our robust evaluation was a lot of unintended positive consequences of the program that we had set up in the project that we had created. This then has allowed us to have conversations at the provincial level about the need for what's best described as a step down hospital for people experiencing homelessness in the city of Calgary. So if anybody is listening has gone into any of the emergency rooms and they've seen somebody who is homeless being at the Foothills Hospital and emergency department is probably not the best place for that individual. You need somebody who's highly skilled, highly trained, and who has expertise in working with somebody who is either intoxicated, dysregulated, unmedicated, and very mentally unwell. And so we are now in conversations to have a specialized, not a facility of sorts, but a specialized project where people experiencing homelessness if they need to go into hospital. Or they've been in hospital for any specific reason, then need to have discharge planning. If you're homeless and you're discharched, and you've just experienced significant surgery or you've just had a severe infection, going back on the streets is actually not good for you because you’re more likely to come back into the hospital for readmittance for even more serious things. And so that unintended consequences of us putting together a project to solve one problem is now helping us as a society, as a system, solve another issue. And so for me, that's the exciting part about what evaluation can offer when you don't just whip together a project and just see how it goes. When you're methodical, when you've got a plan in place, when you're collecting data for a certain reason and then you're open to the new possibilities, I think that's really the exciting part.

35:44 JANICE: I love that example. That's amazing.

35:47 JAMES: Isn't it? And in your world, Janice?

35:49 JANICE: Yeah, I don't know if I might have a specific example that will come to mind, but I think I'm just excited about the arts in general is having this deeper conversation about how do we actually prove and show our impact? I've been connected to the arts for many years and it there's always in the past there were sort of the sentiment of like we know the arts are good for you, like take your medicine. We know that they're good and I think on a personal level, arts have a good societal impact, but I don't know if we've ever, especially in Canada, if there's ever been as much good research and evaluation that's happening right now to show what that actual evidence says and what those outcomes are when people have the opportunity to participate in the art. So that conversation is really, really exciting to me, and I think there'll be lots of learnings from that that will point us in different directions. What's personally exciting in my work at Alberta Ballet is we did a project last year where we worked with one of our newcomer agencies to do a thorough audit of what that experience is like for a newcomer when they come to the theater to see an Alberta Ballet performance, and there was a lot of interesting insights about what they're seeking and it's deeper connection with the organization. So I'm going to be spending some time over the summer looking at is how do we deepen that relationship, how can we create more of a community connection point for immigrants at our performances when they're hosting them so those projects are really exciting to me.

37:38 JAMES: Thank you. Thanks very much. That's great. So we're coming to the end and I just have one last question, which is a round up question. Is there anything we've missed? Is there anything we haven't discussed about evaluation? Anything that you wanted to say when you came on this podcast, but we haven't got around to? Any thoughts like that?

38:00 JOY: For me, I think I'm excited to know that evaluation is actually tied to many of the funding as opposed to the old style way of doing things. I mean, yes, all of us may have benefited from those strong relationships, but if we're in a time of competition and scarce resources, I would prefer that an organization who is doing the deep impact and is able to prove that is the one that's actually receiving the funding. So I'm actually very supportive of best practices and an evaluation component, and I think it helps, especially in the not-for-profit world, it helps professionalize our work. It helps validate our work and it helps from a transparency in terms of where those dollars go. So I'm very supportive of evaluation.

38:54 JANICE: I agree. For me, I think what I would want to emphasize to anybody listening who is in an organization that doesn't have a lot of resources or doesn't have a lot of capacity for evaluation, I would really encourage you to not feel overwhelmed. It doesn't have to be rocket science. Just starting small and building a culture of evaluation is the best place to start and don't feel like you have to have all this amazing software and tools and these very sophisticated evaluation. Just start somewhere. And that's been the learning lesson for me over the last two years, is that there's a lot of great resources out there to help you get on this journey and keep improving and working on how you evaluate and how you collect data. So don't shy away from it. If you think, Oh well, we just don't have capacity, you probably can start somewhere and a great course that I was able to do over the last year was with an agency called Good Roots Consulting. They're actually based in Ontario, but they did a fabulous, I think it was a four week webinar on evaluation fundamentals for the arts specifically. And so it was so advantageous. It was really tangible of breaking down of what is a logic model, starting right at the basics and giving you the tools to do it from the ground up and a little bit DIY evaluation. And there's lots of resources out there like we live in such a great time of access to these tools. I could attend a webinar every single day on evaluation. So I would just encourage people to start that journey if they haven’t.

40:40 JAMES: That's a great way to end. So thank you very much, Joy and Janice. Thanks for your time today. It's been really great talking about this really insightful and I hope it's been useful to our listeners, so I really appreciate you sharing your time and expertise with us today. And for our listeners, thank you very much for joining us. Keep an eye out for our next episode of Responsible Disruption where we will be discussing systems thinking and how this theory is used to address issues in complex structures. So until next time goodbye.

[Outro music]

That's all for today's episode of Responsible Disruption. Thank you for tuning in and we hope you found the conversation valuable. If you did, don't forget to follow, rate, and share wherever you get your podcasts. To stay up to date on future episodes and show notes, visit our website at socialimpactlab.com. Or follow us on social and until next time, keep on designing a better world.