Community IT Innovators Nonprofit Technology Topics

Nonprofit AI: Canvas Hack, Candid Advice, AI for Nonprofits Book

Community IT Innovators Season 7 Episode 36

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 25:31

Carolyn Woodard covers three topics this week: what the Canvas ransomware hack reveals about vendor risk for nonprofits, why roughly 80% of nonprofits still have no AI use policy and what you can do about it today, and why the new book AI for Nonprofits belongs on your leadership team's reading list even as AI tools continue to evolve rapidly.

The Canvas story is at its core a governance story. When 30 million users depend on a single platform, a breach affects everyone who trusts that vendor. Nonprofits can't out-analyze the cybersecurity of major vendors, but you can make sure you have cybersecurity insurance that covers third-party breaches, a communications plan ready before a crisis hits, and solid data backups separated from your main systems.

The bigger takeaway for AI governance: most nonprofits are already using AI tools without any organizational guardrails in place. You don't need a full formal policy to get started. A one-page declaration of principles, a commitment to paid enterprise tools over free versions, and a habit of documenting what's working can give your organization a meaningful foundation.

And finally, AI for Nonprofits by Cheryl Contee and Darian Rodriguez Heyman is worth your leadership team's time, especially for its strategic framework and breadth of expert voices weighing in on AI uses across fundraising, communications, and program evaluation.

Resources Mentioned:

New every Tuesday.

_______________________________
Start a conversation :)

Thanks for listening. 


Carolyn Woodard

Good morning and welcome to the Community IT Innovators Midweek Nonprofit AI check-in. My name is Carolyn Woodard. I'm your host.

Carolyn Woodard

I am not an AI expert. I am interested in AI and have worked for nonprofits. I work for a technology company that serves nonprofits only. So really interesting in just exploring this world, this new world of AI, and helping nonprofits understand the resources that are out there, news stories, answering questions. You can get them into me via Reddit or on our website. And just helping understand these tools a little bit better and helping us all grow our AI literacy. So we're making informed decisions about these tools when we use them at our organizations and with the communities we care about. So

Carolyn Woodard

Today I wanted to touch on this Canvas hack from last week. If you have a kid in school or college, you probably got an alert about this big problem with Canvas. It was a day or two that people couldn't log into it. They were sending out alerts about data being stolen, the student data being stolen. Just a quick note that on Monday, May 10th, the Utah-based firm Instructure, the parent company of Canvas, did announce that it had reached an agreement with the hackers to pay the ransom to avoid the leak of that student data. So if you are following this story, I will also include a link to the story about shiny hunters getting that ransom that they had demanded.

Carolyn Woodard

So this story, it's about education, and maybe colleges are really who are using Canvas and schools, of course. So if you're a nonprofit working in education, of course, you've heard about this and you're maybe interested in it. But

Carolyn Woodard

I think for all of us, it has some layers that we can unpack beyond just the education part of the story. Of course, it impacts all of us who are parents or our students who are worry about your students' credit, your teenagers and college kids.

Carolyn Woodard

But it's also about how all of us are using tools that are very widely used, trusting our vendors to be able to keep our data safe. And kind of what questions you can ask up front and what you can understand about what your vendors are doing, what you're covered in with your contract that you have with those vendors, and just the concentration perhaps of the market in a few industry leaders mean that when one of those leaders uh gets hacked, it really has very widespread implications and uh issues that come out of that. And

Carolyn Woodard

That connects directly to conversations that we can have about AI governance at our own organizations. So uh

Carolyn Woodard

This Canvas hack, I will share some uh news stories and resources uh in the show notes with you. But basically, Canvas is a classroom management software, has 30 million users, used by roughly half of all higher education institutions in North America, plus many K through 12 districts. And it is particularly used for posting homework online, class assignments, communicating with students, uh, you know, getting extra help, getting one-on-one time, and of course, finals. Uh so

Carolyn Woodard

This week of where a lot of people are having finals, a ransomware group called Shiny Hunters hacked it uh during exams. And the data that was at risk was names, email addresses, student IDs, grades, coursework, private messages between students and teachers. And basically, you know, a lot of people like couldn't it have - I heard one news story said the uh professor said, you know, the hack it went down in the middle of the final. So what are those students that were halfway done taking this final going to do? Are they gonna be able to get their work back? Are they gonna have to reschedule the final entirely? Like, how is that gonna work?

Carolyn Woodard

It turns out that what they believe at the moment is that the entry point for the hackers was uh free for teachers accounts. It was a free tier that was created that is connected to the enterprise level.

Carolyn Woodard

So, as we say often in this podcast, make sure you're logging in with your work email and make sure you're getting a license, if that's at all possible in your budget, to get the paid version, the enterprise version, over the quote unquote freemium versions. And so this is kind of a case in point for that, although it didn't protect the institutions that had these enterprise uh accounts.

Carolyn Woodard

Of course, the data is valuable to the hackers in terms of the credit. So it's uh if you have a teenager, if you have a child, if you have a college student, uh it's important to be monitoring their credit so that someone can't uh set up, you know, a credit card or a mortgage in their name because they have that identity. This is the largest single point breach of student data on record. But of course, it's this student data is very valuable to hackers, so it's not going to be the last one, that's for sure. And

Carolyn Woodard

I want to kind of take a step back. For years, Community IT has recommended standard, widely adopted tech stacks using reputable tools that are widely used, you know, industry standards, best of breed. And one of the reasons that we do that is because way back in the day, when I was first in uh nonprofit technology, 20 years ago or more, there were a lot of customized solutions. If you if you were working in nonprofit IT, then you remember there were customized databases, there were customized, you know, emailing solutions, there were, you know, customized what the equivalent of what would now be a CRM. You people built very customized websites that ran off of customized databases.

Carolyn Woodard

And the problem with that was that it was hard to find someone else who could work on your tech, right? So you would often get locked in with a single vendor, a single consultant who knew how your database worked. And they could basically charge you anything because you couldn't you couldn't operate without that database operating.

Carolyn Woodard

So, in general, for 20 years, we've recommended that our clients move to very standardized tech stacks where there are a lot of consultants. There's lots of opportunity to change. If something isn't going right with your consultant or your you know vendor, you can find another consultant who can work with Salesforce or Microsoft or Google Workspace. So we generally recommend those industry leaders as being the you know, the safest bet in for your investment in IT. As I said, you'll have lots of people who can help you going forward, and they're very stable platforms in general.

Carolyn Woodard

But the honest tension with that that occurred in this Canvas hack was that the standard tech can be the biggest target. And that data prize for a ransomware group is the largest when you can get that industry standard or industry leading vendor. Of course, an industry leading vendor, one hopes, has a lot of cybersecurity built into their platform and into their contracts with you as their client. But, you know,

Carolyn Woodard

This Canvas hack really surfaced that vulnerability when there are only one or two vendors that are have this huge market share. It makes sense for you to go with them because they're standard. But if they suffer a hack, everyone that uses them is going to be impacted by that hack.

Carolyn Woodard

So, what can nonprofits do? Uh, you can do what you can with vendor vetting, you can review the terms of service, you can ask general questions, but it's not in general within the purview or the capacity of most small and meat-sized nonprofits or even larger nonprofits to out-analyze the cybersecurity threats of major vendors. And you shouldn't have to. Uh, that is what that's why you're signed up with a large vendor.

Carolyn Woodard

Making sure that you have cybersecurity insurance that covers third-party vendor breaches, making sure that you understand those contracts, that you have, you know, looked at the terms and conditions. It's true that large vendors have these problems.

Carolyn Woodard

Smaller vendors also, of course, have these problems, and you have difficulties when it's a very small niche vendor. They go out of business. Is there anyone who can replace them? They also, you know, a small niche vendor may not have the level of cybersecurity in place. So they are also a target for cybersecurity, you know, hacks and attacks.

Carolyn Woodard

But just making sure that you are mitigating risk where you can. This is something that your board can help with, something that your leadership team should be involved in, you know, over your IT director, you know, your leadership team and your board are really into that risk mitigation, building up those layers so that you have insurance, you understand the contracts, you're using a reputable company that has its own cybersecurity in place, uh, you know, as much as possible, but you're never going to be 100% able to prevent any cybersecurity fraud with any all of your vendors, right?

Carolyn Woodard

Of course, if you're holding student data, young people's data, any sensitive data about the people you serve, the community you care about, I would recommend having a communication plan ready now before something happens with a third-party vendor or something happens to you yourself, right?

Carolyn Woodard

When a breach occurs, your community is gonna look to you. Constituents, the people in the communities that you care about, they are gonna, when they hear about some massive hack that they are associating with you and your services, they're gonna ask you, how does this impact us? Is there something we need to worry about? How can I, you know, secure the credit applications for my student, all of those things. So

Carolyn Woodard

You can't anticipate every single hack that's gonna happen or breach or you know, security issue that's gonna impact you and your nonprofit. But having a clear line of who does the communication and a generalized communication plan, having that in place before something happens is always better than scrambling around to try and create it in the midst of some kind of hack issue. So if you have an incident response plan, that communications plan should be part of that.

Carolyn Woodard

If you don't have an incident response plan yet, but emphasize yet, make sure that you put that in place. We have a ton of resources on our website for how to build up that incident response plan and what it should include. So I will include those in our show notes too.

Carolyn Woodard

The ransomware aspect of the story also I thought was so interesting. If you missed our April webinar on the cybersecurity incident report from our own users and our data from our end users, we showed that wire fraud has been much more dominant over the past four or five years. We still have not really seen ransomware attacks on our clients. Doesn't mean they aren't occurring and they don't exist. They do exist. Clearly, this is an example.

Carolyn Woodard

In general, we were kind of theorizing that bad actors want to get a wire fraud. They get the money right away, rather than holding something for ransom, getting you to transfer cryptocurrency. You might not even know what cryptocurrency is. You know, it sets them up for more uh vulnerability in what they're trying to do. So, in general, we've seen a lot of wire fraud and phishing and spoofing over ransomware.

Carolyn Woodard

But we did see a jump in malware and viruses last year that we, you know, caught and were able to mitigate. We think it's related to vibe coding in AI. AI assisted development makes it a lot easier and faster for bad actors to build and deploy new viruses. So I think that's one of the things that's happening. Um, so yes,

Carolyn Woodard

Be aware of ransomware, of course. The biggest way to avoid being a victim of ransomware is to have those backups. And um those backups need to be completely separated from your systems so that you uh have something to backup from if your whole system is being locked out.

Carolyn Woodard

Again, just revisit your cybersecurity plan, your incident response plan, make sure you're having those backups and be prepared for ransomware, malware, viruses, wire fraud, all of those things.

Carolyn Woodard

So the Canvas story is basically a governance story and a cybersecurity story. The schools that were exposed were relying on a vendor for their plan. The same dynamic can play out with AI tools, and most nonprofits, we know from the research and from surveys, are using AI without policies in place. Um, so

Carolyn Woodard

I wanted to share another resource with you that is from Candid. Candid has a practical guide called What If My Organization Has No AI Use Policy. It's published in February, so a couple months ago, but still really valuable. Uh, and

Carolyn Woodard

They found that roughly 80% of nonprofits lack an AI use policy. If that speaks to you, we do have a template on our website that you can download. You can work with an AI tool to help you build an AI policy. You need to customize whatever you use to your organization and to your situation, what you do, what your values are, what you're using AI for already. Like have a good conversation with your staff. And

Carolyn Woodard

Even if uh the idea of making a whole long policy, official policy is super intimidating and seems like it's going to take a long time to get all of the partners and people who should be involved involved. Even just having a one-page declaration of principles that you can share with all staff and get staff to weigh in on, that is a lot better than a zero, nothing, not talked about it. So if you can do that, I would recommend doing that.

Carolyn Woodard

You know, this week, the Canvas hack and the um other hacks that are happening that we're seeing now, stories about AI issues, especially with agentic AI, um, make it really important to spend some time on this uh sooner rather than later.

Carolyn Woodard

The guardrails include never using the free versions of AI tools at work. They don't carry the same data protections. Canvas was not using an AI tool, but they did have this problem with the free tier version that had these vulnerabilities. That is, you know, across the board. If you're using a free AI, they are using your data. That's why it's free.

Carolyn Woodard

So upgrading to a paid version, a lot of the paid versions have different tiers. If you can afford the budget for the lowest tier, or just have the license for one person at your organization that needs to use that tool. You know, there's lots of budget-friendly ways to do this, but having that agreement with the AI tool vendor gives you those additional protections, business protections that you're gonna want to have for your data, for copyright issues, for how they store your data, if they store your data, if your data is going back into their model, all of those issues. So make sure you have that agreement.

Carolyn Woodard

Never input sensitive data into AI tools. Check that your training is turned off. There are ways to check that. And even I would say with de-identification, like with student data, there's a lot that can be found, even if you think you've scrubbed all of the student identifying, personally identifying information out of it. There still may be metadata that tells where you are, what institution you're at, you know, something identifying about that student.

Carolyn Woodard

So in general, we would say be very, very careful about sensitive information that you're uploading, especially to those freemium tools that are quote unquote free.

Carolyn Woodard

And then another good thing that comes that that comes out of this article that Candid recommends is documenting your experimenting, what's working for you with the AI tools you're using, what isn't working, what you've learned. When you share this with your organization, even if the organization doesn't have an official AI tool policy yet, if you have your own kind of mini tool guide of what your ethics are and how you're using these AI tools in the most secure way that you can, then you're gonna be that could be very helpful to your organization as more people come on board. They could adopt what you're doing.

Carolyn Woodard

You could be able to build a more uh enterprise-wide, organization-wide policy from your initial notes and learnings and ethics and values, and how you're um, you know, the the ways that you're putting in place for the ways that you use those tools.

Carolyn Woodard

So this is just a really good article. I'm gonna share it in the show notes. Uh, we do also have um some more information on our website about AI tools, uh AI policies, and the safe way to use those AI tools. We did a webinar on that back in February. So I will share those links as well.

Carolyn Woodard

And then the last thing I wanted to just say quickly is if you listen to the Friday podcast, it was just a wonderful conversation with Cheryl Conti, who wrote the book, AI for Nonprofits. I was lucky enough to see her at a conference or earlier this year, and I asked her to be on our podcast, and she was so generous to share her time and her expertise with us. So if you missed that, you can go back and listen to it. It's the one the last Friday. And

Carolyn Woodard

I wanted to just kind of clarify this book. I have dipped into it, it is amazing. And I had questions myself, and you may have questions about how valuable is a book when AI tools are moving so quickly. And wouldn't it be more valuable to find some online tools that are really recent? And I would say yes for actual the how-to with the tools themselves.

Carolyn Woodard

Where this book is so valuable is that it talks a lot about strategy and about process and about the the sorts of things that you can use AI tools for with some examples, rather than like a how-to use this exact tool to do this exact thing.

Carolyn Woodard

All of the chapters are written by guest authors, like they're the leading thought leader in their aspect of using AI for marketing or comms or fundraising or you know, proposal writing or program evaluation. So you get all of this great wisdom in this one book, and it has lots of perspectives. It doesn't just have one voice, but then Cheryl and her co-author kind of tie it all together and bring it all into this kind of strategic framework for thinking about AI at nonprofits. Sorry, just to shout out to her co-author, Darien Rodriguez Heyman and Cheryl Contee, and

Carolyn Woodard

They provide this overarching structure and kind of pull these chapters together of what does it all mean? And it's very accessible, and you can embrace it if you have readers at your nonprofit, maybe in leadership, who are more comfortable with a book or with excerts than they would be with going if you gave them a website to go to to learn about AI. I really recommend this book. I've been using it a lot already. I'm sure I'm going to be dipping into it a lot as we go forward.

Carolyn Woodard

So, you know, just add that into AI governance. You know, thinking about your organizational values, your leadership, your philosophy, and your constituents that you care about, the communities that you're working in, all of those threads need to be drawn up together in the beautiful tapestry of your AI policy, and understand that that is an evolving, changing, uh, multifaceted policy going forward. So

Carolyn Woodard

I want to just close with a thought that uh organizations that are navigating this new era that we're in well aren't necessarily the ones with the most AI tools or that adopted AI fastest or earliest, or that are using AI for their mission right away, got a lot of grant money coming in to like immediately change what they do because they can use AI to do it.

Carolyn Woodard

I think the organizations that are gonna be successful, and I want you all to be on this list of nonprofits that are going to be successful using AI and continuing to be successful in their missions as we're in this new era - are the organizations that ask the right questions, that are cautious and informed about adopting new tools, working with new vendors, who look for the opportunities and want to be fully informed as they jump into those opportunities.

Carolyn Woodard

You want to be asking those questions and thinking about your relationship with your vendors, with data, with your constituents, your communities. And all of that needs to be wrapped up in this envelope of what you care about, your values, and what you stand for.

Carolyn Woodard

You know, a hack can happen to anyone. People listening to this has probably happened to many of us. You know, I don't want to victim blame. It's not your fault. There are bad guys out there who are trying to steal stuff from you, get money from you, get data from you on these uh vulnerable populations or, you know, data that they can use for financial gain, basically. So

Carolyn Woodard

I don't want people to be too scared about it, but staying informed, moving forward, using the tools that are available to you to be as secure as you can. I think all of those are going to be so important going forward.

Carolyn Woodard

We know that nonprofits are learning organizations for the most part. And we're curious. We ask questions all the time about the challenges that the constituents are facing, the root causes of the issues that we're, you know, working on, trying to make better. We're very well positioned to be asking those questions of all of these tools as we go forward, but don't get stuck in paralysis, question paralysis, or caution paralysis.

Carolyn Woodard

AI tools really represent a huge opportunity. So be informed, use these resources, learn in the way that you learn best. If it's a book, if it's a website, if it's a podcast, if it's a webinar. We're so happy that you're here learning with us. You can hear me on Fridays and again on Tuesdays with the nonprofit AI.

Carolyn Woodard

I just want to leave you with these thoughts and encouragement. Um, you're doing a good job. Stay informed and we'll talk to you next week. Take care.