Community IT Innovators Nonprofit Technology Topics
Community IT offers free webinars monthly to promote learning within our nonprofit technology community. Our podcast is appropriate for a varied level of technology expertise. Community IT is vendor-agnostic and our webinars cover a range of topics and discussions. Something on your mind you don’t see covered here? Contact us to suggest a topic! http://www.communityit.com
Community IT Innovators Nonprofit Technology Topics
Data Governance for Nonprofits with Jeff Gibson
Best practices on improving data security, making the case for creating the policies, and the impact of AI on data security.
Carolyn speaks with guest Jeff Gibson from Build Consulting to answer questions about why data is important to nonprofits, how your data can get really widely distributed in different tools in different departments, and why it is important to do an audit and assessment to know what data you have.
Do you know where your data risks lie? Do you know you are in compliance with laws and your internal policies? Do you have a data policy? What do you need to know about data governance for nonprofits?
Jeff has over 25 years in nonprofit IT including as CIO Chief Information Officer, but recognizes that many staff at nonprofits work with data or use tools that create databases – but don’t have a data background or expertise with database management. That can create holes in your cybersecurity that cause real risks for your organization.
But it can be really difficult to see a need for specific data policies. It can be difficult to convince leadership to prioritize the time to do a data assessment and create data policies. And at small to medium sized nonprofits without resources for a compliance team, it can be hard to monitor compliance with your own policies and with legal requirements.
Jeff shared his experiences and insights into these difficulties and offered some tips on making the case for the danger of unmanaged data. If you are feeling anxious about your data security policies, this discussion on data governance for nonprofits will give you ideas on how to move forward.
_______________________________
Start a conversation :)
- Register to attend a webinar in real time, and find all past transcripts at https://communityit.com/webinars/
- email Carolyn at cwoodard@communityit.com
- on LinkedIn
Thanks for listening.
Carolyn Woodard: Welcome everyone to the Community IT Innovators Podcast. My name is Carolyn Woodard, I’m the host. And today, I’m really excited to be talking to my friend, Jeff Gibson from Build Consulting about data governance policies. Jeff, would you like to introduce yourself?
Jeff Gibson: Hi there. I’m Jeff Gibson. I’m with Build Consulting. They’re a nonprofit consulting group out of DC and about 25, 30 years in IT operations, implementations, analysis, etc.
Thanks for having me.
Carolyn Woodard: Oh, thank you for coming. Thanks for making the time to talk about this topic. We have gotten some questions from some of our clients around not just creating policies for their IT governance but specifically around data governance.
Why Have Data Governance Policies?
And I know that you have a lot of expertise in that area. Can you tell us why you need a policy specific to data governance?
Jeff Gibson: There’s a litany of reasons, many of which are self-preservation for an organization. Not only compliance reasons, depending on what your industry is. I have some background in the education industry. There’s HIPAA, FERPA, GDPR considerations, just general right to privacy.
The accessibility with which people can access malicious tool sets, means the bar has been lowered very much in the last 5 to 10 years. So policies are needed not only for hardening the organization from an external threat, but also internally.
You have a lot of people that are dealing with data that don’t have a data-specific background, that are using data as part of other tool sets. It’s important, especially from a policy perspective, that your users understand the potential risk that they’re incurring by how they use data, where they store data, who they share data with.
Vendors and Data Governance Policies
One of the things that gets a short shrift all the time is vetting vendors and vendors that are service providers, that you’re sending your data to, and making sure you have a full understanding of what they’re doing with the data. If they combine it with other data and aggregate to determine their product outputs, what’s happening with their vendors, those kinds of things just get forgotten about all the time. Having policies that not only acceptable use for internal users, but external vendors.
And then when you’re in the IT team, how are they storing that data? If they combine it with other data and aggregate to determine their product outputs, what’s happening with their vendors, those kinds of things just get forgotten about all the time.
Where are the backups going? Are backups encrypted? Are your cold backups encrypted?
Cybersecurity Insurance and Data Governance Policies
The other thing too that’s really pushed a lot of this is cybersecurity insurance or cyber risk insurance. A lot of those coverage criteria are, in a nice way, they’re outlined for you in the document. But also, that really raises the bar for a lot of organizations that hadn’t previously had a lot of compliance controls in place or training for that matter.
AI and Data Governance Policies
So, there’s a litany of policies. The most interesting one recently, and a lot of people don’t think of it as a data concern, but it’s probably one of the bigger risks that come up is AI. AI just sort of insisted upon itself into organizations, whether it was brought in the back door generally by users who are like, hey, what’s this nifty thing and how can it help me in my job?
But also, and less so, what’s the formal organizational AI policy? They’re all very much in their infancy right now, and AI is ever changing. But it’s important from a training perspective and an education perspective that users who are not traditionally in the IT area or data stewards, it’s important that they’re trained now in terms of understanding what the potential risk is for sharing external data within the generative aspects of AI.
That gets populated in the library, the global library of information. If you have institutionally specific data, secret sauce information that makes your organization special, user data in there or information about your users in there, that becomes part of a global library, unless you intentionally tell it not to, or you have an internal instance of a copilot or something. So that one I think is probably recently the biggest one.
And people don’t really think there’s a nifty tool that we can use, and it’ll help me. But what are you sharing? And it’s important that everyone understands the applications of that.
And then sort of retrofit the organization, because we’re sort of organizations and IT organizations are sort of being led by the nose to an extent by just the popularity, the wash of AI sweeping through all industries, really, if you think about it.
Carolyn Woodard: Yeah, no, a lot of tools, like you get the update, and it’s like, here’s your AI assistant. You’re like, wait, I didn’t know I was getting that. I didn’t ask for that.
I want to put in a plug, just a quick plug, that we do have a template for an AI acceptable use policy for non-profits on our website, communityit.com, if you’re looking for something to write a policy specifically around AI in many different uses.
Blanket Data Security Policy vs A Data Policy for Every Tool and Platform?
But that leads me to a different question, which is you could have data in lots of your different tools, IT tools that you’re using or platforms that would be valuable to your organization. In your experience, does it make sense for a non-profit to try to create some kind of a blanket data policy?
Or does it really need to be where you do an assessment, you figure out which tools have which data, and then you write a data policy for when you’re using your CRM, or when you’re using AI for something else, for your spreadsheets, or your list of volunteers, or your list of clients that are receiving your services.
Do non-profits do those kinds of policies piecemeal for the specific tool set, or do they do a broad one?
Jeff Gibson: I think yes and yes. It depends on where your risk lies. I mean, where are you at in this space and time?
Do you have a ton of users who are sharing data externally or do you have a ton of users who are adding new data? And is that data getting into your overall data repository or your disparate data repositories? And what kind of checks are being done on that data before it’s introduced into the wider ecosystem?
You want to have that blanket data security policy that covers a lot of your bases. It helps you initially to help users understand just really what they’re dealing with. A lot of people don’t have a lot of data-specific training or the understanding of where that can go.
But also, if you have particular risks in, again, say you’re a nonprofit, a hospital, or a college, or any organization has European contacts, or constituents, or California, for that matter. Depending on where they are, you have to have GDPR considerations.
What do I do with this data? How do I delete this data? If called upon, how would I provide a receipt of sorts that this data has been destroyed, removed, or aggregated to anonymity?
The one thing to that understanding in terms of the data use policy, but it also has to be developed and trained in tandem with whatever data retention policy your organization might have to have because of insurance, or just what your operations are.
Most document retention policies are seven years. They’re smaller or longer depending on what the business is and how they justify that retention. But if you make a data retention policy, that’s 20 years out or 15 years out because of the low cost of storage or whatever. That might not comply with your acceptable use policy in the organization. You could be retaining data that another part of the organization is certifying with compliance management organizations or external auditors that we’ve gotten rid of this completely. It’s in cold storage.
Those kinds of things have to be taken in consideration of other policies that are in place. That’s really why you want to have, and I’m not to dovetail off into another entirely different conversation, but a governance group, some kind of governance team to look at the organization as a whole and make sure that one policy isn’t contradicting or in violation of another policy. That may be the IT department is just responsible for data, but your document or attention policy might be with three other organizations that have nothing to do with IT.
A governance team can oversee all of those disparate pieces of an organizational level.
Carolyn Woodard: And I’ll put in a plug that we did a great webinar in May about creating governance, IT governance in general, and you had some really wonderful ideas and tips that were part of that Making IT Governance Work for Your Nonprofit webinar. I’ll let people go check that out themselves.
The Value of Data
But I think it leads me to ask you about the data itself. I know for most regular people, consumers, individuals, we’ve heard a lot about the types of data breaches where your credit card numbers got out or social security numbers because of Target or some other company that had a data breach where they had a database that clearly hackers were very interested in.
A lot of nonprofits, if they use a credit card processor, that’s a separate, completely separate from their data that they keep on as we talked about, like constituents, students, volunteers, other individuals who are in your database.
Can you talk a little bit about, like do you do a data assessment to know what data you have where and what is covered under these different compliance laws? Why should people be worried about data if it doesn’t include things like credit card numbers or social security numbers?
Why is data so important to your organization?
Jeff Gibson: I’ll flip the order of those questions if you don’t mind. With the addition of AI, you can take disparate pieces of data, assemble those and have a pretty good notion of what that persona is, and use that data for either just general malicious acts or taking over someone’s identity. I mean, that’s probably one of the best ways to sell that to reluctant teams in terms of receiving training and things like that is, your information is in our database.
It’s in our HRIS system. Your emails are in there. So, assembling disparate pieces of information is much easier than it was previously.
But also, you can assemble entirely different personas from people. One of the things for students in particular that was appealing was, even if you have social security numbers or some just anecdotal pieces of data, most students aren’t checking their credit all that often. And a malicious actor and on the dark web knows that they’re probably not going to be checking it for three to four years.
So that data itself is much more valuable on the dark web because it says, once you validate this, you have three to four years of runway to do it with it as you please before these folks are really going to be aware that they need to rein in their data.
I was a CIO of an organization, and one of our vendors had a data breach. It was nothing with us, but five, seven people had second mortgages taken out in their name. So, it’s not little things, it’s big stuff. That’s another thing to consider too, is your vendor notification policies.
A lot of vendors, I’ve had vendors at third parties who have had breaches, but because of the legalese associated with what’s a breach versus an unauthorized accessing, they can argue that for six to seven months before they need to comply with their own notification policies. And the people who are affected have had seven months of malicious actors with their information and had no idea. You really want to vet the vendors, vet the notification policies, the timelines, what they’re including as a violation, legally speaking, or sorry, a breach, legally speaking.
When we’re training people, when we’re talking to people about security, we’re bringing it home to them, not just from the organizational perspective, but from the perspective of you’re part of this organization. You’re being a steward of your own data to some extent. Ninety percent of people are going to care about the organization they work for and the data and protecting it.
Data Assessments
From a data assessment, it really is important at least, especially when one of the things I do is come into less mature organizations or smaller teams, where it hasn’t been a priority, or they don’t have the bodies to do the work, or they don’t have the best practice people. I think the basic cost of entry anymore is, where’s my data? Where is it?
Even if it’s from disparate sources, it’s a six to one half and a dozen to the other. If you have disparate sources, it’s much harder to manage across an organization. But then if you have done the work and you have a centralized data repository, both need to be hardened, both need to be managed, both need to be backed up and insured from a security perspective, that it’s part of your overall security planning and policy.
Top three for me are the data set. Where’s my data? Map your data. Map how it’s used. Map when it was last used. Is it still relevant anymore? Are we incurring unnecessary risks because we don’t use this data all the time? Can’t be purging.
The second one would be getting those at least basic policies and training in place for your ad users. It’s just rudimentary stuff. They don’t have to know why or how, but the why might help them internalize it and act upon it readily. Then IT just came with some memo that I’m just going to use as filler.
But then the third thing is at least, if not once a year, every couple of years, reviewing your cybersecurity plan or having an external penetration test, it’s almost emotional because as an IT leader, you’re sitting there going, I don’t have the people for this. I don’t have the time. I have a thousand front burner items.
At least knowing where your weaknesses are will help you incrementally bolster those areas or conversely centralize that data to make management easier. But if you don’t know, then it’s your best guess. It’s the thing that keeps you up at night are the things that you don’t know.
An external penetration test, you can get one for $3,000. You can get them for $50,000. It just depends on how involved you want to be. But at least at a rudimentary level, I think any organization should know where its weak points are. Whether it’s data or your infrastructure, you’re one of the same.
I mean, for a smaller nonprofit organization, that doesn’t probably be my top must-have items, especially given all the change in data.
Getting Started with a Data Governance Policy
Carolyn Woodard: I think one of the things we run into with smaller to medium-sized nonprofits is they aren’t very centralized because often they’re very entrepreneurial. So, you have some of the department who will see a need that they have, a database that they want to build up for their program that they’re doing or for marketing or major donors, whoever it is, and they’ll be using that tool, often very independently from the rest of the organization, just getting their job done. They don’t have to do a big coordination.
But certainly, even in the smaller organizations, but once you grow to a certain size, I like your idea of having a governance committee that could maybe do that assessment or hire someone external to come and help with an assessment, to really map out in which department is building these databases, and then do they have all the training they need on why the security is important, and are they getting those vendor agreements so that they have them and they’re updated so that everybody knows. That could be centralized in one location, or you have all of those policy documents.
Do you have any parting words from us on how to get started doing this?
If you don’t have a data policy now, what would be the first couple of steps that you would take?
Jeff Gibson: I think the acceptable use policy is an obvious first step. I mean, there’s so many templates out there. Community IT has one. The acceptable use policy is probably, you’re just, as soon as you walk in the door of IT management.
Same thing with the vendor policy. Let’s take the easy pieces first. There are templates out there for vendor vetting, templates out there to obviously control acceptable use.
There’s already templates out for AI policies, and there’s just one-sheeters that you can give to people in brown bag sessions or maybe you have an IT newsletter or a senior leadership meeting, and just start that conversation.
The other thing is to take those policies and get your non-IT leadership talking about that. Obviously, have the conversation in the background. What’s the organization’s posture on AI, posture on data, those kinds of conversations. But in parallel, I think at least, even if the cake is not baked, get out there, have leadership, be consistent and persistent in their messaging.
Like, this is a thing. This is how we can use it. It can be a tremendous benefit, but there’s also risks involved.
There’s institutional knowledge that can get out there very easily. And just even explaining the rudimentary differences between sharing your inquiry data with the world or just keeping it in house, or how to anonymize data just to get an answer from an AI situation.
And I think that the other one is just having anyone who’s able to sort of sign a vendor contract or be involved with data or migrate data, that team or that department lead or that whichever is managing that data has to understand what they’re doing, and the potential risks involved so that they can really justify what they’re sharing too.
People need to share what you need to share, but it’s always easier just to dump a file over to a vendor, but you’re probably giving 80 percent more information than they need to give. So really, what are you trying to get out of this? What does that vendor need?
What’s the vendor going to do with the information?
You can kind of get those three things in line, and you’re pretty well on your way, but also internally always being aware of how to minimize risk to data from external and internal sources.
There’s very little money involved in anything I just mentioned. So, it’s just the biggest currency of all time, but it should be viewed as a priority that it hasn’t been in the past decades or even five years ago anymore.
Carolyn Woodard: And I get a little bit nervous because I feel like I hear people in the nonprofit sector getting really excited about AI, and that AI is going to maybe fix a lot of their poor data management problems. Because even if you haven’t been really on top of your data, but you’ve been keeping these maybe old databases or data on your major donors or what have you, and there’s going to be tools that are going to help you sort out what’s in your database. But if you haven’t gone through and done that assessment and know what’s in your database to begin with, and who has permissions to access it, and then have those vendor agreements if you’re doing an AI, bringing in an AI tool to analyze some data that you have, making sure that you do all of that homework or prep work, I guess I would say, before expecting an AI tool to just slice your bread and butter it as well.
Jeff Gibson: Yeah. I always look at AI at this stage, it could be completely different tomorrow. At this stage, I look at AI as a labor supplement, not an intellect supplement.
You’re still driving the bus, you just have more people helping you maintain it, or however long I want to extend that analogy.
But I just don’t think we can give it up and go, AI is right, I’m going to do whatever it tells me, because it really only is as good as its inputs. And people need to control how much of their data is being used as an input.
Carolyn Woodard: Those are all such great suggestions for us, Jeff. I appreciate your time today so much. Absolutely.
I hope that this is helpful to our audience to understand more about why data is so important to nonprofits and how to begin making those data policies if you don’t already have them. Thank you again for joining me today.
Jeff Gibson: Thanks for having me.