Community IT Innovators Nonprofit Technology Topics
Community IT offers free webinars monthly to promote learning within our nonprofit technology community. Our podcast is appropriate for a varied level of technology expertise. Community IT is vendor-agnostic and our webinars cover a range of topics and discussions. Something on your mind you don’t see covered here? Contact us to suggest a topic! http://www.communityit.com
Community IT Innovators Nonprofit Technology Topics
Year-End Cybersecurity Tips with Matt Eshleman
It won’t be a surprise that AI is on everyone’s mind. Matt also shared some thoughts on some year end cybersecurity tips and maintenance to ensure your cybersecurity practices – and permissions – are up to date.
Nonprofit Cybersecurity expert and Community IT CTO Matt Eshleman offered these year-end cybersecurity tips.
- Do a permissions audit at the enterprise level – what are your defaults for sharing, and how are you training your staff to log in to various tools and subscriptions that can access your data?
- At a personal level, what are the safest ways to log on to your accounts? How should you be storing passwords? How are you setting your personal sharing and log in permissions?
- Reviewing these policies and practices – and actually going in to your systems and checking – is a good end-of-year cybersecurity task.
- Consumer Reports has good tools for people who want to reduce their data footprint.
Matt also reported back from the NGOISAC Conference on the trends that the community is seeing. For more information on the NGOISAC cybersecurity community for NGOs and Nonprofits, use this link.
- AI companies are not spending nearly enough on safety as they rush to market their products. That puts it on consumers and organizations to know how to protect themselves.
- AI is increasing the ability of hackers, and in the arms race with protectors, a corresponding use of AI to block phishing and other hacks is playing catch up.
- Nonprofits are going to need to understand our new environment where not just emails will seem realistic but where voice and video fakes will be put in use. Finding old fashioned ways to verify that the person you are interacting with is a person will become more important as AI powers grow.
_______________________________
Start a conversation :)
- Register to attend a webinar in real time, and find all past transcripts at https://communityit.com/webinars/
- email Carolyn at cwoodard@communityit.com
- on LinkedIn
Thanks for listening.
Carolyn Woodard: Welcome everyone to the Community IT Innovators Technology Topics Podcast. I’m Carolyn Woodard, your host, and I’m so excited to be here today with my friend Matt, who’s our Chief Technology Officer at Community IT, who’s going to share some information about cybersecurity for the end of the year and going forward. So, Matt, would you like to introduce yourself?
Matt Eshleman: Sure. Thanks, Carolyn. It’s great to be here. My name is Matt Eshleman, and I’m the Chief Technology Officer at Community IT. In my role, I get to oversee and supervise our back-end cybersecurity team that helps keep all of our client endpoints safe and secure, and then also get to work with clients on their technology strategy and cybersecurity protection initiatives. So, happy to chat today.
Year End Advice: Clean Up Permissions
Carolyn Woodard: Is there anything cybersecurity-wise that, there’s particular scans that come out, because it’s the holidays, like porch pirates or whatever, but for cybersecurity? Or is there anything that particularly you would want to do at this time of year, or is that just not even a smart question? There’s just stuff that you have to do year-round, so it doesn’t matter what time of year it is?
Matt Eshleman: I think that there’s probably a couple of things that would be good to do here at the end of the year, as we kind of do the cleanup.
A couple of things I’m thinking of would be, first, take a digital inventory of all the systems that you have and you use and have your data. I think in our cybersecurity playbook, we had a couple of things about looking at the third-party apps that have access to your account through Google or Facebook or Microsoft, because a lot of those systems, you say, “oh, I want to sign in with Google or, yes, Facebook can use this. This app can access my information.”
It’s probably a good idea to go back and look at those systems to make sure that they are, in fact, still active and they still need access to your data. You still want to share that information.
That’s probably a good year-end cleanup piece to talk about, a little system inventory, what apps and systems have access to my data and do they still need it. So that would be a good thing to do.
Carolyn Woodard: It sounds like you’re saying permissions, going back through, not only which of your staff have permissions to which things they’re supposed to have permissions to, but also for people individually, maybe you want to make sure to go through all of your apps and reconfirm the things that should have permissions.
Matt Eshleman: Yeah, I think so. There’s probably an organizational dimension of this, for example making sure that you’re using groups to assign permissions to folders and SharePoint or Google Drive.
Make sure that the sharing settings are still accurate. I know whenever I’m doing the security assessments, that’s one of the things that I look at is what is the default security sharing. And I think a lot of folks set up these systems a long time ago, and the default was, “we’re going to share and let everybody share anything with anyone.”
And that’s not the best practice.
And so being a little bit more intentional about how we’re sharing data externally is important, given that it is so easy to do that with data in the cloud, that may not always be the best idea.
So, have a data cleanup for both your organization, in terms of group and folder access and external sharing, and then take an audit of your personal data sharing as well with some of those personal apps, social media apps and that kind of thing, especially that allow access to other data brokers.
Sign In with Google or other Apps OK?
Carolyn Woodard: Do you recommend with Google, when you’re using Chrome, it’ll prompt you all the time “Do you want to save this password here?”
Is that something that is okay to do?
I always feel suspicious when any site is asking me to save something there because I just think, well, why wouldn’t I save it in my own keychain or my own password manager? What’s your advice on that?
Matt Eshleman: So, for me personally, I think my hierarchy of how I like to access systems is number one, I really do like using the sign in with Google option. If there’s a way you’re accessing some new web service, you’re subscribing to a newspaper or whatever, then using the option to sign in with Google as opposed to creating another username and password, I really like because it makes it a lot easier to access.
I think I have a little bit more control (when I use the Google Sign In) – I don’t need to save or remember another username and password. And I think the Google security controls are pretty good in terms of notifying when that happens. And so that whole process, I think works really well.
And so that’s the number one thing I like to do for my personal access is sign in with my Gmail account and just use the Google account. So that’s probably number one on my list.
Number two, if that isn’t an option, I do prefer to save it in my own password management tool.
I use 1password. There’s a lot of other similarly good password manager tools there. And I like that because then I can access that on many different devices and many different platforms. I’m not just tied to logging in through a web, on a specific web browser. So, it’s handy to be able to access that on my phone.
And then the third option is, I do still save some stuff in my Google Account, but I think you do need to be a little bit more intentional about that, because again, if you sign in to a Google browser on some other computer, that password, all that stuff follows you, which on the one hand can be really great and convenient if you’re the only person that has access to that computer.
But if you sign in to a system on your kid’s computer and they’re poking around and it auto fills some passwords, that may not be a great situation. So, I think, and specifically for us as an IT services provider, having that work/personal divide be really strong, we want to make sure we keep all those passwords in work systems when they need to be and personal when they need to be.
So that’s kind of my hierarchy and preference of password and access methods and tools.
Google Password Manager
Carolyn Woodard: If I can ask a follow-on question, a lot of our clients use Google Workspace. So, if a nonprofit is using Google Workspace as their platform, it sounds like you would recommend they can use the Google Password Keeper.
Matt Eshleman: Yeah, I think the Google Password Manager within your organization’s Google Workspace account is probably a good option. It’s certainly better than nothing. It may not be as flexible or as convenient or as handy as a third-party password manager, but it’s way better than trying to remember all those different passwords.
Writing them down, honestly, is not bad. I think we like to ridicule that idea, but the physical security of a book in your hand is pretty good.
The worst option is just using the same password over and over again for all these different sites.
But if you’re a Google Workspace customer, and you don’t really have the budget to do a password manager tool, saving those in your work Google account, that’s a good start.
Carolyn Woodard: I think we often recommend something that’s built in. If you’re already using that platform, then you should use all the tools that are associated with it because they will automatically integrate with each other better.
Matt Eshleman: Yeah. And I think Google in particular, I think has added some nice new features into that password management tool.
It’ll tell you if you’ve reused passwords or if passwords have been detected in a breach. And so, yeah, there are a lot of really nice little features that are in those browser-based password manager tools.
NGO ISAC Conference Reflections
I think the other stuff I think that I wanted to talk about is a reflection from the NGO ISAC Conference that I went to.
That was really helpful, seeing the cybersecurity threats from the NGO space, which are really significant because that space is really targeted. It’s asset-rich. It’s got a lot of intellectual property that state-sponsored actors in particular are after.
It’s a unique space that has resources to some extent, but probably needs a lot more (in cybersecurity investments) to come up to the level of the threats that they’re facing.
Carolyn Woodard: Can you just introduce what conference is that? What does that acronym stand for and what is it about?
Matt Eshleman: ISAC stands for Information Sharing Analysis Center. So essentially, you know, kind of a place to share intelligence about different cyber threats that organizations are facing. I’m a part of the NGO, ISAC, so the Non-Governmental Organization, Information Sharing Analysis Center, which is really focused on the unique threats that face think tanks, larger nonprofits, and foundations.
It’s a little bit of a bigger tent than the small to mid-sized nonprofit organizations that we work with primarily. I think there is some overlap, particularly for the organizations that we support, in the policy and think tank world, overlap with foundations and typically organizations that do have more in-house IT capacity.
I was there at the conference as a representative of Community IT, supporting a whole bunch of these smaller think tanks that don’t have internal IT capacity. But the majority of the attendees at the conference were in-house cybersecurity people at bigger nonprofits and NGOs and their partner organizations. There were lots of third-party cybersecurity vendors that were there that do a lot of work to support these bigger organizations and have a lot of capacity.
It’s a really good community to be a part of. I think the thing I’ve appreciated about the cybersecurity world in general is that it’s very open and people share lots of information. It doesn’t matter kind of where you are or where you’re at, there’s just a big interest in sharing (these defenses against common threats.) And so that’s been appreciated.
Top Cybersecurity Threats in 2025
Carolyn Woodard: Do you have some takeaways for us? What are your reflections on, for example, the top things that people are thinking about or the top threats they’re worried about?
Targeted Phishing Attacks
Matt Eshleman: In terms of the top threats, certainly it’s things that we see with our small to mid-sized organizations as well. There’s lots of targeted phishing – attacks on digital identity.
I think in the NGO and think tank space, that attack surface really goes beyond your work account because typically those attacks are targeted. And so, these threat actors know that your work account is probably pretty well protected, but maybe your personal account is not, or maybe your spouse’s account is not, or maybe your kid’s account is not.
It really goes way beyond, I think, what we typically do for most organizations. We say “hey, let’s make sure we protect your work account, and that work personnel are really separate there.” I think the threat surface area is a lot bigger because you’re getting targeted in multiple ways.
Be a Learning Organization
So, in terms of the takeaways from the conference, I also think just the importance of defining and having a formal process for managing your cybersecurity program and that feedback loop.
We have lots and lots of near misses that people report, and making sure that as a cybersecurity resource in your organization, you always take time to have some meetings and look at what happened, how do we catch it, what can we do better, what can we do differently, and update so that learning organization piece is really important because things change quickly. And in cybersecurity, you’re playing defense all the time, you have to be perfect or near perfect. And so those weaknesses get exploited.
And so just having an on-going process of learning and sharing and always evolving is really important. Is Your Nonprofit a Learning Organization to learn more.
Cybersecurity Professionals Are Under Stress
I think the other takeaway I actually took is a note that is just cybersecurity is stressful in general. I mean, even more so than the help desk, you know. I used to joke when I was doing more direct support, right, that the only time anybody calls into IT is when they have a problem.
You’re just dealing with problems all day long.
I think in cybersecurity, in addition to being problems, then it often amps up the stress because there could be a huge financial dimension to that, if you have an account that’s compromised or there’s a ransomware incident in place.
Being able to invest in the mental health and capacity of your cybersecurity people is really important.
And taking advantage of some of your work programs through health insurance or other employee benefits is really important. There were a couple of people that shared that, they didn’t quite realize it originally. But then, when they’re doing their benefits renewal, they find out that there are counseling sessions or mindfulness resources that are available. You should take advantage of them. (Community IT also provided some resources in De-Stress! Self-Care in Nonprofit IT Roles.)
And so that was one takeaway for me, for our team is, we just went through a benefits transition, right? So what resources are available to de-stress, for mental health, for dealing with those super stressful situations that are just a part of our jobs, because, you know, you can be intense for a while, but that’s not sustainable.
Carolyn Woodard: That is so interesting, because I think you’re right. (Stress for cybersecurity professionals) combines things that are elements of stress like urgency, right? You have some incident happening. You have to respond right away on the weekend. Whenever it happens, you have to leap into action. And so, you’re constantly primed to be able to do that, which is very stressful itself.
And then often you’re right (about the financial implications adding stress.) You’re looking at some existential threats, right? If someone takes all of your money from your accounts or finds the information for your three major donors and they no longer feel comfortable donating to you, that is just going to really make a big difference to your nonprofit.
And I think we found too, that with our clients suffering a breach or an incident or a phishing attack, even if it was not successful, and the hackers didn’t get what they wanted. The fact that you’ve had an attack is so stressful to all of your staff. There just immediately becomes this new atmosphere of being really, really careful and kind of looking at everybody and trying to make sure everybody is trained not to click on those links.
And it can be a very stressful period for any staff. And then if you’ve lost money, it’s even more stressful. I never really had thought about using those mental health resources through your benefits, but that’s a great tip.
AI and Cybersecurity
Matt Eshleman: You know, you can’t have a conference without talking about AI. Right now, I think with a lot of new technologies, right, AI is just another new technology, and it’s just a race to get people to use your platform. And so, the governance and the safety controls are really being dampened because nobody wants to slow down. Because if you slow down, somebody else is going to get there first, and then they’re going to be the winner. And in this world, you kind of have to be first, and if you’re second, then you lose. It really subverts some of the safety and good governance controls, because you just want to get there as fast, and we don’t want to worry about safety.
Right now, we’re just kind of in a race to have a bigger technology to shoot yourself in the foot with.
In nuclear energy, for example, in the cost of nuclear energy, 90% of the cost of developing and implementing and supporting nuclear energy are safety controls, right? So 90% is safety expense and 10% is actually creating the nuclear energy itself. But it’s all about safety.
In AI, some of the big firms are aspiring to maybe have 10% be invested in safety controls and kind of these guardrails. But that’s not even happening now. And so we’ve got this really powerful ungoverned resource that is just kind of innovating, innovating, innovating. And there’s just very little being spent on the safety of that tool.
Carolyn Woodard: I feel like we’re going to have some kind of AI Three Mile Island or Chernobyl. And then that’s going to be what gives people the impulse to put the 90% safety in after there’s been some big, colossal AI problem. And everybody has all our data anyway.
I also feel, as you said, it’s kind of like an arms race. And even though AI itself is expensive, right, there are four big companies that are running all of the AI basically. But because they made some of these features free to gain market share, it feels like the hackers, the bad guys, have a lot of incentives to monetize the AI that they have access to, to get at us.
And so I’m wondering, it feels a little bit from my perspective, like the companies are playing a little bit of catch up. Where is the AI that’s going to super protect our inbox? We know that the AI is out there with the phishing, emails getting better and better. But I think we’re kind of waiting for some of those AI tools that will really zap them before they get to us.
Matt Eshleman: Yeah. I think a lot of the technology tools that are out there are being developed for good and for defense. But it does seem like they should be better than they are.
You know, these hackers or criminals, you know, cybercrime networks, are using the same tools that everybody else is, to write compelling emails, to trick us into sending money and buying gift cards and updating our wire payment information. These tools are out there and available for everyone to use.
In cybersecurity, a lot of the things that we would have relied on in the past, the poorly worded messages, the misspellings, that’s kind of all out the window because it’s just easy to run all this through a free text generation platform and get a really, really, really well-written email.
AI Deep Fake Frauds
Carolyn Woodard: Or voice generation.
Matt Eshleman: Yeah, voice generation. There were a couple of those examples shared, in terms of fraud schemes that were initiated, by voice, there were a couple of scenarios where video was also part of that. This person also shared, the question how do you validate the identity of who you’re talking to?
Are there personal code words to provide validation that you are, who you say you are, not a recording?
A case was shared, I think it was at Ferrari. There was a really sophisticated video fraud against the CFO or somebody in the financial department. They had invested a lot in the look and feel of this digital twin. And the person targeted had a couple of suspicions. So they asked about a book that they had asked them to read, as part of some corporate exercise. And the digital twin didn’t know that answer. And so, while they knew all the financial stuff and some of the other things that maybe had been part of an email breach, they didn’t know about this book that they had been asked to read by the real person.
So, it is interesting. Having some of these offline safe words, proof of life, proof of humanity that are important to have.
Even for things as mundane as resetting passwords from our help desk, we’re investing in tools to do that verification through an automated way that’s trusted so that we know who we’re talking to, because it’s really easy to clone somebody’s voice and have that whole process occur.
Carolyn Woodard: It really, it’s like out of a movie. And I think it’s so interesting that you went to this conference that is built around the nonprofit space and foundation space, because as we always say, I think a lot of nonprofits think that they’re too small to be a target, or they’re going to kind of be under the radar, or maybe they’re a target, but only in this mass email, mass phishing way.
But really, when you do think about those donor lists, the assets that you have, if you’re working in advocacy and it’s something that someone else would be interested in knowing if you have volunteer lists, children in your database, all of that sort of thing.
And it’s so easy to make a voice clone, that is something that I think we’re all going to have to step up our game for sure.
Matt Eshleman: Yeah.
Carolyn Woodard: Well, do you have any other bits of advice for us going into 2025? What should we be paying attention to? AI?
Matt Eshleman: Yeah, AI. Apparently Consumer Reports has a really well-done guide to data cleaning and removing your digital identity from those data broker lists. So they have a really good resource. The director of that consumer protection piece was there and talked about the resources that they have.
Carolyn Woodard: We’ll put that in the transcript.
Matt Eshleman: Yeah. There are a couple of services, I think, around that data cleanup that are consolidating a little bit, but there are some resources to help minimize your digital footprint.
Carolyn Woodard: Sounds good. Thank you so much, Matt.
Matt Eshleman: Yeah!