This article is contributed. See the original author and article here.

Host:  Raman Kalyan – Director, Microsoft


Host:  Talhah Mir –   Principal Program Manager, Microsoft


Guest:  Dawn Cappelli – VP of Global Security & CISO, Rockwell Automation


 


The following conversation is adapted from transcripts of Episode 4 of the Uncovering Hidden Risks podcast.  There may be slight edits in order to make this conversation easier for readers to follow along.  You can view the full transcripts of this episode at:  https://aka.ms/uncoveringhiddenrisks


 


In this podcast we discover the history of the practice of insider threat management; the role of technology, psychology, people, and cross-organizational collaboration to drive an effective insider risk program today; and things to consider as we look ahead and across an ever-changing risk landscape.


 


RAMAN:  Hi, I’m Raman Kalyan, I’m with Microsoft 365 Product Marketing Team.


 


TALHAH:  And I’m Talhah Mir, Principal Program Manager on the Security Compliance Team.


 


RAMAN:  Talhah, we’re gonna talk about putting insider risk management into practice.


 


TALHAH:  That’s right, with Dawn Cappelli, somebody who’s been a personal inspiration for me, especially as I undertook the effort to build the insider risk program in Microsoft.


 


RAMAN:  Thank you Dawn for being on our podcast.  Would be great to get your background, what is it that you do now, how did you get into insider threats, all that sort of stuff?


 


DAWN:  Okay, so I am the VP of Global Security and the Chief Information Security Officer for Rockwell Automation. We make industrial control system products. I came to Rockwell in 2013 as the Insider Risk Director, to build our Insider Risk Program.  At that time not many companies in the private sector had Insider Risk Programs. Financial did, Defense Sector of course, they counterintelligence, but not many other companies had Insider Risk Programs. I came here from the Carnegie Mellon – CERT program, which for those that don’t know, CERT was the very first cyber security organization in the world. It was formed in 1988 when the first internet worm hit, and no one knew what it was or what to do about it and Carnegie Mellon helped the Department of Defense to respond. So, going back, I actually started my career as a software engineer, programming nuclear power plants for Westinghouse.


 


From there, I went to Carnegie Mellon again as a software engineer, but I became interested in security and SERP was right there at Carnegie Mellon, so I tried to get a job there. Fortunately, they hired me. I didn’t know anything about security, but I got a job there as a technical project manager so that I could get my foot in the door and learn security. I was hired by CERT.  CERT is a federally funded research and development center, and it’s primarily federally funded. They had funding from the United States Secret Service to help them figure out how to incorporate cyber into their protective mission. At this point, this was August 1st, 2001 when I started, the Secret Service, their protective mission was gates, guards, guns. It was physical and they knew they needed to incorporate cyber. My job was to run this program and the first thing that we had to do was protect the Salt Lake City Olympics, which were in February 2002.


 


So I thought, “How cool is this? I get to work with the Secret Service, protecting the Olympics and I know nothing about security. How did I ever get this job?” And it was very cool. I thought this is the greatest thing. “I can’t believe they’re paying me for this,” but then a month later, September 11th happened and suddenly the Olympics they thought that would be the next terrorist target. And so that cool fun job became a very real, very scary job and when we first went to Salt Lake City to talk to the Olympic Committee about how could a terrorist bring down the network or harm attendees? And someone just, the security experts were looking at network diagrams and trying to figure this out.


 


Someone just happened to say, “So have any network administrators or system administrators left on bad terms?” And they gave us a list of 20 people. So, we’re like, “Oh my gosh, these 20 people they could get right into this network. They know what all the vulnerabilities are.” We decided we needed an insider threat team and an external threat team. I was intrigued by the insider threat team. You have people and you have technology, that really intrigued me.  I said I would take that team and look at where it led me. So yeah, that’s how I got started.


 


TALHAH:  Dawn, one of the things that Raman and I talk about quite a bit is how influential your work at CERT, and the book that you wrote was.  Not only helping me get acclimated to insider risk, insider threat management, but also what we did at Microsoft in terms of building a program and the solution that we’re building. And one of the things that was big for me, coming from a traditional security background where you have this tendency to think that we could pretty much contain and manage the risk from a SOC perspective. When it comes to insider risk it’s important to consider other business partners like HR and legal. I’d love to get your take on that. I know that’s one of the things that was big in terms of my learning, how you came about that and what your journey has been building those partnerships in Rockwell?


 


DAWN:  When I took over that insider threat team or not took it over, but created it, we thought, “Okay, so what is an insider threat? How could they attack the Olympics? What could they do?” And what we decided to do was ask the Secret Services to collect every real case they can find/ let’s look at what insiders have done in the past and learn from that. So, they did, they collected 150 cases for us for the first batch and we looked at like, who does it? Why do they do it? What do they do when, how, where, and how can you stop them? And what we came to realize was this is very different than an external adversary, that they could be anywhere in the world and you have no idea who they are.


 


These are insiders are in your company and they come to work every day and the interesting thing about it was we actually partnered with the psychologist, the behavioral, I can’t remember what the team was exactly called at the time at the Secret Service, but it was the behavioral psychologists and we teamed with them on this effort because we realized this is very different. We need to look at security issues, technical issues, but we also need to look at the people issues because they are people, and we see them every day. So, we teamed with them and we looked through those cases and we just created a big database of all of these different attributes of every single case that we wanted to catalog, and it was behavioral aspects of the case, organizational aspects of the case and technical. So, there were really three components.


 


And because we teamed with psychologists from the very start, by the time I left CERT, we had over 800 cases collected. And we started looking at the cases to look for patterns because the attitude back then was, these are insiders. They have access. They come in to work every day. You’ll never stop them.  They do what they do every day. They just do something bad. And fortunately, because the patterns were so distinct in these cases, we realized, “Yeah, you can. You can mitigate these by looking at social behaviors, as well as technical behaviors.”


 


And that’s where HR and legal come in because I really realized when I came to Rockwell and tried to put all of our theory into practice, I thought, “Well okay, so someone who’s going to commit cyber sabotage, insider cyber sabotage, we know that in almost every single case we had and we have like 169 of them, every single case there were behavioral indicators.” Happy people do not commit sabotage. People that commit sabotage are they’re angry, they’re upset about something and their behaviors get worse and worse over time. So here I am in a company like Rockwell, where we’re in over a hundred countries around the world. So how can I possibly train every manager in the company on what to look for and came to realize that really what I need is HR. Because if you have an employee, we used to say in CERT that an employee who’s going to commit sabotage ends up on the HR radar, meaning their behaviors get bad enough that they come to the attention of HR.


 


And so there it is, HR, they’re my eyes and ears all around the world. If we can train HR as part of their normal training, then we can rely on them to be the ones that notify us when there’s a potential insider threat and it works. It works amazingly well, our HR department, they get it, they know when to contact us, but that’s where legal comes in because there’s subjectivity when you’re talking about someone’s human social behaviors. So you can’t just initiate an investigation because someone says, “Hey, Dawn’s been acting really crappy lately.” So legal is a very important part of that to make sure that we really substantiate what we’re being told. We have multiple people that can attest to the behaviors that we aren’t violating any privacy laws in that local part of the world because they’re different everywhere. So that’s how the human and the HR and the legal partnership came to be.


 


RAMAN:  It makes a lot of sense. Even here at Microsoft as we were looking at the solution broadly or the insider risk solution is we really wanted to bring in HR and legal into that conversation such that organizations would have the ability to collaborate with those two teams to not only help to ensure that they were meeting their regulatory requirements or they were compliant with employment laws and privacy laws, et cetera but what we also ended up realizing was that there’s this other side to the coin. We talked a lot about it here on malicious threats, but then there’s the inadvertent risks as well. People being distracted and especially in this time with COVID and everything, doing things that maybe they didn’t mean to and what we’ve heard from HR legal is, “Hey, how could we maybe use some of this insight to help support a stronger company culture to help people do the right thing and feel like, I’m not going to always get slapped on the back of my hand because I did something wrong.”


 


DAWN:  Most of the insider risk cases that we have are unintentional. It’s people who are doing something they shouldn’t, they’re putting information somewhere they shouldn’t or downloading software that they shouldn’t be, but they don’t have malicious intent, but that has changed a lot over the years. When we first started, we did catch people who it appeared that they were trying to be malicious. When I started the program at Rockwell, I always tell companies you can start very quickly. You don’t have to go out and invest in technology. We started with nothing. That was one lesson learned when you take a job in the private sector you should ask like, “Am I going to have people? Am I going to have a budget?” I took the job and then found out that I didn’t have any people or a budget, but I built the program with nothing, just me and just worked with IT and so we approached theft of IP first.


 


Theft of IP is much easier than sabotage. Theft of IP is very different than sabotage by the way. Theft of IP, we have very nice, happy people that try to steal intellectual property. They’re not disgruntled. They’re ambitious. They’re going on to their next job and they feel like “What I created is mine. I’m going to take it with me.” So, they are disgruntled, sometimes they are. But the key in theft of IP is that they’re leaving the company. So most people that steal intellectual property do it within 90 days of resignation. I knew that going into Rockwell and so I first, well our executive sponsor of our program is the Senior Vice President of HR. So, I talked to her and I said, “Hey, can I just pick one team in the company that has access to the crown jewels, the most critical information we have and just use them as a pilot?”


 


“So, I’ll work with their HR person. She knows to be on the lookout and let me know when someone’s leaving the company,” and then I worked with it and I said, “Hey, what kind of audits do you have? Or what kind of logs do you have? I need USB logs, cloud activity, email logs, can I get access to them? If I have a person’s name, can I go in and just look up their activity?” And they said, “Sure, yeah, we can give you that access.” So, then I went to legal, and I said, “Okay, here’s what I want to do. One team, six months, I’m going to do a pilot. And here’s how it’ll work. HR will tell me someone’s leaving. I’ll go to IT; I’ll look them up in the logs. And if we find something, then we’ll investigate, and I’ll pull you in.


 


And they said, sure, you can do that. I was two weeks into my pilot and at that time I was educating HR globally about insider risk. But two weeks into my pilot, I get a phone call from an HR person in a totally different part of the company and she said, “I know I’m not part of your pilot, but we just had four engineers that all quit at the same time from the same team and they now have a competitive company and they’re starting to try to take our customers away and there is no way they could have built this capability in two weeks. We’ve invested millions of dollars in this capability over years and they just now had to have taken it. There’s no other way they could be competing with us. So, can you do an audit?” I went to legal and I said, “Well, they’re not part of my pilot, but can I do an audit?”


 


And they said, “Sure, go ahead and do an audit.” So, I did it and found sure enough, they had taken all of the intellectual property that that team had created for years and were starting to try to take our customers. We contacted law enforcement, took legal action against them. We ended up collecting royalties from them for like five years, every time they went to a customer that we already had and so it ended up actually, I wouldn’t say a good news story, but it certainly got the Insider Risk Program off the ground because my six-month pilot was over after two weeks. They said, “You need to roll this out. We don’t need a pilot, just roll it out,” and here I am one person and no money, but you can prove your value really quickly. I remember talking to you about that Tallah about how to just take a backward look at people that have left your company over the past 90 days and see what you see and when companies do that, they’re always shocked.


 


TALHAH:  And I’m grateful for that. We followed a very similar approach where we didn’t try to boil the ocean, Dawn. It was very focused approach to say what kind of scenarios we care about, what kind of risks we care about, departing employee data, theft for example is one of the key ones that you educated us on. And just go look back, see what the data is showing you and go from there. And this really is a big inspiration for us in terms of how we develop the solution at Microsoft now, where we’re trying to take these key scenarios that customers care about, that we see a lot of data in the field around and how do we build these detections to be able to identify those things?


 


You’ve been a great storyteller, which really inspired us and a lot of the folks at Microsoft as well. I’d love to hear some of your other stories that really got you focused on different parts of insider risk, something you came across from working in the peers or in your own experience, other stories like this.


 


DAWN:  Well, in November of 2014, we had an employee who resigned, and he was one of our software, senior firmware engineer.  He had access to the crown jewels, and we ran an audit of course. And back then I actually had, I had one or two people in my team by then, but it was still very manual. So, this engineer was leaving, the company had access to all of our source code, did an audit and found that, oh my gosh, I’ll never forget. Whenever I did have a team because they went to HR, HR called me and said, “Dawn, this is the big one,” and I was getting on a plane to go to Milwaukee, to our headquarters and I said, “Okay, I’ll be there in an hour,” and it was the big one.


 


It was an employee who had taken all of our source code. He was from China, was working in Milwaukee, but was from China. Just took the information on a USB drive and was leaving the company and so we met with him, just like we always do. We had a really good process down and so we caught this pretty quickly. We met with them and said, “Look, we have logs. We know that you took this information. We just want to get it back. 


 


That was always our attitude. We don’t want people leaving the company feeling like we distrust them or like they’re a criminal.  We always just say, “Look, we know you took it. We just need to get it back,” and he wouldn’t give it back. He was very, very resistant about giving it back and so we contacted the FBI. You figure this is a lot of our source code and this ended up going to federal court. So, I’m not saying anything that isn’t public knowledge or available in the public. We ended up going to federal court and the interesting thing is that he ended up, he was found not guilty.


 


It’s funny because our company lawyers were really upset about that. They were like, “We never have lost a case. I can’t believe we lost the case.” And I said, “No, we did not lose the case,” because our goal is to protect our intellectual property. We caught him fast enough that after a year and a half of forensics by the FBI, they found no evidence that he had given that information to anyone else. We caught it fast, took action, then law enforcement went in and we got the information back. From my perspective that was a success story for the Insider Risk Program, but yeah, the lawyers didn’t really see it that way.


 


RAMAN:  One of the things you just mentioned on was that at this time your process was a little bit manual.  As you’ve been in this space for so long, how has the technology evolved from your perspective and how is it really helping you being not only be more efficient, but catch things that you may not have caught before?


 


DAWN:  Well, it’s funny because when I was at CERT and we even said this in the book, what we really need is a technology that will let us pull in whatever logs we have. Every company’s different, every company has different data sources. We need a tool that will let us bring in all of our logs, correlate them together and create custom risk scoring algorithms based on the data and those logs and the logs have to include HR data because like termination date, that’s the key. That’s the key trigger there for theft of IP. So, we kept saying that at CERT, “Why can’t somebody do this?  Why can’t somebody do this?” And in 2000, about 2000, that case happened in November of 2014. And leading up to that case, there started to be some products on the market that did that.


 


And I had been saying, “Hey, there are these products that we can actually automate what we’re doing so we wouldn’t have this manual process because it’s really not scaling. It’s getting too big,” and they said, “Well no, we don’t have the money for that.” Well, the day that we caught that engineer, I get a phone call that night. The Senior Vice President of his business called our CSO at the time and said, “What’s that technology Dawn wants? I’ll pay for it. We need to get that in here.” It was fantastic except I learned being an early adopter can be very painful. You figure this was like early 2015. So, that was five and a half years ago. Boy, that’s a long time in technology terms and so it was a bumpy road, but it’s exciting to me that we had the idea in CERT, I came to Rockwell and was able to be one of those early adopters of that technology is a nice little road to go down.


 


RAMAN:  That’s awesome, because we’ve heard a lot from customers like, “Hey, I want to get started, but I don’t know where to start and I need, I have all these different sources and I just need something that I can just quickly scope.” To your point scope the team I want to start looking at and see what happens and that to your point is if you make it easy like that and allow people to say, “Hey, you don’t have to spend a lot of time configuring things, bringing in a bunch of logs and scripts and things like that.” If you can get started and saying, “I’m interested. I’m concerned about IP theft; I’m concerned about this particular group. and I want to just go,” that’s probably half the battle right there.


 


DAWN:  I just actually put together a graph last night showing the number of manual audits that we have done from 2015 through 2020 and just created a bar graph, looking at how has the program improved from an efficiency perspective. And it’s just incredible, once because all of those technologies have matured so much now, we are totally in a, I shouldn’t say totally. We still do some manual audits. If someone walks in and says, “I am leaving and I’m going to the competitor.” The HR picks up the phone, the phone and says, “I need an audit. We can’t wait until tomorrow when the analytics runs the risk algorithm and says, ‘Oh hey, that person it’s gone.'” But for like 90% of the cases our insider risk analyst just come in, sit down at the dashboard, start at the top of the list, and look at the highest risk users first and just start working your way down.


 


And there are times, if there’s a reduction in force, you have a lot of people that are showing up on that dashboard, but the beauty of using analytics where you have these risk scoring algorithms, they are combining all of the factors and they’re telling you, here’s where to start and if you work your way down, what I tell the team is, “When it’s time to go home, it’s time to go home. You don’t have to work through the entire list because as you get down and those risks scores are lower, it’s okay if you can’t get to everybody today. You got to the most important cases.”


 


And that way, people don’t even show up on the dashboard unless they have some activity that warrants being on the dashboard. So, the number of audits, where we used to look at every single person leaving the company, you had a bar like this. Now we’re just looking at the people that are leaving the company and have suspicious activity. That’s a much smaller bar and it enables us to focus on other things, like more technical, the sabotage kinds of cases and the serious security violations which wasn’t even in our scope before.


 


                                            


To learn more about this episode of the Uncovering Hidden Risks podcast, visit https://aka.ms/uncoveringhiddenrisks.


For more on Microsoft Compliance and Risk Management solutions, click here.


To follow Microsoft’s Insider Risk blog, click here.


To subscribe to the Microsoft Security YouTube channel, click here.


Follow Microsoft Security on Twitter and LinkedIn.


 


Keep in touch with Raman on LinkedIn.


Keep in touch with Talhah on LinkedIn.


Keep in touch with Dawn on LinkedIn.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.