Heading 1

You can edit text on your website by double clicking on a text box on your website. Alternatively, when you select a text box a settings menu will appear. Selecting 'Edit Text' from this menu will also allow you to edit the text within this text box. Remember to keep your wording friendly, approachable and easy to understand as if you were talking to your customer

Cyber In security News

TM

SUBSCRIBE FOR FREE
INTERVIEW: KRIS LOVEJOY / EY
THE SURPRISING RISE OF A CYBERSECURITY STAR
She didn’t have the usual background or aspirations, but somehow                           that didn’t stop her.
Many paths have led to careers in cybersecurity. This isn’t surprising since the field is new, it draws from multiple disciplines, and it’s changing rapidly. But the path that Kris Lovejoy took must be one of the more improbable journeys so far.
     Lovejoy is now the Global Cybersecurity Leader for EY, based in Washington, D.C. She hasn’t been there very long, but she had a distinguished career at IBM, where she worked for eight years, including a stint as chief information security officer and as general manager of its Security Services Division. More recently she was CEO of BluVector, a cybersecurity startup that was just acquired by Comcast. She had much to say about how lawyers and security professionals can communicate more effectively, why the key is sometime the effective use of analogies, and why the question from lawyers she most abhors is, “Have we been breached?” She also explained how she came to experience bias in a new way only after she became a CEO.
     The way she got her start in computer science is where the surprises began. It wasn’t in college. She was an English major. It wasn’t in grad school. She didn’t go. It all started when her husband was in the U.S. Marine Corps. He was stationed in Jacksonville, North Carolina, “which effectively killed all career aspirations that I had had to that point,” she said. She wasn’t aiming to be a CISO or a CEO. She was hoping to be editor of The New York Times Book Review. She resigned herself to lots of volunteer work in Jacksonville. One of her projects was to help Marine spouses when their husbands were out in boats on deployment. It was the mid-1990s, and there was no way for the couples to communicate during their separations, which was a serious hardship for many wives who couldn’t draw funds they desperately needed without their husbands’ input. That’s when she had a brainstorm.

CyberInsecurity News: How did the Marine Corps give you a start in this field?  
Kris Lovejoy: I realized this financial problem could be fixed if the spouses had a way to communicate with their husbands. At that time, the only way to communicate was if the husband got off the boat and called. And sometimes these women couldn’t even afford phones, so how were they supposed to get in touch with these guys? I read a book on the internet, and I said, “What if we could get some computers and set them up at the USO [the military’s United Service Organization] and give the spouses access so they can contact their husbands?” I raised the money, got the computers, put them in place. Then I said, “Hey, guys, how am I supposed to connect to somebody else?” So I bought a book on networking, and I figured out how to do that.
     It was a disaster and never worked. But that’s when I became one of the only people in Jacksonville, North Carolina, who knew how to do networking. I started getting hired as a consultant by the investment companies from which I raised money. I helped them by connecting their offices to their back offices in Raleigh. Voilà, I became a network engineer. When I moved to Washington, D.C., in 1997, I got a job as a consultant in a company called New Horizons, which happened to put me at both a U.S. intelligence agency and the Department of Defense as a contractor. I was laying the lines for the internet, and I just happened to get sucked into the cybersecurity world because I was really interested in the subject. So that’s how I got started in cyber—working with one of the intel agencies and deploying servers. And here I am! [laughs]

CIN: When you look back, were there any clues, at least in retrospect, that you had this latent talent and interest in tech?
KL: Yes. In fact, if you were to ask me what was the subject I enjoyed the most in school, it was math. Second was science. You know how you take those tests when you’re in high school to figure out whether you’re going to be an engineer? I was off the charts in engineering. The reason I didn’t go into the field—first of all, there was really no such thing as STEM. There was math. And honestly the mathematics faculty scared me to death. [laughs] I was thinking, “I don’t want to have to work with people like that.” Whereas Ms. Gunn, the English teacher, I loved her. So I decided, “I’m going with English.”

Not Afraid of Math—Afraid of the Teacher
CIN: Can you define “scared me to death”?
KL: The head of the math department, I really liked the guy, but he had this way of throwing [chalkboard] erasers at people. Not at me, but I thought to myself, “Oh gee, all of these math guys are kind of mean.” It’s funny that my sister, who was also an English major, was the First Brigade commander at the Naval Academy. She’s in IT now. She ran the public sector program on cognitive computing in Watson, at IBM, and was just appointed as the CIO for the CIA. She ended up getting a couple of master’s degrees, one of them in applied encryption and the other in missile defense systems. So she is a hard-core engineer as well.

CIN: How has your English major training, and the skills you developed along the way, helped you in your career in a very different field?
KL: That’s easy. Communication. The communication gap between the legal community and the chief security officer is oftentimes not just written communication. It’s how to construct the story in a way that the other party understands the punch line. What tends to happen in my field is that the cybersecurity folks tend to be very engineering-oriented, and they’ll come to you with the answer. “The answer is five.” And you’ll say, “Why is it five?” And they’ll say, “Because it’s five.” They don’t know how to craft the business value of that answer. They just haven’t been trained in how to tell a story. The English major, the liberal arts background, gives you an ability to take a step back, summarize and be able to provide a recommendation to the audience that they can understand and act on.

CIN: Your job at EY is all about global cybersecurity.
KL: Correct. I’ll give you a practical example. We have customers who oftentimes don’t know how much money to spend on security. Our customers will say: “Look at my program. Is it effective? Where do I have areas of opportunity to improve? Where should I be spending money to get the most bang for my buck?” We help our customers quantify cybersecurity risk, the potential financial impact, if left unresolved, and then help them in prioritizing risk remediation activities.
     The problem is that the pace of technology advancing is so significant today that every time you add something, there’s a new risk introduced. And the way the security industry approaches security is kind of interesting. Because most security officers do not own application development, they are responsible for trying to secure that technology or tech-enabled service only after it has been built and delivered. It’s like trying to put a seatbelt into a car that’s already on the road. A lot of our strategic focus is on trying to get customers to change their approach to managing cybersecurity risk—to design trust inside.

Why She Hates Being Asked About a Breach
CIN: You told me that one of your least favorite questions from lawyers is, “Has there been a breach?” Why?
KL: Because often you don’t have the answer. It’s not just that you don’t have an answer; you will never have an answer. And that’s hard for people to understand. A breach denotes that there has been something illicit occurring within the network or within the system. It’s like saying, “OK, I think someone came into the 7-Eleven, and I think that there’s a Twix bar that’s missing. But we don’t have any video. Nobody was in the store—it was locked up. And we don’t have a real good inventory process to understand how many Twix bars we had. So, I don’t really know.” And that’s how it works in security. Most people would think, “Well, it’s cybersecurity. It’s all zeros and ones, and they’re flowing through a network.” But the fact is that in order for you to identify a breach, you have to have the tapes. And if the tapes don’t exist, there’s no way that you can answer that question. That’s why one of the hardest conversations to have with the legal community is “Has there been a breach?” Because the answer is: “Maybe.” And then they want to ask, “What are the indicators that led you to believe there was a breach?” Outside a third party calling you and saying they found your data—the Twix bar—there often aren’t any. That’s when you have to go through these long, tortured discussions and technical explanations, which become very difficult and sometimes emotional.

CIN: What if the next question you get goes something like this: Well, our company happens to have a division in London, and this data that may have been breached would seem to be European or have a connection to European customers who are covered by the GDPR. And we don’t have a lot of time here to report a breach. And you’re telling me that we don’t know, and may never know. So what am I supposed to do about reporting?
KL: That’s a good question. What I’ve found is that different legal communities have different interpretations of what is a breach. One interpretation I’ve heard is, “A breach is where there is evidence that there has been some sort of unauthorized disclosure, modification or interruption of availability of protected information.” If the confidentiality, integrity, availability is violated in some way, and you can prove that it was violated, then there’s a breach. Other communities I’ve come into contact with say, “No, that’s not it. You don’t have to prove that there was access, modification or interruption. You just have to prove that there was some sort of illicit activity in the network or on the system that could have led to violation.” It’s like saying, “Somebody walked into the 7-Eleven with the intention to steal a Twix bar. We don’t know whether they did, because we don’t have an inventory system. We have no tapes. We have nothing.” So do you report the fact that somebody was in there who shouldn’t have been in there or not?

The Problem With Definitions
CIN: How do you advise the lawyers?
KL: First, you have to get their definition, based on company policy. I would never interpret on behalf of the legal community how they want to define “breach.” I would always start with a conversation: “How do you define it? Potentiality? Or is it proof?” Based on what they conclude, you go into the forensic process, and you do the best you can to provide evidence to go along with the statement that they’re going to make. When in doubt, I have always been of the opinion that it is best to be aboveboard with your clients or other potentially impacted individuals, even if it’s “I don’t know.”  My personal belief—and this is just me, Kris Lovejoy, not speaking on behalf of EY or anybody else—I’ve always found that it’s much better to disclose even the fact that you’re not sure whether to disclose. Because in the event that something does happen, proof arises at some point, at least you’ve got that information out there.

CIN: There’s almost a Heisenberg uncertainty principle here. The nature of the phenomenon involves uncertainty for an indeterminate amount of time. That lawyer assumed that you would have an answer. And if not now, then in 15 minutes, or two days, or however long you estimate it will take to nail down. But you’re saying that some uncertainty isn’t just an aberration; it’s common, expected. Is that right?
KL: Exactly.  

CIN: In the law there are terms of art. There are various words and phrases that are defined and have a special significance. And they are well established and agreed upon. And you’re saying that there are really crucial words—when you’re talking about cybersecurity, what’s more important than the word “breach”?—that are actually not agreed upon.
KL: That is absolutely 100 percent correct. We need terms we can agree on. And taking it further, what is a “disclosable event”? Most organizations will say, “That would be a material or potentially significant breach.” Well, what does that mean? How do you define that? Based on what? Is it based on potential loss of revenue? How do you even begin to think about that?

CIN: What organization should be trying to consolidate definitions, or float ones that the community can respond to and maybe eventually agree on?
KL: Really, it’s the risk community. If you’re thinking of who should define it on behalf of any given organization, it really should be the chief risk officer. From the industry perspective, it’s going to be a collective. You’re going to be talking about risk officers, the legal community and to some extent the NACD. You’re going to want corporate directors to be involved. They’ll have a good sense, from a board perspective, what they would consider to be materially significant and what kind of breaches they would want to be disclosed.

CIN: Should any organizations be involved? A government entity or nonprofit? NIST? Should some entity take the lead in attempting to come up with definitions that other people can then weigh in on?
KL: In 2017, the American Institute of Certified Public Accountants released a cybersecurity reporting and attestation framework for evaluating the effectiveness of cybersecurity risk management programs and completing an attestation report. This was an attempt to create a common language that can be understood by all stakeholders—including the board. While this was a great step, undergoing an examination under the framework is purely voluntary. At the end of the day, I would imagine the Securities and Exchange Commission will find itself obligated to issue binding guidance. Once it becomes obligatory, I’m sure we’ll see answers to thorny questions on definition.

CIN: Let’s talk about general counsel and chief information security officers. What are the challenges in the relationship?
KL: The biggest challenge is finding a mechanism for communication that works for both parties, because they come from such different mindsets, different understandings of what the same word means.

It Takes a Team to Build Security
CIN: So how do you bridge the gap?
KL: Let’s go to the specific relationship between Legal and the security community, and make sure that the combination of those two is not toxic. How do you do that? Legal and security organizations need to sit down and develop that lingua franca. They have to understand common definitions and agree to the process. The best way to figure this out is through role playing and tabletop exercises. Pretend that something bad has happened—for example, a third party has notified you that “confidential information” has appeared on Pastebin. Now what? Force yourself to detail your response, capturing the steps you take, the people you call, the point at which this becomes a disclosable event. What if the data wasn’t under your control but processed by a third-party vendor with whom you work? What should you do then? I would argue that it’s critical to invest in creating a very strong and tight partnership between the legal community and the security community from Day 1. You need to ensure that they are working hand in hand under these circumstances—as allies, not enemies.

CIN: Who makes the first move? Should CEOs formally invite general counsel and CISOs to sit down to get started?
KL: There’s an effective way of doing this that a lot of organizations haven’t yet deployed. They need to create a cybersecurity disclosure committee. Think about it this way: It’s a group of key stakeholders, including the chief legal officer, the chief information officer, the chief marketing officer and the chief risk officer. It also includes the heads of the lines of business, with the security officer as facilitator and lead presenter.
     What I would do with that committee is say, “You are my go-to decision-makers. If there’s a breach—something big that impacts the organization—you are going to make the determination as to whether we disclose publicly or not. And how we disclose. You’re going to be the funnel from this bad thing that happened up to the board and to the public. You guys and gals are going to make this determination, not me. My role as security officer is to inform you when there is something that meets the definition of “boy, this is bad and should be reported.” I would advise that this committee not meet only when there’s something bad. It should also act as an ongoing advisory committee responsible for meeting on a routine basis and reviewing any incidents under investigation that are potentially significant and material—in other words, that may one day meet the “oh boy, this is bad and has to be disclosed” criteria. Over time, the group becomes a forum for talking about patterns that arise. For example, maybe every bad incident includes one vendor, or perhaps all the troubling incidents seem to involve one group of people in one department. With these insights, business leaders can help prioritize and champion remediation activities, like big policy changes, which may be unpopular but necessary.

CIN: How does this compare to the incident response team?
KL: They are two different things. There’s the day-to-day incident response team. They are working 24/seven. When there’s an issue that may need to be disclosed, or may result in an HR or procurement action, that’s when you bring in the lawyer. From that point forward, the lawyer is on every single call, because these investigations are often carried out under privilege. But here’s the thing: They’re operating in the trenches. The Disclosure Committee I’m talking about is more of a governance body that enables the business leaders to come together and act as a strategic decision-making body with ultimate authority for determining whether something should be disclosed.

Why Analogies Are Important in This Field
CIN: You’re a big believer in analogies as a form of communication. Why do you find them useful in communicating in a business environment?  
KL: Some of these topics are really hard to understand. What I’ve found is that when speaking with business leaders, you can’t use the language that one would use when talking to another practitioner. We practitioners have our own language formed from knowledge that others don’t have. So somehow you have to communicate it in a way that others can wrap their heads around. For example, when talking about the importance of designing security inside everything we do, I use the seatbelt and safety glass analogies. Once upon a time, in the early days of automobiles, cars were traveling dusty roads at high speeds, and people were getting bugs in their mouths and dust in their eyes, all while trying to avoid varmints in their path. So, they put in brakes and put up glass so that people could drive faster. When you explain that you want to put security controls into an application so that you can drive faster, that makes sense to people. They don’t think of security as a bad thing—as something that’s slowing them down. They think of it as an enabler. And by virtue of using an analogy like that, it becomes easier for you to ask for money—because it’s not for something that’s going to stop things. It’s for something that’s going to help.
     It’s important to realize that the challenge in security is not technical, it’s a human challenge. In most security incidents, what you’ll find is a human being somewhere in the chain who did something dumb, misconfigured something or shut something off— ultimately allowing the bad guy to get inside. The easiest way to reduce your risk is to go to the people who have the propensity to enable the bad guys and help them understand the importance of their actions—decreasing the likelihood of an incident by decreasing the probability that they will let the bad guys inside. Take USB sticks. A lot of bad things result from people picking them up at conferences and sticking them in their computers, only to have malware uploaded. A good analogy would be: Hey, would you pick up a toothbrush on a street corner and stick it in your mouth? You shouldn’t put a USB stick in your computer either! People remember it. I’ve had people come up to me years later and say, “I still don’t use a USB stick unless I know where it came from.”

What Lawyers Need to Understand About Security
CIN: What are the most important lessons the legal team needs to understand about cybersecurity today?
KL: If you look at any given organization monitoring for cyberattacks, it’s important to realize that today most organizations can only see attacks of things that have been seen by somebody else before. In other words, there must be some sort of pattern or signature that’s been written and deployed in technology. When the monitoring tool sees that pattern or signature blowing through the network, it’s able to recognize and alert on it. The problem is that in today’s world, there are maybe 200,000 or 300,000 variants of new malicious code that are being introduced on a daily basis. There aren’t enough human beings to keep up with that volume of attacks and write the signatures needed to find them. We estimate that we can only detect about 50 to 60 percent of the attacks that are out there. That’s a big problem.
     The legal department needs to understand—yup, another analogy—that if my enterprise is like an organic being, I’m getting bombarded by bacteria and viruses all day long, most of which were just invented today. There is no way for me to understand what I’m seeing, because I’m patient 0 and the diagnostic hasn’t been invented yet. The other thing to realize is that many security organizations lack infrastructure to tape-record what’s happening. The opportunity to go back and investigate the attack, and then create the diagnostic, is impossible. High-definition tapes simply don’t exist. Even if the enterprise does make recordings, what is recorded simply isn’t enough. Think about the traffic on a network being like a series of letters. The top of the envelope has the address and the addressee; the important stuff is what’s inside the envelope. What typically happens in most organizations is that they only record the top of the envelope. They don’t capture the content of what was sent. Without that, you can’t reverse-engineer the attack and devise the diagnostic. You’re in the dark.
     Where the legal profession can really help is by being advocates on behalf of the security organization. There are lots of reasons why we can’t answer their questions. Most often the root cause is that we don’t have the resources—the money, people or technology—to create the evidence required to say whether there’s been a breach. That blindness is the key problem. I would hope that what everybody reading this article will take away is that the legal profession can be of the most assistance by acting as an ally in developing the people, the tools and the infrastructure that enable us to get to the answer.

______________________________________


A LESSON ON DIVERSITY FROM THE CEO’S PERCH

CyberInsecurity News: Tell us a little about BluVector and how you came to be CEO.
Kris Lovejoy: I came into contact with BluVector technology several years back, when I was at IBM as the general manager of the security services division. A good part of our business was the detection and response to advanced security threats. And it was in that context that I “met” the technology. At the time BluVector was not really a product. It was more of a technology that Northrop Grumman was considering bringing to market. They thought it was important, but they were not sure it was commercially viable. My team was going through the process of evaluating it. As it happened, Northrop was looking for an executive to come in and spin the technology out, and I volunteered to go there to get the technology ready and to spin it out into a stand-alone entity.

CIN: What was it like to achieve the status of a CEO? And how did it compare to your previous work experience?
KL: One of the things I learned is that you certainly find out who your friends are when you’re the CEO of a startup. When you’re a general manager at IBM, you’re part of a giant company with lots of people who want you to help them. When you become the CEO of a cybersecurity startup, you have a small team with one goal—get your voice heard. It’s incredibly difficult—particularly when you don’t have a lot of resources to back you on the marketing side. As a private equity-backed organization, we simply didn’t have a lot of funding for marketing. We had to use our personal connections. I found out who my friends were because they were the ones who were willing to take my calls.

CIN: So there was no regal aspect to this “elevation.”
KL: Absolutely not. The drama really increases the smaller the company is, because you are closer to the team. When you’re the GM at IBM, you have a phalanx of support staff. You have talent managers and HR people and legal support. When you’re the CEO of a startup, it’s you. You are getting on the phone with the health insurance agencies to find out why employee No. 5’s wife wasn’t covered for an accident. You’re moving boxes, doing network troubleshooting, and buying office supplies and candy bars for your employees.

CIN: It actually sounds a little like your networking work in the very beginning, when you were trying to help the spouses connect with their husbands on the boats. You were the CEO of a startup then, too.
KL: Exactly. It’s pure innovation, unrestrained bootstrapping and focus.

CIN: After two years BluVector was bought by Comcast, a deal that closed very recently. And you moved on to EY. But you remain an adviser and consultant to BluVector and Comcast. Looking back, how do you feel about the experience?
KL: I’m very happy I did it. I learned a ton. I scratched an itch. It was good for me to have that experience and to bring it with me here. But there’s one thing I want to add about women and diversity that became apparent to me at BluVector in ways I couldn’t have imagined. While I can’t say I was completely immune to issues associated with diversity, I wasn’t as cognizant of the gap until I had taken that role. It was the first time I really felt there was a ceiling that I might not be able to crack. Prior to that, I had been lucky enough to be surrounded by institutions and people who were supportive. One of the big things I learned is: There is a serious problem. And we’ve got to fix it. The second thing I learned is that the cybersecurity market is oversaturated. There are simply too many vendors. There’s too much money. And we are doing our customers a disservice. In order for us as vendors to get our wares out in the marketplace, we are forced to make claims that are simply untrue in many cases. I’m not saying that about BluVector at all. But there are a number of companies out there that will sell you everything but the kitchen sink. And they can’t possibly deliver.

CIN: How did your experience as a woman in this field change when you were a CEO?
KL: I think what changed was my awareness of the chasm that I was going to have to cross. And for the first time, I really worried I couldn’t cross it. It’s hard to put in words. It may be easier to give you an example of what I’m talking about. I was at a conference hosted by the venture capital community specifically for CEOs from startups. Not all were cybersecurity vendors, but all were startups with the same revenue profile. There were about 60 or 70 of us. It was a two-day conference. I was one of three women and the only one who was a CEO. The other two were an HR executive and the organizer of the conference. On the second day there was a discussion about diversity. Somebody posed the question: “Why are there not more women on our executive teams?” One guy stood up, looked at the audience, and said, “I think we have to stop asking ourselves, ‘What can we do?’ It’s not us; it’s them. Look at this room. All these CEOs, and there’s not one woman CEO.” And he went on and on. Meanwhile, I had been participating for the past two days. I was 10 feet from the guy. I’m waving my hands above my head. The guy sitting next to him pulls on his jacket and says, “Hey, she’s a CEO over there.” And he looks at me and says, “Oh, sorry. I thought you were just one of those HR people.”
     This is unconscious bias. He didn’t just not hear me, he couldn’t see me. I don’t know how many VC, PE interviews I did while I was at Northrop Grumman trying to find a backer. I probably met with 100 people. And there was one woman, and she was junior. It’s very hard for women—particularly CEOs—to succeed in the marketplace, because I do believe there is unconscious bias that is keeping us from raising the capital and managing the companies the way our male peers are. It’s something that we have to grapple with. Because it’s not us!