Heading 1

Cyber In security News

TM

SUBSCRIBE FOR FREE
INTERVIEW: MICHAEL YAEGER / CARLTON FIELDS
A LITIGATOR’S-EYE VIEW OF CYBERSECURITY
A former federal prosecutor advises companies with the benefit of what he learned.

JANUARY 1, 2019
During six years as an assistant U.S. attorney for the Eastern District of New York, Michael Yaeger practiced in the Business and Securities Fraud Section of the Criminal Division. Before he left, he was the computer hacking and intellectual property crimes coordinator. He followed with four years as special counsel at Schulte Roth & Zabel, where he developed and led the data breach response practice. Then, late last year, he became a shareholder at Carlton Fields, where he joined a larger team working mostly on cybersecurity and government investigations, particularly in the financial services and health care industries. What his clients get now is a litigator’s perspective, developed during a decade that happened to dovetail with the explosion of activity in cybersecurity. That experience served him well, as evidenced by the fluency with which he discussed managing data at BYOD companies, cooperating with law enforcement, seizing bogus domains and working with CISOs.

CyberInsecurity News: You were a federal prosecutor from 2008 to 2014. What was a big takeaway from your experience that in-house lawyers should know?
Michael Yaeger: Companies that have been victimized can obtain aid from the government. Especially when they are dealing with law enforcement, as opposed to their primary regulator. To take one example: People in the investment adviser space are going to, appropriately, have a different view of the Securities and Exchange Commission than they would the FBI. The SEC is their regulator, who does exams and is looking at a wide range of issues. Contacting the FBI or the Secret Service is a different proposition, because they don’t have a general mission of monitoring and regulating. And when people deliberately turn to law enforcement in the situation of a cybersecurity crisis, it can help them manage their relationship with their regulator. And the Justice Department and state prosecutors have legal powers that no private plaintiff has—like grand jury subpoenas, which they can send out before they’ve brought a lawsuit.  A private company has no way to do that. And the government can also do search warrants or undercover operations.
     Those are the more dramatic examples. But one that isn’t talked about a lot is forfeiture. At my old office, the Eastern District of New York, great forfeiture attorneys would help with ex parte seizures of domain names. So if someone were impersonating an investment adviser in a scheme, the U.S. Attorney’s Office could seize that domain name—estop it—fast.
     It doesn’t always make sense to run to the government. And contacting the government is not a magic act. It doesn’t mean that attackers will be caught, much less caught quickly. It really does depend on the particular facts and situation. But increasingly, if you are in a situation where you believe, either for legal obligations or because of business risk, that it makes sense to notify your customers and/or state attorneys general and consumer affairs bureaus, you may then say, “Well, I can turn to someone in the government.” And they can partner with you. There are times when it makes sense for a company to show their customers that they are proactively doing what they can.

CN: When you were in the government, did companies often notify you when they’d suffered a breach?  
MY: No. It ranged widely. There were some companies where we were the ones telling them that they had been breached. We had discovered it before they had. It depends on the industry and the type of company. My own experience in private practice is that people are doing this more. There is more proactive involvement of the government and coordination with the government by companies.

CN: Did you find that large institutions move slower, and are not as quick to contact the government?
MY: No. The opposite. Large institutions are interacting with the government all the time, and they tend to be extremely regulated. And they’re probably going to have to notify people regardless. They also have more compliance staff. In general, one of the difficulties today is that companies start having a large geographic footprint before they have a lot of employees or compliance structure, because of an increasingly globalized economy. You have startups that are now across the country and sometimes across the globe very fast, without a large in-house legal department.

As an outside lawyer, sometimes you’re leading the charge, sometimes you’re a role player. But there is always a team.

CN: So the companies that already have relationships with government agencies—there’s a comfort level there.
MY: Absolutely. And there are good sharing organizations—the ISACs [Information Sharing Analysis Centers], the different organizations where people share information with other people in the same industry. But there are also organizations where people have regular contact with investigators, like the National Cyber-Forensic & Training Alliance . So when people have relationships and personal trust, this stuff happens. And it’s good, because it can help protect all of us. It’s not always easy to do. The government always wants to do more of it, and it doesn’t always make sense. But in general, this is not a situation where competitors are hurting each other. They’re helping each other if they can shut down bad actors.

CN: During your time in private practice, what has been the most common way that you’ve gotten involved in cybersecurity cases—if there is such a thing?
MY: There’s been a mix. I have had a hand in drafting information security plans and incident response plans, and then, of course, I’ve been called in for breaches by companies I’ve never spoken to before. And then there’s related counseling, and sometimes litigation that springs out of cyber events. So it is a range. Frankly, for a lot of institutions, certainly smaller ones, there is an employment aspect to cybersecurity—employees doing things that they shouldn’t, either accidentally or on purpose. Then there are attackers, ransomware schemes, attempts to steal data or trade secrets, straight-up attempts at social engineering to get people to transfer money.   

CN: As an outside litigator, when you’re called in, you’re talking to in-house lawyers, who don’t know nearly as much as you do about cybersecurity and the law, and you’re dealing with technologists, who know more about the technology than you do but aren’t lawyers. Are there language barriers you need to overcome to communicate with these people?
MY: There are always issues of translation. This is always the lawyer’s job. When I was in government, and I was the computer hacking and intellectual property coordinator for my office, I was sometimes explaining new technology and new forensic techniques to judges. And training people. At the same time, of course, there were cyber agents and technologists who were teaching me.
     But this really isn’t so different from being a lawyer who works on complicated accounting cases and gets an expert to assist there. And then has to go explain it to a jury. But also in counseling, it is the translation between a few different realms. It is multidisciplinary. A classic example of the need for people to work together as a team, and to understand their own strengths and limitations, is just a standard SEC exam that an investment adviser would have. The SEC staff will be coming in, and they’ll be talking about cybersecurity, and they may want to speak to the technical staff about what they do. And so it’s important that the lawyers know what the technical staff does when they write the policies, so that the policies will match up to what the staff will say that they do. Another example is an internal investigation, where it can get especially sharp. There are technical people who are essential, but often they are not the decision makers in an enterprise. The technologists may not be the people who have the authority to make the ultimate call. As an outside lawyer, sometimes you’re leading the charge, sometimes you’re a role player. But there is always a team.

CN: Do you find that in-house lawyers and their chief information security officers are communicating effectively these days?
MY: Many are. It varies. But people understand the importance of the area and are very much approaching this in good faith. They’re not experts in each other’s realms, but they’re trying, and they care. Usually you can get people rowing in the same direction because they understand the mission. They understand what the desired outcome should be. As there are more and more compliance needs, people have increasingly had to talk to each other about these problems. Certainly, if they’ve done tabletop exercises, they’ve also done a kind of simulation of what they would do in a crisis. And while no plan is actually a substitute for the experience of having gone through a real crisis, it is helpful.

CN: What are some of the essential lessons you want your clients to learn about cybersecurity?
MY: There was a time when information technology was purchased by the company and was thought of as something that you had at your job. I guess you had a personal computer or a laptop at home. Now people are used to buying software themselves all the time, in the form of an app either for their Android or iPhone. People are used to finding new productivity tools that they get themselves. This can deeply complicate e-discovery or internal investigations. When people have their own devices, when they’re in a BYOD [bring your own device] company and they are storing data in different places, companies have to give a lot of thought to how they manage this. Because in a BYOD company, where employees own the mobile device they use for work, company information is on a device that the company doesn’t own. And so there are a number of tools that many companies use that allow them to segregate some of the company information and to wipe it remotely if a phone is lost, or if and when someone is released [from employment]. Some of this, however, depends on the policies you have. Employees need to know that they do not have a reasonable expectation of privacy in certain areas. And companies have to obtain consent for inspection of devices upon termination of employment.
     One of the basic principles in cybersecurity is understanding what assets you have and where they are. What’s the most valuable stuff? Where does it sit? Managing the software that your employees use, and how they communicate, is part of this. You certainly wouldn’t want people doing work on some messaging app like Signal that you didn’t even know about. You have to give some thought to how your employees will actually act, and whether you have made things inconvenient or unrealistic. And this is why you have to have a graceful way for people to work remotely in different ways. You have to think about how they act when they travel. Employee training has to be designed with these practical needs in mind. It can’t just be where someone comes in and shouts at people about how they need to follow all these rules. There has to be a concrete and realistic appraisal of how people are going to act. That’s how you get real security, as opposed to mere paper compliance or CYA.

Selling an app made for tourists walking in Rome makes a company subject to the GDPR, even if it has no presence itself in Europe.

CN: The EU’s General Data Protection Regulation is getting a lot of attention, provoking lots of anxiety. And it’s only been in effect since May. How is that affecting your practice?
MY: There have been a lot of clients who have had to update policies and consent by their customers. There have been a number of clients who realized that they now have what counts as a presence in the European Union, and they didn’t think they did. The recent guidance by the European Data Protection Board speaks to this. They issued some guidelines that are not binding, but have a certain influence. I suspect that this will be at least as important as, say, a statement by the American Law Institute (ALI) on American law. They go through some of what it means under Article 3 to be an establishment in the European Union or targeted in the European Union. One example that they give is a company based in America and is not marketing to EU residents. And it’s an English-based app. It was designed for tourists to use while they’re walking the streets of Paris or Berlin. It was targeting behavior that would occur inside the European Union. So you’d have a customer who downloaded the app in America. There was no advertising that was directed overseas, and the company has no physical presence overseas, but it is targeted at behavior that would occur in the European Union. And the guidelines give that as an example of a company that would fall under the GDPR. So, what we’re going to see is continual attempts to understand and clarify this.
     I see this as analogous to what happened to personal jurisdiction in the 20th century with the creation of the automobile. We were getting situations that just didn’t happen with great frequency before. A resident of one state and a resident of another state have an accident in a third state. People from New York and New Jersey having a car accident in Connecticut. What happens next? Where do they get sued? These are the kinds of determinations that will be made to gauge the extraterritorial scope of the GDPR. Exactly how far does the long arm of the law reach? The guidelines help. But there will be other situations where we’re just going to have to see how it plays out in the courts.

CN: What are some of the other new challenges that companies are confronting that are likely to pose big risks in 2019?
MY: Another thing that’s happening are lawsuits that spring from cybersecurity issues but are not brought under privacy or cyber statutes. Like 10b-5 suits for a stock drop, where the material event is a cybersecurity event. We’ve already seen shareholder derivative suits, where shareholders are suing on behalf of the company because of a cybersecurity-related issue. It happened with Target and Wyndham and Home Depot. Here, using these statutes, there’s no standing problem for the suit in the way that there often are for some of the privacy suits or a suit under the Computer Fraud and Abuse Act. The plaintiff wouldn’t have to show a classwide loss, doesn’t have to have a detailed complaint pleaded with particularity. Instead, for a derivative suit, it can just begin with a letter to the board demanding that the board take action against executives. Let’s say we have a Caremark suit that springs from failure of oversight. There’s a claim that the directors have not honored their duty of loyalty and good faith. The company cannot insulate directors from damages from breaches of the duty of loyalty. On the other hand, there’s a pretty high standard of having to show that the director utterly failed to implement controls or consciously failed to oversee controls. So some of these derivative suits have failed. But over time, as the courts get a sharper idea of what the oversight obligations are in cybersecurity, we may see more of these—and more successful suits.
     One example is patching software. This is an absolutely basic and important cybersecurity control. When a company like Microsoft, for example, issues a patch for a security flaw, you have to implement it. You’re on notice. But sometimes this is easier said than done. Large enterprises have a lot of different software programs. And it’s not as if patching is just a five-second thing. But it’s one example where it seems incredibly simple. “Why didn’t you patch your software?” The plaintiffs’ suit could make that kind of allegation. I think what we will see is people in the plaintiffs bar taking broad general statutes and using cybersecurity events as the basis of for them. This has been going on for some time, but I think it will accelerate.

CN: You’ve written about the different agencies that oversee cybersecurity cases. Recently the Federal Trade Commission (FTC) made a pitch to reaffirm its jurisdiction, claiming that it is the lead agency and looking to Congress for legislative confirmation. What do you expect in 2019? Will the FTC consolidate its oversight?
MY: I expect that a lot of regulators will continue pursuing this. One key source of regulation here, frankly, is the state AGs. They have general statutes that apply across a wide range of industries. I’m not saying that the FTC won’t be active. I’m expecting that they will be. But cybersecurity is, to a certain extent, a reflection of the American system as a whole. We have sector-specific regulation, and we have a federalist system. And so we’ll have multiple regulators looking at a single space.
     At the same time that we have federal securities laws, there’s also a role for state attorneys general in securities enforcement. God knows, I’ve made, in different legal realms, pre-emption arguments. But there is an overlapping quality to American regulation, and we have not created a centralized regulator for all cybersecurity. We have grafted it onto our existing regulatory regime. And so many of the existing regulators have added cybersecurity pieces. The SEC has things to say about cybersecurity, as does DHS [the Department of Homeland Security]. So do the self-regulatory organizations: FINRA [Financial Industry Regulatory Authority], the National Futures Association. The CFTC [Commodity Futures Trading Commission], too. In New York, the Department of Financial Services has made pronouncements, as have the national banking regulators, such as Treasury, the OCC [Office of the Comptroller of the Currency] and FinCEN [Financial Crimes Enforcement Network]. A lot of this is: The people you’ve been dealing with forever now have additional regs—cyber regs. The old players have a new game.

CN: While we’re talking about the states, people are concerned that there are separate data breach laws in all 50. And now the states are getting involved in creating privacy laws. And there’s the question of whether Congress will finally pass its own privacy law and pre-empt state laws. Is it a big problem for companies to sort through the demands of all these different agencies? 
MY: It can be. Sometimes they coordinate well. There is some virtue to the gradual working out of these rules, and some regulators have been more cautious and tried to use the standards put out by NIST [the National Institute for Standards and Technology] as a guide. So they’re trying to speak to each other. But coordination problems are very real. I should add that they’re not just government-created. The mere act of having to perform due diligence on your vendors can be extremely time-consuming, and a lot of this is enforced by contract. It’s just the fact that a large bank may have a large number of vendors who all have different due diligence questionnaires. Heck, a law firm will get a large number of due diligence questionnaires from clients that are not uniform. There have been various attempts to solve that particular problem. Different trade organizations tried to put out standardized questionnaires. It may be that a federal law standardizing breach notifications would solve some of these problems. In the short term, I think it would be quite welcome. But it also might create problems, if they don’t do it so well. And this is such a fast-moving area that passing overarching legislation could create its own problems five years after it’s passed. Coordination problems are real, whether it’s the public sector or the private sector, but centralization can be another issue.
Michael Yaeger