Data-privacy expert Cate: Cybersecurity will continue to be ‘growth industry’

March 22, 2019
Fred Cate is one of the nation’s foremost experts on cybersecurity and has testified before numerous congressional committees about security issues. (Photo courtesy of Indiana University)

Fred Cate, one of the nation’s top cybersecurity experts, says data and privacy issues in the United States will always be difficult because an open society means people weigh their independence against the inconvenience of security.

But Cate, vice president for research at Indiana University, who specializes in information privacy and security law issues, said people can take some simple steps to protect their personal data, while experts continually work on ways to secure it.

A senior fellow of the Center for Applied Cybersecurity Research and distinguished professor at the Indiana University Maurer School of Law, the 56-year-old Cate has testified before numerous congressional committees and speaks frequently before professional, industry and government groups.

He spoke to IBJ about how the cyberspace landscape has changed over the last decade and the challenges the industry and the nation face.

Every day, there seems to be a new revelation about the privacy of Americans being breached in some way. Is that going to be the reality for decades to come, or do you think we’re going to figure out a more effective way or ways to protect data?

I think both are true.

It is absolutely going to be the reality for decades to come. In the future, we won’t be able to protect everything, but hopefully, we learn to protect some things, so that it’s possible to be able to say, with confidence, particularly about information that really matters, “This is secure from being lost or stolen.”

Are there key advancements that you see coming on the horizon?

We have a sense of what those are going to be already. The question is, really, whether the public’s willing to put up with it.

So, for example, multi-factor authentication, where you can’t just log in using one type of thing, like a password, but rather you have to use another type of thing, like something you own, like you have the cell phone or you have a key that gives you a one-time login code. [With] these simple technologies, widely available, you can use multi-factor authentication right now to log into Google or your iPhone or things like this. But people don’t do it because it just slows them down a little bit.

An industry doesn’t make people do it because nobody wants to drive away a customer. So the only places that we really see it are either where it’s required by law, like financial institutions use it, or where employers require it of their employees, because most of us won’t quit our jobs over this. But I think, in the future, we’ll get more used to it, and we’ll say, “OK, these are really effective tools. Let’s use them more widely for Amazon accounts or something like that.”

It almost seems like there are two camps. Many Americans are worried about their personal information being compromised, but there are other people who throw caution to the wind. Does the body of research show that the majority of Americans are willing to sacrifice privacy for the convenience of using smartphone apps and other technology?

Yes. We’ve always known the world’s broken down into people who we might call privacy fundamentalists, who are really worried about their privacy—and that’s a very small group—like, 10 or 15 percent. And then there’s a group of privacy pragmatists, who are willing to trade privacy for convenience or benefits or lower costs or something else. That’s a really large group.

And then there’s a tiny group of people who just don’t care at all. They don’t pull their blinds at night; they don’t lock their doors; they really just don’t care about this stuff.

I do think more people are going to start caring more. We’re not just talking privacy here. Computers control the brakes on your car, they control the airplanes, and they control the diagnostics at the hospital. So having good security that not just protects data but also protects the security, the integrity of their systems—that is something, I think, people are going to care more and more and more about.

Are younger people generally less concerned about privacy?

That’s always been true. Fifty years ago, younger people were less concerned about privacy. Part of the reason they’re less interested in keeping private is because they are still defining themselves to the world. But even for younger people, the first time you go apply for a job and discover that 94 percent of employers ask for your Facebook account, and suddenly, they start to care about privacy.

Are the attitudes of Americans a lot different than the attitudes of, say, Europeans?

Not really. There are some specific issues over which we have pretty different attitudes. Like, we tolerated telemarketing for decades, whereas in Europe, it was always against the law and it was seen as a huge invasion to call somebody at home to try to sell them something. But on most issues, we see surveys that are conducted across 20 or 30 countries, and the results are almost always the same about people worrying about their privacy but they don’t actually do a great deal about it.

Do you ever envision this country adopting a policy as restrictive as the European Union’s General Data Protection Regulation?

I don’t. Europe has adopted one of the most—not really restrictive but kind of bureaucratic—approaches to privacy, and it has a couple of failings that I really hope we never replicate. It doesn’t apply to a lot of activities. It doesn’t apply to most government uses for law enforcement or national security and things like that. So the areas in which, in the United States, we’ve been most concerned, it doesn’t apply at all.

The second is that it imposes some serious transactional burdens for activities that may not be worth it. So today, every single time we log into a new website, we get this little pop-up saying, “We use cookies. We’re just telling you. Is that OK?” Well, that’s because of the European law. Well, cookies are kind of a non-issue, and so, Europe has imposed this multimillion-dollar solution for a problem that doesn’t really exist.

How is the availability of massive amounts of data reshaping criminal justice? For instance, if the police suspect me of a crime, under what circumstances would they have a right to search my text messages or the GPS function of my phone to determine where I was at different times?

This is a complicated area because, first of all, there’s infinitely more data available today than there was a decade ago, and certainly than there was 30 years ago. So there’s more data recorded for law enforcement to use. So previously, I might have had a conversation to plan a crime. Today, I might use a text message. Now, there’s something to go find.

Or previously, if I were a child pornographer, I would have a single image of child pornography; it would be a physical image. But today that might be on the internet so that it can be accessed from anywhere. That means both by other child pornographers, but it also means law enforcement can access it as well, and therefore, it can make it easier to find.

And so, I think the starting point of thinking about this is just how much more recorded data, digital-accessible data there is today.

Maybe the best example is thinking about your phone. Your phone records your location, everywhere you go. You could look at your phone and see where you were yesterday. Ten years ago, you wouldn’t have had that. If law enforcement wanted to trail you, they would have to have agents on the street with cars and it would be expensive, time-consuming, and they would have to know in advance they wanted to follow you. Today, they can decide they want to know where you are, and they go to the phone company and get that information.

It’s almost impossible to imagine how different the world is for law enforcement today than it was 10 or 20 years ago. But not all of that data is accessible and not all of that data we have clear rules about. … Slowly now, we’re starting—through some legislative efforts and from courts—we’re starting to get some rules around this.

So, for example, for a century, the police have been able to seize something when they arrested you. They could look for weapons, they could look for drugs, they could look for something that might be evidence of a crime. Does that include your cell phone? Well, for decades, the police did seize cell phones. And then the Supreme Court said … “Nope, you can’t do that. You’ve got to go get a warrant if you want to search a cell phone.”

Let me ask you about the internet of things. What are the privacy implications as more everyday devices become internet-connected and thus are able to send and receive data?

There are lots of privacy implications. There are also lots of practical implications, like it’s consuming a lot of bandwidth because these are all using wireless or cellular, Bluetooth and various transmission technologies.

My TV is connected; my car is connected in all sorts of ways. More and more household appliances are connected. I’ve got a wireless alarm system around my house; you can use a wireless doorbell. Amazon will now sell you something to let you open your front door so that the delivery man can set a package inside rather than run the risk of it being stolen outside

In all of those cases, the devices are collecting data. I set or un-set my alarm—that will tell you when I’m at home and when I’m not at home. And then you’re now increasingly sharing that data with some third party. Your smart TV or your Amazon speaker at home is listening to hear you say the thing you want it to do. That data is being captured and then, to some extent, certainly, after you’ve said, “Alexa,” it’s being shared with the cloud in order to provide you what you’ve asked for.

Eavesdropping is going on from all of these sensors. That data is being collected and in many cases shared. The question is, what happens if the third party does something else with it or if they lose it or they lose control of it?

There are some common-sense precautions we can take. Part of it is just thinking about, what is the risk? What is the sensitivity around this bit of data? You may remember four years ago, there was a scandal about nude celebrity photos being stolen from iPhones. It turned out those celebrities had left their iPhones set to upload any picture they took to the cloud, and then they had used weak or non-existent passwords for their cloud accounts.

There may be times you can’t avoid disclosing sensitive information in ways that might make you vulnerable, but at least make sure you know that you’ve made a choice as opposed to a mistake.

What will be the biggest business opportunities in the cybersecurity sector going forward?

There are going to be so many. Cybersecurity is not just about preventing bad things from happening. It’s about making sure you can recover quickly. … It’s not just about developing the technologies; it’s about putting them in workable places.

Right now, we have a gigantic shortage in the workforce of people that understand cyber issues and are trained to deal with cyber threats. I think this is going to be a growth industry for quite some time.

Why is there such a shortage in cybersecurity workers?

Ten years ago, nobody thought of it as a problem. I taught the first cybersecurity law class in [any U.S.] law school … and that was 14 years ago. Today, there’s not a law firm in the country that wouldn’t want people trained in cyber to know how to deal with breaches, to do forensic investigations, just to be able to talk intelligently with the experts they’re going to have to work with.

So I think there’s been a pretty fast evolution and the supply of workers hasn’t yet caught up to the fast-growing demand.

I’ve been told that government agencies have difficulty hiring cybersecurity workers because they need American citizens to fill those positions. Is there anything to that?

No question about it. You look at enrollment in computer science programs across the country and 80 percent [of enrollees] are non-Americans and from countries where we have difficulty doing the necessary background checks to grant those people clearance.

Another issue is, we have all sorts of rules around employing people. So we’ll say, “You have to have a college degree.” It turns out a lot of people great in cybersecurity may not have college degrees.

Or we say, “You’ve got to work an 8-to-5 day.” Well, I’ve got to tell you, there aren’t a lot of computer scientists who work 8-to-5 days. They’re not good at saluting. They’re not good at wearing a tie.

So, increasingly, we’re having to recognize that the sort of abilities and skills and experiences we need may not fit well in a kind of buttoned-down environment. Even militaries recognize this. They increasingly let the cyber people work in ways that they might not let other people work.

How is the U.S. doing with respect to cybersecurity? Are we up to speed with the rest of the world?

We’re still pretty much best. Our challenge is, as I think all the discussions about Russian interference in the election suggest, we are an incredibly open society. And so, no matter how good we are at our cyber offense or cyber defense, we are always going to be an attractive target because of how open we are.

And that’s not true in Russia, or in many cases, in China. They are also excellent in cyber offense and defense, but they’re not as open, so it’s a little harder to target them. Therefore, they may have fewer vulnerabilities or fewer breaches.•


Recent Articles by Anthony Schoettle

Comments powered by Disqus