FUTURE OF CYBERSECURITY - WITH BRUCE SCHNEIER

 

How can we build a more secure digital future?

 
 

Cyber-attacks and data-breaches are escalating, attackers are employing all manner of sophisticated tools, AI is transforming the ‘arms race’ for attackers and defenders. We see the headlines, but is this a future we have to accept? What are the pathways to a more secure digital future?

I asked Bruce Schneier, the biggest name in cybersecurity.

Bruce is a DEEP thinker. He’s been researching and writing prolifically on cybersecurity since 1998, has authored more than 12 books, is a Fellow and lecturer at Harvard’s Kennedy school and a board member of the Electronic Frontier Foundation, and is also Chief of Security Architecture at Inrupt, a data-security venture co-founded by Sir Tim Berners-Lee, the inventor of the World Wide Web.

What I like best about him is this: he thinks about cybersecurity in all its holistic glory, as the product of social, political, economic AND technical forces. He's a realist. Added to which, he has a wonderfully sharp turn of phrase that cuts through the nonsense to get right to the heart of an issue.

We discussed why government intervention is crucial for correcting market failures, IoT laws, the benefits of data decentralization and compartmentalization, disrupting cryptocurrencies to disrupt ransomware, treating superfluous data as a toxic asset, AI, thing-to-thing authentication, the unknowns of the post-quantum world, crypto-agility, the benefits of zero trust authentication and more.

Enjoy the podcast, and scroll down for some of my personal takeaways, a bunch more links to Bruce’s blog posts so you can take a deeper dive, and finally - for those who prefer reading over listening - a full transcript of our conversation.

And don’t forget you can listen to all my future-focused interviews with the world’s best thinkers and scientists by subscribing to FutureBites on Spotify, Apple, or your favorite podcast platform.

 
 

CHANGE THE INCENTIVES

We started with Bruce’s most important and (based on my readings of his essays) most oft-repeated message: government intervention is crucial for correcting market failures. Market forces have not, and will not, sufficiently incentivize cybersecurity. As he points out, the only way we made automobiles safer, and planes, and pharmaceuticals, and appliances, and indeed everything where inadequate design choices could harm us or kill us, we regulated. When data breaches can ruin us financially and digital failures in hospitals and electricity networks and transportation can kill us just as dead, why should cybersecurity be any different?

Hard to argue against that, isn’t it?

He pointed to the General Data Protection Regulation (GDPR) passed in the EU in 2018 as an example of solid progress, but did not shy away from the difficulties of preogrssing regulation in the US, where cybersecurity is not yet the stuff of political debates, where the tech giants wield considerable power, and where the standard corporate reflex is to kick against regulation. Most of my Silicon Valley friends roll their eyes at the very mention of the word.

Bruce’s observations about incentives and corporate behaviours tally with my own. Some years back I had occasion to spend time with the digital security team at a large bank and was surprised to discover their approach was ‘probabilistic,’ by which I mean they were prepared to tolerate paying back defrauded customers so-and-so many millions of dollars in losses each year, because this was cheaper than investing in top-notch systems. This “good enough” approach on a risk-versus-cost basis is troubling for two reasons: (1) It doesn’t factor for the anxiety, pain, time and trouble experienced by impacted customers, reimbursed or otherwise, and (2) As the wonderful Professor Atif Ahmad recently pointed out to me, in a world where a single cyberbreach has potential to freeze or cripple an organization, the only sensible approach is ‘possiblistic,’ not ‘probabilistic.’

I think the public appetite for intervention is growing. Facebook & Friends have, through their bad behaviours, raised awareness of the perils of data profligacy. Every mega-breach of customer data adds to that awareness. AI-driven deepfake attacks, catnip to the media, will certainly push anxieties to new heights. Law makers will surely follow.

Decentralizing DATA

I’m excited by Inrupt, and the Solid Protocol that underpins it, especially in its application to healthcare, of which more here and here.

Bruce said that the Solid Protocol has the potential to be more transformative than the Web because it separates data from applications and authentication in a way that’s valuable to the user. He was realistic about how long it could take to drive large scale change, and agreed that governments were a good starting spot, because there's real benefit in giving citizens a pod where you can interact with all of government, because interacting with government is annoying. Typing the same data into lots of different forms, different agencies, makes no sense.

Couldn’t agree more. In 2007 I wrote, with Greg Stone and Simon Edwards of Microsoft, a primer for public servants on securing citizen information, and it was ALL about compartmentalization, decentralization, using only the information necessary, and giving citizens control of their own data, not because it was prescient, but because those principles are fundamental and have been around forever! In 2005 some scientists at CSIRO briefed me on a technology they called Privacy Preserving Analytics, designed to help science and medical R&D organisations bring patient datasets together for research without compromising the security or identity of patients. It hasn’t been widely adopted yet, but the concept lives on in various forms because it is right.

So, decentralizing data using technologies like Solid might be slow and it might be incremental, and early adopters will most likely continue to be public sector, where public interest trumps data-avarice (see below) but it represents a truly important pathway to a more secure digital future.

DATA AS A TOXIC ASSET

Right now, every large corporation seems to be hoarding as much data as possible, for analytics and for AI. AI is the new driver: business leaders who struggle to articulate all the ways they might apply AI in future are motivated by a general sense that any and all of their data assets might one day be useful feedstock for machine learning. I love how Bruce reframed this in terms of incentives and again used this to show the need to change the incentives via regulation:

Saving it wasn't free. It didn't cost you in storage, but it cost you in risk. If someone hacks your system, they now can steal so much more data and now you have annoyed so many more of your customers. Now we don't have liabilities in a way that makes that really hurt, but if we did, everyone would decide that data was a toxic asset … But watch what goes on. The company saves it, and it's a risk to you and I. Until the company feels our pain, they're not going to change their behavior.

Again Bruce is realistic about the challenge, but his goal isn’t some kind of 180-degree reversal, more like creating an environment where managers finally have to think about their ‘keep everything’ decisions and ask: Do I really want to keep that data, because if I lose it then I'm liable.

DISRUPTING RANSOMWARE

Without cryptocurrencies, there'd be no way for victims to pay the ransom. Start with that and at least one future pathway becomes clearer!

Bruce advocated disrupting the exchanges, interdicting transfers, and targeting criminals based on patterns of use — If you are, over the long term, a seller, you are either a ransomware criminal or a cryptocurrency miner [and] We know who the miners are — and overall, because of the challenges of pressuring players in a system designed to be independent, the more we bring cryptocurrency under the normal banking rules, the better it'll be for normal people that use it and the worse it'll be for criminals.

Enough said.

Worth reading this longer essay by Bruce Schneier and Nicholas Weaver on cutting down on ransomware attacks without banning Bitcoin.

And Bruce and I are 100% aligned on the redeeming values of cryptocurrencies: there aren’t any.

THING-TO-THING AUTHENTICATION

When I asked about the future of authentication, Bruce zeroed in on the Internet of Things (IoT) and the big and as-yet-unresolved challenge of fast and reliable ‘thing-to-thing’ authentication (as opposed to human-to-thing authentication). While his main point was about ensuring the trustworthiness of computers that affect the physical world, he really got me thinking about the speed aspect. Car-to-car communication is fundamental to the future of transportation. When multiple proximate vehicles are sharing location, trajectory, and intentionality data, and when the price of delay is collision, the authentication layer is going to have to be very quick and very efficient!

The California IoT law that Bruce mentioned is here and a plain-language discussion of its weaknesses can be found in this article in Forbes. Bruce’s call for stronger cybersecurity laws for the Internet of Things is articulated in this essay.

THE A.I. DIMENSION

I was fascinated by the way Bruce brought my question about AI back to IoT and computers having agency to affect the world in a direct physical manner. As he put it:

It's not really an AI story, it is more an Internet of Things story. It's cars, it's thermostats, it's drones, it's weapons systems, it's power plants, refrigerators. All of these things are computers, and all of these things affect the world in a way your laptop doesn't. And that's the change. AI is a layer of autonomous decision-making on top of it.

Bruce described this connection more fulsomely in a recent essay. You’ll find more of my thoughts on how agency relates to AI threats here. A whole new dimension to this risk will open up as AIs become indispensable personal agents that are richly and emotionally engaging and insert themselves in all kinds of interpersonal and financial transactions, as I wrote about in The Real Threat From AI. Any hack that compromises the agent will be insidious indeed.

Two AI/cybersecurity topics we did not have time to explore were:

(1) AI-enabled voice and video deep fakes for spoofing and phishing attacks (I’ve witnessed demonstrations, and they are insanely good and getting better fast, and of course infinitely scalable and mass-customisable at negligible cost) but so much has been written about this elsewhere that there is, perhaps not much to be added here.

(2) The looming AI attack-defense arms race in terms of ‘identifying and exploiting’ software vulnerabilities versus ‘identifying and patching’ them, of which Bruce has written about here. Given the asymmetric nature of cybercrime, and the relatively longer timeframes it takes for corporations and governments to react to new forms of attack, the balance seems likely to shift in favour of attackers over defenders through the next 5-10 years, but no one really knows yet.

CRYPTO-AGILITY

Bruce changed some of my views on the transition to post-quantum encryption. While we agreed on the unknown timeframes of quantum computers that can break classical cryptography, he downplayed the scale of the ‘harvest now, decrypt later’ threat, and he articulated the relatively more important challenge for decision-makers as not so much transitioning to the post-quantum standards developed by NIST as achieving crypto-agility (when an algorithm is broken, being able to quickly switch to another) in the face of ongoing uncertainty. He anticipated that this future pathway would mostly be delivered imbedded in cell phones, operating systems and other tech products, and the best immediate action for decision makers was to pressure their suppliers to support it. Crypto-agility was not on my radar before this conversation.

ZERO TRUST SYSTEMS

Bruce agreed that we’ll see more systems employing continuous or repeated authentication through the course of a transaction. To me, this is another positive addition to the cybersecurity ledger, and one with a great deal of innovation to come, especially in designing more automated and unobtrusive layers of monitoring that preserve convenience.

OTHER CYBERSECURITY PATHWAYS

The future of cybersecurity is a BIG topic, and 45 minutes is hardly enough time to cover all of it. I encourage you to dive deeper into Bruce’s blog. Doubly so because the vast majority of books, blogs and websites out there are today-focused or, if they attempt to discuss the future, are heavy on looming problems and positively anaemic on future solutions.

What additional pathways exist to make big step-change improvements in the future of cybersecurity? What did we miss that all business and government leaders should be thinking about? Drop me a line and let me know!

 

INTERVIEW TRANSCRIPT

Please note, my transcripts are AI-generated and lightly edited for clarity and will contain minor errors. The true record of the interview is always the audio version.

BRUCE MCCABE: Well, hello and welcome to FutureBites, where we explore pathways to a better future, and my guest today is BRUCE SCHNEIER. A very special guest because we're going to explore pathways to a better and more secure digital future. Welcome, Bruce. Welcome to the podcast.

BRUCE SCHNEIER: Thank you for having me.

BRUCE MCCABE: Thanks for making time. A brief introduction so people know just how important your voice is on this subject. I've been reading your stuff for more than 20 years. I would say you're probably the most prolific and the deepest thinker when it comes to writing on the subject of cybersecurity and where we're going. Twelve books, I believe. You're with the. Harvard Kennedy School now, and also on the board, I believe, of Electronic Frontier Foundation, the EFF and you're also Chief of Security Architecture at Inrupt, which I hope we get to talk about a tiny bit on the pod. But in particular, you've been writing so prolifically on this subject and I remember personally reading, just after 911, an article you wrote about the big jump in security. The biggest jump was that passengers would be prepared to jump up and intervene next time there was a hijacking. And it spoke to the way you look at things:- not just technically but socially and politically and all of the holistic aspects of security which I wanted to get into today. So it's an honor to have you on the pod and thank you again.

BRUCE SCHNEIER: Yeah, no, I'm good.

BRUCE MCCABE: It's always the embarrassing part, but I really want people to understand just the depth of your background before we get into this. The question I wanted to explore with you, Bruce, today was: what are the big things we can be doing? The big positive things we can be doing and should be doing, even if we're not doing them, but the things we could do to make our digital future more secure? And I don't know if you wanted to uh kick off with some of the things that are top of mind for you, but you have so much to say particularly on regulation and incentives, and it seems to be one of the most important areas in your writing - getting back to changing the economics of security - so that might be a good place to start.

BRUCE SCHNEIER: Happy to start there. I mean the other question is who's the ‘we’ in your question, yes and who has the power and what do they have the power to do? I do agree that a lot of our security problems are systemic. I mean, it's not going to be choose a good password and you're better. It's really deep economic incentives, which makes it a very hard thing to change and a very complicated way to change them. And so we are getting the security that our systems give us, and if we want it to be different, we need to change the rules. I mean, that's simply the way things work in our society. So I do look at regulations, which is a shorthand for saying government involvement. If we don't like a thing, government is the vehicle by which we control how it happens, whether that thing is fraud or murder, or selling pajamas that catch on fire, or selling software that's insecure, whatever it is. These aren't market problems to fix. These are societal problems to fix. So yes, I tend to look at government as a solution to market failures, and then markets working on top of government systems to provide for the things that we in society think are valuable.

BRUCE MCCABE: Yeah, yeah, Because I think you put it, the market just doesn't reward more security at the moment, or it doesn't punish insecurity sufficiently perhaps.

BRUCE SCHNEIER: But it's so many things, right? I mean there's a lot of complaints about how social media is harming democracy. Helping democracy is not something that a social media company is concerned about. It's not part of their bottom line, it's not something that is relevant to them, and whether it's automobile safety or internet security or, I don't know, the selling of human organs, these are things that we don't leave to markets, and if we do, they are to our detriment.

BRUCE MCCABE: Yeah indeed. So if we're pushing for more regulation, or … Should the big banks, for example, out there, and the credit card providers and all of the guys that have invested a stake in this from a provider point of view, from a corporate point of view, it's in their best interest to push for more regulation, isn't it?

BRUCE SCHNEIER: It's complicated. So the bigger you are, the more you tend to like regulation because it keeps the small guys out. Now in the United States in banking, there's a really interesting thing we do is that regulations are key to your size. The bigger you are, the more onerous the regulations on you are, and the thought is you could do more damage. So that is a way that regulation regulates the big players without hurting the startups, and that is a way to think about it. Facebook has been very pro-regulation the past two years. I mean, several things are going on. One is they know it's not going to happen, so it's easy to be pro something that doesn't exist. And two, if it does, they know it will entrench their monopoly position because it'll make it harder for anybody else to come up behind them. But it is still, you know, a little fanciful. A lot of companies talk big about pro-regulation because they're pretty sure they'll be able to get around any regulation or neuter it in rulemaking as things move forward. But generally companies are anti-regulation.

That feels short-sighted to me, but it is kind of a truism in free market, libertarian thinking, that government involvement is bad. It makes no sense, it doesn't survive any scrutiny, but it is a common belief. Rather than government, regulation gives us all the playing field on which we can compete and by creating that playing field, allows us to compete, right? You know, if it wasn't for contract law you wouldn't have corporations. I mean, they wouldn't work. You need government to enforce agreements between parties. That's the very basic one.

But you know, something like an environmental regulation affects everybody equally. It doesn't give you more or less advantage against your competitors, and you know so it's happening. In the United States - and again its far from the field of security - but you know, right now I mean a few years ago the auto manufacturers were very much against regulations that incentivized electric vehicles, and what's changed between then and now is that they are something like 10% of the market and a growing part of the market and something the auto manufacturers have invested in. So now they very much like those regulations because it keeps their investment profitable because. And so, things change. But that's sort of an example of the government stepping in and saying here are the social outcomes we want. We're going to put our finger on the scales and design a competitive landscape and then you companies compete in that.

In cybersecurity, we really haven't done that. There is no incentive in the market to provide good security. I mean, they're not things that buyers understand, so they're not aspects where sellers compete, just like automobile safety isn't something where sellers compete. So the incentive is to be as insecure as you could possibly be and get away with it. And that's the kind of market failure that a government can step in and correct.

BRUCE MCCABE: So why haven't governments stepped in and and actually, uh really jumped in with two boots and said, let's, let's fix some of this and force companies to comply with much stronger security provisions and data protection provisions? Why haven't they jumped into the breach?

BRUCE SCHNEIER: Well, some have, right? The European Union, very much the regulatory superpower on the planet, has some pretty strong regulations on privacy and we have an AI regulation. We're seeing a Digital Markets Act on competition. So we are seeing regulation there. In the United States, I mean, a couple of reasons. Uh, we have a lot of trouble passing real legislation. Uh, there is this libertarian belief that regulation is is uniformly bad, and you know, in the early years of computers and the internet it was an enormous cash cow. The tax revenue was huge and there was this feeling, don't mess with it, don't do anything. I think that's changing a little bit. In Internet of Things, that places computers out into the real world in a way that traditional computers are not. But it's been a confluence of reasons why in the United States you haven't seen any real regulation. You know Republican presidents, you know is another reason for that, and Democrat presidents don't want to antagonize Republican voters, but that's going to have to change. It's becoming too dangerous out there for this industry to be completely unregulated, and I think lawmakers are sensing that at least in Europe. And really for regulation right now, look to Europe, not to the United States.

BRUCE MCCABE: Okay, and it feels to me like the zeitgeist has changed, at least with an understanding of the dangers of social media. You know, we're finally really caught up globally, at a population level, at a citizen level. This is terrible, it's toxic. It's done so much to undermine the way our children are feeling, the sort of depression rates, right through to democracy itself.

BRUCE SCHNEIER: Yeah, I'd be careful of that. A lot of that's really sloppy research. I mean there's a lot of moral panic here, but certainly the fact that these companies are spying on everything we do, using that information to manipulate us, is something that is a cause of concern even without all of those other things. But it's sort of interesting to watch. You see in the United States, what happens is, Congress gets very upset. They hold the CEOs in for a hearing and yell at them and then … nothing happens. So ‘the zeitgeist is changing’ is very different than the legislative landscape is changing yeah, especially the United States. You tend not to get laws that the money doesn't want, so it doesn't actually matter how much the people want something. What matters is how much the money wants it right now the tech companies are very powerful and very wealthy. Yes, so it'll take a lot more change before we get laws.

BRUCE MCCABE: And you've said that the … I think we can all see the urgency is growing with what's happening in AI, and all the exponential change that's going to bring, to how things are done, how insecure things could be, to attacks, all that sort of stuff. So the need is growing really rapidly at the moment, isn't?

BRUCE SCHNEIER: Yeah, in AI, I think the need is similar to the Internet of Things, the fact that the computers are affecting the world in a direct physical manner. It's not really an AI story, it is more an Internet of Things story. It's cars, it's thermostats, it's drones, it's weapons systems, it's power plants, refrigerators. All of these things are computers, and all of these things affect the world - in a way your laptop doesn't. And that's the change. AI is a layer of autonomous decision-making on top of it. But I mean, the real risk is the non-AI internet-connected car, and the AI internet-connected car is slightly greater risk, but it doesn't change that risk. As soon as the computer got into a car and got four wheels and an engine and now goes places and can kill people. That's the change.

BRUCE MCCABE: Yeah.

BRUCE SCHNEIER: And and that's kind of why I see government getting involved. They're already heavily involved in the safety of automobiles, cause if you get it wrong, people die. The same thing with airplanes or pharmaceuticals, or, you know, building safety, fire codes, lots of places, restaurants. As those things become computerized, computers will naturally be more regulated because they're going to be responsible, for ‘get it wrong and people die.’

BRUCE MCCABE: Is there more we should be doing to capture the attention of the regulators, because if they're beholden to larger tech companies and the money involved, what do we do to break through that? I mean, even antitrust law doesn't seem to be applied very aggressively in the face of such power.

BRUCE SCHNEIER: I'm going to say it again: who do you mean by we? You mean you and I?

BRUCE MCCABE: Yes, you and I.

BRUCE SCHNEIER: You and I, we're doing a podcast. I mean this doesn't become a thing until it becomes a political issue. I want to see a US presidential debate where this is a question, right? Because that means the voters care about it, and it's not just lobbying and special interests. So it's very hard for us as people to do anything when it is not an issue that makes it to legislators.

Again, different in the US than Europe. Europe is regulating here. Regulators get all of this. They are working on this. California has an IoT security law. I mean it's not very good, it was a first attempt, but it's an attempt. Uh, several states have passed, uh, data privacy laws similar to europe's GDPR I think about five or six states now. So now there's talk of a national law. It's not great talk because the talk of the national law is something lousy that preempts the state laws. But you know it's still something. But it is very hard for these issues to become real populist issues. They're specialized, they're technical, so it is likely that the interests are going to steer whatever happens.

BRUCE MCCABE: Can we talk about compartmentalization and decentralization of data?

BRUCE SCHNEIER: This is exciting!

BRUCE MCCABE: Yeah, philosophically it's always been a thing. I mean you can reduce risk by not putting everything in one place, reducing your exposure. I mean there will be security breaches, so why don't we at least limit the damage, right? But there's a bigger picture in what you're working on, as to customer - sorry, not customer, but consumer or individuals - who can curate their own data, to pulling control of that data outside of the corporation, that this is a net good for corporations and for individuals. Could you step me through a little bit of what that could be?

BRUCE SCHNEIER: So this is again difficult, right, Because our systems move towards centralized control and you know we have decentralization, then consolidation and then things get broken up and consolidated. And if you think about, you know, social networks, you know they're very few and you know they're run by very large corporations. In the early days of networking, there were dozens and dozens of them. And we have two phone operating systems and two or three, depending on how you count, computer operating systems, and a couple of cloud providers and a few hardware makers and chip makers. These are all very centralized systems and, yes, we would be better off with more decentralization.

It is the ‘don't put all your eggs in one basket’ way of thinking, but it is hard. And you mentioned antitrust. That is something that is being revived In the United States. We really didn't have antitrust since the Reagan era, so it just went away, which is why you have massive consolidation and monopolies or near monopolies in so many industries in the United States. There is a reviving of antitrust, but again it's sort of not a fair fight. You know, one of these big tech companies has more attorneys working on antitrust than the FTC has attorneys total. So it is a hard, hard road.

I think it's important. I mean, our entire economic system is based on competition. The whole idea is that buyers compete for sellers and that competition spurs them to better their products and reduce their prices. Right, it is a race to the bottom. That's the system. And when it works great, it works great. If you think about an open air market, it works fantastically. If you're selling your oranges higher than everybody else's, no one's going to buy them, right? If your oranges are better than everybody else's, everyone's going to buy yours.

But it works terribly with cell phone plans or cell phones or social media, where you have very few players, where there are network effects that keep you locked in, where, uh, buyers have trouble making decisions because prices are obscured. I mean you go figure out how much you're going to pay for your cell phone plan before you buy it. It is actually impossible, and that's by design. I mean, nobody wants it to be like a gas station where the price is on a sign as you drive by and you just go to the cheapest. Companies hate that. They like to obscure prices.

We are starting to see legislation in the United States - maybe it'll pass - trying to prevent some of this fee piling on. That happens in airplane tickets. When you're comparing prices, you see one thing by the time you check out, the price is widely different. It's too late, you buy it.

So I do see decentralization is really important. We were much safer when there were thousands of email providers instead of just two or three. You've either got a Google email address or an Apple email address. Maybe you've got Microsoft or one or two others. Probably in your country there's another couple of players, but not many more, and it would be better if we had distributed data ownership where, instead of all of our – I don’t know, Fitbit data being at Fitbit, all of my data about me is someplace I can control it. That gives me more flexibility, means if someone hacks Fitbit, we don't lose everybody's data, and it's generative. I mean, the neat thing about email is that there are lots of different plugins. If you use Gmail, there are dozens of different plugins you can use, some for sale, some are free to help you manage your email. It's all yours, you can decide. There are zero plugins to help you manage your Facebook. It is only what Facebook decides you should have. So you get a lot more generativity when there's distributed ownership of the thing, whether it's your files or your data, or your posts, or your emails or your texts, whatever it is.

BRUCE MCCABE: And things like healthcare records. Perhaps we could get into Inrupt, if I pronounced that correctly, this venture you've been working on with Sir Tim Berners-Lee. What are the mechanics of that? Because I believe you've actually convinced some - or got some people along to work with that model in Europe, in Belgium, that sort of thing. We're seeing things in healthcare, where we're starting to compartmentalise and decentralise health records and give them that custodianship.

BRUCE SCHNEIER: So Inrupt is a company that commercializes SOLID. SOLID is an open internet standard, uh, invented by Tim Berners-Lee, who invented the Internet - sorry, who invented the Web, I have to be precise there - and it is a standard for distributed data ownership. The idea, basically, is that you, me, everybody has something called a pod, which is kind of like a cloud storage, but much more complex, where you can keep all of your stuff, all of your data, not just your files, but all sorts of data. You mentioned health data, your medical records, the data you deal with the government, your data from your interconnected thermostat and car and phone, and Fitbit and everything else, and it is distributed. So your pod is your pod. So, instead of these companies keeping everybody's data in one place, it’s distributed in everybody's pod, and that increases security, increases flexibility, increases reliability and, again, it is generative. You can imagine someone writing a new application that uses my fitbit data and my refrigerator data and my location data, my phone - just making this up right now - to do something different.

It is slowly being adopted. You mentioned there are governments in europe and, yes, the, the government of Flanders is giving everybody a pod. There are other governments looking at it. There are some corporations looking at it, corporations that have a lot of kind of random data about their customers. That's kind of a mess, and the nice thing about SOLID is it handles a mess right. That's the way it's designed and I think it's a really interesting way of thinking about data. It could be more transformative than the Web because it really separates data from applications, from authentication. Yeah, because right now your data is tied to the application in a way that you can't move when you change, it's really really hard to move it, but this breaks that connection in a way that's valuable to the user.

Now, of course, you can imagine a company like Facebook saying no, absolutely, we like controlling our users' data. We don't want to give them control and in fact, they make it very hard for you to download your data or use it or move it someplace else, right? I mean, this is part of their strategy to maintain their monopoly position. But you know, distributed is better for society. So the hope is that this will become a bottom-up thing, kind of like the Web is, right? But the big companies weren't the first ones to websites. Yeah, yeah, yeah, it was the little guys who had websites first. Big companies were last, if anyone remembers that early, like in the 90s.

BRUCE MCCABE: Yeah, well, SOLID seems like such a big structural improvement if we can get more awareness of it out there.

BRUCE SCHNEIER: It is a chicken and egg problem, right, I mean just like the web. Yeah, you know you're going to want a pod if there are companies and applications that use pods, and there will be companies and applications that use pods, if enough people have pods. Yeah, so it has to start somehow.

BRUCE MCCABE: It has to start with—

BRUCE SCHNEIER: I mean you'd go on the web as long as there are sites to visit, and there'll be sites to visit if there are people there. But that was early, that was the 90s. It was kind of easy to jumpstart that. It's a lot more complicated to jumpstart something now.

BRUCE MCCABE: Of course, but if we can highlight the governments that are doing it and the experiments that are running, and just raise awareness, well that's something I can do anyways, to bring more awareness to them and just at least spread the word that it's possible, it's possible to do this!

BRUCE SCHNEIER: It is possible and it just takes a doing. But yes, governments are, I think, a good starting spot, because there's real benefit in giving citizens a pod where you can interact with all of government, because interacting with government is annoying. Typing the same data into lots of different forms, different agencies, makes no sense. So having that be more seamless, and that's the kind of thing pods do, right? So if you want my address, I give you permission to go to my pod and get it. If I want to change my address, I change it in one place and then everybody who gets my address has my new address. Nice, yeah, much easier than changing your address everywhere. I moved a couple of years ago. It was super annoying to change my address everywhere, absolutely. And every once in a while there's a site I missed that pops up.

BRUCE MCCABE: So another thing you talk about on data is deleting it. We're keeping far too much of it and that inherently increases our risk, as corporations now, I'm talking about the business decision maker who, just in this mindset of ‘keep everything,’ especially in this AI universe where they're all thinking about, oh my God, machine learning …

BRUCE SCHNEIER: That was a big data promise of a decade ago. Save everything, because the cost to save it is so cheap. Right, data storage is free. Save it all, and we'll figure out how to make use of it later. I think AI doubles down on that story. Right, save everything. Machine learning will learn on it all.

BRUCE MCCABE: Yep.

But it was kind of a fiction even from the beginning, because saving it wasn't free. It didn't cost you in storage, but it cost you in risk. If someone hacks your system, they now can steal so much more data and now you have annoyed so many more of your customers. Now we don't have liabilities in a way that makes that really hurt, but if we did, everyone would decide that data was a toxic asset, that saving it is is something you really have to think about. Do I really want to keep that data, because if I lose it then I'm liable. That'd be a great world. But you know, yes, I think we do save too much. Yeah, we save data we don't need and that inherently a risk. But watch what goes on. The company saves it and it's a risk to you and I. Until the company feels our pain, they're not going to change their behavior. And that gets us back to regulation, whether it's liabilities or some kind of statutory damages or regulations some way that we, as users, can make sure the company feels our pain when they lose our data.

BRUCE MCCABE: Yeah, to raise the cost of insecur—

BRUCE SCHNEIER: That's the thing the market doesn't fix.

BRUCE MCCABE: Now, ransomware. One of the most interesting things I've read recently, going through your blogs, is how disrupting the scourge of ransomware is linked to disrupting cryptocurrencies. I'd love to get into that, because it's such an interesting area.

BRUCE SCHNEIER: So ransomware works for two reasons. The first is a realization on the criminal's part, maybe a decade and a half ago, that the person to whom any piece of data is most valuable to is you. Stealing and selling data is a waste of time. Stealing it and selling it back to you is great business, so that sets up the ransom. The second reason it works is because of cryptocurrencies. Without cryptocurrencies, there'd be no way for victims to pay the ransom. The normal financial system blocks criminal users regularly, so you can't pay with Visa. Visa won't allow it. And suitcases full of $100 bills are really, really heavy, so you can't pay in person. I mean, that is right there, that's the point of a kidnapping where all kidnappers get caught, right? You're trying to collect the ransom. The only reason ransomware works is because there is this way of paying that's outside the banking system and that's cryptocurrency. If you want to disrupt ransomware, you need to disrupt cryptocurrencies, and that is easier said than done, because cryptocurrency is designed to be outside the banking system and can exist wholly outside the banking system. But you can do things. When we talk about making it illegal to pay the ransom, that is a way of disrupting the payment. I don't think it's a good idea, but that is a mechanism.

BRUCE MCCABE: They’ve done that in Japan, interestingly [with payments of extortion money to the yakuza].

BRUCE SCHNEIER: The exchanges, the ways that cryptocurrency is converted to real money. It is no good for a ransomware gang to have too much cryptocurrency. They need to convert it into dollars or euros or Swiss francs or whatever currency is useful to them. Right, disrupting that exchange process, disrupting some of the some of the cryptocurrency exchanges themselves. So there's been a lot of success in clawing back ransoms, and that isn't breaking the cryptocurrency security, that's going to the real world company, that is, that is maintaining the accounts, saying to the humans when you see money go into this account, freeze the account. Yeah, lock everybody out of it, don't allow anybody to log in, because cryptocurrencies are not very anonymous. I mean, that's all public, it's on the blockchain, so we tend to know where the criminal wallets are. So we're having more success of interdicting the transfer of money. But still, it's an uphill battle because these systems are designed not to have these points of leverage. But the more we bring cryptocurrency under the normal banking rules, the better it'll be for normal people that use it and the worse it'll be for criminals

Iif you think about it, if you are a normal user of cryptocurrency, like you're an investor or speculator …

BRUCE MCCABE: Speculator is a good word [laughs]

BRUCE SCHNEIER: Right. You’re a buyr and seller, in equal measure, basically. If you are a net buyer, like over the long term, you only buy, you are likely a ransomware victim. If you are, over the long term, a seller, you are either a ransomware criminal or a cryptocurrency miner. We know who the miners are. Everybody else is a criminal. Nobody else is a net seller because you can't use cryptocurrency in normal transactions. You're not going to go buy a car with Bitcoin.

BRUCE MCCABE: Yeah, absolutely, I love that. And yet another reason - I mean it's such a speculative instrument anyway, and it's not doing a hell of a lot of good on the energy front - there's a whole lot of reasons for cryptocurrency to—

BRUCE SCHNEIER: Oh, it's a disaster on many fronts. It has absolutely no redeeming values whatsoever. It's kind of amazing, but it's here to stay because the math works. It could become something that, oh, we used to do, that. It's something that fell out of favor, but it'll always exist, and I guess the hope is it falls out of favor pretty quickly. Right now, it does seem like the users are people who are buying illegal goods and ransomware.

BRUCE MCCABE: Yeah, the actual users, as opposed to the speculators. Yeah, absolutely.

BRUCE SCHNEIER: Which was something else to add, right? If you are a net seller, you could be an illegal good provider.

BRUCE MCCABE: Two more things I wanted to cover, as I'm conscious of your time. One thing I want to get into is, look, when we talk about post-quantum encryption, no one really knows the timeframe for quantum computers. You know, there's a lot of, an awful lot of unknowns here, but we do know that theoretically, once we get that capability, where everything that's currently encrypted is potentially quite open …

BRUCE SCHNEIER: No no, no, no, no, it's not nearly that bad.

BRUCE MCCABE: Oh, okay.

BRUCE SCHNEIER: There are a lot of doomsayers here. But you have to sort of peel apart what works and what doesn't.

First thing is, you're right, we don't know ‘when,’ and we actually don't even know ‘if.’ I mean, the engineering challenges to making a quantum computer working at any reasonable size are considerable and they might be impractical. It might be impossible using current technologies. We don't know. My guess is we will get it working someday. If that happens, we have some theoretical algorithms that break some of our encryption standards. We don't know if those algorithms are feasible. Again, let’s pretend they are. So now some existing cryptographic standards are broken, Not all of them. Lots of cryptography still works. Lots of key sizes still work, Lots of systems still work. And right now NIST, which is the US standards body in this thing, is undergoing a competition to create post-quantum encryption standards algorithms that are resistant to quantum computers. That is going swimmingly. We have some candidate standards, so right now the math is well ahead of the physics.

The real issue is crypto agility. The real issue is when an algorithm is broken, whether by a quantum computer or just a smart person with a regular computer, can we switch to a replacement algorithm fast enough? The answer is largely no, and we need to be better at that. Now I think a lot of companies are looking at crypto agility and really it's not user companies, it's like Microsoft, it's the companies that make the stuff that all the other companies use. It's the encryption in your cell phone, the encryption in your operating system. These are not choices that we as users have. These are choices that suppliers have, or even the suppliers to the suppliers. Google has to be crypto agile and they are working on it. They do understand the issue and we went through this in the 90s when we tried to update a bunch of old cryptography standards, and it took over a decade to do it in some cases and it was kind of embarrassing. So we're trying to do better.

But quantum computers could potentially break a lot of things, but it really is potentially. We don't know and it's going to take, I think, a couple of decades working with a working cloud computer to fully understand what it's capable of, Just like it took decades working with conventional computers to fully understand what they're capable of. So it's going to be interesting. But it is not that crypto apocalypse that the popular press likes to talk about. I mean, we do worry about existing things, and places like the NSA are saving a lot of stuff, just in case the quantum computer you know shows up and it is valuable, but it decays in value. I mean, how important is 10-year-old Chinese intercepts to the US government? Probably not very. What tends to be important is stuff that's current. Most of our secrets don't have a long shelf life. Corporate secrets this year are products on the shelf next year. I think of automobile designs.

BRUCE MCCABE: Yeah, that's true.

BRUCE SCHNEIER: They're secret for a year. Mergers acquisitions are super secret for what, four months? And real long-term secrets like the formula to Coca-Cola. That kind of stuff is not on a computer, so they don't trust it.

BRUCE MCCABE: If crypto agility is something for the Googles and Microsofts to worry about, are there any preparatory actions that your average corporation should be taking now? Are there any things they should be doing, rather, than just waiting.

BRUCE SCHNEIER: No. Pushing for crypto agility. I mean asking for it, demanding it. If you're going to buy a product that has encryption, ask about agility. Make that part of your buying decision. You make what the suppliers are doing part of what you're paying attention to. I mean, it's not the whole thing, but it is something. A system that's not very agile is less good. It's more vulnerable. Well, it's more fragile is the way to put it.

BRUCE MCCABE: Yeah, and again, I guess there's a time-based element to that, in that the systems you're using now you'll probably be replacing anyway in five years.

BRUCE SCHNEIER: I mean, you have to assess. You know we say that, but there are systems in place that are decades old, and it's like banking and the phone system and airline reservations and all of these industries that use computers decades and decades ago. The amount of legacy software out there would stagger you. Governments, corporations, don't change their stuff the way we consumers do. Like we get a new phone every three years. And apps change, we update them and we go. Now there's stuff running in the systems running like our nuclear missiles are literally from the 60s.

BRUCE MCCABE: Yeah. There's a couple of silos you can go down and have a look at in North America, and they're very sobering …

BRUCE SCHNEIER: Using 8-inch floppy drives?

BRUCE MCCABE: [laughter] For a bunch of reasons they’re sobering, and that's one of them! Okay, um, authentication. Um, I just want to examine sort of where we can go with that or should go with that, because there's a whole lot of biometrics that are available. I mean we can keep layering on liveness detection, um, more behavioral biometrics. Is that where things are going? We're going to be doing more multi-layered? …

BRUCE SCHNEIER: There's a lot of places. Because authentication is hard. I mean and you think about us as humans we tend to authenticate through biometrics. I recognize your face, I recognize your voice right? Yeah, I look over the room and just the way you hold your body I recognize you right though. So these biometrics are very human to human. They work less well with computers, you know, and originally computer systems have such low uh connectivity the best we could do is a string of characters like a secret word that I knew, that you knew kind of the secret handshake way of thinking about authentication. That kind of only just barely worked, and we know all the problems. People forget their passwords, they choose lousy ones, all the ways they fail. Biometrics are an alternative in some situations, for an object that you're interacting with. Biometrics were great. They work on your phone. I have an iphone and used to use fingerprint. Now it uses face. You know, google's phone is sort of equally uh biometric agile. That works less well for remote service. I can’t use a biometric to, you know, authenticate to Dropbox because its all the way over there. So I'm still stuck with usernames and passwords. Sometimes I'm going to use two-factor where I'm going to have, generally now it is some kind of code, one-time code I get on my phone. So the first factor is the password. The second factor is control of the object, the phone. That works marginally well. It's, it's good. People don't like it. It's an extra step. Uh, you, you see some people doing away with passwords and just using their email. I mean, how many people do you know will always click ‘I forgot my password’? They get the email, they click on the email. Right, and they don't even ever remember their password because they're relying on the security of their email as a proxy for everything else, which works great until someone hacks your email, in which case you lose access to everything.

But the real problem is going to be the rise of thing-to-thing authentication. Because right now, when we authenticate, it's either me authenticating to an object, like turning on my phone, or authenticating to a remote service where I check my email, and the way that works actually right now is my phone stores my email password and it automatically sends it up. Thing-to-thing authentication is like your car authenticating to your phone or your car authenticating to another car as they drive past each other and exchange information about who's braking and who's turning. Now we can do that a little bit. When you authenticate to your car, that works. It works seamlessly. But you set that up right. You paired, used Bluetooth and paired your car with your phone. When you got your phone, you got your car. That was a manual process. That works great, but you only do it 10 or 20 times. It's not going to do with everything you own and it doesn't work ad hoc. It doesn't work when the drone is landing and needs to authenticate with the doorbell. They've never met each other. They'll never meet each other again. The two cars are coming at each other. You have to authenticate because they're exchanging braking information. You have to do that really fast or someone's going to hit someone else.

BRUCE MCCABE: Yep, that’s hard.

BRUCE SCHNEIER: So authentication is something we are going to see causing increasing problems, and it's going to be the rise of the Internet of Things, these thing-to-thing authentications. Yeah, and we don't have a good answer here. But for a lot of the stuff we deal with, we've kind of patched together an authentication system. Right, we use our phones, we use biometrics, we use two-factor, we use the right. I forgot my password in email, and to the point where some sites don't even have a password, all they do is they'll send you either an email or a one-time code to your phone as a text. That's the authentication system.

BRUCE MCCABE: I guess everything's a trade-off and for high, really high value transactions, a lot of people talk about zero trust systems, which repeatedly—

BRUCE SCHNEIER: We’re going to see some more of that! And zero trust is a terrible name for it, because it's not zero trust, it's continuous authentication.

So if you think about it the way it works today, the bad way you authenticate to your bank, you get into your account, you do whatever you want. A better way is you authenticate to your bank, you get in, you can do anything normal, but once you do something weird, like send $50,000 to some unknown account in Eastern Europe, it's going to flag it, it's going to demand extra authentication. As a corporate user, you do your job and you authenticate, and that's fine. If you start downloading lots of files or downloading files that are not normally part of your job, it might ask for extra authentication. So zero trust is a way of continuously monitoring what the user is doing and demanding extra authentication to do something unusual. You see a little bit of that in something like Facebook, where if you log on from a different location or a different computer, it'll ask you for more authentication. You're doing something weird.

BRUCE MCCABE: Are there any other messages that you wish you could amplify more, bring more awareness to about how we make a more secure future? Anything we haven't covered that you'd like to bring more attention to, particularly from corporate and government decision makers?

BRUCE SCHNEIER: It comes back to incentives. It is the job of government to set the incentives properly and to a real extent. Government has abdicated that responsibility on the internet. They do it in consumer goods. They do it in workplace safety. They don't do it on the internet. Now we're now in the world where the internet is critical, where failures cost lives and property, have enormous economic impact. So we cannot look at the internet as this sort of hands-off area where companies can do what they want. Those days are over, and I get that it's going to be less fun, and I get that all those companies are going to be mad at you for doing it, but we as society have to. There's no other choice.

BRUCE MCCABE: Well, that's a great way to end it. BRUCE SCHNEIER, thank you so much for making time to be on the pod. It's been an absolute privilege to have you on and to hear some of those insights. There's so much more to cover, I know, but they're great messages to amplify, so thank you.

BRUCE SCHNEIER: Thank you for having me.

 
Previous
Previous

FUTURE OF EPIGENETIC MEDICINE - WITH BEN OAKES

Next
Next

CALIFORNIA’S 100% RENEWABLES REVOLUTION