The Lifecycle of a Revolution
Jennifer Granick; Jeff Moss; Phillipe Courtot
Posted 21 January 2016 by Alkibiades
In 2015, Jennifer Granick was the keynote speaker at Black Hat, the annual conference of the global InfoSec community held in Las Vegas (UT). In her talk, she argued that 20 years from now, the internet might complete its shift from liberator to oppressor. According to her, centralization, regulation, and an increasingly divided community of users have slowly subverted the dream of a free and open internet. These developments will continue to form the future of communication and information, and transform the internet into a slick, controlled, and closed thing. While it might still be possible to prevent this from happening, Granick believes that in the next 20 years we will need to get ready to smash the Internet apart and build something new and better.
Jennifer Granick is the director of Civil Liberties at the Center for Internet and Society at Stanford Law School. Outside of academia, she is mostly known as the attorney who defended some of the more notorious criminal hackers around, including Kevin Poulsen, Aaron Swartz, Jerome Heckenkamp and the hackers in the Diebold Election Systems case.
The video of the speech as well as a revised written version are also available at https://medium.com/backchannel/the-end-of-the-internet-dream-ba060b17da6…
- Date of recording: Mon, 2015-08-10
- Language(s) spoken: English
00:01 About Black Hat
Jeff Moss: Okay, so welcome to the largest Black Hat ever. I say that every year, I’m just going to take a recording of myself. And it starts to make me really wonder: Where does this all end? Because I really feel that we are all employed for life. I mean, we really can keep doing this forever, because, as far as I see, I see problems and challenges. And so, on one hand, I’m really excited, on the other hand, I just want to sleep sometimes, you know?
So, this is our 18th Black Hat, and the second year at the Mandalay Bay, and every year, I do a little bit of talking about statistics of how many people show up, how many countries are represented. I’m just going to get that out of the way before I move into some of my remarks, and so: 102 countries represented, that’s a pretty good proportion of the planet is here around you. And 20 of those countries have only sent a single person. And you can guess who the biggest countries are – United States and… Come on, who is the second country?
[Inintelligible voice from the audience]
China? No. I would have guessed Canada, but it turns out to be the UK. Ok, so they did their own count. So we have a lot of representation, and we also have some new things we have been doing this year. Last year, we announced our academic scholarship program, where we are trying to get more students involved from computer science programs around the country, and, so, to get a crack at a free admission to Black Hat, you write a white paper about a security topic, and you get it submitted, and they all get reviewed. And we don’t really have a cap on how many we accept right now, and, so, this year, we accepted 153 that we thought were interesting enough. Yeah. So if you are in the audience, and you are in here because of that program, let’s hear it! Come on, guys! So, now this is your time to make the most of it. Right? Take advantage of any opportunity you have, because next year we’re probably not going to accept your writing. It might be a one-time offer [laughs].
We also, for the second time, are doing a donation to the EFF. And the EFF has really been involved in a lot of legal battles over the years, and sometimes you agree with them, sometimes you don’t agree with them, but what you can agree on is that they’re dedicated, and they’re motivated, and they get results, and they understand our community. And there’s not that many other legal groups that really understand us and go out of their way to reach out to us and to involve us in their process. And so we are doing another 50,000 $ donation, for the work they are doing not just on behalf of our speakers but also on behalf of our community. So I want to thank EFF for that, and their good work this year.
So, I have a couple of remarks, you can see how well they have been laid out [shows notes scribbled on a sheet of paper]. And so for the theme this year, I have been trying to think of: “How have I been feeling last year about the direction not just of the United States, but of the direction of information security in general?” And it’s really feeling to me like a pendulum, a pendulum has been swinging and it’s accelerating. It started maybe a couple of years ago, and what’s happening, I think, is… When I started off in information security, when I was a hacker, it was not illegal to pirate software. How many people were alive back then? Or: took advantage of that loophole? Non-commercial copyright infringement was not illegal. You could not make a profit off of somebody else’s work. But you could enjoy it yourself, for no profit. How far have we come, right?
Now, they are talking about mandatory minimum sentencing in the UK for software piracy. So, in the course of a short… of about 20 years… Well, actually, in the United States, non-commercial piracy changed, and the case started essentially in about 1994 to 1997, with the Net Act, it became illegal. So, less than 20 years ago, it was legal. So [gestures from left to right] on one side of the pendulum, and here we are today. And if you take that metaphor, and you start looking around, you can see it in many areas [repeats gesture]: very little legislation, lots of legislation today.
And so, when I was making notes to talk about, I realized that everything that is happening now, we are in the middle of. We are sort of the gatekeepers on this. We are going to be the trusted advisors. When we want to understand the implication of the Internet of Things, liability is, what is possible? We are the people building it, or we are the people securing it, we are going to have to be the ones who are advising on it. Is it possible to remote update 100.000,000 Internet of Things devices? We are going to be the ones to be sort of in the middle of that. There are security implications.
So you might not think that you are going to be in this battle, but you’re going to be pretty central to this fight over the next five to ten years. And, make no mistake, what we decide what happens in the next five to ten years, we’re going to live with for the next 30. And so you really have to start thinking a little bit more broadly about what you are working on, where you are donating your time, and what kind of world you want this to be.
You can see this just with the Crypto Wars starting all over again: mandatory backdoors, golden keys. These are not new concepts to us, but they are new concepts to the new players. And we have to bring some perspective and some history to let them know why it wasn’t a good idea then, and why it’s not a good idea now, even worse idea now.
So, if you think about it, we have insurance coming on the scene, cyber insurance is going to suck up maybe a quarter to a third of your budget in the next, say, five to ten years. That’s not going to make you any more secure, that’s not going to patch any of your routers, but when your routers do fail or get hacked, you are going to have some insurance, right: a classic risk-avoidance strategy. But how are going to still do your job if some of your budget is moved to insurance? You are going to have to play in a world where there is more regulation, there is more insurance, there is more involvement of the legal system. So, for example, I ask people this, and I have not got a good answer. So, if you can tell me an answer to this question, I’ll put it in my brain, I’ll think about it, and I’ll write something.
And why, when your company is under attack, do you not employ all the tools that are at your disposal? Right? You deploy your technical tools, but how many of your companies have actually gone out and sued someone for attacking you? It’s really rare. But if you think about it, in any other instance, you would be deploying lawyers: intellectual property theft, disgruntled employees, competitors stealing, but in certain areas, when you are being hacked, we don’t deploy our lawyers. Why is that? They are sort of like our army. Governments have armies, they have state departments; as a company, you don’t have any of that. You don’t have the law on your side, the law to use force, all you have got is civil law. Why are you not using civil law? I think this is going to become a bigger component in your toolbox in the future. You are going to start having to reach into that box and use it. So you should start to get familiar with what tools to use.
Another one I am thinking about is liability. This was not a concern of ours 20 years ago. It’s going to be a big concern for us going forward. Do you think that we can solve all of our problems that we are facing in security and the ones we have in the future if there is no software liability? I hate saying this, but I do not see a way forward without software liability. Which is going to be more of the same, as far as you can see. It is going to be like turtles all the way down. No software liability of any – it could be a dollar, it could be ten dollars – I’m not saying it has to be punitive, I’m just saying if normal, legal functions are not injected into the process, we are going to have more of the same.
Can you imagine Boeing, or Airbus, or Tesla – they are essentially moving data centers, flying through the sky. Yeah, they are data centers with wings, or wheels, but they are data centers. And they operate under liability. Right? They have some pretty strict software liability around their moving data centers. But an Oracle data center without wheels? No liability. So do you think someone like, I don’t know, Elon Musk, or someone at Boeing, they are going to feel that they are in a great competitive situation when everything they do requires so much more engineering and liability protection, but over there, the Microsofts and Oracles of the world – no protection? No, I think they are going to want a level playing field at some point. And it’s going to be competitive. And so I think even if we do nothing, market forces are going to start driving us toward liability. But what is that going to mean? For you, and for the way you run your business?
09:40 Introduction of Jennifer Granick
And so, just more and more of these issues are becoming ripe, and, so, for that reason, when we were looking for a keynote this year, I was trying to think of someone that encompasses the big perspective, the long view, that’s familiar maybe a little bit more with the legal side, and maybe a little bit less with the technical or the geopolitical side, to really kind of help us understand what these big shifts in social direction, legal direction, what this means. And so for that we have Jennifer Granick as our keynote.
And I first met Jennifer at DefCon 2, I believe, and we have been talking about it back there… Besides being a rare occurance of a woman in a hacking conference 20 years ago… I can’t imagine: was I hitting on her? How did I meet her? No, it turns out she was signing in, and she put her name down, and she put “lawyer.” And I was so excited that there was a lawyer there, and of course everybody wants free advice from a lawyer, so I started talking to her, and we stayed in touch. And over the years, Jennifer became the go-to person. I mean, if anybody in the scene anywhere got into trouble, Jennifer was there to help. She was a defense attorney.
And I remember at one point, she… If you remember Peter Shipley, who invented wardriving: he was being accosted by the hotel because he was doing something, and she busted into the hotel security and pulls out her little barrister association ID, and she was like: “I’m his attorney, let him go!” And she is not afraid of controversy. She has helped, over the years, Max Butler, was involved in his case, Mike Lynne, ten years ago she was involved with, the MTBA charge card hackers, helping them, Kevin Poulson, helped weave, she helped Aaron Schwartz, she has even helped me several times. I mean she has really helped everybody. So that’s why I’m so proud to have her finally on stage.
11:48 Thanks to Qualis and their CEO
And to introduce her, I am equally proud to reintroduce you to Philippe Courtot, chairman and CEO of Qualis. And Qualis has been a sponsor for Black Hat since about, probably, 1999. They have had a long view. And back then, there was no cloud, there was SAS. I think SAS had just been invented as a tool. So they were offering their services SAS, and then the cloud came along so they had to rebrand. So now they are offering their security in the cloud. But, you know how the technology is, it’s essentially the same stuff. But what’s different, I think, maybe, with what I see from Qualis is that they are making their stuff more accessible, and they are offering more free services, so you can get sort of play around with their stuff.
And I asked him what his longterm goal is, I mean, what’s after cloud? If there’s SAS, cloud, mobile, mobile SAS, mobile SAS cloud, mobile [gibberish]… Those acronyms don’t work. But really, what it is, they are trying to basically build security, simple security, into the fabric of the cloud sort of as a background, and if it was coming from somebody else, who doesn’t have an almost 15 years track record with us, I would not really give him a lot of credit, but for Philippe, I’ll give him a lot of credit for the longterm visionary view. So, with that, I want to see how Philippe makes his entrance on the left here. Thank you very much.
13:28 Laudatio
Phillipe Courtot: Good afternoon, ladies and gentlemen, and thank you, Jeff, for this very nice words. And it’s an honour and a pleasure for me to have the opportunity of introducing our keynote speaker. And this is because Black Hat is very special, it’s very special to me, because I think it brings the best minds and the best hearts in our industry so we can continue moving forward. And I think this is very important, in fact, especially now, in 2015. 2015, which I believe we may end up calling the Year of the Megabreach in reference to the office of personal management.
But also, I believe, it’s a pivotal year for all of us, because in fact, I think, there is two – not one, but two – mounting challenges in front of us, which are, as we well know, on one hand cybersecurity and on the other hand privacy. And when you think it through, it’s not possible to disassociate both of them. And this because, in fact, they represent our digital freedom and the digital freedom of our children. And with this in mind, I’m very honored and pleased to introduce our speaker, a lady and a lawyer who in fact dedicated her career to our digital freedom. And also, as Jeff mentioned, a lady and a lawyer who also was, since the very beginning of our industry… became known as the person, the first person that hackers call. So with that I’m very happy to introduce Jennifer Granick who is the Director of Civil Liberties at the Stanford Center for Internet and Society. Thank you very much, Jennifer.
16:00 Jennifer Granick’s personal background within the hackers’ community
Jennifer Granick: Hello, ladies and gentlemen. I’m really excited to be here. I think I have come to almost every Black Hat since the show started and it’s a real honor to be invited to be the keynote.
So, it was twenty years ago that I started coming to these events and that I went to DefCon. And I was interested in going to DefCon because I believed in the dream of a free, and open Internet. And I believe that we want a world where information is freely accessible. And I believe in the freedom to tinker, the hands-on imperative that people should be able to study, manipulate, reverse-engineer the devices, the software, that define the world around us, that that’s what it means to engage with and to understand our world. And I went to DefCon because I wanted to be part of making these dreams true, and as an attorney, I thought I could use my services to protect hackers, the people who were making this world happen, from the predations of law. But today, that Dream of Internet Freedom that brought me to DefCon twenty years ago is dying.
17:26 General trends
And it’s dying because – nobody is murdering the dream – but it’s dying because, for better or for worse, we’ve started to prioritize other things, we have put other values ahead of openness and freedom. We are looking at security, online civility, improving the user interface, protecting intellectual property interests, and we are valuing these above freedom and openness. And so, through neglect and other evolutionary trends, what we are seeing is an Internet that is less open and more centralized. We’re seeing an internet that is more regulated. We used to not have very much regulation, now we do. And in terms of where these rules are coming from, we’re seeing an Internet where the United States’ dominance over the network is fading and other countries are getting in on the regulatory business. And this is really important, because the next billion internet users are going to come from countries that don’t have a Bill of Rights, that don’t have a 1st Amendment. And it’s going to be those governments that are getting in on the business of regulating the internet.
So, these trends are accelerating. We can see them now, but they are accelerating. And what I think this means is that the dream, today, is in danger, but we can kind of see forward into the future and what the future will look like twenty years from now. And twenty years from now, you won’t necessarily know anything about the decisions that are made that affect your life and your rights. The software is going to compute on data, and it’s going to decide whether you get a loan, whether you get a job, whether a car runs over you, or drives off a bridge. And these things are going to happen, and you’re not going to know why, and maybe the people who design the software are not going to know why either.
Now, when the public learns about these things, how are we going to feel? Well, it’s going to work well enough. It’s going to work well enough. And what is going to happen is that there will be a lot of mistakes, but the mistakes are going to be on the edge cases, and as long as those mistakes disproportionally affect edge cases and minorities, people are going to accept this state of affairs. But that’s not ok. Because it’s the edge cases and the minorities that are the ones that are the innovators, that are the early adopters, that are the first movers, that are the ones that evolve our society forward.
The Internet is going to become more like TV instead of this global conversation that we envisioned 20 years ago and that attracted many of us to computers. And rather than technology being revolutionary, and overturning existing power structures, we are seeing that technology is being used to reinforce existing power structures. This is particularly true in security. People want and need a certain level of safety online. But what we have learned is that people are not able to protect themselves. And so the idea of security has been centralized: companies and devices need to provide security to the public. Well, that’s not working well enough, but that is the enterprise that everybody here is engaged in trying to provide. But when we centralize in order to achieve that goal of security, what happens is that we create these choke points where regulation can happen. And we are seeing that regulation in things like the push for crypto backdoors. The regulation is going to be done by governments that have domestic, local concerns, not global concerns. It’s going to be influenced by elites, and that is people with money and companies with money. And so, powerful groups are going to get to decide who gets security, and who doesn’t.
And then, finally, we see that… finally we see that the way the internet is currently operating – for technological and for business reasons – instead of routing around censorship, is actually facilitating surveillance, censorship, and control. It doesn’t have to be this way. But if we are going to change things, we need to start doing it now. And it needs to start with us making… asking some difficult questions and making some hard decisions.
What is it going to mean when computers know everything about us, and computer algorithms decide about life and death decisions? Should we be worrying more about another terrorist attack in New York, or about the ability of journalists and human rights workers around the world to do their jobs? How do we value and weigh those two things? How much free speech does a free society really need? We see that technology has created this golden age of surveillance. Can we also use technology to readjust the balance of power between people and governments so that we can have some privacy back? Given that decisions by private companies are going to be determining individual rights, how can the public interest be communicated into that process? How can we democratically control what these important platforms and private companies do, without squelching innovation? Who is responsible for digital security? Is it the US government, is it governments’ role, is it private responsibility for corporations? And what is going to become of the Dream of Internet Freedom?
23:50 The Dream of Internet Freedom in the 1980s and 1990s
For me, the Dream of Internet Freedom began in 1984 when I read Steven Levy’s book “Hackers, Heroes of the Computer Revolution” in which he talks about computer scientists and engineers who built the early internet. And these people had a value called the Hacker Ethic. And the Hacker Ethic was that information should be freely accessible. The Hacker Ethic was that… was the hands-on imperative: that people should be free to manipulate, change, modify, study, reverse-engineer the technology around them. And the Hacker Ethic was built into the technology itself. The decentralization was a design principle that had a political impact. And the important thing about decentralization is that it empowered people to make their own decisions about what was right and wrong.
Decentralization was built into the very DNA of the early Internet, where there would be dumb pipes, smart edges, and the innovation could take place and the network could run it. And the idea was that when we have this global network, and the global network would allow us to communicate with anyone, anywhere, anytime, and that would bring us all of the hopes and dreams and glories that the human mind and heart could dream up. I wanted to live in that world.
And that idea, that we could be in charge of our intellectual destinies, and that thechnology would help, carried with me to college, when I went to college. And my college, it was New College, a liberal arts school in Sarasota, Florida. Do we have any New Collegians here? Oh, too bad. Yeah.
So, New College is a place where the motto of the school is that “Everybody in the final analysis is responsible for his or her own education.” And it was around that time that I read the Hacker Manifesto, in Phrack magazine, written by The Mentor. And I learned from that that hackers were a lot like my fellow academic nerds at New College. We were tired of being fed intellectual baby food. We wanted to take responsibility for our own lives. We thought that information should be freely available, and that we should be able to communicate and think freely. We wanted to… We mistrusted authority, and we wanted to change the world. And we wanted to live in a place where, in The Mentor’s own words, we would exist, without skin color, without nationality, without religious bias, just based on the quality of our thoughts. So, this is what I was into when I was starting to use the internet in 1991.
And I remember the day that everything clicked for me. We had a small ISP called hallownet and I needed some help, I didn’t know what I was doing, so I asked a question of the system administrator. And he started to respond to my question and I could see him typing, his letters, one by one, appearing on the screen I was looking at, and this, just, connection over the technology. And it made me realize, viscerally, for the first time that this idea that we could talk to anyone or everyone, in real time, could be a reality. So, twenty years ago, when I became a criminal defense attorney, I had this love of technology.
And I learned that hackers were getting in trouble for doing things that I thought were actually pretty cool tricks. One of the instances that really affected me was – I was a legal aide person for the San Francisco Sheriff’s Department, working in the jail and giving the people legal advice and representation. And one of the jail inmates was looking at having all of his jail credit time taken away because he had been basically hooting into the pay phone and getting himself, and all of his pod mates, free phone calls home. And I was like, you are going to get time away from that, that’s pretty cool. And as I was investigating the case, that’s where I learned that hackers were getting in trouble for all kinds of things. And there were all these laws out there that were impacting the Hacker Ethic, and I thought as a lawyer I should get involved and maybe I could help.
28:40 First attempts at online regulation laws in the US
So, this was also the same year that the Internet Civil Liberty Wars really started, at least from a legal perspective. That year, 1985, a guy named Marty Rimm wrote a “study” saying that the internet was running rampant with pornography. A law journal published the story, and then Time Magazine did a discover story on it, and the cyberporn scare was off and running. And there’s nothing that gets Congress more excited doing something than the scourge of pornography, so Congress quickly passed this law, the Communications Decency Act of 1996, the CDA. Ok, so this was an attempt to regulate online pornography.
So for those of you who are pornography fans out there this is already a bummer. But it was actually worse than that, because in order to regulate pornography, the government had to argue that the internet wasn’t going to be fully protected by the First Amendment. So the internet was going to be more like TV and less like a library, ok. And it was worse than that, because we had bigger hopes for the internet than that it would be a library. The internet was better than a library, because on the internet, everyone can be a creator, too. On the internet, it’s global. And on the internet, unlike in a library or a bookstore, everything is always on the shelves. So this idea that we will take this promise and basically cut its legs off, so early on, was anathema. We were very upset.
So, what happened was… So at that point, a lot of people became activated, and at that point entered John Perry Barlow, lyricist for the Grateful Dead, rancher, founder of the Electronic Frontier Foundation, a great man who has lived a wonderful life. Barlow has been in the hospital these past couple of weeks, so I send a shout-out to him and I hope people will keep him in their thoughts.
What Barlow wrote was, as could be expected, lyrical. And what he wrote was revolutionary, it was really a revolutionary document, the Declaration of Independene of Cyberspace. And in it, Barlow wrote: “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.“
Now, Barlow was reacting to the CDA and the assertion that the Internet should be lessfree than books and magazines. But he was also expressing a weariness, and I think a weariness that a lot of us shared, with business as usual. He was expressing a hope that the Internet would be able to place our reading, our friendships, our very thoughts beyond government control. And it was maybe naïve, and maybe a little bit radical, but the core, I think, attracted a lot of people, and definitely attracted me.
So, as it turned out, Marty Rimm and the Communications Decency Act didn’t kill the internet. In fact, it’s a little bit ironic, because what ended up happening was in the law suit challenging the CDA the Supreme Court struck down almost every part of the law. The Supreme Court said the First Amendment applies fully and completely to the internet. But there was one part of the CDA that the court did not strike down, and that part seems to have had… It seems to be, read alone, sort of the opposite of what Congress’s goal was in passing the CDA, because it said that online service providers don’t have to police the content on their services and can’t get in trouble for content except in certain, specific categories. So the idea there is that, you know, the provider doesn’t have to be a policeman.
And together, the Hacker Ethic, the Hacker Manifesto, the Declaration of the Independence of Cyberspace, ACLU vs. Reno – the Supreme Court case, and this part of the CDA describe what’s a more or less radical dream, depending on who you are. But it’s one that many, if not most of us in this room share, or shared, and maybe even have spent our lives working for. So, the dream is that we overcome age, race, class, and gender. The dream is that we can communicate with anyone, everywhere at any time. This enhanced individual liberty. The dream is that you have free access to information. And the dream is the hands-on imperative, the freedom to tinker, that we will be able to study, know, and ultimately understand the devices and software around us. The dream, in sum, was that computers were going to make our lives more free and better.
34:28 Inequality in the digital world
But I’m here to tell you today that this dream of internet freedom is dying. And if you look from here at the trends, and look twenty years on, it doesn’t just look like the internet could be a lot less revolutionary than we had hoped. It looks like in a lot of ways it might be a lot worse. Today, we have seen that race, gender, class are more than resilient enough to thrive in the digital world. We have seen that our ability to communicate with anyone anywhere is being limited by both government control and corporate policies about what speech is acceptable online. We have seen that free access to information is also limited, particularly our ability to study software and hardware around us. So many laws now interfere with what computer hackers do and reverse engineering. So, the question that’s left is: will computers liberate us? Is that dream still possible?
Now, I want to talk first about equality. Race, gender, and class discrimination are proving remarkably resistant to change. Now, I have to say, this has not been my experience: being here, at DefCon, at Black Hat, being part of the world of computer security, I have always felt respected and I have always felt welcome. But there’s too much evidence that other people’s experiences are not the same, and I want just to illustrate this with one simple set of statistics. At Google, 30 percent of the workforce is female, but only 17 percent of the people in tech jobs are. At Facebook, that number is 15 percent, and at Twitter, it’s 10 percent. So we are very far away from equality.
My other piece of evidence is anecdotal: This field in particular has a reputation for being overwhelmingly male and overwhelmingly white. Now, I have never understood why that’s true. Because, from what I have seen, the hacker community is unbelievably great at recognizing talent and skill in unconventional candidates. I mean, we have people who are unbelievably successful who never finished college, nevermind highschool. We have people all over the autism spectrum who are doing incredibly well. Age is irrelevant. I mean, Aaron Schwartz, when he was 15 years old, hung out with Doug Engelbart, the creator of the mouse. Inclusion is at the very heart of the hacker ethic and community. And I think we have a choice: we can kind of persist the way that we have been going, or, I think, this field could really be a leader and take leadership in evolving a more equal society, starting with security. I think we have consciencously try to do so, and try to cultivate that talent though.
37:52 Defending the freedom to tinker
Next, I want to talk about the freedom to tinker. And when I say “Freedom to tinker,” it sounds a little bit like a hobby, but I want to impress on people how important this is. Because it’s not the ability to, like, go putter around in your garage. It’s the ability to study, modify, and ultimately to understand the technology around us. And as technology is more and more important, that understanding is necessary for a democratic society. And there are two things that are limiting our ability to… ah, the freedom to tinker. One of them is law, and one is just our natural human capacity to understand things. Now, there are many, many examples of how the law has interfered with the freedom to tinker, but I am just going to give you two.
It was exactly ten years ago that Mike Lynn who worked for ISS was scheduled to give a talk about a new class of vulnerabilities in routers. And Lynn’s employer ISS and Cisco, the router producer he had studied, decided at some point that they did not want him to give the talk. And so they pressed Black Hat with threats of copyright lawsuits to actually rip the pages with Mike Lynn’s slides out of the conference books and reprint and re-do all of the CDs. I mean, there’s nothing that looks more like censorship than people actually ripping pages out of books. So, on stage the next morning, Mike got on stage, put on a white baseball cap?—like ?literally a white hat?—?quit his job and gave the original talk anyway.
Now, I was Mike’s lawyer in this case, and we successfully fought back the civil law suit for copyright infringement, we were able to fight back against the criminal investigation that the companies had implemented against him. But the message was loud and clear, and not just to Mike. The message was: This is our software, not yours. This is our router, not yours. You’re just a licensee and we’ll tell you what you are allowed to do, and you’ll just do that and no more. You can’t decompile this, you can’t study it, and you can’t tell anyone what you find.
The other case I want to mention is the criminal prosecution for computer fraud and abuse act against my fried Aaron Swartz. Aaron got in trouble for writing a script that automated the download of academic journal articles. And he was authorized to access these articles as a student at Harvard, but he was doing it really, really fast. Aaron was a hacker, and he challenged the system in all kinds of ways, and they went after him with a vengeance and charged him with CFAA on multiple counts, and he was looking at a lot of years. And the stress of the case ultimately contributed to his killing himself. But again… which is an unbelievable tragedy. But again, the idea that anything Aaron did was unauthorized to a computer scientist is crazy. But yet, here too, the message was clear: You need our permission to operate in this world. If you step over the line, we will come for you. If you automate, if you download too fast, if you type something weird in the URL bar on your browser –if we don’t like what you do, or if we don’t like you- than this law is vague enough that we can come and get you.
41:44 Political measures and cyber security
Now, the question is, in the future, are we going to have the freedom to tinker? What would it take for us to change the path that we are on now?
Well, first Congress would have to stop this tough-on-cybercrime hand waving, and actually do something for real about cyber security instead of saying “Oh, we’re going to have bigger penalties under criminal law.” Noone cares about bigger penalties under criminal law. If you look at any of the big breaches, we have had over the past two or three years, there have been no criminal prosecutions in any of them. And, you know, China or North Korea, or whoever is behind these breaches… we are not putting China or North Korea in prison. So what’s happening though is that these heavy sentences are chilling the good guys and are not protecting people online.
We’d also have to declare that users and people who buy software have the right to modify that software, and that laws like the Digital Millennium Copyright Act can’t get in the way of that. So this is really important, because in the next 20 years, we are going to have all these network devices. There is software in everything. It’s true now, and it’s only going to get more true. And if we are not allowed to study that, basically what it means is that we are going to just be surrounded by black boxes that do things that we can’t understand. So we need to get rid of the CFAA, and we need to get rid of the DMCA, we need to get rid of this idea that license agreements can limit what we do. There is a public interest in the freedom to tinker that needs to be protected.
43:32 Technological advances and software liability
The other problem is the idea of… is just our natural intellectual limitation on understanding the world. So in the next 20 years, we are going to see these amazing advances in machine learning and artificial intelligence. And one of the things that is going to happen is that software is going to do stuff and we are not going to be able to really understand why. Some of that opacity will be because we don’t write the software, but increasingly it may be that the very people who actually wrote the software don’t know either.
And a law professor, Frank Pasquale, wrote a book about this, called The Black Box Society. And the idea of the book is that we are going to be surrounded… that algorithms are going to be making these life and death decisions about us, and we are not going to be able to understand them. Really the first step to doing anything about something is to understand it, is transparency. And transparency is actually going to become increasingly hard. So, you know, you take secrecy and you take profit motive, you add a couple hundred million pieces of data about all of us, you shake, a result comes out, and that’s what we live with. We need to think very hard about how we take advantage of AI and machine learning without ending up in that kind of terrifying world. And part of that is talking about who is responsible when software fails. Who’s job is it? And so far, we have had almost no regulation of software. There have been very few cases, mostly where the vendor has misrepresented to the customers, you know, what the software does.
But people who, you know, are not big into regulation are sick and tired of crappy software and they are not going to take it anymore. And that feeling is going to be accelerated by the Internet of Things. Because now we have industries that are very used to product’s liability that are also software vendors. Autonomous cars that crash? Somebody is going to sue. When your networked toaster catches on fire, somebody is going to sue. And there is going to be software liability. We have already had Chrysler recall 1.4 million cars based on research that Charlie and Chris are going to talk about later today.
So, what is going to happen then when we have software liability? I think software liability is inevitable. I also think that it’s necessary. But without question, it’s going to make coding more expensive. And it’s going to make coding more conservative. I think that we will do a very crappy job of imposing software liability for a very long time. And I think that people who are going to suffer are going to be the innovators and the start-ups and not the incumbants. So we have to pay a lot of attention to that and be wary of it. But it is going to happen, because it’s a very short step from suing Tesla to suing Oracle, with all the good and bad that will come of that.
46:54 The evolution of public interest in relation to the use of the internet
So, next I want to talk about privacy, and security, and free speech, but we need to take a step back and talk a little bit about how we got here. So I said… I mentioned that when I was reading Stephen Levy’s book, I was learning about the concept of the decentralized internet, and the end-to-end principle. And the idea of the end-to-end principle was that… was intentional. Innovation happens on the edges. And what that meant was that the internet would not just enable communication – the phone network did that – but that it would do it in a democratized, decentralized, radical way. Power to the people and not to the governments or companies that own the pipes. That model has evolved. It has evolved for business reasons, and it has evolved for technological reasons.
Today, broadband users want to build smart pipes to enable quality of service, to do malware and spam filtering, and to have new business models where they can make more money off the fact that they control the underlying network. Today, hundreds of millions of people conduct their social interactions over highly centralized platforms like TenCent or Facebook. And so what does this mean for the public interest, this evolution away from end-to-end?
If you have not read professor Tim Wu’s book “The Master Switch” then you should. And in this book, Tim takes a look at the other great communication technologies of our lifetime: phones, radio, television, and movies. And what Tim says from studying the history of these technologies is that there is a cycle, and this cycle is this: History shows a progression of information technologies, from somebody’s hobby to somebody’s industry; from a jury-rigged contraption to a slick production marvel. I’m thinking BBS to web here. From a freely accessible channel to one controlled strictly by a single corporation or cartel?—?from an open to closed system. Eventually, entrepreneurs or regulators smash apart the closed system, and the cycle begins anew. And Tim traces the cycle through these technologies and then he asks the question I’m asking you guys here today: Is the Internet going to follow this cycle? Is the internet going to become centralized, strictly controlled, and closed?
If we don’t do things differently, the Internet will end up like TV. And, as I said, some of that is because we have neglected the goals of freedom and openness in favor of other values. But I think that we have to recognize that some of it is because people have lost their allegiance to the dream of internet freedom. Some people will say, maybe even people in this audience that the internet is not the utopia that I have made it out to be.
Rather, the Dream of Internet Freedom has clashed head on with the ugly reality that other people can suck. Nasty comments, 4chan, /b/tards, revenge porn, jihadists, nazis. These things are so affecting the public sensibility about whether the internet is a nice place to be or not that increasingly I even hear law professors, experts in the First Amendment who are supposed to know about the chilling effect and the doctrine of overbreadth, talk about what’s the best way to legislate this stuff they don’t like out of existence.
Second, the trends I told you about that are affecting the network. Ant these are centralization, regulation and globalization. Centralization is a problem because it is a cheap and easy point for regulation, control, and surveillance. Regulation is on the rise. It is the exercise of government power in favor of local or domestic interests and private entities with economic power. That’s just the reality of our system, and it’s even more so in other places. And globalization means that other governments are going to get into the mix: other governments who are not constrained by the First Amendment, who don’t have a Bill of Rights, who maybe don’t even have due process or the rule of law.
51:50 Users’ preferences and the golden age of surveillance
Now, when I say that corporate control is a problem, it may sound like I’m blaming corporations. And when I say that the government … the internet is becoming more closed because governments are censoring the internet, it may sound like I’m blaming governments; and I am. But I’m also blaming you, and I’m blaming me. Because it’s the things that we want that are driving these trends. So, just as an example: Who here ever had a blog? [Raises left hand] Did anybody ever have a blog? Okay, a couple of bloggers out there. Who here still blogs regularly? I don’t; I post my updates on Facebook, the centralized server.
Who… Probably a lot of people in this room run their own e-mail servers, but almost nobody else I know does. They all use gmail; and they like gmail because they like the user interface, they like the spam filtering, and they like the malware detection. I’m no different. When I had an iPhone, I didn’t jailbreak it. I downloaded the pre-approved, you know, a-ok apps from the app store, I trusted Apple’s judgement about what was secure and what was, you know, available, and I download apps now, because I don’t like the interface, the mobile interface, on my mobile browser. And when they ask me to say “Yes” to the permissions, I click “Yes” because I want it to do what it’s going to do, and so I give it access to all kinds of information about me, and I love it. I love when I’m at the store and my phone buzzes and reminds me that I need to get milk. I’m thrilled that it’s ubiquituously tracking my location. I mean, because otherwise – no milk, and that would be really bad.
So, my point is that we want lots of cool products in the cloud. But the cloud is such a terrible metaphor, because a cloud is billions of little droplets of water, and the internet cloud is not like that at all. The internet cloud is actually a finite and knowable number of companies that have… together have control over almost all of the internet that we use. And it’s Level 3 for fiberoptics, or amazon for servers, or google for the search enginge and for android; and the fact that there are these chokepoints, these particular companies that are subject to government regulation, whether US or other, means that this is an opportunity – this more centralized cloud is an opportunity for control, for surveillance, and for regulation. And this is not looking like it is going to change.
So, as things keep going in this direction, what does it mean for privacy, security, and freedom of expression? Well, privacy is central to liberty, and that means that without privacy the future will be less free. This is the golden age of surveillance. People know how much information technology today collects about you. But what you might not realize is this:
Here’s a quiz. What do e-mail, buddy lists, drive backups, social networking posts, your web browsing history, your bank records, your medical data, your fingerprints, your face prints, and your shedded DNA have in common? The answer is: The Department of Justice doesn’t think any of these are private. Right? The Department of Justice’s view is that these are all things that either happen in public, or what you voluntarily reveal to service providers, and so there’s no expectation of privacy, and the Fourth Amendment doesn’t apply. Okay. And what that means is that as technology has proliferated all these data, the law has not stepped in to protect it. The law has utterly fallen short on the job.
In fact, quite the opposite: The law is enabling surveillance in all kinds of ways. We have these national security surveillance laws that are supposed to apply to foreigners and particular categories of information, but we have learned through secret interpretations of law that our government in the United States is actually using it to spy on us. We have provider assistance provisions that the government is using not just to say “Well, you have this data, I’d like to get a hold of it”, but to try to force companies to do things like turn over their encryption keys. We have lots of laws, and more are being proposed, that will give corporate immunity for helping out the government in giving your data over, even when there’s other laws, narrow laws, that would say “No, actually this information is private.”And increasingly, particularly in other countries, but we are going to see it here, too, data retention obligations, where companies are going to be basically commissioned to be police officers and spies for governments.
Now, you might think, like, “But there’s got to be some law on this, right?” We have had the internet for a while, e-mail has been around for a long time. But really surprisingly, there is actually only one case that has ever been decided on this, from, you know, a regular or public court. And it was in the Sixth Circuit, which is Kentucky, Tennessee, Missouri, and one other state – sorry for the other state in the Sixth Circuit that I forgot – oh, Kentucky, Tennessee, Michigan, and Ohio; sorry Ohio. But, as a result… But basically, in this case, the Sixth Circuit said “Ok, e-mail is a communication, communications are like phone calls, it’s protected by the Fourth Amendment.” And this case has been really important, but the Department of Justice, in public and in private, continues to say that it is wrongly decided and needs to be overturned.
Now, I want to also take a moment to impress upon people, because you might not really get this (not being lawyers), what a warrant means and how important it is. Now, a warrant means that a judge has to authorize the search. And basically it’s a guard against arbitrary government action: The police cannot just come in and run rampant through your house, or just, you know, investigate you for no reason. And that’s important. But a warrant is also important because it requires you to specifically describe the place that is being searched and things to be seized. So a warrant is also a guard against mass surveillance.
When there is no warrant requirement, it means that searches can be arbitrary and massive, turned against everybody. So, this is really important, but all this data is not being protected.
So, and, you know, with globalization, it really only makes things worse. And it is going to get worse as we see the Internet of Things and networked devices. So, you know, we have got the centralization problem, there’s all this regulation, and countries are getting in on it. I mean, particularly now that other countries know how excessive the United States surveillance is, they want to have the same stuff.
59:14 Security vs. openness
Next, I want to talk about security. We often talk about security as the opposite of privacy, but we know that that’s not true. You can help security without invading privacy, like by locking cockpit doors. Sometimes, to protect security, you need to protect privacy: A homosexual person in India or a human rights worker in Syria is safer because of privacy.
One thing we don’t talk about that much is the relationship of security to openness. You know, as you lock down your network, as you make things more secure with sign-ons and there’s no open wi-fi anymore, and all of that, security has a tension with openness. But it’s also true at the same time that if the network is not secure and safe, people are not going to use it. What good is an open network that is too dangerous to use? Those of us who try to check our e-mail at DefCon already know that.
So the idea that should have been is that people can choose security when it’s appropriate and choose openness at other times. Right? The fact that we need to secure the electrical grid or data systems that control water doesn’t mean we have to have closed wi-fi, or that the government has to sit on the domestic network and spy on our e-mail. But that’s what we are seeing. We are seeing that, in the name of security, we are having this greater exercise of power, particularly by the US government, over our use of the network.
So, instead of having a global view, that security on the internet should be a rising tide that will float all boats, what we are seeing is this very provincial idea that security is “cyber.” And cyber means, in my usage, what General Hayden, former head of NSA and CIA said it means, which is that the US has the ability to use the network whenever we want, and we have the ability to deny that use to our adversaries. That does not sound like an open and reliable internet to me. That’s not my internet. And so what that’s meaning is that, you know, instead of us protecting the security of everybody, the government wants to have crypto backdoors, sit on the network and do surveillance, be able to blackout the internet in North Korea or wherever. And that means that there is going to be security-haves and security-havenots. I think the better analogy for security to understand it is that increasingly, we are seeing security becoming about power, where people in power want security for themselves and want to deny security to others. So, if security is a power relationship, then people are going to loose. And the people who are going to loose are the vulnerable communities, and the minorities, and the religious minorities, that actually need security most.
And, you know, here in the United States, people don’t care enough. Because we think “Oh, we have the Bill of Rights, and we have all of these laws that protect against discrimination.” I think a lot of people know that those laws, like our privacy laws, do not work well enough, but certainly in other parts of the world people don’t have those safeguards, people don’t have that security. And if we are not going to be a leader in providing it to them, they are going to loose out on the democratic benefits and on the human rights benefits of providing security to everybody. But I don’t hear that being the model from our government.
1:02:58 Freedom of speech is being undermined
Finally, I want to talk about freedom of expression. And, just briefly, you know, for all kinds of reasons, we have seen censorship on the internet, whether it’s copyright or that sort of thing. But now that the physical architecture is so centralized, it’s easier to control. So, here is an example: Our government and the UN have started to ask platforms to police their networks for political speech. And now it’s radical speech, or terrorist speech, ISIS videos, or jihadist things, but they are even starting to ask to watch out for people who are becoming radicalized.
So, I can tell you, if you look at what the FBI thinks of as signs of radicalization, we have no idea. The FBI doesn’t know, or psychologists, nobody knows what makes somebody a radical and what makes somebody curious; what makes somebody have legitimate, you know, non-violent political viewpoints, and what makes somebody who is going to be dangerous and violent.
But people are not rebelling against this. I don’t see people booing when google says “Ok, I’m going to, you know, I’m going to take ISIS videos off of YouTube.” People are not upset about that. So it’s ISIS videos, then it’s revenge porn, but we have to understand that this censorship, these censorship decisions, are inherently political, because we don’t see the same call for racist speech, or pictures of the Confederate flag, or that kind of thing. And in the United States, we are not even seeing these laws that we can protest against. What we see is that the government, or interest groups, put pressure on the companies, and the companies make these decisions because they want to have a service that appeals to the majority of their users – not to the edges, not to the fringe, not to the radicals. And so they censor, and most people don’t care.
But the end result of that is that for people, particularly the new adopters around the world… These new adopters around the world, they don’t really have a sense, necessarily, of the broader internet. There was a poll of internet users in Indonesia, and a lower percentage of people said that they use the internet than said they use Facebook. They don’t think about Facebook as “using the internet.” In a lot of the news stories I saw, that was like, “Funny, they don’t know that Facebook is the internet.” And I actually had the opposite reaction, which was: Facebook is not the full internet, Facebook is a community, a narrower community that allows you to do particular thins, shows you particular information, based upon what Facebook thinks you are going to like. But it doesn’t give us that global conversation. It doesn’t give us that radical freedom. It doesn’t have everything on the shelf.
1:05:48 Conclusions for the future
So what does this mean for the next 20 years? You know, in the next 20 years, things will happen, and no one will really know why. You will be more ignorant about the world around you. In the next 20 years, you will mostly feel ok about it. People will mostly accept it because it is going to affect minorities and edge cases, and it is going to work ok. It is going to work ok, and the internet is still cool, so we are still going to have enough good stuff.
The internet is going to become a lot more like TV. We are going to be watching videos, and consuming, and we are not going to be able to reach that global audience. I mean, even if you have a blog now, to reach an audience, you need search engine optimization, and CDNs, and all of that stuff. It’s not the level playing field that we once thought it would be. And existing power structures are going to be continued, replicated, and strengthened, whether that’s in the field of security, in the field of surveillance, or in the field of censorship.
Or, we have an alternative. We could think globally instead of locally and nationally about what we would want. Yes, we need to guard against more terrorist attacks in New York, but we cannot ignore the impact that something like crypto backdoors would have on journalists and human rights workers around the world.
We can start thinking about technology as something where we want to build decentralization back in, where possible. Give that power back to the people where we can. And part of restoring that balance of power is end-to-end encryption. We need end-to-end encryption, so that when the government needs your data, instead of secretly going to Level 3, or to Google, or to Microsoft, or to Apple, they have to come to us for it.
We need to have the government have hands off private technology development. It’s not the government’s business to tell us to design networks to be surveillance friendly, it’s our business to try to create technology that will give people the tools they need to have a better life and a freer life.
We need to start to be afraid of the right things. Humans are really bad at understanding risk. People are way more afraid of sharks than they are of cows, but cows kill something like 8 times more people a year than sharks do. It’s true, look it up. The most dangerous thing we do everyday is get in our car – by far the most dangerous thing we do everyday. So we need to start being afraid of the right things. We need to learn what to accept, and we need to address the right problems.
We need to modify our laws to be better. We need to get rid of the Computer Crime Law the way it is written, we need to modify the DMCA so that it doesn’t interfere with security research, we need to look at provisions of the Patriot Act and revise those as well as other foreign intelligence surveillance laws, and we need to do away with secret law. We have secret law in this country, and it is an abomination in the face of a democracy to have that.
At the same time, privacy isn’t dead. We may have all this technology collecting information, but we can use law to provide safeguards where technology can’t. But we won’t do it. Why don’t we amend the Electronic Communications Privacy Act to protect your e-mail fully? Why don’t we amend the Electronic Communications Privacy Act to protect our geolocation data? These are very simple, basic proposals, but Congress won’t do it. We have to get behind it, and we have to push.
Now, there is a possibility that these provisions -and these are just a few ideas - but there is a possibility that these things are not going to work, that these ideas are not going to work. And what that means is that in the next 20 years, instead of seeing the dream of internet freedom come true, we are going to see it getting sicker, and sicker, and sicker, until it finally dies. And then the internet is going to be this slick, stiff, controlled, closed thing. It will be good, it will be better than TV and radio, but this is what it will be. And if that’s true, then what we need to do in the next 20 years is: We need to get ready, and we need to get ready to smash it apart and make something new and better.
Thank you.
Metadata
video source: https://www.youtube.com/watch?v=Tjvw5fz_GuA
image source: https://medium.com/backchannel/the-end-of-the-internet-dream-ba060b17da6…
- Login to post comments