Program Notes

https://www.patreon.com/lorenzohagerty

Guest speaker: Cory Doctorow

Cory Doctorow delivering his Palenque Norte Lecture at the 2017 Burning Man Festival

Date this lecture was recorded: August 2017

Today’s podcast features the 2017 Palenque Norte Lecture by Cory Doctorow. In this talk, which was given during the Burning Man Festival, Cory walks us through some of the current issues having an impact on our ability to have our thoughts and words exchanged between ourselves without the threat of Big Brother listening in. As he explains, this isn’t a case of winning or losing, it is about the ongoing efforts we all must make so as to keep the Internet free and not simply be the tool of big corporations and governments to spread their stories at the expense of We the People.

Previous Episode

031 - The Sacred Cactus

Next Episode

032 - Ben Stewart’s Wild Ride

Similar Episodes

Transcript

00:00:00

Greetings from Cyberdelic Space.

00:00:19

This is Lorenzo, and I’m your host here in the Psychedelic Salon.

00:00:23

And to begin with, I thought that I’d better remove some confusion that I seem to have created here in the salon.

00:00:30

Since launching the Salon 2 track, you have heard both myself and Lex Pelger talk about Patreon,

00:00:37

which is a site that allows people to directly support artists, writers, musicians, and podcasters, among others.

00:00:43

Both Lex and I have personal Patreon

00:00:46

accounts where a few people send us whatever amount they want to each month. The funds from

00:00:50

those accounts go directly to each of us personally. But to support the web presence of the Psychedelic

00:00:56

Salon, after the first few years of supporting it out of my own pocket, I began to accept donations

00:01:02

due to the fact that, well, our numbers had grown to a

00:01:05

point where it was getting too expensive for me to do on my own. And so I began accepting donations

00:01:11

to help offset some of those expenses. Now, over the past 12 and a half years, there have been many

00:01:17

millions of downloads made by well over a million people. And out of that number, and over the past decade, there have been somewhere

00:01:25

around 300 to 350 people who have made donations to keep us going. So, in a way, you can say that

00:01:34

the wonderful people who make direct donations to the salon are one in a million. They are the

00:01:40

heart and soul of these podcasts from the Psychedelic Salon, and without them, well,

00:01:44

we wouldn’t be sharing this time together right now. Hopefully that gives you a little better

00:01:49

idea about the difference between our Patreon followers who are supporting our personal work,

00:01:54

and our wonderful fellow salonners who make direct donations to keep the salon going.

00:01:59

And during the past three weeks, the salon has received direct donations from Douglas H.,

00:02:08

During the past three weeks, the salon has received direct donations from Douglas H., Daryl H., and Joel M., all of whom I would like to thank from the bottom of my heart.

00:02:11

You have provided the fuel, and now it’s my turn to light the fire.

00:02:17

So today I’m going to begin podcasting the Planque Norte lectures that were held at this year’s Burning Man Festival.

00:02:24

And thanks to Frank Nuccio,

00:02:26

we have some excellent recordings of all the talks, something that I was never able to pull

00:02:31

off myself when I first began that lecture series. It’s not so easy to do out in the playa.

00:02:37

So Frank, along with the rest of our fellow Saloners, I send my sincere thanks for all of

00:02:43

your hard work out there. And I also want

00:02:45

to thank everyone at Camp Soft Landing, and particularly those who work to produce these

00:02:50

lectures. I know how tough that can be out there on the playa. It’s a difficult task and is done

00:02:56

by a core of volunteers, and we owe them all a big debt of gratitude. And I should also add that Frank has offered to buy me a ticket to Burning Man in

00:03:06
  1. Why so far in the future, you ask? Well, my friend Bruce Dahmer let it slip to Frank that
00:03:14

I told him that if all went well, I would come back to the burn that year to celebrate my 80th

00:03:19

birthday. And now that Frank has offered to remove one of the big obstacles to my making another burn

00:03:25

I guess that I’m going to have to begin getting a little more serious about actually doing that

00:03:30

so thanks for that as well Frank. Now my previous podcast from here in the Psychedelic Salon 1.0

00:03:38

I featured a talk by Gore Vidal along with a few highly political comments by myself

00:03:43

and to be honest, I was a

00:03:46

little surprised at some of the responses that I received about that program. Many of them were

00:03:51

negative. And here’s part of one of my favorite negative comments, and I’m quoting,

00:03:57

Can you please just stick to posting psychedelic research and McKenna talks, and leave the spreading of leftist political ideology to professionals like Salon or CNN.

00:04:09

I used to come to this podcast to educate myself and learn about interesting, non-politically fired subjects.

00:04:16

Or if there were politics, people like McKenna would at least take an unbiased stance, if anything.

00:04:22

But now I should listen to your rapid leftist rants because

00:04:25

somebody in the White House doesn’t like sodomites, terrorists, and illegal criminals?

00:04:30

End quote. It goes on, but I think you get the gist. So I thought that today I should carry on

00:04:38

with yet another political discussion. You see, I learned a long time ago that if I’m not pissing somebody off, then, well, I’m not doing a very good job.

00:04:49

Now, don’t get me wrong, I’m not doing this just to make people mad.

00:04:52

The reason that I’m doing another politically favored talk is in the hope that it will help you to open your mind a bit more

00:04:59

and give you yet another view of what is meant by psychedelic thinking.

00:05:03

You see, the word psychedelic does not of itself imply that drugs are involved.

00:05:09

The word was originally coined to mean mind manifesting.

00:05:13

And, for me at least, one of today’s writers has for many years now helped me

00:05:18

to think about things in ways that I hadn’t conceived of before.

00:05:22

His name is Cory Doctorow, and this will be the third time

00:05:26

that I’ve had the pleasure of featuring him here in the salon. Hopefully this talk won’t offend too

00:05:31

many people, but we would have a rather narrow-minded group of fellow salonners if it doesn’t rustle at

00:05:37

least a few feathers here and there. But hey, we’re big kids now, and so I hope that we can all get

00:05:42

something positive from this very interesting Planque Norte lecture by Cory Doctorow. Our next speaker is Cory Doctorow. Trumpism knocks us back.

00:05:58

Thanks, everyone. Thank you for coming. It’s very nice to be back at Palenque Norte for, I think, the fourth, fifth time, maybe.

00:06:06

Nice to see some of you again.

00:06:08

So this is a talk I gave this year at DEF CON, which is a big hacker conference in Las Vegas.

00:06:14

And I’m going to reprise it for you folks.

00:06:17

I’ve changed it a little because I’m assuming that you’re not all cryptographers and security experts.

00:06:20

But it kind of relates to something that’s happened in my own work, which is, you know, I work in part for Electronic Frontier Foundation.

00:06:28

We work on privacy and free speech as they relate to the internet.

00:06:32

And obviously, the election of Donald Trump was kind of a sad moment for us for not just

00:06:40

the obvious reasons that the whole country shared, but in specific, the issues we work

00:06:44

on and some of the policy fights that we’d made some progress on

00:06:47

suddenly started to roll back.

00:06:49

And I got to thinking about how to understand policy setbacks

00:06:54

and how to understand policy advances

00:06:57

and what we really mean when we say we’re winning or we’re losing these big fights.

00:07:02

So you may remember during the Obama administration, Obama elected,

00:07:08

appointed this guy, Tom Wheeler, to run the FCC. And at the time, I think a lot of us were really

00:07:13

worried because Tom Wheeler is a former telecoms lobbyist. John Oliver famously called him a dingo

00:07:21

babysitter, right? You know, you take this guy who’d worked for the big cable operators

00:07:25

and you put him in charge of regulating the cable operators.

00:07:28

And there was this push to get Tom Wheeler

00:07:31

to require that ISPs treat the internet as a kind of dumb pipe

00:07:39

and that we have a neutral internet by kind of regulation

00:07:43

as well as by custom.

00:07:45

And this was called Title II regulation.

00:07:49

And lots and lots of people phoned in, lots of people emailed,

00:07:53

lots of people went in person.

00:07:54

And then in February of 2015, February 26, 2015,

00:07:59

Tom Wheeler, the dingo babysitter, actually gave us a network neutrality regulation.

00:08:04

He gave us what

00:08:05

we’d been asking for. And it was a day where it felt like we’d won. He even went further. You know,

00:08:13

a lot of the states have rules, state rules, that ban cities from providing broadband to the people

00:08:20

who live there, even if no cable operator or phone company want to give broadband to the people who live there, even if no cable operator or phone company want to give broadband

00:08:27

to the people who live in the state or the city. And he said that those rules exceeded the state’s

00:08:33

jurisdiction, that only the FCC could make rules about who could and couldn’t be an ISP, that the

00:08:38

states couldn’t do it. So he was opening the door for cities to serve their own citizens as well.

00:08:42

So it was an amazing day. And then like a year later, January 20th, Donald Trump is sworn in and he immediately appoints this guy named

00:08:52

Ajit Pai to run the FCC. And Ajit Pai is also pretty thoroughly a dingo babysitter.

00:09:00

He hates net neutrality. He has been a staunch opponent of net neutrality through his whole career, much of which he spent working as a lawyer and lobbyist for the telecoms.

00:09:11

And he promised that his first order of business would be to dismantle the Title II regulations and get rid of net neutrality rules in America.

00:09:26

rules in America. And so on July 12th, there was this day of action. And now there is this day of reckoning that’s going to come where we find out what Ajit Pai does as a result of all of these

00:09:31

people who wrote in and called. And it begs the question, you know, are we winning or losing the

00:09:35

fight for net neutrality? Are we winning because Ajit Pai got millions of phone calls? Are we

00:09:41

losing because Ajit Pai has promised to steam ahead? And I think the thing

00:09:46

is that that’s the wrong sort of question.

00:09:47

Because political change is a process

00:09:49

and not a product.

00:09:52

And it’s a process governed by four forces.

00:09:54

Forces that Lawrence Lessig set out

00:09:56

in his book

00:09:57

Code and Other Laws of Cyberspace

00:09:59

in 1999. So Lessig says

00:10:02

that the world is regulated

00:10:03

by the confluence of code, markets,

00:10:07

laws, and norms. That what’s technologically possible, what’s legal, what’s considered moral

00:10:14

or right, and what’s profitable determines what happens. And I think that this is a framework

00:10:21

used to understand whether or not we’re winning or losing and how to

00:10:26

win more and lose less so let’s look at where network neutrality stands now on the code side

00:10:33

well this lessic protege is guy named tim wu who had served in the ftc and then at columbia and

00:10:43

then went to work for the Attorney General of New York,

00:10:46

he got some open source measurement tools and he gave them to New Yorkers

00:10:50

to measure their internet speed and find out what actually happens

00:10:55

when you buy internet service from one of the few companies legally allowed

00:10:59

to give you internet service in New York.

00:11:01

And in June 2016, he sent an open letter to Time Warner Cable that said,

00:11:07

in advertisement after advertisement, Time Warner Cable promised a, quote, blazing fast,

00:11:12

quote, super reliable internet connection, yet it appears the company has been failing to take

00:11:16

adequate or necessary steps to keep pace with the demand of Time Warner Cable customers,

00:11:21

at times letting connections with key internet content providers become so congested

00:11:25

that large volumes of internet data were regularly lost or discarded so so we now have technology

00:11:31

that we can use to actually show that the network access that we’re being promised and that we’re

00:11:36

paying for is uh is not what we’re getting so that was that’s actually a big thing prior to that

00:11:42

when we would say to the cable operators or the ISPs,

00:11:46

your internet sucks, they would say, no, it doesn’t. Everyone hates their internet.

00:11:51

No one likes it when there’s the occasional slowdown. But our internet is perfectly fine.

00:11:56

We are the only ones qualified to tell you whether our internet is perfectly fine because we run an

00:12:00

ISP and you don’t. Never mind that they’re the only ones legally allowed to.

00:12:02

fine because we run an ISP and you don’t.

00:12:03

Never mind that they’re the only ones legally allowed to.

00:12:06

Bits per second.

00:12:08

Well, yeah.

00:12:11

And so

00:12:11

now we can

00:12:14

actually quantify

00:12:15

how bad does the internet suck?

00:12:18

We can also quantify

00:12:19

did it just get better or worse?

00:12:21

We try a thing, we can tell you whether

00:12:23

or not the thing worked.

00:12:31

So we now have a new code element in our arsenal to use in the fight over network neutrality.

00:12:32

How about the law?

00:12:36

Well, we have Title II that’s in law, right?

00:12:39

There’s a rule now that says ISPs have to be neutral.

00:12:41

And we have a guy who says he’s going to get rid of it.

00:12:44

And we have a fight looming. The composition of

00:12:46

the FCC is fixed by statute. So maybe if Congress wanted to, they could pass a law that would change

00:12:52

the composition of the FCC. But it seems unlikely that we would get any legal reform in the

00:12:58

composition of the FCC, not necessarily because there aren’t people in Congress who’d like to do

00:13:02

that for good reasons and bad, but because Congress can’t even fucking pass a budget.

00:13:08

In general, for the last 10 years, counting on legislation at the federal level being the thing that changes what goes on has been kind of a mugs game, just because the legislative agenda has been a fucking basket case for a couple of administrations.

00:13:23

How about markets?

00:13:24

Well, markets suck. Telcos

00:13:27

are the original highly concentrated industry. They’re what economists call naturally occurring

00:13:33

monopolies. There’s only one right of way, and the carriers usually have an exclusive lease to it.

00:13:40

It often doesn’t make economic sense to run a second set of wires once the first set of wires is in there and moreover the traditional hedge against the telcos in markets has been the online

00:13:51

companies the online companies make money from a neutral internet the telcos make money from a

00:13:56

discriminatory internet because they can extract revenue from the online companies but now the

00:14:01

telcos are buying the online companies, right? So now Verizon owns

00:14:05

Tumblr. Tumblr was one of the great forces for mobilizing people for Title II in 2015.

00:14:11

Tumblr was pretty missing in action in 2017. And so markets are not our friends. And the other

00:14:18

thing that’s happened is that the winner-take-all economics of networks in general and and our economic system more particularly at this

00:14:26

moment has turned a lot of last year’s pirates into this year’s admirals so like uh we had uh

00:14:34

we had you know um uh netflix as one of the great advocates for a neutral internet because they

00:14:40

correctly perceived that one of the things that that really was at risk in a discriminatory

00:14:45

network environment was that cable operators in particular, who viewed Netflix as a direct

00:14:50

competitor and were worried about cable cutting, would block Netflix and then charge extra

00:14:57

for Netflix access and use the fact that videos were big files to go after Netflix and say,

00:15:04

well, we’re not blocking Netflix. We’re just blocking

00:15:06

certain kinds of big files that are latency sensitive because they interfere with our

00:15:10

network management. Now, Netflix, since the 2015 fight, has become so interwoven into our lives

00:15:18

that now in an investor call, their CEO said, you know, we still think network neutrality is important, but it’s not an

00:15:25

existential risk. Because we as Netflix are pretty sure that any ISP or any cable operator that

00:15:34

blocked us would face such howls of outrage from their customers that we would be able to come back.

00:15:41

And so we feel okay about our future with this. And we’ll fight

00:15:46

for it, but it’s not a fight to the death. So that’s bad news. So it’s bad news about markets.

00:15:52

It’s not great news about law. Pretty good news about technology. But where we have amazing news

00:15:57

is in norms. People care about network neutrality. Decrying network neutrality right now,

00:16:04

like saying, I don’t like network neutrality,

00:16:05

makes you sound like a colossal asshole.

00:16:08

Network neutrality is counted as a victory that we have won.

00:16:13

People believe that they have the right to net neutrality.

00:16:17

Millions of people wrote to their lawmakers on July the 12th.

00:16:20

Like more than 10 million wrote and called lawmakers on July the 12th. There’s

00:16:25

evidence in hand from the telcos that Title II didn’t affect their investment in network

00:16:33

infrastructure. One of the things they said is if you regulate us and make us have a neutral

00:16:36

internet, we won’t spend any money on network infrastructure. Then when they started having

00:16:40

to make investor calls where like if they were caught lying, they would be personally criminally liable for telling a lie.

00:16:48

They said, actually, although we said that we would change our investment as a result of net neutrality, we’re not going to change our investment as a result of net neutrality.

00:16:56

And so norms are huge.

00:16:59

And we have millions of Americans who today care about telecoms policy.

00:17:04

I mean, telecoms policy. I mean, telecoms policy.

00:17:06

I find it difficult to remain interested in telecoms.

00:17:11

I mean, it is the most boring subject imaginable,

00:17:15

and yet we have arrived at this bizarre moment in 2017

00:17:19

where millions of people who are not network engineers,

00:17:23

have no money invested in phone

00:17:25

companies, are not in any way specialized in telecoms policy, really have deeply held beliefs

00:17:33

in telecoms policy. It is almost impossible to overstate how fucking weird it is that people

00:17:40

care about telecoms policy. And that’s an amazing thing, right? That is an asset that the fight over network neutrality

00:17:47

has notably lacked for its entire history, right?

00:17:52

The biggest deficit network neutrality had

00:17:54

was that nobody even cared about

00:17:57

the whole domain of telecoms policy,

00:17:59

let alone this one weird corner of it.

00:18:02

So are we winning or losing on telecoms policy?

00:18:05

Well, it’s both, right?

00:18:06

Last year, we leveraged our norms to win a legal battle.

00:18:08

This year, the legal side got a huge push from the market side

00:18:11

because the government got bought out by big telco.

00:18:14

But this year, we got bigger norms, right?

00:18:16

Our normative power keeps growing

00:18:18

because even the politicians who got bought,

00:18:22

they got elected because they were backed by net roots who really care about an open Internet on both sides.

00:18:30

Any net roots insurgent candidate really, really has a base that cares about network neutrality because they all view their political power as emanating from the ability to have open and free networks where they can talk to one another and organize and rally.

00:18:47

And so that includes the Trumpist side of the Republican Party as well.

00:18:52

The politicians themselves might favor a more controlled lockdown network

00:18:57

because now that they’ve been elected, they’re not all that interested in insurgencies.

00:19:00

They would rather have some stability.

00:19:03

But their base really, really understands

00:19:05

that this is important. So are we winning or losing? Never mind. Winning or losing is the

00:19:12

wrong sort of question. The right side of question is, how do we win more than we are right now,

00:19:16

and how do we make our adversaries lose more? This is like a security problem. When you’re

00:19:21

red teaming an adversary, when you’re trying to figure out what weaknesses your adversary has, you look for their weak points and you attack them with their

00:19:28

strengths. And our strength is in norms. You know, net neutrality, it’s not regulating the internet.

00:19:35

Telecoms is regulation, right? There is no telecoms without regulation. You know, like a

00:19:41

phone company without regulation is an impossible undertaking because unless you can get a government to give you the right to put poles on every corner and dig into everyone’s basement and dig trenches along the long rights of way that go from New York to Los Angeles and to every other major point in the country, then you will spend trillions and trillions of dollars and still end up with like the holdout problems where you know

00:20:05

you you want to buy that last house that has that last quarter miles worth of of a place where you

00:20:10

need to run your wire and there’s no other way around it and those people raise the price to

00:20:15

your entire marginal operating profit for the next 10 years because they know they can because you’ve

00:20:20

got all these sunk costs so without a government to impose a regulation that gives you

00:20:25

a subsidy in the form of these rights of way, there’s no ISP, there’s no telecoms, there’s no

00:20:29

phone companies. Phone companies are just regulations. So we’re not arguing about whether

00:20:34

there should be regulation. We’re arguing about whether the regulation will be in the service of

00:20:38

the people or the incumbent telecoms giants. And will we win this time? That’s the wrong question.

00:20:43

It would rock to get Ajit Pai

00:20:46

to back off, to win an important victory against this dingo babysitter who showed up promising to

00:20:52

kill net neutrality and make him back down. That would win us a huge legal and normative fight,

00:20:58

right? Because we would be emboldened. Millions of more people would enter into this weird thing of having a strong point

00:21:05

of view about obscure technical subjects of telecoms policy. And it would also rock to kick

00:21:11

his ass in a court of law, right? To have him pass this rule and then show in a court that he was

00:21:15

fucked up and wrong. Either one would be a huge morale booster and either one would create a lot

00:21:20

more net neutrality stakeholders who were pirates still and not yet admirals.

00:21:29

When net neutrality rules are in place, new businesses can start.

00:21:35

And when net neutrality rules are nuked, small businesses can’t start and big businesses consolidate their gains.

00:21:42

So, you know, telecoms advocates in the 1980s, they thought that they’d won when the FTC broke up AT&T. But then the carriers came roaring back because, of course, they did.

00:21:46

Our victory didn’t cause all those greedy,

00:21:48

anti-competitive telecoms executives

00:21:50

to, like, convert to Buddhism and move to an ashram.

00:21:54

They hadn’t lost.

00:21:56

They just had this, like, business, legal,

00:21:58

and normative setback.

00:21:59

They worked at the margins, like Voldemort,

00:22:02

recovering his powers on the back of some poor asshole’s head,

00:22:05

nursing themselves back to power, merging and coalescing, working in Congress.

00:22:10

And it helped that at the same time, you know, finance capital was devouring the world

00:22:13

and like favoring firms that got bigger and bigger and gutting the competition rules and so on.

00:22:22

And maybe, you know, we can outspend them.

00:22:25

Maybe we can win on the market side finally.

00:22:28

The total value of the ecosystem that they’re strangling

00:22:30

with network discrimination is much larger

00:22:33

than the petty profits that they get to trouser by discriminating.

00:22:37

But every pirate wants to be an admiral.

00:22:39

And so the insurgents who bankroll net neutrality today

00:22:42

will be tomorrow’s telecom chums,

00:22:44

happy to slip big ISP a few bucks for premium carriage

00:22:47

if it means their upstart competitors can be walled off from their customers.

00:22:51

It’s like that final line in Animal Farm.

00:22:53

The creatures look from pig to man and man to pig,

00:22:56

and it’s impossible to tell which is which.

00:22:59

So just like it’s useless to ask if we’re winning or losing,

00:23:02

it’s useless to ask whether companies are on our side or not on our side. Companies aren’t on our side. The right question is, how can we improve

00:23:11

our situation and which companies can we enlist to do so right now while they have a confluence

00:23:17

of interest with us? So I want to apply this to some other issues. One is backdoors. So EFF, we made our bones with fighting backdoors in crypto.

00:23:28

In 1992, we represented this mathematician, computer scientist named Daniel J. Bernstein,

00:23:33

DJB, who was arguing that he had the First Amendment free speech right to write code that

00:23:39

embodied strong cryptography and put it on the internet, even though the NSA said that was ammunition.

00:23:46

And we won.

00:23:50

And it was a triumph of law over norms, markets, and code because we had tried to win with code.

00:23:54

We tried to just make strong crypto available,

00:23:58

and that hadn’t been sufficient.

00:24:00

No one was starting businesses to do it.

00:24:01

No one was bringing it to people.

00:24:03

People weren’t sure if they could export it or use it in their products.

00:24:07

We tried showing that the cipher that the NSA said was sufficient for civilian use was insufficient.

00:24:16

So John, who’s sitting over there, built a computer called the Descracker

00:24:20

that for about a quarter million dollars in about two hours could exhaust the entire key

00:24:25

space of this cipher that the NSA wanted us all to use to protect our medical records, our banking

00:24:31

records, our military secrets, everything. And he showed that all of that could be taken over for a

00:24:37

quarter million dollars in a couple hours. It didn’t work. We tried to get the finance industry

00:24:42

to show up and say, our banks need better security.

00:24:46

That didn’t work.

00:24:47

No one understood crypto.

00:24:49

That didn’t work.

00:24:51

We tried all of that stuff, but what eventually carried the day was this legal strength that we had, which is that the First Amendment protects expressive speech.

00:25:01

And the way programmers express themselves is by writing code.

00:25:04

And so code is a

00:25:05

form of expressive speech and the ninth circuit and then the ninth circuit appellate division

00:25:08

excuse me upheld our argument on behalf of djb that code uh that that code was speech and that

00:25:15

crypto should be given to civilians and we won this fight now this law is very u.s specific no

00:25:23

no other countries have an exact analog to the First Amendment,

00:25:26

and even countries with strong free speech protections

00:25:28

fall short of the kind of absolute free speech protection

00:25:31

that was invoked in Bernstein.

00:25:33

So, for example, Canada has the Charter of Rights and Freedoms,

00:25:36

and there’s the Human Rights Directive in the EU,

00:25:40

but under both of those, it’s probably okay to ban strong crypto on speech grounds,

00:25:46

whereas in the US, it’s not. And so the US became a guardian of the world’s working crypto.

00:25:52

Whenever anyone in one of these other countries that had weaker free speech laws

00:25:56

proposed backdoors for crypto that would allow cops or governments to spy on people,

00:26:04

the response was, well, we could pass that shitty, stupid law,

00:26:07

but the US is going to just keep on exporting strong working crypto to our people, and then

00:26:13

that’s just not going to work very well. This legislation will not achieve its stated goal.

00:26:17

So the US fights, they’re very important. That’s why everyone was watching so closely after the

00:26:22

San Bernardino shootings when Apple was being inveigled by the FBI to introduce backdoors into iOS

00:26:29

to break into the iPhone that the San Bernardino shooters had used.

00:26:34

Thank you.

00:26:36

So what happens when we apply Lessig’s four factors to backdoors?

00:26:40

Well, on the norm side, I think we’re starting to win here, right?

00:26:43

We’ve had a couple of giant leaks of hoarded vulnerabilities that the spy services had either discovered or deliberately

00:26:52

introduced that turned out to be like really bad news. So the CIA had all these leaks called the

00:26:57

Vault 7 leaks. These were backdoor or vulnerabilities in commonly used computers and systems that the CIA had discovered, that they’d kept a secret,

00:27:10

and that they had learned how to exploit

00:27:11

so that they could attack their adversaries.

00:27:14

And when they leaked onto the Internet,

00:27:16

they were taken over by petty criminals and by foreign governments

00:27:20

and used to attack Americans and other people.

00:27:23

And all of a sudden, people who never really understood this kind of backdoor slash zero-day hoarding issue started to pay

00:27:29

attention. We hit a kind of tipping point. Then the shadow brokers happened as well. The shadow

00:27:35

brokers was an NSA leak, much the same. And then the shadow brokers leaks got integrated into an

00:27:41

old piece of ransomware and turned into a new piece of ransomware called WannaCry, which shut down hospitals, financial institutions, mass transit systems, aviation

00:27:50

systems all over the world. And the amazing thing about WannaCry and these other ransomware attacks

00:27:57

is that although historically the argument has been we need to retain these cyber weapons so we can attack, like, evil supervillains.

00:28:06

The actual use of these cyber weapons when they appeared in the field was they were wielded by dum-dums, right?

00:28:14

So, like, does anyone know what the ransom was for WannaCry?

00:28:18

$350.

00:28:19

$350.

00:28:20

You had to give the anonymous dum-dums who operated WannaCry $350 to unlock your hospital.

00:28:28

Now, in theory, they could have asked for more than $350.

00:28:32

Why didn’t they ask for more than $350? They had no idea what they had.

00:28:35

We worried about supervillains, and it turned out that we got this in the hands of dum-dums.

00:28:41

Meager villains.

00:28:47

this in the hands of dum-dums meager villains well the problem is that we kind of we feel like we can game out the motivations of super villains like kim jong you know kim jong-il is going to do

00:28:53

this or that based on you know these vast geopolitical forces and what’s going on in

00:28:58

other parts of the world but like unpredictable unstable people who don’t really understand what

00:29:03

they have they’re like people who like smash your car window for the quarter you left on your dashboard.

00:29:09

You can park your car under a streetlight and make sure that you’ve got it insured and so on.

00:29:16

And generally that protects you against car thieves, but it doesn’t protect you against the person who’s so drunk that they smash your car window for a quarter.

00:29:24

person who’s so drunk that they smash your car window for a quarter and we have put massive super powerful cyber weapons in the hands of the kinds of people who smash your car window to get

00:29:30

the quarter so uh on top of that so so this has really pushed the normative discussion uh the

00:29:38

other thing that’s really pushed the normative discussion against backdoors is that there is

00:29:42

and remains no evidence that super villains are going dark

00:29:45

so the spy agencies keep saying well someday all the bad guys will go dark and we won’t be able to

00:29:51

spy on them and they won’t use encryption or they’ll all use encryption and we won’t be able

00:29:56

to figure out what they’re doing and millions of people will die and it’ll be your fault because

00:30:00

you’re you invented this cool cryptography without understanding that bad guys would use it.

00:30:05

And although eventually bad guys

00:30:07

are going to figure out better OPSEC,

00:30:09

generally speaking,

00:30:10

the terrorists who attacked Paris,

00:30:14

they left a laptop in a garbage can at the scene

00:30:17

with an unencrypted folder on the desktop

00:30:19

that was called secret plans.

00:30:22

There’s just no evidence

00:30:23

that the bad guys are going dark.

00:30:26

And there’s an increasing appreciation of the fact that the spy agencies

00:30:31

can do a shit ton with metadata without having to actually read your email.

00:30:35

Just by knowing, you know, Alice sends a message to Bob,

00:30:38

and Bob sends a message to Carol, and then Carol does a thing,

00:30:43

that is itself a very useful piece of knowledge that the spy

00:30:47

agencies were never able to get hold of before, and now they have access to it in huge numbers.

00:30:53

And so they actually have a much better view into what people are doing and what they have done

00:30:58

retrospectively than they ever had. In contrast to the story of going dark, what’s really happening is they’re getting superb telemetry on everyone on Earth.

00:31:08

And their claims of helplessness in the face of computerization just ring hollow.

00:31:16

So there’s also a counterforce to this in the norm side.

00:31:21

There’s a lot of hysteria about the dark net.

00:31:25

to this and the norm side there’s a lot of hysteria about the dark net uh there’s a lot of negative uses of cryptocurrency like real actual non-hysterical ones like cryptocurrencies are now

00:31:31

like the go-to way to get paid for a kidnapping for example um but uh but you know the norm story

00:31:39

it’s a kind of a little a bit this a little of that. On the market side, all good news.

00:31:45

Full disk encryption is now standard

00:31:47

in almost every operating system, mobile, desktop.

00:31:50

It’s just become like the kind of default.

00:31:54

Nobody would try and sell a commercial OS

00:31:56

that didn’t have full disk encryption.

00:31:58

It’s hilarious to watch the spy agencies

00:32:00

try to figure out how to talk about this

00:32:02

because, you know, like five, ten years ago,

00:32:04

the argument went,

00:32:05

you guys are making full-disk encryption tools,

00:32:10

but your customers don’t want them.

00:32:12

You are out of step with the rest of the world.

00:32:14

You should stop making full-disk encryption.

00:32:17

And now they’re saying,

00:32:18

you guys have all these customers

00:32:21

who want full-disk encryption,

00:32:22

but you know that full-disk encryption

00:32:24

is a force for evil.

00:32:27

And so even though you’re in step with the world,

00:32:30

you should stop making full-disk encryption, right?

00:32:32

It’s kind of having their cake and eating it too.

00:32:35

We have WhatsApp on the market side.

00:32:39

We have this tool fielded by Facebook

00:32:43

that’s used by billions of people that now uses super shit-kicking end-to-end encryption.

00:32:49

They’ve made some compromises on the way that make it slightly weaker than it could have been.

00:32:54

But it’s still very strong and very good.

00:32:57

And they’re acclimating millions of people to the idea that they should have end-to-end cryptography in their messaging. And it would be completely unlikely for someone to enter the market,

00:33:07

in the West anyways today,

00:33:08

with a messaging tool that didn’t have end-to-end crypto.

00:33:11

Now, the bad news on markets is that AT&T made tens of millions of dollars

00:33:18

selling backdoors into their network.

00:33:22

This was a thing revealed in the Snowden regulation,

00:33:24

in the Snowden leaks, and then reported in the Snowden regulation and the Snowden leaks and then reported

00:33:26

in the New York Times by Heinrich Molka

00:33:28

and Laura Poitras.

00:33:29

The zero-day trade is alive and well

00:33:31

and countries around the world are

00:33:34

able to buy their way into a surveillance

00:33:36

state by

00:33:37

just buying in tools from

00:33:39

Western countries

00:33:40

and the companies that are headquartered there.

00:33:44

So we have a client at EFF,

00:33:46

Mr. Kadani, who’s an Ethiopian dissident in exile who lives in Washington, D.C., who left Ethiopia

00:33:54

under threat of his life. And in D.C., the Ethiopian government hacked him using a tool

00:34:00

from an Italian company called Hacking Team and broke into a Skype, figured out who

00:34:06

his friends were in Ethiopia, and then tracked them down and arrested them and tortured them.

00:34:11

And so the new existence of turnkey surveillance states, again, this is kind of the dum-dum

00:34:17

problem again. States that have no internal IT capacity are becoming super sophisticated surveillance states.

00:34:26

And generally, I think it’s understood or at least believed that the Western states that sell them

00:34:32

or the technologically developed states that sell them these surveillance tools backdoor the surveillance tools.

00:34:37

They perform what’s called third or fourth party collection,

00:34:40

where they spy on another country’s spy agency to collect all of their data,

00:34:44

as well as another Snowden revelation.

00:34:48

Thank you.

00:34:49

So on the law side,

00:34:52

well, we still have the First Amendment and Bernstein.

00:34:55

The First Amendment is intact.

00:34:56

Bernstein is intact.

00:34:58

But hard cases make bad law.

00:35:01

Terrorist attacks are a thing.

00:35:03

Eventually, terrorists will use crypto we

00:35:05

will have cases that emerge out of that where siding against against a backdoor

00:35:12

and crypto will be at least in many people’s eyes siding as with the

00:35:17

terrorists there’s a version of this playing out right now there’s a

00:35:21

free-speech version which like, should ISPs and domain

00:35:25

registrars and top-level domain systems, should they be in charge of policing content? And there

00:35:33

are a lot of people who, if they thought about it at all, probably would have said no until it

00:35:37

became a super convenient way to make the Daily Stormer disappear. And the Daily Stormer is odious

00:35:42

and terrible and all of us hate it i suppose uh i

00:35:45

certainly do but um generally speaking when we are uh trying to get those companies not to police

00:35:52

content we’re trying to get them not to police the content of people you agree with because the

00:35:56

people you agree with tend to have less power and money uh than than you know rich powerful people

00:36:01

who uh are generally on the same side as the Daily Stormer,

00:36:05

or at least parts of the Daily Stormer’s agenda, if not the whole agenda.

00:36:08

It is rare for an organization like the Daily Stormer to be the one in these crosshairs.

00:36:13

And if we make it much easier to get rid of websites like the Daily Stormer,

00:36:17

mostly we’re going to be using those rules to get rid of websites like Black Lives Matter.

00:36:20

And so it’s become really hard for people who advocate for those private entities who

00:36:27

operate these choke points on the internet not to be content regulators to talk about why the

00:36:35

Daily Stormer, even if we don’t want it on the internet, the right way to get rid of it is not

00:36:40

to target those people. And, you know, that’s going to be, think about that in the context of terrorism.

00:36:47

There’s going to be a lot more people

00:36:48

who feel very deeply about why terrorism is wrong

00:36:50

than are upset about the Daily Stormer.

00:36:52

And imagine what it’s going to be like

00:36:54

arguing against backdoors then.

00:36:58

So we already see that emerging.

00:37:00

In Australia, the prime minister

00:37:04

and their security Minister have both

00:37:06

advocated for all of the Five Eyes countries

00:37:08

that’s Australia, New Zealand, Canada,

00:37:10

the US, and the UK

00:37:12

mandating back doors.

00:37:14

The Prime Minister of Australia

00:37:16

gave a press conference where he said

00:37:17

well the laws of mathematics are all

00:37:20

well and good, but I assure you that in Australia

00:37:22

the laws of Australia are the laws

00:37:24

of Australia, and that the laws of Australia are the laws of Australia.

00:37:30

And that, you know, the laws of mathematics are, you know, can go hang because we have a legislature.

00:37:36

Presumably he doesn’t feel this way about the law of gravity or any of the other physical laws, just the laws of mathematics.

00:37:41

In China, they’ve banned VPNs unless they have a backdoor.

00:37:46

That ban is actually rather old, but they started enforcing it against mobile apps,

00:37:50

and Apple and the Android stores have gone along with them.

00:37:55

Now, so the law space, not so good, but the code space is in pretty good shape,

00:37:58

because crypto works.

00:38:01

If there’s a problem with crypto right now,

00:38:04

it’s not the crypto,

00:38:06

it’s that we are increasingly engaged in computing models that treat their owners or users as their adversaries where we

00:38:12

have devices whose role is to figure out how to stop you from doing something not help you do

00:38:18

something and if the crypto only works if it’s performing its job faithfully. If it’s betraying you, if it’s badly done, or if it has a backdoor, or if it’s in some other way not fit for purpose, then crypto stops working.

00:38:33

It only works when it works.

00:38:34

And designing computers to be adversaries of their users sets the stage for computers that betray their users at the end point.

00:38:43

So the math is intact.

00:38:45

Crypto totally works.

00:38:46

It’s just not performed well

00:38:47

because at the end point in your laptop,

00:38:50

in your mobile device,

00:38:51

and your other IoT embedded systems device,

00:38:55

the manufacturer has decided to lock you in

00:38:58

using a system that deliberately hides itself

00:39:01

from your inspection

00:39:01

and invokes laws and other systems

00:39:04

to prevent you from modifying it, understanding it, or plumbing its depths.

00:39:09

I call that the war on general purpose computing.

00:39:13

In the war on general purpose computing, we have these devices.

00:39:16

They started as games consoles and entertainment devices and DVD players and inkjet printers

00:39:23

that tried to be computers that could run all the programs

00:39:27

except for one or two that pissed off the manufacturer.

00:39:31

So, like, a DVD player was the computer

00:39:33

that could run all the programs

00:39:34

except for the one that let you play a disc from out of region.

00:39:37

And a Nintendo was a games console,

00:39:42

was a computer that could run all the programs

00:39:43

except for the ones that hadn’t been blessed by Nintendo.

00:39:46

And an inkjet printer

00:39:49

was a computer that could run all the programs

00:39:52

except for the program that allowed you

00:39:55

to bypass the handshake between the cartridge and the printer

00:39:57

so that you could use third-party ink.

00:40:00

And this is generically called DRM

00:40:04

or digital rights management.

00:40:06

And it’s a stupid idea.

00:40:08

You can’t hide secrets in equipment that you give to your adversaries.

00:40:14

So if you expect that no one is going to figure out where the cryptographic keys are that stop you running these unauthorized programs in a device that you hand over to your adversary, you’re going to be disappointed.

00:40:22

in a device that you hand over to your adversary,

00:40:23

you’re going to be disappointed.

00:40:26

For the same reason that even if you design a really good bank safe and put it in the bank robber’s living room,

00:40:29

you’re going to be disappointed.

00:40:30

That your adversaries, if they can take the equipment

00:40:33

you’ve hidden the secrets in to their own premises,

00:40:36

where they have access to things like electron tunneling microscopes

00:40:39

and decapping tools and protocol analyzers,

00:40:43

they will eventually figure out what secret you hid in the hardware.

00:40:46

And then once they find it, if they tell everyone else about it,

00:40:49

then they’ll know what that secret is too,

00:40:50

and then everyone can run every program they want on their computer.

00:40:53

And since we don’t know how to make an actual computer

00:40:55

that runs all the programs except for the ones we dislike,

00:40:58

and all we can do to approximate it is a computer

00:41:01

that runs all the programs but has some kind of master program

00:41:04

that watches to see if you’re running a program that hasn’t been authorized and tries to shut it

00:41:08

down as soon as someone gets the secret that allows them to run any program then all bets are off so

00:41:16

drm is stupid but it’s not harmless because of its very fragility drm advocates aren’t really

00:41:23

interested in the drm per se, that is in making

00:41:26

proprietary software that hides decryption

00:41:28

keys from the users.

00:41:29

What they really want is the law that

00:41:32

goes with DRM. And in

00:41:34

the US, that’s a law called DMCA

00:41:35

1201, the Section 121 of the

00:41:38

1998 Digital Millennium Copyright Act.

00:41:40

And under Section 121

00:41:42

of the DMCA, bypassing

00:41:44

or removing or weakening one of these locks

00:41:46

that tries to control which software you run on your own computer

00:41:49

is a felony punishable by up to five years in prison

00:41:52

and a $500,000 fine for a first offense

00:41:55

and DMCA 12.1 has come to mean that corporations

00:41:59

can force you to use your property

00:42:01

only in ways that make them the most money

00:42:04

on pain of criminal

00:42:05

prosecution. So if a phone vendor makes more money when you buy apps from its app store,

00:42:11

instead of from the person who wrote the app directly or a third-party app store,

00:42:15

then all they need to do is design their device so that changing which app store it uses involves

00:42:20

bypassing DRM. And since bypassing DRM is a felony, then changing which apps you buy

00:42:26

and who you buy them from is also a felony.

00:42:29

So conceptually, there’s no difference

00:42:31

between this and designing a toaster

00:42:34

that uses a vision system

00:42:35

that forces you to use official bread

00:42:38

or a dishwasher that uses RFIDs

00:42:41

to make sure you only wash authorized dishes.

00:42:43

You can even make all the same arguments, right?

00:42:45

Well, what if you were to put a bagel in your toaster

00:42:47

and set your house on fire?

00:42:48

Or we only have dishes that are optimized

00:42:50

to prevent foodborne illnesses.

00:42:52

This is a safety thing.

00:42:54

Plus, think of the warranty expense we would have to assume

00:42:56

if you could toast anyone’s bread in your toaster.

00:42:59

Think of the maintenance headaches that we would get into.

00:43:02

And, you know, the people who design the dishes

00:43:03

for our dishwasher, well, they have an intellectual property interest in those dishes.

00:43:07

And if we allow you to put third-party dishes in your dishwasher,

00:43:10

then pirates could pirate those dishes

00:43:12

and deprive our hard-working independent dish vendors

00:43:16

of their rightful royalties on their dishes.

00:43:20

So there’s not really any reason not to invoke this

00:43:24

from a firm’s perspective if they can.

00:43:27

And as soon as you have software in your device, you can invoke these rules, right?

00:43:30

These rules become available to you.

00:43:32

And it becomes a kind of license to force your customers to arrange their affairs to the benefit of your shareholders instead of them.

00:43:42

So if it’s your property,

00:43:45

generally speaking,

00:43:47

we say you have the right to decide what to do with it.

00:43:48

If having software and DRM in a device

00:43:51

means you don’t get to decide

00:43:52

how to use your property,

00:43:53

it’s not your property anymore.

00:43:55

In that Blackwellian sense of property

00:43:57

being the thing that you have

00:43:58

the sole and despotic dominion over

00:44:00

to the exclusion of all other people,

00:44:03

it’s only your property

00:44:04

if you

00:44:05

can decide how you want to use it, whether you want to toast third-party bread, whether you want

00:44:09

to run third-party software. So that’s a benefit and not a problem for the firms that are invoking

00:44:17

it. They would like to have you be a tenant of your property forever because tenants pay rent

00:44:22

every month, whereas owners, they sever their

00:44:27

economic relationship with the firm and only rekindle it if the firm can come up with an offer

00:44:32

that they choose to take up because it’s the best offer they can get. If a firm can exploit your

00:44:38

sunk costs in their technology and their tools to continue to sell you aftermarket parts and services, then why wouldn’t they?

00:44:46

So a bonus side effect of this is that because revealing a defect in a DRM system, in a copy

00:44:53

protection system, could potentially weaken it, it too has become a thing that people

00:44:59

invoke this law over.

00:45:01

And so reporting on a defect in a system that has this copyright stuff in it

00:45:05

can become a felony. And that means that telling someone about a flaw in a device that belongs to

00:45:12

this increasing constellation of diverse tools that embody things like pacemakers and cars and

00:45:21

tractors, that telling them about mistakes that the programmers made, defects

00:45:25

that the manufacturers shipped, that is also potentially a felony, also potentially a source

00:45:31

of civil liability, also potentially punishable by a five-year prison sentence and a $500,000

00:45:36

fine for first offense.

00:45:38

So the security research community went to the copyright office about this in 2015 when

00:45:44

they had hearings on it.

00:45:45

They’re about to open another set of hearings now.

00:45:47

And they said that they’re sitting on vulnerabilities they’ve never disclosed in devices as diverse as farm equipment, medical implants, voting machines, and key infrastructure.

00:45:59

Because they feel that if they were to reveal these defects to the people who relied on these systems, that they might face copyright persecution for revealing these defects.

00:46:09

So this is bonkers, right?

00:46:11

On the one hand, you have this incentive to use this stuff to lock up your devices,

00:46:17

and on the other hand, the more devices there are that have it,

00:46:19

the harder it is to secure those devices because security researchers can’t come forward.

00:46:25

So where do we stand on DRM?

00:46:27

Code.

00:46:28

It’s not hard to break DRM.

00:46:30

It’s a fool’s errand, and this is not going to change.

00:46:32

Markets.

00:46:33

Well, there’s more DRM than ever, and it’s making bank for companies,

00:46:37

and that’s not going to change until the law changes.

00:46:40

GM, for example, started putting DRM in the telemetry from its engines,

00:46:44

and now they charge

00:46:45

$70,000 for the diagnostic tool to read out the telemetry from the engine before you fix

00:46:50

it.

00:46:51

That tool has a cost of goods of like $100, right?

00:46:55

But the official tool costs $69,900 more just as a kind of GM digital rights management

00:47:04

tax that you have to pay. Why wouldn’t GM go on

00:47:07

using that if they could? This has a huge drag on the economy, especially to small and medium

00:47:15

enterprises, especially to repair companies who end up paying this tax to become official repair entities for the firms. So SMEs that do repair generate 200 jobs

00:47:30

per kiloton of e-waste, whereas SMEs that do recycling of e-waste generate 15 jobs per kiloton.

00:47:37

So that is a huge difference. Again, that’s 200 jobs to repair 1,000 tons of electronic waste and 15 jobs to recycle it and that

00:47:46

is all onshore jobs right nobody sends their phone to India or China to get it

00:47:51

fixed you take it down to the corner shop where people in your neighborhood

00:47:55

who have bought a couple of manuals from iFixit and a toolkit and who’ve done

00:48:00

some online training figure out how to fix your phone your device and get it

00:48:04

working for you.

00:48:05

So this anti-repair, anti-service stuff,

00:48:09

it’s been a huge drag.

00:48:10

It’s one of the sources of our legal power

00:48:12

because right-to-repair bills,

00:48:14

as opposed to anti-anti-piracy bills,

00:48:18

right-to-repair bills are now making progress

00:48:20

in a bunch of legislatures

00:48:21

because repair is a really easy thing

00:48:23

for people to understand.

00:48:25

We helped a bunch of legislatures because repair is a really easy thing for people to understand. We helped a set of researchers at the University of Glasgow do a study on the potential size of the market

00:48:32

for devices that break DRM locks, that break these copyright locks.

00:48:37

They did a small study where they scraped Amazon for DVD players,

00:48:41

and they compared DVD players that had a circumvention feature,

00:48:45

the ability to play downloaded movies, which is illegal in DVD certified tools, with ones

00:48:52

that didn’t. So this is a pure software change. There’s no new hardware in it. There’s no

00:48:57

marginal cost extra to produce this more capable DVD player. And the DVD players that had the

00:49:03

circumvention feature controlling for number

00:49:05

of reviews and date of manufacture and overall review, those devices sold for 50% more than

00:49:12

the commodity DVD players that followed all the rules. Now, the traditional margin on

00:49:17

a commodity DVD player is 2%. So they were able to go from 2% to 50% by selling these devices that broke the circumvention rules,

00:49:29

that broke the DRM rules. So on the one hand, DRM is holding back these markets. On the other hand,

00:49:38

DRM is holding back these markets. There are a bunch of people who stand to make a lot of money

00:49:41

by raiding these giant margins, right? Like, yeah, GM gets

00:49:45

500 repair tool, you would have every

00:49:51

mechanic in the world beating the way to your door to buy that tool from you, and you would

00:49:56

have a mere 400% margin on your commodity hardware or pure software play that ran on a laptop that

00:50:02

you plugged into a car engine. All we need to do is get rid of this law and we’ll have this constituency to help us.

00:50:09

So on the side of norms, well, we’re definitely at the point where the number of people who

00:50:16

are indifferent to this issue is only going down. More and more people are realizing that

00:50:19

this is a problem. They’re aware that they get ripped off when they buy inkjet cartridges. They’re aware when they buy ovens that only cook food that came from a specific manufacturer.

00:50:30

This is a thing. Juicers that only juice fruit that came from a specific manufacturer. That’s

00:50:34

also a thing. We see cochlear implants that are locked to certain kinds of software, insulin

00:50:41

pumps that are locked to certain kinds of recyclables and consumables. People are increasingly aware that they are being ripped off when they buy

00:50:50

these tools, and they’re fighting back against it. And they’re also aware that there’s a security

00:50:54

dimension, that their Internet of Things devices are insecure, that they’re being used to attack

00:51:00

them and other people. And when they find out that security researchers can

00:51:05

investigate these tools, they become outraged because they know that these tools are very

00:51:09

intimately bound up with their lives. And then on the law side, things are starting to look up.

00:51:14

In 2015, we brought a lawsuit against the U.S. government representing Matthew Green and Bunny

00:51:21

Wang, who are two well-known, well-liked security researchers. Bunny is here.

00:51:26

He camps with the Institute on the Esplanade. And we feel like we have a pretty good chance

00:51:33

of winning this lawsuit, although it’s going to take years, and invalidating this law.

00:51:38

So I’m just going to… No, you know what? How much time do I have?

00:51:45

Seven minutes. Seven minutes. All right. So I’m going to skip the No, you know what? How much time do I have? Seven minutes.

00:51:46

All right.

00:51:47

So I’m going to skip the stuff about the W3C

00:51:49

because I’ve talked about that here before

00:51:51

and it’s been a long haul.

00:51:55

But I guess what I’m going to say to close here

00:51:59

is that it doesn’t make you an extremist

00:52:02

to say that DRM is bullshit.

00:52:05

It just makes you someone who isn’t self-serving.

00:52:09

DRM doesn’t actually work.

00:52:11

It can’t work.

00:52:13

That doesn’t mean that it wouldn’t be great to solve some of our problems

00:52:16

by making computers that only can run certain programs,

00:52:19

but we don’t know how to make that computer.

00:52:21

No one knows how to make that computer.

00:52:22

There’s no theoretical basis for that computer.

00:52:29

Saying DRM is bullshit doesn’t make you a purist or a fanatic. It just makes you not delusional. Just because DRM is bullshit, it doesn’t mean that it’s harmless bullshit.

00:52:34

The more we rely on DRM, the more dangerous it becomes to reveal to people what their computers

00:52:39

are doing and what defects lurk in them. And so it’s a catastrophically bad idea to redesign computers

00:52:46

to treat their owners as adversaries. And it’s also a catastrophically stupid idea, which is why

00:52:52

the companies that are engaged in it, they’re on the wrong side of history. That’s why we should

00:52:57

feel like there is a good chance that we will eventually win, because the future isn’t full

00:53:02

of computers that are almost Turing complete, that almost run all the programs except for the ones that giant firms don’t like.

00:53:10

Even though some quack convinced someone’s pointy hair boss that DRM is a thing, that doesn’t make

00:53:15

DRM a thing. We won’t win this fight because it’s not the kind of fight that you win. It’s the kind

00:53:20

of fight you fight forever because so long as computers are important to people, someone will always want to use computers to take away our rights and not to help us

00:53:28

exercise them. The principle of computers to enable and improve the lives of the people who

00:53:33

use them will always need defenders. And that’s where we all come in, every one of us. Thank you.

00:53:41

So we have time for maybe one or two quick questions. I like to call alternately on people who identify as women or non-binary

00:53:47

and people who identify as male or non-binary.

00:53:49

Can we start with someone who identifies as a woman or non-binary, please?

00:53:55

Greatest surprise this year in terms of policy initiatives.

00:53:57

I think we got more numbers on the Net Neutrality Day of Action

00:54:00

than anyone thought we would, like an order of magnitude more.

00:54:03

We went from like 2 million to 12 million.

00:54:05

It was fucked up.

00:54:06

It was amazing.

00:54:08

Right?

00:54:08

It put SOPA in the dark.

00:54:10

Like, that was crazy.

00:54:13

Yeah.

00:54:14

Why do you think that happened?

00:54:15

I just think that we tipped.

00:54:17

I think we hit a critical mass.

00:54:18

I think that procedural shenanigans

00:54:24

are often easier to understand than substantive ones.

00:54:28

I was talking with Jen the tea pourer in there, who also works on this anti-corruption thing where they had some procedural shenanigans this year.

00:54:42

EFF at WIPO at the UN, we were

00:54:44

fighting this really weird policy fight

00:54:46

that would take forever to

00:54:48

describe and is super boring and I’m not going to get into.

00:54:51

But

00:54:52

we would do these handouts that we’d get

00:54:54

indie media to translate overnight.

00:54:56

I’d get up at five in the morning and get them copied and bring them

00:54:58

to the Palais de Nations and hand them out.

00:55:00

And someone started stealing our handouts

00:55:02

and putting them in the toilets.

00:55:09

Like hiding them in the bathrooms. And someone started stealing our handouts and putting them in the toilets, right? Like hiding them in the bathrooms, right? And like all of a sudden people cared, right? They still didn’t really understand the substantive issue. A few of them probably like

00:55:13

said, oh, well, I should probably pay attention to this because someone’s throwing these papers

00:55:17

in the toilet. But more to the point, like people were like, you don’t throw your adversaries

00:55:22

papers in the toilet if you’re on the side of

00:55:25

righteousness, right? You know, this is like SOPA or like TPP, right? Like the fact that the meetings

00:55:33

were all secret was itself super useful to us because like everybody understood that you don’t

00:55:38

hold trade negotiations in secret because you want all the people to be pleasantly surprised,

00:55:43

right? Like, you know, there’s really only one reason

00:55:46

to play those shenanigans.

00:55:48

Are there any people who identify as male or non-binary

00:55:50

who’d like to ask a question?

00:55:54

Corey, it’s always an inspiration

00:55:55

to hear you preach and speak.

00:55:58

Thank you.

00:55:59

It’s important.

00:56:00

Like, I don’t feel well today, and I had to be here.

00:56:03

Oh, thank you.

00:56:05

Last year, you talked about something that, you know, you really educated me

00:56:09

and I’ve been carrying it with me for a year and talking to other people about it.

00:56:14

And it was, and I’m not technical so please, you know, help me or give me allowance on that.

00:56:21

But you talked about the issue of code and not knowing,

00:56:26

for instance, like an Internet of Things device or a John Deere tractor. When we’re buying the

00:56:31

device, we’re buying the hardware, we’re only licensing the software, and we are naive to,

00:56:38

A, how that device gathers data, uses data. We are naive to the laws that’s actually a felony

00:56:46

for us to simply audit and peek at that code.

00:56:50

We are naive to the fact that we can’t edit that code

00:56:54

if we so choose to change a function.

00:56:57

We are naive to our inability to optimize that code

00:57:01

if we have creative applications or we want to innovate.

00:57:04

To build on your theme, what’s the progress that we’ve seen on that?

00:57:09

So I do think, I call it like peak indifference. Like we are at the point where the number of

00:57:13

people who give a shit is only going to go up because the number of people who’ve like been

00:57:18

negatively affected is only going to go up, right? We’ve acquired so much policy debt,

00:57:21

is only going to go up.

00:57:24

We’ve acquired so much policy debt in the form of bad technology and bad policy

00:57:27

as a result of this lack of understanding

00:57:30

and shitty policy from the regulatory side

00:57:35

that there are a whole bunch of ruptures

00:57:38

that are in our future

00:57:39

that it’s probably too late to stop

00:57:43

and there are so many that have already happened now.

00:57:46

And so really what’s changing is we’re moving from the part of the job

00:57:52

where we try to get people to care about this stuff

00:57:54

and into the part of the job where the people who show up

00:57:57

having had their lives destroyed by this stuff and say,

00:57:59

what do I do now?

00:58:00

We say, well, here’s EFF Surveillance Self-Defense Kit,

00:58:04

and here’s the free software

00:58:05

foundation you can you can you know join them to like fight for free and open source software in

00:58:09

the software freedom law center or whatever and here’s like some pitchforks and torches and the

00:58:13

home address and phone number of the people who’s depraved indifference caused all of your shit to

00:58:17

be ruptured all over the internet forever right and you know that’s a that’s a like a speaking

00:58:23

of someone who’s spent 15 20 years trying to get people to give a shit and is now more in the business of trying to get people who already give a shit to do something, I’m feeling good.

00:58:33

Like, I feel like that’s, that’s better, right?

00:58:36

I mean, it sucks that we have this technology debt and that we’re going to be paying on it for a long time and we are in a race to see if we can get like enough people to

00:58:45

do something to affect a change before we reach a tipping point beyond which our technology debt

00:58:51

has like unimaginable consequences and it’s like climate in that regard right like we we have the

00:58:56

carbon that’s in the atmosphere is in the atmosphere it seems unlikely that we’ll be able

00:59:00

to do much to to get it out of the atmosphere maybe some very speculative moment in the future

00:59:04

we’ll do it but mostly what that carbon is going to do it’s going to do so to get it out of the atmosphere. Maybe some very speculative moment in the future will do it.

00:59:05

But mostly what that carbon is going to do, it’s going to do.

00:59:08

So now it’s a race to see whether the manifest effects of the carbon in the atmosphere convince us to do something to not put more carbon in the atmosphere

00:59:16

before we reach the point that it’s too late to stop putting carbon in the atmosphere.

00:59:23

So we have to decarbonize the surveillance economy, right?

00:59:26

Like we have to kind of get to the point where we are exerting sort of market forces, norms, laws, and code

00:59:33

to make our devices more obedient, more secure, devices that empower us instead of taking away our power in order to, before we reach the point where these devices

00:59:46

are so widespread, so poorly secured,

00:59:51

have gathered so much data about us

00:59:53

that is in so many giant, badly secured silos

00:59:55

that the potential for mischief is infinite.

01:00:01

Because right now it’s just mind-boggling.

01:00:03

But infinite, I don’t know how we recover from that. But mind-boggling. But infinite, I don’t know how

01:00:05

we recover from that. But mind-boggling, maybe we can, like, you know, when you look at the

01:00:10

thoughts, the kind of best predictions of the carbon debt that we’re under, that’s mind-boggling.

01:00:16

But it’s not infinite, right? Like, you know, there is a way that we can kind of weather that

01:00:20

storm. Maybe we can weather the storm. Time?

01:00:23

that storm. Maybe we can weather the storm. Time?

01:00:26

I’m good.

01:00:29

One more question? Yeah.

01:00:36

I’m gonna sit down.

01:00:39

So it’s always been the case that

01:00:42

the most hackable part of a system is the human part, right?

01:00:46

And it seems like we’re approaching this threshold right now with conversational synthesis

01:00:53

and using artificial intelligence to forge video and audio and documents.

01:01:00

And there seems like we’re on the cusp of an arms race about using cryptography to sort of verify documents.

01:01:07

And so I’m curious in terms of like getting people to even agree on the facts when it’s getting easier and easier for a hoax to outrace our ability to like debunk it.

01:01:19

How you see this this whole policy conversation interfacing with that sort of breakdown and consensus reality?

01:01:26

So I break policy questions into two sides.

01:01:30

So one is making things work well, and one is making them fail gracefully.

01:01:35

And so graceful failure is making sure that if you have an idea for how to detect forgeries,

01:01:40

that it’s not illegal to talk about it.

01:01:43

Succeeding well is like coming up with great ideas to detect forgeries that it’s not illegal to talk about it uh succeeding well is like coming up with great

01:01:45

ideas to detect forgeries i am in general skeptical of people who claim to be able to make perfect

01:01:51

forgeries only because they tend to be working from a corpus that was not designed to thwart them

01:01:57

and it is without recourse to what a corpus that was designed to thwart them would be like so

01:02:02

there’s this thing called adversarial stylometry, right, where you have an anonymous

01:02:05

text, and you want to know who

01:02:08

wrote it. And so you

01:02:10

analyze

01:02:12

the text of a bunch of potential

01:02:14

candidates, like, you know, I know it’s one

01:02:16

of these 20 people,

01:02:17

because only these people

01:02:20

were privy to this thing that the whistleblower said

01:02:22

to the journalist. And you pull

01:02:24

it out, and whichever person’s speech is most like the speech in the anonymous block of text,

01:02:30

you then have a confidence rating that that person is this person, right?

01:02:34

Well, that’s great, but then someone came up with, like, adversarial stylometry countermeasures,

01:02:40

which is, like, I take my candidate text that I want to release anonymously,

01:02:43

which is like I take my candidate text that I want to release anonymously,

01:02:48

and I compare it to my own text, the text that I, you know,

01:02:50

all of the things that were published by me on the Internet,

01:02:55

and the countermeasure tool tells me how to change it so that it has none of the tells that makes it look like my text, right?

01:03:00

And so now there may be a counter-countermeasure, right?

01:03:04

You know, like the stylometry tool is only measuring like 14 things,

01:03:09

and there are more than 14 things in the way that you communicate.

01:03:13

So maybe the next generation of tools will use seven more things

01:03:17

that are not currently analyzed,

01:03:19

and that your countermeasure tool wouldn’t cause you to vary.

01:03:23

But at a certain point,

01:03:27

in this measure-countermeasure dynamic,

01:03:33

there are moments where it feels like

01:03:35

one side is comprehensively winning.

01:03:38

And it usually turns out that they’re not winning

01:03:39

the way that we think they are,

01:03:41

that there are countermeasures available.

01:03:43

And that, especially when you’re at the start of something where someone is like, I can

01:03:47

use, I mean, I think I know the stuff you’re talking about.

01:03:51

I heard these Radiolab episodes about it.

01:03:52

These guys who’ve got software-based video forgery thing.

01:03:57

They did a thing with Obama where they got synthetic Obama to lip sync a speech that

01:04:01

real Obama had made.

01:04:03

But no one’s got a tool to like,

01:04:06

detect whether those forgeries exist. And it would be amazing if they did, because the tool to make

01:04:12

that forgery hasn’t even been released yet. And the fact that no one has that tool yet doesn’t

01:04:17

mean that it’ll never exist. And it may be that like, we just make that tool. And then all those

01:04:22

forgeries look really, you know, dumb and cack handed and whatever.

01:04:27

It’s hard to say.

01:04:29

I mean, you know, a breakdown of consensus reality.

01:04:32

I don’t know that we’ve I think that our likes consensus reality is probably overstated anyways.

01:04:36

We I think we don’t have a huge consensus to begin with.

01:04:39

There’s a lot of people who live in different worlds and operate from different assumptions.

01:04:43

So is that time or? Yep. All right. There’s a lot of people who live in different worlds and operate from different assumptions.

01:04:48

Is that time?

01:04:48

Yep.

01:04:49

All right.

01:04:50

Well, thank you all very much.

01:04:51

It was great to see you all. Thank you all.

01:04:57

You’re listening to The Psychedelic Salon,

01:04:59

where people are changing their lives one thought at a time.

01:05:05

Well, I hope that wasn’t too political for the snowflakes among us.

01:05:09

And don’t worry, I have no plans to shift to a permanent series of politically tinged talks.

01:05:16

In fact, in the following weeks, I plan on podcasting a new Palenque Norte talk each week.

01:05:22

Although I haven’t listened to them all myself yet,

01:05:25

I would be quite surprised if there is very much more to come of a political nature.

01:05:30

That said, I want to close here today by reminding you about something that Corey said in the talk that we just listened to.

01:05:37

He said that these political struggles to keep our communications channels free and open

01:05:42

isn’t about winning and losing.

01:05:46

It’s about staying alert to each new threat to our freedom of speech that comes along. And so I feel that I should

01:05:51

point out the fact that just two days ago, the number two person in the Justice Department at

01:05:57

the United States gave a speech urging Congress to pass a law that would force companies to install

01:06:03

a backdoor in any crypto software they sell.

01:06:06

In other words, he wants to roll back the progress that has been made to prevent just such a thing from happening.

01:06:13

This is a never-ending story, but it’s one that we need to remain aware of.

01:06:18

And talking about free speech, let’s look at the thing that seems to be the most important thing on the mind of this nation’s president,

01:06:26

and that is the ongoing protest by some brave NFL athletes

01:06:30

who choose to take a knee when the national anthem is played.

01:06:34

First of all, as a Catholic schoolboy,

01:06:37

I was taught that the highest form of respect we could pay to someone or something

01:06:41

is to kneel before it.

01:06:44

So the lawyer in me could argue that these

01:06:47

players are actually giving the anthem even more respect than those who are just standing.

01:06:52

It’s a weak argument, of course, but since I’m a lawyer, I just couldn’t resist pointing that out.

01:06:59

However, in all of the hullabaloo about the athletes taking a knee,

01:07:04

the thing that seems to be lost on most people is why they are doing that.

01:07:09

Their protest is about police brutality.

01:07:12

Don’t forget that the next time you see a player who is making a statement by not standing for the national anthem.

01:07:18

They are demonstrating against police brutality.

01:07:23

It isn’t a protest about playing the national anthem at football games.

01:07:27

And if you don’t think that death by police is a real problem here in the States,

01:07:32

then think about this.

01:07:34

From the beginning of just last year, 2016, until today,

01:07:38

there have been more people killed by U.S. policemen

01:07:42

than have been killed by all of the terrorist attacks in this country

01:07:46

during the past 40 years combined.

01:07:50

So, is the biggest threat to citizens in this country coming from terrorists?

01:07:55

Or is it coming from our highly militarized police forces?

01:08:00

I’ll let you decide that for yourself. I know where I stand.

01:08:04

I’ve been thinking about many other things along these lines that I’d like to say,

01:08:08

but I think that for today we’ve already given you more than enough to think about yourself.

01:08:13

I don’t expect all of our fellow salonners to completely agree with me or with Corey.

01:08:19

In fact, that would be a sorry state of affairs if we all agreed 100% on everything.

01:08:26

would be a sorry state of affairs if we all agreed 100% on everything, but I do hope that,

01:08:30

at least here in the salon, we can keep our discussion about our differences of opinion more civil. So I’m going to begin by doing my part and apologize for calling some of our fellow

01:08:38

saloners Trump trolls, when in fact they are simply Trump supporters. And I also hope that I can be forgiven for siding with the majority of eligible voters in this country

01:08:49

who voted for none of the above by simply not voting at all.

01:08:54

As I said in my last podcast, I believe that this is a failed state.

01:08:59

And what better evidence of that could there be than the fact that over half of the people in this country

01:09:09

couldn’t bring themselves to vote for either of the two realistic choices we were given.

01:09:15

For most of my life, I wound up voting for the lesser of two evils. But when I finally figured out that either way I was still supporting evil, I decided to become more local and let the empire

01:09:22

destroy itself without my tacit approval that continuing

01:09:26

to vote would be.

01:09:28

So for now, this is Lorenzo signing off from cyberdelic space.

01:09:33

Be well my friends. Thank you.