Program Notes

Guest speaker: Cory Doctorow

cory (at) eff (dot) org
https://www.eff.org/

Update! October 27, 2015

Victory for Users: Librarian of Congress Renews and Expands Protections for Fair Uses
 

Today’s podcast features a 2015 Palenque Norte Lecture given by Cory Doctorow at this year’s Burning Man Festival. Cory is a well-known science fiction writer and lecturer who warns us about how much of our freedom has already been assumed by rich corporations through their manipulation of the law through the use of technology. Whether you realize it or not, your personal freedom has already been significantly compromised in the digital domain, and unless you become aware of what is taking place, and then do something about it, Orwell’s world of “1984” will continue to encroach on your life.

“We have not reached peak surveillance … but we have reached peak awareness of surveillance.” -Cory Doctorow

Electronic Frontier Foundation (EFF.org)

Previous Episode

473 - Outcasts and Future People

Next Episode

475 - The Path of a Medicine Woman

Similar Episodes

Transcript

00:00:00

Greetings from cyberdelic space.

00:00:19

This is Lorenzo and I’m your host here in the psychedelic salon.

00:00:23

This is Lorenzo, and I’m your host here in the Psychedelic Salon.

00:00:31

And I’m a day late in posting today’s podcast because, well, it wasn’t the one that I had originally intended to produce this week.

00:00:37

In fact, I spent the past weekend working on what I thought you’d be listening to right now.

00:00:43

But then I checked my email yesterday morning, and there was a message from Chris Pezza.

00:00:48

As you know, Pezza has taken over the lead for the Planque Norte lectures at Burning Man and he sent me the first batch of this year’s recording with this brief message.

00:00:55

Quote, you should have access to STE002 within the hour,

00:01:00

containing Cory Doctorow’s talk, which I would highly recommend starting with.

00:01:04

It was an incredible performance that is highly relevant and I think people will go wild for it.

00:01:10

End quote.

00:01:11

Now, one of the reasons that I handed over the Planque Norte lectures to Pez

00:01:16

is because I trust his judgment implicitly,

00:01:19

and even though I’m no longer an integral part of Planque Norte, it’s, well, it’s still my baby.

00:01:24

So, taking Pez at his word, I immediately put aside my weekend’s work and began listening to the talk that you and I are about to hear right now.

00:01:33

And I discovered that Pez was correct.

00:01:36

This is a truly important talk and well worth your time to take it in.

00:01:41

Now, I’m sure that most of our fellow slaughters know who Cory Doctorow is, because,

00:01:46

well, it’s been almost impossible to miss his name attached to books and articles all over the place.

00:01:52

I’ve also read several of his novels, and just now I thought that I’d check to see how many he has

00:01:57

written. Well, was I ever surprised when I learned that he has over 40 books listed on Amazon.

00:02:04

I was never surprised when I learned that he has over 40 books listed on Amazon.

00:02:10

My God, Corey, I’ve only written five non-technical books in my 73 years,

00:02:14

and I’m worn out just thinking about how much work those few books took.

00:02:18

For what it’s worth, when I was working as a technical writer,

00:02:21

I produced quite a few tech manuals.

00:02:24

But that was a job, you know, with deadlines and a paycheck each month. But to

00:02:25

write novels in the hopes that one will eventually get paid for all of that work takes significantly

00:02:31

more internal fortitude than any non-writer can even imagine. Actually, if I remember correctly,

00:02:38

Pez tried to arrange for Corey to speak at the 2014 Planque Norte lectures, but their schedules didn’t permit it then.

00:02:46

Fortunately, this year, John Gilmore, who is one of the foundation stones for Camp Soft Landing,

00:02:52

where these talks are hosted each year, well, he was able to help Pez book this year’s talk. So,

00:02:58

thank you to John and Pez for this wonderful talk as well. Now, as you know, I’ve been a geek for most of my life, and I’m

00:03:06

still licensed to practice law in Texas, so I thought that I have a pretty good handle on both

00:03:11

law and tech. But when I began listening to this talk, and after Corey had been talking only for

00:03:17

about 15 minutes, I found myself standing up at my desk and shouting, those dirty bastards!

00:03:24

And if you don’t become as incensed about the information that Corey passes along in this talk,

00:03:30

then you had better listen to it again,

00:03:32

because your personal freedom is being challenged

00:03:34

by the massive invasion that our now-necessary computer devices have brought about.

00:03:40

This is serious business, and it directly affects you.

00:04:05

Hey, y’all. So we’re going to get started without further delay. This is serious business, and it directly affects you. about the Internet of Things, poisoned at birth by the inkjet printer business model.

00:04:06

Here’s Corey.

00:04:13

Hi there. Sorry, I got stuck in the whiteout there, hence my being late.

00:04:16

My wife, who left after me, got here before me. That’s how bad the whiteout was.

00:04:20

So we live in a world that’s made out of computers,

00:04:24

and I don’t mean in the sense of the Internet of Things videos where you see people walking into houses and waving their arms

00:04:28

and all the lights turn on and then they say,

00:04:31

tea, black, hot, Earl Grey, and magically the kitchen wakes up

00:04:34

and starts making them tea.

00:04:36

I don’t mean in that sense, although I want you to think for a moment

00:04:39

about how creepy that Internet of Things vision is

00:04:43

if you don’t trust the computers running your house.

00:04:46

Because a house where, wherever you are,

00:04:48

you can gesture at it and talk to it,

00:04:50

it will respond to you,

00:04:51

is a house where, wherever you are,

00:04:53

there’s a camera watching you

00:04:55

and a microphone listening to you.

00:04:57

And the only basis you have for believing

00:05:00

that that camera’s not sharing views of you

00:05:03

throughout your home with someone else is whether or not you trust the computers. I don’t mean that we live in that

00:05:09

world of the future where everybody dresses like an extra out of Tron. I mean that today we are

00:05:14

living in a world that is largely made out of computers. So you may have seen in the New York

00:05:21

Times this year there was an ad or an article, about the subprime car lending industry.

00:05:27

So now that the subprime housing bubble has popped,

00:05:30

Wall Street needs a new thing to financialize from poor people,

00:05:33

and that’s cars that they can’t afford.

00:05:35

So they take people who are poor credit risks,

00:05:38

and they loan them money to buy cars,

00:05:40

and then they make bonds out of those loans, right?

00:05:43

So every time you pay your loan,

00:05:45

an investor gets some money paid back on her bond and they want to make those bonds valuable.

00:05:49

So they want to make sure that you keep your payments up and that you don’t run away with

00:05:52

the car and that they can’t repo it. So there’s about a million of these cars on America’s streets

00:05:57

today and they’re all fit with location aware networked ignition override switches. So if you’re

00:06:04

a day late on your payments, they have their own independent sound system. So if you’re a day late on your payments, they have

00:06:06

their own independent sound system. And if you’re a day late on your payments, your car shouts at

00:06:10

you, you are late on your payments, you are late on your payments, you are late on your payments.

00:06:15

But if you stop making payments, or if you violate the terms of your rental, of your loan rather,

00:06:19

like it may say you’re not allowed to leave a certain area, then your car won’t start anymore. It has an ignition kill switch.

00:06:26

So the most salient fact about a car these days is not its transmission or whether it runs on solar or petrol or diesel.

00:06:35

It’s what kind of informatics are in the engine.

00:06:38

Because we’ve seen already people attack those cars by getting into the dealer’s servers and immobilizing every car that

00:06:45

dealer ever sold, right? So the most salient fact about a contemporary vehicle is its computer.

00:06:51

Your car is a computer that you ride down the highway at at 60 miles an hour or five miles an

00:06:55

hour on the gate road. So a 747 is a flying sun solaris workstation in a very fancy aluminum case connected to some scarily badly secured SCADA controllers.

00:07:08

A plane is a computer you fly in.

00:07:11

So all of these things in the Internet of Things,

00:07:13

all of these things that are computers and fancy cases,

00:07:16

are being born with the worst of technology’s business models.

00:07:20

It’s the inkjet printer business model,

00:07:23

where they sell you something either at cost or

00:07:26

at a loss or maybe even at a modest profit, but they anticipate that most of the revenue from

00:07:30

that thing will come from you buying aftermarket consumables for it forever. So, you know, they

00:07:35

sold you the toaster. They want to sell you the bread as well, and they don’t want you to use

00:07:39

third-party bread. They want to be able to mark up the bread as much as they can. That’s the inkjet

00:07:43

business model. That’s why printer ink costs more than vintage champagne. That’s why 3D printer

00:07:48

nylon costs more than filet mignon. It’s not because it’s intrinsically valuable. It’s because

00:07:53

it’s locked so that you can’t use third-party components in it. And there are a lot of reasons

00:07:59

why companies want the inkjet printer business model. So they want to control the market in

00:08:03

consumables. They also want to control the market in parts. So they want to make

00:08:08

sure that if it breaks, that you only buy parts from them so they can charge arbitrary

00:08:12

markups on those parts. They want to make sure that if you want to add a new part to

00:08:16

it, if you want to add a new connector or an attachment, that you only buy their attachments.

00:08:20

You know, wouldn’t it be great for car companies if instead of plugging any appliance into your cigarette lighter to charge it

00:08:26

the car company could issue licenses

00:08:28

to certain companies to charge their

00:08:30

devices through their lighter and use some kind

00:08:32

of computer magic to figure out whether or not you were using

00:08:34

a licensed charger or an unlicensed charger

00:08:36

they could have a new business stream so a lot of these

00:08:38

internet of things companies are

00:08:40

anticipating that they can control the parts

00:08:42

they also want to control the

00:08:44

apps so many of you probably have fruit flavored devices where there’s companies are anticipating that they can control the parts. They also want to control the apps.

00:08:45

So many of you probably have fruit-flavored devices where there’s only one store you’re

00:08:49

allowed to legally buy apps from. And so those fruit-flavored devices, they charge the software

00:08:57

vendors who make the programs that you run on your computer, they charge them 30% of what you

00:09:02

pay to Apple. That 30% goes in Apple’s pocket and the other 70% is handed off to the vendor for the service of having taken a file and put it on a server and collected a payment for it.

00:09:14

30% is a pretty high markup.

00:09:16

And normally you’d expect there to be another business that would come along and say, well, we’re going to give it 20% or 15% or we’re not going to take anything because we’re a cooperative software vendors. But if you can control the app store, then you can charge software vendors whatever

00:09:29

you want for the privilege of selling to customers. The customers become a product, right? You

00:09:34

are a product for the app store, for the companies that make the app stores of the things that

00:09:39

you own because they can sell the fact that you own their device to software vendors and then extract rents from those software vendors for reaching you.

00:09:47

They become gatekeepers to you.

00:09:50

And then there’s also the business of being able to make covenants.

00:09:53

So I want to sell a phone,

00:09:56

and I know that the easiest way to sell a phone

00:09:57

is to have the customer not pay for it at all.

00:09:59

Instead, they go to a carrier, and the carrier buys the phone,

00:10:03

and then they quote- quote unquote give it to the

00:10:05

customer in exchange for being locked into a long terrible relationship with that carrier

00:10:09

and the carriers want to be able to control functionality because they want to sell you

00:10:13

the features of their network onesie twosie right they want to be able to say well this tethers this

00:10:18

device but not that device and so they want to be able to control what kind of software you can load

00:10:23

on the phone because if they can say to carrier, if you sell our phones to your customers,

00:10:29

we’ll promise you that they can never install software that will let them tether devices

00:10:33

unless they paid you for the privilege of it. Well, that’s a covenant that they can make more

00:10:37

money out of, right? That gives them a new return on their investment in the engineering for that

00:10:40

phone. So everybody wants this business model. They want this inkjet printer business model. Now, normally, if you have a market for stuff, this doesn’t work, right? If

00:10:50

you say, well, I’ve got this phone and it only has one software market and I have unreasonable

00:10:54

conditions for the people who want to sell into it, someone else starts another marketplace.

00:10:58

Markets don’t solve all of our problems, but they solve that one pretty well, right? If it’s like,

00:11:02

in order to drink in my cafe, you have to allow me to shower you with abuse, someone starts a cafe next door where they don’t,

00:11:08

you get the coffee and not the abuse, right? That’s what we expect normal markets to do.

00:11:12

But in America, and increasingly around the world, we have a law that prohibits doing this.

00:11:18

In 1998, Congress passed a law called the Digital Millennium Copyright Act. DMCA, excuse me. The DMCA is a long, complicated,

00:11:27

really technical copyright law

00:11:28

that was supposed to make America’s judicial system

00:11:30

or legal system ready for the 21st century.

00:11:33

You’re probably familiar with the DMCA

00:11:35

because it’s the author of all your favorite videos on YouTube.

00:11:38

This video has been taken down due to a claim under the DMCA.

00:11:41

But there’s another part of the DMCA,

00:11:42

not the takedown part,

00:11:44

a part called Section 1201. And what Section 1201 does is it prohibits removing a lock that controls access to a

00:11:51

copyrighted work. So if there’s a copyrighted work and there’s a lock that prevents you from

00:11:55

accessing it, the DMCA makes it a felony punishable by up to five years in prison and a $500,000 fine

00:12:02

for a first offense to remove that lock. And that is without regard

00:12:06

to whether or not you’re committing an act of piracy, right? Without regard to whether you’re

00:12:10

allowed to access the work behind that. I write science fiction novels. If I had a Kindle and my

00:12:15

novel was locked in that Kindle, bypassing the lock to access the work that I hold the copyright

00:12:19

to is still a felony, right? It’s against the law to remove the lock regardless of whether you were doing something

00:12:26

illegal once you removed the lock.

00:12:29

And the thing about the DMCA,

00:12:31

about this part of the DMCA, 1201,

00:12:33

is that it has almost no litigation history.

00:12:35

Almost no one has ever stood up

00:12:38

to a claim under the DMCA,

00:12:39

so there’s been almost no judges

00:12:40

who ever ruled on the DMCA.

00:12:42

And as a result, we don’t really know

00:12:44

whether or not judges would hold

00:12:46

that it passes constitutional muster,

00:12:48

whether or not prosecuting someone under the DMCA

00:12:51

would be successful.

00:12:52

So what happens is,

00:12:53

is if you’re thinking of doing something

00:12:54

that violates the DMCA because you’re a researcher

00:12:57

or you’re an entrepreneur starting a company

00:12:59

or you’re an archivist who needs to break a lock

00:13:03

to get access to a work to archive it,

00:13:05

you go to your general counsel and you say, am I allowed to do this? And they say, to be honest,

00:13:10

we don’t really know. There’s not much litigation history. What litigation history there is isn’t

00:13:15

very promising. And the penalty for getting it wrong is really, really bad. So you probably

00:13:19

shouldn’t do it. So the DMCA actually goes beyond what it probably says because nobody even wants to find

00:13:25

out what its contours are, right? It’s this kind of enormous, what they call a minotaur-y presence,

00:13:31

like the minotaur at the front of the labyrinth. We don’t know how badass the minotaur is and no

00:13:36

one wants to find out, right? So everybody just steers clear of violating the DMCA and talking

00:13:41

about it. Now, why hasn’t there been much litigation history? Well, it’s

00:13:45

because the people who rely on the DMCA get to choose who they sue. So generally speaking, if

00:13:50

you’re someone who’s violating the DMCA in a way that courts are probably going to be favorable to,

00:13:55

they don’t bring an action against you. But if you’re violating the DMCA in a way that a court

00:13:59

might look down on, they do bring an action against you. And if you’re dumb enough to stand

00:14:04

up to them, you lose. And then we get all of our litigation history goes the wrong way. So there are a couple

00:14:10

of examples of this that relate to the work the Electronic Frontier Foundation, for whom I’m

00:14:14

working again, I’ll get to that later, with whom we’ve been involved. So the first one is the 2600

00:14:20

case. Do any of you know 2600, the hacker quarterly? It’s an amazing little magazine.

00:14:21

the 2600 case.

00:14:22

Do any of you know 2600,

00:14:22

the hacker quarterly?

00:14:24

It’s an amazing little magazine.

00:14:26

Terrible defendant.

00:14:29

I love 2600.

00:14:30

I’ve written for 2600.

00:14:32

I have a subscription to 2600.

00:14:34

And in around 2000,

00:14:36

2600 published source code to break DRM,

00:14:38

to break the locks on DVDs,

00:14:39

the DCSS program.

00:14:41

And this is pretty classic

00:14:42

First Amendment stuff.

00:14:43

A printed publication that you

00:14:45

can buy in a bookstore publishes some math. And Congress has passed a law that says that

00:14:51

publication is illegal. That’s classic First Amendment. Generally speaking, you go to a judge

00:14:56

and you say to the judge, math is one of the ways that human beings express themselves. It’s a form

00:15:02

of expressive speech. Congress can’t prohibit me from doing this.

00:15:05

That’s what the First Amendment says.

00:15:06

And the judge says, off you go.

00:15:08

But there’s a problem with 2600 being your test case.

00:15:12

Actually, there’s a bunch of problems.

00:15:13

The first one is 2600 is in New York,

00:15:16

and New York judges are not the most tech-savvy judges,

00:15:18

especially in 2000.

00:15:19

New York judges were still talking about the information superhighway.

00:15:24

The second problem is that 2600 calls itself the hacker quarterly.

00:15:31

And the third problem is that what they were doing

00:15:33

was not breaking digital rights management,

00:15:36

not breaking a software lock in order to do something

00:15:39

that everyone could see should be lawful.

00:15:41

They were breaking the lock in a way that there were some lawful uses

00:15:44

and some unlawful uses. It probably would have helped people pirate DVDs. And so when we

00:15:48

defended 2600, the judge said, this has nothing to do with free speech. This is about whether

00:15:54

people should be allowed to protect their investments or whether you should be allowed

00:15:57

to steal anything you want if you can break the lock off of it. We had our asses handed to us.

00:16:01

Now, four years later, a guy named Ed Felton was also threatened under the DMCA. Now,

00:16:06

Ed is the right defendant. He’s a Princeton computer scientist who’s just been made deputy

00:16:11

CTO of the White House. And he was the inaugural CTO of the FTC in the last administration.

00:16:17

And Ed was part of a team who broke a digital lock that was going to be used to restrict access to

00:16:22

music. And they wrote a paper about it

00:16:25

that they were going to present

00:16:26

at the 10th Annual Usenik Security Conference,

00:16:28

which is a learned academic conference.

00:16:30

And we really wanted a judge

00:16:32

to hear a case from the record industry

00:16:34

in which they said,

00:16:36

Princeton mathematicians should not be allowed

00:16:38

to talk about statistics at learned conferences

00:16:40

if record executives say that they shouldn’t.

00:16:43

Because we think that that’s the case

00:16:44

that a judge would say, you know what, this law is a really bad idea. So we stepped up to defend Ed,

00:16:50

and the record industry dropped its suit. They even offered us a covenant saying they would

00:16:54

never come after Ed for breaking this particular piece of things. So we couldn’t get what’s called

00:16:58

a declaratory judgment. When you have a threat against you, even if they withdraw the threat,

00:17:02

you can ask a judge. You can say to the judge, look, we have reason to believe this threat may reoccur.

00:17:07

Tell us how we would come out.

00:17:08

We have the right to know.

00:17:09

We have standing to know how this case would proceed.

00:17:13

But they were smart enough that they didn’t do it.

00:17:15

So we’ve never been able to choose our battlefield in the DMCA.

00:17:18

And as a result, it’s kind of crept along year after year, long past its sell-by date.

00:17:23

This ridiculous law from 1998

00:17:25

is still on the books in 2017. Now, in 2004, it looked like we might get a shot at it.

00:17:31

In 2004, two companies brought actions into the DMCA. One is Lexmark, which is a division of IBM.

00:17:37

They make printers. They went after someone who was jailbreaking printer cartridges to refill them.

00:17:41

The other one was Skylink. They make garage door openers. And they went after someone who jailbroke their garage door openers to make cheap remotes

00:17:48

for them because they charged a lot of money for the remotes.

00:17:50

And we went to the Federal Circuit and the Federal Circuit judge, Federal Circuit judges

00:17:54

are usually idiots about this stuff, but the Federal Circuit judge was amazing.

00:17:57

Federal Circuit judge said, there are no copyrighted works in your garage door owner except for

00:18:02

the digital lock.

00:18:03

The only thing that the digital lock protects access to is the digital lock. So removing the digital lock does not violate copyright.

00:18:10

This is purely anti-competitive. Same with Lexmark. We thought we would get it, but a funny thing

00:18:15

happened between 2004 and 2015, which is that now there are copyrighted works inside of everything,

00:18:22

right? Your light bulb has a full-on operating

00:18:25

system and a TCP IP stack that comes on a chip for 60 cents. And so no one is ever going to be

00:18:31

able to argue again that the digital lock is protecting nothing except itself because right

00:18:37

now it’s so cheap to put a copyrighted work in every single piece of electronics, no matter how

00:18:41

trivial, no matter how functional, that everything is now falling under the purview of the DMCA. And so we are increasingly owning devices

00:18:52

that are designed to be sort of computers that we inhabit. I talked about that a little

00:18:58

before, but I want you to get an appreciation of how significant this is. So if you live

00:19:03

in a modern house,

00:19:10

whether it’s in a cold climate or a hot climate, chances are it’s very heavily insulated and extremely well sealed. And it’s got a computer-controlled respiration system, an HVAC

00:19:14

system that controls the moisture and expiration of air and moisture in and out of that building.

00:19:20

That building is a computer that you live in. And the reason that you can tell is if you take the

00:19:24

computer out of that building, it starts to fill up with black mold. This is what

00:19:27

we found out in Florida, right? We turned off the power to all those subprimes in Florida after the

00:19:32

2008 crisis. A couple of years later, they had to scrape them down to the foundation slabs,

00:19:36

because the most salient fact about those houses is their computers. Take the computer out of those

00:19:41

houses, and they are permanently uninhabitable. They’re scrap.

00:19:50

787s, I told you about 747s, 787s, the new Boeing Dreamliner,

00:19:53

have to be rebooted every 248 days or they crash.

00:19:55

They literally crash.

00:20:04

In 2013, a sadly departed security researcher in Australia named Barnaby Jack presented his work on implanted defibrillators. These are amazing technology. If your heart is

00:20:09

prone to stopping, you can, excuse me, you can go to the doctor and the doctor will slice you open

00:20:15

and she’ll stick a computer attached to a powerful battery in your chest cavity and that computer

00:20:20

will listen to your heartbeat and if your heart stops beating, it will shock you back to life,

00:20:25

and you will go on living.

00:20:27

Now, doctors want to get telemetry off of your implanted defibrillator,

00:20:30

and they want to update the firmware on your implanted defibrillator.

00:20:34

And it’s hard to attach a USB cable to a computer that’s inside your chest cavity.

00:20:38

So these things have wireless interfaces,

00:20:40

because everything has a wireless interface,

00:20:42

because it’s so cheap that you might as well put a wireless interface in it. We’re basically living in microwave ovens. And so

00:20:48

this thing has a wireless interface and it’s a copyrighted work. So it’s against the law to look

00:20:52

at it too closely or pay attention to it or jailbreak it or add new firmware to it. So Barnaby

00:20:58

Jack showed that because of this funny state that these implanted defibrillators live in where

00:21:02

third parties aren’t allowed to audit them and tell you what they find,

00:21:06

that he could reprogram them from 30 feet away

00:21:08

and cause them to seek out other defibrillators,

00:21:11

like when you went to the defibrillator clinic

00:21:13

and everybody else who’d had one implanted was there,

00:21:15

seek them out and reprogram them,

00:21:17

and then at a set date in the future

00:21:18

to administer lethal shocks to everyone who’d been affected.

00:21:22

When Dick Cheney had his defibrillator implanted,

00:21:25

he had the wireless interface turned off.

00:21:27

All of his upgrades involve a scalpel,

00:21:29

but no one can give him a heart attack from 30 feet.

00:21:33

So this rip-off stuff, this is bad,

00:21:36

and that’s the intended consequence

00:21:38

of using the DMCA in this way.

00:21:40

This is what the inkjet printer business model is for.

00:21:43

It’s to return higher

00:21:45

revenues on engineering investments by locking customers in to covenants, to renewables,

00:21:51

to code, to features, and so on. That’s what it’s for. That’s the intended consequence.

00:21:56

But the really bad news here is not the fact that you’re going to be a kind of feudal tenant

00:22:00

in the IT fields for the rest of the future. The really bad news here is how it interacts with

00:22:06

security, and that’s the unintended consequence. So the DMCA not only makes it a felony to jailbreak

00:22:13

a device, but it makes it a felony to give people information they could use to jailbreak a device.

00:22:18

So if you discover a mistake the programmer made, if there’s a flaw, what they call a vulnerability in the code,

00:22:25

then if you know about that, that can be the thing that you use to insert your own software

00:22:30

into the device and jailbreak it and add the features the manufacturer doesn’t want you to add.

00:22:35

And so it’s a felony to tell people about flaws in these devices that they live in,

00:22:39

that they have inside their bodies, that literally have the power of life and death

00:22:42

over them and whole populations. It’s a felony publishable by up to five years in prison. Now, this matters a lot because we have

00:22:49

one experimental methodology for discovering whether or not security works, and that’s

00:22:54

disclosure. Because anyone can design a security system that works so well that they themselves

00:22:59

can’t think of a way of breaking it, but all that means is that you’ve designed a security system

00:23:03

that works on people who are stupider than you.

00:23:06

To understand why this experimental methodology is the only one we have, you only need to

00:23:12

cast your mind back to how we got to where we are today.

00:23:15

So before we had science, we had a thing that looked a lot like science called alchemy.

00:23:20

And alchemists did what scientists do.

00:23:21

They observed phenomena in the natural world.

00:23:29

They formulated hypotheses about the causal relationships between these phenomena.

00:23:31

This causes that.

00:23:34

They made up experiments that tested their hypotheses.

00:23:36

So, so far, that’s science, right?

00:23:40

But then they didn’t tell anyone what they discovered or what they thought they discovered.

00:23:44

And there is a bottomless human capacity for self-deception.

00:23:46

If you think that you were probably right going into your experiment, you have a tendency to look at results that prove

00:23:52

your hypothesis and to ignore or downplay results that disprove your hypothesis. And this is why

00:23:58

every alchemist discovered for himself in the hardest way possible that drinking mercury was a

00:24:02

bad idea. Right?

00:24:06

And then after 500 years of this,

00:24:09

alchemists actually did manage to convert something base into something precious because they started publishing.

00:24:13

They started telling other people what they thought they knew,

00:24:15

and they converted alchemy into science.

00:24:18

We call that moment the Enlightenment.

00:24:20

Right?

00:24:20

So we have one experimental methodology

00:24:22

for discovering whether your security works,

00:24:25

and that’s to tell people what you think you know, how you think it works. And we’ve short-circuited

00:24:30

this with the DMCA for the last 17 years in a field in which increasingly everything is colonized

00:24:37

by it. And so the security dimensions of that are historically underreported and becoming more

00:24:43

grossly underreported as every day goes by.

00:24:46

So what are we going to do about this?

00:24:48

Well, the problem has been so far that the other side has always been able to pick the battles.

00:24:53

They get to choose who they sue, so they get to choose only the cases that will keep the DMCA intact,

00:24:58

and they can just pretend they don’t see the cases that a judge would likely look on with disfavor.

00:25:03

Now, the Electronic Frontier Foundation,

00:25:05

which is a civil liberties group that turned 25 this year, for whom I’m back working for again,

00:25:10

EFF loves to change the law, not just by lobbying Congress, which is hard and slow and expensive,

00:25:16

but by appealing to the judiciary through something called impact litigation. And the way that it

00:25:20

works is if you’re lucky enough to live in a country with a strong constitutional tradition,

00:25:24

rather than convincing a majority of lawmakers to go against the people who funded their election campaigns

00:25:30

and vote down a law that they’d previously passed,

00:25:32

you can instead go to a judge, or a few judges, or in the Supreme Court, five judges,

00:25:36

and convince five out of nine judges that Congress made a law that violates the Constitution.

00:25:42

And no matter what law that is, and no matter how committed Congress is to it, that law is nullified by the Supreme Court. We’ve seen some pretty amazing

00:25:51

Supreme Court cases and some pretty terrible ones in the last couple of years. We’ve seen how that

00:25:55

can work. So in 1992, EFF had its first amazing impact litigation victory. So in 1992, it was illegal for normal people, people who weren’t

00:26:07

in the military, to use strong cryptography, cryptography that worked, that could keep your

00:26:12

secrets. Because the NSA thought that if you could keep secrets from them, that you might get up to

00:26:17

stuff that they would want to know about, and they wouldn’t be able to find out about it. So they

00:26:20

prohibited civilian access to strong crypto. And there were a lot of arguments made about why this was a bad idea.

00:26:26

The kind of arguments you’d think would woo Congress.

00:26:29

Like the banking industry went to Congress and said,

00:26:32

we really need to be able to protect the authentication mechanisms

00:26:35

between our servers and with our clients and to our head offices

00:26:38

with strong crypto because Chinese spies and Russian spies and the mafia

00:26:43

don’t follow your laws,

00:26:45

and so if they can break into our stuff,

00:26:48

you know, we’re really vulnerable.

00:26:50

And Congress went to the NSA, and they said,

00:26:52

is this true?

00:26:52

And the NSA said, no, we gave them a really good cipher.

00:26:55

They can use it.

00:26:55

It’ll protect them from everyone.

00:26:57

So John Gilmore, who, I don’t know, is he here today?

00:26:59

He booked me into this talk.

00:27:01

John Gilmore is one of EFF’s founders,

00:27:03

designed a computer for a quarter million dollars that could exhaust all the possible keys for the cipher the NSA said

00:27:09

we could use in two and a half hours. Right? So we took that to Congress and they said

00:27:14

we don’t care. Right? But then we found a mathematician, Daniel J. Bernstein, well known

00:27:19

as DJB, an eminent cryptographer today who was a grad student at UC Berkeley then, who on Usenet, which was like the web but angrier, on Usenet was publishing source code for strong

00:27:33

crypto.

00:27:34

He was publishing the math behind strong crypto.

00:27:37

And in the Ninth Circuit, we argued that he had the right to do this because the First

00:27:42

Amendment protected source code as a form of expressive speech.

00:27:48

And the Ninth Circuit upheld us, and then the Appellate Division upheld us,

00:27:52

and then the ban on strong crypto was struck down. So this is an amazing template, right? This is ninja policymaking because you find the one weak spot in the other side’s otherwise invulnerable

00:27:59

wall, and you attack them there, and the whole wall comes down. It doesn’t matter how many

00:28:04

Congress critters they bought off. If you can convince a judge that it’s unconstitutional,

00:28:09

away goes the law. So in January, I joined EFF again after a 10-year hiatus. I used to be their

00:28:15

European director. I took 10 years off to write novels, and I got more and more alarmed about

00:28:19

this stuff. And I went back in January to help work on a project called the Apollo 1201 Project,

00:28:25

which is a project to abolish all the DRM in the world within a decade, right? We do this not

00:28:29

because it’s easy, but because it’s hard. Thank you. So how are we going to do this? Well,

00:28:38

people like you have ideas, right? You have things that you would like to make that probably

00:28:42

violate the DMCA. In fact, if any of you are researchers working in computers and information security,

00:28:47

you’re probably already violating the DMCA.

00:28:49

Lots of people do.

00:28:50

They just don’t talk about it.

00:28:51

They publish papers in mobile phone security where they say,

00:28:55

I tested a bunch of apps to see what their security model was,

00:28:58

but they never say how they got the apps.

00:29:00

Because they got the apps by breaking the law.

00:29:02

They jailbroke a phone, and then they extracted the apps from memory on the phone and then they subjected them to analysis. So

00:29:08

nobody ever talks about this stuff. Some of you have found vulnerabilities that you haven’t talked

00:29:14

to anyone about. We’d like you to come and talk to us. We’d like you to come and talk to us about

00:29:19

how the ideas that you have, how the research that you’ve done can be expressed in scholarly circles in ways

00:29:25

that are litigation hardened and likely to survive a legal challenge.

00:29:29

Because if the other side sues you and you have the right facts, we can get rid of this

00:29:34

law.

00:29:34

And if the other side doesn’t sue you and you publish and you are public about it, then

00:29:39

we can embolden other people who’ve made discoveries like yours to come forward as well.

00:29:43

And one way or the other, we get to get rid of the DMCA, right? Either they become so scared of suing us because they

00:29:49

know that the minute they sue someone who’s consulted with us and litigation hardened their

00:29:53

strategy, they’re in danger of losing the DMCA entirely. And so they never sue anyone or they

00:29:59

lose patience and they sue someone and we get rid of the law. So one way or the other, we’d like you

00:30:02

to come talk to us if you’re a security researcher, if you’re a programmer, if you’re working in these spaces.

00:30:08

We have a target-rich environment today. For the last 15 years, the only people who could sue us

00:30:13

under the DMCA were entertainment companies because they were the only ones using it.

00:30:16

But today it’s coffee pod makers, and it’s automakers, and it’s tractor makers, and it’s

00:30:21

implanted medical device makers, and it’s tons of companies that have bet the farm,

00:30:25

in John Deere’s case literally,

00:30:27

on being able to control how you use your devices.

00:30:29

And we just need one of them to be dumb enough to sue us

00:30:32

and we can make this law go away.

00:30:35

And once 1201 goes,

00:30:36

once the prohibition on breaking digital locks goes,

00:30:39

digital locks go too.

00:30:40

Because digital locks don’t work.

00:30:43

For a digital lock to work, I have to scramble a

00:30:46

message, and I have to give it to you. And then I have to give you a device in which I hide the

00:30:51

keys to scramble the message. And then I have to trust that device to never tell you what the keys

00:30:56

are. So like Netflix gives you a browser plugin that decrypts the videos while you’re watching

00:31:01

Netflix, but doesn’t have a save button. And to stop you from making a save button, they have to make sure that you never figure out where

00:31:08

in that plugin they hid the keys. Well, giving your adversary a thing that you’ve hidden keys in

00:31:13

and hoping that that adversary never finds those keys has a technical term in security research.

00:31:19

It’s called wishful thinking, right? We don’t give adversaries the keys to our crypto for the

00:31:24

same reason we don’t hide safes in bank robbers’ living rooms, right? We don’t give adversaries the keys to our crypto for the same reason we don’t

00:31:25

hide safes in bank robbers’ living rooms, right? Because it doesn’t matter how great the safe is,

00:31:30

if it’s in the bank robber’s living room where she has an electron tunneling microscope,

00:31:35

that safe is eventually going to open up, right? And the same is true with all digital locks. They

00:31:40

only work when it’s against the law to tell people about their flaws. They don’t work

00:31:45

if people are allowed to subject them to analysis. And nobody wants digital locks, right? Nobody woke

00:31:52

up this morning and said, do you know what I want? I want to read a book, but I want to do less with

00:31:55

it, right? Do you know what I really want? I want to listen to some music, but I want music with

00:31:59

fewer features. I wish my coffee pod took fewer vendors’ coffee. I wish my insulin pump had fewer

00:32:06

companies that made software that I could use to interpret its data. That’s what I really need,

00:32:10

is less choice in the devices that have the power of life and death over me. Nobody wants this.

00:32:16

In markets where people roll out digital locks and you’re allowed to compete,

00:32:20

those digital locks disappear immediately. Like with coffee pod makers, Keurig put coffee

00:32:24

locks on their coffee pods.

00:32:26

They’re not very good pods,

00:32:27

but some people use them, lots of people use them.

00:32:29

A year later, their share price had fallen by 25%,

00:32:33

because in a market where people can choose,

00:32:35

people don’t choose the digital locks.

00:32:36

It’s only when Congress says,

00:32:38

we will spend an unlimited number of tax dollars

00:32:40

defending your dumb business model,

00:32:42

that it makes sense to put digital locks

00:32:44

in your business model. So we’re looking for hackers. We’re looking for academics. We’re looking for

00:32:49

security researchers. Some of you work for tech companies that might be World Wide Web Consortium,

00:32:53

W3C members. So the W3C makes standards for the web. Last year, the W3C started to fold in digital

00:33:00

locks into standards for web browsers, making every web browser into a reservoir of long-lived digital pathogens

00:33:06

that can fuck us in every conceivable way

00:33:08

from asshole to appetite.

00:33:09

And we’re working on getting them to change that.

00:33:11

If your company is a W3C member, talk to me.

00:33:14

I want to talk to you about how we can work with you on this.

00:33:16

If you’re a publisher or a writer involved in digital music

00:33:20

and you sell your works in the digital marketplaces,

00:33:22

we’re working to get the FTC to force the vendors,

00:33:25

the platforms, Amazon and Google and Apple,

00:33:28

to start labeling those things

00:33:29

as to whether they have digital rights management or not,

00:33:31

as to whether they have locks or not.

00:33:33

Because right now, if you want to buy a book

00:33:35

that doesn’t have digital locks on it,

00:33:36

Amazon won’t tell you whether or not

00:33:38

the book has a digital lock on it or not.

00:33:40

We want them to start labeling that stuff.

00:33:42

So if you’re involved in that,

00:33:43

there’s a letter you can sign on to.

00:33:44

We’re going to get the FTC to

00:33:45

go to those companies and start twisting their arms.

00:33:48

And then if you’re a UX designer or

00:33:49

UI designer or product designer, talk to me.

00:33:52

I’m working on a project called the Catalog

00:33:53

of Missing Devices, which is a catalog

00:33:56

of all the things we could have if it hadn’t

00:33:58

been for this law on the books for the last 17

00:34:00

years. Because we want

00:34:02

people to start realizing what they’re missing.

00:34:04

So my email address is cory, C-O-R-Y, at EFF.org. So if you’re involved in any of those things, talk to me.

00:34:11

So the digital rights management and Section 12.1 of the DMCA that make them possible have

00:34:17

been a precancerous mole on the information society festering for 17 years. We are making

00:34:22

a surgical strike against it that you can help us with.

00:34:25

Not because the internet

00:34:27

is the most important fight that we have.

00:34:29

There are many fights

00:34:30

that are more important than the internet.

00:34:31

There’s fights about gender equality.

00:34:33

There’s fights about racial equality.

00:34:35

There’s fights about climate change.

00:34:37

There’s fights about income inequality.

00:34:38

And all of those fights

00:34:39

are more important than the destiny of the internet.

00:34:41

Except that the internet

00:34:43

is the battlefield

00:34:44

on which all of those fights will be fought.

00:34:46

There is no, excuse me, I’ve got so much dust in me.

00:34:49

There is no turf that is more foundational than that fight.

00:34:54

If we’re going to have all those other fights,

00:34:55

we have to keep the internet free and fair and open.

00:34:58

So I’m going to close by talking to you

00:35:01

about how you can help,

00:35:02

even if you’re not any of those kinds of people,

00:35:04

what you can do to help out.

00:35:07

So there are a lot of people who feel helpless

00:35:10

because they know that every day

00:35:11

they send a check to the cable company

00:35:13

whose mission is to destroy the Internet as we know it

00:35:16

and make network discrimination a matter of fact.

00:35:19

Or they know that they like their fruit-flavored devices

00:35:21

that are locked in ways

00:35:22

that encourage this inkjet printer business model.

00:35:25

Or they know that, you know, me, I put Linux on my computers, but my computers are Lenovo computers.

00:35:31

And Lenovo has become like the alpha and omega of spyware, right?

00:35:35

Or you have to go to work at a company like Cisco that’s helping spy on the whole world.

00:35:39

Or you go to work at Oracle that’s trying to patent our copyright APIs.

00:35:44

Whatever it is, you may feel like there’s nothing you can do

00:35:46

because you’re not living pure, right?

00:35:48

And this is the crisis of vegetarianism, right?

00:35:51

Every vegetarian eventually meets a vegan,

00:35:54

and every vegan eventually meets a fair trade vegan

00:35:57

who eventually meets a fruitarian who eventually meets a breatharian, right?

00:36:01

And no one can be pure, and I’m not going to ask you to be pure, right?

00:36:04

I’m not going to ask you to be pure, right? I’m not going to ask you to like live in a cave, build a computer out of like FPGAs that you write the

00:36:10

firmware for, you know, only use Telnet and ask other people to like proxy data to you and,

00:36:19

you know, reject all cookies. Like I’m not going to ask you to do that, because you can’t.

00:36:25

And if you try to live that way, you won’t be able to do all the things that you need to do

00:36:28

to struggle on in all these fights that are important.

00:36:31

What I’m going to ask you to do instead is hedge, right?

00:36:33

Hedging is not just for Wall Street.

00:36:35

If every day you’re opening your wallet and spending money with companies

00:36:38

whose mission is to destroy the future we want to bequeath to our children,

00:36:41

sit down one day and figure out how much money you’re spending

00:36:44

on companies who are destroying the future we want to bequeath to our children, sit down one day and figure out how much money you’re spending on companies who are destroying the future we want to live in, and figure

00:36:48

out what percentage of that you’re going to give to a group that’s fighting to keep the

00:36:52

internet free and fair and open. And not just EFF, although EFF is amazing, and you should

00:36:56

be members of EFF, and you should join EFF. I’m a member and a donor to EFF, as well as

00:37:00

someone who’s become a staffer again this year. I was a staffer before. It’s fantastic.

00:37:04

I’ve worked for a lot of non-profits. I’ve never seen one as well run as who’s become a staffer again this year. I was a staffer before. It’s fantastic. I’ve worked for a lot of nonprofits.

00:37:05

I’ve never seen one as well run as EFF.

00:37:07

But there are lots of other organizations.

00:37:08

There’s the Free Software Foundation

00:37:10

and Public Knowledge and Creative Commons.

00:37:12

And for those of you in Europe,

00:37:13

there’s the EDRI,

00:37:14

the European Digital Rights Initiative.

00:37:16

In the Netherlands, there’s Bits of Freedom.

00:37:18

In the UK, there’s the Open Rights Group.

00:37:20

In Australia, there’s Electronic Frontiers Australia.

00:37:23

In Finland, there’s Electronic Frontiers Finland. In Finland, there’s Electronic Frontiers Finland.

00:37:25

In France, there’s Le Quadrature de Net.

00:37:27

In every territory, and this is an amazing thing

00:37:30

because 10 years ago, this wasn’t true,

00:37:32

but in every territory now,

00:37:33

there are activist groups working on this.

00:37:35

Pick some, pick all, and make a hedge

00:37:37

and help us build the future we want to live in.

00:37:39

Thank you.

00:37:47

All right.

00:37:48

So how are we doing for time?

00:37:54

We have time for some questions.

00:37:56

So my Q&As tend to be out of a sausage fest,

00:37:58

and so I like to alternate between people who identify as women or non-binary

00:38:01

and people who identify as men.

00:38:03

And I remind you that long rambling statement

00:38:06

followed by what do you think of that

00:38:07

is technically a question but not a good one.

00:38:10

Go.

00:38:13

So who could have been the plaintiff at the record company?

00:38:15

Couldn’t someone from the record industry

00:38:17

have come and sued us?

00:38:18

Like, couldn’t we have cooked up a lawsuit

00:38:19

where we have a researcher do something,

00:38:21

we have a record company sue them?

00:38:22

That’s actually a thing in American jurisprudence.

00:38:24

It’s called a collusive lawsuit.

00:38:26

And it’s legal. It is legal.

00:38:29

And the way that it has to work, you have to have something called cause or controversy, right?

00:38:32

So there needs to be a real legal issue at hand.

00:38:35

And what needs to happen is one party, the plaintiff,

00:38:40

needs to want there to be a determinative outcome.

00:38:43

They want to know what the law is, but they don’t care which way it goes.

00:38:46

They just want to know which thing they should do,

00:38:48

and then they can have standing to bring a collusive lawsuit.

00:38:50

Dred Scott was a collusive lawsuit.

00:38:53

But judges fucking hate them, right?

00:38:56

And judges, you bring up a case in front of a judge,

00:38:58

and they go, this feels like shenanigans, right?

00:39:00

This feels like you guys are using me

00:39:02

to do an end run around Congress.

00:39:04

So as much, it was a thing that various people, when I started this project and I called around all the legal scholars,

00:39:10

a couple of them, especially in the academic world, were like, you need a collusive lawsuit.

00:39:14

And then I talked to our legal staff and they were like, we shouldn’t do a collusive lawsuit.

00:39:18

Any people identify as male or non-binary? Yes, sir.

00:39:22

What’s the legal status of using long PGP keys? That is absolutely legal.

00:39:25

You’re thinking of that Bernstein case. So before Bernstein, more than 50 bits was illegal,

00:39:31

and now it’s legal. Now you can have arbitrarily long PGP keys. The response to that, if you look

00:39:35

in the Snowden docs, you’ll see that the response was to start attacking the cipher systems and to

00:39:40

start attacking the endpoints, so inserting malware in the computers,

00:39:45

because the crypto is intact.

00:39:47

I mean, this is an amazing thing about crypto

00:39:48

that I think is underappreciated,

00:39:51

which is that for the first time

00:39:52

in the history of the human race,

00:39:54

normal humans can make messages so secret

00:39:57

that nobody in the world,

00:39:59

without physically coercing them or tricking them,

00:40:02

can find out what those messages say,

00:40:04

and they can send them to other

00:40:05

people you can take the distraction rectangle in your pocket and you can like take a break from

00:40:12

throwing pigs at birds and you can use it to scramble a message so thoroughly that if you

00:40:16

turned every hydrogen atom in the universe into a computer and you set it to do to doing nothing

00:40:21

but guessing keys until we ran out of universe you would run out of time before you ran out of keys.

00:40:26

This is an amazing thing.

00:40:28

So the internet won’t solve all of our problems

00:40:31

and it’s not foreordained whether it’s going to be

00:40:32

a tool for surveillance or a tool for privacy,

00:40:35

but it has the potential to make privacy happen

00:40:38

in ways that have never existed on this earth before.

00:40:41

So anyone who says, well, we’ve had other technological revolutions

00:40:44

that didn’t come out the way we thought, it’s true, we have. But they weren’t like this one.

00:40:47

There is a different thing going on here. Are there any women or people who identify as non-binary

00:40:51

who’d like to ask the next question? Yeah. Right. So the question is about fair dealing and other

00:40:58

exceptions in copyright. And then how that applies to the physical world, right? So you have the right to make fair uses of works.

00:41:09

You’ve probably heard this term, fair use.

00:41:10

So you’re allowed to quote works for the purposes of criticism.

00:41:13

You’re allowed to make duplicates of works for archival purposes.

00:41:16

You’re allowed to do lots of things.

00:41:18

You’re allowed to transcode things that are not even part of fair use, right?

00:41:20

You’re allowed to transcode works into assistive formats for people with visual disabilities.

00:41:26

And what happens when a digital lock stops you? Are you allowed to circumvent the digital lock? So far, no, right? I mean, so far that the

00:41:31

jurisprudence on this has been the other way. And in other jurisdictions where they’ve passed

00:41:35

their own versions of the DMCA, where they’ve made exceptions for this, the exceptions have

00:41:41

been a joke. So in Norway, when they adopted the EUCD, which like they’re not even in the EU and they adopted the EUCD because they make dumb decisions

00:41:49

in Norway. Norway said you have the right to turn an e-book into a book that’s in an assistive

00:41:56

format for people who are blind, but only if you’re blind and you’re not allowed to share your tool

00:42:00

with anyone else. Right? So blind people are each individual allowed to circumvent.

00:42:05

So the question is,

00:42:06

are you allowed to circumvent in other contexts

00:42:08

when there’s a copyrighted,

00:42:09

when you have an expectation

00:42:12

under some other body of law,

00:42:14

like perhaps there’s a consumer rights reason

00:42:17

to believe that you have the right to do it,

00:42:18

or contract law,

00:42:19

there are limits to contract law,

00:42:20

there’s something called

00:42:21

the doctrine of unconscionability,

00:42:23

there’s proportionality,

00:42:24

you know, I can’t, like the fact that I have a sign that says, you know, at the very back of my store

00:42:29

that says by entering the door I was allowed to hit you in the head and as soon as you come through

00:42:31

the door I hit you in the head. I say, but it was a contract. That doesn’t make it a binding valid

00:42:35

contract, right? So what about those limits, you know, to contract law? Well, where we’ve heard

00:42:40

those cases argued, they’ve always said, actually, the DMCA really doesn’t allow you to circumvent an access control to a copyrighted work, even if you don’t do anything with that copyrighted work.

00:42:55

So a good example, a couple of examples of this would be in the most recent triennial hearing.

00:42:59

So every three years, the Copyright Office hears petitions for exemptions to this rule, and we just had one.

00:43:05

And there were a couple of really interesting petitions.

00:43:08

One was about the right to jailbreak cars.

00:43:11

So GM intervened and they said,

00:43:14

you shouldn’t be allowed to change the firmware in your car.

00:43:17

We want to make sure that mechanics can only service your car

00:43:21

if they’ve signed a contract with us.

00:43:23

And then the contract with GM says, you will only buy GM parts for your car if they’ve signed a contract with us. And then the contract with GM says you

00:43:26

will only buy GM parts for your car. So obviously, like, there’s a lot of competition law and practice

00:43:31

that says it’s not right for a manufacturer to be able to control the market and spares and, you

00:43:36

know, fenders and windshield wipers and whatever. But GM believes that the DMCA, because in order to

00:43:43

find out what’s wrong with the car and which part you need to replace,

00:43:46

you have to circumvent an access control to the copyrighted firmware for the car,

00:43:49

that even though the fender is not copyrighted

00:43:53

or the fuel injection system is not copyrighted

00:43:55

or the pistons are not copyrighted,

00:43:57

that the fact that to find out which of those you need to replace,

00:43:59

you need to circumvent an access control, that it controls it.

00:44:02

You know, GM used to have this advert,

00:44:04

that’s not your father’s Oldsmobile.

00:44:06

It turned out that they weren’t speaking metaphorically.

00:44:08

They literally meant, your father doesn’t own that Oldsmobile.

00:44:10

That’s still our Oldsmobile, even though we sold it to him.

00:44:13

And then, you know, John Deere makes tractors,

00:44:16

and their tractors have firmware locks on them.

00:44:19

And again, like, tractors are not copyrighted works,

00:44:21

and the thing that John Deere wants to protect

00:44:24

has nothing

00:44:25

to do with copyright. So they have torque sensors on the wheels and they have GPSs. So they make

00:44:30

centimeter accurate surveys of the soil conditions of your field when you plow your field. And they

00:44:35

lock that data into the device and then they transmit it back to John Deere without you being

00:44:40

allowed to look at it, you the farmer being allowed to look at it. And what they do with that data is

00:44:44

they sell it to seed companies.

00:44:46

So if you want to have your tractor automatically

00:44:48

disperse your seed in a way that’s optimized for the soil density

00:44:51

in your field, you have to buy your seed from Monsanto

00:44:54

because only Monsanto has the exclusive right to it.

00:44:57

And then they also, because they have insight into soil conditions

00:45:00

across whole regions, they play the futures market.

00:45:04

They can make predictions about grain yields

00:45:06

ahead of the game.

00:45:08

And none of this is protected by copyright, obviously.

00:45:11

But no one will argue that the firmware

00:45:13

in a John Deere tractor is not copyrighted.

00:45:16

And so because in order to achieve these lawful outcomes,

00:45:20

like all the other lawful outcomes

00:45:22

that the Copyright Office has said,

00:45:23

no, you can’t circumvent for, like archiving or whatever.

00:45:26

You have to circumvent an access control.

00:45:29

The Copyright Office has said those rights that you have are trumped

00:45:34

by the absolute prohibition on circumvention.

00:45:38

Now, I think you’re right.

00:45:40

I think that if we got the right case in front of the right judge,

00:45:42

that a judge would say what you just said.

00:45:44

This makes a nonsense of the law.

00:45:49

Congress did not intend for a law that was supposed to let the entertainment industry

00:45:54

region code its movies to go on to allow fridge manufacturers to region code their butter.

00:46:01

And you could imagine that that might be something that a judge would say.

00:46:05

The problem is that everyone who relies on the DMCA to restrict additional features in their products

00:46:11

or functionality in their products gets to choose who they sue.

00:46:15

And so they only sue the people from whom they think they’re going to get a bad outcome,

00:46:18

which is why we need a much wider pool of people advertising the fact that they’re violating the DMCA

00:46:24

after having first

00:46:26

designed their product or project or research so that it is optimized for surviving a legal

00:46:33

challenge under the DMCA so that we can goad some of these people into suing. And the thing that I

00:46:39

think has changed is this target-rich environment. There’s a lot more people out there who have

00:46:43

standing now.

00:46:49

And it’s kind of the inverse of the copyright troll problem. So for a long time, the movie industry didn’t sue everybody who was in a torrent swarm downloading a movie because

00:46:54

they understood that that would just make them look like jerks. And then you got enough

00:46:58

companies that had standing to sue over BitTorrent downloading and do speculative invoicing where

00:47:04

they send you an email or

00:47:05

a letter that says, we saw your IP address in the swarm. If you pay us $1,000, we won’t

00:47:10

sue you otherwise. And that’s less than it would cost you to ask a lawyer whether you

00:47:15

should give them $1,000. And so they collect this money on these speculative invoices.

00:47:19

There are so many companies that have standing to sue whose works are being transmitted over

00:47:24

BitTorrent and who

00:47:25

have nothing else to lose because their movies did really badly. Like there was an Adam Sandler

00:47:29

movie where they just started sending out speculative invoices in Australia that closed

00:47:33

at the box in like four days, right? And it was just, you know, they found people who were

00:47:37

torrenting it and that’s the only business model they have left for this limited liability company

00:47:42

they incorporated to make this one movie. And so they don’t give a shit

00:47:45

if it brings the studios into disrepute

00:47:47

because they just want to maximize

00:47:48

their return on this one movie.

00:47:50

We now have this target-rich environment.

00:47:52

There are lots of companies

00:47:53

that don’t care about the long-term

00:47:55

because they have no long-term

00:47:56

unless 1201 is intact.

00:47:57

And if we can figure out how to gore one of their oxes,

00:48:01

then they will come and sue us,

00:48:02

even if the facts are not in their favor

00:48:04

because it’s the

00:48:05

only option they have. And then maybe we can get the right facts in front of the right judge. Are

00:48:09

there any manner of people who identify as non-binary? Yes, in the back. How can you break

00:48:15

the Inchev model? So the beauty of anticipating that 1201 will not survive, that the DMCA will

00:48:21

go away, is it turns what used to be a bug into a feature. So right now, it’s really sad that everybody you know, and you probably, are buying things that

00:48:29

have digital locks on them and that are locked up in ways that are destroying the future we want to

00:48:35

live in, right? But as soon as it’s legal to unlock that stuff as a third party, every one of those

00:48:41

people is a customer for your product. So in other words, the more Netflix customers there are out there,

00:48:46

the more customers there are for a Netflix PVR that defeats their digital lock.

00:48:50

The more K-Pod customers with DRM K-Pods there are,

00:48:55

the more customers there are for third-party K-Pods that break their locks.

00:48:59

The more GM customers there are, as soon as it’s legal to break those locks.

00:49:02

So design that product.

00:49:04

Come and talk to us about how that product

00:49:06

can be designed in a way that it’s most likely to survive

00:49:08

a 12-on-1 challenge.

00:49:10

If you’re doing a startup, you only have two

00:49:12

outcomes, right? Either you succeed or you die.

00:49:14

Either you become huge or you die.

00:49:16

So come talk to us about how

00:49:17

if you are commercially successful, you are most

00:49:19

likely to be legally successful. We’ll

00:49:22

give you the best advice we can.

00:49:24

Can’t guarantee it,

00:49:25

but we’ll talk to you about how you can be someone who has the best chance of surviving a

00:49:30

12-on-1 challenge, and then you can treat every one of those companies’ margins as your opportunity.

00:49:35

That’s Jeff Bezos in a rare moment of candor once told the publishers, your margin is my

00:49:39

opportunity. The only reason you need to use digital locks is to maintain extraordinary margins

00:49:44

that marketplaces would otherwise erode.

00:49:46

Every one of those companies has a margin that’s your opportunity, right?

00:49:50

Whether that’s making a dongle that auto jailbreaks an iPhone and subscribes it to another software store that you’ve created

00:49:56

where you’ve gone and cherry-picked the thousand top apps out of the app store

00:49:59

and offered them a 15% commission instead of a 30% commission, right?

00:50:03

Or something like jailbreaking cars so that you can get third-party parts,

00:50:08

or jailbreaking tractors, or PVRs for Apple TV,

00:50:13

or any of those other things that are lawful, right, that are lawful to do, right?

00:50:18

Like, it is unequivocally not piracy for me to make a piece of software,

00:50:22

give it to you to sell to her.

00:50:24

That is not piracy for me to make a piece of software, give it to you to sell to her. That is not piracy,

00:50:25

right? It does break the DMCA, but it’s the kind of question we want judges to answer. Is it piracy

00:50:31

when someone who owns a copyright authorizes a second party to sell it to a third party?

00:50:37

Is that piracy? What topsy-turvy universe is it in which creators who hold copyrights authorizing vendors

00:50:45

to sell it to users is an act of piracy?

00:50:48

That’s exactly the

00:50:50

humpty-dumpty question we want to put

00:50:51

in front of a judge.

00:50:54

Yeah?

00:50:56

Right.

00:50:58

Sorry, the question is about implanted

00:50:59

medical devices, and this is a particularly

00:51:01

rich field because it’s very visceral. I mean,

00:51:03

literally visceral, but also emotionally visceral. And if we could get a collusive lawsuit…

00:51:09

Talk about a death panel, right?

00:51:10

Yeah, talk about a death panel, exactly. I think a collusive lawsuit is still the wrong idea,

00:51:14

not because it’s not an urgent issue, but because collusive lawsuits are unlikely to get a legal

00:51:18

victory. That’s the problem with collusive lawsuits. It’s not that the issue isn’t serious,

00:51:21

it’s that the judge just won’t give you the judgment you want. But there are a lot of medical devices that have serious problems.

00:51:28

In the 1201 triennial at the copyright office,

00:51:31

there was a researcher who was a type 1 diabetic who doesn’t use a pump,

00:51:34

even though self-injecting is unquestionably shortening his life.

00:51:37

I mean, that’s really clear, right?

00:51:39

Humans are shitty lab techs.

00:51:41

So, like, if you rely on yourself to assay your blood sugar levels

00:51:46

and then measure out your insulin dose and then squirt yourself,

00:51:49

you will never do as well as a $15 microprocessor

00:51:54

in a little box that’s attached to you.

00:51:56

But he still pricks his finger because he’s a medical researcher

00:51:58

and he’s looked at these things.

00:51:59

And he’s like, these are unsafe at any speed.

00:52:02

How do you raise people’s awareness?

00:52:04

Right, this is a really good wide issue, right?

00:52:07

It cuts across a lot of political boundaries.

00:52:09

People care about medical implants because, as you say,

00:52:11

we all know someone who’s alive today because of their medical implant.

00:52:14

And we can viscerally see that when those medical implants are compromised,

00:52:19

that the consequences are literally grave, right?

00:52:22

I think that in some ways, and very tragically, this is self-correcting,

00:52:26

in that there will be more and more horrible outcomes involving medical devices. And when

00:52:33

those horrible outcomes arise, people will become more alarmed about it. I’m kind of looking at,

00:52:38

this is a very Canadian metaphor, skating where the puck is going. I think that there will be a

00:52:42

steady drumbeat of infosec problems involving

00:52:45

things like implanted medical devices. I want to be the person who’s standing there going,

00:52:49

are you worried about last month’s horrific story about implanting medical devices?

00:52:53

I’ve got a political program that will help us solve this issue. Maybe. I don’t know. I don’t

00:52:59

know. But I think that you’re right. And this relates to privacy overall.

00:53:07

I think that we have not reached peak surveillance.

00:53:13

Our devices will add more surveillance and control to our lives as time goes by.

00:53:15

That’s clearly going to happen.

00:53:18

But I think what we have done is we’ve reached peak indifference to surveillance.

00:53:22

There will never be a time in which fewer people give a shit about this stuff. Because now on more and more people will be directly affected by it right ashley madison the office of personnel personnel

00:53:29

management were like they were not we have not reached that was not like the zenith of leaks

00:53:35

right this was like the pre-game show of leaks right the facebook leak the gmail leak the the

00:53:43

you know hertz rent-a-car leak,

00:53:46

those are going to be the big shows,

00:53:47

and there will be people who really urgently care about this.

00:53:50

I think that in some ways drumming up anxiety about this problem is not the issue.

00:53:56

It’s being ready to capitalize on the anxiety when it arises.

00:54:00

And it’s true also of privacy technology overall.

00:54:03

I think that for a long time, getting people to give a shit

00:54:06

about privacy was the hard problem.

00:54:07

And then secondarily, it was making privacy technology

00:54:10

that normal humans could use.

00:54:12

Now the only problem is making privacy technology

00:54:14

that normal humans can use. Because the

00:54:16

pool of people who care about privacy

00:54:17

is only ever going to go up.

00:54:19

And so that’s all we need to do. How are we for time?

00:54:22

How do I get my fair share

00:54:24

of revenue as an artist, as a copyright maker?

00:54:27

So I’m, as you say, I made my living for the last 10 years as a novelist.

00:54:31

It’s how I pay my bills.

00:54:33

And I do worry about making as much money as possible.

00:54:37

But of all the ways that I fail to make money,

00:54:40

the most significant relate to negotiating positions between me and my publisher

00:54:44

and my publisher and the platforms that retail my products. The deadweight losses, as far as anyone

00:54:50

can quantify them, from piracy are pretty small and redound on the, I’m so far down the food chain

00:54:58

in terms of the actual like clearing royalty to me of every dollar spent on Amazon on my products that even modest

00:55:08

piracy has almost no effect on my bottom line. What I think we need to make as our policy

00:55:14

priorities in terms of protecting artists is not making sure that a certain business model where

00:55:19

you get to control how your works are distributed or who distributes them or whatever is successful in the future.

00:55:25

I think what we need to do is design policy so that no matter what business model it is

00:55:30

that succeeds in the future, because the business models changes, the technology changes,

00:55:34

that the first in line to get paid are artists.

00:55:37

And the second line to get paid are the people who invest in our products,

00:55:39

my publisher or record label.

00:55:41

And the third are the people who format a file and put it on an

00:55:45

e-commerce system like Amazon, right? Because frankly, that should be a commodity role.

00:55:51

And right now, because of the DMCA, it goes the other way. So every time one of my publishers

00:55:56

sell, well, thankfully, none of my products are sold with DRM. But if I were with a publisher

00:56:01

that used DRM, every time Amazon sold one of my products locked with their DRM,

00:56:07

because only Amazon can authorize that to be unlocked,

00:56:10

if my publisher or I had a dispute with Amazon, like Hachette did, one of the big five publishers a couple of years ago,

00:56:16

and said to Amazon, we are going to advise all of our customers to start buying their books everywhere but Amazon.

00:56:21

We’re going to give them a tool to convert their existing libraries to run on everyone else’s platforms. Amazon could say, no, I’m sorry. Only we get

00:56:28

to authorize that. And we’re not going to authorize that. And what we saw with Hachette

00:56:32

is that they lost. And Hachette is not just a giant publisher. They’re actually a giant

00:56:37

arms dealer that owns a publisher. So you’d think that they would have read their art

00:56:40

of war. But apparently they missed the fact that when you let Amazon alienate your customers

00:56:45

from you and arrogate your customers to themselves, that they’d get to control the commercial

00:56:49

relationship with the customer. So if we want to increase the share of income on every dollar

00:56:54

spent on an e-commerce platform to the creator and the publisher, we need to get rid of this

00:57:00

regime in which the platform vendor gets to control whether or not the lock can be unlocked.

00:57:05

So that’s step one. But step two is the DMCA’s takedown regime, where right now the DMCA

00:57:12

has basically increased the amount of liability that intermediaries have to assume in order to

00:57:21

make works available to the public. So when YouTube started, all you needed was a pile of hard drives and three people in

00:57:28

a garage and an unhealthy interest in video.

00:57:30

And now you need all of that plus $100 million worth of compliance software to do automated

00:57:35

copyright takedown, which has become the de facto standard for running a video hosting

00:57:39

platform.

00:57:40

What this means is that there’s not a lot of YouTubes around.

00:57:43

We’ve kind of reached maturity in the, like, I’ll host your video in a really robust way for free market, and the

00:57:50

only new entrants into the market are owned by other big companies that have the same

00:57:53

victory conditions like Microsoft. And what that means is that if my publisher says, here’s

00:57:58

the best deal I’m going to offer you, I can’t say, fuck you, I’m going to go do it on my

00:58:02

own out here with the independents, because the independents have become indistinguishable from the big five or the big four in records

00:58:09

or the big five in movie studios.

00:58:13

The new boss becomes the old boss when there’s not competition in that regime.

00:58:17

And by increasing the cost of entry, you decrease the competition.

00:58:20

And this has literally happened in the case of YouTube,

00:58:22

where YouTube started a competitor to Spotify and Pandora to stream music.

00:58:27

And they gathered the big four record labels into a room with them,

00:58:30

and they negotiated the terms on which they would license the big four labels’ music.

00:58:34

And then they went to the indies, the small labels,

00:58:37

and the independent artists who market their material through YouTube,

00:58:40

and they said, you will take the terms set by the big four,

00:58:44

or you can no longer

00:58:45

use YouTube at all to promote your products, right? The new boss has become indistinguishable

00:58:49

from the old boss. What we need, if we’re going to have consolidation in the publishing sectors

00:58:55

and kind of the investors in creative works, we need to have lots of independent options that

00:59:01

represent a competitor of last resort, resort. The worst deal that the publishers

00:59:06

can offer me has to be better than the best deal that I rationally expect I can get for myself by

00:59:10

self-publishing. And when we increase the liability for all the services that enable self-publishing,

00:59:16

we decrease the quality of the deal that I get when I self-publish. I have to give more money

00:59:21

to all the intermediaries that handle my payments, my marketing, my formatting, my sales, and my fulfillment.

00:59:26

And so I want to make sure that we get the most money

00:59:29

into the pockets of the artist,

00:59:31

no matter which artists are successful in any given moment.

00:59:33

Remember that being a successful artist is a Six Sigma event.

00:59:36

Like most people who set out to make a living in the arts fail.

00:59:39

We should be interested in whoever succeeds

00:59:41

making as much money as possible.

00:59:42

And the way that you do that

00:59:43

is by decreasing intermediary liability and making it such that you can port your material from one

00:59:50

vendor to another without any liability or lock-in. And that means that my negotiating position is

00:59:55

always better. And so if my works are commercially successful, I can get clear more money from every

01:00:01

sale. That’s, I think, the way that you do this. I don’t think you can make a reliable business model for artists because Six Sigma events don’t have

01:00:08

reliable business models. One last question from a woman. Yes, go ahead. So copyright,

01:00:14

the question is, will the DMCA fall at once or will it be in different domains, medical,

01:00:17

automotive, tractors, or whatever? So copyright is, generally speaking, it’s what lawyers call

01:00:22

fact-specific. So copyright rulings tend to be very narrow and relate to very specific set of circumstances.

01:00:30

But if we broadly found that source code, even when it’s circumvented, was protected under the First Amendment,

01:00:37

then we would have a really broad exemption.

01:00:41

It would probably only cover people who publish source code as well as part of their product strategy, which incidentally means that people who make free and open source

01:00:47

software that can be audited by third parties without having to decompile it would have a

01:00:52

commercial advantage. Because if you made the Mozilla phone, you’d be able to add Netflix

01:00:56

support without Netflix’s permission because you publish your source code. Whereas if you made the

01:01:00

Apple phone, you wouldn’t because you don’t publish any of your source code. So that would

01:01:04

be a pretty interesting outcome generically. There would still probably be corner cases that were

01:01:08

intact in 1201. But one of the interesting things about this is that everywhere in the world,

01:01:13

there are 1201 analogs because the U.S. trade representative has made 1201 equivalents a major

01:01:19

priority in trade deals with the U.S., whether that’s the Australia-U.S. Free Trade Agreement or in Canada, where I’m from, Bill C-11 was negotiated

01:01:28

or was passed after exclusive consultation

01:01:32

with the U.S. trade representative

01:01:33

and American entertainment companies

01:01:35

and no consultation with Canadian artists

01:01:37

or Canadian entertainment companies.

01:01:39

And it also has similar prohibitions.

01:01:42

But all of these countries have essentially said,

01:01:44

we will not

01:01:45

allow our businesses to get into these profitable lines of work because America has covenanted that

01:01:53

they will not get into these profitable lines of work either. So it’s a suicide pact. But suicide

01:01:58

pacts are mutual. And when the U.S. stops enforcing its side of the bargain, once we start eroding it

01:02:04

through jurisprudence here,

01:02:06

then all of those other countries are vulnerable

01:02:07

to having their 1201 equivalents struck off the books

01:02:10

because you have this confluence of the interests of activists

01:02:13

who care about this stuff because they care about information policy

01:02:16

and entrepreneurs and industry who just see a profit opportunity in it.

01:02:20

And as we saw with the fight over SOPA

01:02:23

and as we saw with the net neutrality fight,

01:02:26

usually you can handicap the outcome of a policy debate based on who’s spending the

01:02:31

most money. But as soon as you have the interests of activists aligned with one industry body,

01:02:35

it becomes indeterminate. You don’t know how it’s going to come down until the coin lands

01:02:40

because activists are a wild card. So I think that that’s going to be, that is a thing

01:02:46

where we can make a change everywhere in the world if we can get rid of it here. Thank you very much.

01:02:58

You’re listening to The Psychedelic Salon, where people are changing their lives one thought at a time.

01:03:11

Over the course of the past few years, I’ve actually read about many of the things that Corey talked about here. But by putting them all together as he has done in this talk,

01:03:16

well, he just blew me away. You know, it’s as if I’ve been asleep for the past 10 years.

01:03:21

This is a really important talk, not just for you, but for all

01:03:25

of your friends as well. So be sure to make a copy of this podcast and pass it around. You know,

01:03:31

it’s free. There’s no digital lock on it, and the Creative Commons copyright agreement that’s posted

01:03:37

with it allows you to make all of the copies that you want. So I urge you to do just that.

01:03:41

Take a copy to your teachers if you’re still in school, and give copies to your children and grandchildren if you have any.

01:03:47

The longer we go on as we have been, just whistling past the graveyard more or less,

01:03:53

well, the more difficult it’s going to be for any creative work to progress in the future

01:03:57

that isn’t owned and controlled by a rich corporation.

01:04:01

You actually need to do something about this yourself.

01:04:04

And you can begin by going to EFF.org and becoming involved. Thank you. your own life. Then go to EFF.org, click their Take Action link, and become involved. This isn’t

01:04:27

something that you can put off until a better season. The threat to your freedom is real,

01:04:32

and you can’t count on anybody else to protect you. This is a do-it-yourself situation,

01:04:37

so go do something. And to begin a discussion among us salonners, I’ve started a new forum

01:04:43

on our Find the Other site, which you can get to

01:04:45

from our salon’s main site,

01:04:48

which is psychedelicsalon.com.

01:04:50

The new forum is called

01:04:52

Digital Freedom.

01:04:53

And, as you know, we are now in our final

01:04:55

week of free lifetime charter

01:04:57

subscriptions to that site.

01:04:59

And beginning November 1st, for everybody

01:05:02

except students and former Pledge Drive

01:05:04

contributors, there’s going to be a $12

01:05:06

a year charge to participate

01:05:08

in the forums. And hopefully there’s

01:05:10

going to be enough people who join each year that

01:05:12

I won’t have to do any more of those

01:05:14

annual Pledge Drives to cover

01:05:15

the expenses of producing and publishing

01:05:18

these podcasts. Actually,

01:05:20

there have already been hundreds of salonners

01:05:22

who have joined us online, and I

01:05:24

hope that you’re going to do so as well.

01:05:26

For now, this is Lorenzo signing off from Cyberdelic Space.

01:05:30

Be careful out there, my friends. Thank you.