Program Notes

Follow Lorenzo on Patreon.com
https://www.patreon.com/lorenzohagerty
Guest speakers: John Perry Barlow, John Gilmore, Cory Doctorow

https://www.eff.org/John Perry Barlow Photo source: EFF.org

Date this lecture was recorded: November 17, 2006

Today we pay tribute to the late John Perry Barlow, lyricist for the Grateful Dead, co-founder of the Electronic Frontier Foundation, and an important leader in the struggle to keep the Internet free. In addition to John Perry’s remarks, we will also hear John Gilmore, another of the co-founders of EFF.org. One of the most important essays written by Barlow is his Declaration of the Independence of Cyberspace, and at the end of this podcast you will hear John Perry Barlow reading that declaration himself.

“If information is power, then the public needs to have more of it than the government, or the public will not be able to control the government.”
-John Gilmore

“I think the city-state is going to have the biggest Renaissance since the Renaissance.”
-John Perry Barlow

 
John Perry Barlow Library
A Declaration of the Independence of Cyberspace
by John Perry Barlow

 Electronic Frontier Foundation
 

Download a free copy of
Lorenzo’s latest book
The Chronicles of Lorenzo - Volume 1

https://lorenzohagerty.com/freebooks/

Previous Episode

564 - Plants and Mind (Part 2-Chimpanzee or Bonobo)

Next Episode

037 - The Family that Trips Together, Sticks Together

Similar Episodes

Transcript

00:00:00

Greetings from cyberdelic space.

00:00:20

This is Lorenzo, and I’m your host here in the Psychedelic Salon.

00:00:24

This is Lorenzo, and I’m your host here in the Psychedelic Salon.

00:00:35

And, as I’m sure that you already know, the worldwide psychedelic and tech communities have lost a man who was a giant among us, John Perry Barlow.

00:00:42

Although our longtime fellow salonners are quite aware of all that John Perry has done for us,

00:00:45

if you are new to the tech or the psychedelic scenes,

00:00:48

well, don’t worry, because by the end of this podcast,

00:00:52

you’re going to know a great deal about the life and work of this dear man.

00:00:56

Now, I don’t mean to imply that I knew him really well.

00:00:59

The fact is, we only met on a few occasions.

00:01:03

However, I clearly remember that first time that we met.

00:01:10

It was in 2003 at Burning Man, and he attended the talk given by Allison and Alex Gray,

00:01:14

and it was the first of the Planque Norte lectures that year.

00:01:21

And after the Gray’s talk, John Perry stopped by our camp and spent several hours visiting with some of us under our shade structure.

00:01:25

And to be honest, well, I was awed just to be able to talk with him, because in the circles that I moved in, John Perry Barlow was already a legend.

00:01:32

For one thing, he first came to my attention as a lyricist for the Grateful Dead. And while I

00:01:38

wasn’t a dedicated deadhead myself, well, I did like their music and owned several of their CDs.

00:01:44

But how John Perry first

00:01:46

captured my attention was when he published his Declaration of the Independence of Cyberspace.

00:01:52

And if you’ve never read that essay, well, you should. It’s really that important. And to be

00:01:57

sure that you don’t miss it, at the end of today’s podcast, I’m going to play a recording of John

00:02:02

Perry reading it himself.

00:02:11

Now, at the time he published his declaration, the Internet was still almost unknown to most people.

00:02:19

Back then, even people who had access to the net spent less than one hour per month surfing the web.

00:02:24

And my job back then was to travel to conferences across the U.S. and Europe and speak about what a truly marvelous thing the Internet was going to be.

00:02:29

In essence, I was the Internet evangelist for the company now known as Verizon.

00:02:34

And in my talks, I read excerpts from the Declaration, which begins,

00:02:40

Governments of the industrial world, you weary giants of flesh and steel, I come from cyberspace, the new home of mind. And as you may guess,

00:03:03

well, that wasn’t really well received by those in the upper reaches of management.

00:03:09

But those on my level understood.

00:03:12

And many of those young women and men that I met at conferences are now leaders in various tech fields.

00:03:18

And hopefully a few of them still remember those rousing words of John Perry’s.

00:03:22

And they are continuing the struggle to keep

00:03:25

the internet free.

00:03:26

But I digress.

00:03:28

When I first learned of John Perry’s death last week, I began looking for some of his

00:03:33

talks that I could possibly use here in the salon.

00:03:36

While he had been in the audience of the first series of the Planque Norte lectures at Burning

00:03:40

Man, and he was one of us, still I never found a way to work in some of

00:03:46

the more tech-oriented and political talks that he had given, but this time my search was more

00:03:52

than fortuitous. I found the talk that we are about to listen to on archive.org. It was given

00:03:58

on November 17, 2006 at Southern California University, and there were only about 50 people in the room.

00:04:07

Additionally, there was a second speaker, John Gilmore, who you’ve heard from several times here

00:04:12

in the salon, and on top of that, this two-man panel was introduced by Cory Doctorow, another

00:04:18

Planque Norte speaker, who you’ve heard here in the salon several times as well. So, how could I resist playing this talk?

00:04:27

Obviously, I couldn’t.

00:04:29

Now, if you’re under 30 years old,

00:04:32

much of what you’re going to hear about the early history of cyberspace

00:04:35

that John Perry begins with

00:04:37

is probably going to be new information to you.

00:04:41

But if you’re a dusty old fart like me,

00:04:43

you’ll remember these stories about the early government raids on hackers as something that you had to live through yourself.

00:04:50

In my recent book, The Chronicles of Lorenzo, one of my stories mentions the fact that even if we aren’t fully aware of it,

00:04:58

the news from our towns, our countries, and the rest of the world seeps into our minds like elevator music.

00:05:04

countries and the rest of the world seeps into our minds like elevator music.

00:05:10

And in some cases, it can even provoke subliminal suggestions that involve a wide range of emotions.

00:05:16

My phrase for this is, history is the background music of our lives.

00:05:22

So now let’s listen to a little of what was once background music to many of us as Cory Doctorow opens the evening’s discussion.

00:05:24

ground music to many of us as Cory Doctorow opens the evening’s discussion.

00:05:34

We do have two fantastic speakers tonight, John Perry Barlow and John Gilmore, two of Electronic Frontier’s co-founders and two real giants of technology, cyber liberties,

00:05:40

the internet, the personal computer, software, as well as numerous causes now.

00:05:49

Gilmore couldn’t fly down here because he’s embroiled in a legal action

00:05:53

that’s headed, we hope, towards the Supreme Court

00:05:56

over the right to fly anonymously and the requirement for the TSA

00:06:00

to show the content of the rules that govern whether you can or can’t fly anonymously.

00:06:07

So one of my students, Andy, and I flew out to San Francisco yesterday and drove down.

00:06:12

He’s taking the night bus home tonight.

00:06:16

Barlow, in addition to co-founding EFF and writing some of the Grateful Dead’s best-loved songs

00:06:21

and the Declaration of Independence of Cyberspace, and also co-founded Earth First and Greenpeace.

00:06:30

I didn’t co-found Greenpeace.

00:06:31

You didn’t co-found Greenpeace?

00:06:32

No, no.

00:06:32

Co-found Earth First. I beg your pardon.

00:06:34

I had a lot to do with it.

00:06:35

Had a lot to do with it.

00:06:36

And Gilmore helped found one of the first LFISPs

00:06:40

and helped write probably the most widely used compiler

00:06:44

and funds

00:06:45

a lot of drug activism, the right

00:06:48

to control your own state of mind

00:06:50

and therefore choose

00:06:52

what you’re going to ingest and what it does

00:06:54

to you, among many numerous other causes.

00:06:56

And between the two of them, they’re two of the most

00:06:57

inspiring and interesting speakers I’ve ever heard.

00:07:00

And you came here to hear them

00:07:02

and not me, so I’m going to sit down now

00:07:04

and let them take it.

00:07:05

Thank you very much for coming.

00:07:13

Well, this will be a strange thing to say in this particular context,

00:07:18

but one of my objectives in life is to eliminate broadcast media.

00:07:25

So I’m not terribly interested in being one,

00:07:28

and I’m not sure that John is either.

00:07:30

So to the extent that we can make this a conversation,

00:07:33

and I see it’s a fairly large group to have a conversation with,

00:07:37

this is what we would aim to do.

00:07:41

There are, I mean, over the course of the time

00:07:43

that he and I have been working together,

00:07:45

which is 16 years,

00:07:49

a lot has happened in cyberspace.

00:07:53

I mean, actually, it hadn’t even been named when we first started dealing with it.

00:08:00

In fact, people didn’t think of it as a place of any sort or an environment of any sort at that time.

00:08:07

So even though 1988 seems like a reasonably recent time in normal human terms,

00:08:15

it’s like in the police scene as far as the Internet is concerned.

00:08:21

So over the course of that time, there have been a great many issues that we’ve dealt with. And we can go through some of the early history of the EFF

00:08:32

for those of you who don’t know about it, you know, what the issues were that caused

00:08:36

us to form it in the first place. Just to set the parameters here, I will tell you just as an executive summary that the EFF exists so that

00:08:50

your descendants will have the right to know regardless of where they are on this planet.

00:08:58

And that if one of them wants to say something, he will be able to, or she will be able to.

00:09:03

And if somebody else wants to hear what that person said, they will be able to or she will be able to and if somebody else wants to hear what that person said

00:09:07

they will be able to it’s as simple as that I mean we we are right at the precipice of a golden age

00:09:15

in human history where everything can be known by anybody who’s interested thanks to the internet

00:09:20

and we believe that and EFF exists to try to make certain that the underlying architecture

00:09:27

of this great room in which all humanity is gathering

00:09:33

will go on being an open room

00:09:36

and not one that is filled with chambers and hierarchies and powers that make it difficult for the know-nots to go on knowing not.

00:09:52

Too much power has been distributed over the course of time by the ability to control information.

00:10:01

liberty has been seized

00:10:03

against the notion that there were

00:10:07

certain things that were dangerous

00:10:09

to know.

00:10:11

We don’t believe that anything is dangerous

00:10:13

to know. There are dangerous things to do.

00:10:17

But we believe that nothing

00:10:19

is inherently dangerous to know.

00:10:23

Nothing.

00:10:27

We didn’t have such a clear notion

00:10:29

back when we started EFF.

00:10:31

I mean, in the very beginning,

00:10:33

I don’t think we were thinking about an organization at all.

00:10:38

I was a cattle rancher in Wyoming.

00:10:40

John at least had the advantage

00:10:41

of having been on the Internet since, what?

00:10:46
00:10:47
  1. I was a, you know,
00:10:50

a Johnny-come-lately.

00:10:51

I was raising cows

00:10:53

and not thinking about these things

00:10:55

until 85.

00:10:56

I wasn’t allowed to be on the internet.

00:10:58

No, but you were there.

00:11:01

Actually, I wasn’t allowed to be on the internet

00:11:03

when I got on either.

00:11:04

But, you know, it was easy enough to do.

00:11:09

You didn’t necessarily have to be a defense contractor, though they preferred that.

00:11:16

But the first time I got online, it was a place that had probably 200,000 people on it, if that.

00:11:29

But I knew, and I think John had known well before that,

00:11:33

that this was going to expand as exponentially as it has

00:11:39

to include every man, woman, and child on the planet sooner or later.

00:11:43

And we would all have to be in there together

00:11:46

without any of the usual defining constraints of sovereignty,

00:11:53

borders, legal systems of one sort or another,

00:12:00

languages, religions, political systems, and that there was going to be an

00:12:08

inevitable amount of friction. Furthermore, you could see from the very beginning that

00:12:15

if this thing was going to do what we thought it was going to do, that it would cause a

00:12:19

fundamental renegotiation of all the existing power relationships on the planet,

00:12:25

which it is now in the process of doing.

00:12:29

And people have various longstanding agreements with themselves and the rest of their elites

00:12:38

that they have some right to these authorities that they may not inherently have

00:12:43

when the world becomes a meritocracy

00:12:47

of thought, a collective organism of the human genius.

00:12:53

But it all started out much simpler than that.

00:12:57

Back in about 1980, well, I first met John, actually, at a hackers conference, which was not hackers in the present sense of the term.

00:13:08

It was people like Steve Wozniak and the folks who actually created all this technology originally.

00:13:16

And I was just suddenly, I was getting out of the cattle business.

00:13:21

I was suddenly interested in this other thing that was happening that I’d seen online.

00:13:26

And I wanted to find out who these wizards were who were making it.

00:13:30

So I became interested in that fashion.

00:13:33

I was interested in the deadheads, oddly enough.

00:13:38

I mean, the people who followed the Grateful Dead.

00:13:40

I wanted to understand.

00:13:41

I lived in a little agricultural town in Wyoming.

00:13:44

the Grateful Dead. I wanted to understand. I lived in a little agricultural town in Wyoming and I wanted to know what was going to happen to community in America after all those little

00:13:49

towns went away. And I was looking at other forms of emergent community. And I thought,

00:13:55

well, the deadheads might be one way of looking at it. But because I wrote songs for the dead,

00:14:01

it was difficult for me to study them without sort of the Heisenberg sociology effect of,

00:14:06

you know, as soon as I came around,

00:14:07

that I wasn’t getting a straight read.

00:14:10

And somebody suggested, well, you could get online.

00:14:12

There are a lot of them online.

00:14:14

I didn’t know what that meant.

00:14:17

There were Usenet newsgroups already of deadheads.

00:14:22

So I got an account, which was, in those, trying to get online was a, you know,

00:14:28

you had a 300, 300 baud modem and, you know, it was heavy lifting, a lot of Hayes command codes.

00:14:33

And exercise in futility.

00:14:37

Oftentimes. And, um, but as soon as I got there, I thought, well, yes, there is a community of

00:14:43

these people and this is really interesting, but there is a community of these people, and this is really interesting.

00:14:47

But there’s a community of a lot of other folks,

00:14:51

and they’re just mostly talking about bits and bytes and how to make the system work.

00:14:53

They’re not talking about the politics.

00:14:55

They’re not talking about the economics.

00:14:57

Or even about music.

00:15:00

Or even about music that much, no.

00:15:07

Or certainly none of the issues that now preoccupy us with the film industry and copyright and all these issues.

00:15:10

That hadn’t really come up yet.

00:15:12

Yeah, it’s actually a little surprising that it took like 20 years of the ARPANET

00:15:19

and Internet’s existence and 10 years of the public Internet

00:15:22

before it occurred to very many people to start moving music on it.

00:15:26

Right. Well, there wasn’t much bandwidth. I mean, how much music are you going to move

00:15:31

on a 300-bot modem? I mean, unless you…

00:15:34

Overnight?

00:15:35

Yeah, over the course of a week, you know what I mean?

00:15:38

You mean just like dead tapes, right?

00:15:40

Yeah.

00:15:41

Or were there other kinds of music?

00:15:43

Well, there were all kinds actually in those days

00:15:47

people coordinated the trading of dead tapes

00:15:49

through the mail

00:15:50

by sending email to each other

00:15:51

that was what they were mostly talking about

00:15:53

you got a

00:15:56

Omaha 73 man

00:15:58

I’ll send you one for a

00:16:02

Rochester

00:16:02

87

00:16:04

so I’ll send you one for a Rochester, you know, 87.

00:16:13

So I thought, look, you know, I don’t know anything about this stuff,

00:16:16

but nobody else is writing about it, so I’ll start writing about it,

00:16:18

just because it’s interesting to me.

00:16:22

And I’d established myself somewhat as a writer on this subject, and then Harper’s Magazine did something that was incredibly prescient. They decided to have a forum about hacking and

00:16:31

cracking and privacy and freedom of expression online in 1989. And they included me and Mitch Kapoor, my colleague at EFF, and several other folks, including

00:16:48

a couple of computer crackers named Fiber Optic and Acid Freak. Fiber Optic and Acid

00:16:55

Freak seemed like the worst desperados I’d ever seen, or hadn’t seen, because they were

00:17:01

invisible to me, but they were coming on awful strong and you know they just seemed like a couple of godless little nihilists i mean i was having

00:17:09

sort of an old hippie response to them and um at one point i irritated them in this forum by saying

00:17:18

you know if somebody took away your modems and gave you skateboards that wouldn’t make a nickel’s worth of difference. And this insulted them because it was

00:17:26

partly true.

00:17:28

So they

00:17:28

downloaded my entire credit record

00:17:31

into the conference.

00:17:35

And

00:17:35

suggested, erroneously

00:17:37

as it turned out, that they could change it

00:17:39

permanently to my

00:17:41

everlasting disadvantage. I mean, I’d be standing

00:17:44

in money order queues at the post office for the rest of my life.

00:17:48

That’s scary, you know.

00:17:50

I mean, I’ve been in police custody on acid, and I wasn’t as scared as I was of those two guys.

00:17:56

And so I emailed them, and I said, look, you know, we’ve exceeded the bandwidth at this medium, I think,

00:18:03

and I’d like to have a phone conversation with you,

00:18:06

and I won’t insult your intelligence by giving you my phone number,

00:18:09

because I knew that they would hack it out of the system immediately

00:18:12

just to prove that they could.

00:18:15

And surely in 15 minutes I was talking to them.

00:18:19

Now, their voices hadn’t even changed yet.

00:18:22

They were a couple of little pencil neck geeks

00:18:25

who were basically trying to violate the forbidden.

00:18:28

I mean, they actually wanted to violate another forbidden,

00:18:30

but they hadn’t quite gotten up to that yet.

00:18:34

And they were also, in fairness,

00:18:38

trying to create their own internet

00:18:41

using the telephone network.

00:18:43

They were hacking into the phone system

00:18:44

and creating their own people’s internet since they didn’t have access to the telephone network. They were hacking into the phone system and creating their own people’s internet

00:18:47

since they didn’t have access to the real one.

00:18:50

And they had something called the Legion of Doom.

00:18:54

And shortly enough, I found myself like the scoutmaster

00:18:57

to the Legion of Doom, which is, you know,

00:18:59

life takes odd turns.

00:19:02

But I liked these kids.

00:19:09

And then a series of events I became familiar with through them and through other sources started to occur. There was a games outfit, role playing

00:19:17

game outfit in Austin, Texas named Steve Jackson Games. And one day the Secret Service came

00:19:23

in and took everything in the office, all

00:19:26

the computers, everything, because they were producing a role-playing game called Cyberpunk,

00:19:32

which the Secret Service thought was a handbook for computer crime.

00:19:37

Just to clarify here, games in those days came on paper.

00:19:40

Yeah, it was just a book. But they didn’t know that.

00:19:45

And maybe a game board.

00:19:46

The Secret Service was completely clueless about this.

00:19:52

And that disturbed me.

00:19:54

It sounded like it was kind of an overbroad search.

00:19:58

And then there was a kid named Eric Neidorf in Illinois

00:20:01

who had suddenly been hit with heavy felony charges for having

00:20:07

published in his online magazine a document called the 911 document, which was an extremely

00:20:15

abstruse description of the way the 911 system worked in the Bell system.

00:20:22

You know, I mean, this was better than chloroform in print.

00:20:25

I mean, if you could read this whole document without going to sleep,

00:20:27

you’d have to work for AT&T.

00:20:31

But, and furthermore, it was readily available from Bell Corp.

00:20:36

But somebody had hacked into a system in Georgia

00:20:40

and taken it out as a trophy, you know,

00:20:42

just a coonskin to nail to his hacker door.

00:20:46

And Eric Neidorf had republished it, and Eric Neidorf was suddenly being charged with, you know,

00:20:53

serious crimes and publishing, you know, secrets vital to the safety of the United States.

00:21:01

I still wasn’t paying much attention to it.

00:21:04

Then I get a phone call from Acid Freak,

00:21:07

and I find out that he’s come home

00:21:08

and found his 12-year-old sister being held at gunpoint

00:21:13

by several large men from the Secret Service.

00:21:18

While they remove every electronic item in the house,

00:21:22

including clock radios,

00:21:24

and every bit of magnetic media, including his Metallica tapes.

00:21:30

And he’s scared.

00:21:32

And I think, well, now wait a second.

00:21:34

You know, maybe these kids are worse than I think.

00:21:36

I mean, maybe they’re doing something really bad.

00:21:39

Seemed unlikely.

00:21:41

But nevertheless, it seemed like they were coming down with such maximum law that it was possible.

00:21:47

So I got more concerned.

00:21:51

And then there was a break point where I got a phone call from Special Agent Richard Baxter in Rock Springs, Wyoming, from the FBI.

00:22:01

And I knew Agent Baxter.

00:22:02

I mean, when I was in the cattle cattle business he’d help me get some cattle back

00:22:06

that had been stolen

00:22:07

he was a good hand with cattle wrestling actually

00:22:09

there’s probably only one FBI guy in Wyoming

00:22:11

in western Wyoming

00:22:13

he was it

00:22:14

and you know I’d always gotten along with him

00:22:18

he calls me up and he’s nervous as hell

00:22:21

and that’s always a bad sign

00:22:23

a friend

00:22:24

and then he says I need to talk to you and I can’t talk to you over the phone.

00:22:29

I really don’t like that.

00:22:31

So he’s going to drive 100 miles north to talk to me in person.

00:22:36

So, you know, and I’m still heavily affiliated with the Grateful Dead where, you know, crimes have taken place.

00:22:44

I don’t consider them crimes, but there are certain

00:22:47

members of the federal government who do. So I’m nervous about what it’s going to be. And he comes

00:22:52

up and he’s just, he’s quaking with anxiety about this. And I finally get him calmed down.

00:23:01

And he tells me that he’s investigating something called the New Prosthesis League.

00:23:05

Well, actually it called itself the New Prometheus League,

00:23:07

but this was only the beginning of Agent Baxter’s misinformation about this thing.

00:23:13

The New Prometheus League had taken some of the source code from the Apple Macintosh ROMs

00:23:18

and was shipping it around on floppy disks.

00:23:21

Tells you how long ago this was.

00:23:24

As a protest against Apple’s

00:23:26

closed and proprietary architecture.

00:23:31

But what Agent Baxter believed was that what was really happening was that they were shipping

00:23:36

around the recipe to the Apple, the secret sauce, and that if this got shipped around

00:23:42

in general, this is what Apple had got him to believe,

00:23:46

if this got shipped around in general,

00:23:49

then the Taiwanese would be turning out Macintoshes and it would be the end of the great American industry.

00:23:51

He was there to protect our industrial interests.

00:23:55

But I had to spend two hours explaining to him

00:23:57

what source code was, what a ROM chip was,

00:24:00

what the crime was if it was in need of crime,

00:24:03

before I could even start to tell him

00:24:04

why I was not likely the person who had committed it.

00:24:09

And I thought, this is not a good sign.

00:24:12

Because every time I see some well-armed, insecure guy

00:24:16

wandering around a place he does not understand,

00:24:20

with the likely possibility that they’re more where he came from,

00:24:23

I think it’s something you kind of have to start thinking about and dealing with.

00:24:27

And it also put what was happening to my young friends in the Legion of Doom

00:24:32

into another perspective.

00:24:36

So I wrote something about it, and I put it on a bulletin board called The Well,

00:24:39

which was kind of a salon for writers in early Digerati,

00:24:45

where it was read by Mitch Kapoor, who had founded Lotus.

00:24:51

Now, Mitch, as it happened, had been visited by the FBI

00:24:54

in connection with the same thing and had been fingerprinted.

00:24:58

Now, he’s president of a fairly large company,

00:25:01

and suddenly the FBI is in his office fingerprinting him.

00:25:04

He’s not used to this stuff, and he’s a little nervous anyway.

00:25:07

So he doesn’t even tell his wife about it.

00:25:10

This happened to my girlfriend, too.

00:25:12

She worked at Apple at the time, and they fingerprinted her

00:25:14

to see if she had, if any of her fingerprints had showed up on any of those floppies,

00:25:20

like maybe she was the one who stole it.

00:25:22

Right.

00:25:22

And she was really nervous about this,

00:25:24

floppies like maybe she was the one who stole it right and she was really nervous about this because she worked in a lab at apple that just had a stack of spare floppies you know anybody could

00:25:31

pick them up use them put them back whatever so her fingerprints could have been on all those

00:25:35

floppies so anyway i wrote something about this and i put it on the well and mitch reads it

00:25:42

and says aha suddenly i have a support group. I mean, this weird thing

00:25:47

has happened to me that is very disturbing to me. And now I know there’s somebody else out there

00:25:51

that has had the same weird thing happen. So the next day he happens to be flying his biz jet

00:25:57

across the country. And he calls me up from over North Dakota and asks if he can land a Canadair Challenger at Pinedale Airport, which, as it happens, one can.

00:26:10

And I said, sure.

00:26:11

And he just literally dropped out of the sky.

00:26:16

And we sat down.

00:26:18

We spent the entire afternoon going over the things that I knew about Eric Neidorf, about Steve Jackson Games, about the Legion of Doom.

00:26:27

And he got increasingly upset. And toward the end of the afternoon, we called up Rabinowitz

00:26:34

Boudin, which was a, you know, hotshot First Amendment firm in New York that had successfully handled the Pentagon Papers case.

00:26:49

And they thought that there was maybe something fishy going on. I mean, they could see that there

00:26:53

was prior restraint going on, that there was overbroad search going on, that, you know,

00:26:57

that if the Constitution applied to this environment, to these kinds of media, then,

00:27:07

these kinds of media, which it presumably ought to, then there were problems.

00:27:14

So at that point, Mitch and I thought, well, all we’re going to have to do is bring a few cases here,

00:27:19

you know, slap them around a bit, get everybody to understand that, you know,

00:27:25

the Constitution does apply to digital media, you know, and dust our hands off in satisfaction and go on.

00:27:30

Not realizing, for example, that, you know, in cyberspace, the Constitution is a set of local ordinances.

00:27:37

You know, it doesn’t actually apply anywhere in this environment.

00:27:39

Not realizing a lot of things yet.

00:27:45

But we did start to initiate legal action and started to make some fuss.

00:27:49

And at this point, John, who I knew from the hackers conference,

00:27:55

who I had not thought of being a man of any particular resources since the only time I’d ever seen any of his resources,

00:27:58

it was like the worst motorcycle I’d ever taken a ride on.

00:28:03

You know, it had been in the rain,

00:28:06

and the cover was off the back of the seat.

00:28:08

Oh, yeah, yeah.

00:28:10

So you sit on the back,

00:28:12

and you’re wet for a while.

00:28:18

But I get this email from John.

00:28:20

He says, well, I may not have the same resources as Mitch,

00:28:23

but would $100,000

00:28:25

help? And I wrote back and I said, yes, because I could see that these cases were actually

00:28:33

going to get sort of complicated, which they did. And at that point, we realized that we

00:28:38

actually had to have an organization, that it was not just going to be Mitch and John

00:28:41

go off to cyberspace, you know, like some kind of frontier duo,

00:28:46

like the Lone Ranger and Tano.

00:28:49

I guess I would have been Tano.

00:28:52

And, you know, establish peace, justice, and the American way.

00:28:55

It was going to be trickier than that.

00:28:58

So that was kind of how we got started.

00:29:01

And at first, you know, we didn’t really know what we were doing

00:29:06

except for the fact that we wanted to make sure that this thing started out free

00:29:12

and stayed free as long as we could make it.

00:29:15

Because it had been made to be free.

00:29:18

The people who designed the Internet understood that its inherent characteristic

00:29:23

was a kind of anarchy.

00:29:28

I mean, I once had a conversation with a guy named Paul Barron

00:29:30

who invented packet switch networking,

00:29:33

which is the basis for the Internet.

00:29:37

Well, he was working at the RAND Corporation,

00:29:39

and it was widely thought that he had done this

00:29:41

because he was supposed to be coming up with a command and control system

00:29:45

that could not be decapitated by nuclear attack.

00:29:48

And I asked him one time, I said,

00:29:50

were you trying to come up with something that simply couldn’t be decapitated by nuclear attack?

00:29:54

And he said, no, I was trying to come up with something that didn’t have a head.

00:30:00

And he gave me that sly look, as, you know, secret anarchists do,

00:30:04

when they, you know, they think about how cool it would be not to have a head.

00:30:12

And so all those guys had had some sense of this, the people who’d been designing it originally.

00:30:25

But we could see that now that it had been discovered by the powers that had been,

00:30:30

that there were going to be Agent Baxter’s up the gazitch.

00:30:31

And there were.

00:30:35

I mean, immediately, the Secret Service was all over everything.

00:30:36

The FBI was all over everything. Fortunately, they were stumbling around with such massive incompetence

00:30:39

that we were generally spared their despotism by that.

00:30:43

that we were generally spared their despotism by that.

00:30:45

But… There was also a turf war where the Secret Service was trying to,

00:30:51

who was only responsible for protecting the president

00:30:53

and dealing with counterfeiting, decided,

00:30:55

well, here’s a new area we could expand into, the Internet.

00:30:58

Right. Big new budget.

00:30:59

The FBI, you know, can’t do that. That’s our job.

00:31:03

So they had to make a big splash by running around and arresting a lot of people.

00:31:06

Right, seeing who could arrest the most people on the most furious count.

00:31:11

Well, and get the biggest news headlines.

00:31:13

And get the biggest news headlines and scare the hell out of the American people

00:31:16

who were suddenly getting really agitated about this stuff.

00:31:21

After we won just about everything we took on in the beginning. In fact, we did win everything we took on in the beginning.

00:31:25

In fact, we did win everything we took on in the beginning.

00:31:29

But we could see that there was a whole lot more to come

00:31:32

because down in D.C.,

00:31:36

all the traditional media companies,

00:31:39

Time Warner and, you know, not traditional media companies like AT&T.

00:31:43

You mean Time and Warner.

00:31:44

Yeah, Time and Warner. Yeah, at that time, Time and Warner.

00:31:48

We’re cooking up something that they call the information superhighway,

00:31:53

which was actually interactive television,

00:31:55

which would be about as interactive as having a buy button on your channel clicker.

00:32:01

But this would supplant the Internet.

00:32:03

This would be better, and the Internet was basically regarded as being like ham radio. It was would supplant the Internet. This would be better. And, you know, the Internet

00:32:05

was basically regarded as being like ham radio. It was not to be taken seriously. And we,

00:32:12

so we moved our offices to D.C. to try to start fighting the regulatory processes that

00:32:19

were going to, you know, enable the information superhighway to kill the Internet.

00:32:34

At a certain point in there, we did realize that the Constitution was a local ordinance and that all these regulations were also somewhat local in nature.

00:32:41

And that the real issue was, I mean, Mitch just casually dropped this line at one point

00:32:47

that was our watchword and has been ever since.

00:32:50

Architecture is politics, he said,

00:32:53

by which he meant that if you wanted to assure rights in cyberspace,

00:32:59

you couldn’t count on the law to do it anymore.

00:33:01

You had to count on the architecture of the network and keeping

00:33:06

that open to all manner of transmissions and keeping people or institutions more often

00:33:14

from dividing it up into zones where some things were permissible and some things weren’t

00:33:21

because we knew intuitively that if you can control any part of the internet in any fashion

00:33:26

you can control it all

00:33:28

I would say that’s still

00:33:32

an accurate statement

00:33:33

no it’s like the Chinese

00:33:36

can’t control the whole internet though they control

00:33:38

part of the one in their

00:33:40

region but the idea here

00:33:41

if you get legal

00:33:44

leverage over the internet,

00:33:45

then that starts getting, you know, you give them an inch and they take a mile. So we wanted

00:33:51

to stop that first inch as long as possible. They could start controlling the technology.

00:33:56

They could start controlling the way in which the architecture worked, the way in which

00:34:00

routers worked. They could impose technological considerations that would, you know.

00:34:10

So, you know, there are many things that have happened since then.

00:34:12

You know, it’s gotten more and more complex.

00:34:17

And, you know, part of the reason I want to have this conversation with you is because there are so many things that we can talk about,

00:34:20

whether it’s federal wiretapping, which we are intimately engaged in,

00:34:27

cryptography, which John spearheaded our work in

00:34:35

by getting us to take on the Bernstein case,

00:34:39

which proved that cryptography algorithms were a form of speech

00:34:42

and trying to stop strong cryptography

00:34:45

was tantamount to prior restraint on free speech,

00:34:50

which enabled, by the way,

00:34:54

the formation of real business on the net

00:34:56

because without strong cryptography,

00:34:58

nobody was going to trust financial transactions there.

00:35:02

Mitch and Jerry Berman and Dave Farber and I started something called

00:35:08

the Commercial Internet Exchange, which lobbied the National Science Foundation, which had

00:35:15

taken over the Internet from the Army, or from the Department of Defense, and was running

00:35:20

it on the condition that no private traffic, no commercial traffic could pass over, which was a poppycock, of course.

00:35:28

But we felt like it was necessary for there to be commercial traffic and for there to be private carriers and private ISPs.

00:35:38

And so we banded together with several budding private ISPs, including UUNet, PSINet, and several others,

00:35:45

and got the National Science Foundation to privatize the Internet.

00:35:51

I mean, there have been times since when I wondered if that was such a great idea,

00:35:55

but I know that it was, I mean, in essence.

00:35:58

Because, I mean, if it’s going to reflect what human beings do in all their dimensions,

00:36:03

it has to have commerce in it.

00:36:02

what human beings do in all their dimensions.

00:36:04

It has to have commerce in it.

00:36:09

We have taken on… God, I mean, it’s such a lengthy list,

00:36:13

but it was quite a while before we decided

00:36:18

that issues relating to copyright

00:36:22

were going to be hugely important.

00:36:25

And in fact…

00:36:26

Well, that’s not quite true.

00:36:28

We went through sort of the search and seizure phase in the beginning.

00:36:31

Yeah.

00:36:31

And we kind of cleaned that up more or less.

00:36:33

And then we went through a whole censorship phase

00:36:35

where Communications Decency Act, trying to protect children from…

00:36:41

Actually, you know, trying to prevent adults from seeing things

00:36:43

that other people didn’t want them to see.

00:36:46

And we went through a whole cryptography and privacy set of years.

00:36:53

But actually fairly early in that process, we realized, John Perry in the lead,

00:36:59

that copyright was going to become a huge issue.

00:37:01

Yeah, I wrote a piece for Wired, not ex-Cathedra EFF.

00:37:07

I mean, I was not speaking on behalf of EFF.

00:37:10

But I just had this insight

00:37:13

that copyright was going to take a terrible beating

00:37:18

in a time when you could suddenly reproduce

00:37:21

anything that human beings could think infinitely

00:37:24

and distribute it infinitely

00:37:26

at zero cost. And since I know that, you know, outside of, outside of sex and hunger and shelter,

00:37:33

you know, the thing that, that human beings are most attached to is sharing information

00:37:38

that they find relevant. It was going to be very difficult to contain this stuff henceforth.

00:37:46

And that all these industries like the recording industry, the movie industry, the publishing industry,

00:37:51

that had based their authority on their ability to create a scarcity of information,

00:37:58

were going to take a terrible beating.

00:38:01

Now, I tried to convince my colleagues at EFF that this was going to be a big problem.

00:38:07

And at first, I mean, John was the only one that agreed with me.

00:38:10

Oh, no, I think we all agreed it would be a big problem.

00:38:13

And really the question was, what could we do about it?

00:38:16

Well, no, but also, I mean, frankly,

00:38:18

that we were getting a huge amount of money from outfits like Microsoft.

00:38:23

Yeah, that didn’t really enter into my calculations.

00:38:26

Well, it sort of entered into Jerry Berman’s, who was our executive director in those days.

00:38:30

And in fact, when I published this piece, even though it was not published in my role

00:38:34

at EFF, Bill Gates withdrew all of his personal support, which had been considerable, and

00:38:41

all of his corporate support, which had been considerable, and actually told his employees that they were not to support EFF any longer.

00:38:48

So that was a warning sign.

00:38:53

So it took us a little while to sort of come to grips with the fact

00:38:56

that we were going to lose about half of our funding if we took this up as an issue.

00:39:00

But the reality was, and continues to be,

00:39:04

that the principal obstacle to free speech in cyberspace is the idea that you can own speech.

00:39:13

You cannot own free speech.

00:39:16

There are a lot of ways to monetize relationship between the creator and the, whether musician or whomever,

00:39:28

and the audience that do not involve property. But property is just not going to work there.

00:39:35

It’s the wrong model. And furthermore, it’s the wrong model as a practical matter because information may have

00:39:46

completely different economic characteristics

00:39:47

than physical goods.

00:39:50

Well, indeed it does, and we know it does.

00:39:52

I mean, in the physical world,

00:39:53

there’s a clearly coupled correlation

00:39:58

between scarcity and value.

00:40:01

I mean, Adam Smith pointed this out

00:40:03

a long time ago, and he was right.

00:40:08

And there is an assumption that the same applies with regard to information.

00:40:15

Especially, you know, if the information is encapsulated in books and CDs and physical objects that are manufactured.

00:40:22

It looks like any other physical thing.

00:40:24

that are manufactured, it looks like any other physical thing.

00:40:27

But once you take away those containers,

00:40:31

which, interestingly enough, is about the same time they started calling it content,

00:40:32

when the containers went away.

00:40:40

Once you take away those containers, it becomes liquid and leaks like crazy.

00:40:49

And, you know, it was just going to be difficult to hang on to it but furthermore there might be good reason to think that one wouldn’t want to i mean when i was writing for

00:40:52

the grateful dead we decided at a certain early point that we were going to let people tape our

00:40:59

concerts mostly because we felt bad about kicking out deadheads. I mean, it’s not good for your karma to be mean to a deadhead.

00:41:08

I mean, they’re hapless folks.

00:41:11

You know, and mean as we were,

00:41:14

there was something about the baleful glances that these kids would cast

00:41:17

as we kicked them out of the concert with their tape recorder that got to us.

00:41:21

And we said, finally, we said, well, we’re not in this for the money anyway,

00:41:26

which was easy to say because we weren’t making any.

00:41:30

So we said, all right, tape them, not realizing that what we were doing was creating viral

00:41:34

marketing.

00:41:36

You know, we had a marketing technique that suddenly was much better than anything that

00:41:39

Warner Brothers had come up with for us because those tapes spread and became an you know an

00:41:45

article of currency in a in a community that eventually became so large that we could we

00:41:51

could fill any stadium in the country anytime we wanted to three nights in a row yeah mostly by

00:41:58

hauling our audience around with us but you know but we could so I’m curious actually

00:42:06

how many of you have heard the Grateful Dead

00:42:08

that many

00:42:10

how many of you have been to a Grateful Dead concert

00:42:13

how many of you heard it on recordings

00:42:17

just passed around

00:42:18

right

00:42:21

but don’t listen to our records

00:42:23

they’re uniformly bad

00:42:24

they’re terrible

00:42:24

but the tapes are sometimes pretty hot

00:42:27

but we saw

00:42:30

that you know actually there was a

00:42:32

relationship between familiarity and value

00:42:34

it was exactly the obverse

00:42:36

you know the less scarce we made

00:42:38

our product

00:42:39

the more valuable it became

00:42:41

yep and I had

00:42:44

the same experience at my last business.

00:42:46

Exactly.

00:42:48

Turns out I made a whole pile of money

00:42:51

starting a company that wrote free software

00:42:54

and gave it away.

00:42:59

Very freely.

00:43:01

We wrote and maintained the GNU compiler tools,

00:43:07

which are still probably the most used programmer tools on the Internet,

00:43:14

and gave them away not just for free,

00:43:19

but with full rights for you to take them and give them to all your friends,

00:43:24

to sell them to as many people as you could sell them to, and to go in and give them to all your friends, to sell them to as many

00:43:25

people as you could sell them to, and to go in and modify them and make them better or

00:43:30

worse, and sell that, or spread it around, or use it yourself. It’s the so-called free

00:43:36

software that Richard Stallman had envisioned and a large bunch of volunteers had created.

00:43:43

Well, we said, there’s so many people using this stuff,

00:43:46

there’s probably a business in helping them.

00:43:49

It was very hard, actually.

00:43:52

Yeah, well, there aren’t that many people who can maintain a compiler.

00:43:59

It’s a painfully big, complicated piece of software.

00:44:04

And so we hired a few people to do that and

00:44:08

we found some customers, initially big companies that were using the software already, and

00:44:14

were having problems here and there with it and they paid us to fix up those problems.

00:44:19

And then we found a whole category of people who were building chips. And they wanted this compiler

00:44:25

to work with their chips. Because as they went to sell their chips to people who were

00:44:30

building them into networking equipment or toys or whatever, they kept hearing requests

00:44:37

like, well, does it work with the GNU compilers? And so they couldn’t land a big order from

00:44:43

some big companies unless they made it work with our compilers.

00:44:46

And so they’d pay us to do that.

00:44:48

And they’d pay us hundreds of thousands of dollars to do that.

00:44:54

Giving it away was a good thing?

00:44:57

Yeah.

00:44:57

And the most interesting part of the whole experience is every company we went into with sales guys to try to sell them on the idea that

00:45:06

they should use our tools and pay us and we’d support them and all of that, they already had

00:45:11

a copy of our tools. They already were using them, right? It made it a much easier sales job

00:45:18

to have the tools precede us. And that created value for our company.

00:45:29

And ultimately, after 10 years of hard work,

00:45:31

we sold the company to Red Hat for $600 million.

00:45:35

Something for nothing.

00:45:36

And they’re still making money with it.

00:45:42

So whereas if we had started a company that sold its software in the usual way,

00:45:45

we would never have been worth that much money.

00:45:50

Unfortunately, there are a lot of people in this town

00:45:52

who still do not understand this principle.

00:45:56

The record industry has actually, you know,

00:45:58

had so much of the snot-nosed beat on them at this point

00:46:00

that they’re starting to get it by, you know, they can’t help it now.

00:46:05

The movie industry is next.

00:46:08

And I think we’ll suffer similar results.

00:46:12

The advertising industry is after that.

00:46:15

You know, there are going to be a lot of folks that are making a lot of money

00:46:18

and they’re suddenly not going to be making money the same way anymore.

00:46:22

And they’re going to be fighting hammer and tongue to maintain their old

00:46:26

business models.

00:46:28

Right. So part of the reason I

00:46:30

actually started this business was

00:46:32

because I had been following

00:46:34

the nanotechnology crowd for

00:46:36

10 or 15 years.

00:46:38

And they were telling us that someday,

00:46:40

they couldn’t say exactly when, but someday

00:46:41

in the foreseeable future, we were going

00:46:44

to be able to reproduce physical objects the same way that we can reproduce bits today.

00:46:49

And you could basically take a design of a laptop or a pen or a house or a table and have some kind of tiny molecular machines build it up for you the way a tree grows out of a seed.

00:47:02

build it up for you the way a tree grows out of a seed.

00:47:07

And that was going to make it really hard to make a living by handcrafting tables and laptops and pens and things,

00:47:12

because people would just grow it for themselves.

00:47:15

Not to mention the fact that you’ve got people out there owning genetic code,

00:47:19

or claiming to own genetic code.

00:47:21

Yeah.

00:47:22

But so on the off chance that these people were right about nanotech,

00:47:28

I decided to see if I could start a company

00:47:31

that could actually give away all of its intellectual property

00:47:34

and make money.

00:47:37

And it turned out it wasn’t that hard.

00:47:41

It just wasn’t that hard.

00:47:44

It also surprised me that almost nobody wanted to compete with us.

00:47:49

Oh, that’s crazy.

00:47:50

That’ll never work.

00:47:52

Exactly.

00:47:53

Because I had made a little money from an earlier job,

00:47:56

people kind of assumed that I was just supporting the company.

00:48:00

You know, it was not profitable,

00:48:02

and it was just going to burn up all its money and go away.

00:48:05

It’s like we started that company on $15,000,

00:48:09

and it was profitable from three months in.

00:48:12

And we ran it on revenues from customers throughout the rest of its time.

00:48:18

But the only competition we ever found

00:48:20

was this one little consulting company in Switzerland.

00:48:24

We called our company

00:48:25

Cygnus, like Cygnus the Swan, and they called their company Cygnum and offered the same

00:48:35

kind of services. But they were only two guys and they didn’t really know much about what

00:48:39

they were doing. So, you know, we grew, before we sold the company, we had about 120 employees, just gradually growing over the years.

00:48:48

But the idea that you could make money this way was just so against the grain of everything they taught you in business school

00:48:58

and everything they taught you in lawyer school that anyone who thought about competing with us

00:49:05

just instantly rejected the idea, which was great for us.

00:49:11

But, you know, these are just some of the issues that we’ve had to deal with.

00:49:14

You know, those of you who have computers open,

00:49:16

if you go to the EFF homepage and just read the list of the things

00:49:23

that we are presently engaged in.

00:49:29

You know, we are suing, we’ve got a class action suit going against AT&T for abrogating their customer agreements by allowing the NSA to gain access to their systems,

00:49:36

and thereby your information and your phone calls.

00:49:40

We are dealing with electronic voting in a significant way.

00:49:44

We are dealing with electronic voting in a significant way.

00:49:52

We’re dealing with online spamming and slander.

00:49:58

We’re dealing with bloggers’ rights against bloggers. We feel our journalists should have the same rights as any other journalist

00:50:03

from having their notes being confiscated under warrant from the government.

00:50:10

We’re suing for more information on electronic surveillance systems.

00:50:18

We’re defending TiVo.

00:50:20

We’re trying to preserve anonymity for online embroidery fans

00:50:25

it gets complicated

00:50:26

we actually

00:50:27

we had to sue Barney at one point

00:50:30

which I was

00:50:32

really in favor of

00:50:34

well so

00:50:37

they can read about that

00:50:38

what do you guys want to hear about

00:50:40

exactly

00:50:41

since you got this smorgasbord of things

00:50:44

that’s the end of the broadcast I want to say thank. Exactly. I mean, since you’ve got this smorgasbord of things that we can… Ask us some questions.

00:50:45

That’s the end of the broadcast.

00:50:47

I want to say thank you very much.

00:50:48

Thank you.

00:50:54

Join EFF.

00:50:56

And we don’t promise that the answer to one of these questions

00:50:59

may not take another 20 minutes.

00:51:02

Yeah.

00:51:03

Can you talk about what happened with you and the TSA?

00:51:05

What happened with me and the TSA?

00:51:07

Sure.

00:51:08

That’s not much of an EFF thing, but I’m happy to talk about it.

00:51:15

I’ve been working on identity issues.

00:51:19

And in particular, I’ve been working on whether or not we can continue to exercise all of our rights and responsibilities in society

00:51:28

without the government requiring us to get an ID card and show it on demand.

00:51:36

It’s the sort of national ID, your papers, please, kind of thing that when I went to elementary school,

00:51:43

they told us that that was how totalitarian

00:51:45

societies worked.

00:51:47

Yeah, I remember being a kid and they’d say, you know, why in the Soviet Union you got

00:51:50

to have a passport just to travel around inside the country?

00:51:53

I thought, God, that must be awful.

00:51:55

Yep.

00:51:58

Well, so…

00:51:58

But that’s where we are.

00:52:01

So I let my driver’s license expire and stopped driving.

00:52:08

I’d love to have the Angelenos in the room.

00:52:10

I’ll go.

00:52:13

I live in San Francisco, so we actually have transit that works.

00:52:19

And he’s very patient with the transit.

00:52:22

Following him around on his own terms is really just a terrible pain.

00:52:28

But what I discovered is that we don’t have the right to move around our own country without ID.

00:52:37

And that besides trying to convince people of that, that the government had actually done this in the airports with

00:52:45

kind of a subterfuge. In the train stations, they are totally upfront about it. It’s like,

00:52:51

you show that ID or you use a credit card that will tie your identity into the system

00:52:56

or you cannot buy a ticket. But the airlines, you know, they were a little bit more subtle.

00:53:04

They put signs up all over the airports that says, you know, must show ID.

00:53:08

But they never actually passed a law that said you had to.

00:53:12

And they never published a regulation that said you had to.

00:53:19

They just kind of, you know, put up signs and had the guards start enforcing it.

00:53:25

And so this made it a little cumbersome to sue them over it because we couldn’t read

00:53:30

the text of the rule.

00:53:32

Right.

00:53:33

And so indeed, when I did sue them over it, I tried to fly to Washington, D.C. and not

00:53:40

have an ID, and they said, no, you can’t do that.

00:53:45

And I sued them.

00:53:47

And the judge in the district court said,

00:53:52

actually asked the lawyer for the government,

00:53:57

so what’s the law here?

00:54:00

And he says, oh, the law is you can’t bring a weapon onto an airplane,

00:54:04

and you can’t make threats, and this sort of thing. And she said, yeah, yeah, I heard all that. So what, the law is you can’t bring a weapon onto an airplane and you can’t make threats and this sort of thing.

00:54:06

And she said, yeah, yeah, I heard all that.

00:54:08

So what’s the law about ID?

00:54:10

And he said, well, he alleges that we have a rule about it,

00:54:16

so you have to assume for purposes of the case that there is such a rule.

00:54:22

Well, and in fact, they have now, I mean, since the passage of the Patriot Act,

00:54:25

there are a great many rules that you don’t have access to.

00:54:28

I mean, I got myself into a problem with the TSA.

00:54:33

There was a pretty good case a couple of years ago where a guy got busted by a cop,

00:54:39

and he had a rule that you do have to show your ID upon demand from a law enforcement officer.

00:54:46

No, that was an interesting case, and our lawyers got involved in it.

00:54:52

Can you repeat the question?

00:54:54

Sure.

00:54:54

Wasn’t there a case a few years ago where this guy got busted for failure to show an ID,

00:54:59

and the Supreme Court said, yeah, you do have to?

00:55:01

The guy’s name is Dudley Heibel.

00:55:04

He lives in Winnemucca, Nevada.

00:55:07

And he’s a cussed man.

00:55:08

He’s a rancher along the lines of a John Perry Barlow.

00:55:12

It’s kind of a smaller spread, but similar attitude.

00:55:15

He got pulled over.

00:55:17

His son was drunk driving or something.

00:55:19

He didn’t have to be in the car.

00:55:20

He actually had pulled over,

00:55:22

and he was having an argument with his daughter.

00:55:24

Well, actually, she had been driving.

00:55:28

No, no, no, I heard of this stuff before.

00:55:30

No, no, that’s right, precisely, she’d been driving.

00:55:33

She was driving, they were arguing about her boyfriend.

00:55:38

These things happen, I can tell you.

00:55:40

And they pulled over, kind of in a huff,

00:55:43

and he got out to have a cigarette and calm things down.

00:55:47

But some nosy neighbor had seen them arguing in the car and phoned up the cops and said,

00:55:56

you know, there’s somebody beating on their kid or something.

00:56:00

And so the cop came screaming up.

00:56:03

And we actually have the video from the cop car.

00:56:07

They had a video on the dashboard of the cop car.

00:56:11

It took a little extra processing to be able to hear the voices out of it,

00:56:14

but you can see the whole thing pretty well.

00:56:17

And the cop came up.

00:56:20

The guy’s just standing around outside his truck smoking a cigarette,

00:56:27

came up and basically said,

00:56:29

I need to see some ID.

00:56:35

And Heibel said, you know, what did I do?

00:56:36

What’s the charge?

00:56:37

He says, I need to see some ID.

00:56:39

And they go around four or five times on this.

00:56:44

And Heibel eventually, you know, says, well, no, you know, I know my rights.

00:56:46

I don’t have to show you any ID.

00:56:49

And the cop arrests him,

00:56:51

throws him in the back of the truck,

00:56:52

and then goes after the daughter

00:56:54

and drags her out of the truck

00:56:56

and throws her on the ground and stuff like that.

00:56:58

And then the video cuts off.

00:57:02

And they prosecuted him.

00:57:04

He had a public defender

00:57:05

and I think they lost

00:57:09

in the lowest courts

00:57:13

and they worked their way up

00:57:14

the Nevada Supreme Court

00:57:16

and so the law had said

00:57:19

the Supreme Court had ruled earlier

00:57:24

that you didn’t have to show any kind of ID.

00:57:31

This was a case about a guy who had been loitering.

00:57:34

You know, cops love to sort of run around to people who they claim are loitering and say, you know, we need to see who you are and you need to account for yourself.

00:57:43

And this was a black guy with hair down

00:57:45

to his ass who liked to wear these white ice cream suits and walk around in white neighborhoods.

00:57:51

And the CHP busted him like 25 times over two years and he eventually sued them and he won

00:57:57

in the Supreme Court. He said, for walking around on the street, you don’t need an ID,

00:58:03

you don’t need nothing, they can’t harass you this way.

00:58:07

So Larry Heibel, Dudley Heibel, thought that was the rule.

00:58:11

But the Nevada legislature had passed a law that pushed the envelope a little bit.

00:58:16

It said, well, you don’t need to show an ID except unless a cop suspects you of a crime,

00:58:25

and you are being detained for that.

00:58:29

And in that case, you have to identify yourself, is what they said.

00:58:34

And the Nevada Supreme Court eventually looked at that in this case,

00:58:38

and they interpreted that law to mean not that you had to show an ID card,

00:58:43

but that you just had to give your name.

00:58:43

to mean not that you had to show an ID card,

00:58:44

but that you just had to give your name.

00:58:52

So the case ended up getting appealed from the Nevada Supreme Court to the U.S. Supreme Court,

00:58:54

and when the U.S. Supreme Court took it,

00:58:56

we discovered the case,

00:58:58

and a bunch of people discovered the case,

00:59:00

and all kinds of people came in on both sides,

00:59:02

cops coming in saying, you know,

00:59:04

we need to check up on people, and civil of people came in on both sides. Cops coming in saying, you know, we need to

00:59:05

check up on people and civil liberties people coming in and saying, you can’t compel someone

00:59:11

to speak, you know, to incriminate themselves. And you can’t search them without suspicion of a crime

00:59:18

and all this other stuff. And so we got involved. Ultimately what the Supreme Court decided was that if you are suspected of a crime, if you’re in what they call a Terry stop,

00:59:34

if a cop has a reasonable suspicion that you’ve committed or about to commit a crime, they can detain you briefly they can’t move you except like out of the traffic

00:59:45

but they can detain you in one place

00:59:47

and they can ask you questions which you aren’t obligated to answer

00:59:50

but in a state

00:59:55

that has required you to give your name

00:59:58

they can make you give your name

00:59:59

now as far as I know only one state has done that

01:00:03

which is Nevada

01:00:04

and the curious wrinkle in all of this Now, as far as I know, only one state has done that, which is Nevada.

01:00:12

And the curious wrinkle in all of this is the cop never asked for his name.

01:00:14

No, he just asked for ID.

01:00:15

He asked for a document.

01:00:18

He asked for a document, a document, a document, which he never had the right to get.

01:00:24

But, of course, the way this was reported was that cops can make you show an ID.

01:00:29

But this is almost not an EFF case.

01:00:33

I mean, we used some of our resources on it.

01:00:38

Why don’t you talk about the two cases that are most interesting to you right now that you think of as well?

01:00:43

Well, obviously our biggest case, the one that is drawing the most attention both among us and elsewhere is the AT&T case.

01:00:54

I mean, we realized that given the current legal climate, suing the NSA was going to be kind of an uphill battle.

01:01:02

And besides, ACLU was doing that.

01:01:04

Right. Well, you know,

01:01:07

we figured we had more leverage elsewhere. We decided to open a new front. Right. And so

01:01:14

the law that makes it a crime for NSA to be wiretapping people in the United States without

01:01:22

any warrants also makes it a crime for a telephone company to wiretap people,

01:01:27

its own customers perhaps.

01:01:29

In fact, it makes it a crime for you to wiretap people in the United States without a warrant.

01:01:35

And so we went after AT&T for building this infrastructure to wiretap their people

01:01:42

and handing off the, actually, what we particularly are complaining

01:01:46

about is

01:01:47

that

01:01:49

they built an infrastructure,

01:01:52

they built a room in their

01:01:53

central offices

01:01:55

and just let the NSA put any kind of

01:01:58

equipment in there that they wanted

01:01:59

and then they fed,

01:02:02

they put splitters on their fiber

01:02:04

optic lines and fed those lines into that room.

01:02:08

That, we believe, is the crime.

01:02:11

Right?

01:02:11

It’s irrelevant what the government did after that.

01:02:14

The government didn’t even have to listen to any of it.

01:02:16

The crime is in delivering your phone calls to the government.

01:02:33

And fortunately, we had the luck of running across the fellow in San Francisco who’d been running that room, who had just retired from AT&T and had never been comfortable with this arrangement. He hadn’t been running it, actually. He was a fiber optic installer.

01:02:37

And so he had watched this whole thing happen without actually doing it. And he had talked to other installers in other cities.

01:02:46

And so he knew that this was happening at least by hearsay in like six or eight other cities.

01:02:51

But he had seen it happen in San Francisco.

01:02:54

And as we were reading the New York Times article about the wiretapping

01:02:59

and puzzling what we should do over this,

01:03:01

he literally walked into our offices and said,

01:03:04

you know, I’ve got some

01:03:05

information and some documents about this wiretapping. He did the whistleblower thing.

01:03:11

That was my question. Do you engage in some of your own investigative work or

01:03:15

did you kind of, were you inspired by the New York Times or they approached you?

01:03:22

We do a lot of investigation, but it’s actually of a much nerdier sort.

01:03:28

I mean, this is why we had Corey, and we’ve got a whole set of uber-nerds.

01:03:36

Well, EFF has done some very good original investigation that used EFF supporters to help with it,

01:03:41

like the laser printer dots, where John had heard a… well, John, do you want to tell the story?

01:03:47

Sure.

01:03:48

I mean, I had heard years ago that Xerox had done something funny to the color Xerox machines,

01:03:54

that they would print something in every page that showed who had printed it and when.

01:04:02

And, you know, I just sort of heard that. There was no way

01:04:06

to confirm it, and so I filed it away. And I heard this later about color laser printers,

01:04:15

that not only Xerox, but Canon, once they started making color print engines, and everybody

01:04:20

put those print engines into their printers, whatever brand they were.

01:04:29

And I tried to stir up the tech people at EFF to kind of,

01:04:31

okay, let’s reverse engineer this, let’s figure this out.

01:04:35

We filed FOIA requests asking about it and we got nothing back.

01:04:40

We sort of trolled around with Google and found as much info as there was on the net.

01:04:45

And we did find a couple of places where Xerox had admitted to doing this.

01:04:49

And they gave the reason as deterring counterfeiting.

01:04:50

Right.

01:04:57

Making it because, you know, it’s a reasonable concern on the part of the government that by the time you’ve got copiers that you can just stick a 20 bill,

01:05:10

that, you know, they might be concerned with retaining the value of the currency.

01:05:14

But unfortunately, it’s the kind of technology that can be used for a lot of things.

01:05:16

Right. If you’ve got, you know, an extremely zealous government, which we seem to be,

01:05:23

well, fortunately, I think we may be backing away from it now,

01:05:26

but it looked to me like we were developing the kind of government in this country

01:05:28

that would be perfectly capable of taking a look at pieces of paper

01:05:32

and finding out who had printed them if they didn’t like what was printed.

01:05:39

So we have an Uber nerd on staff.

01:05:43

Seth Sean.

01:05:44

Seth Sean.

01:05:45

Who is our speaker next week.

01:05:47

And he’s actually quite capable of speaking, you know, intelligible English.

01:05:55

Right.

01:05:55

A lot of the time he just talks in, you know, things that are difficult for anybody.

01:05:58

So to compress this a little bit, he came up with a test thing that people could print out on their color laser printers that you could download and then mail it to us.

01:06:10

And a couple of hundred people did this, printed about six pages or whatever.

01:06:14

And we started looking at the output, looking to see if we could notice patterns in it.

01:06:29

patterns in it. And what we discovered was that there were these really faint yellow dots being printed in patterns across the page. And that if you took a blue light and

01:06:38

shined it on the page, the yellow dots would look black and you could see them a whole lot better.

01:06:45

You’re kind of raising the contrast,

01:06:47

so you kind of shine the thing,

01:06:49

and you’d start picking up these patterns.

01:06:52

And by comparing the serial numbers of the printers

01:06:57

that people had sent in and the dates and times and things,

01:07:00

we sort of decoded for at least one branded printer

01:07:02

what they were printing,

01:07:05

and we discovered this rumor is really true.

01:07:08

And we’ve seen lots of dots on other models that we haven’t yet decoded yet.

01:07:12

We’re still interested in people who want to help with that.

01:07:15

But we’ve published the results just to warn people,

01:07:19

like if you’re going to print out political flyers,

01:07:21

you had better know that those can be tracked back to the serial number

01:07:25

of your printer. And if you’ve gone down to like Office Depot to buy a printer, you know

01:07:29

they would really prefer that you buy it with a credit card. They will take cash if you

01:07:36

ask nicely, you know, but they really prefer to get all that information. And it’s all

01:07:42

correlated right on the sales slip with the serial number.

01:07:44

all that information, and it’s all correlated right on the sales slip with the serial number.

01:07:49

But there are a long, long list of these kinds of things that are really kind of unlikely to surface in public awareness.

01:07:56

And in fact, even after they surface in our awareness,

01:07:59

trying to get the public to know what we’re talking about is very difficult.

01:08:03

When we were fighting over encryption,

01:08:05

which I think was enormously important,

01:08:10

it just… quite a conversation stopper it is

01:08:13

if you try to sit down and have a conversation with somebody

01:08:16

about computer encryption and why that’s important.

01:08:19

I mean, they would… you know, they’ll have to go get a drink or something.

01:08:23

Yeah. So this was an example, and we see it so many, so many times.

01:08:31

The Secret Service reaction to counterfeiting with laser printers

01:08:35

was to just embed some information in every output of every printer

01:08:41

that would be useful for them in doing their jobs

01:08:44

without ever

01:08:45

looking at the impact on the rest of society.

01:08:47

Right.

01:08:48

Right.

01:08:49

Here we have another example.

01:08:52

It’s currently in process.

01:08:54

It’s the RFID passport.

01:08:58

The State Department has decided that they would really prefer that when you’re standing in line at the airport, you know, waiting to go through immigration, that they be able to rapidly read your passport while you’re in line and sort of look you up in the database.

01:09:14

And then by the time you get up to the front, you know, they’ll already have…

01:09:18

They’ve got the handcuffs out and everything.

01:09:19

Yeah.

01:09:21

Or not.

01:09:21

That’s cool.

01:09:22

They’ll process you right on through.

01:09:21

or not.

01:09:23

They’ll process you right on through.

01:09:27

So they designed this whole scheme to put a chip into the passport cover

01:09:30

and pass this around.

01:09:34

Whose passport is that?

01:09:36

Well, that one is…

01:09:37

This is a mock-up.

01:09:38

It’s a manufacturer’s sample

01:09:40

from a company that makes these passports.

01:09:43

But you’ll get your own.

01:09:45

Yeah, they are actually issuing them now.

01:09:48

If you renew your U.S. passport,

01:09:51

Ed Hasbrook has been tracking

01:09:54

this. He

01:09:55

writes things under the name

01:09:59

The Practical Nomad, and he has some

01:10:01

advice online about

01:10:03

if you want to renew your passport

01:10:05

and get a non-chipped one,

01:10:07

how to do it. Because

01:10:09

some of the passport centers are issuing

01:10:11

chipped passports and some of them aren’t at this

01:10:14

point in the United States.

01:10:15

Is this a fail in Britain?

01:10:17

Some countries have already

01:10:21

issued passports like this

01:10:23

and some European countries have.

01:10:25

I don’t know if Britain is one of them yet.

01:10:28

But, again, it’s an agency that was trying to solve their little problem

01:10:32

of checking you when you’re in their particular line,

01:10:36

but they gave you a document that you have to use all over the world

01:10:41

for all sorts of things, check into every hotel you go to,

01:10:45

to show to airlines and everybody else.

01:10:48

And worse than that…

01:10:50

Great way to track you.

01:10:52

This chip…

01:10:53

Everywhere.

01:10:53

…is readable by people feet away from you,

01:10:57

without you ever knowing it.

01:11:00

And what they will get is all the info that’s on your passport.

01:11:03

It’s like the name, you know, the birth date, the passport number, your country, your picture.

01:11:10

All of that stuff is encoded on this chip.

01:11:14

And at least the way they had initially announced they were going to roll it out,

01:11:18

all that stuff was in clear text.

01:11:23

Sounds like a really great way to figure out who to kidnap

01:11:26

overseas. Exactly.

01:11:28

I mean, it’s even a great way to figure out who to

01:11:29

overcharge in a taxi overseas.

01:11:33

It’s really… But again, yeah,

01:11:36

go ahead. Is that like the same sort of

01:11:38

um, is it a similar

01:11:39

issue with the credit card now?

01:11:42

Like, scandalous credit card

01:11:43

that a bunch of credit card companies are starting to release that, like, I don’t know about that. weeks ago, there’s this whole breed of credit cards that major credit card companies are

01:12:08

now selling that are like scan lists so that you don’t have to swipe them and they’re faster

01:12:15

allegedly, but anyone can read them, can put together something and read them if they’re

01:12:22

being sent to people in the mail and you get all your information.

01:12:26

Right.

01:12:27

Well, the passport thing is just a subset of the problems with RFID.

01:12:32

It turns out that major consumer product makers and a couple of major distribution chains

01:12:40

like Walmart have been trying to mark consumer products with RFIDs.

01:12:47

Libraries have been putting them in their books, putting an RFID chip in every book

01:12:56

such that it’s real easy to check you in and out because they just sort of scan it.

01:13:03

We discovered in San Francisco it was virtually impossible

01:13:05

to adopt a pet

01:13:07

that didn’t have an RFID chip

01:13:09

embedded under its skin.

01:13:11

We ended up having to find

01:13:13

some Mexican family

01:13:15

that had just had a litter of cats.

01:13:18

You know, that hadn’t even, you know…

01:13:20

Considered putting in a chip.

01:13:22

Yeah, yeah.

01:13:22

Luckily, they’re not born

01:13:23

with chips in them yet.

01:13:26

They’re working on that.

01:13:29

But it is hard to adopt a pedophile here in California now for the same reason.

01:13:34

Can you say a little bit about whether or not you’re anonymous when you have these chips,

01:13:45

what is it about having a library book and a pet and so on that have these chips

01:13:49

and then that compromises your anonymity?

01:13:52

Well, let’s go with the, I mean, start with the passport.

01:13:57

If you’re carrying the passport around, then people can identify you.

01:14:01

And more than people, machines can identify you. More than people, machines can identify you.

01:14:06

So they can put up a video camera

01:14:07

that’s taking pictures of everyone that goes by

01:14:10

and also scanning their passports

01:14:12

and tying that back to who you are.

01:14:15

If they put the chips in your clothing,

01:14:22

then they might not know who you are, but they’ll know that the

01:14:27

same guy passed by.

01:14:28

Right.

01:14:29

I mean, increasingly we’re setting up this matrix for a broad variety of generally practical

01:14:36

and not necessarily bad reasons one at a time.

01:14:39

For surveillance.

01:14:40

Where we are just in the course of leading contemporary lives,

01:14:45

going around leaving this digital slime trail

01:14:48

that can be rolled up into a perfect simulacrum of us

01:14:52

by any number of institutions or entities,

01:14:57

some of whom may not be benign.

01:15:01

And there’s, you know,

01:15:02

certainly many of these entities don’t need warrants.

01:15:06

In fact, the government doesn’t seem to need warrants anymore either

01:15:09

to use that information in ways that you may find makes it harder for you to operate.

01:15:16

So here’s a court case thing EFF has been doing.

01:15:20

We’ve now gone through more than half a dozen cases

01:15:23

where the federal government was applying to a magistrate judge,

01:15:28

kind of a lower-level judge in federal court,

01:15:31

to get an order to wiretap someone’s cell phone location.

01:15:39

In other words, they’re not going to listen in on the calls.

01:15:42

They’re just going to be told by the phone company in real time

01:15:45

where that person is and everywhere they go.

01:15:49

And the government has been arguing,

01:15:51

okay, so first thing to know is

01:15:53

the government never applies for these in open court.

01:15:58

They always apply for them in secret

01:16:00

so they don’t tip off the guy who they’re trying to track.

01:16:04

Second thing to know is that

01:16:07

the judges issue their opinions in secret. Either they grant the thing or they don’t, but the public

01:16:12

never finds out, at least not until way later in a court case, if the guy eventually gets arrested.

01:16:18

Well, one judge, about a year, year and a half ago, issued a published opinion in which he turned them down

01:16:25

for such an order. And he published his opinion and said, you know, I’ve issued a bunch of these

01:16:32

orders in the past, but I started to read up on this and I’m concerned that they’re asking for

01:16:37

something that’s illegal. And he decided to stir up some controversy with it by publishing it to see,

01:16:46

and he kept out the name of the person they were trying to track and stuff,

01:16:50

but he tried to stir up the legal community to think about this.

01:16:53

And we thought about it.

01:16:55

And we wrote him a letter that said,

01:17:00

we’re very interested in this topic.

01:17:02

We helped to craft the last law about wiretapping.

01:17:07

And we think that you’re right, that it is,

01:17:10

that specifically you need a warrant to get somebody’s location out of their cell phone.

01:17:16

You can’t just go around to a magistrate and say,

01:17:20

I’m interested in this person, therefore order the phone company to tell me where he is.

01:17:25

But part of the problem is that under the Patriot Act,

01:17:30

any database, whether it’s a hospital database, a library database, a credit card database,

01:17:37

can now be taken by the federal government with what amounts to the same kind of rubber stamp warrant.

01:17:46

Non-warrant.

01:17:47

Or non-warrant.

01:17:48

Yeah, or order.

01:17:49

Yeah.

01:17:49

Can now be taken from any institution.

01:17:52

And in fact, if the person who hands that database over informs anyone that this has

01:17:58

taken place, he is guilty of a felony for having revealed it, which, you know which gives the government the right to get all kinds of…

01:18:08

I used to feel fairly sanguine about the amount of information

01:18:12

that I was tossing toward commercial enterprises.

01:18:16

I thought it made my life more convenient in some respects.

01:18:19

And I’m in favor of companies knowing as much as possible about consumers

01:18:24

so they can make better products.

01:18:26

But now it’s a different matter

01:18:27

because now that’s all government data ipso facto.

01:18:32

And that’s troublesome.

01:18:34

Is it known if cell phone companies are encouraging this data

01:18:37

or are they stating indefinitely?

01:18:39

It is not known.

01:18:41

Do you have a contract with your cell phone company

01:18:43

and what does it say about this

01:18:45

I bet it doesn’t say anything

01:18:46

and in the absence of an agreement with you

01:18:50

they can do what they want

01:18:51

I have a lot of friends that have considered

01:18:57

joining the new program

01:19:01

in San Jose

01:19:01

and this will be done in Boston Airport as well

01:19:04

where you can sign up

01:19:06

for the rapid… Trusted traveler.

01:19:08

The trusted traveler.

01:19:10

So

01:19:10

it sounds like a scary idea to me but

01:19:13

it would be interesting for me to hear from you

01:19:15

what you feel the risks are

01:19:17

in that path of the monument colleagues.

01:19:21

Well,

01:19:21

um…

01:19:23

He doesn’t get on airplanes.

01:19:25

So…

01:19:25

Basically, the Trusted Traveler Program,

01:19:32

the idea is you can skip some of the security in airports

01:19:36

if they have vetted you in advance,

01:19:39

and then if you use some biometric to prove who you are,

01:19:43

like a fingerprint or retina prints or something like that.

01:19:48

They haven’t been very popular yet because they don’t actually bypass very much of security.

01:19:55

So they don’t really gain you that much and it’s X amount of hassle.

01:20:10

But I think the biggest danger in these programs is if they become mandatory.

01:20:17

And so the idea is, oh, you start them off as an optional thing that the business travelers will use,

01:20:23

and then as more and more people divert into there,

01:20:25

you reduce the number of lines that other ordinary people can go through

01:20:27

until eventually if you can’t

01:20:29

get one of these cards you can’t travel at all

01:20:31

personally I can’t wait to become one

01:20:33

to tell you the truth

01:20:34

I flew 270,000 miles last year

01:20:36

I’m gonna

01:20:40

but I don’t care about

01:20:41

but I don’t personally care about my privacy

01:20:43

presumably somebody who really wants to do major harm might have the resources to

01:20:48

get one of these as well. So it’s not, I don’t even think it really is, makes me feel more

01:20:52

secure. I mean, it’s not good from a security standpoint, really.

01:20:55

It’s not, it’s not any good from a security standpoint as far as I can tell. But what

01:21:01

it’s good for is, is it’s good from an authoritarian standpoint because the way it works is you have no right to any of these kind of cards,

01:21:10

the way you have a right to travel,

01:21:13

a right to move around in your own country.

01:21:16

So the way it works is you apply for the privilege of having the card,

01:21:21

and to get it you have to send them six forms of ID

01:21:24

and fill out your life history,

01:21:26

and they’ll do a criminal records check on you

01:21:28

and fingerprints and eye scans and all of that kind of stuff.

01:21:32

And then they’ll sit on it for six weeks or two months or whatever,

01:21:35

and then they’ll decide.

01:21:37

And if they give you one, then you’ll have one.

01:21:40

And if they don’t, you have no recourse.

01:21:43

And they won’t tell you why, and there’s nothing you can fix.

01:21:46

It’s pretty likely that Tim McVeigh could have gotten one.

01:21:50

Right.

01:21:50

I would say.

01:21:51

You don’t think there’s actually much of a risk to the fly

01:21:53

if he goes through all that background screening,

01:21:56

because they could probably do that anyway.

01:21:59

Well, it depends on whether or not you’re a terrorist, I guess.

01:22:03

I don’t know.

01:22:04

See, I mean, the problem is that we don’t know that that’s all they’re checking for.

01:22:08

I mean, you know, there’s also the Patriot Act has given them a great opportunity to start routinely checking bags for, you know, instruments of terror.

01:22:19

But, you know, it’s a funny thing.

01:22:21

They find other stuff, you know.

01:22:23

And once they find that other stuff, you know, well, geez, you know,

01:22:27

they’re not just going to close the bag.

01:22:31

You know, it’s this beautiful net that they’ve created

01:22:34

that they can, you know, they can filter the entire flying public through now.

01:22:39

For all for safety and security.

01:22:41

Right.

01:22:41

But somehow, if you have some contraband with you that provides no danger of safety and security,

01:22:49

then somehow they’ll charge you with it anyway.

01:22:53

And somehow the courts think it’s okay that they searched you without suspicion

01:22:57

because you were in an airport and what did you expect?

01:23:00

And you cannot find out whether or not there’s a directive that actually is…

01:23:05

Orders them to search for drugs.

01:23:06

Orders them to search for drugs and other things,

01:23:08

because all of those procedures…

01:23:11

Are secret.

01:23:12

Are completely secret.

01:23:17

Yeah.

01:23:18

So, personally, I don’t believe the government has the right or the ability

01:23:23

to demand that I identify myself before I can go through an airport.

01:23:29

And so I wouldn’t buy into a system that purports to have the government decide

01:23:37

who’s allowed to go through the airport by giving or not giving you a card.

01:23:42

That’s a philosophical point of view as opposed to a risk assessment point of view.

01:23:46

Well, if you want to think of it in risk assessment,

01:23:49

how much do you trust your government?

01:23:52

That’s why I’m suspicious.

01:23:54

Well, if you trust your government,

01:23:56

be happy about giving them as much information about you as you like.

01:23:59

If you don’t trust your government, think twice.

01:24:01

I think there actually is reason to trust the government,

01:24:04

mostly on this account.

01:24:05

I mean, I do consulting for some of the, you know, spookier parts of the government,

01:24:11

and they are completely incompetent in there.

01:24:15

It’s astonishing.

01:24:17

I mean, they are so good at gathering data,

01:24:20

and the more data they gather, the less capable they are of turning them into information.

01:24:25

But this was the problem with the Bush administration.

01:24:27

You got some people who were capable instead of incompetent in there

01:24:30

and look at what they did.

01:24:33

Yeah, that’s true.

01:24:35

Do you gentlemen have any information on what causes you to be put on a no-fly list?

01:24:43

There is no such thing as a no-fly list. And the second part to that was… There is no such thing as a no-fly list.

01:24:47

But there’s records.

01:24:48

I mean, there’s cases where people walk up

01:24:50

and they’re told that they can’t fly.

01:24:51

Like Teddy Kennedy, for one.

01:24:53

Or Mohammed, whatever he is.

01:24:56

No, I was just kidding.

01:24:58

There is a no-fly list,

01:24:59

but you’re not allowed to know about it.

01:25:00

Yeah, exactly.

01:25:01

There is not an official no-fly list.

01:25:04

It has not been published.

01:25:05

Its criteria have not been published.

01:25:07

It’s all secret.

01:25:08

Well, and think about this.

01:25:11

Bruce Schneier actually sat on a government review committee

01:25:15

that looked at the secrecy of this stuff,

01:25:17

and he said, this is a list of people so dangerous

01:25:21

that we can’t allow them to be just an ordinary passenger

01:25:25

on an airplane,

01:25:25

even after we’ve searched them,

01:25:27

but so innocent

01:25:28

that we can’t arrest them

01:25:30

for anything

01:25:30

because they haven’t

01:25:32

committed any crime.

01:25:33

Well, unless you count

01:25:34

Teddy Kennedy.

01:25:39

There’s a disconnect

01:25:40

there somewhere.

01:25:43

Yeah.

01:25:44

You can say that the government

01:25:45

can’t write a law that doesn’t allow you to get on an airplane

01:25:48

without showing ID, but couldn’t

01:25:50

the airlines require you to show ID?

01:25:51

And, you know, you say

01:25:53

commercial enterprise wants to know about their customer.

01:25:55

If I’m an airliner, I can say maybe I’m not going to let you in my airplane

01:25:58

unless you’re from who you are.

01:25:59

Isn’t that about the same thing?

01:26:01

Well, that’s kind of the dodge that the airlines,

01:26:03

you know, the government is getting the airlines to take.

01:26:05

They have been trying, you know, to do that pass-the-buck thing.

01:26:10

But several airlines actually did try to impose ID requirements back in the early 90s

01:26:16

because they were mad that people were, if they weren’t going to use a plane ticket they had bought,

01:26:23

they would resell it to somebody else.

01:26:23

if they weren’t going to use a plane ticket they had bought,

01:26:24

they would resell it to somebody else.

01:26:30

And they said, well, we’ll just make people show ID,

01:26:32

and then we’ll catch those people.

01:26:37

And what they discovered was that travelers switched to other airlines when they imposed an ID requirement.

01:26:39

The public did not like this.

01:26:43

Of course, the airlines, so they really couldn’t do it.

01:26:47

They tried several times until they found a way to get the government to order every airline to do it,

01:26:52

and then you had nowhere to go.

01:26:54

And the airlines love it because now they can charge you $100,

01:26:59

or that’s up to $150 now, I don’t know, to change your ticket,

01:27:04

when otherwise you just could have sold it to somebody.

01:27:07

So is the next step that I want to change chips with my cat?

01:27:13

Not a bad idea.

01:27:14

Oh, you mean you put his chip in you?

01:27:19

Yeah.

01:27:21

This could be an incredible shell game.

01:27:25

I like that idea.

01:27:30

I think it’s a great idea. Try it out.

01:27:33

Your door will open when you come up to it.

01:27:39

I mean, rather than get the passport from the wrong place,

01:27:42

which might not let me through, I might just want…

01:27:46

I don’t know what we do.

01:27:47

You know, we can enter…

01:27:48

You cannot fly.

01:27:49

We can decide to use all cash.

01:27:51

I mean, it seems like there’s a sort of mischief

01:27:53

on another level that…

01:27:55

There are lots of opportunities here

01:27:57

for hacking the system and seeing how it responds.

01:28:02

Indeed…

01:28:02

Or raising awareness.

01:28:04

Yeah.

01:28:04

Well, ultimately, I think it’s the raising awareness.

01:28:07

I mean, we are going into this state,

01:28:11

and I personally am of the conviction

01:28:14

that we are headed toward a society

01:28:16

which is going to be completely transparent.

01:28:19

And I’m not sure that’s such a bad thing.

01:28:21

I mean, I was raised in a small town in Wyoming

01:28:23

where everybody knew everything anyway.

01:28:26

But we had the absolution of familiarity

01:28:29

and mutually assured destruction.

01:28:32

I mean, somebody wanted to rattle my skeletons.

01:28:35

I knew where their bodies were buried.

01:28:36

And by this means we tend to leave each other alone

01:28:39

in spite of knowing everything.

01:28:41

It’s different when you’ve got large secretive institutions

01:28:44

that know everything about you,

01:28:46

and you don’t know anything about them.

01:28:49

It’s also different when you have people

01:28:52

that you’ve never met and never will meet

01:28:53

who know everything about you.

01:28:56

And are making decisions about you

01:28:57

that you can’t even find out about.

01:28:59

And are making moral decisions about your behavior

01:29:01

based on their own personal system and morality

01:29:04

that may be much more rigid than yours.

01:29:07

And your incorrect database entries, which you can’t correct because you can’t even see.

01:29:12

But, I mean, eventually, you know, I assume that we’re going to find ourselves in a society

01:29:17

that is adapted to this by becoming, you know, first of all, a lot more transparent on all levels,

01:29:26

both institutionally and personally,

01:29:28

and also more tolerant because we’re not going to be able to handle it

01:29:32

if we don’t become more tolerant.

01:29:34

So, Farrell, I want to push back a little on this

01:29:36

because we’ve been talking about this off and on today,

01:29:39

and I’m skeptical.

01:29:41

I think that you’re talking about two different kinds of transparency,

01:29:45

transparency for institutions and powerful individuals who wield a lot of power,

01:29:51

and transparency for the powerless,

01:29:53

and as though the two necessarily accompany one another.

01:29:56

But I think it’s perfectly conceivable that we could have one without the other.

01:29:59

Oh, absolutely.

01:30:00

And I think the brass ring would be to shoot for transparency for the powerful,

01:30:06

privacy for the powerless.

01:30:11

Well, that’s one of the reasons why we were so adamant about cryptography,

01:30:18

because we felt like in order for there to be a parity, we had to start out with a parity.

01:30:26

We had to have the same level of privacy for the individual that the government had for itself.

01:30:30

And, you know, then you’re in a better bargaining position.

01:30:32

Then you can start to de-escalate.

01:30:36

You know, it’s kind of like the salt talks during the Cold War.

01:30:38

You can start to de-escalate on both sides. So this is kind of like in LA I’ve kind of been enjoying

01:30:46

watching the cops get on

01:30:47

YouTube while they’re beating somebody up

01:30:50

and I guess at the same level

01:30:52

I recognize that there’s a certain

01:30:54

that transparency is there

01:30:55

but there’s a decoupling of the power to have a result

01:30:58

Do you feel as though

01:31:00

everybody should be walking with video cameras?

01:31:03

I mean

01:31:03

Well, at the moment everybody is I mean walking around with video cameras? The moment everybody is.

01:31:06

I mean, really bad video cameras

01:31:08

and cell phones, but they’re going to get

01:31:10

much better.

01:31:10

The funny thing is that if more of us work for the government,

01:31:12

I mean, like if I spend time in Scandinavia,

01:31:15

a lot of people say, well, as a government worker,

01:31:16

I shouldn’t. Those are also

01:31:18

private individuals. So that

01:31:20

transparent society gets a little weird.

01:31:22

I don’t know.

01:31:24

There’s the problem of little brother.

01:31:27

And little brother watching you all the time.

01:31:31

I was just shown a great example of little brother here at USC.

01:31:36

There’s a project that Adam Powell showed me today.

01:31:42

You got to see that.

01:31:42

And he was very excited about it without a trace of concern, right?

01:31:48

No, he had some concern.

01:31:50

Actually, he said they’ve gathered a year’s worth of data on all the video cameras.

01:31:55

He was in the room.

01:31:56

He had to go, yeah.

01:31:59

A year’s worth of data of all the video cameras on campus.

01:32:03

It’s all sitting on hard drives somewhere

01:32:05

that they can map into satellite pictures

01:32:09

and show you who was walking around,

01:32:12

who was driving through campus.

01:32:14

He said they put this info up very briefly

01:32:19

after demoing this thing,

01:32:21

and then they took it back down again.

01:32:23

And they’re trying to figure out the social and privacy implications of this.

01:32:27

He said he’s gotten requests from campus security to, like, you know,

01:32:31

show me what was going on on this street at 4 a.m. on this day,

01:32:36

and he’s turned them down.

01:32:38

But if somebody came with a warrant, you know, he’d have to show it.

01:32:43

You know, if you believe in David Brin’s theory on this,

01:32:48

the transparent society, the thing to do would be to put that information and all that

01:32:52

software up for free public access.

01:32:55

That if the elite is going to be able to spy on all of us, then we

01:33:00

all ought to be able to spy on all of us. Yeah, it’d be good to know where all the campus

01:33:03

cops were at any given moment.

01:33:06

Yeah.

01:33:07

And, you know, if the Secretary of Defense can know where I’m going

01:33:12

as I go out and visit my friend and go out to a bar and have a drink and wander on home,

01:33:17

I should be able to know where he’s going.

01:33:20

Seems fair.

01:33:23

Yeah, because the only way you equalize that power is by equalizing it.

01:33:29

If information is power, then the public needs to have more of it than the government,

01:33:34

or the public will not be able to control the government.

01:33:43

We’ve had packet switching and anarchists mentioned in the same breath

01:33:46

Now we’re talking about the government and not being able to control

01:33:50

We’ll talk about China, which is one part of the internet and not all of it

01:33:53

In this digital future, is there kind of an inevitability

01:33:57

of the undermining of the current concept of the nation-state?

01:34:02

I think the nation- state is actually damn near gone in many respects.

01:34:11

I mean, the United States of America is the nation state

01:34:16

that seems to be most determined to preserve that notion,

01:34:18

but it’s also determined to preserve the notion

01:34:21

that everything in the world is part of the United States of America.

01:34:26

I mean, clear back in the Clinton administration when we were arguing about, you know,

01:34:30

the functional borders of copyright and cryptography, I said, where exactly do the boundaries?

01:34:38

I was in the White House, and I said, where exactly do you think the boundaries of this country now lie?

01:34:44

And the staffer at that point was the president’s chief of staff,

01:34:49

said, we don’t find that to be a very convenient question to ask around here.

01:34:55

So, I mean, but the nation state is,

01:34:58

I think the nation state’s been on decline for a while.

01:35:01

I mean, it’s something that rose up to support the industrial period.

01:35:06

I mean, it was about defining an economic zone

01:35:08

that was big enough to support, you know,

01:35:11

standard of industrial production

01:35:14

and, you know, common monetary standards

01:35:16

and railroad track standards and that kind of thing.

01:35:18

And it…

01:35:19

What comes next?

01:35:21

Well, I actually look to see a major comeback

01:35:23

for the city-state.

01:35:26

You know, I mean, I think the city-state is going to have the biggest renaissance since the Renaissance.

01:35:30

Well, the United States does show signs of being too big a country

01:35:37

in the sense that we don’t know the people in the government, and they don’t know us.

01:35:44

Right?

01:35:49

And most countries are a lot smaller, either in population or in physical size,

01:35:53

to the point where if you want to talk to the minister of the interior about X,

01:35:57

you can actually schedule an appointment and go over and talk to him.

01:35:57

Right.

01:36:11

And this country has just gotten, you know, over 200 years physically so big and so complicated and encrusted that nobody wants to talk to you.

01:36:12

You’re just a citizen.

01:36:15

You know, you’re supposed to take orders.

01:36:15

You’re a citizen journalist.

01:36:17

Oh, yeah. I mean, the trend, and this, you know, has been commented by a lot of folks, but, I mean, the trend seems to be what they call glocalization,

01:36:27

you know, heading out toward the global and the local at the same time. You know, and

01:36:34

emptying out the nation state in the middle. Because, you know, part of what nation states

01:36:38

were for was conducting war. Well… And we don’t do that anymore. Not exactly. I mean, we have these SWAT teams the size of

01:36:55

armies that go and try to produce police actions in other countries. But war, to my way of thinking,

01:37:01

is where you really get at it, like know, like World War II or something like

01:37:06

that. Well, with the invention of nuclear weapons, that’s no longer practical. That’s not a possibility.

01:37:13

You can’t do that anymore. So, I mean, war is obsolete in that sense.

01:37:20

And I think it’s probably going to dawn on us fairly soon that it’s obsolete, you know,

01:37:24

even when we decide we’re going to go up and beat up on some pipsqueak nation.

01:37:28

Because it turns out that they will kick our asses sooner or later.

01:37:31

They have the last several times we tried it.

01:37:36

I mean, I think nation states are evolving.

01:37:40

It’s been really interesting for me to see more of the world in my life

01:37:46

because growing up in the United States,

01:37:50

well, I hear growing up in every country they tell you it’s the best country on earth.

01:37:53

You know, we’ve got the best mountains and we’ve got the best cities

01:37:56

and we’ve got the best people.

01:37:58

But we mean it here.

01:38:00

Right.

01:38:01

Right.

01:38:03

So just getting yourself out of that reality distortion field

01:38:09

can do a lot for figuring out what’s really happening.

01:38:14

One of the things that’s really happening is, for example,

01:38:16

that border controls are generally going down.

01:38:19

Yeah.

01:38:19

Right?

01:38:21

You can take trains all over Europe,

01:38:24

and they don’t even look at your passports.

01:38:26

Yeah, you travel in Europe,

01:38:27

you don’t know when you’ve crossed the border anywhere.

01:38:28

I mean, it still feels like, you know,

01:38:30

getting in or out of a prison camp

01:38:31

to go in and out of the United States,

01:38:33

and that’s been getting worse.

01:38:34

But that’s because, you know, we are becoming…

01:38:38

We looked into the Soviet abyss for too long

01:38:41

and we have become it.

01:38:44

Exactly.

01:38:48

Yeah.

01:38:50

We talked about a lot of potential future outcomes

01:38:54

and some of them are pretty scary.

01:38:57

Where do you see that we are?

01:38:59

Are we standing at the precipice of this horrible future

01:39:04

where we created the infrastructure for a government to become totalitarian?

01:39:09

Is the EFF and other groups like you able to provide counterweight?

01:39:13

Where do you see this situation?

01:39:15

The cool thing about the Internet and the reason that the EFF exists, is that the Internet has contained within it

01:39:26

the capacity to be the most liberating thing

01:39:29

that’s ever happened to humanity

01:39:30

and at the same time the world’s greatest surveillance tool.

01:39:34

I think the most liberating thing was the pill, actually.

01:39:38

That didn’t last that long.

01:39:41

They decided they didn’t like that.

01:39:44

I’m not just talking about sex, but population control, too.

01:39:50

This is probably, you know, I’ve been called a pronoid by people,

01:39:55

and I’m pleased to be called one.

01:39:57

Somebody thinks the universe is a conspiracy on your behalf.

01:40:02

And I’m reasonably sunny about the future

01:40:05

in spite of all the gloomy things that one can say

01:40:09

in this context or numerous others

01:40:11

because I think that if you get all of humanity

01:40:15

working in some kind of much more informed condition,

01:40:19

working together and creating a nervous system for the planet

01:40:22

so that we actually know what’s going on everywhere collectively.

01:40:27

We could make the leap to nirvana.

01:40:29

Kind of like it.

01:40:30

But, I mean, we might actually become a slightly more conscious species.

01:40:35

I mean, I feel that that’s very likely true.

01:40:38

But, you know, we’ve got a lot of shoals and narrows to pass through on the way.

01:40:46

I guess I kind of feel similarly.

01:40:49

The Internet has been a disruptive technology in the classic sense.

01:40:55

It’s gone in and messed up a bunch of assumptions that people had.

01:40:59

And the result is things can get better or things can get worse.

01:41:04

EFF is about 25 people.

01:41:06

I mean, how can 25 people really change the world?

01:41:10

Well, when there’s a lot of flux, when you’re at a cusp,

01:41:14

when you can actually see that a decision made here

01:41:20

can make things go this way or that way or that way.

01:41:24

If you’re at the beginning of a journey, you can help to pick a direction.

01:41:29

And the 25 people at EFF or the 50 people in this room could end up making some of the

01:41:37

choices that make the world better or worse.

01:41:39

We’re still…

01:41:40

So we’re trying.

01:41:40

We’re just like five butterfly wing beats past you know the first you know the butterfly that

01:41:46

that starts the thunderstorm i mean this is really early still you know i mean you are

01:41:52

everybody here is a pioneer still in this regard and you know you do not think that that this is

01:42:03

all over by any means because just the stuff stuff that John and I have found ourselves dealing with in the last two or three years,

01:42:10

you know, decisions that get made that we can have some influence over,

01:42:17

that are important decisions in terms of how things go from this point forward.

01:42:21

I mean, we’ve had a long dispute with Intel and Microsoft

01:42:26

and actually a number of different groups

01:42:30

in a consortium over what is called trusted computing.

01:42:35

How many people in here know about trusted computing?

01:42:39

Wonderful thing.

01:42:40

How many of you are in favor of it?

01:42:44

Opposed?

01:42:46

Yeah.

01:42:47

Don’t know?

01:42:49

How about ambivalent?

01:42:50

In favor of parts of it and not the rest?

01:42:52

Right.

01:42:53

And that’s the problem, because there could be some good things about it.

01:42:57

But there could be some truly awful things about it,

01:42:59

not the least of which is that if deployed the way in which it was intended to be deployed before we got involved,

01:43:08

it would have made it so that any document that passed through your computer would be marked as having gone through there.

01:43:17

Trackable back to you.

01:43:17

So it would be trackable back to you, you know, to solve copyright violations, mostly to make sure on Microsoft’s behalf that…

01:43:26

That you’re not stealing their software.

01:43:28

You’re not stealing their software.

01:43:30

There’s only 6 billion copies of it in the world.

01:43:33

If everything that passes through every computer

01:43:36

is marked with the identity of the person who had that computer,

01:43:40

I mean, it makes it very easy to create a dreadful surveillance system.

01:43:46

And this didn’t surface on most people’s radar because it was too goddamn complicated. I mean,

01:43:51

just trying to figure out the specs of the… Right, the 150-page turgid spec of what it was

01:43:59

supposed to do. Right, which was constantly changing and in an environment that was not

01:44:03

necessarily open to public discourse.

01:44:06

Now, this is really important stuff, and it happens all the time.

01:44:10

And, you know, these are the kinds of things that I recommend you getting involved in.

01:44:14

You see these things surface, and they’re everywhere around you.

01:44:17

And ask yourself, is this going to make for a more open society or a more closed society?

01:44:24

Is this going to make the Internet more open, or a more closed society? Is this going to make the Internet more open or is it going to make it more closed?

01:44:28

Is it going to connect people or is it going to separate them?

01:44:32

Right.

01:44:32

Is it going to give people more control over their lives?

01:44:35

More or less.

01:44:35

Or give other people more control over your life?

01:44:39

Could you come in on the one laptop per child?

01:44:42

Because just go.

01:44:44

You know what it is.

01:44:45

Sure, yeah.

01:44:46

I think it’s a great thing, actually.

01:44:49

This is a project that came out of the MIT Media Lab to produce.

01:44:53

They were shooting for $100 laptops.

01:44:55

Their first rev is $140.

01:44:58

And they have no hard drives, a really slow processor,

01:45:01

an amazing new screen that costs a quarter of what laptop

01:45:08

screens cost and is readable in sunlight, and very, very low power such that if you

01:45:14

can charge it overnight, then a kid can use it all day.

01:45:20

Yeah, you can house the power by hand.

01:45:22

In places with no electricity, you could actually use human or animal power to recharge it.

01:45:27

It’s almost indestructible.

01:45:29

That’s the theory.

01:45:31

Well, I mean, the last one I saw looked like you could throw it all the way across the room against a brick wall and it would take it.

01:45:38

Yeah.

01:45:38

Well, and so they’re selling these.

01:45:43

The idea here is to sell these to education bureaucracies in the third world,

01:45:49

using the money that they would normally spend on textbooks.

01:45:52

If they’d spend 30 a year per kid on textbooks,

01:45:56

then have them use four or five years’ worth of money and buy that kid a laptop,

01:46:01

and they can download the textbooks into the laptop and read them.

01:46:04

They also have a cool system. Meanwhile, they can do the textbooks into the laptop and read them.

01:46:05

They also have a cool system which is…

01:46:06

And meanwhile they can do lots of other things.

01:46:08

They have a mesh network.

01:46:10

You know, if you get a bunch of these together there’s a mesh network that automatically

01:46:13

creates a local area network without trying to configure it at all.

01:46:19

And they are also musical instruments.

01:46:21

I mean they have a lot of very cool musical software in them.

01:46:24

So you could, you know,

01:46:26

I can imagine the 25 Laptop Band easily.

01:46:29

Yeah.

01:46:30

On a camera and a video.

01:46:31

What’s that?

01:46:31

Yes, they’ve…

01:46:32

On a camera and a video.

01:46:33

Yeah.

01:46:33

They’ve added mic and camera to it.

01:46:35

Right.

01:46:35

But I guess my question is,

01:46:38

there’s the utopian view,

01:46:39

which I subscribe to,

01:46:41

of giving more access to knowledge,

01:46:45

making it more hard to knowledge, making it more part of the control.

01:46:48

If you have 5 million kids in Liberia on the net,

01:46:51

it’s going to be possibly…

01:46:53

Libya.

01:46:54

Libya.

01:46:55

It was Gaddafi that went for this.

01:46:57

The great thing about a dictator is you can just…

01:46:59

But then there are people who have taken the other…

01:47:04

One of our students wrote a long and effective saying, well, this is really cynical.

01:47:09

People have invested in it, wanted more people for eBay, more customers, more games.

01:47:13

Customers, local culture will go away.

01:47:17

I don’t really subscribe to this, but I’m thinking it’s worth a conversation.

01:47:20

Well, first of all, it’s a good idea to know the people who are involved. I mean, I’ve been

01:47:26

peripherally involved and

01:47:27

the people who are involved are…

01:47:29

I’ve been more closely involved.

01:47:31

They could care less about that.

01:47:35

Yeah, quite the opposite.

01:47:36

There’s in no way an incentive.

01:47:39

There are 14 people

01:47:40

in the One Laptop per Child organization.

01:47:44

Okay?

01:47:44

And, you know, they’re doing this all with leverage through partners.

01:47:50

I think the person who wrote this, and I agree with you.

01:47:53

Okay.

01:47:54

Clearly, I’m not agreeing with him.

01:47:55

I’m just bringing it up as a…

01:47:57

Well, I’m glad to talk about it.

01:47:58

I’m bringing it up as just a point of conversation

01:48:00

in terms of what is this idea of this huge globalization and lose…

01:48:06

How do you use this to hold

01:48:08

culture as opposed to lose local culture?

01:48:10

Coming from the small town, I think you can

01:48:12

make the point that it’s more…

01:48:13

The Mesh Network enforces the small town.

01:48:16

It enforces community.

01:48:17

You build your Wikipedia and your…

01:48:19

There are languages that were dying.

01:48:22

There are cultures that were dying

01:48:23

because they had no way of preserving themselves.

01:48:26

They didn’t have enough of a commercial base

01:48:29

to produce a publishing industry.

01:48:32

I mean, all over the planet,

01:48:35

languages even as big as Catalan were in tough shape.

01:48:40

Two-thirds of the languages

01:48:41

that the laptop is going to be used in in the first two years

01:48:45

were not supported by Linux before they started.

01:48:48

Right.

01:48:48

Right?

01:48:49

And they’re putting that support in.

01:48:51

So that in Libya, they’ll be using the thing in Arabic.

01:48:55

And in Thailand, they’ll be using it in Thai.

01:48:57

I just think it’s an example of a big idea that was actually at this juncture,

01:49:01

which really can’t be changed.

01:49:02

And also, you know, there’s another great, to me, an important part of this,

01:49:06

which is that I’m a big believer in open source.

01:49:12

And the whole thing has been done with open source.

01:49:14

This thing is open source.

01:49:15

So that the kids can share, they can examine it to see how it’s built,

01:49:20

they can take it apart physically and put it back together,

01:49:23

which means they can also learn to repair it.

01:49:26

They can update the software in it.

01:49:29

And anything they build in it, they can share with all the other millions of kids that have them.

01:49:36

So listening to Mary Lou Jepsen, who is CTO of the project, she gave a talk about this a couple of days ago that I attended and she said well you know

01:49:45

we don’t really believe too hard in this idea

01:49:49

that

01:49:50

the kids are going to just read all their textbooks on this

01:49:55

what’s really going to happen

01:49:57

is that the kids are going to explore this on their own

01:50:00

and they’re going to learn how to learn

01:50:01

and they’re going to learn how to teach each other

01:50:02

and they’re going to do things with this

01:50:04

that we never even thought of.

01:50:06

And that is the best kind of education we can give.

01:50:09

Absolutely.

01:50:10

And also, I mean, if you suddenly have this flood of Linux machines,

01:50:14

or basically Linux machines out there in that age group,

01:50:21

you’re able to start dealing with creating another software culture

01:50:25

that doesn’t automatically enslave itself to Windows.

01:50:29

And they worked very hard to not put any kind of DRM,

01:50:35

not put any kind of reporting back on what the users are doing,

01:50:39

access controls, all of that kind of stuff is all missing from this deliberately.

01:50:44

access controls, all of that kind of stuff is all missing from this deliberately.

01:50:50

So that it’s an empowering tool rather than a spy tool.

01:50:55

But they’ve got microphones and cameras in them, right?

01:51:00

There’s going to be a million of them with approximately the same software in it by the end of next year.

01:51:06

If somebody can figure out a way to write a virus that can propagate through that, it can start spying on those kids and their families, right? It’s an especially gentle breeze that

01:51:12

blows no ill. I mean, I cannot imagine that, you know, absolutely no harm will come of this,

01:51:18

especially given the really pure good intentions of everybody who’s involved in creating it.

01:51:23

the really pure good intentions of everybody who’s involved in creating it.

01:51:27

I mean, this is a dead setup for some kind of disaster.

01:51:30

If I were a totalitarian government,

01:51:33

I might not want to buy 20 billion of those from my country.

01:51:37

It totally depends. I’m interested in whether China will actually buy it, for example.

01:51:40

Well, I think they will.

01:51:41

And in Brazil, for example.

01:51:43

You realize these are being built.

01:51:45

It’s going to take a lot of them.

01:51:46

Brazil is going to buy Brazilians.

01:51:50

Brazil?

01:51:50

Brazilians.

01:51:51

Brazil is a country that two years ago

01:51:54

spent more on software licensing

01:51:57

than it did on fighting hunger inside Brazil.

01:52:02

You know, that’s just, that’s beyond wrong.

01:52:06

The government or everybody in the country?

01:52:08

The government.

01:52:09

You know, I mean,

01:52:10

so, you know, something has to happen there.

01:52:14

And creating a,

01:52:16

you know, a culture of open source

01:52:18

kids, I think, is a

01:52:20

good start.

01:52:22

So, I think it’s going to be another

01:52:24

disruptive technology,

01:52:25

but it will be disruptive in those countries,

01:52:30

and it will be disruptive in a way

01:52:32

that produces a differently educated generation.

01:52:36

And whether that will come out positive, negative,

01:52:40

both in every which way, nobody knows yet.

01:52:42

Probably turn to crap, but there’s a huge opportunity.

01:52:46

There’s a guy back here who’s had his hand up for a long time.

01:52:48

Two things.

01:52:49

First, to address your point about the totalitarian government

01:52:53

giving their hands on a million of them

01:52:54

and not wanting the students to have a free operating system on it,

01:52:59

hardware specs are open.

01:53:00

They can replace them.

01:53:02

They can blow it, Linux and Trigger and everything else on there and replace what they want on every single OLPC. Right. Right. Right. Right. Right.

01:53:05

Right.

01:53:06

Right.

01:53:07

Right.

01:53:08

Right.

01:53:09

Right.

01:53:10

Right.

01:53:11

Right.

01:53:12

Right.

01:53:13

Right.

01:53:14

Right.

01:53:15

Right.

01:53:16

Right.

01:53:17

Right.

01:53:18

Right.

01:53:19

Right.

01:53:20

Right.

01:53:21

Right.

01:53:22

Right.

01:53:23

Right.

01:53:24

Right. Right. Right. Right. Right. And I think it’s very, it’s actually, you know, spend time in China and come back and tell me that that’s a totalitarian government at this point in the usual sense of the word.

01:53:30

I mean, there are a lot of, there are a lot of different things going on in China.

01:53:34

There are a lot of different factions.

01:53:35

There are a lot of people who are determined.

01:53:39

Actually, I would say more, more zealous about creating genuine liberty in China than there are in the United States.

01:53:46

Well, they have more people.

01:53:48

Well, as a percentage.

01:53:51

But they also have recent experience of…

01:53:52

People in power in China.

01:53:54

People in power who care about this.

01:53:56

A large fraction of their populace being killed off

01:53:58

by their own social programs.

01:54:00

Right.

01:54:01

I mean, they’re wary.

01:54:02

They’re, I think, understandably wary

01:54:04

of these big cyclonic

01:54:06

madnesses that happen in China periodically, like the Great Cultural Revolution. And they,

01:54:11

you know, they don’t want to allow that sort of thing to easily form. So they try to keep some

01:54:16

kind of, you know, some kind of resistance in the loop. But, you know, I think most people in China,

01:54:25

officials that I talk to,

01:54:27

they don’t think of themselves as being a totalitarian state at all.

01:54:31

Well, I think most of the people in the government here

01:54:34

would make the same answer.

01:54:37

But less credibly.

01:54:41

One other thing on technical.

01:54:43

The mesh networking aspects are proprietary. It’s firmware from Mar-a-Lago. yeah it’s true

01:54:51

there’s two pieces of firmware in there

01:54:54

both the Marvell and the

01:54:56

embedded controller that controls the power

01:54:58

in the keyboard

01:54:59

it’s true.

01:55:05

The plan is actually to rewrite it.

01:55:07

It was?

01:55:08

Yeah.

01:55:09

And it remains so.

01:55:11

The speed of implementation

01:55:12

for the OPC was to just go through.

01:55:14

Well, I think they’re also still trying to negotiate

01:55:16

to get that opened up.

01:55:18

Yeah, I spoke with a number of the OPC tech leads

01:55:21

and they couldn’t give me a solid answer

01:55:22

whether or not they were in the program.

01:55:24

This all came together incredibly fast.

01:55:26

Oh, I know.

01:55:26

It was the one thing that I’ve heard it touted repeatedly

01:55:30

that mesh networking is a wonderful thing.

01:55:31

It just worries me slightly.

01:55:32

And even a company as decent as Marvell,

01:55:36

it’s still in that unusual position

01:55:37

where you have two really nasty, actually,

01:55:41

fields in the middle of something

01:55:42

that’s touted as wonderfully as that.

01:55:43

Well, the other piece there is that the mesh networking implements an IEEE standard

01:55:49

that’s fairly new.

01:55:50

It’s 802.11s, I think.

01:55:53

So you can get the specs for it and reimplement it yourself, in theory.

01:55:59

We’ll see how closely the theory matches.

01:56:02

Yeah, next.

01:56:04

This perhaps sounds like a stupid question, but I don’t mean it that way,

01:56:07

but I’d love my kids here in California to have $140 laptops.

01:56:12

Is there any barrier to the same design happening?

01:56:18

Is there a groundswell for that happening?

01:56:19

Christmas sales.

01:56:30

Actually, the barrier to its happening is their model for getting these out into the world is to do big deals for at least a million laptops.

01:56:36

Nothing says they can’t do one with Walmart.

01:56:38

Well, most countries can do that.

01:56:41

It turns out in the United States, the education bureaucracies are not centralized.

01:56:48

They’re not even centralized at a state level. It’s by individual school districts.

01:56:50

So it would require someone to

01:56:51

put together a consortium of school districts

01:56:54

that could buy a million laptops

01:56:56

at a time from them and do the distribution

01:56:58

and stuff.

01:56:59

And they are looking at…

01:57:02

It doesn’t have to be that way.

01:57:04

This is still fluid.

01:57:05

I mean, this is not a bad idea.

01:57:08

I mean, I think, you know,

01:57:08

we can have a conversation with Nicholas

01:57:10

and make it happen.

01:57:12

Well, you know, you can go on their website

01:57:13

and you can see which governors of which states

01:57:16

have actually called up Walter and Nicholas

01:57:17

and talked to them about it.

01:57:19

And there’s maybe about eight states.

01:57:21

I think California’s one of them.

01:57:23

At this point, that’s not where they’re concentrated.

01:57:26

Their intention is to do third world countries first,

01:57:29

and if that’s successful, I’m sure they will.

01:57:31

Yeah.

01:57:32

According to Mary Lou,

01:57:33

this will be the fastest ramp-up of production of a product ever done.

01:57:39

The Xbox was the previous fastest,

01:57:42

and they’re trying to do it a little faster.

01:57:45

So once they’ve satisfied the demand in the third world,

01:57:48

they’ll be able to do millions more for here.

01:57:50

But you’ll probably want the second generation one,

01:57:53

which will be the $50 laptop.

01:57:55

The one that has the hard disk.

01:57:57

Right now.

01:57:58

I’m waiting for the developing world to get first.

01:58:01

Yeah.

01:58:05

Yes?

01:58:06

I don’t mean to

01:58:07

rain on everybody’s

01:58:08

parade in terms of

01:58:09

the $100 laptop,

01:58:10

but what about

01:58:11

in instances of

01:58:12

inner cities where

01:58:12

computers have been

01:58:13

given and they just

01:58:14

end up sitting there,

01:58:16

not being used

01:58:16

because they don’t

01:58:17

know how to use them

01:58:17

or they’re more

01:58:18

pressing needs?

01:58:19

And I think that

01:58:20

kind of skepticism

01:58:21

has been largely

01:58:22

ignored in terms of

01:58:22

the $100 laptop

01:58:23

because it’s kind of

01:58:24

like, well, how appropriate really is a computer in these kinds of environments?

01:58:29

In countries like Malawi, for example, or in Africa.

01:58:32

The problem is, you know, at least in cities in the United States,

01:58:36

is they give them to people who actually don’t like them.

01:58:39

I mean, they put them in the control of adults, for starters.

01:58:42

And they put them under the control of adults, for starters. And they put them under the control of adults

01:58:46

that are suspicious of computers because they’re the wrong generation anyway. And they deal

01:58:53

with the computers as they’re something that can be easily broken. I mean, if you go to

01:58:56

Brazil where my pal who’s the Minister of Culture has managed to put up almost 600 computer centers over Brazil in the last two years.

01:59:09

All of those computer centers are in the hands of the kids that are using them.

01:59:14

They find the toughest kid that they can find,

01:59:17

and they have him in charge of taking care of the computers.

01:59:20

These are in favelas.

01:59:22

And the computers belong to the kids, and they get used all the time.

01:59:27

You go into one of those computer centers, and they’re constantly being used.

01:59:30

It’s just American education is so screwed up, you know.

01:59:38

I was a public school teacher for many years and a literacy specialist,

01:59:41

and, you know, this whole notion of kids will teach other kids, that has

01:59:46

a role once people are already literate

01:59:48

and you can’t

01:59:50

replace a human with a computer.

01:59:52

How did you learn how to use a computer?

01:59:53

Did you learn it from a teacher?

01:59:55

I sure didn’t.

01:59:57

What?

01:59:59

You were a nerd before?

02:00:01

I was literate.

02:00:02

So I mean, another thing is you have to have

02:00:04

a more knowledgeable other there to…

02:00:07

How many people in this room learned how to use a computer from some academic environment?

02:00:13

I think she’s saying that if you can’t read, you can’t teach yourself how to use a computer.

02:00:18

That’s not so.

02:00:19

Oh, yeah.

02:00:20

Not at all.

02:00:21

Why would that be?

02:00:22

Well, I…

02:00:23

I mean, I… Your daughters learned to use computers before they came here. Not at all. Why would that be?

02:00:29

Your daughters learned to use computers before they could read. Exactly.

02:00:29

All my kids were using a computer pretty proficiently before they could read.

02:00:33

They just knew where things were placed on the screen.

02:00:36

It’s the same way that if you find yourself using a computer in a different language,

02:00:40

if you’re kind of familiar with the computer, you can still get it to do stuff.

02:00:44

You don’t have to read it. But I think in that situation, what you’ve described is kind of familiar with the computer you can still get it to do stuff you don’t have to read it

02:00:45

but I think in that situation what you’ve described is kind of a cultural

02:00:48

environment that really supports the ability

02:00:50

for your children to be able to go in

02:00:52

and engage with the computer

02:00:53

I mean your presence and that kind of familiarity

02:00:55

but I think in a situation where it’s really a foreign object

02:00:58

being kind of put into this environment

02:00:59

I don’t know

02:01:01

I’m just skeptical

02:01:02

I spent two years three years on and off

02:01:06

going around Africa

02:01:08

getting countries hooked up to the internet

02:01:11

and watching

02:01:13

what happened in these places

02:01:14

we have all these arrogant notions

02:01:17

about these poor

02:01:19

savages

02:01:19

and how they will never understand our

02:01:22

mighty technology

02:01:23

it floors me.

02:01:25

I mean, you go to the places where I was now in the beginning,

02:01:30

and you see computer systems in every beauty parlor.

02:01:33

I mean, the beauty parlors are the big places they got them.

02:01:36

I mean, they’ve got them in the most random sorts of places,

02:01:38

and ordinary people are using them.

02:01:41

They’re not scared of them at all.

02:01:44

We have a powerful hand in the bag.

02:01:46

I’d like to say that while that happens,

02:01:49

and it certainly did happen earlier when there were rollouts

02:01:53

and putting computers in with no backup and that sort of thing,

02:01:56

it’s kind of a misguided rep to leave with inner cities

02:02:00

because what’s much more common is community technology centers

02:02:03

and other programs going on where they’re doing great things

02:02:06

all those programs

02:02:08

got gutted by the Bush administration

02:02:10

but still there’s really a lot of

02:02:12

great inspiring stuff happening

02:02:13

in the inner cities so I would hate to leave it

02:02:16

that that’s what’s going on

02:02:17

there’s Project Mouse in New York

02:02:19

which does amazing stuff

02:02:21

if I could just add a little bit.

02:02:26

My girlfriend was in Burkina Faso.

02:02:27

She’s a journalist there.

02:02:28

She had a laptop with her.

02:02:30

They’ve got an 18% literacy rate.

02:02:31

She sat down, and everybody was looking at it

02:02:34

because they’re like, they’re pictures.

02:02:36

And so it was a visual literacy of like,

02:02:37

oh, the icon means an idea,

02:02:39

and I don’t have to know what the same characters are.

02:02:41

They have a dress that you’re saying.

02:02:43

And what they were finding was that

02:02:44

a lot of the problems they were having

02:02:45

in the educational system was because they were

02:02:47

speaking French to the kids who didn’t speak

02:02:50

French back, and they didn’t have

02:02:51

a system where they could ask questions.

02:02:53

So it wasn’t interactive.

02:02:55

So A, you had a visual literacy

02:02:57

interactive system with a lot of

02:02:59

interest in it, as opposed to the antiquated

02:03:01

education system where

02:03:03

nobody was passing it. So that’s the other side of it, maybe. to the antiquated education system where nobody was passing

02:03:05

it. So that’s the other side of it maybe.

02:03:07

I think another aspect of that is that the user interface is, for the most part, still

02:03:12

very text-based, it’s very word-based. If you think about pretty much anything, all

02:03:17

the menus come down in the beginning, little character streams, whereas more and more of

02:03:22

what is passing in and out of the computer is very visual.

02:03:27

It’s video.

02:03:28

And so that’s probably one of the paradigms that’s going to shift.

02:03:32

Is that the interface really?

02:03:34

I mean, you look at a browser, you’ve got a couple of arrows and a couple of pictures,

02:03:37

and that pretty much defines your experience, again, unless you go into the…

02:03:41

The thing that’s happening in these Brazilian computer centers

02:03:44

is that most of them are using them for musical purposes,

02:03:48

and they don’t have to read to get them to work for that purpose.

02:03:52

And that’s what sucks them in.

02:03:55

So, folks, we’re at time now.

02:03:57

I think we could go on for several more hours,

02:04:00

but in deference to our speakers who’ve come a long way,

02:04:03

I’m going to wrap things up here.

02:04:05

Thanks.

02:04:06

And I want to thank you all for your great questions

02:04:09

and for your interest.

02:04:10

Thank you.

02:04:11

Thanks.

02:04:11

Thank you.

02:04:21

You’re listening to The Psychedelic Salon,

02:04:24

where people are changing their lives one thought at a time.

02:04:28

I would be remiss if I didn’t point out how deeply I hoped John Perry was correct when he said that he didn’t think nuclear war was possible anymore.

02:04:37

Keep in mind the fact that this was said in the year 2006, and I agreed with him back then.

02:04:47

in the year 2006, and I agreed with him back then. But today, with that mentally impaired child who is the commander-in-chief of the most powerful military in history, well, I’m now back to where

02:04:53

I was in grade school during the 1950s when we did duck and cover drills. Sadly, we are being

02:04:59

led by a confederacy of dunces, and no one has a clue as to what major world events lie ahead.

02:05:06

My only advice is to not let your supply of cannabis get too low. As Terrence McKenna often

02:05:12

said, keep the old faith and stay high. Actually, I thought about giving you an update on the

02:05:18

various EFF projects that they talked about back in 2006. But since this podcast is already quite long,

02:05:26

I decided that if you’re really interested in one or more of these topics,

02:05:29

that you’ll go to EFF.org yourself and read about them,

02:05:33

which is why the Internet is here, after all.

02:05:37

I’m going to close now by first playing a recording of John Perry Barlow

02:05:41

reading his Declaration of the Independence of Cyberspace.

02:05:45

Then I’ll follow that by playing a song that on more than one occasion

02:05:49

inspired me to get back into the game of life as a player,

02:05:52

rather than just as another spectator.

02:05:55

The song that I’m going to play is from the Grateful Dead, of course,

02:05:58

and it’s titled We Can Run.

02:06:00

It was written by John Perry and the Grateful Dead’s keyboard player,

02:06:04

Brent Midland, who also sings it.

02:06:07

Hopefully, some of the suggestions that Barlow and Gilmore made in this podcast today will spark an idea or two in you that will convince you to stop running after these elusive consumer dreams, if that’s what you’re doing, and stop hiding from what, deep down, you know to be your destiny. More than once I’ve had

02:06:27

to decide to stand up and be counted yet again, and it’s always turned out to be worth the risk.

02:06:33

No one needs to tell you what to do. You’re the best judge of that for yourself.

02:06:38

We’re all in this together, you know, and your fellow salonners are counting on you to do your best, just as John Perry Barlow has inspired us all to do.

02:06:47

So press on.

02:06:49

By the way, did you know that one of the words that I use

02:06:52

from the first and last lines of every podcast from the Psychedelic Salon,

02:06:56

the word cyberdelic,

02:06:58

did you know that that word was coined by none other than John Perry Barlow?

02:07:03

So each week when you listen to these podcasts

02:07:05

from the Psychedelic Salon

02:07:07

and you hear me say the word cyberdelic,

02:07:10

well, I hope that it’ll bring the memory

02:07:11

of John Perry Barlow back to your mind

02:07:13

and in that way we can help

02:07:15

to keep his wonderful spirit alive.

02:07:18

And for now, this is Lorenzo

02:07:20

signing off from cyberdelic space.

02:07:24

Be well, my friends.

02:07:27

A Declaration of the Independence of Cyberspace, as written by John Perry Barlow at the World

02:07:36

Economic Forum in Davos, Switzerland, on February 8th, 1996.

02:07:43

On February 8th, 1996.

02:07:52

Governments of the industrial world, you weary giants of flesh and steel,

02:07:56

I come from cyberspace, the new home of mind.

02:08:05

On behalf of the future, I ask you of the past to leave us alone.

02:08:09

You are not welcome among us.

02:08:14

You have no sovereignty where we gather.

02:08:20

We have no elected government, nor are we likely to have one,

02:08:27

so I address you with no greater authority than that with which liberty itself always speaks.

02:08:39

I declare the global social space we are building to be naturally independent of the tyrannies you seek to impose on us.

02:08:50

You have no moral right to rule us, nor do you possess any methods of enforcement we have true reason to fear.

02:08:57

Governments derive their just powers from the consent of the governed.

02:09:02

You have neither solicited nor received ours.

02:09:04

We did not invite you.

02:09:10

You do not know us, nor do you know our world.

02:09:14

Cyberspace does not lie within your borders.

02:09:17

Do not think that you can build it as though it were a public works project.

02:09:21

You cannot.

02:09:23

It is an act of nature, and it grows itself through our collective actions.

02:09:33

You have not engaged in our great and gathering conversation, nor did you create the wealth of

02:09:40

our marketplaces. You do not know our culture, our ethics, or the unwritten codes

02:09:47

that already provide our society more order than could be obtained by any of your external

02:09:55

impositions. You claim there are problems among us that you need to solve.

02:10:05

You use this claim as an excuse to invade our precincts.

02:10:11

Many of these problems don’t exist.

02:10:15

Where there are real conflicts, where there are wrongs, we will identify them and address

02:10:20

them by our means.

02:10:24

We are forming our own social contract. This governance

02:10:29

will arise according to the conditions of our world, not yours. Our world is different.

02:10:39

Cyberspace consists of transactions, relationships, and thought itself, arrayed like a standing

02:10:48

wave in the web of our communications.

02:10:52

Ours is a world that is both everywhere and nowhere, but it is not where bodies live. We are creating a world that all may enter

02:11:07

without privilege or prejudice

02:11:09

accorded by race, economic power,

02:11:14

military force, or station of birth.

02:11:18

We are creating a world where anyone, anywhere,

02:11:22

may express his or her beliefs,

02:11:26

no matter how singular,

02:11:32

without fear of being coerced into silence or conformity.

02:11:43

Your legal concepts of property, expression, identity, movement, and context do not apply to us.

02:11:44

They are based on matter.

02:11:47

There is no matter here.

02:11:50

Our identities have no bodies,

02:11:53

so unlike you, we cannot obtain order by physical coercion.

02:11:59

We believe that from ethics, enlightened self-interest,

02:12:04

and the common wheel, our governance will emerge.

02:12:09

Our identities may be distributed across many of your jurisdictions. The only law that all of our

02:12:18

constituent cultures would generally recognize is the golden rule.

02:12:30

We hope we will be able to build our particular solutions on that basis.

02:12:36

But we cannot accept the solutions you are attempting to impose.

02:12:41

In the United States, you have today created a law,

02:12:44

the Telecommunications Reform Act,

02:12:46

which repudiates your own Constitution and insults the dreams of Jefferson, Washington, Mill, Madison,

02:12:53

de Tocqueville, and Brandeis.

02:12:56

These dreams must now be born anew in us.

02:13:08

be born anew in us. You are terrified of your own children, since they are natives,

02:13:17

in a world where you will always be immigrants. Because you fear them, you entrust your bureaucracies with the parental responsibilities you are too cowardly to confront yourselves.

02:13:28

responsibilities, you are too cowardly to confront yourselves. In our world, all the sentiments and expressions of humanity, from the debasing to the angelic, are parts of a seamless whole,

02:13:36

the global conversation of bits. We cannot separate the air that chokes from the air upon which wings beat.

02:13:50

In China, Germany, France, Russia, Singapore, Italy, and the United States,

02:13:57

you are trying to ward off the virus of liberty by erecting guard posts at the frontiers of cyberspace.

02:14:06

These may keep out the contagion for a small time, but they will not work in a world that

02:14:15

will soon be blanketed with bit-bearing media.

02:14:21

Your increasingly obsolete information industries would perpetuate themselves by proposing laws in America and elsewhere that claim to own speech itself throughout the world.

02:14:38

These laws would declare ideas to be another industrial product no more noble than pig iron.

02:14:49

In our world, whatever the human mind may create can be reproduced and distributed

02:14:57

infinitely at no cost. The global conveyance of thought no longer requires your factories to accomplish.

02:15:10

These increasingly hostile and colonial measures place us in the same position as those previous lovers of freedom and self-determination who had to reject the authorities of distant, uninformed powers.

02:15:27

We must declare our virtual selves immune to your sovereignty, even as we continue to consent

02:15:36

to your rule over our bodies. We will spread ourselves across the planet so that no one can arrest our thoughts.

02:15:47

We will create a civilization of the mind in cyberspace.

02:15:53

May it be more humane and fair than the world your governments have made before.

02:16:01

Davos, Switzerland, February 8, 1996

02:16:05

and read in New York City, July 30, 2013 We don’t own this place, though we act as if we did

02:16:34

It belongs to the children of our children’s kids

02:16:37

The actual owners haven’t even been born yet

02:16:43

But we never tend the garden and we rarely pay the rent

02:16:48

Most of it is broken and the rest of it is bent

02:16:51

Put it on the plastic and I wonder where we’ll be when the bill’s in

02:16:56

We can run

02:16:59

But we can’t hide from it

02:17:04

Of all possible worlds

02:17:07

We only got one, we got a ride on it

02:17:11

Whatever we’ve done

02:17:13

We’ll never get far from what we’ve left behind

02:17:19

Baby, we can run, run, run

02:17:22

But we can’t hide Well, I’m dumping my trash in your backyard

02:17:30

Making certain you don’t know this really isn’t so hard

02:17:33

You’re so busy with your guns

02:17:35

And all of your excuses to use them

02:17:38

Well, let’s love for the rich and babies for the poor

02:17:44

We got everyone believing that more is more

02:17:48

If a reckoning comes, maybe we’ll know what to do then

02:17:53

We can run, but we can’t hide from it

02:18:00

Of all possible worlds, We only got one

02:18:05

We gotta ride on it

02:18:07

Whatever we’ve done

02:18:10

We’ll never get far

02:18:12

From what we leave behind

02:18:15

Baby, we can run, run, run

02:18:18

But we can’t hide

02:18:21

Oh no, we can’t hide

02:18:24

All these complications We can’t hide. Oh, no, we can’t hide.

02:18:32

All these complications seem to lead no choice.

02:18:40

I heard the tongues of billions speak with just one voice.

02:18:43

Saying, just leave all the rest to me.

02:18:47

I need it worse than you, you see Then I in the Hail of the Wind

02:19:15

There’s a hole in the sky where the light pours in

02:19:19

I remember the days when I wasn’t afraid of the sunshine

02:19:24

Now it beats down on the asphalt land

02:19:29

Like a hammering blow from God’s left hand

02:19:33

With little silver rose

02:19:35

Grinches in the shade till the night time

02:19:38

We can run

02:19:41

But we can’t hide from it.

02:19:47

Of all possible worlds, we only got one.

02:19:50

We got a ride on it.

02:19:53

Whatever we’ve done, we’ll never get far from what we leave behind.

02:20:00

Baby, we can run, run, run, but we can’t hide. Oh no, we can’t hide from it.

02:20:17

We can’t hide from it.

02:20:18

Of all possible worlds, we only got one.

02:20:22

We gotta run.

02:20:23

We gotta run.

02:20:24

We gotta run. Whatever we’ve done. We only got one. We can run. We can run. We can run.

02:20:46

But we can’t hide from it.

02:20:49

That’s something we can’t do.

02:20:50

Of all possible worlds, we only got one.

02:20:54

We gotta run.

02:20:54

We gotta run.

02:20:55

I don’t admit it.

02:20:57

Whatever we’ve done.

02:20:59

Whatever we’ve done.

02:20:59

We’ll never get far.

02:21:01

We’ll never get far.

02:21:02

We’ll never get far.

02:21:04

Maybe we can run, run, run. We can run. We can run. We’ll see you next time. We only got one.