Program Notes

https://www.patreon.com/lorenzohagerty

Guest speaker: John Gilmore

https://www.theguardian.com/science/2016/dec/20/francis-huxley-obituary- - Francis Huxley 1923 - 2016 - - Photo: The Guardian

Date this lecture was recorded: August 2016

[NOTE: All quotations are by John Gilmore.]

“Don’t buy Apple products when they lock you into only using software that Apple approves of. It’s really straightforward. It’s like, don’t buy food that poisons you. Don’t buy from companies that try to control you.”

[In response to whether one can get their information back from Facebook.] “I don’t think there will be a way if you voluntarily hand over your data to a huge corporation that does not have your interest at heart. For you to get the data back, no, I don’t think there will be a way.”

EFF (Electronic Frontier Foundation)
HTTPS Everywhere
NO SCRIPT Addon
PRIVACY BADGER

Francis Huxley obituary
Anthropologist fascinated by shamanism,
myths and religious rites
who strove to protect indigenous peoples.

Previous Episode

528 - History Ends In Green – Part 5

Next Episode

530 - A Psychedelic Moment In History

Similar Episodes

Transcript

00:00:00

Greetings from cyberdelic space, this is Lorenzo and I’m your host here in the psychedelic

00:00:23

salon.

00:00:24

And you’re probably wondering where I’ve been these past couple of weeks.

00:00:29

Well, as I recently told fellow salonner Groove, well, I’ve just been goofing off.

00:00:36

You see, once the end of the year holiday season came around, I thought that, well, I’d just take a week off and just do some reading and maybe binge watch a few shows.

00:00:46

But you know what they say about it only taking 30 days to form a new habit?

00:00:51

Well, I think that only applies to starting a new habit that isn’t a lot of fun.

00:00:56

However, I found that it only takes about three days to form a habit of goofing off.

00:01:02

And so I just got lazy.

00:01:05

But when a fellow septuagenarian like Groove wonders about where I’ve been,

00:01:10

well, I figured that it may be time for me to resurface so that you don’t think I’ve kicked the bucket.

00:01:15

And I don’t foresee that happening any time for at least another decade or so, so I’m still here.

00:01:22

And I still have a lot of things that I want to do, and now I’ve just wasted a few weeks by not doing any of them.

00:01:28

So it’s time for me to get back to the business of podcasting and writing and a few other things I want to do.

00:01:35

But first of all, I want to thank fellow salonners,

00:01:39

Light Spirit Essentials,

00:01:40

Victor B.,

00:01:42

Ryan Q.,

00:01:43

Andrew D.,

00:01:44

and Angel R., all of whom made donations that are going to be used to offset some of our expenses this year.

00:01:52

And I thank you all very much.

00:01:55

Now, before I introduce today’s program, I want to let you know that at the end of this podcast, I’ll tell you about a few steps that I’ve taken to get the Salon 2.0

00:02:05

rolling along sooner than I originally planned. And the headline is that in addition to new

00:02:12

programs coming out under the Salon 2 name, I’ll also continue to do these podcasts as I’ve been

00:02:18

doing for almost 12 years now. But with the addition of the Salon 2 programming, there should be at least one new

00:02:25

program every week. But I’ll be sharing the workload of doing them with some of our fellow

00:02:30

salonners. And there will also be a way for you to become involved if that’s something that you

00:02:36

want to do. But first, let’s get on with what will be the first podcast from the salon in the year 2017, which should prove to

00:02:46

be a very interesting year. Interesting, that is, in the sense that train wrecks are also interesting.

00:02:55

Anyway, for our first talk of the year, it seemed to me that it would be appropriate for us all to

00:03:01

get back into the groove by reminding ourselves about the importance of

00:03:05

keeping our private communications and web surfing just that, private. And in my opinion,

00:03:12

there are very few people in the world that are better able to answer our questions about online

00:03:17

security than John Gilmore. Now, if you do a search on John, you’re going to find that he is a man of many interests and has accomplished more than most of us even dream of.

00:03:28

But today he comes to the salon as one of the founders of the Electronic Frontier Foundation, EFF,

00:03:34

which has been working to secure our digital rights longer than any other group that I know of.

00:03:41

I should also mention that John is an integral member of Camp Soft Landing,

00:03:45

where the Planque Norte lectures are hosted at Burning Man each year, and he’s been a friend

00:03:50

of the Salon’s podcast ever since they began. Now, just a heads up, but about 55 minutes from now,

00:03:57

John is going to be talking about the spyware that tracks you through any email lists that you are on.

00:04:04

Even if you are a security techie guru and don’t want to listen to this entire talk,

00:04:08

I highly recommend that you listen to what he has to say about companies

00:04:12

that send out mailing lists for people.

00:04:15

Maybe I’ve been out of the tech world for too long now,

00:04:18

but this was really news to me, and I think it’s important news.

00:04:22

So now let’s join John Gilmore and some friends on the Burning Man

00:04:27

Playa at Black Rock City last August as he answers

00:04:32

their most burning questions.

00:04:35

Next up is John Gilmore, co-founder of the Electronic Frontier Foundation.

00:04:41

Hello folks.

00:04:44

Hi.

00:04:46

Crowd on up.

00:04:49

Yeah, right.

00:04:52

So I’m really here basically to answer your questions.

00:04:57

So if you don’t have any questions, we can all just go back to having conversations.

00:05:01

But I’ve worked on drug policy for 16 years.

00:05:07

I worked in free software for like 25 years.

00:05:12

Co-founded the Electronic Frontier Foundation.

00:05:14

Been to Burning Man for 18 or 19 years.

00:05:18

You know, so I got a bunch of things

00:05:19

you could ask me about,

00:05:21

but I don’t know what you actually want to hear about.

00:05:24

So raise your hand.

00:05:26

Are your emails from 2003 accessible to some grand jury? Is that kind of the question?

00:05:33

Well, it really depends who you sent them to and where you and the other recipients keep them.

00:05:41

Like if you keep them in Gmail, they probably keep them forever. They were sent to

00:05:46

somebody else, so they would have to go to the somebody else to get them. Well, in the UK,

00:05:52

things are a bit different, where the GCHQ has actually tried like tapping every packet that

00:05:58

went through the international cables just to see if they could and what they could dig out of it,

00:06:04

and they declined to say how much of it and what they could dig out of it. And they declined

00:06:05

to say how much of it they keep or how long they keep it. So basically, the NSA can only collect

00:06:11

information if it is relevant to a couple of broad categories. And one of them is the foreign policy

00:06:18

of the United States, which means anything you say that affects anything that the U.S. does with any other country

00:06:25

or any other citizen of any other country,

00:06:28

then, you know, it’s kind of fair game if they want to claim it affects the foreign policy.

00:06:34

Ordering drugs from somewhere.

00:06:36

Hello, come on in.

00:06:38

Yeah, if you order drugs across an international border using email,

00:06:43

somebody’s probably reading that email.

00:06:46

If you were to recommend a means of digital communication

00:06:49

that is more secure,

00:06:54

what means of digital communication would you recommend?

00:07:00

Well, it depends, you know, it depends is the answer.

00:07:04

Well, it depends, you know, it depends is the answer.

00:07:18

When smuggling contraband into Burma, what people have been doing is filling up USB sticks and walking them across the river from another country.

00:07:20

Sneaker net.

00:07:30

Yeah, sneakerNet. Yeah, SneakerNet. And this is how all sorts of forbidden Western video and things like that are getting into that closed society.

00:07:35

Because you can carry a lot on like a 32 gig USB stick these days.

00:07:41

But if you’re talking about like how do I send email or text messages to somebody,

00:07:49

there’s an app called Signal that a lot of people seem to depend on. I haven’t used it myself, but I know a lot of people who do. In general, one of the things

00:07:56

that Edward Snowden taught us is that encryption actually tends to work. Like, really, you know,

00:08:07

academically vetted, properly encrypted,

00:08:10

more or less bug-free encryption

00:08:12

will actually keep the NSA from reading your stuff.

00:08:17

But it’s easy to make mistakes,

00:08:20

and that’s usually how they get you.

00:08:24

But it keeps your stuff out of the sort of vacuum cleaner trawl

00:08:29

where they’re just looking for anything interesting.

00:08:33

If you come to their attention

00:08:36

and they actually go to the trouble to break into your computer,

00:08:40

then they can watch you as you type it

00:08:42

before you ever encrypt it, for example.

00:08:44

But they’re not doing that to everybody,

00:08:46

because it would be too easy to catch them at it.

00:08:50

Do I have any insight into how pervasive parallel construction is?

00:08:57

I think it’s fairly pervasive.

00:09:01

This is the method where the cops use some illegal means

00:09:07

to get the information that they want about you,

00:09:10

and then they go back later and find another legal means

00:09:15

and prosecute you with that.

00:09:18

We found a bunch of FBI memos that said,

00:09:21

FBI memos that said, well,

00:09:24

here’s

00:09:26

a way of finding

00:09:28

people using

00:09:30

what they called sting

00:09:32

rays, these

00:09:34

fake cell sites that

00:09:36

pretend to offer cell phone service

00:09:38

that has a stronger signal

00:09:40

than the main cell phone towers

00:09:41

and then your phone will latch on to them

00:09:44

and all your calls will go phone towers and then your phone will latch on to them and all your

00:09:46

calls will go through them and also your phone will identify itself to them say this is trip’s

00:09:51

phone anytime somebody calls trip it should come here um those uh fbi was teaching all these local

00:10:01

police departments how to do this but they they also told them, you can’t

00:10:06

let any of this information get into a court case, because that will jeopardize this law enforcement

00:10:13

technique. So instead, you have to do parallel construction, that once you found somebody

00:10:19

through this, or you heard there was going to be a drug deal or whatever, then you have to plant a cop on the corner there

00:10:26

who just happens to notice somebody drive up and do a drug deal.

00:10:30

And then you can bust them for that.

00:10:32

And you never tell them that you actually heard it

00:10:35

over this cell phone interceptor.

00:10:40

I mean, we actually got documents that said,

00:10:44

lie to the courts about where you got this information.

00:10:49

So, also just what I’ve learned about drug policy in the last 16 years is the practice of police lying on the stand in order to secure a conviction is so common

00:11:06

and so popular that it has its own

00:11:08

word. It’s called

00:11:09

testifying.

00:11:13

And I think it’s

00:11:14

pretty common.

00:11:15

When I myself was busted years ago

00:11:18

in an illegal search,

00:11:20

I told my lawyer,

00:11:22

well, this is an illegal search.

00:11:24

They can’t get me for this. And he said, well, this is an illegal search. They can’t get me for this.

00:11:25

And he said, well, kid, the cop will just lie,

00:11:32

and the judge will believe him, not you.

00:11:36

And that’s still how it is, as far as I can tell.

00:11:41

Anybody else?

00:11:44

No questions? Am I going to have to ask you guys questions?

00:11:49

I actually have 36 questions, but I was going to let other people ask them first.

00:11:55

My next question is, what are the main goals of the EFF in the coming year? What are you aiming at trying to accomplish?

00:12:02

The main goals of EFF. Okay. Well, EFF used to be a small organization, and it was easier to keep track of its main goals.

00:12:18

There are now between 70 and 80 people working at EFF, all funded by you folks, the folks among you who are members. And they’re

00:12:29

working on so many things, I even find it hard to keep track of it. But fundamentally,

00:12:34

they are trying to ease the integration of electronic communications and computing into society,

00:12:50

trying to cause as few expectations to be broken,

00:12:57

trying to cause as little damage as possible to long-standing principles,

00:13:09

things like privacy and free expression and all of that,

00:13:15

are all changed as we all move our communications online.

00:13:20

And sometimes those changes are beneficial and sometimes not so much.

00:13:26

And so we try, our goals are to explain those changes to people so they understand what’s going on and to try to both steer the society and steer the technology in directions that are

00:13:35

positive and beneficial so ameliorate the privacy problems that come when computers

00:13:42

are involved in every interaction you make

00:13:45

and they’re leaving detailed audit trails of everything you do.

00:13:49

And at the same time, trying to make real the opportunities that come from that,

00:13:58

like the opportunity to communicate in privacy and in security with your friends

00:14:04

no matter where they are on the earth.

00:14:09

So when it comes down to doing that, there’s a lot of detail.

00:14:15

Our first five years were mostly spent on censorship issues.

00:14:20

In the U.S., the Communications Decency Act and things like that,

00:14:50

which we ultimately beat in the Supreme Court, we got this great decision that said that a message posted over the Internet should have the same protection under the law as the lonely pamphleteer who was around in the days of the American Revolution. The next amount of time, a lot of that was spent on wiretapping and surveillance and

00:14:57

things like that. And then we spent another five or ten years on copyright.

00:15:10

And once Hollywood figured out that their revenue streams were threatened by the Internet because they were no longer the monopoly or oligopoly providers of entertainment,

00:15:16

then there was a lot of trouble with them trying to shut down anybody who made it possible

00:15:22

to get any kind of music or movies

00:15:25

or whatever that you wanted that didn’t involve going through them.

00:15:31

Then we got involved post-9-11 again in wiretapping and encryption and things like that.

00:15:39

Somewhere along the lines, patents became a big deal. And we have a long-standing effort now that is…

00:15:52

We nominate a stupid patent every month.

00:15:56

And we try to get it canceled.

00:16:00

Let’s see.

00:16:01

You’d have to look on the blog to know what the exact last one was.

00:16:14

But a lot of these come along the lines of do something that people have already been doing for 100 years.

00:16:21

Good. There’s an endless parade of people trying to manipulate the system when the system involves having the government give you a 20-year monopoly

00:16:26

on something. Because then you can extract money from people for 20 years if you can manage to

00:16:31

manipulate it that way. And that’s what a patent is. New methods of surveillance that we should be let’s see

00:16:45

well have you heard of Edward Snowden

00:16:48

he was a contractor at the National Security Agency

00:16:55

and he decided that they had gone beyond the pale

00:16:59

that instead of spying on foreigners

00:17:02

for the benefit of the American government they had turned to spying on foreigners for the benefit of the American government,

00:17:05

they had turned to spying on Americans for their own benefit.

00:17:11

And he brought with him a huge cache of documents about what they were doing,

00:17:15

which they had been keeping secret from us for decades.

00:17:19

And he gave them to various people in the press

00:17:22

who had been reading through them and publishing them and writing stories about them.

00:17:28

What we learned is that they are wiretapping pretty much anything they want to wiretap domestically or around the world.

00:17:39

And they’re filtering through that stuff with a system that looks for who’s talking to who and saving that data for a long, long time.

00:17:52

The idea is if 10 years or 20 years from now you do something interesting to them, they want to be able to go back and look at

00:18:05

every person you ever communicated with for your entire life. Where did you get these

00:18:10

ideas? Who are you collaborating with? How can we stop you from doing the thing that

00:18:16

now threatens them, whether it’s good for society or bad?

00:18:21

And so they’re collecting not the content of your communications necessarily, but who’s talking to who.

00:18:29

And they’ll use that not for your benefit, of course.

00:18:34

So that’s one form of communication.

00:18:38

Yeah, go ahead.

00:18:40

Besides government surveillance, what other court cases currently in progress are the highest priority of EFF?

00:18:50

Okay.

00:18:51

Well, yeah, we are suing the NSA over government surveillance, and that’s been going on for about a decade.

00:18:59

And so far, we still have not gotten the courts to give us a straightforward answer, is what they’re doing illegal or not.

00:19:08

But we have probably 30 or 40 other cases going on at any given time.

00:19:14

About half of them are freedom of information cases where we have asked the government for documents about stingrays or documents about patent policies or whatever,

00:19:27

and they haven’t answered.

00:19:28

Or they sent us minimal info, and we know they have more, and we have to sue them over it.

00:19:33

So those are the sort of less interesting cases until if and when they turn up with real documents that show what’s going on.

00:19:46

what’s going on. But the more interesting cases are where we’re involved in trying to

00:19:53

litigate a particular policy issue that affects a lot of people.

00:20:07

So one example, we call it the dancing baby case. And it’s about a mom who shot a little short video, like less than a minute,

00:20:18

of her toddler dancing in the kitchen with a Prince song in the background that the baby is dancing to.

00:20:24

She put that up on YouTube so that the baby’s grandma could see it, and Prince’s lawyers had YouTube

00:20:30

take it down. And the reason is because it had 15 seconds of a Prince song in it. Well,

00:20:38

we thought this was insane, and we took this through the courts and it’s been a slow case

00:20:47

but we got a good ruling out of it

00:20:53

that basically said

00:20:54

if you’re building a censorship system

00:20:59

for copyrighted material

00:21:01

which is what this takedown process is

00:21:03

you have to go to some effort to not

00:21:09

censor things that are allowed to be there. So if you put up a complete copy of a Prince song and

00:21:17

you go, here, download this Prince song, they can take it down. And that’s valid under the law.

00:21:24

they can take it down, and that’s valid under the law.

00:21:29

But if the Prince song is incidental to what you’re really doing,

00:21:33

and it doesn’t actually impact the market for Prince songs,

00:21:37

then you have a right under the Fair Use Doctrine to make use of that song in accomplishing your own communication.

00:21:43

And the record companies

00:21:46

had instead built a system

00:21:49

that just looked for any snippet

00:21:51

of any song that they owned

00:21:52

and say, take it down,

00:21:54

without ever even having a human look at it

00:21:57

or listen to it,

00:21:58

to say, were they making a fair use of it?

00:22:01

Was this transformative?

00:22:03

Was this educational?

00:22:05

Was this incidental? Did this meet any of it? Was this transformative? Was this educational? Was this incidental?

00:22:06

Did this meet any of the criteria for legality?

00:22:10

And we got a court ruling that said they have to do that.

00:22:15

If they’re going to use this mechanism that says you send a letter to the website and it has to take it down,

00:22:26

you have to certify that it really violates your copyright. And it doesn’t violate your

00:22:31

copyright if it’s a fair use. So the record companies, of course, are fighting this, because

00:22:39

it means rather than using a robot to take down everything, they actually have to have

00:22:43

someone watch each of these things

00:22:45

and make a judgment call as to whether it’s worth it.

00:22:48

But it’s more important to have an uncensored communications medium

00:22:54

than to save the record company’s money.

00:22:57

Yes?

00:23:00

Hello.

00:23:02

Flow artists are currently being attacked on Facebook and their videos are being destroyed because of this, because of copyright issues.

00:23:12

My curiosity lies around that kind of gray area of how much of that song, so let’s say maybe if it’s three minutes versus the full three minute and 15 seconds,

00:23:25

three minutes versus the full three minute and 15 seconds, where do you really draw that line where it no longer becomes okay for, let’s say, an individual user to have that playing in the

00:23:29

background of their video versus it actually infringing on their copyright? Right. Well,

00:23:36

it turns out there’s an answer to this, but you won’t like it much. Seven notes. Seven notes. seven notes seven notes it’s

00:23:46

now

00:23:47

this is the law

00:23:50

from an older era

00:23:52

that if you wrote a song

00:23:54

and then somebody else

00:23:56

wrote a song and it’s similar

00:23:58

if you copied seven notes

00:24:01

from the first song

00:24:02

then his copyright controls it.

00:24:06

Because, you know, think of how many ways

00:24:08

you could generate seven notes.

00:24:10

There are a lot of different ways.

00:24:13

Well, to build on that,

00:24:14

the focus isn’t primarily the music.

00:24:18

It’s primarily the art that’s being performed.

00:24:21

And so does that still tie into the seven-note, I guess, rule?

00:24:26

Yes, it does. It just out it used to be hard to transform other people’s material

00:24:32

into new artistic forms that way but technology has brought us the ability to just grab a chunk

00:24:41

of music from here and some video from there and some imagery and some computer-generated stuff

00:24:47

and merge it all together into a new art form.

00:24:50

Even just the process of sampling other people’s music

00:24:54

and incorporating it into a song

00:24:57

has generated a whole lot of controversy

00:25:00

as far as what does the law say now and what should the law say to enable

00:25:07

both people to make a living at music and also for there to be freedom of artistic expression.

00:25:15

That balance has not yet been worked out. And so a lot of people are going to fall

00:25:23

into those cracks and be hurt by being on

00:25:26

one side or the other either by

00:25:28

deciding oh I can’t do

00:25:30

flow acts because

00:25:32

it’s all going to get me in trouble

00:25:34

and so I just won’t be creative that way

00:25:37

versus oh I’m just

00:25:38

going to do it and then I’m going to get caught

00:25:40

and I’m going to get sued and my house

00:25:42

will get taken away

00:25:43

or you find the alternative in between to get caught and I’m going to get sued, you know, and my house will get taken away.

00:25:50

Or you find the alternative in between and you find independent artists that can then upload their music and allow it for free distribution, which is now happening.

00:25:55

Right.

00:25:55

Well, so one of the classic ways around this problem is for artists who could have locked

00:26:02

down their music to freely choose to let other people reuse it.

00:26:08

And that’s called the open content movement.

00:26:12

And it started actually in the free software movement

00:26:16

with software.

00:26:18

And then a really smart guy named Larry Lessig,

00:26:21

who was studying the free software movement,

00:26:24

realized, oh, we could make easy ways named Larry Lessig, who was studying the free software movement,

00:26:30

realized, oh, we could make easy ways for people to do this with things other than software.

00:26:35

And he started the Creative Commons organization and wrote the original Creative Commons licenses.

00:26:39

And by now, hundreds of millions of works have been licensed by their creators under these licenses.

00:26:47

And they’ve made five or six of them that give you different bits of control.

00:26:52

So you can say, for example, anyone can do anything they want with my song or my drawing or my words,

00:27:01

as long as they just attribute it to me. Let people know that I wrote it and

00:27:07

otherwise do what you want. That’s the CC by license. You just have to tell them who it’s by.

00:27:13

There’s also a non-commercial version that says you can do anything you want as long as you’re

00:27:19

not making money from it. There’s a sharer-like version that says,

00:27:25

you can do anything you want with my material if you give all the recipients of your material

00:27:31

the same right. So you can build on mine if you let other

00:27:36

people build on yours. And these ways of easily marking your material make it possible for people who want to build new material to find collaborators who they don’t have to negotiate with, who they can just go, oh, he put that song up under a CC BY license.

00:27:57

It means I can use it and he’s not going to sue me.

00:28:01

My question was how do they market?

00:28:03

But I think you just answered that.

00:28:05

It’s the license, a certain type of license that they have.

00:28:10

So the way you market is first you say copyright by you on such and such a year.

00:28:16

And then you say CC licensed under this particular license.

00:28:24

And they have little logos and things like that

00:28:26

you can put in for graphics or you can use the words.

00:28:29

And it has a link out to the main legal text of the license

00:28:33

and also to an explanation of what it means.

00:28:36

But fundamentally, once you’ve gone through this a few times

00:28:40

and you’ve clicked through the link and looked at it,

00:28:42

then you realize, oh, anytime I see this CC in a circle, it means Creative Commons and it means I’ve got the right, instead of all

00:28:50

rights reserved under copyright, it means some rights reserved and the public can do other things.

00:28:57

Is this widely used now? It is very widely used.

00:29:02

It is very widely used.

00:29:08

Cory Doctorow, who’s going to speak here tomorrow,

00:29:12

releases all of his science fiction books under the Creative Commons license.

00:29:16

There are millions of pictures people have uploaded to Flickr and other places like that

00:29:18

that are specifically marked with Creative Commons licenses.

00:29:23

Everything you find in Wikipedia is licensed under a Creative Commons licenses. Everything you find in Wikipedia

00:29:26

is licensed under a Creative Commons license.

00:29:30

And so you’re free to extract that

00:29:32

and use it in other works.

00:29:34

In the back.

00:29:37

What do you think is going to happen

00:29:40

with these recent attempts

00:29:43

to ban end-to-end encryption like in the UK the snoopers

00:29:48

charter are any of these governments actually going to try to do that and if so how would it

00:29:56

play out like is a company like Apple are they going to just pull out and not do business or

00:30:01

are they going to rewrite their system to comply how do you see this

00:30:06

this kind of issue playing out it’s an interesting question

00:30:12

it’s not a new issue it’s actually been an ongoing issue for a while

00:30:19

10 years ago if you had a blackberry cell cell phone, the texts that went from one BlackBerry to another were encrypted and were not readable without the help of the BlackBerry company.

00:30:39

Now, it turns out part of the reason for this was they were marketing these to a bunch of

00:30:46

companies and if the

00:30:48

company put a BlackBerry server into

00:30:50

their own data center then

00:30:52

the messages could only

00:30:54

be

00:30:55

intercepted

00:30:58

by going through the company’s server

00:31:00

so even BlackBerry

00:31:02

couldn’t get at them

00:31:03

and this was a big deal for places like stock brokerages, financial institutions,

00:31:15

places where leakage of information can cause large amounts of money to slosh around in bad ways.

00:31:23

India got upset by this.

00:31:27

And they eventually went to BlackBerry and said,

00:31:30

you can’t sell these phones in our country because we can’t wiretap them.

00:31:34

Cops could go to BlackBerry with a subpoena and try to get this stuff,

00:31:38

but the problem is a lot of governments are monitoring things

00:31:42

that they don’t want you to know they’re monitoring.

00:31:44

They don’t want companies to know they’re monitoring. They don’t want companies to know they’re monitoring. And also, BlackBerry was based

00:31:49

in Canada, which is kind of inconvenient for people, for cops in the United States who

00:31:54

didn’t want to have to go through a whole international hoorah just to wiretap the guy

00:32:00

who they’re suspecting of selling marijuana or something.

00:32:03

the guy who they’re suspecting of selling marijuana or something.

00:32:06

Back to privacy.

00:32:13

360 video is now coming out big around, rather expensive still, but coming out.

00:32:14

Can’t quite hear you.

00:32:23

360 video, the spherical video that takes here, unlike just usual videos,

00:32:26

you have control over where the camera’s looking.

00:32:31

In a 360 system, it’s taking pictures all the way around you at the same time.

00:32:39

So for documentaries and so forth, there’s a whole question about how you direct people to look here and not at the guy running around doing something else over there.

00:32:44

Have the EFF looked at the 360 privacy question at all?

00:32:49

No, that one hasn’t come up for us.

00:32:52

It’s still sufficiently small of a niche,

00:32:55

and it also sounds like it’s,

00:32:59

in some ways, it’s sort of a cinematography thing.

00:33:03

Like, if you’re actually trying to capture something in 360,

00:33:07

but you want to direct people’s attention here,

00:33:09

what techniques do you use to do that?

00:33:12

Not really our issue at this point.

00:33:15

But the guy doing something else behind it that he wanted as privacy,

00:33:20

people like him here, if they were doing 360 video,

00:33:23

not wanting to be in the video.

00:33:25

Right. I think we have a 360 video camera in the room.

00:33:30

Well, if you’re recording video of people in all directions,

00:33:36

especially if you’re actually doing it for public release,

00:33:39

you should be getting permission from those people.

00:33:42

And if the camera can see more people, you

00:33:45

need more people to give you permission. You may need to black them out if you can’t get

00:33:49

their permission.

00:33:51

I was wondering about net neutrality as it applies to Internet of Things. For example,

00:33:58

the content self-driving cars are collecting and how do we ensure that as our roads are used by corporations and self-driving cars,

00:34:08

how do we maintain sovereignty over the commons?

00:34:12

Because let’s say an exec needs to get to work, he pushes a button and all the cars pull over for him.

00:34:17

So I want to know if EFF was thinking about that.

00:34:21

And then also about video capture and live blogging. I know people

00:34:26

that do live blogging and they’re always recording everything and they go to events like this

00:34:31

and Ephemeral and other things. And I was wondering, what can we do about that and how

00:34:35

can we defend privacy in instances like that?

00:34:38

Right. Let me take the second half first, like the live blogging thing.

00:34:45

Do you remember a product called Google Glass?

00:34:50

Why did it die?

00:34:53

It died because people thought it was creepy.

00:34:57

And when people saw you wearing Google Glass, they walked the other way.

00:35:04

And I think there’s sort of a social process that goes on around these

00:35:09

things where currently the expectation is if you walk up and talk to somebody you’re not being

00:35:15

recorded you’re not being live streamed to the internet etc and you need to if you’re going to

00:35:21

do those things you need to put people on notice and let them choose whether they want to participate in that.

00:35:32

What was the first half?

00:35:34

It was about Internet of Things and self-driving cars.

00:35:38

Yeah, the roads and all that.

00:35:43

Your question sort of wraps up a whole bunch of different aspects.

00:35:47

One of them is the detailed monitoring of ordinary activities,

00:35:54

like Tesla recording everything that your car is doing,

00:35:58

every bump it hits, and every time you push the gas pedal down

00:36:02

or let it off or turn the steering wheel,

00:36:04

and uploading that

00:36:06

for later analysis, how they can improve how they make their cars.

00:36:12

There’s a lot of that going on. Most of it’s going on on the web.

00:36:17

Some of it’s going on in apps and phones. And so far, it hasn’t really escaped beyond that

00:36:23

very far. But people keep dreaming of this Internet of Things idea,

00:36:27

not so much that it would help you,

00:36:30

but that it would let them capture all sorts of stuff about you.

00:36:36

Personally, I think we actually have to care about this,

00:36:41

and the way I deal with it is I decline to use products that spy on me that way.

00:36:48

So I don’t do searches at Google.

00:36:54

I have plugins in my browser

00:36:57

that don’t let Google Analytics run in my browser.

00:37:02

There are like 15 different free services that Google offers to individuals

00:37:09

and to webmasters and things like that. Google Fonts and Captchas and the analytics and free

00:37:18

search and all of that that you can drop into your site. Oh, the like buttons. There’s another big one. All of those were

00:37:28

not provided out of the generosity in their hearts. All of those were provided because

00:37:36

they provided another data stream about what each person on the web is doing. And I take pains to turn as many of them off as I can.

00:37:50

And often I do that just by declining to use the services that come with that price.

00:37:57

Now, yeah, I’ll get to you in a minute,

00:38:00

but the self-driving cars and that sort of thing,

00:38:04

I’ll get to you in a minute, but the self-driving cars and that sort of thing,

00:38:17

there’s a societal process of getting used to the idea of having robots drive us around.

00:38:24

And drive around packages and things like that that aren’t us,

00:38:27

and fly through the skies without human control and all of that.

00:38:39

The response of the traditional social and governmental systems to this has been to try to regulate it using government regulators,

00:38:44

saying you can’t sell those cars

00:38:47

here unless they meet our criteria. You can’t drive that on the roads of California because

00:38:53

we didn’t pass a law that lets you, etc. I’m concerned that, like with many things in the

00:39:02

government, the people who regulate that will be captured

00:39:06

by the people who are doing the work and they won’t really end up protecting the public.

00:39:13

So I’m more interested in actually us creating a diversity of different ways of doing self-driving vehicles.

00:39:26

So that you can’t just buy it from Tesla or from Ford or whatever,

00:39:32

but that there can be university projects that make self-driving cars,

00:39:39

and you could actually run that code in your car if you wanted to.

00:39:47

actually run that code in your car if you wanted to. And in other words, that these,

00:39:53

there’d be a whole ecosystem of these and they would not only compete on how well they drive your car, but also on how much privacy they offer you. So that you could choose to

00:40:00

have one that wasn’t monitoring you everywhere you went, that wasn’t telling Google every

00:40:05

place you go, and it would give you the opportunity to make that choice. We’re still a long way

00:40:14

from there, and it’ll take some smart tech work as well as some other activism and publicity and marketing and things like that to make to make that ecosystem happen

00:40:27

if you haven’t already covered this do you see a way out of closed app stores and ecosystems

00:40:35

and back into the realm of open standards i’m thinking of things like the apple app store

00:40:41

where there’s one company with control over what code we can and can’t execute?

00:40:45

Oh, yeah.

00:40:48

There’s an obvious way out of them.

00:40:50

Don’t use them.

00:40:57

Don’t buy Apple products when they lock you in to only using software that Apple approves of.

00:40:58

It’s really straightforward.

00:41:04

It’s like, you know, don’t buy food that poisons you, you know, don’t buy from companies that try to

00:41:08

control you. Yeah, on a similar note, do you think it’s likely that there will be any sort of

00:41:15

regulations passed in the US that are similar to those in the EU, like the right to be forgotten,

00:41:22

that would enable users who in their youth

00:41:27

might have been ignorant and used Facebook

00:41:29

and Facebook had captured all of their information

00:41:31

and you might stop using a service like Facebook

00:41:35

but they keep your data forever.

00:41:39

Do you think there’s any way that consumers

00:41:42

will ever be able to liberate themselves from Facebook and caching their data?

00:41:49

The right to be forgotten is not

00:41:52

one of my favorite things, actually, because it’s a censorship mechanism.

00:41:57

And so far it has mostly been used

00:41:59

by rich people who don’t want the public to find out what they’ve been doing

00:42:03

using particular laws in the UK.

00:42:09

In general, I think this issue is going to be solved by social change,

00:42:17

not by legal change, for the most part.

00:42:22

There’s a significant degree of tolerance in our society

00:42:26

for youthful indiscretions

00:42:30

for kids who steal from the candy store down the block

00:42:35

and who later figure out that this is probably not a good idea

00:42:38

they learn they get forgiven

00:42:42

and nobody holds it against them when they’re 40

00:42:44

and so the silly things that you posted on Facebook when you were 12 You know, they learn they get forgiven and nobody holds it against them when they’re 40.

00:42:55

And so the silly things that you posted on Facebook when you were 12, they’ll still be out there probably, but they won’t really socially come back to bite you.

00:42:57

Right? Because nobody will care.

00:43:07

So do you think there will ever be a way to at least have Facebook not keep the data forever?

00:43:11

No, I don’t think there will be a way.

00:43:17

If you voluntarily hand over your data to a huge corporation that does not have your interest at heart,

00:43:20

for you to get the data back, no, I don’t think there will be a way.

00:43:26

If there’s any assurance, their policies say that if you delete your account, they delete all of your data within some number of days.

00:43:29

And they can change that

00:43:30

policy anytime they want.

00:43:31

With new consent language, yeah.

00:43:34

So recently there was a case of

00:43:36

a photographer who had donated

00:43:38

thousands of her images to the

00:43:40

public domain and

00:43:41

she got a cease and desist

00:43:43

letter saying that she was, I guess,

00:43:49

distributing images that were owned by the Getty Institute.

00:43:53

Are you familiar with the case?

00:43:55

I’m not familiar with that case.

00:43:57

Okay.

00:43:57

Well, it was recent, and what it turns out was that the Getty had acquired this, her

00:44:02

archive of a public image, and then one of the

00:44:05

Getty’s third-party

00:44:07

companies was just

00:44:09

like these patent trolls kind of thing,

00:44:12

just sending it out.

00:44:14

I didn’t know if the EFF had

00:44:15

been involved in that.

00:44:17

It’s kind of like patent trolling, but this is a unique

00:44:20

case because she had donated it to the

00:44:22

public domain, and then they came back to her

00:44:23

with a threatening letter, and so she’s suing theming them back to say you know they’ve done this to

00:44:29

other people yeah lots of smaller foundations as well some family foundations saying oh your

00:44:37

foundation is interested in protecting the first amendment and here are our projects that work on

00:44:43

the first amendment would you like to fund one of them?

00:44:47

Our original breakthrough in that realm of foundation funding

00:44:52

came through around electronic voting

00:44:55

because there were a bunch of foundations

00:44:58

back in, I think, the Bush versus Gore days

00:45:03

that wanted to protect the integrity of the election system.

00:45:08

And they saw us, both our ability to see the policy issues

00:45:13

and the technological issues in that

00:45:15

as key to protecting the election from being stolen

00:45:21

or invalidated by people breaking into the voting machines

00:45:25

and changing all the votes.

00:45:28

But interestingly,

00:45:31

foundations tend as a species

00:45:36

to act like lemmings.

00:45:38

If they’ve never heard of you,

00:45:40

they’ll never send you money.

00:45:42

You can apply over and over

00:45:43

and they won’t send you money. But once they send you money. You can apply over and over and they won’t send you money.

00:45:48

But once they send you money for something,

00:45:50

then it’s like, oh, now we know these guys,

00:45:52

we can send them more money.

00:45:54

And so after we broke through on one issue,

00:45:57

then they started going, oh, and look,

00:45:59

you’re working on these other issues we care about too.

00:46:03

And so that became a third pillar of our funding, which

00:46:07

for a good many years were about three equal pillars. So we had rich techies and individual

00:46:16

members and foundations, each paying about a third of what it costs to run EFF. Well, then it turns out an odd thing happened.

00:46:27

Corporate sponsorship.

00:46:31

Now, we’ve been accused by a few people

00:46:35

of being like stooges for Google and things like that,

00:46:39

which is pretty far from the truth.

00:46:43

It turns out you would never guess

00:46:46

the corporation that provided

00:46:49

like 80% of the corporate donations to EFF.

00:46:53

I mean, I could invite you to try if you want to try,

00:46:57

but I don’t see any hands.

00:47:00

It’s called Humble Bundle Incorporated.

00:47:07

It was started by two college students in a dorm room

00:47:10

because they knew a bunch of other students

00:47:13

who had written video games

00:47:15

that they couldn’t get distribution of.

00:47:18

And they said, well, we’re studying marketing.

00:47:21

We’ve figured out a way to market these.

00:47:23

We’ll put up a website.

00:47:25

We’ll let people pay whatever price they want for this bundle of like six video games.

00:47:31

But it comes with a deadline.

00:47:34

You can only buy it for the next three weeks.

00:47:37

After that, we’ll shut it off.

00:47:39

So you better buy it if you want it.

00:47:41

If you want even one of those games, you could send us five bucks and you get all six.

00:47:46

And they were successful.

00:47:48

Their first round of doing that

00:47:50

raised more than a million dollars.

00:47:54

And they, actually, they made a pledge

00:47:58

that if they raised more than a million dollars,

00:48:01

they would free up the source code of all those games, too.

00:48:04

And they did.

00:48:11

But part of their marketing model was, besides getting to set whatever price you wanted to buy the bundle, you could also allocate a fraction of it to two different charities.

00:48:18

One of them was called Child’s Play, which buys video game consoles and gives them to kids in hospitals.

00:48:27

And the other was EFF.

00:48:31

And on the sliders,

00:48:34

they just put up sliders where you would buy the thing,

00:48:36

and they had it set by default

00:48:39

that 10% of the money would go to the charities.

00:48:43

But you could change that slider so 100% of the money would go to the charities. But you could change that slider

00:48:45

so 100% of the money went to the charity.

00:48:48

Or zero.

00:48:48

It was all up to you.

00:48:50

And then these charities co-marketed

00:48:53

the video games.

00:48:55

So we put out a blog post that says

00:48:57

hey, if you want a bunch of video games

00:48:59

at whatever price you want to pay

00:49:00

they’re over here.

00:49:02

And oh, by the way,

00:49:03

you’ll be supporting EFF if you buy them.

00:49:06

That dorm room effort became a company and marketed hundreds and hundreds of video games

00:49:13

and books and other things. And EFF became one of the charities that was in many of their bundles.

00:49:21

was in many of their bundles.

00:49:24

And the result was they donated multiple millions of dollars to EFF.

00:49:30

That distribution model is now declining.

00:49:35

I’m not sure exactly why.

00:49:38

But the result is their donations

00:49:40

have been going down over time.

00:49:42

But for about a five or six year period,

00:49:45

they were giving us two or million or $3 million a year.

00:49:49

And our budget nowadays is about $10 million,

00:49:52

so that was roughly a quarter of it.

00:49:56

So we had four independent pillars,

00:50:00

the individual donors, rich techies, foundations, and a few silly corporations.

00:50:10

And the result of that was that EFF has been independent of all of them. None of them can

00:50:17

control us. None of them can tell us what to do. And so we’re free to follow our own principles and our own hearts in trying to do the

00:50:27

best job we can for society. And that’s where our funding comes from. I’m curious, in your opinion,

00:50:35

what organizational models or tactics are best for activists outside of the EFF that want to

00:50:42

self-organize for Internet activism?

00:50:45

For what sort of activism are you thinking of?

00:50:48

Maybe like students on a college campus.

00:50:50

Would you support people getting involved as individuals or forming chapters

00:50:54

or like in the general public?

00:50:57

Do you think that in-person meetings still have a place in Internet-based activism?

00:51:02

Or do you think that this could largely take place online?

00:51:06

Oh, I think both are true. I think there’s no substitute for meeting the other people around

00:51:13

you who care about your issue, and not only does that help you educate yourself about the corners

00:51:21

of it that you haven’t thought of, but it also spreads the word to the people around you.

00:51:26

It makes a social occasion.

00:51:29

It draws people in.

00:51:32

EFF, but of course there’s a huge place

00:51:35

for electronic communication about these things.

00:51:41

And so EFF does both.

00:51:43

We have what we call speakeasies that we hold around the Bay Area and out in other cities

00:51:51

when an EFF staff person or board person is going to be in New York City or Vancouver or Rio de Janeiro or whatever.

00:52:00

We’ll hold a speakeasy there.

00:52:03

How do we make it important to people to support EFF-like things?

00:52:08

It’s become less of an issue in the last 20 years.

00:52:12

We used to be seen as a niche,

00:52:15

as these sort of crazy techies who got involved in policy somehow.

00:52:20

And what’s happened is our issues have mostly gone mainstream.

00:52:24

and what’s happened is our issues have mostly gone mainstream.

00:52:31

So it’s far more common nowadays that when I mention I’m involved with the Electronic Frontier Foundation,

00:52:35

people say, oh, those guys, I love them, thank you,

00:52:39

as opposed to, who the hell is that?

00:52:43

It also helps that we do a lot of interacting with the press.

00:52:49

One of our first roles, actually, when we were founded,

00:52:52

was trying to explain technical issues to the press

00:52:56

so that they would write coherent, intelligent stories about them.

00:53:01

One thing that I learned from living with a journalist for a long time is

00:53:06

being a journalist is like being a politician. You’re expected to be an expert in whatever

00:53:13

the news of the day is, and tomorrow you’re expected to be an expert in tomorrow’s thing

00:53:18

and forget what it was yesterday. For journalists to be able to pull that off, they need sources who they

00:53:25

can trust to tell them what’s really going on.

00:53:29

And a lot of journalists would call us up when some computer got hacked somewhere or

00:53:37

somebody’s privacy got violated or somebody got censored or sued.

00:53:41

They’d call us up and say, what are the real issues in this? Whose ox is being gored here

00:53:47

and why do we care about this? And we would try to explain it to them. And the result was

00:53:54

we got written up a lot and we get quoted a lot in the press. We now have two people who work full-time just handling press inquiries.

00:54:07

And most of those people, you never even see their names in the press

00:54:11

because they’re not being quoted.

00:54:13

They’re just passing the reporter on to,

00:54:16

oh, you called in about this lawsuit about the public domain photographs?

00:54:22

Here, let me get you to the lawyer who’s working on those

00:54:26

copyright issues and he’ll tell you

00:54:28

what the real story is.

00:54:30

It’s just been

00:54:32

a real

00:54:33

flood of communication

00:54:35

through

00:54:37

the press with

00:54:40

us and the result

00:54:41

is it’s given us a very broad

00:54:43

reach around the world that

00:54:45

when we have a point of view on these

00:54:47

issues

00:54:48

the press is often willing to print it

00:54:51

at least as one of the ways they talk

00:54:54

about the issues

00:54:54

do you have any thoughts

00:54:58

about

00:54:58

it seems like anytime you bring

00:55:02

up the idea of privacy

00:55:04

you’re almost always talking about policy in that we need these human structures in order to make sure that privacy gets retained.

00:55:15

I’ve always thought that privacy would be enhanced if we could build it into the infrastructure itself so that it wasn’t an option to have privacy.

00:55:23

It was built in so that when the technology gets used, the privacy comes cooked in.

00:55:28

Do you have any ideas about where that might be going in the future?

00:55:31

Or is there anyone thinking about that sort of thing?

00:55:33

Yes.

00:55:34

Actually, we think about that a lot.

00:55:38

One of the early sayings of one of our founders is architecture is politics. In technology,

00:55:49

if you architect the system to make the right political choices, then those political choices

00:55:55

don’t have to be argued about later. So if you architect the system so that it doesn’t collect

00:56:01

everybody’s information, you don’t have to argue about who gets to look at it later

00:56:06

because the information was never collected.

00:56:11

If you architect a system so that there’s no central point of control,

00:56:16

then you don’t have to argue about who should exercise that control.

00:56:21

And we actively try to work with technologists to look at the social implications of the systems they’re designing

00:56:32

and nudge them in the direction of better social outcomes, of wider distribution of power, of less opportunity to intercept or monitor,

00:56:48

to less opportunity to have security issues.

00:56:55

We actively work with companies and with academic researchers and people like that to try to make the infrastructure more,

00:57:04

give us fewer things to argue about in the future but also we have a tech

00:57:10

department that looks at opportunities where nobody’s building that kind of technology and

00:57:18

where we could step in and do that so one thing that we did that took about three or four years between negotiation

00:57:28

and finding partners and programming and debugging and rolling it out was a system for setting

00:57:37

up encrypted websites automatically. It turns out, you know, we already have ways to make your web traffic private

00:57:46

so it can’t be monitored by a third party

00:57:49

like the NSA or even a third party

00:57:51

like your ISP

00:57:53

or the guy next to you in the coffee shop

00:57:57

but

00:58:00

those technologies, HTTPS

00:58:04

the secure web technologies were only being used by big companies because they were too painful to deploy.

00:58:28

would enable a wide variety of smaller websites run by individuals, companies, blogs, things like that,

00:58:36

to automatically let people connect to them privately so that nobody can tell which things you’re reading and what things you’re posting.

00:58:47

And we rolled that out about a year ago, and it’s been a remarkable success.

00:58:53

Just general discomfort with knowing that the company has your information. So I wanted to ask if anonymizing the data makes any difference to you.

00:59:00

So let’s say if there was some way to enforce that all data that was collected was anonymized

00:59:04

such that they could never determine that the data was yours,

00:59:07

but they could use the aggregate data to make, say, better products or, you know.

00:59:11

Because I think most of the time they probably are concerned mostly with, well, actually I don’t know,

00:59:15

but a lot of times, some of the data is used probably just in aggregate

00:59:18

and some maybe just like personal, you know, like search ads,

00:59:23

like when they, you know, target you from your search preferences

00:59:25

unfortunately

00:59:33

the simplest way to collect aggregate data

00:59:38

is to collect all the details and then crunch through it later

00:59:41

it’s actually harder to design a system that works

00:59:46

that doesn’t keep the individual details.

00:59:49

And so most people don’t bother to go to that effort.

00:59:53

A classic example that I’ve currently been fighting a lot

00:59:57

is email tracking by companies.

01:00:03

If you run a mailing list,

01:00:06

it’s probably being run through

01:00:09

somebody’s company that either

01:00:11

offers free mailing lists or like

01:00:13

$5 a month for mailing lists.

01:00:15

Places like MailChimp.

01:00:18

Those companies

01:00:20

don’t just

01:00:21

post. If you send a message

01:00:23

to that list, they don’t just repost your message.

01:00:27

They go in and change the message

01:00:29

to insert tracking technologies

01:00:33

so that they’re basically abusing

01:00:38

the standards for sending multimedia mail

01:00:42

by generating a unique URL, not just for every message sent,

01:00:48

but for each recipient of that message. So if you have a mailing list with a thousand people on it,

01:00:54

they will generate a thousand different URLs and send each one individually to each recipient.

01:01:01

And then anytime somebody accesses that URL, they know exactly which person

01:01:06

was reading which message

01:01:09

and clicked which link in it.

01:01:12

And they store all that information.

01:01:15

And they think of this

01:01:17

as a standardized marketing technique,

01:01:20

and I think of it as spyware.

01:01:22

And I refuse to click those links,

01:01:24

and I refuse to support those links and I refuse to

01:01:25

support organizations that do that

01:01:27

the technique

01:01:29

was developed by spammers

01:01:31

to figure out

01:01:33

whether you were reading their spams

01:01:36

and

01:01:38

the data it provides is

01:01:39

unreliable because

01:01:41

there is no internet protocol

01:01:43

for saying for telling the sender, oh, the

01:01:49

recipient just looked at this message. So instead, they’re abusing existing protocols to try to get

01:01:56

that effect, but the result is it’s not accurate. If you read that message on your phone, it doesn’t necessarily download the image that would tell them that you read it.

01:02:11

And if you read it in this mail reader, it’ll look like you never saw it.

01:02:16

Whereas if you read it in that mail reader, they’ll think you opened it.

01:02:21

So not only is this marketing data inaccurate, but it’s intrusive and almost none

01:02:29

of the organizations that are doing it know that they’re doing it. I regularly interact with,

01:02:38

you know, more than a hundred non-profits because I’m a donor to a bunch of non-profits,

01:02:45

and I get on their mailing lists.

01:02:47

And they send me these messages that are full of spyware,

01:02:50

and they’re horrified when I send back,

01:02:53

like, I didn’t open your message because it was full of spyware,

01:02:57

and I won’t send you any more donations unless you fix this,

01:03:01

because I don’t support places that spy on their members.

01:03:04

donations unless you fix this because I don’t support places that spy on their members.

01:03:14

Do you think that the way forward is regulation requiring the companies don’t store personal information or at least make the consent language more clear and mutable? Or do you think it’s

01:03:19

just the masses not using products that do things like this?

01:03:23

the masses not using products that do things like this?

01:03:30

Well, the European model has been to try to regulate these things.

01:03:33

And my impression from… I haven’t talked directly to European regulators about this.

01:03:37

I’ve talked to people who interact with them,

01:03:41

and I’ve talked to people who are regulated by them.

01:03:44

And all of those people

01:03:46

pretty much agree that the regulation is an ineffective and intrusive joke that it doesn’t

01:03:53

actually like for example there’s a guy who invoked the European data protection directive

01:03:59

to say okay this says I have the right to get all the information a company has about me.

01:04:08

And he sent a demand to Facebook,

01:04:10

which has an Irish subsidiary that serves Europe,

01:04:13

and said, send me all the information you have about me.

01:04:17

And they sent him, like, four boxes of printouts,

01:04:21

like, this much stuff.

01:04:23

boxes of printouts, like this much stuff.

01:04:31

And he’s suing them in Europe for collecting information about him

01:04:34

that he didn’t authorize.

01:04:37

And it’s going nowhere, of course,

01:04:39

and Facebook is still collecting all that information

01:04:41

about all the other 500 million people in Europe.

01:04:41

collecting all that information about all the other 500 million people in Europe.

01:04:48

So I don’t think regulation is particularly effective at this.

01:04:55

I’m not sure what will be.

01:04:59

There are some precedents for this.

01:05:05

In the 1930s, when automated telephone systems were getting designed and built and deployed,

01:05:15

they collected detail records about every phone call, who called who and how long they talked.

01:05:23

And they used this for billing.

01:05:22

records about every phone call, who called who and how long they talked. And they used

01:05:23

this for billing.

01:05:27

And this

01:05:28

is still the way it is in the United States.

01:05:30

When you get your monthly phone

01:05:31

bill, if you get a monthly phone bill,

01:05:34

it has a list of every call

01:05:36

you made.

01:05:37

In France, this was not true

01:05:40

for a long time after World War II.

01:05:43

Because

01:05:43

in France,

01:05:45

when the Germans took over their country in the 1940s,

01:05:49

they used that information to round up the underground.

01:05:53

And when the French got their country back

01:05:55

at the end of World War II,

01:05:57

they said, we’re not going to build a system

01:05:59

that leaves that information lying around for next time.

01:06:03

Unfortunately, I think it’s going to take

01:06:06

some

01:06:07

mass privacy

01:06:10

atrocities before

01:06:12

people will pay attention

01:06:13

to how much information they’re giving

01:06:16

to people who do not have their best

01:06:17

interests at heart.

01:06:20

And the current

01:06:21

generation of people who are

01:06:23

using cloud services and smartphones and all of that stuff have not yet been seriously burned by that.

01:06:31

And they will only change that when they get seriously burned.

01:06:36

I wish I knew a way to prevent that.

01:06:39

I don’t.

01:06:44

You’ve heard of Operation Brandeis, the DARPA project.

01:06:48

Operation what?

01:06:50

Brandeis, the DARPA project.

01:06:52

It’s the government’s response to Facebook and Google.

01:06:55

They’re trying to build new technologies to allow Internet sovereignty.

01:07:01

I was wondering if you’d heard about it yet.

01:07:04

No, I’m not sure what it is.

01:07:06

Okay.

01:07:08

I mean,

01:07:09

talking about sort of internet sovereignty,

01:07:13

I do…

01:07:15

The internet

01:07:15

was designed as a widely distributed

01:07:18

system that has no center.

01:07:22

Anybody

01:07:22

can make a little internet

01:07:24

that just connects up the people who they

01:07:26

know with physical wires or with radio signals or whatever. We do this here at Burning Man.

01:07:34

Different camps have radios that talk to an antenna in center camp, and we can push packets

01:07:40

around to each other. And when we want to connect that to the other networks,

01:07:46

we don’t need to go to some central authority.

01:07:48

We just need to go to one other guy

01:07:50

who we want to connect to

01:07:52

and get his agreement

01:07:54

and we can start swapping packets with him.

01:07:58

And the result has been a very robust system

01:08:01

that’s also very hard to censor.

01:08:03

It’s hard to take it down, either

01:08:06

intentionally or accidentally. Similarly, the web was designed as a widely distributed

01:08:14

system. You didn’t need permission from any given place to publish a website. If you stuck

01:08:21

a server on the internet, you could put up a website. You could put up 10 websites.

01:08:26

You could host your friends’ websites.

01:08:27

Nobody needed permission from anybody to do that.

01:08:31

That widely distributed system has been slowly and carefully surrounded

01:08:37

by a small number of companies, Google being the primary one,

01:08:50

of companies, Google being the primary one, that found an opportunity to make money out of tracking what people were doing. But if your web access went from your laptop on the

01:08:59

playa direct to somebody’s server in New York City, how is Google going to find out that you went to that website?

01:09:08

Well, yeah, if there’s Google ads in that website,

01:09:13

Google gets notified whenever you access that page.

01:09:17

If that page uses the free Google fonts,

01:09:21

they get notified by your browser when you download the font into the page.

01:09:26

They have put up a whole series of free services that are designed to make a centralized flow of information about who’s talking to who on the web. And my guess at this point is probably more than 50% of web accesses are

01:09:48

now reported to Google. That’s pretty frightening, considering that in the beginning the number

01:09:55

was zero. Now, a couple of other companies have seen that opportunity, and they’ve tried to jump on that bandwagon, such as Facebook.

01:10:10

And they all have these slimy privacy policies that basically say, well, you really think of us as having one main service, like Facebook’s service being the social media thing, and Google’s

01:10:26

service being the search thing. But actually, we have a whole variety of services, and we

01:10:32

can freely aggregate any information we collect about you from any of these. So the Google The Google Like button service, the G Plus button, such as if it still exists, is a service that they provide that you don’t think of as a Google service, especially time you load a page that has that G+, or that F,

01:11:06

or that LinkedIn, or that

01:11:08

Pinterest, or whatever

01:11:10

button, each of those companies

01:11:12

is getting

01:11:14

a packet that comes back from your

01:11:16

browser that says, he loaded

01:11:18

that page at this time from that IP

01:11:20

address.

01:11:22

And they

01:11:23

have succeeded in surrounding a widely distributed system with a logging

01:11:30

infrastructure that not only serves them, but also serves the governments that are trying

01:11:35

to monitor their own populations. And that’s been done with the active participation

01:11:46

of all of us

01:11:48

what plug-in do I use for

01:11:52

for web access you mean

01:11:56

well I use Firefox

01:11:58

and I use NoScript

01:12:01

and I also use a cookie manager

01:12:04

of which there are dozens.

01:12:07

And EFF has put out a similar cookie-ish manager called Privacy Badger,

01:12:12

which I think about 2 million people are using now,

01:12:17

that turns on a web feature that says,

01:12:20

I want your web server not to track me.

01:12:24

And then if it sends back cookies,

01:12:27

and the cookies are more than just totally trivial,

01:12:29

like what language you speak or whatever,

01:12:32

then it will block those websites.

01:12:35

It will refuse, it will tell your browser,

01:12:39

don’t talk to those websites.

01:12:42

And so rather than having a blacklist of places that are good and places that are bad,

01:12:47

it will actually measure what the web services are doing. And when they do things that are

01:12:52

intrusive to your privacy, it will block them. You’re listening to the psychedelic salon,

01:12:59

where people are changing their lives one thought at a time.

01:13:03

where people are changing their lives one thought at a time.

01:13:08

Now, as a compliment to this talk that we just listened to,

01:13:12

I highly recommend that you go back to podcast number 522,

01:13:17

Surveillance, Capitalism, and the Internet of Things, with Cory Doctorow.

01:13:24

As you know, this was Cory’s talk that was also given at the August 2016 Planque Norte lectures, Thank you. Let me pass along what’s taking place with what I’ve been calling the Psychedelic Salon 2.0, or simply Salon 2.

01:13:48

Rather than go into all of the reasoning behind this right now and the long-term vision, let me just give you the headlines.

01:13:57

To get things started, I’ve now met with the Symposia group, along with Shauna Holm and Bruce Dahmer.

01:14:03

And here’s what we’re going to do.

01:14:06

I’ll call that group the Salon 2 Curators right now,

01:14:09

but please keep in mind that this is only the organizational phase

01:14:13

and that eventually there are going to be many ways to become involved in Salon 2

01:14:17

for you and for the rest of our fellow salonners

01:14:20

who would like to become involved as we move forward.

01:14:24

And it’s not going to be just

01:14:25

podcasts, by the way. Anyway, the curators are putting together some of their material and

01:14:31

are going to create complete podcasts that will still be introduced briefly by me. And we’ll be

01:14:37

numbering these podcasts as Salon 2-001, Salon 2-002, etc. At the same time, and possibly even in the same week occasionally,

01:14:48

as a Salon 2 podcast is released, I’ll continue podcasting as I always have been as long as my

01:14:55

backlog of new material lasts. And since I’m still receiving more new material each month,

01:15:01

well, it looks like I’ll still be doing the current form of the podcast as well

01:15:05

for quite some time now. Now before long, I’ll be posting both on the forums and on our Slack site

01:15:12

some requests for audio help that the curators are going to need. I know that several of our

01:15:18

fellow salonners have told me that they have audio expertise and are willing to help. And now we finally will have some very

01:15:26

specific requirements for you. And as we go forward each week or so, I’ll bring you up to

01:15:32

date as to how we’re doing on getting the Salon 2 podcast going. Oh, and don’t worry about you

01:15:38

having to do anything to get both podcast streams, because I’m going to be including the Salon 2 podcast in the same RSS

01:15:46

feed as you’re now getting for this podcast. And hopefully I’ve explained this well enough that

01:15:51

for right now you get the idea that this is going to be a great year for podcasts from the Salon.

01:15:58

Now, next week, after the inauguration of a new U.S. president on the 20th,

01:16:03

I’ll post a podcast that I promised you on the day after the November elections here in the States.

01:16:09

And that is to give you my take on how I see the chaos in the world unfolding over the next few years,

01:16:16

along with a few ideas that I have about how we can not only make the best of the immediate future,

01:16:21

but how we can also have some enjoyment along the way.

01:16:24

the best of the immediate future, but how we can also have some enjoyment along the way.

01:16:33

Now, in closing, I want to mention the passing, last October 29th, of one of our most esteemed elders. I’m speaking about Francis Huxley, who was Aldus’s nephew, and was one of the last of

01:16:40

his generation of intellectuals to leave us. The Guardian in the UK began its

01:16:46

obituary of him saying, Francis Huxley was an anthropologist fascinated by shamanism,

01:16:53

myths, and religious rites who strove to protect indigenous peoples. In the early 1950s, the

01:16:59

anthropologist Francis Huxley, who has died at age 93, undertook pioneering fieldwork among the Yorubu people of the Amazon basin.

01:17:09

The resulting book, Affable Savages, from 1956, adopted a new reflexive approach to the study of culture in which the author’s encounters with the other are reflected as much in personal reactions as in objective descriptions. Francis

01:17:26

was a pioneer of this form of anthropological writing, a style that much suited his lifelong

01:17:32

interest in shamanism and the altered states of consciousness often experienced by religious

01:17:38

healers. While this novelesque way of writing was largely shunned by his contemporaries,

01:17:44

While this novelesque way of writing was largely shunned by his contemporaries,

01:17:47

eventually it became commonplace.

01:17:52

I won’t read the rest of the obituary here, but I’ll link to it in today’s program notes,

01:17:55

which you can find at psychedelicsalon.com.

01:18:00

However, I do have a few personal observations that I’d like to make right now.

01:18:05

First of all, I really like the name of the people who were the subject of Huxley’s first book, the Yorubu. You see, just previous to where I now am living, I lived on Yorubu Street,

01:18:13

and I love telling people how to spell it. You see, Yorubu is spelled U-R-U-B-U. You are you.

01:18:31

you. You are you. Be you. But I digress. You may find this strange, but whenever the subject of raw oysters comes up, the first thing that comes to my mind is my last encounter with Francis Huxley.

01:18:39

Now, this could be a really long story, but I’m just going to give you the highlights for now.

01:18:45

a really long story, but I’m just going to give you the highlights for now. Probably forever.

01:18:52

First of all, during the time that I was living in Houston, Texas, there were many weekends when I would drive down to the docks in Seabrook and buy a bushel of just harvested raw oysters.

01:18:57

Then some friends would stop by my house and we would shuck and eat them in record time.

01:19:02

I could easily eat several dozen myself in a single sitting,

01:19:06

and I was and still am a devotee of raw oysters.

01:19:11

I just don’t eat them anymore.

01:19:13

Now, fast forward a few decades from my Houston days as a lawyer

01:19:17

to Cortez Island up in Canada.

01:19:20

It was September of 2000, and there were about 80 of us attending an invitation-only

01:19:26

conference titled Entheogenic Evolution. Fortunately for me, my wife was invited,

01:19:33

and so I got to tag along. It was a rather eclectic crowd with psychedelic writers,

01:19:40

researchers, activists, therapists, and several prominent leaders of the worldwide Santo Daime

01:19:45

movement, all together at Hollyhock for a week. And for me at least, this was the most memorable

01:19:51

conference that I’ve ever been to. And someday I’ll have to tell you about the drama that centered

01:19:57

around the Daime group and the followers that they drew to the island, but that’s a really long story

01:20:02

that’s going to have to wait for another day.

01:20:10

Now, one of the most fortunate things that happened to me that week was that the room that my wife and I were assigned was in between the room with June and Duncan Blewett on one side, and the room with

01:20:17

Gene and Myron Stolaroff on the other side. So getting to spend a lot of time with those four

01:20:23

great elders, people who made such significant contributions to our current psychedelic renaissance

01:20:29

well it was one of the most memorable times of my life

01:20:32

and there were also so many other interesting people at the conference as well

01:20:37

in fact I think that that conference is where I first got to know Richard Glenn Boyer

01:20:41

and his wife Rye Sentencia

01:20:43

both of whom have been featured speakers here in the salon, as have the Stoleroffs as well.

01:20:49

But, you say, wasn’t this story supposed to be about Francis Huxley?

01:20:54

Well, it is, and in fact, it actually was the most memorable moment for me of the entire week.

01:21:01

Now, you have to readjust your thinking about me here,

01:21:04

because back then I hadn’t even

01:21:06

become Lorenzo yet. I was still little Larry who was awestruck at being able to spend time with so

01:21:12

many people that I’d only read about before. And being a big fan of the Huxley clan, I naturally

01:21:19

acted like a psychedelic groupie and made it a point to spend some time talking with this last legend of the

01:21:25

Huxley family. And so it was on one of our last evenings there that the wonderful staff at

01:21:32

Hollyhock put on a huge outdoor barbecue that featured, as you by now have guessed, it featured

01:21:38

oysters, both barbecued and raw. Now for quite a few years before this, I’d sworn off raw oysters because, well, it’s very

01:21:46

difficult to be sure that they’re safe to eat. Things were different when I could drive down to

01:21:51

the Texas docks and talk with the boat crew that just came in with a fresh load, but eating raw

01:21:56

oysters without knowing where they came from and who caught them was, well, it’s just something that

01:22:01

I wouldn’t do anymore. At least, I always resisted until that evening,

01:22:06

when, by the water, Francis Huxley came up to me with a plate full of raw oysters and offered them to me.

01:22:12

Naturally, little Larry tried to impress him with how safety-conscious I was about eating raw oysters,

01:22:18

but he only laughed at me and said,

01:22:20

Not only did these oysters just now come out of these waters right around the

01:22:25

point, they are without a doubt the very best oysters that I’ve ever had in my life, and I’m

01:22:30

not going to let you miss out on them. Well, I’m not saying that I gave in to peer pressure because

01:22:37

I am far, far from being a peer to a man like Francis Huxley who, well, he’s accomplished so

01:22:42

much in his life, but I did give in to his great smile as I swallowed what would be the last few raw oysters that I’ve ever eaten.

01:22:51

Sure, I’ve been tempted to eat another raw oyster from time to time,

01:22:55

but if I ever do eat another raw oyster, it’s going to ruin this story that I’ve been waiting to tell you for,

01:23:01

well, ever since I started these crazy podcasts.

01:23:04

And yes, I probably should read

01:23:06

you a list of some of the great things that Francis Huxley accomplished in his life, but to me they,

01:23:11

well, they don’t mean nearly as much as my memories of that evening by the water at Hollyhock

01:23:16

eating raw oysters with Francis Huxley. I’ve had a few perfect moments in my life, but that one

01:23:22

there is right up near the top. And I guess that I should

01:23:26

add one more thing here, and that is to mention the fact that for the last decade or so of his life,

01:23:32

Frances Huxley was cared for by a woman who was his ex-wife, and by her current husband. Now,

01:23:38

I’ve only met her briefly one time, but her current husband played a very significant role

01:23:43

in an experience that was

01:23:45

a real turning point in my own life. And while he most likely doesn’t remember it as clearly as I do,

01:23:51

I will never forget what he did for me. Which, at long last, brings me to the point that I’ve

01:23:57

been trying to make here. In this life, for me at least, it seems like it has been the little

01:24:03

things that people have done for me

01:24:05

that have made the most lasting impact. Those two men did what I’m sure they consider to be

01:24:10

trivial things, like talking me into eating what my memory tells me was the most delicious raw

01:24:16

oyster that I ever ate. It was an insignificant moment for him, but for me it was obviously very

01:24:23

significant. So whenever you have an opportunity to do a kindness for somebody, it was obviously very significant. So, whenever you have an opportunity

01:24:26

to do a kindness for somebody, don’t pass it up. You may have a much more profound impact on them

01:24:32

than you can know. And that something can be as simple as smiling at the next stranger that you

01:24:38

see. Because most likely, he or she is having a difficult day too, just like you are. And your little smile for

01:24:46

a stranger lets them know that, yes, life is often a struggle, but you know how they feel.

01:24:53

We’re all in this together, you know, and sometimes your smile can make a much bigger

01:24:57

difference than you think. And for now, this is Lorenzo signing off from Cyberdelic Space.

01:25:04

Be well, my friends. Thank you.