Program Notes

Support Lorenzo on Patreon.com
https://www.patreon.com/lorenzohagerty
Guest speaker: Corey Doctorow

https://craphound.com/Corey Doctorow

Date this lecture was recorded: August 2019

[NOTE: All quotations are by Corey Doctorow.]

“What like-buttons mainly do is collect data about you.”

“If you know something about people, you can lie to them better”

“In a walled-garden that’s what it’s all about. It’s not about spying on you. It’s about locking you in and charging you extra.”

“Amazon is basically pursuing a strategy to turn themselves into a part of the state, into an arm of law enforcement [by providing doorbell camera footage to the police].”

“We’ll never be able to make Big Tech behave. The only way to make Big Tech behave is to make Big Tech small.”

“Tech is the tool we use to fix the other problems.”

Podcast 565 – “John Perry Barlow Tribute”

Download free copies of Lorenzo’s latest books

Previous Episode

645 - hamanism & Cognitive Liberty

Next Episode

647 - Why Burning Man

Similar Episodes

Transcript

00:00:00

Greetings from cyberdelic space.

00:00:19

This is Lorenzo and I’m your host here in the Psychedelic Salon.

00:00:23

This is Lorenzo, and I’m your host here in the Psychedelic Salon.

00:00:28

And I’d like to begin today by thanking fellow salonner EMB from London for his very generous gift of support for the salon.

00:00:32

And speaking of London, last Monday night’s live salon was held on London time.

00:00:38

And for the foreseeable future, I’ve decided to continue doing live salons on London time

00:00:43

on the first Monday of each month.

00:00:46

Not every other Monday, but on the first Monday we’ll do a live salon on London time for a while here.

00:00:50

They’ve been really fun.

00:00:52

And while we ended up talking about ways in which we can help our family members

00:00:56

better understand where we’re coming from when we talk about psychedelics,

00:01:01

well, we began by talking about whether to expect cancellations of music

00:01:05

festivals this spring due to the spread of the coronavirus. And it was really interesting to

00:01:11

get a live report from one of our fellow salonners who lives in Italy, which currently has the

00:01:17

largest outbreak of the virus outside of China. You know, it’s one thing to read about this in

00:01:22

news reports, but this news makes a much deeper impression on me

00:01:26

when I’m talking with someone who is right in the middle of things.

00:01:30

I guess that I should point out also that every Monday evening

00:01:33

I host a live version of the Psychedelic Salon

00:01:36

where my supporters on Patreon can join in or just lurk if they want to.

00:01:41

And these live salons are open to anybody who pledges just one dollar a month.

00:01:46

And quite frankly, it’s these donations from my supporters on Patreon that I’m using to

00:01:50

support these podcasts, and myself for that matter. For the past two years now, it’s been

00:01:56

these donors who have provided the main financial support that I’ve needed to keep on podcasting.

00:02:02

And these fellow salonners are very near and dear

00:02:05

to me indeed. For over a year now, the number of new people signing on to support the salon on

00:02:11

Patreon each month has almost exactly equaled the number of people who leave after having made a few

00:02:17

months donations. Basically, the total number of active supporters on Patreon has remained around

00:02:23

400 for two years now.

00:02:29

So to reward those stalwarts who have been providing the backbone of our support,

00:02:34

I’ve now begun doing some private podcasts for the $5 and higher level supporters,

00:02:39

who are the fellow salonners who actually provide about 80% of our funding.

00:02:46

Now these private podcasts are going to be recordings that I’m simply not going to be able to squeeze in here in the main salon,

00:02:49

because, well, there’s so much new material coming our way.

00:02:54

But I also want to preserve as many of these historical talks as I can.

00:03:00

So, eventually, all of last year’s Palenque Norte lectures will be saved for posterity,

00:03:03

some here at psychedelicsalon.com and some on Patreon.

00:03:09

But today, I’m going to play another of last year’s Burning Man talks here in the main salon.

00:03:15

And again, I want to thank Frank Nussio for making these recordings,

00:03:20

and then, after rescuing the laptop they were on from the rubble of a tornado,

00:03:21

well, he sent them to me.

00:03:24

Like all of the Planque Norte lectures, you’re going to hear the occasional intrusion of loud music and other noise as various art cars drive by.

00:03:32

But it certainly does give you a better feel of what it was like to be there.

00:03:36

So right now, we’re going to hear from one of my favorite writers, Cory Doctorow, whose books and byline you’ve seen many times.

00:03:46

Corey Doctorow, whose books and byline you’ve seen many times. As you know, Corey is also affiliated with EFF, the Electronic Frontier Foundation, for which

00:03:50

he is a spokesperson, and this was a preview of a new presentation that he is

00:03:55

now giving in various places around the country. So now, here is Corey Doctorow’s

00:04:00

Palenque Norte lecture, which took place at the 2019 Burning Man Festival.

00:04:09

Thank you very much.

00:04:11

Thank you all for coming.

00:04:12

I’m sorry if anything came last year when there was a mix-up

00:04:15

and I didn’t show up with Palenque Norte.

00:04:17

I’m glad to see you this year.

00:04:18

And if any of you were at my center camp talk,

00:04:19

there’s going to be a little overlap, but it’s a different talk.

00:04:23

So the talk is about surveillance capitalism, but not just surveillance capitalism.

00:04:27

It’s about tech exceptionalism, which is a subject a lot of people have a lot of feelings about, right?

00:04:32

There’s a time when people said, well, you don’t want to regulate tech because you think that tech is different from everything else and can’t be regulated.

00:04:39

You’re just a tech exceptionalist.

00:04:41

And then we had people who said, well, tech should be regulated because it’s so important. And we said, oh, you’re a tech exceptionalist. And then we had people who said, well, tech should be regulated because it’s so important.

00:04:48

And we said, oh, you’re a tech exceptionalist.

00:04:49

I want to speak today against tech exceptionalism and talk about how surveillance capitalism fits in

00:04:56

because at its core, mostly, tech is just another industry.

00:05:00

And like most industries, most of the things that the people who run the big

00:05:05

firms and who speak on their behalf say is a lot right when they when they tell

00:05:11

you how they handle your private data they’re usually lying right when they

00:05:14

tell you about whether or not they pay their fair share of taxes they’re

00:05:17

usually lying when they tell you like what country they’re booking their

00:05:21

transactions in it’s generally a lie When they tell you about their labor conditions, it’s usually a lie.

00:05:29

If they recruit you and they tell you you’re going to have good coffee,

00:05:31

they’re often telling the truth.

00:05:32

But for the most part, they tell you lies, right?

00:05:36

So one of the lies that big tech tells us is that they built mind control rates.

00:05:42

They don’t put it that way.

00:05:42

What they say is, like, we can control rates, right? They don’t put it that way. What they say is like, we can sell anything, right?

00:05:45

Buy some ads on our platform

00:05:46

and we will sell the thing that you’re buying ads for.

00:05:50

We figured out how to use machine learning

00:05:52

to like A, B, split our way to that one thing

00:05:55

that if you just say it to the person,

00:05:57

they will buy what you’re selling.

00:06:00

And I am worried about big tech

00:06:02

and I think there are lots of things

00:06:03

to worry about with big tech.

00:06:05

But the thing that mystifies me is when we observe big tech, when we observe that everything they say is a lie,

00:06:11

why would we think that the only thing they weren’t lying about was their sales literature, right?

00:06:15

Like, surely that is the least credible thing that any company says is its marketing promises that it uses to sell its stuff.

00:06:22

marketing promises that it uses to sell its stuff.

00:06:27

So rather than talking about mind control rays,

00:06:31

I think it’s worth talking about the other ways that tech can influence and persuade us.

00:06:35

So one of the big ones is with segmenting.

00:06:41

So when my mom tells this story, I was born at a women’s college hospital in Toronto.

00:06:46

And after she was discharged from the hospital, there was a guy handing out baskets of baby stuff, right? It was like diapers and baby powder and wipes and all this stuff,

00:06:53

because they understood that people leaving a maternity ward would have a higher than average

00:06:59

level of need for familiarity with diapers and baby wipes and so on, right? Not everyone you

00:07:04

hand the basket to has just had a baby.

00:07:06

Sometimes you might waste a few baskets on people who didn’t just have a baby.

00:07:09

Some of those people, it might be their sixth kid and they’re like, I got this.

00:07:12

I still got stuff left over from the last one.

00:07:14

But as compared to like just standing on a street corner and handing out baby baskets,

00:07:20

you’re going to score a lot more conversions, right?

00:07:22

You’ll turn a lot more people who are not a customer into a customer.

00:07:28

And so big tech has figured out how to do that on steroids, right?

00:07:30

They compile these non-consensual, deeply detailed dossiers on all of us,

00:07:35

and they can do things like say, well, who’s going to buy a refrigerator?

00:07:40

Buying a refrigerator, selling refrigerators is a really hard problem.

00:07:43

The median person buys one or fewer refrigerators in their life.

00:07:47

Apart from home shows and kitchen remodeling shows, there aren’t a lot of places where

00:07:52

potential refrigerator buyers gather.

00:07:54

And when you look at refrigerator ads, they tend to be these super low specificity ads.

00:08:01

They’re just on highways because everyone is on the highway eventually eventually and some of those people want to buy a refrigerator, right? So big tech compiles

00:08:09

these big dossiers on us and they can do stuff like say, who’s recently bought a home? Who

00:08:14

recently bought a stove? Who recently shopped for a refrigerator? Who recently like went to

00:08:18

Consumer Reports website and looked at refrigerator reviews? And they can go like, let’s just show

00:08:23

those people ads for refrigerators. And they might increase the efficacy of a refrigerator ad by like three orders of

00:08:30

magnitude. And that sounds really exciting, right? And I think if you sell refrigerators,

00:08:34

you’re probably really happy to have achieved a three order of magnitude like uplift in your

00:08:37

sales techniques. But when you actually look at the absolute number of conversions. What they’ve done is they’ve gone from 0.0000001%

00:08:47

conversion to 0.0001% conversion. And so in absolute terms, they haven’t done much,

00:08:54

but they’ve done something. And so people buy refrigerator ads. There’s a lot of people who

00:08:58

are buying ads on these platforms and they’re doing it not because there’s a mind control rate

00:09:03

where you pointed at someone, you go, you need a fridge. And then the next thing that happens, they go and they buy the fridge.

00:09:09

Instead, it just lets you eke out these marginal gains in these otherwise difficult practices.

00:09:15

Now, this kind of seconding can be creepy. They can say things like, people who have

00:09:22

been recently foreclosed on, let’s try and sell them shitty predatory loans.

00:09:26

That’s like legit creepy, but it’s not a mind control rant.

00:09:30

It’s just following you around and making guesses that are better than random chance or the proxies we had before we had these non-consensual dossiers.

00:09:41

And it’s a thing that we should worry about, but it’s not the end of our free will.

00:09:44

consensual dossiers. And it’s a thing that we should worry about, but it’s not the end of our free will. Now, there’s another way that big tech can change the way that we think, and that’s by

00:09:50

lying to us, right? And so sometimes that’s intentional, sometimes it’s not intentional.

00:09:55

But like, if you don’t know how long the Brooklyn Bridge is, and you type into Google, how long is

00:10:00

the Brooklyn Bridge? And it says the Brooklyn Bridge, I forgot to make a note about how long

00:10:03

it is. But let’s say it says the Brooklyn Bridge is 900 feet long, and you believe that it’s 900

00:10:08

feet long because you’ve got no reason to disbelieve it, and the Brooklyn Bridge is actually

00:10:12

only 800 feet long, you have been effectively deceived, right? Now, it’s, again, not a mind

00:10:17

control, right? And if you don’t really know if, like, vaccines are safe or if the earth is flat or

00:10:22

any of those other things, and you type

00:10:25

it into Google or you go to some other service, if you get funneled into an affinity group

00:10:31

on Facebook or whatever, and everyone’s saying that this is true, they’re not brainwashing

00:10:36

you.

00:10:36

They’re not removing your free will.

00:10:38

They’re filling a vacuum, right?

00:10:40

The vacuum of your lack of knowledge with a thing that sounds plausible on its face.

00:10:44

And you’re like, oh, okay, the Brooklyn Bridge is 900 feet long, the earth is flat,

00:10:49

I’m not going to vaccinate my kids, right? And so again, there are like real problems with this,

00:10:54

not so much in the Brooklyn Bridge, unless you’re trying to drive it with your eyes closed.

00:10:58

But in all of these other domains, like people can get into really big trouble. Like,

00:11:02

you know, are the protocols of the elders of Zion true?

00:11:08

And are Jews secretly running the world? If you don’t know the answer, and Google comes back and says, yes, that could be a really serious problem. But again, it’s different

00:11:13

from taking someone who already has domain experience and who has a knowledge of the

00:11:17

subject and telling them something and flipping their beliefs. That’s a very different thing to merely deceiving someone. And then there’s

00:11:26

domination, right? So in many cases, we have these concentrated tech platforms, and it means that

00:11:33

when you want to find out an answer or find a group or discover a thing in the world,

00:11:40

there’s only one place you can ask the question. And so they get to determine what the answer is,

00:11:45

right? Like the first 10 answers to every question anyone asks are on the first page of the Google results,

00:11:50

and nobody clicks the second page.

00:11:53

And so what that means is that if Google has a bad idea or makes a mistake

00:11:57

or is in some other way not doing the right thing,

00:12:03

they’ve sold a bad ad to someone who, someone types,

00:12:06

are Jews evil? And the answer is, read about how Jews are evil. Those mistakes, that domination

00:12:11

means that they get to tell us what to think. Again, not by brainwashing us, but just by

00:12:18

controlling the first page of results and being the only place that people go to find

00:12:23

the results. Now, there is another thing that big tech does sometimes do,

00:12:28

which is that they sometimes bypass our rational factors.

00:12:31

There are occasions in which someone will luck into a technique

00:12:36

for getting people to change their behavior en masse

00:12:39

in ways that they may not like after the fact.

00:12:43

But these tend to be very short-lived effects. There’s an idea

00:12:47

called regression to the mean, where when you observe something that’s very extreme in the world,

00:12:52

you know, when there’s a big dust storm, after a while there’s not a big dust storm, it regresses

00:12:56

to the mean and you get to sort of a few breezes and picturesque dust zephyrs out there in the

00:13:01

open playa. And we see this over and over again. So, you know, we worry a lot about casinos,

00:13:06

and there’s good reason to worry about casinos,

00:13:07

but if you actually look at the usage pattern

00:13:09

of casinos and gambling, which is kind of the big

00:13:12

example of this bypassing irrational faculties

00:13:14

and inducing

00:13:15

addictive behaviors, what you see is

00:13:18

that, like, the median person goes in,

00:13:20

puts a quarter in a slot machine, spins it,

00:13:22

and says, this is weirdly compelling,

00:13:24

and puts another quarter in and does it again, and puts another quarter in and does it again,

00:13:27

and then a couple hours go by and they go, what the fuck was I doing?

00:13:31

And they don’t play. They don’t go back.

00:13:33

Or they, you know, sigma scratch wins and whatever.

00:13:35

Now, it is true that in the, like, great sweep of human behavior,

00:13:40

there are people out here in the fourth and fifth and sixth sigmas

00:13:42

who are really, really vulnerable to slot machines and to mechanics like it.

00:13:47

And those people, they like cash in their kids’ college funds, buy adult diapers, and stand in front of the machine until they drop dead.

00:13:56

Right?

00:13:56

And that is like a legit problem.

00:13:58

Now, we see the same thing with addictive mechanisms in big tech.

00:14:01

So do you remember the great Farm Bill epidemic?

00:14:04

Right? There was a time when everyone was clicking everyone’s cows. And there are

00:14:10

still people who play Farmville and there’s still people who find that

00:14:13

mechanic really, really, really compelling. But the difference between slot machines

00:14:19

and Farmville is Farmville was making fractional pennies from clicking cows, and

00:14:24

slot machines made dollars. And so what we’ve discovered with Farmville, is Farmville was making fractional pennies from CLEC cows, and slot machines made dollars. And so what we’ve discovered with Farmville is that although Zynga made

00:14:29

tons of money, they didn’t have, what they don’t have is enough money to just keep Farmville

00:14:36

alive as their flagship product, right? They needed to find another Farmville pretty quickly

00:14:41

because the fractional pennies they’re bringing in from Cowplex just don’t sustain the kind of marketing and whatever that it took to get everyone to play Farmville.

00:14:49

Now, Farmville had all the money in the world at one point, right? Zynga made a lot of money

00:14:53

from Farmville. And they made a Farmville 2, but no one played it. Because it turns out that the

00:14:58

reason that they figured out how to get us to click on cows is not because they had amazing

00:15:03

insight into the human psyche. It’s because if enough people throw enough darts, someone gets a bullseye.

00:15:08

And it doesn’t mean that they’re going to get another bullseye straight away right after.

00:15:12

See also Niantic and Pokemon Go.

00:15:15

Everyone’s seen the viral memes of the dude with nine phones who rides around on a bicycle playing Pokemon Go.

00:15:20

That person, if it were a slot machine, would be wearing Depends and cashing in their kid’s

00:15:25

education fund.

00:15:28

But Niantic,

00:15:29

they’re part of Alphabet. They’re one of the

00:15:32

largest, I think they’re currently the most

00:15:33

highly valued company in the world.

00:15:36

They have a lot of money to come up with another

00:15:38

Pokemon Go. Where is it?

00:15:39

Maybe next week it’ll happen and I’ll look

00:15:42

like an asshole, but as far as I can

00:15:44

tell, those guys just got lucky. They made another game before that they used for their data. And there’s

00:15:48

a lot of stuff about how the bias and that data influenced Pokemon Go. But that game was not a

00:15:52

smash success. That game was just a thing that like Google capitalizes like a skunk works that

00:15:58

eventually they were able to sell and then they have not done it again. I was going to say they

00:16:01

never did it again. Maybe they’ll do it again because they can afford to throw a lot of darts because they’ve got a lot of money because they’re

00:16:06

a giant monopoly, but not because they have astounding insights into the human condition

00:16:13

that allow them to break us of our free will. So one of the reasons that these mechanisms always

00:16:20

fail is that the thing that keeps us playing them is a thing from behavioral psychology

00:16:26

called intermittent reinforcement, right? So if you give a rat a pellet dispensing lever,

00:16:33

and every time they press the lever, they get a pellet, they’ll press the lever when they’re

00:16:37

hungry. But if you give a rat a pellet dispensing lever that only dispenses pellets on a random

00:16:43

schedule, the rat will keep pressing the lever trying to figure out what the schedule is.

00:16:47

Like, what’s the trick that gets me the pellet, right?

00:16:51

And, you know, Skinner, who kind of developed this, he created these, like, superstitious pigeons who you could,

00:16:56

they, like, developed a false association between one thing or another.

00:17:00

And you’re familiar with this with Pavlov and so on.

00:17:02

And the thing that keeps it from regressing to the mean is that it’s intermittent.

00:17:05

It doesn’t happen all the time, right?

00:17:07

So the reason that Farmville was compelling is because you couldn’t always click a cow.

00:17:12

You had to keep going back to find out whether you could click a cow.

00:17:15

But one of the first things that happened after Farmville became really successful is other people made other cow clickers.

00:17:20

And then you could go and click a cow all the time, right?

00:17:24

You didn’t have to wait for

00:17:25

Farmville to draw up a random cow clicking opportunity for you. You could just go click

00:17:30

a cow whenever the mood took you. And the more you did it, the more you regressed to

00:17:33

the mean and eventually you were like, I need a break from cow clicking. And then after

00:17:38

you take the break, you’re like, what was I doing clicking all of those cows? So market concentration increases the persuasive power of

00:17:49

big tech and big data, right? So the more concentrated companies are, the more they can

00:17:55

gather data from us, right? So if we let companies like corner whole verticals in the market, so

00:18:00

they’re providing email and calendaring and messaging and search they can collect a lot of data on us and they can use those to do better targeting and segmenting not better mind control

00:18:10

but like just find like even weaker signals that might correlate with buying a refrigerator

00:18:15

right and and again like when you’ve got a really low base rate increasing it a lot it doesn’t take

00:18:23

a doesn’t take a whole lot right making up a very small number twice as big takes a lot, it doesn’t take a whole lot. Making a very small number twice as big

00:18:28

takes a lot less effort than making a very big number

00:18:30

twice as big.

00:18:31

So if refrigerators were Big Macs,

00:18:33

doubling the efficacy of Big Mac sales

00:18:35

would be really, really hard.

00:18:36

Because refrigerators are so hard to sell to begin with,

00:18:39

anything that makes an improvement is a big deal.

00:18:41

And so when you have monopolies,

00:18:43

they can gather lots of data that makes an improvement is a big deal. And so when you have monopolies, they can gather lots of data that makes it easier to locate people who have hard-to-find traits.

00:18:50

Now, one of the ways that that was really widely expressed was when Facebook became

00:18:53

the portal to traffic on the internet, one of the major ways that people got traffic

00:18:58

on the internet. And so if you wanted someone to come to your website, you needed to integrate

00:19:02

with Facebook. And the main form that that integration took was a like button. Now, like buttons are nominally a way to gather explicit

00:19:11

data about users and what they like. So you click the like button, you log into Facebook, it goes,

00:19:15

you like that article, I’ll show you more articles like this. Well, what like buttons mostly do is

00:19:19

they passively gather data. Because every time you land on that website, whether or not you have

00:19:24

a Facebook account, the like button is loaded by your browser which means that

00:19:28

Facebook gets to see what browser you have what IP address you have what

00:19:30

cookies you set and so on and so on and so on so every web publisher on the

00:19:35

internet put a like button on every page they had and even if no one clicked it

00:19:39

Facebook could increase the size of its non-consensual dossiers on us but again

00:19:44

not because Facebook reached into the minds of its non-consensual dossiers on us. But again, not because Facebook

00:19:45

reached into the minds of web publishing executives and said, you know, you are getting very sleepy.

00:19:51

There will be like buttons on all of your pages. But because they locked every internet user behind

00:19:57

a walled garden and then used an algorithm that they and they alone controlled and that no one

00:20:01

had any insight into to decide who would see what. And so publishers were beholden to Facebook to try and figure out how to do it.

00:20:08

So more spying also gives you the power to pitch back, right?

00:20:13

So a lot of the times if you’re trying to trick someone, it helps to know what they’re

00:20:17

deeply knowledgeable about so that you don’t bullshit them on that axis.

00:20:21

Historically, con men, in the golden age of con men, if you’ve ever seen

00:20:25

The Sting or read the book that’s based on the big con, con men had really two big pitches.

00:20:30

One was a stop scam and the other one was a horse racing scam. And they would have these

00:20:34

ropers who would meet rich people on trains, back when trains were the main way of getting

00:20:39

around America, and they would identify marks. And the only question they wanted to know

00:20:43

is, does this person know more about horse racing or stocks?

00:20:46

And if they knew more about stocks, they would try to scan them with horse racing.

00:20:49

And if they knew more about horse racing, they’d try to scan them with stocks.

00:20:52

So if you want to convince people to become white nationalist eugenicists

00:20:55

and they know a lot about DNA and they know that race is not a meaningful,

00:21:01

you know, it’s not a meaningful biological idea it’s a socially constructed idea you might rather than making dumb eugenic arguments about whose dna does what

00:21:10

you may instead uh focus on culture right you may say well people are socialized differently

00:21:15

and if you know a lot about biology but not a lot about sociology be like all right yeah they’re

00:21:20

barbarians from these other countries they’re going to overrun us i believe you yeah they’re

00:21:23

genetically identical but you know nature nurture, they were raised wrong.

00:21:26

We have to keep them out of our country. Right. And so if you know about people, you can lie to

00:21:32

them better. Now it’s not still not brainwashing, right? It’s just identifying the places where

00:21:37

they have voids in their knowledge and then filling those voids with facts that are wrong.

00:21:42

Now there’s another way in which big tech is a little

00:21:45

bit different to historic cons, and another way in which their persuasion is supercharged, which

00:21:50

is that it’s secret. So, you know, Klansmen are hard to locate, right? Like, racism, casual, like,

00:21:57

a little bit of latent racism is a widely distributed trait in our population, but virulent

00:22:02

toxic racism is thankfully pretty rare. And so if you want to reach

00:22:06

Klansmen and say, you know all that

00:22:08

stuff that Trump is dog-whistling about?

00:22:09

He’s your man. You can’t just

00:22:12

put giant billboards up on the side of the

00:22:14

highway that say, are you in the Klan?

00:22:16

Vote Trump.

00:22:17

Because even

00:22:19

if the billboard company is

00:22:22

with you, so many people will

00:22:24

go, why are you buying?

00:22:26

Why are you selling the Klan as billboard?

00:22:28

Right.

00:22:28

You’ll face social disapprobation.

00:22:30

You won’t be able to get tables in restaurants.

00:22:31

People will follow you around with their mobile phone and say, why did you sell an ad to the Klan?

00:22:35

But if you want to sell an ad to the Klan on Facebook, you can use the non-consensual dossier to make sure that the Klan ad is only shown to people who have a reasonably high likelihood of being in the clan and therefore not likely to go and narc you out and get you in

00:22:49

trouble and especially if you have a self-serve platform where you can like say my hands are

00:22:53

clean this was all done by robots and and humans who don’t work for me uh and therefore uh i have

00:23:00

nothing to do with the clan people can actually kind of buy it they’re like oh yeah it’s a

00:23:03

self-serve platform it It’s like algorithmically placed.

00:23:05

This isn’t here because Facebook likes the clan.

00:23:07

It’s here because you like the clan.

00:23:09

The problem is with you and not Facebook.

00:23:11

Therefore, let’s give Facebook a pass on this.

00:23:14

So this is one of the things that big tech has

00:23:16

that historic other means of persuasion have not had

00:23:18

is the ability to persuade in secret at scale.

00:23:22

Because usually secrets in scale are hard to balance.

00:23:24

You know, two may keep a secret if one of them is dead. persuade in secret at scale because usually secrets in scale are hard to to balance you know

00:23:25

two may keep a secret if one of them is dead so uh you know this this is genuinely a new thing on

00:23:30

the earth that’s changing our our um our ability to uh hold discourse among ourselves so facebook

00:23:36

isn’t a mind control right now what is facebook well facebook is for the first thing to know is

00:23:42

that facebook is the worst of the tech giants. All the tech giants are terrible, but they’re terrible in different ways.

00:23:47

So Apple has a monetization strategy, which is that we lock you into our walled garden.

00:23:52

We don’t spy on you while you’re using it, right?

00:23:55

But you can’t get out of the walled garden, and everything in the walled garden costs extra, right?

00:24:00

So, like, if you want to get your phone fixed by an independent repair service,

00:24:04

Facebook has put what are called technical protection measures or copyright locks in them.

00:24:10

And because of a 1998 law, bypassing those is a potential felony.

00:24:14

So to swap a donor screen from an iPhone X that has a working screen but a broken phone into a phone that is working but has a broken screen, you need an unlock code.

00:24:26

And unlocking and typing in that unlock code, if you’re not an Apple service technician,

00:24:30

is bypassing an effective means of access control to a copyrighted work, which is a

00:24:34

potential felony punishable by a five-year prison sentence and a $500,000 fine.

00:24:38

We tried to reform this with 20 right-to-repair bills in 20 states last year.

00:24:42

Apple funded the defeat of every single one of them.

00:24:44

The only one that passed was a ballot initiative in massachusetts so you know this

00:24:48

is how lock-in can be used to control your behavior and this is like this is an actually

00:24:52

like pretty significant form of behavior control deciding when your phone is broken and when it’s

00:24:57

not when it can be fixed and when it’s not where you go to get your software which software is

00:25:01

available for it whose parts you can use that is like a really extreme form of behavior control that has nothing to do with persuasion. And

00:25:07

walled garden, that’s what it’s all about. It’s not about spying on you. It’s about locking

00:25:11

you in and charging you extra.

00:25:12

Now, Google has the opposite strategy. Google’s figured out how to spy on you wherever you

00:25:16

are on the internet, so they don’t need to control you. They can monetize you by serving

00:25:21

you ads, gathering data on you and serving you ads. So Google’s over here, right?

00:25:26

And they’re like, we’ve telemetrized the whole internet.

00:25:29

Everyone’s loading Google Fonts, which are also surveillance beacons.

00:25:32

Everyone’s loading Google JavaScript, which is also surveillance beacon.

00:25:36

Everyone’s loading Google Analytics, which is also surveillance beacon.

00:25:38

Everyone’s got Google Ads embedded in their page, also surveillance beacon.

00:25:41

We can build big non-consensual dossiers of you.

00:25:44

And it doesn’t matter which OS you’re using,

00:25:46

it doesn’t matter which browser you’re using,

00:25:47

it doesn’t matter where you are,

00:25:48

it doesn’t matter whose software you install.

00:25:50

So, Apple locks you in, Google spies on you,

00:25:54

Facebook locks you in and spies on you.

00:25:58

Because they’re like the Lakota of shitty business models.

00:26:01

They use the whole consumer.

00:26:12

of shitty business models, right? They use the whole consumer, right? So to understand why Facebook wants to lock you in and spy on you, you have to understand that Facebook

00:26:17

is actually two different uneasy companions in the same bed, right? So it’s two businesses.

00:26:23

The first business Facebook has is people finding want to sell a refrigerator Facebook will find you the people do you ever wear disease?

00:26:29

Facebook will find you the people did you go to high school a long time ago and don’t know where those people are now?

00:26:35

Facebook will find you those people. Do you want to do you?

00:26:39

Did you wake up one day and realize that your gender identity isn’t the one you were assigned at birth?

00:26:43

You don’t understand what it means. We want to find people who are talking about it?

00:26:47

Facebook will find you those people.

00:26:48

Do you want to start Black Lives Matter?

00:26:51

Facebook will find you those people.

00:26:53

Do you want to convince a bunch of Civil War larpers to march through the streets of Charlottesville

00:26:57

carrying tiki torches, chanting Jews will not replace us?

00:27:00

Facebook will also find you those people.

00:27:02

So Facebook is a people-finding machine.

00:27:04

The reason people use Facebook is because it finds people for them.

00:27:07

The reason advertisers like Facebook is because it finds people for them.

00:27:11

But Facebook has another thing that it does, which is it lets people who found each other talk to each other.

00:27:16

And the problem is that widely dispersed traits in the population are generally traits that have low-intensity discussions associated with them.

00:27:24

So if you have a rare disease, like almost by definition, there’s not much to say about it,

00:27:28

right? Like, I had a bad day. I’m sorry, right? I read this article. It looks promising,

00:27:34

but it’s a long way away. Oh, maybe it’s good, right? Like the actual frequency is pretty low.

00:27:39

Like the reason you’re not talking to the friends you went to high school with is because you no

00:27:42

longer have anything in common with them, right? the intensity is pretty low and the thing is if you’ve got like a really bad

00:27:49

or not a bad ad targeting uh tool but if your ad targeting tool needs to serve a lot of ads before

00:27:55

it gets a hit because it’s it’s uh trying to make like one in a million conversions instead

00:27:59

and that are an improvement on the deal one in a billion conversions you need people to like

00:28:04

make a million clicks

00:28:06

before you serve them the thing that they’ll actually click on.

00:28:09

So you need to goose the amount of engagement

00:28:11

that you have on Facebook.

00:28:13

And the way that you goose engagement

00:28:15

in low-intensity discussions

00:28:17

is by non-consensually eyeball-fucking people

00:28:19

with Trump headlines.

00:28:21

You just feed controversial material

00:28:24

into the discussion

00:28:26

by surfacing posts that are controversial,

00:28:28

showing ads that are controversial, and so on.

00:28:30

Then people just argue with each other.

00:28:32

And if they hang out and argue with each other long enough,

00:28:35

eventually they will have made enough page impressions

00:28:37

that you will have shown them an ad that gets them to click

00:28:40

that gets Facebook a payday, right?

00:28:42

And so you can see this actually in the contours of the tools.

00:28:46

If you’ve ever used Facebook’s ad targeting tool

00:28:48

like on the buy side,

00:28:50

that shit is like from the 25th century.

00:28:52

It is totally amazing, right?

00:28:54

Like really sophisticated, really smart,

00:28:57

really intuitive, really easy to use,

00:28:59

really easy to customize.

00:29:01

But we’ve all used Facebook’s messaging tools.

00:29:03

It’s like LiveJournal 2003,

00:29:05

right? Because if you had sophisticated messaging tools, you could filter out the noise, right? You

00:29:11

could keep the people you are loosely affiliated with from drawing you into pointless arguments

00:29:17

that have been goosed by Facebook saying, let’s you and he fight, let’s you and he fight, lets you and he fight. So Facebook is these two different groups.

00:29:29

So

00:29:29

because Facebook is a

00:29:31

walled garden, if you decide

00:29:33

that you don’t want to be subjected

00:29:36

to this bad discourse environment

00:29:37

anymore, you’re faced with this

00:29:39

horrible collective action problem, which

00:29:41

is that if you want to leave Facebook

00:29:43

and go somewhere else, you have to convince everyone else to leave Facebook and go there with you.

00:29:48

Otherwise, you’re all on your own in a much better messaging environment with no one to

00:29:52

talk to.

00:29:54

And that’s an unsatisfying experience.

00:29:59

And so this is addictive, right?

00:30:01

You are addicted to talking to your friends because it matters, right?

00:30:04

You’re addicted to talking to your campies 11 months of the year

00:30:07

because otherwise you won’t be able to plan your camp when you get here.

00:30:09

But you’re not addicted in the same way that people who are addicted to oxy are addicted, right?

00:30:15

This is a very different kind of addiction because what they’ve done

00:30:18

is they’ve just stuck all your friends behind a red wall

00:30:21

and you can’t talk to them unless you go behind the reg wall too.

00:30:26

And once you’re inside the reg wall, they can just bombard you with garbage.

00:30:31

So why do we let Facebook get away with this, right? How is it that Facebook has been able to

00:30:37

do this toxic thing and do it to what they claim is 2.3 billion users worldwide? Well, the answer

00:30:44

isn’t that Facebook built the best possible messaging

00:30:47

tool. I mean, that seems pretty obvious on the face of it. But what you do see when you look at

00:30:51

what Facebook has done since its inception is it bought or crushed every single company that

00:30:56

might have competed with it. So in 2018, 15 million Americans between 13 and 34 left Facebook.

00:31:06

It was the largest American exodus from Facebook in the history of the company,

00:31:10

but the vast majority of them ended up on Instagram, and Instagram is a Facebook property.

00:31:16

So how did we let Facebook buy Instagram?

00:31:20

Why did we think that Facebook buying Instagram would promote a

00:31:25

competitive, healthy online environment? And to understand that you have to go way, way back

00:31:30

to the Reagan era. So before the Reagan era, we had a practice of antitrust that grew out of the

00:31:36

trust busting in the Gilded Age, when we had these companies that owned like all of the railroads,

00:31:41

or all of the electricity grid grid or all of the oil.

00:31:49

And the trust busters, they went in and they did the very, very hard, slow work of breaking up those companies.

00:31:50

That’s really, really hard.

00:31:52

But then they realized that announced prevention is a kind of cure.

00:31:56

And they implemented three main rules about what firms couldn’t do that would stop trust

00:32:03

from growing.

00:32:04

And those three rules were, if you have a nascent company that might grow to compete with you, you can’t buy them.

00:32:10

The second rule was, if you’re a big player in an industry, you can’t merge with another big player in the industry.

00:32:16

And the third rule is structural separation.

00:32:18

You can’t have a platform and be on the platform.

00:32:21

So if you’re a railroad and you ship freight, you can’t have a freight company

00:32:25

that competes with your own customers.

00:32:27

If you’re a bank and you loan money,

00:32:28

you can’t invest in companies

00:32:30

that compete with the companies you’re loaning money to.

00:32:32

Because you can see how you could use preferential treatment

00:32:35

to effectively corner the whole market.

00:32:38

But in…

00:32:40

So, okay.

00:32:42

So that was how things went right up to the Reagan years.

00:32:45

And then there was this cabal of rich sociopaths

00:32:48

who started to promote the fringe ideas of a guy named Robert Bork.

00:32:52

And if you know the name Robert Bork,

00:32:54

it’s probably because Robert Bork was that one colossal asshole

00:32:58

that the Senate said couldn’t sit on the Supreme Court

00:33:00

because he’d been such a dick when he was in the Nixon administration.

00:33:04

And Robert Bork was a lawyer who liked to pretend to be an economist.

00:33:07

He actually won a Nobel Prize in economics, but you need to understand that the Nobel

00:33:11

Prize in economics is not a Nobel Prize.

00:33:15

Economists were jealous that people who do number stuff that actually has a correspondence

00:33:20

to the actual world, like physicists, were getting Nobel Prizes.

00:33:22

They’re like, we got numbers too.

00:33:24

And the Nobel Committee was like, you’re not a science.

00:33:26

And they said, great, we’ll find someone else called Nobel and give it a…

00:33:30

Seriously, we’ll find someone else called Nobel

00:33:33

and give it a Nobel Prize in economics every year.

00:33:35

Like, if they made it out of chocolate, it couldn’t be faker.

00:33:40

So Quark was a lawyer who liked to pretend to be an economist.

00:33:44

And he was an alternate historian.

00:33:46

He was, like, one of those people who, like, makes up alternate histories where, like, you know, the American…

00:33:53

The Nazis won World War II.

00:33:54

Yeah, the Nazis won World War II. There you go. Philip K. Dick, right?

00:33:57

So he had an alternate history of the Sherman Act, which is the main antitrust act, he said that if you swim really hard at the debates

00:34:05

around the passage of the Sherman Act, what you find is that the legislators weren’t actually

00:34:09

worried about monopolies, that they were just worried that when monopolies were formed,

00:34:13

that they would create consumer harm. And consumer harm narrowly construed as raising

00:34:19

prices on consumers in the short term. And so under Bork’s orthodoxy, which started under Reagan,

00:34:25

but which was expanded

00:34:27

by every presidential administration since,

00:34:31

we stopped stopping companies

00:34:33

from buying Mason competitors.

00:34:34

We stopped stopping companies

00:34:35

from merging with major competitors.

00:34:37

And we stopped enforcing

00:34:39

structural separation.

00:34:40

And the only harm we looked to

00:34:41

was whether prices were rising

00:34:43

on the consumer side.

00:34:44

So you may have heard

00:34:44

that there’s finally an antitrust suit going forward against Apple over the App Store.

00:34:49

And that’s because they say that Apple raised prices by charging a commission on the App Store by being the only store that you can buy apps from.

00:34:57

And so that’s the short-term consumer harm because apps cost more than they would have otherwise.

00:35:02

But the fact that they’re like squeezing their software vendors and forcing you to do a bunch of stuff

00:35:06

that’s totally at their discretion

00:35:08

because they have a wall guard,

00:35:09

none of that is of any interest

00:35:11

to contemporary antitrust enforcement.

00:35:13

It’s the same reason that Apple

00:35:15

and the main five publishers,

00:35:17

which were then six,

00:35:19

we’re concentrating all over industries,

00:35:21

the big six publishers and Apple

00:35:23

all had an antitrust action for price fixing to make e-books cost more. Because again, the big six publishers and Apple all had an ancient trust action for price fixing

00:35:25

to make e-books cost more.

00:35:27

Because again, the only thing that ancient trust cares about

00:35:29

is our consumers paying higher prices.

00:35:32

And so this has led to concentration,

00:35:36

not just in tech, but in everything.

00:35:38

So if you’re wearing glasses,

00:35:40

either sunglasses or prescription lenses,

00:35:42

take off your glasses and just see if you can,

00:35:44

I know it’s hard if you’re not wearing them, but see if you can read the

00:35:46

maker’s mark on your glasses.

00:35:48

And if that maker’s mark is Armani or Brooks Brothers or Burberry or Chanel or Coach or

00:35:54

DKNY or Dolce & Gabbana or Michael Kors or Oakley or Oliver Peoples or Purcell or Bruno

00:36:00

Ralph Lauren or Ray-Ban or Tiffany or Valent or Valentino, or Vogue, or Versace,

00:36:06

they were made by one company, right?

00:36:08

That company is called Luxottica.

00:36:10

They were a small Italian eyewear company.

00:36:11

They took a bunch of private equity.

00:36:13

They started buying eyewear companies.

00:36:16

But maybe you’ve got, like, cool indie glasses, right?

00:36:21

But if you’ve got your cool indie glasses from Sunglass Hut, LensCrafters,

00:36:24

Pearl Vision, Sears Optical, Target Optical,

00:36:25

you bought them from Luxottica. They own those retailers.

00:36:27

So maybe you went to your cool indie optometrist and bought your glasses there.

00:36:31

If the lenses were made by Essilor, the largest optical lens manufacturer in the world, also

00:36:36

made by Luxottica, but maybe you’ve got artisanal lenses crafted by a junior in a leather apron

00:36:41

in Portland.

00:36:42

If the insurer that paid out is the largest

00:36:46

eyewear insurer in the world, I met,

00:36:48

that also looks odd, right?

00:36:50

And they used

00:36:51

this lack of structural separation where they could

00:36:54

own the retailer and the manufacturer and

00:36:56

an end to corner the market.

00:36:58

So, which

00:37:00

one was it? It was, I want to say Coach,

00:37:02

but it wasn’t. It was Oakley.

00:37:04

Wouldn’t sell to them.

00:37:05

And so they just wouldn’t sell Oakley in any of those retailers. And a year later, Oakley had been

00:37:10

driven to its knees. It was at the brink of bankruptcy, and they bought Oakley for pennies.

00:37:15

And they went around, they did this over and over and over and over again. So Luxottica, they got

00:37:21

big by buying companies, right? Not by like network effects or first mover advantages or big data,

00:37:28

just by doing stuff that was illegal until about 40 years ago.

00:37:31

So that’s exactly what Facebook did, right?

00:37:34

Facebook bought Instagram and WhatsApp.

00:37:37

And then, you know, there are companies that try to compete with Facebook.

00:37:40

There’s not many.

00:37:41

There’s a famous article in an investment newsletter called The Killing Zone or The Kill Zone.

00:37:47

And it’s what investors have started to call the businesses that Google, Facebook, and the other big platforms are in.

00:37:52

And these are businesses that are posting year-on-year double-digit growth, and no one wants to invest in their competitors.

00:37:58

Like, that’s weird, right?

00:37:59

You’d think that if you were making billions of dollars and posting year-on-year double-digit growth,

00:38:03

remember, it’s hard to grow big numbers.

00:38:05

And they’re growing big numbers.

00:38:06

That there’d be investors going, if you think you can get 1% of that, that’s a big payday for me.

00:38:10

I’ll back you.

00:38:11

There aren’t many of those.

00:38:12

But there’s one company that actually did try it.

00:38:14

And they tried competing with Facebook on the one access that they knew Facebook would never be able to push back on, on privacy.

00:38:20

That company is called Snap, right?

00:38:23

Snap is Facebook, but they delete your messages after 24 hours, right? And Snap was doing really well. But one of Facebook’s acquisitions was a company

00:38:31

called Onavo. Onavo was a fake battery monitor. It did monitor your battery, but the permissions

00:38:38

that it sought allowed it to monitor the telemetry on your whole phone, everything you did. And so

00:38:43

Facebook used Onavo to detect that Snap was being installed by people who were leaving Facebook.

00:38:48

And they used that to drive their acquisition of Instagram.

00:38:51

And then to refine the features of Instagram so that it competed head-to-head with the features of Snap that its users wanted.

00:38:57

Now, again, if we had structural separation, if we stopped companies from buying their nascent competitors,

00:39:02

none of this stuff would have been allowed.

00:39:03

And you don’t need to have a mind control rate to get people to use Instagram if they’re

00:39:07

already using Snap, if you can spy on everything they’re doing and you’ve got a deeper war

00:39:10

chest to advertise.

00:39:12

And it’s not just Facebook.

00:39:14

All of big tech got big primarily through acquisitions and through dirty tricks and

00:39:18

vertical integration.

00:39:20

So Google is a company that makes a lot of products, but they only developed two of them

00:39:25

really in-house.

00:39:26

They made a really good search engine and a pretty good hotmail club.

00:39:29

And everything else they made, they made by buying them.

00:39:32

They made by doing things that pre-Reagan they wouldn’t have been allowed to do.

00:39:37

Apple bought 50 companies in January and February this year.

00:39:41

Google bought 200 companies last year.

00:39:43

Apple buys companies more often than I

00:39:45

buy groceries. So this is an important part of the story, right? How big tech got big.

00:39:52

Because there are a lot of other versions of this story about where big tech got big. And these are

00:39:56

tech exceptionalism versions that say that we can’t ever hope to make tech small again.

00:40:02

They say that tech has these natural monopolies,

00:40:05

or it has network effects, or it has first mover advantage, or that surveillance capitalism is a

00:40:11

rogue capitalism, and by controlling our minds, they can stop us from ever wanting to leave

00:40:15

their platform. They can’t stop people from leaving their platform. They can just buy all

00:40:19

the other platforms that you might leave them for. And so these explanations, although they’re reasonably

00:40:26

well theorized in the literature, it’s very hard to take one of the big platforms and point at them

00:40:32

and go, there’s the way that network effects through this company. And it certainly probably

00:40:36

played a role there. But what you can do is you can point to these companies and go, that acquisition,

00:40:41

that dirty trick, that acquisition, that merger is how these companies got big.

00:40:46

You know, first mover effects and network effects were all we needed.

00:40:48

We would all be searching AltaVista on our great supercomputers with every hour that

00:40:52

God sent, right?

00:40:53

It’s clearly not enough to have first mover advantage.

00:40:57

It’s clearly not enough to have network effects on your side.

00:40:59

But dirty tricks get you a long way.

00:41:02

So monopolies are incredibly profitable.

00:41:04

And the profits that monopolies generate allow monopolies are incredibly profitable and the profits that

00:41:05

monopolies generate allow monopolies to become self-perpetuating and endlessly expanding, right?

00:41:11

If you are collecting monopoly rents, if you’re getting extra money because no one can compete

00:41:16

with you, then you can peel off some of the money that you’ve been making. You can keep your

00:41:20

shareholders happy. You can pay big bonuses to hire the smartest people. And you’ve still got money left over that you can use to pay lobbyists.

00:41:27

And you can also, as you get more concentrated, the number of players in the industry gets smaller and smaller.

00:41:35

You don’t actually have to all gather in a smoke-filled room and come up with a scheme to dominate the world.

00:41:40

Although it actually turns out that a bunch of them did this when they fixed, they did the hiring, you remember this? There was this thing where they all

00:41:47

got together and agreed not to poach each other’s engineers because it was costing so

00:41:51

much. They sometimes do that, but for the most part they don’t even need to. But if

00:41:55

you remember that photo of all the big tech leaders around a table in Trump Tower after

00:41:59

the inauguration, like on the one hand it’s just kind of gross to watch them all kissing that, you know, low-rent, you know, low-rent Walmart Hitler with a, you know, with their own spray tan, like,

00:42:11

kissing his ass. Like, that was terrible. But the other terrible thing to note is that, like,

00:42:14

everyone who makes decisions about tech in the Western world fit around one not very big table,

00:42:21

right? And if you all fit around that one not very big table, it’s not hard

00:42:25

for you to converge on a set of policies, right? Like if you’re trying to get your whole camp to

00:42:31

come out to the burn together so you can all sit together, that’s hard. But if you camp with three

00:42:36

friends or three of you say, we’re just going to go on our own, that’s easy, right? You have a

00:42:40

smaller collective action threshold to overcome to get to an end.

00:42:46

And moreover, when these firms are very big, there’s not a lot of places to go and not a lot of places to hire from.

00:42:52

So when you look at the executive suites in these firms, they all have worked at each other’s companies.

00:42:58

Sheryl Sandberg, she’s like the Zelig of big tech.

00:43:01

She’s been in all of them.

00:43:04

And when that happens,

00:43:06

you don’t need to have a conspiracy. You just need to be like the godparent of someone who

00:43:11

works in the other company and you see them at, like the kid of someone who works in the other

00:43:16

company and you see them at birthday parties and you just like over casual chit chat start to

00:43:21

converge on a set of terms. You just get very cozy, right, when you’re

00:43:25

very, very concentrated. And so that coziness means that you can not only spend money on

00:43:32

lobbying, but that you’re not spending money on lobbying for one thing while the other

00:43:36

big tech is spending lobbying dollars to undermine it. You’re all pushing in the same direction to get laws passed that make

00:43:45

you more powerful.

00:43:48

One of the ways in which big tech has converged on a set of lobbying outcomes that has made

00:43:53

them more powerful is by fencing off something that I call adversarial interoperability.

00:43:58

So you’re all familiar with interoperability, right?

00:44:00

You’ve got a bolt, you’ve got a nut.

00:44:02

You screw the bolt on the nut.

00:44:03

It doesn’t matter if they’re from the same manufacturer, right? You’ve got a cigarette, you’ve got a nut. You screw the bolt on the nut. It doesn’t matter if they’re from the same manufacturer, right?

00:44:05

You’ve got a cigarette lighter in your car.

00:44:06

You go to the gas station.

00:44:07

There’s a fishbowl with 50-cent USB chargers that fit your cigarette lighter.

00:44:11

You put it in.

00:44:11

That’s interoperability.

00:44:13

And there’s a lot of it, and some of it is even cooperative interoperability, right?

00:44:16

Like if you want to make an app for Facebook or the iOS platform or Google Play, they’ll give you, like they publish an API.

00:44:23

They tell you how to do it.

00:44:24

So that’s like kind of willing interoperability, adhering to standards. But really the secret

00:44:29

sauce is adversarial interoperability. That’s when I figured out how to plug something into

00:44:34

something you made without your permission against your wishes. Like when the first cable

00:44:39

TV came along, it was TV salesmen in rural Pennsylvania who couldn’t sell TVs because

00:44:44

no one could receive

00:44:45

signals. So they clubbed together and put up a big antenna and ran wires to all their customers’

00:44:49

houses and just stole the broadcast signals from Philly. That’s adversarial interoperability.

00:44:55

And adversarial interoperability is deep in the DNA of all of today’s tech giants. And like every

00:45:00

pirate that has ascended to the top, they’ve now declared themselves admirals and then kicked the ladder away.

00:45:06

So the tactics that made them big were legitimate competitive tactics.

00:45:09

Those tactics deployed against them are illegitimate forms of intellectual property theft.

00:45:14

So a good example of this would be what Apple did around 2002 to 2006.

00:45:20

You remember the Switch ads?

00:45:23

Switch to Apple.

00:45:24

It’s easier than you think. It doesn’t

00:45:26

matter that all of your documents are siloed

00:45:28

in Microsoft proprietary apps.

00:45:29

Apple can help you switch. And then they made the iWork

00:45:32

suite where they did clean room

00:45:33

re-implementations of every app

00:45:35

that Microsoft was using to lock in their customers.

00:45:38

They made pages that read Word.

00:45:39

They made numbers that read Excel.

00:45:41

They made Keynote that read PowerPoint.

00:45:43

They did that without permission and they field that, and they stole customers from Microsoft,

00:45:48

and they allowed people who were using Apple products to coexist in an office environment with people who were using Windows.

00:45:56

So the fact that everyone in your office used Windows didn’t mean that you also had to use Windows,

00:46:00

which meant that you could start to erode the monopoly power of Microsoft and push back against it. That is Apple’s whole origin story.

00:46:08

Today, anyone who tried to make a tool like that

00:46:11

for, say, iTunes, that reads and writes the proprietary iTunes files

00:46:16

that can spoof all of the special magic

00:46:19

URLs they use to pull down podcasts and so on, so that you could use

00:46:23

all of the functionality

00:46:25

of iTunes with all the devices that iTunes works with without using an Apple product

00:46:29

and buying your stuff from third parties and not just from Apple.

00:46:32

If you did that, Apple would hit you with software patent suits.

00:46:35

They would say that violating the terms of service violates the Computer Fraud and Abuse

00:46:38

Act.

00:46:39

They would say that the digital rights management in it is a felony to bypass.

00:46:43

It’s an effective means of access control

00:46:45

to a copyrighted work. And so trafficking in that tool would be a felony punishable by a five-year

00:46:49

prison sentence and a $500,000 fine. They would say that you tortiously interfered with their

00:46:54

contractual relationships with their suppliers and their customers. And they would basically

00:46:57

just nuke you from orbit. And there would be nothing left but a crater by the time you were

00:47:02

done. Because when they did it, it was legitimate. And when you do it, it’s theft, right? Or Facebook. When Facebook started, they had a really bad

00:47:10

problem, which is that everyone who might use Facebook was already a MySpace user. And remember

00:47:15

the collective action problem. Facebook might have had a better product, but it’s not better

00:47:18

if all the people you want to use it with are still using MySpace. So they figured out a way

00:47:22

for you to use Facebook with MySpace.

00:47:29

They made a bot. And that bot, you would give it your login credentials. It would pretend to be you and show up on MySpace and say, here I am. What messages are waiting for me? All right, here we

00:47:33

go. And it would flow them into your Facebook inbox. And then you could answer them from Facebook.

00:47:38

So you could be in a MySpace group on Facebook. And then it would send them back to MySpace.

00:47:43

And there’d be a little footer that said, I sent this from Facebook. Why the fuck are you still using MySpace? Right? And so

00:47:48

they were able to bring customers over. They didn’t have to solve the collective action problem.

00:47:51

They could just use adversarial interoperability. Now, around 2013, a company called PowerVentures,

00:47:57

no, 2011, a company called PowerVentures comes along and they’re like, we’re going to unify

00:48:01

all of your social telephones in one inbox, right. You’re not going to answer your LinkedIn phone and your Facebook phone and your whatever phone.

00:48:07

We will write little bots that log into all those services, scrape your messages and flow them into this thing.

00:48:15

Facebook sued them. Right. They sued them and they advanced this fairly radical theory that a law passed in 1986 in a moral panic over the movie War Games called the Computer Fraud and Abuse Act

00:48:26

that defines hacking as exceeding your authorization on a remote computer

00:48:31

means that any time you violate terms of service, you commit a potential felony.

00:48:36

And they shut down Power Ventures because when they did it, it was progress,

00:48:40

and when someone does it to them, it’s theft.

00:48:43

So as these companies have gotten more money, they’ve been able to buy these legal and policy outcomes that make it easier for them to be lobbyists.

00:48:52

You know, California has been fighting really hard on privacy laws.

00:48:57

And the big tech firms who swing a very big stick in California have been throwing everything they have at fighting those meaningful privacy laws.

00:49:04

Or Apple, as I said, shut down 20 right-to-repair bills in 20 states last year.

00:49:09

And it’s not just that monopolies can lobby.

00:49:13

They can also buy their regulators.

00:49:15

So there’s a European former political leader of a large, powerful nation.

00:49:20

I can’t say anything because this was told to me in confidence,

00:49:22

who now fronts for one of the big tech companies as their major public base, who is on a four million euro a year salary.

00:49:29

Right. Four million euros a year buys you a lot of motivated reasoning.

00:49:34

Like, well, you know, if only bad people lobby for these big tech companies, only bad things will happen.

00:49:40

I’m a good person. And so I can take the four million euros a year and I can fix it from the inside.

00:49:44

You know, you cannot fix something from the inside. It shouldn’t exist in the first place

00:49:49

so

00:49:50

monopolies of

00:49:52

monopolies dominance of

00:49:53

Regulation does not need to be total in order for it to be effective

00:49:57

In fact, sometimes monopolies lose regulatory fights and snatch victory from the jaws of defeat

00:50:02

So the GDPR is a pretty well, I’d say like 70% of it is a really well-designed privacy

00:50:09

regulation.

00:50:09

There’s tons of parts of it.

00:50:10

And I should say I’m not speaking on behalf of the FF here.

00:50:12

We have a more nuanced view of it.

00:50:15

But the GDPR, the European Privacy Rule, is pretty good, right?

00:50:18

But one of the things about it is that it’s very expensive to comply with.

00:50:22

Well, there aren’t many companies that have a lot of money.

00:50:24

Most of them are giant American tech companies.

00:50:26

And what we’ve seen is that the European ad tech sector

00:50:29

has been flushed down the toilet

00:50:30

and has been replaced by American big tech operators.

00:50:33

So even though we’ve brought them to heel a little,

00:50:36

what we’ve also done is we’ve made being in the sector

00:50:40

so expensive that you have to be a monopolist to be there.

00:50:43

And so one of the implications of this

00:50:44

is that anything that we ever did in the future

00:50:46

to make them too poor to honor the GDPR would subvert this privacy rule that we’ve come up with.

00:50:52

Right?

00:50:52

And it’s not just privacy.

00:50:53

Last year, there was a terrible, awful fight in the European Union about the copyright directive.

00:50:58

So every, you know, 15 to 20 years, whether they did it or not, Europe updates its copyright rules.

00:51:03

And this one had been a pretty, like pretty normal one that had been going along.

00:51:07

They’d had expert input, and they’d had some loony ideas, and they’d thrown them away.

00:51:11

They’d had some good ideas, and they’d refined them, and everything was going along.

00:51:14

But because of the way the presidency rotates and because of the way that the parliament rotates,

00:51:18

the guy in charge of it, who’s called the rapporteur, changed over to this German MEP called Axel Vos.

00:51:24

in charge of it, it’s called the Rapporteur, changed over to this German MEP called Axel Voss. And Axel Voss revived this completely bananas idea where what they would say is

00:51:31

that instead of being expected to police copyright on a platform by being told that someone had

00:51:36

violated copyright and then taking it down, instead platforms would have to know before

00:51:41

something went live whether or not infringed copyright. And so you would somehow have to know before something went live whether or not it infringed copyright. And so you would somehow have to look at every tweet and Facebook update and code check-in

00:51:50

and Wikipedia update and new audio file and video clip and photo, and you would have to

00:51:57

somehow know whether or not that user was entitled to post it, whether it was their

00:52:00

copyright or in the public domain or fair dealing or not.

00:52:04

Now like, if everyone who was ever trained or capable of being trained as a copyright lawyer in the history of the world

00:52:09

were put to work on this task, we would run out of lawyer hours before we made even the smallest dent in this.

00:52:15

And so what was obvious from the start was that this was going to require automated filters.

00:52:19

And we see some of those out there in the field already.

00:52:22

Google has one for YouTube called Content ID that costs about $100 million to build and deploy.

00:52:28

And Content ID is pretty terrible.

00:52:30

It’s really easy to spoof.

00:52:31

You know, the people who wanted the filter in the first place, the big rights world organizations, say Content ID is wildly inadequate.

00:52:37

And not only that, but it’s also really hard to get legitimate stuff through it.

00:52:41

Forever misidentifying things.

00:52:43

You know, it identified for a long time, it identified any silence as belonging to Philip

00:52:47

Glass.

00:52:48

It identified birdsong as belonging to a sound effects company called Rumblefish, any birdsong.

00:52:56

It repeatedly and routinely identifies people’s own recordings of the performances of Brahms

00:53:01

and Bach as belonging to Sony Music, because for an algorithm, the

00:53:06

professionally prepared version of this and your version in your living room are not different

00:53:12

enough. The algorithm is kind of dialed over to err on the side of caution, because what they

00:53:17

don’t want is for you to be able to fuzz a recording enough that it sounds a little amateurish

00:53:21

and then get past the algorithm. And like they shoot first and ask questions later but even so content id is a much smaller version of this for one thing

00:53:31

it only accepts claims from trusted rights holders right to be someone who can send something to

00:53:35

content id and say never let anyone upload this you have to you have to be someone that they’ve

00:53:39

vetted right yeah and they can kick you out if you’re a dick about it right if you start uploading

00:53:43

stuff that doesn’t belong to you they they can go, right, you’re

00:53:46

out of it. From now on, you just have to monitor

00:53:47

YouTube, and if you find something that infringes, you have to tell

00:53:49

us about it, and we’ll take it down. We’re not going to do proactive

00:53:52

takedowns for you. But the

00:53:53

copyright directive, it has no exception

00:53:55

for this. Even if

00:53:57

you, a million times in a row,

00:53:59

upload the alphabet and the works of Shakespeare and say

00:54:01

these are mine, and the people operating

00:54:03

the filter have to go in and remove that a million times. They can’t strike you off and say, sorry,

00:54:08

you don’t get to use, you don’t get to add stuff to the filter anymore. Instead, what they have to

00:54:13

do is accept all of your input on pain of, if you do have a copyright that’s eventually infringed

00:54:19

and they didn’t let you identify it in the first place, they can sue you and take you for very

00:54:24

large sums of money.

00:54:25

And so this is just kind of like a charter for the operation of trolls and crooks

00:54:30

and dirty cops who want to suppress videos of their beatings of their victims

00:54:34

and anyone else who just wants to make things disappear from the Internet

00:54:36

just by saying it’s copyright and without having to prove it.

00:54:39

You don’t even have to like, there’s no requirement in the directive

00:54:42

that you formally even identify yourself.

00:54:44

There’s no rigor in the identification process

00:54:46

for who is actually you can you can say like Paul McCartney here I’d like you to

00:54:50

know all that stuff belongs to me now Paul McCartney supported this it’s hard

00:54:53

to fight things that Paul McCartney supports and we lost we lost it by a

00:54:58

speaker so five million people signed the petition against it it was the

00:55:02

largest petition in European history 200,000 people marched in 50 cities on one day.

00:55:09

It was the most controversial directive ever introduced.

00:55:14

It went to a vote that would have been like the dispositive vote about whether or not it would proceed.

00:55:19

And we lost by five votes.

00:55:21

And afterwards, 10 MEPs said that they were confused about what they were voting on and pressed the wrong button. Right? So big tech fought this. They actually did fight this to their credit.

00:55:32

They spent a bunch of money on it. In fact, I was working on it and everyone kept accusing me of

00:55:36

shilling for Google, which is awesome. They do let me search the internet for free. I’ll put my hand on my card and admit that.

00:55:48

But they’re not so sad about losing.

00:55:54

In fact, the CEO of YouTube wrote an op-ed where she said,

00:55:57

there’s nothing wrong with filters per se.

00:55:59

We want to be able to decide which filters we have.

00:56:01

Well, we can afford filters.

00:56:02

Well, shit, yeah, they can afford filters.

00:56:03

They’ve got monopoly rights.

00:56:06

There’s five Western companies in the world that can afford filters. Well, shit, yeah, they can afford filters. They’ve got monopoly rights. There’s five Western companies in the world that can afford filters.

00:56:08

None of them are European, right?

00:56:12

All of those European companies had their death warrant signed last March when this regulation was introduced.

00:56:14

And if you think Google is hard to negotiate with in 2019 when there are small European

00:56:18

competitors that people actually use, and there are, like there’s a Bulgarian search

00:56:22

tool that people like and a Croatian photo search tool that people like, and a Croatian

00:56:25

photo sharing tool that people like, and so on. If you think they’re hard to negotiate with now,

00:56:29

give them 10 years with no competition and see how you like them. So this means that we’re,

00:56:36

at this point, we’re making these choices about whether to deputize big tech firms

00:56:42

as arms of the state or to try and cut them down to size, right?

00:56:46

We’re at this point where we’re trying to decide whether we’re going to fix big tech or fix the Internet.

00:56:50

Because once you make companies so big that they don’t have hundreds of millions of dollars to buy filters,

00:56:55

then you ensure that the filters won’t run.

00:56:57

And if you’ve already decided that filters are the only way to solve this problem,

00:57:00

then you have already foreclosed on the possibility of making the company small

00:57:06

again, right? Like you either have an internet that doesn’t consist of five giant websites filled

00:57:10

with screenshots from the other four, or we can have an internet that’s pluralistic, where lots

00:57:15

of people offer lots of services to lots of other people in lots of ways. And we can see that big

00:57:21

tech is increasingly becoming an arm of the state with the effect that we won’t make them small again.

00:57:27

So many of you probably follow the scandal about Amazon Ring.

00:57:30

That’s the surveillance doorbells they make that have a little camera built into them.

00:57:34

And Ring did secret deals with at least 225 U.S. police departments where the cops would go around and buzz market Ring devices and subsidize the purchase of Ring devices in their cities

00:57:46

and be given free Ring devices to hold raffles for.

00:57:49

And then they would encourage people to install the Neighbors app, which creates surveillance

00:57:52

grids of all those doorbells that all watch the street together.

00:57:56

And then Amazon at first admitted that they were giving the cops the ability to ask a

00:58:02

citizen, can I see this footage from your doorbell camera because something bad happened there?

00:58:06

But it actually turned out as we got deeper and deeper into the freedom of information requests,

00:58:10

if the citizen says no, then Amazon says to the police, just ask us for it.

00:58:15

And once you’ve made a formal request, it doesn’t matter if the citizen says yes or no.

00:58:19

So Amazon is basically pursuing a strategy to turn themselves into part of the state, right,

00:58:24

into an arm of the state, right?

00:58:27

Into an arm of law enforcement.

00:58:31

And if you said, well, we’re going to make a new rule that regulates big tech and its privacy practices to stop them from doing dirty shit, 225 police departments across

00:58:37

America will say, don’t make that rule.

00:58:39

If you make that rule, we can’t fight crime anymore because our crime fighting capacity

00:58:42

is completely bound up with these big tech companies that we’ve deputized to be arms of the state we cannot make big tech uh

00:58:49

we cannot make big tech behave we’ll never make big tech behave the only way to make big tech

00:58:54

behave is to make big tech small right the reason facebook is a dumpster fire isn’t merely that

00:58:59

mark zuckerberg is a sociopath it’s that uh and and not suited to make decisions about the social lives of 2.3 billion people,

00:59:06

it’s that there’s no one on earth who is suited and wise enough to make the social decisions

00:59:13

about the lives of 2.3 billion people.

00:59:17

So if you’ve got 2.3 billion people, that means that every day you have to solve 2,300

00:59:24

one-in-a-million use cases, right?

00:59:27

No company is going to build a single product that handles 2,300 new one-in-a-million corner cases

00:59:34

every single day. No service can do it, but lots of services can, right? Services that respond to

00:59:41

the idiosyncratic needs of their users. A pluralistic internet is

00:59:45

one in which people have more technological self-determination, first because they can shop

00:59:49

around, second because they can build stuff for themselves, and third because they can have the

00:59:53

policy space to rein in the worst actors because the bad actors won’t have those monopoly rents

00:59:59

that they can use to pay former European leaders four million euros a year to lobby on their support.

01:00:03

So we need to weaken the power of big tech, and we need to break up big tech, and then

01:00:09

we need to stop big tech from getting big again.

01:00:11

So we need to, one way we can do this is we could create a blanket immunity for interoperability.

01:00:17

We could say that anyone who makes a tool that allows users to increase their technological

01:00:21

self-determination is immunized from claims under patent, copyright,

01:00:25

tortious interference, anti-hacking laws like the Computer Fraud and Abuse Act.

01:00:29

And we wouldn’t immunize them if they were doing things that were actually enumerated

01:00:32

crimes, like stealing people’s data, trafficking in obscenity, making actual death threats,

01:00:37

and using these adversarial interoperability tools to do it.

01:00:41

And we wouldn’t have to mandate that the big tech companies tolerate it.

01:00:44

They can fight it, right? So a good example of this is how we used to regulate AT&T. When AT&T was

01:00:50

the bell system, they had an absolute monopoly on everything you could plug into the phone network.

01:00:55

And they used that to ensure that no one could buy phones. You could only rent phones. And so

01:00:59

people would buy their phone a thousand times over, right, by paying, by renting it every month.

01:01:04

It was a really good racket for them. And part of their argument was, we need to be able to maintain

01:01:08

the integrity of the bell system because we have been deputized as an arm of the law, right, an arm

01:01:12

of the state. Like, when there’s an emergency, our network is how we coordinate it, right? When

01:01:16

there’s a public safety issue, our network is how we respond to it. And so if we can’t control it,

01:01:21

how will we ever be able to do all of that? So you have to stop people.

01:01:25

But of course, what they mostly used it for was to pad their balance sheet.

01:01:29

So there was a company called Hushaphone that made a device that went over your phone receiver like this.

01:01:37

And when it was over your phone receiver, people couldn’t read your lips and your voice were kind of muffled.

01:01:41

So it would be hard to eavesdrop on you.

01:01:43

and your voice were kind of muffled, so it would be hard to eavesdrop on you.

01:01:48

And AT&T argued that mechanically coupling this third-party device to the bell system endangered the integrity of the bell system, right?

01:01:51

And so it went to court, and it was a claim that was so absurd that finally a judge said,

01:01:56

uh-uh, you can no longer monopolize all the things that you can mechanically connect to the bell system.

01:02:02

Now, AT&T still had a remedy.

01:02:04

The judge didn’t say you have to keep your phone receivers in the same shape.

01:02:09

AT&T, if they wanted to, if they wanted to piss off their customers,

01:02:12

they could have replaced all of the phones in America, which, remember, they owned.

01:02:15

They could have done it every six months.

01:02:17

It would have been really expensive,

01:02:19

but the state wouldn’t have been subsidizing their business model anymore.

01:02:22

But they had a perfectly good countermeasure at their disposal to stop people from connecting kasha phone but they were like

01:02:28

eh it’s not worth it right then it happened with electronic coupling there’s a company called

01:02:32

carter phone that gave like a walkie talkie to phone bridge for like ranch hands they sued over

01:02:36

that they lost that they lost the ability to control electric coupling too they could fight it

01:02:41

right and they could stop you from electrically coupling things that were actually illegal, right?

01:02:45

If you were Steve Jobs and Steve Wozniak going door-to-door in your dorm selling blue boxes that let you make free long-distance calls, which is how they banked Gold Apple, they could sue you, right?

01:02:55

And because that’s a disservice, it’s an actual enumerated crime, couldn’t stop you from just making a tone generator that lets you automatically place calls without having to press the buttons.

01:03:08

Or any one of another million things that could be either electrically or mechanically coupled.

01:03:12

So we could create an absolute interoperability defense.

01:03:14

We wouldn’t have to tell the big tech companies how to run their business.

01:03:16

We wouldn’t have to tell them which companies they could fight.

01:03:19

All we would say is that you can no longer use the state as a cheap way to turn your commercial preferences

01:03:25

into felony contempt of business.

01:03:29

So, thank you.

01:03:33

And then we could just restore Bork, right?

01:03:35

We could say, you can’t buy nascent competitors anymore.

01:03:37

You can’t merge anymore.

01:03:39

You have to have structural separation, right?

01:03:41

We can say you can’t go into new lines of business

01:03:43

that would make it harder to compete with you. And that, in some ways, would be pretty straightforward

01:03:47

because it would just take executive action. You wouldn’t need new law. Or it was, like,

01:03:52

wrong about what the Sherman Act says. The Sherman Act really clearly says all that shit

01:03:55

is illegal. And so the administration could just do it. Now the judges have weird theories

01:03:59

about this, and we might need to pass a new law to get them to, you know, like, for avoidance

01:04:02

of doubt, cut that shit out. But, you know, like, it would be a good start, right? So that would be preventative, prophylactic to

01:04:07

stop them from doing it. The last thing we can do is we can start breaking them up. There are

01:04:11

lots of lines we can fraction on Instagram or take Instagram to Facebook, take double click

01:04:15

out of Google, take YouTube out of Google, whatever. And that will take a long ass time.

01:04:21

It’s like a 10 to 15 year process. If you look at the AT&T breakup or the attempt at

01:04:26

the Microsoft breakup or the attempt at the IBM breakup, they were really slow. And in most cases,

01:04:31

they were unsuccessful at Microsoft, IBM. But the thing that happened, even though they were slow

01:04:37

and unsuccessful, is they were monumentally unpleasant for the firms involved. And they

01:04:42

sent a very clear message that shenanigans would make

01:04:45

life very hard for you. And there are a lot of Microsoft insiders who tell a credible story

01:04:50

about how Google got started, which is that every time there was a boardroom in which someone said,

01:04:54

let’s just go crush those fuckers, someone else in that room said, wait a second, didn’t you see

01:04:59

what happened when they put Bill Gates on the stand. He was like rocking and stimming and like it became a laughingstock.

01:05:06

Don’t make them put Bill back on the stand.

01:05:09

And so they left Google alone, right?

01:05:12

IBM had been like under antitrust investigation

01:05:14

for their mainframe business for like a decade.

01:05:17

And then they decided to make the IBM PC

01:05:19

and they made it all with commodity components,

01:05:23

which is how we got IBM PC clones.

01:05:25

And the reason that this company

01:05:26

that had always made every component

01:05:28

down to the fucking machine screws

01:05:30

decided to use commodity components

01:05:31

is a bit of a mystery.

01:05:33

But one very plausible explanation

01:05:35

is that there was a boardroom where someone said,

01:05:37

if we do this really gross and competitive thing,

01:05:40

that thing that we’re now winding down,

01:05:42

because at that point the DOJ was winding down

01:05:43

the antitrust over mainframes

01:05:44

because mainframes weren’t a thing anymore um that thing might

01:05:48

kick off again right and so we just got done with like a decade and change of like every utterance

01:05:54

we made having to be run through council in case it destroyed our business with the deal because

01:05:58

of the doj antitrust investigation so even if these things don’t work they work right they put

01:06:04

everyone on notice.

01:06:05

You know, sometimes you have to execute an admiral to encourage the others, right?

01:06:10

So this is the last platform.

01:06:14

Now, if this sounds like a giant battle to you, you’re right.

01:06:18

But I think it’s one that we have a plausible chance of winning.

01:06:21

And it’s because this is not a big tech problem, because tech exceptionalism is bunk,

01:06:26

right? Because this is a problem with big eyewear as well as big iPhones. This is a problem with

01:06:31

people who are worried about the fact that 25 years ago, there were 30 pro wrestling leagues,

01:06:36

and now there’s one. And the guy who owns it’s a billionaire Trump donor. And because he’s got

01:06:40

the only wrestling league, he reclassified all of his wrestlers as contractors and took away

01:06:44

their medical insurance. And GoFundMe is full of he reclassified all of his wrestlers as contractors and took away their medical insurance.

01:06:46

And GoFundMe is full of 50-year-old dying pro wrestlers begging for money to stay alive,

01:06:50

to pay their medical bills.

01:06:52

So there are wrestling fans who care about this shit.

01:06:54

And there are eyewear fans who care about this shit.

01:06:56

And there are Hollywood screenwriters who care about this shit.

01:07:00

The Hollywood screenwriters fired all of their agents this year

01:07:03

because three hedge funds bought the last three talent agencies and then started doing deals where

01:07:09

they were screwing their own writers. They were saying, we’ll take less money for the

01:07:12

writer we represent if you will cut us in for a bigger piece of the action for the firm.

01:07:17

So this is like horrible conflict of interest. And the Writers Guild said, cut that shit

01:07:21

out. And they said, nope, there’s only three of us and our private equity masters expect a return on their investment.

01:07:27

And so the Writers Guild said, all right, you’re all fired, right?

01:07:31

So the Writers Guild wants to help us with this.

01:07:33

As do people who care about the fact that there are only a couple of tool makers.

01:07:37

As do people who care about the fact that there’s only a couple of gas companies.

01:07:40

As do people who care about the fact that there’s only a couple of music studios.

01:07:44

And so on and so on and so on.

01:07:45

Right?

01:07:46

Like Fox just got bought out by Disney.

01:07:48

Right?

01:07:49

There’s an entire group of people who are about to be out of a job who care about the

01:07:53

fact that tech has consolidated, as has every other industry.

01:07:57

Now, before the term ecology was coined, there were people who cared about a lot of different

01:08:03

issues.

01:08:04

Some people cared about fresh air. Some people cared about fresh water. Some people cared about endangered

01:08:08

species. Some people wanted to make sure the whales were safe. Some people worried about the

01:08:12

ozone layer. They were all different fights. And the word ecology changed all of those fights into

01:08:18

one fight. That word crystallized a bunch of disparate fights into a single movement where

01:08:23

people began to have each other’s back. and we are at an ecology moment for pluralism

01:08:28

and monopoly where people actually are starting to realize that we’re all

01:08:32

looking at facets of the same fight the same problem and it’s not a problem of

01:08:38

mind control race it’s a problem of market concentration so remember so

01:08:44

remember that big tech got big through monopoly tactics, not through mind control

01:08:49

race.

01:08:50

And in that regard, big tech is not exceptional, and tech exceptionalism is bullshit.

01:08:53

But there’s one way in which big tech is really important.

01:08:58

Our tech is really important and different from the other industries.

01:09:01

And it’s not that it’s like the most important fight.

01:09:03

We have way more important

01:09:06

fights related to gender and racial justice, income inequality, stopping the planet from

01:09:12

cooking us all on our own pudding and inverting a future in which we all have to dig through rubble

01:09:16

for canned goods and drink our own urine. Those are way more important fights than which search

01:09:22

engine we use. But the thing is that every single one of those fights

01:09:26

will be won or lost on the internet.

01:09:29

The internet is how we find people with hard-to-find traits

01:09:32

in the world.

01:09:33

If you want to find people who want to work with you

01:09:36

on these issues from the corner that you’re working on,

01:09:38

we need the internet to do it.

01:09:40

We need the internet to organize.

01:09:41

And so although the internet and tech

01:09:43

is not the most important fight,

01:09:46

and although tech is not exceptional in most regards,

01:09:47

in this one it is.

01:09:51

Tech is the tool that we use to fix the other problems.

01:09:53

And unless we have a free, fair, and open internet,

01:09:56

we are fighting with both hands tied behind our back.

01:09:58

So, thank you.

01:10:04

So, I’m going to look at the end, and then I’ll take some questions.

01:10:11

The thing I want to say at the end here is that if you’re going to put up your hand and say,

01:10:13

what can I do personally to solve this?

01:10:14

The answer is effectively nothing.

01:10:16

No one of us can personally solve this.

01:10:21

And one of the things that 40 years of neoliberal doctrine has tried to inculcate in us is the belief that all problems are individual problems with individual solutions.

01:10:25

But I’m here to tell you that you cannot personally recycle your way out of climate change.

01:10:33

Even Elon Musk doesn’t get to dig his own subway.

01:10:36

So these are collective fights that need collective action to solve them.

01:10:42

And there are lots and lots of collective

01:10:45

bodies out there that are working on this and I work for one of them the

01:10:49

electronic frontier foundation but I will ask you you know we’re a 501 C 3

01:10:59

nonprofit and we make a dollar go farther than any other nonprofit I’ve

01:11:04

ever seen and we do

01:11:05

important work we’re doing it for a long time one of our founders who died last

01:11:09

year dr. Barlow excuse me well where did that come from

01:11:12

Marlo thank you

01:11:22

so Carlo was one of our founders.

01:11:25

He’s a murderer.

01:11:26

Died last year.

01:11:28

He, uh, oh.

01:11:30

I’m Ashkenazi and I’m underslept.

01:11:31

Excuse me.

01:11:36

So Barlow, like, his legacy is contested now.

01:11:37

He wrote these amazing story,

01:11:40

certain documents like the Declaration of Independence of Cyberspace,

01:11:41

which, you know, read widely today.

01:11:42

People argue about it.

01:11:43

They should.

01:11:43

He wasn’t a saint.

01:11:48

He was a man just like anyone else. And he got some stuff right and he got some stuff wrong.

01:11:52

But the one thing that people say about Barlow that’s completely wrong, and I can tell you because I know it and I was there, is that Barlow thought that the internet would just take care of itself

01:11:57

and that it could do no wrong, right? And that the reason he fought over regulation that seemed

01:12:02

like it would distort or break the internet was

01:12:05

because he thought that the internet should never be regulated, shouldn’t have any rules,

01:12:08

and could never be turned into a harmful enterprise. You don’t start an organization

01:12:14

like EFF if you think that the internet is just going to take care of itself, right? You start

01:12:20

an organization like the EFF because on the one hand you’re very excited about what

01:12:25

a digitally enabled world could look like

01:12:27

and on the other hand

01:12:28

you’re fucking terrified about how terrible

01:12:31

it could be, right? If they ever write something on my

01:12:33

tombstone, well it’ll probably be

01:12:34

my wife and I have a pact, it’ll be if a man lies

01:12:37

in the ground in boulders and decomposes

01:12:39

and his wife isn’t there to tell him he’s doing it wrong

01:12:41

is he still wrong and her tombstone is going to say yes?

01:12:46

Failing that,

01:12:48

I want my tombstone to say, this will

01:12:50

all be so great if you don’t screw it up.

01:12:52

That was what Barlow was there for.

01:12:54

That’s what EFF was there for. And it’s not just

01:12:56

them. But one of the

01:12:57

things that EFF has done in the last couple of years

01:13:00

is started this network of affinity groups

01:13:02

all around America and now increasingly

01:13:04

around the world called the Electronic Frontiers Alliance.

01:13:06

The guy who runs that’s a burner called Shahid Guattar.

01:13:08

He’s out here on the play.

01:13:09

He primaried Nancy Pelosi last year.

01:13:12

He’s a fucking amazing dude.

01:13:14

And you can get involved with local affinity groups and do local projects that we work with you on and support.

01:13:21

So this isn’t just about writing a check and having something happen at the national policy level.

01:13:25

This is how the Oakland Privacy Group

01:13:27

managed to pass a rule that says that

01:13:29

Oakland cannot procure any new surveillance equipment

01:13:32

without public consultation.

01:13:34

So no more of these secret deals with them.

01:13:37

So that all said,

01:13:39

I do want you to write us a check, right, if you can.

01:13:42

I know it’s expensive to live in this world,

01:13:45

and we have monopoly capitalism,

01:13:47

and they’re driving down wages.

01:13:48

We have increasing income inequality.

01:13:50

But it’s a collective enterprise,

01:13:51

and one of the things that we need are hubs

01:13:53

that people organize around,

01:13:56

and that’s what EFF uses funding for.

01:13:58

So that was my little pitch for EFF at the end of this talk.

01:14:00

I really want to thank you all for being so patient

01:14:02

and listening to it.

01:14:03

This is a new kind of pitch for me, this Monopoly thing,

01:14:05

and I haven’t given this talk very often,

01:14:07

and so if it seemed a little rough around the edges,

01:14:09

I appreciate your patience.

01:14:12

You’re listening to The Psychedelic Salon,

01:14:14

where people are changing their lives one thought at a time.

01:14:19

If you’ve been paying attention to the tech news

01:14:22

coming out of Washington lately,

01:14:24

you probably know that there’s some talk about establishing somebody as the tech czar, similar, I guess, to the drug czar.

01:14:31

Well, my suggestion is to ditch the Russian connotation using the word czar, and instead we appoint Cory Doctorow as our tech conscience.

01:14:41

It seems to me that he is exactly the right person for a job like that.

01:14:46

By the way, as we just heard, Google Analytics, Google Ads, and all of the other Google products

01:14:52

are simply beacons that gather information on anyone who comes to a page with a link to one

01:14:57

of those services. Well, a few years ago, John Gilmore, whose Planque Norte talks you’ve heard here,

01:15:05

helped me clear away all of the connections like that from the salon’s website.

01:15:09

John, as you know, is also one of the co-founders of EFF.

01:15:14

So, with the exception of a few pages that have links to a YouTube video,

01:15:19

Google has no hooks into the salon’s website.

01:15:21

In fact, I’ve never used Google Ads or Analytics on the psychedelicsalon.com website,

01:15:27

and that is also why you don’t see like buttons

01:15:30

and similar hooks like that on the site either.

01:15:34

So I’m doing my best to prevent you from being tracked

01:15:37

when you come to the psychedelicsalon.com website.

01:15:41

In closing today, I want to repeat Corey’s tribute

01:15:44

to John Perry Barlowlow as you know the electronic

01:15:48

frontier foundation is the world’s leading non-profit organization that is defending

01:15:53

civil liberties here in the digital world now eff was founded in 1990 by john gilmore

01:16:00

mitch capore and john perry barlow now what it’s worth, the opening and closing of every one

01:16:06

of these podcasts from the Psychedelic Salon include a word that John Perry coined, and that

01:16:12

word is cyberdelic. And in my book, The Spirit of the Internet, I also mentioned his first published

01:16:19

use of this word back in 1990 when he wrote, and I quote, The closest analog to virtual reality in my experience is psychedelic,

01:16:29

and, in fact, cyberspace is already crawling with acid heads.

01:16:34

The cyberdelic experience isn’t like tripping,

01:16:37

but it is as challenging to describe to the uninitiated,

01:16:41

and it does force some of the same questions,

01:16:44

most of them having to do with the

01:16:45

fixity of reality itself. End of quote. And keep in mind, that quote came out two years before the

01:16:53

World Wide Web was even released, so that was a really forward-looking thought that he had there.

01:16:59

Now, Barlow’s death two years ago was really a big blow to the entire psychedelic community,

01:17:05

including us here in the salon.

01:17:08

In fact, the first time that I ever met John Perry was at Burning Man,

01:17:12

and he came up and introduced himself to me after attending what was one of the very first Blanque Norte lectures,

01:17:18

the talk by Allison and Alex Gray.

01:17:21

I only saw him a few times after that, but I’ll never forget that merry twinkle in his eyes.

01:17:26

He somehow made everybody feel like they were his new best friend. And if you go to the Salon’s

01:17:32

podcast number 565, you’ll hear our tribute to Barlow, which includes not only words from John

01:17:39

Perry himself, but also tributes by John Gilmore and Cory Doctorow as well. It’s really worth listening

01:17:47

to again, and I’ll link to it in the program notes for this podcast, which you’ll find at

01:17:51

psychedelicsalon.com. And for now, this is Lorenzo signing off from cyberdelic space. Namaste, my friends. Thank you. you