Program Notes
https://www.patreon.com/lorenzohagerty
Guest speaker: Cory Doctorow
[NOTE: All quotations are by Cory Doctorow.]
“This world of computers exists in a principle-free environment. The Internet of Things is the Internet of absolute, self-serving bullshit.”
“The Internet of Things needs principles.”
“The real struggle here, it’s not making computers free, it’s making people free. The reason we want to save computers is not because computers are more important than racial justice, or gender equity, or getting rid of homophobia and transphobia, or the climate. The reason we want to make computers free and open is because we cannot win those fights without a few and open information infrastructure.”
“Considered atomically, one thing at a time all the things computers can do, we live in an age of unparalleled wonders. But all civilizations fall, and we have the shared responsibility to the civilizations that come after us to build the infrastructure that will lead to a future in which technology exists to server its users, not destroy their lives in service to surveillance capital, and the global war on terror, and self-serving bullshit so noxious that you can see it from orbit.”
“No one puff [of a cigarette] is going to give you cancer, but statistically given enough puffs you’re getting a tumor. If that tumor erupted with the drag there would be no second drag. The reason people smoke is because the tumors happen years later. And the reason people give up their private information is because the privacy stuff that bites them in the ass almost always happens years and years later too.”
Books by Cory Doctorow
Electronic Frontier Foundation (EFF.org)
Previous Episode
521 - Risk Reduction – How You Can Help
Next Episode
Similar Episodes
- 474 - Important Podcast – This Affects YOU! - score: 0.84204
- 548 - The Politics of Electronics - score: 0.80459
- 529 - Privacy & Free Speech in 2017 - score: 0.80176
- 565 - John Perry Barlow Tribute - score: 0.79448
- 646 - Big Tech Needs A Conscience - score: 0.74068
- 312 - Occupy the Internet - score: 0.72477
- 371 - Civil Rights In Cyberspace - score: 0.71623
- 397 - Art and Other Disruptive Technologies - score: 0.69102
- 425 - Drug Policy, Technology and Everything Else - score: 0.68305
- 401 - Surveillance and Revolution - score: 0.65131
Transcript
00:00:00 ►
Greetings from Cyberdelic Space, this is Lorenzo and I’m your host here in the Psychedelic Salon. And today we have fellow saloners Kate M,
00:00:27 ►
Trapeze High LLC, John P, Stephen J, Oliver B, and Duncan S to thank for their donations which
00:00:37 ►
will be used to of course keep these podcasts coming your way. And I thank you one and all
00:00:43 ►
from the very bottom of my heart.
00:00:51 ►
Now, for today’s podcast, I’m going to play another of this year’s Palenque Norte lectures, and this one is by Corey Doctorow, who you’ll remember from last year’s lectures,
00:00:57 ►
as well as from reading his articles in Wired, BuzzFeed, and other publications.
00:01:02 ►
But in addition to his writing pursuits,
00:01:05 ►
Corey is also a special advisor to EFF, the Electronic Frontier Foundation,
00:01:11 ►
and it is in this capacity that he brings some important information to us today.
00:01:17 ►
Interestingly, I previewed this talk last Thursday and decided that after another round
00:01:22 ►
of Terrence McKenna talks that I’d play this talk by
00:01:25 ►
Corey. However, the very next day, as you already know, there was a large denial of service attack
00:01:33 ►
that took down a significant portion of the internet here in the States. The reason that I
00:01:38 ►
found this so interesting is that this DNS attack was orchestrated through a botnet running on the IoT,
00:01:45 ►
the Internet of Things,
00:01:47 ►
which is exactly what Corey was warning people about
00:01:50 ►
when he delivered this talk two months ago at Burning Man.
00:01:54 ►
And while this talk isn’t about psychoactive substances,
00:01:58 ►
it actually may be every bit as important to you
00:02:00 ►
as was last week’s risk reduction talk by Annie Oak.
00:02:04 ►
As Corey says, keeping the Internet free and open Thank you. thanks to fellow salonner Frank Nuccio, who recorded this talk for us, we are now going to get to hear what the burners
00:02:28 ►
in the Camp Soft Landing lecture tent recently learned.
00:02:31 ►
We have next to me Corey Doctorow.
00:02:36 ►
Can we get a round of applause, please?
00:02:39 ►
Thank you. And Corey is a science fiction
00:02:43 ►
author, activist, journalist, and blogger.
00:02:47 ►
He is the MIT Media Labs activist in residence,
00:02:51 ►
working for the Electronic Frontier Foundation
00:02:54 ►
and co-founded the UK Open Rights Group.
00:03:00 ►
He also wrote two novels about the burn,
00:03:03 ►
which is definitely going to be worthwhile to check out. I’m sure you can find them online. And I’m currently residing
00:03:08 ►
in Los Angeles. So the way we’re going to go about this is he’s going to give us his
00:03:15 ►
lecture and then we’re going to do an open Q&A. So hold your questions till the end.
00:03:21 ►
And thanks for being here.
00:03:23 ►
Thanks. Thanks for having me. Thanks for coming, folks.
00:03:27 ►
So I’m going to give a little talk about infrastructure
00:03:30 ►
and what it means and what kind of infrastructure we’re building.
00:03:33 ►
So did anyone here help build the city?
00:03:35 ►
Anyone got one of these?
00:03:36 ►
So you know what infrastructure is like and what it means.
00:03:38 ►
Infrastructure is what we build everything else on.
00:03:40 ►
It’s the, you know, architecture is politics, right?
00:03:42 ►
You build a city like this and you get certain outcomes.
00:03:44 ►
You build a city of a different shape, you get a really different outcome.
00:03:48 ►
So there’s a really cool semi-apocryphal infrastructural story that historians and
00:03:54 ►
technologists like to tell about the Roman metallurgy in the era of the first chariots.
00:04:00 ►
So the state of Roman metallurgy in the era of the first chariots determined the maximum
00:04:04 ►
length of the axle. The maximum length of the axle determined the wheelbase of the chariot,
00:04:08 ►
which determined the width of the roads that the Romans built all across Europe.
00:04:11 ►
After Roman civilization collapsed, because all civilizations collapse, everything collapses.
00:04:16 ►
After it collapsed, hundreds of years later, we got the first modern roads. And those first modern
00:04:23 ►
roads were built to the width of the Roman chariots
00:04:25 ►
because they were built on top of the Roman roads,
00:04:28 ►
which was determined by the state of metallurgy
00:04:31 ►
when the first chariots were being made.
00:04:33 ►
So the modern road determined how wide a car’s wheelbase could be
00:04:37 ►
because you couldn’t build cars that were wider than the roads we already had.
00:04:41 ►
So the car determined how wide a truck’s wheelbase could be because we
00:04:45 ►
weren’t going to build special roads for trucks. You have to be able to offload and unload containers
00:04:49 ►
from trucks to trains. So that determined the maximum width of a container on a train.
00:04:55 ►
Now, the space shuttle’s reusable fuel canisters were transported by rail. And the design parameter
00:05:02 ►
of that fuel canister was prefigured and determined
00:05:06 ►
in some important way by the state of Roman metallurgy thousands of years before, right?
00:05:11 ►
So the infrastructure that we built, even after we’re gone and even after what we built has failed,
00:05:17 ►
redounds through the ages in ways that are significant and determine what kind of lives
00:05:22 ►
the people who come after us will live. So I want to talk about two different infrastructure
00:05:27 ►
projects, one of which ended really well so far, one of which is in a little bit of trouble.
00:05:31 ►
The one that went really well is the free and open source software movement. Everyone
00:05:34 ►
familiar free and open source software? Okay, so free and open source software movement,
00:05:38 ►
open licensed code, people contribute lots of different reasons. Some people want to
00:05:42 ►
scratch their own itches. Some people feel it’s unethical to keep code a secret. Some people want to disrupt their neighbor’s business. You
00:05:51 ►
know, if you want to fuck with Microsoft, make Word free. But there’s lots of different reasons
00:05:55 ►
people build open source software. And open source and free software has been so successful that it’s
00:05:59 ►
now basically what we use. I mean, yeah, people carry MacBooks and Winbooks, but they use them as dumb terminals to talk to Linux in the cloud, right? Twenty years since Linux
00:06:08 ►
was founded, and that’s where all the action is. Your iPhone Unix box, your, you know,
00:06:15 ►
open source Unix box, your Android phone is a free and open source software Unix box.
00:06:19 ►
It’s just, it’s open source software all the way down. It won, right? It won in a way that
00:06:24 ►
it was, like, inconceivable 15 years ago.
00:06:27 ►
And then there’s the free and open web.
00:06:29 ►
Free and open web also built for lots of different reasons.
00:06:32 ►
Some people wanted to create a more democratic world
00:06:35 ►
that was really decentralized.
00:06:36 ►
Some people wanted to get unbelievably fucking rich.
00:06:39 ►
Some people wanted to do both.
00:06:40 ►
Some people wanted to find a better way
00:06:43 ►
to talk to each other about subjects that
00:06:45 ►
they weren’t allowed to talk to in the real world. You know, there’s this story that, like,
00:06:48 ►
pornographers are hardcore technologists. It’s not that pornographers are hardcore technologists.
00:06:54 ►
It’s that, like, if you’re not allowed to say things that are important to you using the
00:06:58 ►
existing channels, then it’s worth your while investing time figuring out how to use these
00:07:02 ►
new janky channels where you can talk because you go from like not being able to talk to being able to talk whereas everybody else it’s
00:07:08 ►
like why would you bother learning how to use the dumb text only internet if you can use all the
00:07:13 ►
other media to talk to each other right so people built it for porn and they built it to talk about
00:07:17 ►
drugs and they built it to talk about terrible things and wonderful things and meet people and
00:07:21 ►
to do all of that stuff and we we, many of us, believe that we
00:07:25 ►
could build something by building a free and open web that would be genuinely transformative,
00:07:31 ►
that would be infrastructure on which we could build more free, more open societies.
00:07:36 ►
But a funny thing happened on the way to the 21st century. If you haven’t noticed,
00:07:40 ►
the web is not very free and open anymore. It’s become unbelievably centralized, right? The web is not very free and open anymore. It’s become unbelievably centralized.
00:07:48 ►
The web and finance capital have become welded at the hip in a way that is a kind of monopolist wet dream
00:07:51 ►
and makes the biggest surveillance dreams of the Stasi
00:07:55 ►
look unambitious by comparison.
00:07:58 ►
Now, how is it that the free and open source software movement
00:08:01 ►
ended up continuing to make free and open source software. And the free
00:08:06 ►
and open web, kind of like a lot of those people either cashed out or gave up or just found
00:08:11 ►
themselves shouting into a storm that completely overpowered them. So I have a theory about this,
00:08:18 ►
and it revolves around something that economists call Ulysses Pax. You know the story of Ulysses,
00:08:22 ►
right? Ulysses wanted to sail into the
00:08:25 ►
siren-infested waters where the sirens sang from the sea. And if you heard their song, you would
00:08:31 ►
jump into the sea and drown and they’d eat you. And the normal kind of protocol for dealing with
00:08:36 ►
sirens, the ISO-approved siren protocol, was to fill your ears with wax. But Ulysses was a hacker.
00:08:42 ►
He wanted to hear the sirens but not jump into the sea.
00:08:51 ►
And so before he got to the siren waters, he had his sailors lash him to the mast so that he couldn’t jump into the sea. So Ulysses knew that there would be a moment in the future when he
00:08:55 ►
would be tempted by weakness. And he knew that he had a moment now where he was strong. And he used
00:09:00 ►
a promise that he himself couldn’t break to bind himself when he was strong so that he couldn’t give in when he was weak.
00:09:08 ►
And the free and open source software movement has an amazing Ulysses pact.
00:09:12 ►
It’s the open source licenses.
00:09:14 ►
And one thing that all of those licenses share is that they are irrevocable.
00:09:18 ►
Once you make your source code open, you’re not allowed to close it again.
00:09:22 ►
And human beings have lots of moments of weakness, right? There are lots of times when people might be tempted to close their code, just
00:09:28 ►
like there was lots of times when people were tempted to close the web. But they weren’t
00:09:32 ►
able to. They weren’t able to because at the moment when they were idealistic and starting
00:09:36 ►
out and fresh and full of beans, they made a promise to the rest of the world that they
00:09:41 ►
were not allowed to break, that they would never close their code. So, like, if you started a company and got 30 of your best friends to quit their jobs
00:09:47 ►
and put their mortgage on the line to make free and open source software with you, and
00:09:51 ►
your company was almost out of money and your investor said, I’m going to put all of your
00:09:54 ►
friends on the breadline and their kids are not going to have a college fund unless you
00:09:58 ►
take all your software and close it, you literally couldn’t close it. It didn’t matter, right?
00:10:03 ►
Gun to your head, you couldn’t close the code.
00:10:07 ►
So the free and open source software movement had a Ulysses pact, and they hung on to it.
00:10:10 ►
But the web didn’t.
00:10:12 ►
The web didn’t have any way for those of us
00:10:15 ►
who were tempted or in a moment of weakness
00:10:17 ►
or in a moment of foolishness
00:10:20 ►
or in a moment of extremis
00:10:22 ►
or in a moment of greed
00:10:24 ►
to stop our weak selves
00:10:27 ►
from trumping the dreams of our strong selves. And as a consequence, 20 years later, we moved
00:10:33 ►
inch by incremental inch from do no evil, don’t be evil, to surveillance capitalism on the web.
00:10:40 ►
Now, this matters because the web has metastasized, right? The most salient fact
00:11:05 ►
about all the technology that you use all day long, the things that are in your body, the things that touch your body, the things that you put your body inside of, the supercomputers in your pockets that you use to throw birds at pigs but that know all the people you know and all the places you go with them and all the things you talk to them about and how to get into your bank account and what your lawyer emailed you last week,
00:11:08 ►
all of those things,
00:11:09 ►
the most salient fact about them
00:11:11 ►
is that they have a networked computer inside of them.
00:11:14 ►
This is the era of the networked computer
00:11:16 ►
and the networked computer is a theory-free zone
00:11:19 ►
with no principles that we have all agreed on
00:11:23 ►
that networked computers should have to make sure that we have all agreed on that network computers should have
00:11:26 ►
to make sure that we build a free and open web
00:11:28 ►
and not the terrible closed dumpster fire web on fire.
00:11:33 ►
So let’s get a sense of just how much stuff there is
00:11:37 ►
that is a network computer in a box
00:11:40 ►
that we think of as not a computer but something else.
00:11:43 ►
A car.
00:11:43 ►
A car is a computer you put your body into, whips you down the road, 60 miles an hour, you hope that the
00:11:48 ►
software does what it’s supposed to. A baby monitor is a camera and a microphone connected
00:11:52 ►
to the internet in your baby’s room that you hope only you can see. And if you remember
00:11:59 ►
last January, the San Francisco papers were full of stories about a mom whose three-year-old
00:12:03 ►
said, mommy, mommy,
00:12:05 ►
the phone in my room, called it the phone. The phone in my room keeps talking to me at night,
00:12:09 ►
and it scares me. And she walked by one night, and some rando was talking out of the baby monitor
00:12:14 ►
and swearing at her baby. And she walked into the room, and the baby monitor’s steerable camera
00:12:20 ►
turned around to look at her. And this rando’s strange voice said, uh-oh, mommy’s in the
00:12:25 ►
room, right? So a baby monitor is a computer you put your baby into, right? A hearing aid, right?
00:12:31 ►
Like, a hearing aid is not a beige retro hipster analog transistorized gadget. It’s a general
00:12:37 ►
purpose computer in your head that knows what you hear and can make you hear things that aren’t
00:12:41 ►
there or stop you from hearing things that are there, depending on how it’s configured.
00:12:46 ►
A voting machine is a computer we put a democracy inside of.
00:12:49 ►
Hospitals are computers we put sick people inside of.
00:12:54 ►
Seismic dampers, right?
00:12:56 ►
You’ve been to the major cities of the world
00:12:58 ►
that have been colonized by the finance industry.
00:13:01 ►
You’ll see that one of the apex predator signifiers
00:13:04 ►
of the finance industry is when you’ve really made it the kind of like apex predator signifiers of the finance industry
00:13:05 ►
is when you’ve really made it,
00:13:06 ►
you build a skyscraper
00:13:08 ►
that looks like Dr. Seuss designed it.
00:13:09 ►
It’s unbelievably tall and willowy
00:13:11 ►
and you can’t imagine
00:13:12 ►
how it manages to stay up.
00:13:13 ►
The way that it stays up
00:13:14 ►
is they put a seismic damper.
00:13:15 ►
It’s a huge mass of concrete,
00:13:17 ►
usually sometimes water,
00:13:19 ►
attached to a computer
00:13:20 ►
and it helps the building
00:13:21 ►
lean back into the wind
00:13:22 ►
and into seismic stresses.
00:13:25 ►
Like a skyscraper is a computer we put bankers inside of.
00:13:31 ►
And so, as I say,
00:13:33 ►
this world of computers exists in a principle-free environment.
00:13:37 ►
The Internet of Things
00:13:39 ►
is the Internet of absolute self-serving bullshit.
00:13:43 ►
And self-serving bullshit is the thing that we use
00:13:45 ►
Ulysses Pax to guard ourselves against. Because when you are in the environment in which someone
00:13:50 ►
says, look, it’s just a tiny compromise that you need to make to your principles to allow all those
00:13:57 ►
people to pay their mortgages, or to allow your company to be sold so that you can go on to do
00:14:01 ►
something else that’s even better, or to allow
00:14:05 ►
just a little bit of centralization, how bad could it possibly be? Or it’s probably okay to design a
00:14:12 ►
service like Dropbox in which we ask everyone to put all of the information that is valuable to
00:14:16 ►
anyone anywhere in the world and could compromise them, and we probably won’t ever leak 65 million
00:14:21 ►
passwords and 32 million hashes like we did last week, right?
00:14:30 ►
Self-serving bullshit is the thing that we use Ulysses packs to guard ourselves against,
00:14:36 ►
and the Internet of Things is made of self-serving bullshit. So the self-serving bullshit starts with the Internet of Things business model, right? So the Internet of Things business model is I’m
00:14:40 ►
going to make some hardware. Hardware has a 2% margin. That margin goes to 0% if you become
00:14:45 ►
successful because your Pacific Rim contractor starts running a third shift on your factory,
00:14:50 ►
producing your same product and selling it out the back door at a 0% margin or even less than
00:14:56 ►
you’re paying for it. And so you make no money on the hardware, right? And so what’s the IoT
00:15:00 ►
business model? How do you get a financier, a VC, to give you enough money to make baby
00:15:06 ►
monitors and cars and hospitals and all the rest of the IoT things? The way you get them
00:15:11 ►
to do that is you tell them that you’re going to sell something else around your Internet
00:15:16 ►
of Things thing. You’re going to sell services. So you’re going to say, only we can approve
00:15:21 ►
the software that runs on this, and we’re going to charge people money for the software,
00:15:24 ►
and we’ll charge the software vendors,
00:15:25 ►
like an app store, right?
00:15:26 ►
I’m going to make a thermostat,
00:15:28 ►
and if you want to add a security camera to the thermostat,
00:15:31 ►
you’re going to have to go through me,
00:15:34 ►
and I’m going to charge you a license fee,
00:15:35 ►
and then I’m going to charge the customers for it,
00:15:37 ►
and I’m going to get them coming and going.
00:15:38 ►
Or you say, and or, you say,
00:15:41 ►
I’m going to collect a lot of data on my customers,
00:15:43 ►
and somewhere down the line, we’re going to figure out a way to turn data into money, right?
00:15:48 ►
Somewhere out there is a way to collect all this data and monetize it.
00:15:51 ►
And so they collect a lot of data.
00:15:53 ►
Now, most of these businesses have a six-month to one-year runway.
00:15:57 ►
And they’re collecting tons of data.
00:16:00 ►
And they’re locking up their devices so that the users can’t reconfigure them. And every dollar that they spend on securing those devices,
00:16:08 ►
beyond the minimum viable security that keeps them from actually bursting into flame
00:16:12 ►
when you take them out of the box,
00:16:14 ►
is a dollar they don’t have to spend on runway to keep the doors open
00:16:17 ►
while they’re waiting to either be acquired or sometimes go public
00:16:21 ►
or maybe become a viable business.
00:16:23 ►
And so none of those companies spend one penny more than they need
00:16:27 ►
to make the security stuff happen.
00:16:29 ►
So now we have these proliferating devices in our bodies, around our bodies,
00:16:33 ►
everywhere we look, that are designed with security last.
00:16:36 ►
You know, safety third, security last, as their overwhelming principle,
00:16:42 ►
designed and built by people whose plan is that in six months
00:16:45 ►
they’ll either be in a different line of business
00:16:47 ►
or any problems that emerge
00:16:48 ►
from using this smart light bulb
00:16:50 ►
with a Wi-Fi access point on your LAN
00:16:52 ►
that can talk to your phones, computers, and cameras,
00:16:56 ►
any of those problems are either going to be
00:16:57 ►
the problems of the people who bought their company
00:16:59 ►
and not theirs,
00:17:00 ►
or they’re going to be no one’s problem
00:17:02 ►
because they’ll be out of business.
00:17:04 ►
So the Internet of Things needs principles.
00:17:08 ►
But it gets worse because not only is the Internet of Things a place in which self-serving
00:17:13 ►
bullshit rules and we have this toxic business model that causes people to build insecure
00:17:17 ►
devices that harvest huge amounts of your data, but it exists in a legal environment
00:17:22 ►
that is absolutely dysfunctional in terms of how we think of computers
00:17:26 ►
and regulate them.
00:17:27 ►
So for example, there’s a law
00:17:28 ►
passing the United States
00:17:29 ►
called the Digital Millennium Copyright Act, DMCA.
00:17:32 ►
You may have heard of it.
00:17:33 ►
It’s the author of all your favorite YouTube videos.
00:17:35 ►
This video has been removed
00:17:36 ►
thanks to a claim by the DMCA.
00:17:38 ►
So the DMCA is a big, gnarly hairball of a law.
00:17:41 ►
It has lots of clauses.
00:17:42 ►
It has one clause that’s really important,
00:17:44 ►
Clause 1201, Section 121 of the DMCA, says that if you have a copyrighted work in a device
00:17:50 ►
and you design the device so that it protects the copyrighted work, it stops the user from
00:17:57 ►
accessing it, that tampering with that device to allow the user or the owner of that device to access the copyrighted work is illegal,
00:18:06 ►
even if the person who’s bypassing it does so for a legal reason.
00:18:11 ►
I know that’s super complicated.
00:18:12 ►
I’m going to dig into it here.
00:18:14 ►
What it means is that if you design, say, Netflix, Netflix serves video.
00:18:23 ►
We know in America you have the right to record the video that comes into your house.
00:18:27 ►
You don’t have the right to, like, burn DVDs of it and sell them on the street in a blanket,
00:18:30 ►
but you have the right to record videos.
00:18:31 ►
But Netflix adds this kind of layer of protection around their video.
00:18:35 ►
And to record the video without Netflix permission, you have to remove that layer of protection.
00:18:40 ►
So it’s legal to record the video, but it’s not legal to remove the layer of protection to record the video.
00:18:45 ►
So then it’s illegal to record the video.
00:18:47 ►
So companies can take their commercial preferences, the thing that they would rather you not do, because they can make more money.
00:18:53 ►
Maybe Netflix can promise no home recording to the studios that give them videos, and they get more video, and that’s good for them.
00:18:59 ►
And so they can take that business preference.
00:19:01 ►
We wish that people weren’t allowed to record our videos, and they can convert it into a legal obligation.
00:19:08 ►
You may not record videos, even though you may.
00:19:11 ►
You may not because you have to break this lock in order to do it.
00:19:14 ►
And it’s a super hardcore fine and criminal charge for removing that lock.
00:19:26 ►
charge for removing that lock. The maximum penalty for a first offense for reconfiguring your own computer to do something that you’re allowed to do is a five-year prison sentence and a $500,000
00:19:33 ►
fine, right? So this means that the Internet of Things business model where, sorry, I forgot to
00:19:40 ►
take my scarf off here. The Internet of Things business model where we’re trying desperately
00:19:44 ►
to figure out how to monetize this low-margin hardware,
00:19:47 ►
has a kind of gift from the U.S. government
00:19:49 ►
in which the U.S. government says,
00:19:52 ►
if you merely design your product
00:19:55 ►
so that in order to do something
00:19:58 ►
that makes it less profitable,
00:19:59 ►
you have to tamper with a lock,
00:20:01 ►
the government will make sure no one can do that.
00:20:03 ►
So if you come up with, like, the inkjet make sure no one can do that. So if you come up with
00:20:05 ►
like the inkjet printer business model, where adding something that your device uses up periodically
00:20:10 ►
is a thing that you make a lot of money on because you charge really high margins for that
00:20:17 ►
consumable product, for the paper, for the light bulb and the light socket, for the inkjet ink,
00:20:21 ►
for the insulin and the insulin pump. If you design that device so that using unofficial product in it requires tampering with a digital lock,
00:20:32 ►
then you can ask the government to enforce your right to charge people up the ass for consumables
00:20:40 ►
for your products, right? So there have always been people who try to make expensive spares and parts
00:20:46 ►
for their stuff, right?
00:20:47 ►
You know, the car whose windshield wiper blade
00:20:49 ►
turns out to cost, like,
00:20:50 ►
more than a whole new windscreen,
00:20:52 ►
and it’s this funny shape.
00:20:54 ►
But usually, like, markets don’t solve all our problems.
00:20:56 ►
Usually markets solve that problem.
00:20:57 ►
If someone out there is getting a, you know,
00:20:59 ►
one bajillion percent margin,
00:21:01 ►
you would expect someone else to go out there
00:21:02 ►
and, like, make a competing product
00:21:04 ►
with a half bajillion percent margin. And, margin. And you get a race to the bottom. Eventually,
00:21:08 ►
we’re all paying marginal cost for things. But if you can outsource the enforcement of
00:21:13 ►
your business model to the government, then you can create these super high margin,
00:21:18 ►
kind of abusive monopolistic marketplaces. And you can command a lot more money than you would.
00:21:25 ►
So if you’re in the Internet of Things business and worried that you’re going to run into runway in six months
00:21:28 ►
and living on 2% hardware margins, this is your opportunity to maximize your revenue.
00:21:34 ►
And if you don’t take up the U.S. government on that offer, you’re a sucker, right?
00:21:58 ►
So, the, sorry, I can’t even read my notes. Oh, yeah. So, there’s a form of self-serving bullshit that is kind of latent in this business model that I call denialism or climate denialism. Denialism usually has this one characteristic,
00:22:06 ►
which is there’s a thing that I really want to be able to do,
00:22:08 ►
like drive my car,
00:22:09 ►
and it would be better for me if reality was different
00:22:13 ►
and driving my car didn’t have any bad consequences.
00:22:16 ►
Therefore, I will deny that there are any problems with driving my car.
00:22:19 ►
So early denialism, we had cancer denialism.
00:22:22 ►
So Camel used to advertise that their cigarettes
00:22:25 ►
were the best cigarettes for track and field athletes, right?
00:22:29 ►
And today we have a different kind of denialism
00:22:32 ►
in the kind of Internet of Things, surveillance capitalism, dumpster fire,
00:22:35 ►
which is privacy denialism, right?
00:22:37 ►
You have Mark Zuckerberg saying privacy is not part of our contemporary norms, right?
00:22:42 ►
Well, he says this even as he’s buying the four houses
00:22:45 ►
on either side of his home in Palo Alto
00:22:47 ►
so that no one can stick a long lens out his window
00:22:50 ►
or out their window and take a picture of him and his family,
00:22:53 ►
even as he’s buying the hundred acres
00:22:55 ►
around his beach house in Hawaii
00:22:57 ►
so that no one can get close enough to see what he’s doing.
00:23:00 ►
What Mark Zuckerberg means when he says privacy is dead
00:23:03 ►
is if your privacy was
00:23:05 ►
dead, I would have more money, right? But privacy denial allows him to say something that is
00:23:14 ►
self-serving and make it sound like he’s saying something that’s true.
00:23:20 ►
So one of the elements of Section 1201 of the DMCA that may not be obvious at first glance,
00:23:27 ►
this thing that lets you put a lock on something and then prohibit people from removing the lock,
00:23:31 ►
is that in order to figure out whether things that are locked up are secure,
00:23:35 ►
a lot of the times you have to take the lock off, right?
00:23:37 ►
So if you’ve got a gadget in your house, like a drop cam, or you’ve got a gadget in your pocket like a phone,
00:23:44 ►
and you want to know whether all of that data that potentially is horribly compromising to you is being adequately
00:23:49 ►
protected and encrypted before it’s sent to the company and lots of other things, then you often
00:23:54 ►
have to remove that lock or bypass that lock to get at the device. Well, security researchers
00:24:00 ►
are significantly endangered by Section 12.1 of the DMCA. In America, we actually
00:24:05 ►
once put a security researcher in jail for revealing flaws in Adobe’s e-book protection
00:24:11 ►
system. And security researchers last summer went to the copyright office during a regular hearing
00:24:17 ►
about this dumb law to explain all the different ways in which this dumb law gets in the way of
00:24:23 ►
them telling you about stuff that they know
00:24:25 ►
about the devices that you rely on, because if they reveal these defects they found,
00:24:30 ►
then they could potentially face criminal and civil liability. So like they said,
00:24:34 ►
oh, we found showstopper bugs and insulin pumps and heart monitors and pacemakers and
00:24:38 ►
implanted defibrillators and baby monitors and cars and tractors and voting machines and medical instruments
00:24:46 ►
and on and on and on. And we know these things are broken and you shouldn’t use them. And we
00:24:52 ►
can’t tell you about it. Because if we do, we could face significant civil and criminal liability.
00:24:59 ►
So we are designing a civilization built around computerizing and networking all of the things that when it finally goes, like Rome went, like every other civilization has gone, is going to leave behind a kind of infinitely pluripotent terribleness of secrecy, monopolism, abuse, and spying.
00:25:24 ►
Systems that are designed to do this.
00:25:25 ►
Whatever we build on top of this,
00:25:27 ►
if this is what’s left behind,
00:25:29 ►
when we’re done and the next thing comes along,
00:25:31 ►
that is going to be designed for and pushed towards
00:25:35 ►
just the worst, most dystopian future you can imagine.
00:25:38 ►
I speak in my professional capacity
00:25:40 ►
as a dystopian science fiction writer.
00:25:43 ►
So we need some principles.
00:25:45 ►
We need Ulysses Pact to defend the web that we’re going to make
00:25:49 ►
because we’re going to try to re-decentralize the web.
00:25:51 ►
That’s what everyone’s trying to do now.
00:25:53 ►
All the people who care about this stuff woke up one day and said,
00:25:56 ►
this is not the web we wanted.
00:25:57 ►
Let’s figure out how to make the web more like the web we wanted.
00:26:00 ►
And we need principles.
00:26:01 ►
We need principles like the free software movement had principles
00:26:03 ►
in order to lock that web open, in order to make sure that the infrastructure that we build today remains intact tomorrow when the pirates that we are today become admirals who change their minds about what it was that was good and bad and decide that when we were disrupting things and making them more open and free, that was legitimate progress.
00:26:26 ►
But now that we’re running things, anyone who does the same to us is just a crook.
00:26:31 ►
So you’re familiar with these kinds of principles.
00:26:35 ►
They occur in lots of places.
00:26:36 ►
The U.S. Constitution is a set of these principles.
00:26:39 ►
But so are the ten principles of Burning Man.
00:26:41 ►
And I have two principles.
00:26:43 ►
Two, I think, is a good, easy-to-remember number.
00:26:46 ►
Maybe we’ll get a three someday. That’s the rule of threes. But I have two principles I’m going
00:26:49 ►
to propose to you for the web that we are going to build and lock open. The thing that we need
00:26:54 ►
to make sure is core to this project that we all want to work on to remediate this bad stuff before
00:27:01 ►
it gets too bad. The first principle is that if you own a computer and that computer gets an instruction from someone else and that contradicts an instruction
00:27:10 ►
that you gave to that computer, the computer should always obey you without exception.
00:27:16 ►
Right? Thank you. The second principle is that if there is a fact, a true fact about
00:27:24 ►
a computer that you rely on
00:27:25 ►
that suggests that it’s not as reliable as you think it is,
00:27:30 ►
then it should always be legal under every circumstance
00:27:32 ►
to disclose that fact.
00:27:34 ►
We cannot afford to put corporations
00:27:36 ►
in charge of deciding who gets to embarrass them
00:27:39 ►
by revealing the dumb mistakes they made
00:27:41 ►
when they designed their stuff.
00:27:44 ►
Thank you.
00:27:45 ►
So those are the two principles, right?
00:27:47 ►
Principle one, user always gets to decide what the computer does.
00:27:51 ►
User two, always legal to know what your computer is doing.
00:27:54 ►
And I charge you today to be hardliners for those principles,
00:27:58 ►
to be fanatical on those principles.
00:28:00 ►
If they’re not calling you an unrealistic zealot about those principles,
00:28:04 ►
you are not trying hard enough. So as you heard, I work for Electronic Frontier Foundation.
00:28:11 ►
There’s some EFF people here. EFF is like the free software movement, like Burning Man,
00:28:17 ►
about the same vintage, 30 years old. And EFF 30 years ago was founded on this bizarre
00:28:22 ►
idea that civil liberties would have some nexus with technology, which at the time just sounded weird and dumb to a lot of people.
00:28:30 ►
And it’s a testament to what’s happened in the last 30 years that it seems axiomatic today.
00:28:35 ►
How could you possibly talk about civil liberties fight. Whether it’s, like, investigating mass graves in Syria
00:28:45 ►
or figuring out how to deal with police shootings in America,
00:28:50 ►
every one of these are intensely technological activities,
00:28:53 ►
and civil liberties and technology are bound up together.
00:28:56 ►
And at EFF, I helped start a project called Apollo 1201,
00:29:01 ►
1201, first section, 12-1 of the DMCA.
00:29:03 ►
And the idea of Apollo 1201 is to get rid of all digital locks
00:29:08 ►
that restrict how users can use their own technology
00:29:11 ►
within a decade all over the world.
00:29:14 ►
So we’re doing that.
00:29:15 ►
Excuse me, I need some water.
00:29:17 ►
Where’d my water go?
00:29:18 ►
Oh, that chair.
00:29:22 ►
We’re doing that.
00:29:23 ►
So we are making targeted interventions
00:29:27 ►
in lots of different places
00:29:28 ►
in which digital locks are being strengthened
00:29:31 ►
to weaken them,
00:29:32 ►
to make it harder for those locks
00:29:34 ►
to have the power of law
00:29:35 ►
and to make it easier for people
00:29:36 ►
to change those locks out.
00:29:39 ►
So for example,
00:29:39 ►
there’s an organization also about 25 years old
00:29:42 ►
called the World Wide Web Consortium, W3C. It makes the open
00:29:46 ►
standards that runs the web. If you’re making a browser, what you do is you download the W3C
00:29:50 ►
specifications and make sure your browser conforms to them. And the W3C, for decades, was the guardian
00:29:57 ►
of the open web. Everything they did had in mind this idea that anyone should be able to make the web and that the user should be
00:30:05 ►
in charge of their web browser. But about three years ago, for a whole variety
00:30:10 ►
of reasons that would take me forever to get into, the W3C felt desperate.
00:30:15 ►
They felt like they were being sidelined and marginalized and they needed to bring in more
00:30:19 ►
members and they needed to do more work. And so they decided that they would standardize digital locks
00:30:25 ►
as part of the open web
00:30:26 ►
so that if you implemented a full W3C spec browser,
00:30:31 ►
it would become a potential felony
00:30:32 ►
to disclose vulnerabilities in it or add features to it.
00:30:36 ►
And some of those features might just be innovation,
00:30:38 ►
like, you know, a VCR for your Netflix.
00:30:43 ►
And some of them might be assistive innovations,
00:30:46 ►
like here’s a machine learning system
00:30:48 ►
that can watch a video and add a narrative track
00:30:51 ►
for people who are visually impaired.
00:30:53 ►
And so while you’re watching the video,
00:30:54 ►
a machine voice tells you what’s going on in it.
00:30:57 ►
And it’s against the law
00:30:59 ►
to bypass the W3C’s digital locks
00:31:02 ►
in order to engage in these activities.
00:31:06 ►
And so we went to the W3C because we’re members and we said, all right, we think you shouldn’t
00:31:10 ►
do this. And they said, well, we’re going to do it.
00:31:13 ►
And we said, okay, then, what you need to do is make sure you protect the open web.
00:31:16 ►
You need to make sure that everyone who joins the organization has to promise that they
00:31:21 ►
will never use laws like the DMCA to attack people who unlock browsers,
00:31:27 ►
to report security vulnerabilities,
00:31:29 ►
to add new legal features,
00:31:31 ►
and to extend accessibility.
00:31:33 ►
And the W3C has a vote coming up on this.
00:31:36 ►
We have about 20 members
00:31:38 ►
who have committed to voting on our side for this,
00:31:41 ►
to stop all further progress on this
00:31:43 ►
until the W3C finds a way to protect people.
00:31:47 ►
And I think we’re going to win.
00:31:48 ►
It could happen any day.
00:31:49 ►
I’ve been checking my email here on the playa
00:31:51 ►
just because we don’t know when it’s going to happen,
00:31:53 ►
but it could happen any day.
00:31:54 ►
And that group of people includes, like, blockchain startups
00:31:57 ►
and also metadata consortia and universities,
00:32:01 ►
like Oxford University
00:32:02 ►
and the Royal National Institute for the Blind.
00:32:04 ►
It’s a really big church that we’ve gathered
00:32:07 ►
that think that this is the right thing to do.
00:32:09 ►
So we’re going to keep digital locks out of the open web.
00:32:12 ►
But we’re going further than that.
00:32:14 ►
You know, wait, there’s more.
00:32:16 ►
We last month, excuse me,
00:32:18 ►
or I guess two months ago now, is it September yet?
00:32:22 ►
Two months ago, we sued the U.S. government. And we sued the U.S. government to
00:32:26 ►
repeal Section 12.1 of the DMCA and validate Section 12.1 of the DMCA and make it legal for
00:32:33 ►
people to reconfigure the computers that they own so they do what they want them to do. And we’re
00:32:37 ►
representing two awesome clients. One of them is a burner who’s here on the playa, Bunny Wang,
00:32:41 ►
who’s at the phage camp in the institute. If you’ve seen the brain car going around on the playa,
00:32:46 ►
the people on the brain,
00:32:48 ►
they have these awesome open source hardware blinky badges
00:32:51 ►
that they can use to summon the brain.
00:32:53 ►
Bunny designed those badges.
00:32:55 ►
He is like the hardware hacker’s hardware hacker.
00:32:57 ►
And we’ve represented him before.
00:32:58 ►
He’s the guy who broke the DRM digital locks on the Xbox
00:33:01 ►
and let you install Linux on your Xbox back in the day.
00:33:05 ►
And he’s still at it. He’s super hardcore for this stuff. And we’re representing him. He wants to
00:33:10 ►
make tools that allow people to do more with digital video, even if they have to bypass a
00:33:14 ►
digital lock. And we’re also representing a total security ninja, hardcore security professional at
00:33:21 ►
Johns Hopkins University called Matthew Green. Matthew Green has a National
00:33:25 ►
Science Foundation grant to study things like voting machines and the black boxes that are
00:33:29 ►
used to process payments at payment centers and medical systems. And we are suing the U.S.
00:33:34 ►
government on behalf of both of these researchers because both of them, or both of these clients,
00:33:38 ►
because both of them face criminal liability if they go ahead with these activities that would
00:33:42 ►
otherwise be illegal because they have to remove logs. And when we win, the DMCA will cease to be enforceable. And some of
00:33:48 ►
you, are any of you not American? Okay, so those of you who aren’t American might be saying,
00:33:53 ►
ha ha, look at those dumbass Americans with their shitty internet law. I have news for you. For like
00:33:59 ►
the last 15 years, the U.S. trade representative has been patient zero in a global epidemic of shitty internet law.
00:34:06 ►
And every country that trades
00:34:08 ►
with the U.S. in a significant way,
00:34:09 ►
with a couple of exceptions, has adopted their
00:34:11 ►
own version of this law at the behest
00:34:14 ►
of the U.S. So if you’re an Australian,
00:34:16 ►
that’s the U.S.-Australian Free Trade Agreement.
00:34:17 ►
And if you’re in the EU, that’s laws that
00:34:19 ►
implement Article 6 of the EUCD.
00:34:22 ►
Or if you’re in Canada, where I’m from,
00:34:23 ►
embarrassingly enough, we passed a law saying that we would do this in 2011. And it’s one thing to get it wrong in 1998,
00:34:30 ►
because we had a lot of dumb ideas about technology in 1998. You remember tiled backgrounds,
00:34:37 ►
tiled moving backgrounds for web pages in 1998? But if you’re still having dumb ideas about
00:34:41 ►
technology in 2011, you are seriously not paying attention, right?
00:34:46 ►
That is like felony stupid.
00:34:48 ►
But all of those countries, they have those laws that amount to a kind of suicide pact.
00:34:52 ►
We promise in Canada and Hungary and Australia and New Zealand, all through the EU,
00:34:57 ►
we promise that we will stop our businesses from going to these profitable lines of work
00:35:02 ►
in which we remove locks to allow people to get more out of their technology
00:35:05 ►
if America promises to do the same?
00:35:07 ►
Well, when America stops doing it,
00:35:09 ►
why would those other countries do it?
00:35:11 ►
If you’re an activist in one of those countries
00:35:13 ►
and there are activists in all of those countries
00:35:15 ►
who think that this stuff is terrible,
00:35:17 ►
then when the U.S. stops enforcing this law,
00:35:20 ►
you have just been given the tool
00:35:22 ►
that you need to dismantle the law
00:35:23 ►
in your own territory.
00:35:27 ►
Because there’s one true fact about suicide packs, suicide packs are mutual. If the US doesn’t jump off the cliff, neither
00:35:32 ►
should your country. So that’s what we’re going to do. We’re going
00:35:36 ►
to kill these digital locks everywhere in the world.
00:35:39 ►
And when I say this to people, they ask me if I’m pessimistic or if I’m optimistic about
00:35:44 ►
the future of technology.
00:35:45 ►
Do you think you’re going to win
00:35:45 ►
or do you think you’re going to lose?
00:35:47 ►
And, excuse me.
00:35:49 ►
And, like I said,
00:35:51 ►
I’m a science fiction writer.
00:35:53 ►
And science fiction writers
00:35:54 ►
are in the business
00:35:55 ►
of very badly predicting the future.
00:35:57 ►
If you add up all of the so-called predictions
00:36:00 ►
science fiction writers have made
00:36:01 ►
about the future,
00:36:02 ►
there are a few that came true
00:36:04 ►
in the same way that
00:36:04 ►
if you throw enough darts with your eyes closed,
00:36:06 ►
there are a few that will hit the board.
00:36:08 ►
But the success rate of science fiction as a predictive literature is somewhere between statistically zero and zero.
00:36:14 ►
And so I’m not in any position to predict the future,
00:36:18 ►
and optimism and pessimism are about predicting the future.
00:36:21 ►
What will the future be like?
00:36:23 ►
But it doesn’t matter, right?
00:36:24 ►
Because if you were optimistic,
00:36:25 ►
say you thought that we were going to win
00:36:28 ►
and that the open web would come back
00:36:31 ►
and be locked open and be the web
00:36:33 ►
that when our civilization crumbles
00:36:35 ►
and the next one comes after us
00:36:37 ►
would be the template for what came after us,
00:36:41 ►
then because those stakes are so high,
00:36:43 ►
every morning you should get up
00:36:45 ►
and do everything you can to make sure that comes true
00:36:46 ►
and if you were pessimistic
00:36:48 ►
if you thought
00:36:49 ►
that we were going to lose
00:36:52 ►
and that we were going to end up using
00:36:54 ►
entertainment technology as the camel’s nose
00:36:56 ►
under the tent by which we
00:36:58 ►
ended up with a surveillance society
00:37:00 ►
that made Kafka look like
00:37:03 ►
a My Little Pony
00:37:04 ►
if we were going to get Huxley into the
00:37:06 ►
full Orwell, you should get up every morning and do everything you can to make sure that doesn’t
00:37:12 ►
happen. So rather than optimism or pessimism, I’m going to take you back to the 2008 U.S.
00:37:20 ►
presidential race and ask you to brief for hope. So hope is the necessary but insufficient
00:37:27 ►
precondition for making things better when you don’t know how to make them better, when you don’t
00:37:32 ►
know the overall solution. Hope is why when your ship sinks in the open sea, you tread water.
00:37:37 ►
Not because you have any chance of being picked up in any realistic way, but because everyone who
00:37:41 ►
has ever picked up treaded water until rescue arrived.
00:37:45 ►
Hope is finding one thing you can do to make stuff better
00:37:49 ►
and doing that thing and seeing if having done that,
00:37:53 ►
you can think of the next thing to do.
00:37:54 ►
Because the first casualty of any battle is the plan of attack.
00:37:57 ►
We don’t need to know how to get from A to Z.
00:38:00 ►
We just need to know how to get from A to B
00:38:02 ►
and which way to look and to find C when we get there.
00:38:05 ►
And so I’m going to ask you to have hope.
00:38:07 ►
And hope has ways that you can implement it, right?
00:38:12 ►
Hope isn’t just a feeling.
00:38:13 ►
It’s a thing you do.
00:38:15 ►
I think we live in a moment of peak indifference to this stuff.
00:38:19 ►
I think that, like, for the last 25, 30 years that EFF was around,
00:38:24 ►
most of that time we spent trying to
00:38:25 ►
convince people that they needed to care about this. Now, I think that that problem is somewhat
00:38:31 ►
self-solving in as much as every couple of weeks from now on, a couple of million people are going
00:38:37 ►
to have their lives destroyed because something bad like this happened. Just think about that
00:38:41 ►
horrible Dropbox breach that happened last week. All of that confidential information.
00:38:45 ►
If you’ve been on the playa, Dropbox got horribly, horribly breached.
00:38:49 ►
So all of those nude photos and confidential documents and the deeds to your house
00:38:53 ►
and everything else you keep in Dropbox potentially compromised right now.
00:38:57 ►
Every couple of weeks from now on, there’s going to be a couple of million people
00:39:01 ►
who are going to show up and say, Ashley Madison just happened to me,
00:39:04 ►
or Office of Personnel Management just happened to me, or Dropbox just happened to me. What do
00:39:08 ►
we do about this? So in an era of peak indifference, from now on, the number of people who care about
00:39:14 ►
this stuff only goes up, right? We still don’t have a majority who care about it, but that number
00:39:19 ►
is only going to grow from now on. And it would be great to get those people who aren’t yet
00:39:26 ►
technologically savvy, because of course technological literacy just increases over time. It would
00:39:30 ►
be great to get those people to care about it because they outnumber the technologically
00:39:33 ►
savvy. But there’s a lot of technologically savvy people who don’t really get this stuff.
00:39:38 ►
EFF’s got tens of thousands of members. Reddit’s got millions of users. So there’s a lot of
00:39:43 ►
low-hanging fruit out there.
00:39:45 ►
There are a lot of people in your life right now who you know who are deep nerds, who think about
00:39:52 ►
technology all day long and don’t think about this stuff. And I’ve got a project for you. I want you
00:39:57 ►
right now to take a minute and think of two deep nerds that you know who care about technology but
00:40:03 ►
don’t know about this stuff, Don’t think about this stuff.
00:40:06 ►
Think about those two people and commit to
00:40:08 ►
yourself in the next week to
00:40:09 ►
sit down with those two people, just two.
00:40:12 ►
This is your only project
00:40:13 ►
that you need to do to try and
00:40:16 ►
make a change right now. Sit down with
00:40:18 ►
those two people and have this conversation
00:40:19 ►
with them. Tell them about the two principles.
00:40:21 ►
Computers should obey their owners.
00:40:23 ►
If there’s a true fact about the security of a computer that you use, you have the right to know about
00:40:28 ►
it. Sit down and have that conversation with them. And then ask them to have that conversation.
00:40:33 ►
And if your success rate is 50%, one in two, we will start a cascade that will maybe get
00:40:39 ►
us from 1% of all the deep nerds to 10%. And at 10%, we start to reach a tipping point.
00:40:45 ►
So that’s your project.
00:40:47 ►
Now, you may feel like
00:40:52 ►
every day you go out there and use technology
00:40:56 ►
in a way that is not great for you
00:40:58 ►
and the people around you.
00:41:00 ►
Maybe that’s like you feel guilty
00:41:01 ►
about the carbon emissions of your car,
00:41:03 ►
or maybe you’ve got one of those new electric cars
00:41:04 ►
that just runs on smug.
00:41:07 ►
But maybe you feel like, I use Facebook all the time, and I am part of the problem here.
00:41:15 ►
Or maybe you feel like my fruit-flavored devices, they’re Alpha and Omega or digital locks,
00:41:21 ►
and I throw money at them every week.
00:41:24 ►
Or maybe you feel like my search engine is part of the problem and not part of the solution.
00:41:28 ►
And it may be tempting to feel like you are not pure enough to even start.
00:41:33 ►
You first have to get yourself in order before you can help anyone else.
00:41:36 ►
But you never will, right?
00:41:38 ►
There is never a way that you are going to be able to just use technology in a way that reflects the principles of the world that you want to see until that world is there. And if you wait until that technological option is open to you,
00:41:50 ►
you’ll be waiting forever because we’ll never build that world without it.
00:41:53 ►
You know, there’s this thing, right, where like every vegetarian eventually meets a vegan,
00:41:57 ►
right? And every vegan eventually meets one of those giants who sweeps the ground in front of
00:42:01 ►
them so they don’t step on bugs and wears a mask so they don’t inhale them, right? No matter how pure you are, you are not pure enough. There’s still room
00:42:09 ►
to be more pure. And if you allow yourself to be paralyzed by the feeling that you’re not doing
00:42:14 ►
enough in your personal life, you’ll never get anywhere. So I have an alternative to that paralysis
00:42:19 ►
and it’s called hedging. So I got this from Denise Cooper. Some of you are involved in free and open
00:42:24 ►
source software will know Denise.
00:42:26 ►
Denise says she does this thing.
00:42:27 ►
Every month she sits down and she adds up how much money she spent that month on companies that are destroying the future she wants to live in.
00:42:34 ►
How much money she gave a cable operator that briefs against net neutrality or a company that uses digital locks as their beginning and their end.
00:42:44 ►
She adds up all that money and she takes that amount of money,
00:42:47 ►
and she gives it to activist groups fighting to make the world better.
00:42:51 ►
Now, I have a favorite activist group.
00:42:53 ►
I work for EFF. I’m a contractor there.
00:42:55 ►
They don’t actually pay me anything.
00:42:57 ►
My wages are paid, my contracting fees are paid through a grant from MIT.
00:43:01 ►
I’ve worked on and off for EFF for about 15 years now, I think.
00:43:05 ►
Is that right, Cindy? Yeah.
00:43:07 ►
And I’ve never seen an organization squeeze a dollar until it hollers more.
00:43:11 ►
They spend money incredibly well.
00:43:13 ►
But 15 years ago, when I joined EFF,
00:43:15 ►
there weren’t a lot of other organizations doing what we do.
00:43:18 ►
There was the Free Software Foundation,
00:43:20 ►
there was Public Knowledge, one or two more.
00:43:21 ►
In the last 15 years, because computers have metastasized into the world,
00:43:27 ►
every organization that has a brief to make the world better
00:43:30 ►
also has a component that works on digital stuff.
00:43:33 ►
And so the ACLU has become deeply technological,
00:43:35 ►
and so has Black Lives Matter,
00:43:37 ►
and so have many other organizations that are involved in this struggle in the round.
00:43:41 ►
And, you know, the real struggle here, it’s not making computers free,
00:43:45 ►
it’s making people free.
00:43:47 ►
The reason we want to save computers
00:43:48 ►
is not because computers are more important
00:43:50 ►
than racial justice or gender equity
00:43:52 ►
or getting rid of homophobia and transphobia
00:43:55 ►
or the climate.
00:43:56 ►
The reason we want to make computers free and open
00:43:58 ►
is because we cannot win those fights
00:44:00 ►
without a free and open information infrastructure.
00:44:03 ►
That’s how we get to a better world. And so find those organizations. I do think EFF should be one of them. I’ve been
00:44:09 ►
an EFF donor for longer than I’ve worked for EFF. Give them a hedge. Part of the money
00:44:16 ►
that you are spending every month on companies that are destroying the future you want to
00:44:20 ►
live in, give that to organizations who are trying to make the world better.
00:44:26 ►
Because as a lawyer who wrote an amazing white paper about the future of copyright law and
00:44:32 ►
3D printing once said, this is going to be great unless we fuck it up.
00:44:37 ►
Because for all the scary things about computers, if you’ve wandered out of the dust for my
00:44:42 ►
talk, you are probably someone who really likes
00:44:45 ►
computers and has had some really great experiences with computers. And considered atomically, one
00:44:52 ►
thing at a time, all the things computers can do, we live in an age of unparalleled wonders. But all
00:44:58 ►
civilizations fall. And we have the shared responsibility to all of us, all the people, all the burners, all the people in the world,
00:45:06 ►
but also to the civilizations that come after us to build the infrastructure
00:45:10 ►
that will lead to a future in which technology exists to serve its users,
00:45:17 ►
not destroy their lives in service to surveillance capital and the global war on terror
00:45:23 ►
and self-serving bullshit so noxious that you can see it from orbit.
00:45:27 ►
Thank you.
00:45:33 ►
So we have a little time for questions.
00:45:35 ►
How much time do we have?
00:45:38 ►
45 minutes.
00:45:39 ►
Really? That’s awesome.
00:45:41 ►
Okay, so I like to, because tech things often turn into a bit of a sausage fest in the Q&A,
00:45:48 ►
I like to call alternately on people who identify as female or non-binary and then male as non-binary.
00:45:52 ►
So we’ll start with someone.
00:45:53 ►
If there’s someone female or non-binary who’d like to start, thank you.
00:45:57 ►
Thank you, Corey.
00:45:59 ►
Thank you.
00:46:00 ►
So you gave us two fundamental principles there, and I want to talk about the first one.
00:46:07 ►
Computers, respecting and honoring the relationship with, I don’t use users, but owners.
00:46:13 ►
Yeah, sure.
00:46:14 ►
So quick story.
00:46:16 ►
I had a giant tumor discovered last year.
00:46:19 ►
I’m sorry.
00:46:19 ►
I finally found a surgeon, a doctor, was in Hollywood to deal with this situation.
00:46:25 ►
Time comes for surgery. I had a very hard time getting it done. Couldn’t figure out why. surgeon, a doctor, was in Hollywood to deal with this situation. Time
00:46:26 ►
comes for surgery. I had a very hard time getting
00:46:28 ►
it done. Couldn’t figure out why. I get
00:46:30 ►
out of surgery. There are
00:46:32 ►
no computers working in the
00:46:34 ►
hospital for days.
00:46:36 ►
And they couldn’t tell us why.
00:46:39 ►
Come to find
00:46:40 ►
out this is a… North Hollywood Methodist.
00:46:41 ►
This was the Bitcoin ransomware
00:46:44 ►
hack. Right.
00:46:46 ►
So my doctors ninja’d my surgery
00:46:47 ►
and made it happen.
00:46:49 ►
But what I saw
00:46:50 ►
were dozens of professionals
00:46:52 ►
who didn’t trust themselves
00:46:53 ►
because they did not realize
00:46:56 ►
that they were capable
00:46:57 ►
of doing their jobs
00:46:57 ►
without computers.
00:46:59 ►
Everyone in that hospital
00:47:00 ►
did their job.
00:47:02 ►
But because they had come to
00:47:03 ►
give so much of themselves through
00:47:05 ►
computers, they didn’t recognize that they were capable and that they were strong and
00:47:10 ►
able to collaborate effectively without them. So how do we help people get to that place
00:47:15 ►
so that the relationship is less codependent and fucked up?
00:47:20 ►
I hear what you’re saying. I think that there’s a goldilocks spot,
00:47:25 ►
there’s a sweet spot in how we rely on computers.
00:47:27 ►
Like, do you remember phone numbers anymore?
00:47:30 ►
Like, being good at remembering phone numbers
00:47:32 ►
took up mental capacity that you could use
00:47:34 ►
to remember other stuff, right?
00:47:36 ►
And, like, it’s true that when you don’t have access
00:47:38 ►
to your address book, you’re in trouble.
00:47:42 ►
But it’s not like you took all that time and brainpower that you used to spend
00:47:46 ►
remembering phone numbers and devoted it to watching sitcoms. You took that mental overhead
00:47:54 ►
space and you turned it into something cool. And the reason doctors and nurses and med techs use
00:48:00 ►
all this cool stuff that’s in their hospital is because it lets them focus on being doctors and med techs and stuff.
00:48:06 ►
It lets them, like, in an ideal world,
00:48:08 ►
having a computer that beeps
00:48:10 ►
when something’s going wrong
00:48:12 ►
with your circulation or your breathing
00:48:14 ►
frees you up to pay attention to the patient
00:48:17 ►
and what they’re doing
00:48:18 ►
instead of doing the mechanical business.
00:48:22 ►
And also, like, people suck at repetitive tasks, right? Computers are good at doing the same thing all day long. And so, people suck at repetitive tasks.
00:48:25 ►
Computers are good at doing the same thing all day long.
00:48:28 ►
And so if you want to make sure
00:48:29 ►
that you don’t accidentally give lethal doses of meds to people,
00:48:34 ►
yeah, humans can do that.
00:48:36 ►
But can humans do it monotonically 10 million times in a row
00:48:38 ►
without ever screwing up?
00:48:40 ►
And your doctors were able to perform your surgery,
00:48:42 ►
but it took forever.
00:48:43 ►
So how many surgeries didn’t get done at the hospital that day because they lacked automation, right?
00:48:48 ►
So I think that, like, yes, it’s great to have fallback plans.
00:48:52 ►
We totally need fallback plans.
00:48:54 ►
Pilots need to know how to land planes if the computers stop working.
00:48:57 ►
And you should know how to operate your car if the thing that keeps your braking distance steady stops working and all of that other stuff
00:49:05 ►
we we definitely need to have that fallback thing but you know we have a lot of people on earth
00:49:10 ►
doing a lot of stuff and we use automation a thing that we’ve done you know for thousands of years
00:49:17 ►
we use automation to free us up to do more abstract cool things and we lose something when
00:49:23 ►
we automate like there is a dimension that goes away when you automate. And it’s sometimes something important. If you read
00:49:28 ►
back to the dawn of the written word, the people who memorized literature said that we’re losing
00:49:34 ►
something really important because the experience of a book you’ve memorized is different from the
00:49:38 ►
experience of a book you’ve read. That’s totally true. But we traded that experience of the five
00:49:44 ►
books you could memorize in your entire lifetime
00:49:46 ►
for the ability to read five books a week for the rest of your life.
00:49:51 ►
And so I hear what you’re saying, right?
00:49:55 ►
But I think that it would be, we can’t turn back, right?
00:50:00 ►
We’re not going to degrow to the level where we can turn out, you know,
00:50:06 ►
Portland-grade artisanal brain surgeons who can do the surgeries with the tools that were
00:50:11 ►
developed by the pioneers when they crossed the Donner Pass, right? Like, they’re going to have
00:50:15 ►
to, like, we’re not going to fix everyone’s tumors with awesome computers. I think it’s about learning
00:50:20 ►
to trust ourselves and our collective capacity the same way we learn to trust the decentralized collective.
00:50:27 ►
Wow.
00:50:28 ►
I hear what you’re saying.
00:50:29 ►
I just – and I think that, like, yes, having that trust in yourself is – we should do drills, right?
00:50:34 ►
We should have plan Bs.
00:50:35 ►
We should have fallbacks.
00:50:36 ►
Power goes out.
00:50:37 ►
Stuff happens.
00:50:38 ►
That should all be there.
00:50:40 ►
But I don’t think we get there by, like, saying, okay, let’s just figure out how to just do this stuff by hand all the time.
00:50:46 ►
That should just be our fallback.
00:50:47 ►
Really, we should just fix the terrible IT problems with hospitals, which are deep and structural.
00:50:53 ►
Some of it is hospitals.
00:50:55 ►
If you were a hospital and you said, I’m going to build a new wing, and the firm of engineers showed up and they said, we’re going to build a giant atrium like this.
00:51:04 ►
And we’re going to put a piece of reinforced steel joist along the ceiling,
00:51:09 ►
and we’re going to calculate the load stresses so that your hospital doesn’t fall down.
00:51:12 ►
But we’re not going to tell you the math that we use to calculate the load stresses,
00:51:16 ►
because that’s our secret sauce.
00:51:17 ►
You shouldn’t buy a hospital from those people.
00:51:19 ►
But if you build a hospital, which is a computer you put sick people into,
00:51:22 ►
and you hire a software contractor to design the stuff without which your hospital is just a building we put people to die in,
00:51:29 ►
that software contractor routinely says, we’re not going to tell you how any of the software works.
00:51:34 ►
That’s our secret sauce. In this situation, they put the toxic skull and crossbones of poison
00:51:40 ►
as a photo on every computer. So you can imagine what that did
00:51:45 ►
to the collective consciousness of that experience.
00:51:47 ►
Yeah, it was pretty terrible.
00:51:48 ►
And there were like nine hospitals in a row, right?
00:51:51 ►
It wasn’t, Hollywood Methodist was the first,
00:51:52 ►
but there were a bajillion.
00:51:54 ►
Are there any male-identified people or non-binary people?
00:51:57 ►
Yeah, go ahead.
00:51:59 ►
Just had a question to follow up on your point
00:52:01 ►
about privacy, actually, and whether it’s dead.
00:52:06 ►
I was wondering about how you balance off privacy with ways to deal with resource scarcity.
00:52:12 ►
So something like ride sharing or Uber or anything broader than that,
00:52:15 ►
or even if you go to the electricity grid,
00:52:17 ►
to be able to kind of stave off the collapse that you were talking about.
00:52:20 ►
Is there a sweet spot between privacy and resource conservation that we can find,
00:52:25 ►
or is it just a fool’s errand? I don’t know. I mean, privacy, remember, is not secrecy. Privacy
00:52:30 ►
is like deciding, you get to decide who gets to know your stuff, right? Privacy isn’t like no one
00:52:35 ►
knows your stuff, right? And it’s not even that like people may not know the general shape of
00:52:43 ►
your stuff. I know what you do when you go to the Port-a-San, right?
00:52:46 ►
But it takes a special person to leave the door open.
00:52:49 ►
Privacy is like you
00:52:50 ►
just getting to have that agency
00:52:52 ►
about who gets to know your stuff.
00:52:54 ►
And so there may be
00:52:56 ►
situations like, you know, if you’re
00:52:58 ►
camping with a bunch of other people
00:52:59 ►
at Burning Man, you have to tell
00:53:02 ►
them when you’re arriving and they have to tell
00:53:04 ►
you where the camp is
00:53:05 ►
because that’s a thing that you all do for your own purposes.
00:53:08 ►
But I think that the way that privacy has evolved
00:53:11 ►
in the kind of surveillance marketplace that we have now
00:53:14 ►
is that rather than having these kind of quid pro quos
00:53:17 ►
where it’s a kind of harm reduction,
00:53:21 ►
least amount of information we need
00:53:23 ►
that people freely give and get, and that is
00:53:26 ►
stored as little as possible and destroyed as quickly as possible. We instead have a kind of
00:53:31 ►
casino where we design these kind of slot machine interfaces that give you a periodic, intermittent
00:53:40 ►
reward for disclosing your private information that are really designed to get you to disclose more information than you would if you were like thinking well about it and moreover
00:53:49 ►
that we then have the tools we use like our browsers and our phones that are also designed
00:53:54 ►
to to not let you decide which information you disclose like like by default all of those tools
00:54:01 ►
leak tons of information and they leak leak it at very low levels, like
00:54:06 ►
the unique identifier of your phone that’s picked up by Stingrays, which are these fake
00:54:09 ►
cell towers that were prototyped at Burning Man for ten years, according to one of the
00:54:13 ►
Snowden leaks. This is where the DHS did all their kind of, what would it look like if
00:54:17 ►
we had to surveil a whole bunch of people who were in the desert with no serious organization?
00:54:23 ►
The reason Stingrays work is that your phone is on the phone company’s side,
00:54:28 ►
not your side.
00:54:29 ►
And so the unique identifier
00:54:30 ►
that your phone beacons to the tower
00:54:32 ►
to let them know who to bill
00:54:35 ►
for the call you just made,
00:54:37 ►
that unique identifier
00:54:38 ►
is not supposed to be user changeable
00:54:40 ►
because then you could bill your calls to other people.
00:54:42 ►
And so when you then build a tower
00:54:44 ►
that says,
00:54:45 ►
hi, I’m a cell tower, what’s your unique identifier?
00:54:47 ►
The phone beacons that information back
00:54:50 ►
and you can’t spoof it by default.
00:54:52 ►
I mean, now that we know about these stingrays,
00:54:55 ►
people are trying to find ways to do it.
00:54:57 ►
But that’s going to be a serious arms race
00:54:59 ►
because for reasons that are of economic survival,
00:55:03 ►
there are a lot of phone carriers who would be really upset
00:55:05 ►
if every time you logged onto the network,
00:55:07 ►
your phone had a different unique identifier, right?
00:55:10 ►
So there are these deep problems with this kind of disclosure,
00:55:14 ►
and one of them is that we have devices that, by design,
00:55:17 ►
don’t let us choose which information goes out.
00:55:19 ►
We have this other problem, which is that we have systems
00:55:21 ►
that are designed to convince us to give up more information than we want.
00:55:24 ►
Then we have the third problem, which is that we have systems that are designed to convince us to give up more information than we want. And then we have the third problem, which is that privacy is very hard
00:55:28 ►
to do iterated improvement with. So iterated improvement is how we get better at stuff.
00:55:33 ►
Back in the old days of film cameras, the average American family was taking two rolls of film a
00:55:38 ►
year, one at Christmas and one on the family holiday. And you get the film back from the lab
00:55:43 ►
and you go, that’s terrible, that’s terrible. Oh lab and you go, that’s terrible, that’s terrible.
00:55:45 ►
Oh, that’s good, that’s terrible, that’s terrible.
00:55:47 ►
But no one took notes about what made the good ones.
00:55:50 ►
It was just random chance that made the pictures good.
00:55:53 ►
And now you’ve got the distraction rectangle
00:55:55 ►
and when you take a picture with the distraction rectangle,
00:55:58 ►
you get a two-second look at what just happened.
00:56:01 ►
And so you get better.
00:56:02 ►
We got so much better, so fast at
00:56:05 ►
taking pictures that you can start a billion dollar business that whose goal is to make the
00:56:09 ►
picture look like it was not taken at a Sears studio, but rather taken by someone who hadn’t
00:56:15 ►
gotten good at it because it looks inauthentic. Like our photos are so good by default now,
00:56:20 ►
right? So privacy, you make a privacy disclosure and like sometime in the future that will come back and bite you in the ass.
00:56:27 ►
Not everyone. But if you make enough of them, one of them will eventually.
00:56:31 ►
Like if you smoke cigarettes long enough, no one puff is going to give you cancer.
00:56:35 ►
But statistically, given enough puffs, you’re getting a tumor. Right.
00:56:39 ►
If that tumor erupted with the drag, there would be no second drag, right? The reason people smoke is
00:56:47 ►
because the tumors happen years later. And the reason people give up their private information
00:56:51 ►
is because the privacy stuff that bites them in the ass almost always happens years and years
00:56:56 ►
later too. So these three structural problems with privacy, right? One is we can’t iterate.
00:57:02 ►
One is our tools are crappy. and one is companies try to trick us
00:57:05 ►
and between those three it’s really hard to imagine like kind of what a utopian like uh
00:57:11 ►
ride share resource sharing system would look like that didn’t uh that that was respectful
00:57:17 ►
of privacy and we can kind of control the first two like we can make businesses better and we
00:57:21 ►
can make tools better i don’t really know what we do about the third one. That kind of public health problem where when you do a thing,
00:57:29 ►
the consequences are really far down the road and then you don’t learn,
00:57:35 ►
that’s a problem that we really struggle with.
00:57:37 ►
We struggle with it in the context of nutrition and smoking and STIs
00:57:42 ►
and long-term personal health
00:57:46 ►
and lots of other problems.
00:57:47 ►
Those are really hard to solve,
00:57:48 ►
but maybe if we solve the first two,
00:57:50 ►
the third one won’t be quite so important.
00:57:52 ►
So if that’s a thing you’re worried about,
00:57:54 ►
work on solving those first two technological problems.
00:57:58 ►
Are there any female-identified or non-binary people
00:58:00 ►
who would like to ask the next question?
00:58:01 ►
Yeah, go ahead.
00:58:05 ►
Hi.
00:58:06 ►
So in Makers, you talked about 3D printers, right,
00:58:11 ►
that had copyrighted, you know, 3D printer goop
00:58:15 ►
that makes the things.
00:58:17 ►
And, you know, in my experience,
00:58:20 ►
one of the best things about 3D printers
00:58:22 ►
is for all those little plastic bullshit things that you get, like randomly from Ikea or for some device that are specially shaped and that, you know, it takes, it’s a pain in the ass to order from the internet, right?
00:58:43 ►
does it cover things like modifying your 3D printer
00:58:44 ►
to take arbitrary
00:58:47 ►
3D printer goop
00:58:48 ►
and does it
00:58:49 ►
do you worry about
00:58:53 ►
things like the weird little
00:58:54 ►
shaped plastic thing that goes in your shelf
00:58:57 ►
being copyrighted
00:58:59 ►
by large corporations
00:59:00 ►
and
00:59:01 ►
do you worry about things like overreach there causing us not to be
00:59:08 ►
able to print those things as well?
00:59:10 ►
Sure. Those are really good questions. I’m going to break it apart into a couple of different
00:59:15 ►
things. So first of all, gizmos, if they have any copyright, it’s a very thin copyright,
00:59:22 ►
and usually they don’t, right? But remember, if you have to bypass a digital lock,
00:59:28 ►
the DMCA says that you’re potentially breaking the law.
00:59:32 ►
And so even though you can, in theory,
00:59:37 ►
print out every gizmo,
00:59:39 ►
because there’s not a copyright there,
00:59:41 ►
there might be a patent issue or a trademark issue,
00:59:43 ►
but there’s almost certainly never a copyright issue there.
00:59:48 ►
Even though you have that right,
00:59:49 ►
just like you have the right to make a VCR for Netflix,
00:59:51 ►
because you have to bypass a lock to do it,
00:59:54 ►
that right can be taken away from you.
00:59:55 ►
So I do worry that that’s a potential thing here.
00:59:58 ►
In terms of whether or not the goop is protected by law,
01:00:01 ►
our theory, our legal theory,
01:00:03 ►
the theory under which we’re suing the U.S. government,
01:00:05 ►
is that although to date the industry
01:00:08 ►
and some of the courts have treated it that way,
01:00:10 ►
is that they’re wrong,
01:00:11 ►
that this is inconsistent with the Constitution.
01:00:14 ►
So, like, I don’t want to say on behalf of EFF
01:00:17 ►
that once you put a lock on a printer,
01:00:21 ►
it’s against the law to break the lock,
01:00:22 ►
but I will say that once you put the lock on a printer, it’s against the law to break the lock. But I will say that once you put the lock on the printer, there are people who believe it’s against the law to break
01:00:30 ►
the lock. It’ll be hard to raise capital. You might get legal threats that could destroy you,
01:00:35 ►
even if you could eventually prevail if you had an unlimited litigation budget.
01:00:39 ►
You might face criminal liability and have to go to court and face long prison sentences that, you know, 97% of the people who get indicted in America plead guilty because the potential prison sentence is so long that most people plead guilty to a lesser charge.
01:00:55 ►
And, like, it’s inconceivable that DAs pick the guilty party 97% of the time, right?
01:01:01 ►
Like, they would have to be psychic to do it.
01:01:03 ►
So, like, we have kind of statistical solid evidence that there’s some shenanigans there. And when you create
01:01:09 ►
giant, terrible legal penalties for activities that are legitimate, you make it possible
01:01:14 ►
to shut down those activities just by the threat of invoking that law, even if that
01:01:19 ►
law might not pass muster over the course of a full litigated investigation. So it’s not as simple as saying
01:01:28 ►
the law says you can’t unlock your printer and put new goop in. But it is that the law makes
01:01:34 ►
doing that extremely fraught and that the way that we make, that we have clarity on that and
01:01:41 ►
that we have clarity on things like whether or not your printer is doing what it says it does. Because if you’re making structural elements and not
01:01:46 ►
shower rings, you want to make sure that your printer does what it says it does.
01:01:51 ►
You just want to know that your printer, which is a network computer inside your LAN, you
01:01:55 ►
just want to know it’s right. Like Anqui, this security researcher a few years ago,
01:02:01 ►
he did this work on HP ink jets. There’s 100 million HP ink jets in the field. A printer is a computer that you put ink into. HP ink jets, you can
01:02:11 ►
reprogram them by putting a new firmware blob, a new operating system, in the postscript
01:02:16 ►
of a text, of a Word file with no authentication step. And he showed that he could send your HR department a file called
01:02:25 ►
resume.doc, and when they printed it, he would then own the printer, could open a reverse
01:02:31 ►
shell to his laptop, crawl your LAN from the inside looking for computers vulnerable to
01:02:37 ►
zero days or known vulnerabilities, take them over and exfiltrate all the data on your LAN.
01:02:42 ►
Right? So, like, I don’t care if it’s a smart light bulb or a printer or a 3D printer,
01:02:48 ►
if it’s in your home and it’s on your LAN, you should be able to know how it works.
01:02:53 ►
Leaving aside the goop problem, just at the level of the network stack,
01:02:57 ►
the network interface, and the chip, you should be able to know what it’s doing.
01:03:01 ►
And if you don’t like what it’s doing, you should be able to change what it’s doing. That just feels totally axiomatic to me. We had a person over there.
01:03:11 ►
I’ll get to you in a bit. This gentleman, I think, was next.
01:03:19 ►
Two years ago, I saw a fascinating conversation in a live theater in New York City between Stephen Colbert and the CEO of Google, Eric Schmidt. And it started with, at one point, Eric told
01:03:32 ►
Stephen that the motto, the philosophy of their development design systems at Google
01:03:40 ►
is do no harm. That’s their internal motto. In that same conversation,
01:03:45 ►
shortly thereafter, Stephen asked Eric, what is Google working on in the future? And Eric said,
01:03:52 ►
we’ve developed algorithms that will essentially make all of your decisions for you. Now, for me,
01:04:00 ►
that represents, you know, when you think about a five-year-old growing up and never developing the critical analysis muscle inside them of making their own decisions, that seems to me to be pretty potentially harmful.
01:04:12 ►
And my question is, are these large corporations, when they do something like that, when there’s this seeming disconnect, when Facebook or Google do something like that, are they lying to themselves?
01:04:23 ►
Are they lying to them and
01:04:25 ►
to us? How can there be this huge disconnect that moves them in a direction of funneling
01:04:32 ►
things away from the greater potential that you’re talking about and say with a straight
01:04:36 ►
face what they do to the public?
01:04:38 ►
So I think it’s a little of all of the above, right? Like having a computer make some of
01:04:41 ►
your decisions for you makes awesome sense, right? Like, the decision which of all the web pages on the web is the one that I should be reading right now if I’m looking for information on this search term is a so that we can dispense with it
01:05:06 ►
and get into higher-level abstract stuff,
01:05:09 ►
also really good.
01:05:12 ►
But, so I think that there’s probably people
01:05:13 ►
in the organization who are like,
01:05:15 ►
this is a chewy, awesome problem,
01:05:17 ►
and let’s get on it and see what we can do.
01:05:19 ►
There’s probably some people who are like,
01:05:21 ►
this feels like a way that we could monetize
01:05:23 ►
in some really creepy ways. And organizations, they like, they don’t, they’re not monolithic, right? They have
01:05:31 ►
lots of different parts of it. There’s probably people who are like, we have an earnings call
01:05:36 ►
in at the end of the quarter, and we’re going to have to explain to our major investors what we’re
01:05:40 ►
doing. And this would be great on our earnings call. All of that stuff is happening in the organization.
01:05:47 ►
And I think that the difference between
01:05:52 ►
I’ve designed a computer program that helps you make decisions,
01:05:55 ►
and that’s horrible and nightmarish,
01:05:57 ►
and I’ve designed a computer program that helps you make decisions
01:05:59 ►
and that lets you get a lot more done,
01:06:01 ►
is all the stuff that we use to make everything work better.
01:06:04 ►
Peer review, the ability of people to turn it off, the ability of people to investigate
01:06:13 ►
how it works independently and validate its claims.
01:06:16 ►
So there’s another law like the DMCA called the Computer Fraud and Abuse Act, CFAA.
01:06:20 ►
And it was passed in the 80s when America didn’t really have any anti-hacking statutes
01:06:24 ►
to speak of.
01:06:26 ►
There would be this embarrassing stuff where someone
01:06:27 ►
would break into a database and take a bunch of
01:06:29 ►
sensitive information. They would go,
01:06:31 ►
do we charge them with the theft of a microwatt
01:06:33 ►
of electricity because they didn’t have a statute?
01:06:36 ►
Rather than try and find something that
01:06:37 ►
expressed what conduct on
01:06:39 ►
a computer was illegal, because that would
01:06:41 ►
be a super technical challenge.
01:06:43 ►
As soon as you were done, you’d have to start over again, because computers change really fast. They came up with the broadest
01:06:49 ►
possible definition, really, which is if you exceed your authorization on a computer that you don’t
01:06:55 ►
own, you can potentially commit a felony. Well, in the contemporary world, your authorization to
01:07:02 ►
use a computer, according to the companies that make
01:07:05 ►
online services, is their terms of service, right? It’s like the 28,000 words that no one has ever
01:07:12 ►
read that we know says something like, by being dumb enough to use this service, you agree that
01:07:19 ►
we’re allowed to come over to your house and punch your grandmother and wear your underwear and make
01:07:22 ►
long-distance calls and eat all the food in your fridge. And violating those terms has been treated as a felony by prosecutors
01:07:29 ►
and treated as a crime by companies. So the American Civil Liberties Union has launched a
01:07:34 ►
targeted attack on the Computer Fraud and Abuse Act. It could be wider, but their initial attack
01:07:41 ►
here is that there are academics who want to do things like create a whole bunch of profiles for online services,
01:07:48 ►
some of which look like they’re poor black people, some look like rich white people, some look like everything in between,
01:07:53 ►
and see if they get offered different financial products in breach of federal lending laws.
01:07:58 ►
And to do so, they have to violate the terms of service, and they want a court to tell them whether or not that would violate the Computer Fraud and Abuse Act, and whether the First Amendment then trumps
01:08:08 ►
the Computer Fraud and Abuse Act. And so this part of it, right, if you want to make a thing
01:08:16 ►
that lets you make decisions, helps you make decisions, which we’re doing a lot of, right,
01:08:20 ►
that’s what machine learning is, it’s like, tell me of all these things, which is the thing that’s
01:08:23 ►
most like the thing that I want. If you’re going to do that, you have to have the
01:08:28 ►
ability to independently audit what’s going on, not just because people might be evil, right?
01:08:34 ►
But because sometimes people are wrong, right? Sometimes people make dumb decisions. Like,
01:08:39 ►
alchemy was like science, except they never told each other what they were learning because the
01:08:43 ►
first person to figure out how to turn lead into gold had the victory condition.
01:08:47 ►
So every alchemist discovered for himself in the hardest way possible that you shouldn’t drink mercury.
01:08:51 ►
And, like, it took 500 years for alchemists to turn something base into something precious.
01:08:55 ►
They turned superstition into science by telling other people what they thought they knew
01:09:00 ►
and, like, letting their enemies tell them that they were idiots for getting it wrong right and and so if we can’t do that in those online systems then yeah they are
01:09:09 ►
going to turn people into uncritical people and they are going to magnify all of our social
01:09:13 ►
problems right machine learning takes all of our social problems and and gives them a kind of
01:09:19 ►
empirical face wash that makes all of the social problems that we have today that are problems of bias and systemic problems look like problems of math that are just woven into the fabric of the universe.
01:09:31 ►
So Patrick Ball, who’s camping around here, does a lot of work on this.
01:09:35 ►
And, you know, he did some work on predictive policing.
01:09:37 ►
Well, how do you do predictive policing?
01:09:39 ►
You take all the crime that’s been reported historically and you ask a machine learning system where the next crimes are going to be committed. But you only find crime where you look for
01:09:47 ►
it. So if you go to a neighborhood where everyone is black and you make each and every one of
01:09:51 ►
them turn out their pockets, once a week or so, you will find all the weed in that neighborhood.
01:09:56 ►
So the computer will then decide that all of the weed that there is to be found is in
01:10:01 ►
that neighborhood, right? And the computer will send the cops to go look for weed in that neighborhood.
01:10:07 ►
The cops will only find weed where they look for it in that neighborhood.
01:10:10 ►
And the computer will then say, I was right, and look at how much more weed we just found.
01:10:16 ►
Let’s do it some more, right?
01:10:17 ►
When you, you know, inhaling your own flatulence, right?
01:10:19 ►
And so unless we can examine the training data that goes into the system,
01:10:26 ►
and unless we can interrogate it and have a critical discourse about whether the training data is complete and representative,
01:10:34 ►
then all we do is we make decision-making systems that take all the bad decisions that we used to make out of, like, atavistic, you know, bias,
01:10:43 ►
out of like atavistic bias,
01:10:46 ►
and we turn them into things that computers tell us to do and that are even harder to refute or refuse
01:10:50 ►
because they feel like they live in some empirical realm.
01:10:55 ►
Are there any female-identified people or non-binary people
01:10:58 ►
who would like to ask the next question?
01:11:00 ►
All right, are there any male-identified people?
01:11:02 ►
Okay, you were next in the back there.
01:11:04 ►
Or were you next? You were next. All right. Go ahead. You could look at DMCA laws and things like Netflix prohibiting your ability to record,
01:11:26 ►
a structure for Netflix to make more money,
01:11:29 ►
but also a structure for directors and screenwriters to make a living.
01:11:34 ►
Sure.
01:11:34 ►
I think ideally both of those needs should be balanced,
01:11:38 ►
and it shouldn’t entirely be on the computer owner, and I’m curious how you feel.
01:11:42 ►
I was with you right until that last part, right?
01:11:45 ►
Okay, so we all agree, I think.
01:11:48 ►
Like, I can give you examples of things
01:11:49 ►
that I’m sure you think that the person who made the content
01:11:53 ►
doesn’t get to tell you what to do,
01:11:55 ►
even if they could think of a way of doing it.
01:11:56 ►
So I’ll give you an example.
01:11:57 ►
I used to go to these standards body meetings
01:11:59 ►
for the Digital Video Broadcasters Forum,
01:12:01 ►
which makes digital television standards for Europe,
01:12:03 ►
parts of Asia and South America, DVB.
01:12:06 ►
And we were trying to make a standard for digital locks on new high-def digital TVs.
01:12:11 ►
And one of the people who was there representing the motion picture industry said,
01:12:16 ►
I want to be able, on a show-by-show basis, to flag the show to say whether or not you’re allowed to watch that show in a room other than the
01:12:26 ►
room where your receiver is. Like if you’re allowed to run a cable into another room or
01:12:29 ►
use a wireless retransmitter. So like if the show is being received by a receiver in your
01:12:33 ►
living room but you want to watch it before bed, I want to be able to turn that on and
01:12:37 ►
off. And I said like, which copyright system do you think it is that gives authors, rights holders, the right to decide
01:12:45 ►
what room you consume their material in? And he said, this has value, right? The right to watch
01:12:54 ►
a show in a room other than the one you live in has value. And if it has value, we have the right
01:12:59 ►
to capture it, right? Siva Vidyanthan calls that the if-value-then-right theory of copyright. But it’s wrong. There
01:13:06 ►
are lots of values that are in a copyrighted work that rest with the audience and not with
01:13:12 ►
the author. The right to remember it, the right to read it out of order or reread a
01:13:17 ►
favorite passage, all of those things might be valuable. I could totally see a thing,
01:13:22 ►
a machine learning system in your Kindle that figured out which passages you like the best and charged you extra to read them again.
01:13:29 ►
We know a priori that that’s a terrible idea.
01:13:33 ►
So as the Winston Churchill joke goes, we’ve agreed on the general principle.
01:13:39 ►
Now we’re arguing about specifics.
01:13:40 ►
Some things rights holders should not be allowed to specify.
01:13:45 ►
specifics. Some things rights holders should not be allowed to specify. I shouldn’t be allowed to say my painting can only be watched, can only be looked at by white people. Or like Samuel Beckett’s
01:13:51 ►
family did, my grandparents’ plays can only be acted in by men. That was one of their demands.
01:13:58 ►
Those are not things that rights holders get to decide. End of story. So now we’re just trying to figure out where in the stack it goes. And it cannot start with when you use a computer that has someone else’s
01:14:12 ►
copyrighted work on it, you don’t get to decide what that computer does. Because that means that
01:14:18 ►
your computer is designed to treat you as its adversary to stop you from reconfiguring it.
01:14:23 ►
We don’t know how to make a computer that can run every program
01:14:26 ►
except for the one that upsets someone who provides content to Netflix.
01:14:29 ►
The only computer design we have in production that we know how to make
01:14:34 ►
is the Turing-complete von Neumann machine
01:14:36 ►
that can run all the programs that can be expressed in symbolic logic.
01:14:40 ►
So when we try to design the computer that runs all the programs
01:14:42 ►
except for the one that makes the baby Jesus cry,
01:14:46 ►
that what we end up doing is we design a computer that has spyware on it out of the box
01:14:50 ►
that tries to see if you’re running the wrong program, the program that you’re not allowed to run,
01:14:55 ►
and prevents you from running it.
01:14:56 ►
And for that to work, there have to be processes and systems in your computer that you can’t see,
01:15:02 ►
that you can’t interrogate, that you can’t override.
01:15:04 ►
And we’re back to alchemy because now we put secrets in the computer that publishing
01:15:09 ►
information about them, true facts about the security of that computer, become a matter
01:15:15 ►
of potential criminal liability. And that means that this computer, which is not an
01:15:19 ►
entertainment device, but rather a pluripotent universal Turing machine that does everything
01:15:24 ►
and knows all your secrets and can destroy you comprehensively from asshole to appetite,
01:15:29 ►
that computer becomes off limits to security research because we’ve pursued a fool’s technological
01:15:36 ►
errand, the errand of designing a computer that runs all the programs except for a program,
01:15:41 ►
right? And so we can’t put it there. Now, there may be other ways of getting people
01:15:46 ►
to solve their computer problems
01:15:49 ►
and their rights holder problems,
01:15:51 ►
but historically, the way that this has worked
01:15:53 ►
is like we’ve had kind of two answers
01:15:55 ►
to the new technology has made my living unsustainable, right?
01:16:00 ►
And those two ones were like either,
01:16:02 ►
okay, we’ll figure out a way
01:16:04 ►
to kind of make this into a collective license or just figure it out, suck it up.
01:16:08 ►
Some of you are going to succeed.
01:16:10 ►
Some of you are going to fail.
01:16:10 ►
Remember, the arts are like a statistically implausible industry.
01:16:15 ►
Like, yes, the arts are a big industry, but all the people who ever set out to be in the arts, like almost all of them not only fail to make money, most of them lose money.
01:16:25 ►
all of them not only fail to make money, most of them lose money, right? So, you know, like this saying, okay, well, how do we make sure Netflix monetizes as much as possible doesn’t change them
01:16:30 ►
the amount of money in the pocket of the majority of people who ever wanted to make a movie,
01:16:34 ►
right? Like 99 and 59% of everyone who ever dreamed someday of expressing themselves on the
01:16:39 ►
screen. We’re arguing about how lottery winners can maximize their earnings
01:16:45 ►
it’s the wrong question
01:16:46 ►
because a robust good cultural policy
01:16:48 ►
is not about making sure that lottery winners
01:16:51 ►
keep winning the lottery
01:16:52 ►
it’s about making sure that as many people as possible
01:16:55 ►
can make as much diverse content
01:16:57 ►
as much diverse creative work
01:16:59 ►
as possible for the largest number
01:17:01 ►
of diverse audiences that suit them
01:17:03 ►
and express them as well as possible.
01:17:05 ►
And so if we’re going to make industrial policy
01:17:08 ►
to make computers and the arts more compatible,
01:17:12 ►
we could make a couple of things
01:17:13 ►
that would make a huge difference, right?
01:17:15 ►
Like one is we could repeal Section 12.1 of the DMCA
01:17:18 ►
because practically speaking,
01:17:19 ►
the way that Section 12.1 works
01:17:21 ►
is if Amazon sells your book,
01:17:28 ►
you the publisher or you the author don’t get the right to tell Amazon’s customer that they’re allowed to take the digital lock off of it
01:17:32 ►
and move it to a rival platform like Kobo. So that means that when Hachette, one of the big
01:17:38 ►
five publishers who are generally thought of as like super on top of things, like really like
01:17:43 ►
on fleek, when Hachette hit the end
01:17:46 ►
of their 10-year deal with Amazon and tried to renegotiate, Amazon said, we’re not giving you
01:17:51 ►
anything you want. And Hachette said, well, good luck because we have the books. And Amazon said,
01:17:56 ►
ha, but we’ve got the customers. And people are not going to keep half of their libraries on a
01:18:00 ►
Kindle and the other half on a rival device. And they took all of Hachette’s books out of their catalog for a year, Harry Potter, all of that stuff. And at the end of the
01:18:08 ►
year, Hachette capitulated. And at the end of that year, Random House’s deal came up. Bertelsmann,
01:18:13 ►
who owned Random House, are the largest publisher in the world, huge diversified conglomerates.
01:18:17 ►
They make cluster bombs. They make books. And Random House caved immediately, right?
01:18:22 ►
So how do you make sure that as between the dollar that’s spent at Amazon,
01:18:27 ►
as much of it goes into the pocket of the publisher or the writer
01:18:30 ►
and not into Mr. Bezos’ discount house of everything,
01:18:34 ►
you make sure that the leverage doesn’t accrue to that guy, right?
01:18:38 ►
If you want to make sure that authors get the best deal from their publishers,
01:18:42 ►
the Authors Guild, who are a group I’ve had a lot of disputes with,
01:18:45 ►
the Authors Guild right now have an amazing campaign,
01:18:47 ►
50% royalty on e-books, right,
01:18:49 ►
as opposed to 12% on hardcovers, 25% on e-books.
01:18:52 ►
50% royalty on e-books.
01:18:54 ►
Double the amount of money in the pocket
01:18:56 ►
of everyone who sells a book, right?
01:18:58 ►
The publishers are absolutely not going to let them do it,
01:19:00 ►
and there’s only five big publishers left in the world.
01:19:04 ►
And so if you don’t like that deal,
01:19:06 ►
it’s like Lily Tomlin in the old Saturday Night Live skits,
01:19:08 ►
like we’re the phone company, we don’t have to care.
01:19:10 ►
What are you going to do?
01:19:11 ►
Get two cans and a string, right?
01:19:13 ►
If you want to make sure that there are lots of people
01:19:18 ►
in a seller’s market for books that we write,
01:19:23 ►
you have to make sure that it’s as easy as possible
01:19:25 ►
to be a publisher who might offer someone money. Now, the last 20 years of copyright enforcement
01:19:31 ►
in the quest for balance has monotonically increased year on year the cost of being
01:19:38 ►
someone involved in distributing copyrighted works on the internet. We’ve added, you know,
01:19:42 ►
look at YouTube, right? When YouTube started, all it took to be YouTube was a giant pile of hard drives, an unhealthy interest in video,
01:19:49 ►
and an internet connection, right? Now YouTube has a couple hundred million dollars worth of
01:19:53 ►
content ID, which is a system that’s used to enforce copyrights automatically. And anyone
01:19:59 ►
who wants to compete with YouTube feels, I think probably correctly, that they would be in for a world of legal hurt
01:20:05 ►
if they didn’t have another content ID.
01:20:09 ►
Well, that means that we don’t really get other YouTubes anymore.
01:20:12 ►
When YouTube started, there were a lot of video companies.
01:20:14 ►
Now there’s this one giant one and a bunch of also-rans.
01:20:17 ►
Well, what happens when you have a buyer’s market for video material?
01:20:21 ►
The terms get worse.
01:20:22 ►
So, like, YouTube decided to launch a Spotify competitor.
01:20:26 ►
They sat down with the big four record labels because there’s five publishers and there’s four
01:20:29 ►
record labels left in the world and five movie studios, right? They sat down with the big four
01:20:34 ►
record labels and they said, let’s deal equal to equal. And once they had that deal, they went to
01:20:40 ►
every indie and every small label in the world. And they said, if you want to still use YouTube,
01:20:44 ►
you will take the deal that the record industry negotiated on your behalf. Now, the record
01:20:49 ►
industry for companies historically has not made good deals with its artists, right? You probably
01:20:54 ►
know about the African-American artists who were ripped off in the ages of R&B, but even today,
01:20:58 ►
if you have a standard record deal with one of the big four, every royalty statement you get
01:21:04 ►
has a line item
01:21:05 ►
deduction for something called breakage. Breakage is the statistically calculated percentage
01:21:10 ►
of your vinyl record albums that will break on the truck between the factory and the store.
01:21:16 ►
And it’s deducted from your MP3 royalties, right? The accountancy basis for this is fuck
01:21:22 ►
you, right? Like that’s the accountancy logic of this.
01:21:26 ►
There’s four of us.
01:21:27 ►
What are you going to do?
01:21:29 ►
When we reduce the number of people who can buy our stuff and take it to our audience,
01:21:33 ►
because the Venn diagram of everyone who has a song to sing and everyone who can make YouTube
01:21:38 ►
has a very small overlap.
01:21:40 ►
It’s more of a sphincter than a sphere.
01:21:42 ►
When you make it harder for someone to
01:21:45 ►
assist us to get our music into the hands of people who might pay us for it by adding copyright
01:21:51 ►
liability and compliance stuff, you just reduce the number of people who are bidding for things,
01:21:56 ►
and you reduce the amount of each dollar that’s raised that lands in the pocket of an artist.
01:22:01 ►
Those are policies, like the magical computer that sometimes doesn’t run
01:22:05 ►
programs that we don’t like. That computer doesn’t exist and it’s not on the drawing board.
01:22:09 ►
The policy where more competition means that artists get more money or where people whose
01:22:14 ►
sole contribution to a book is taking a text file and running a script on it don’t get more of a say
01:22:20 ►
in how that book is used after the fact than the person who wrote it or the person who paid them
01:22:23 ►
for it. Those are things that we could do tomorrow.
01:22:27 ►
And so if we’re going to balance the interests of creators and industry,
01:22:30 ►
that’s the right place to start because that pays you.
01:22:33 ►
You can’t eat outrage.
01:22:35 ►
The fact that you’re pissed that someone’s listening to your music the wrong way
01:22:38 ►
is not going to put braces on your kids’ teeth.
01:22:40 ►
But if every dollar that comes in, an extra 10 cents goes into your pocket,
01:22:44 ►
then maybe you can put your kids through college.
01:22:48 ►
Slight but associated change of subject.
01:22:50 ►
You write a lot on things of software, obviously, with regards to open source and that.
01:22:55 ►
A lot of people…
01:22:56 ►
Can you speak a little closer to the mic?
01:22:57 ►
I can’t quite hear you.
01:22:59 ►
So a lot of people associate open source with software, obviously.
01:23:04 ►
Less people than that for open source hardware,
01:23:07 ►
like Arduino and all that sort of stuff.
01:23:10 ►
What do you know about and what would you say
01:23:13 ►
about applying open source principles
01:23:15 ►
to pretty much everything,
01:23:17 ►
mainly like low-tech,
01:23:19 ►
alternative decentralized infrastructure,
01:23:21 ►
that kind of thing?
01:23:23 ►
And do you know of any people
01:23:24 ►
who are really leading in that kind of field?
01:23:26 ►
Yeah, I mean, the Creative Commons licenses
01:23:28 ►
have been applied to a real diversity of materials,
01:23:30 ►
like architectural material or building plans
01:23:33 ►
or design blueprints,
01:23:36 ►
things that are not in the realm of traditional open source
01:23:40 ►
or even open source hardware stuff.
01:23:42 ►
But, you know, the’s the reason that free and
01:23:45 ►
open source licenses work is because software is copyrighted and copyrightable. But being
01:23:50 ►
copyrightable is the exception and not the rule in our world. Like most things are not.
01:23:55 ►
And it’s great to have fair use and it’s great to have creative commons licenses. But the
01:24:00 ►
reality is that like the utilitarian basis for copyright spelled out in the U.S. Constitution to promote the useful arts and sciences is that we give copyrights in places where without those copyrights we wouldn’t get enough stuff.
01:24:15 ►
But like fashion, the most competitive high-speed industry in the world has no or little copyrights depending on which part of the fashion you’re looking at, at all.
01:24:25 ►
We don’t need open source fashion.
01:24:28 ►
Fashion is born open source.
01:24:30 ►
Like, the way fashion works, and the reason fashion is so light speed, so intense,
01:24:35 ►
changes so quickly, is so creative, is that any time you see something cool,
01:24:40 ►
you can take it, clone it, and improve it.
01:24:42 ►
You don’t need anyone’s permission.
01:24:44 ►
You don’t need a license.
01:24:45 ►
You don’t need a lawyer. You don’t need a lawyer.
01:24:45 ►
Because remember, what an open source license is,
01:24:47 ►
is an invitation to hire a lawyer to tell you what you can do.
01:24:52 ►
Let’s not get lost in the license stuff,
01:24:55 ►
because it’s fun writing licenses.
01:24:57 ►
The creative license, and the WTF license,
01:24:59 ►
and the GNU licenses, and all the other licenses.
01:25:02 ►
The artistic license.
01:25:04 ►
Those are fun, but let’s not promote this norm where all of the other licenses, right? The artistic license, you know, those are fun,
01:25:05 ►
but like let’s not promote this norm
01:25:07 ►
where all the domains of things,
01:25:09 ►
like plots, right?
01:25:11 ►
Plots are not copyrightable, right?
01:25:13 ►
Poe invented the detective story.
01:25:16 ►
It didn’t come down off a mountain on two tablets, right?
01:25:18 ►
A dude whose name we know invented a genre
01:25:21 ►
and he didn’t get to tell other people what to do.
01:25:23 ►
We don’t need Creative Commons licenses
01:25:25 ►
for genres. They are
01:25:27 ►
born open.
01:25:29 ►
Totally, we should take all the things
01:25:31 ►
that are closed by default,
01:25:34 ►
figure out where it makes sense to open them, and open
01:25:35 ►
them up, but we must never lose sight of the
01:25:37 ►
fact that most things
01:25:39 ►
are open, and when we
01:25:41 ►
add licenses to them, we take some
01:25:43 ►
plurality of them and make them closed
01:25:45 ►
because they’re not licensed.
01:25:46 ►
I think that’s it for time.
01:25:48 ►
Is that right?
01:25:48 ►
We’re at half past?
01:25:50 ►
That’s what I had in my schedule.
01:25:51 ►
My campies are expecting me for dinner in any event.
01:25:56 ►
Thank you all very much for coming.
01:25:58 ►
Thank you.
01:25:59 ►
Thank you.
01:26:02 ►
You’re listening to The Psychedelic Salon,
01:26:04 ►
where people are changing their lives one thought at a time.
01:26:09 ►
And I hope that you take up Corey’s challenge
01:26:11 ►
and get two deep nerds that you know
01:26:13 ►
who don’t really care too much about cybersecurity right now
01:26:17 ►
and get them to listen to this talk.
01:26:20 ►
And should you happen to be involved in coding various
01:26:23 ►
and different things at your job,
01:26:26 ►
I hope that you take to heart some of what Corey said
01:26:28 ►
and do what you can in your coding and your team meetings
01:26:32 ►
to ensure that the privacy of your customers is protected.
01:26:36 ►
Sixteen years ago, I wrote and published The Spirit of the Internet.
01:26:40 ►
At the time I was writing it, I was still deeply enmeshed in the technical world of
01:26:45 ►
the internet, and so I’d like to read this one paragraph from that book. The kind of world we
01:26:52 ►
are about to bring into existence is being shaped each day by thousands of little decisions being
01:26:58 ►
made in companies all around the globe. This is why it is so important for all of us to become more involved
01:27:05 ►
in discussions about how this powerful technology is to be deployed. Many of the people participating
01:27:12 ►
in these online debates are the same ones who go to work each day and make these important
01:27:17 ►
decisions. Of prime importance in all of these decisions is the issue of privacy. If we do not clearly establish one’s personal privacy
01:27:26 ►
as an absolute and inalienable human right,
01:27:31 ►
our grandchildren may never know what it is like to have a private moment.
01:27:36 ►
Well, I published that book in the year 2000,
01:27:38 ►
and to be honest, my worst fears have not only been met,
01:27:43 ►
they’ve been exceeded by many orders of magnitude.
01:27:46 ►
In that book, I wrote about things that weren’t yet in the market, but were being developed by
01:27:51 ►
people that I knew. And almost everything that I foretold there has come to pass. Much to my
01:27:57 ►
surprise, I should add, because, well, I never thought that our technology would evolve as fast as it has, and as fast as it continues to
01:28:05 ►
evolve. In fact, I hope that you aren’t using any of those IoT voice controllers for your home right
01:28:12 ►
now. Just think about it. Do you really want Amazon or Google knowing every single question
01:28:18 ►
or request that takes place in your home? And you can bet that all of that information that they gather on you and your
01:28:25 ►
family will never go away. You know, disk space is now next to free these days when compared to
01:28:31 ►
the amount of information that they can store for just a penny. Just like you, I can, well,
01:28:37 ►
I can see a lot of promise for the Internet of Things. In a way, it’s bringing into existence
01:28:43 ►
all of the wildest technological dreams of my youth.
01:28:47 ►
But now that we have begun to move into this new arena, it becomes more obvious each day that
01:28:52 ►
until the privacy and security issues of the IoT are solved, we had better move very cautiously,
01:29:00 ►
very cautiously indeed. If you live in the U.S. right now, you no doubt are well aware of the massive dump of e-mails
01:29:07 ►
from various political campaigns that was done by WikiLeaks.
01:29:12 ►
And you can tell how seriously damaging these e-mails are to that Clinton woman’s presidential campaign
01:29:18 ►
because they are completely ignoring the content of those e-mails
01:29:22 ►
and instead they’re doing their very best to cover them up
01:29:25 ►
by claiming that these email accounts were hacked by Russia.
01:29:29 ►
Now, first of all, there’s been no solid evidence as to whether or not that’s true.
01:29:34 ►
But don’t you find it interesting that these politicians have very successfully shut off discussion
01:29:40 ►
about what the democratic political operatives have been saying in private
01:29:44 ►
by redirecting the conversation to the possibility that Russia is involved? discussion about what the Democratic political operatives have been saying in private by
01:29:45 ►
redirecting the conversation to the possibility that Russia is involved, and the horror, the
01:29:51 ►
horror that another government would try to influence an American election.
01:29:56 ►
Where did they ever get such an idea to meddle like that?
01:30:01 ►
If such is the case, then, well, maybe they are simply taking their clues from something called Radio Free Europe, which is an official U.S. government propaganda radio network that has been attempting to influence elections behind the Iron Curtain since 1949.
01:30:18 ►
So, let’s not get all holier than thou here.
01:30:26 ►
it all holier than thou here. And as to the recent Internet of Things botnet attack last week, isn’t it interesting that it stopped really soon after WikiLeaks sent out a tweet
01:30:32 ►
acknowledging the fact that the takedown was in response to the U.S. cutting off Julian
01:30:38 ►
Assange’s internet connection. So instead of blaming Russia for this particular incident,
01:30:44 ►
the power elite are now pointing to Anonymous, which they still think of as an organized group of some kind.
01:30:51 ►
And that’s really quite laughable.
01:30:54 ►
You see, well, in my mind at least, Anonymous isn’t a group of people. It’s an idea.
01:31:01 ►
This recent attack was obviously the work of some very unsophisticated hackers
01:31:05 ►
that were using publicly available scripts. It most certainly wasn’t an attack by a government
01:31:12 ►
agency, only the politicians think that. However, if I was working for the FBI or the NSA or
01:31:19 ►
Homeland Security, I would be considerably more worried about this attack than if it had been
01:31:25 ►
done by another government. Because now it has become quite obvious that just a few script kids
01:31:31 ►
are capable of causing widespread disruption of the net, should a few of them get pissed off at
01:31:37 ►
something like, well, like what happened when PayPal refused to accept donations for WikiLeaks.
01:31:43 ►
And the reason that I think of Anonymous as an idea,
01:31:47 ►
rather than as a group of hackers,
01:31:49 ►
is that we’ve already seen actions promoted by Anonymous
01:31:52 ►
that brought out a lot more people
01:31:54 ►
than can be found only in the hacker community.
01:31:57 ►
Remember when Anonymous took on the Church of Scientology?
01:32:01 ►
One day they called for people to picket the offices
01:32:04 ►
of this so-called church,
01:32:06 ►
hoping that a few dozen people would show up. But what actually happened is that hundreds of people
01:32:11 ►
showed up in cities all over the world. These weren’t all just computer hackers. These demonstrators
01:32:17 ►
were people just like you and me, and they showed up to display their support for an idea.
01:32:23 ►
For me, well, another example of how big
01:32:26 ►
Anonymous is was the Occupy Wall Street movement. Remember that? In cities all over the world,
01:32:33 ►
thousands of people demonstrated against Wall Street’s brand of capitalism. The way I see
01:32:38 ►
it, there is this huge wave of thought throughout the world that is getting more and more fed
01:32:43 ►
up with the way the one percenters are running things. This underlying current of thought, it seems to me, is like the
01:32:51 ►
mycelium under the forest floor. It is what feeds the trees and plants and actually holds the forest
01:32:57 ►
together. For most of the time it is just beneath the surface doing its best to keep things from
01:33:03 ►
falling apart. But every once in a while a mushroom fruit will pop up out of the mycelium.
01:33:09 ►
And during the first phase of the Occupy movement,
01:33:13 ►
people in the form of those fruiting bodies appeared in Occupy camps
01:33:16 ►
in every major and in many smaller communities around the world.
01:33:21 ►
We are anonymous.
01:33:23 ►
You and me, each and every one of us. And one day our time will come.
01:33:28 ►
We aren’t only computer hackers. We are the ones who are hacking human consciousness. We,
01:33:34 ►
you and I, are anonymous. We are Legion. We do not forgive. We do not forget. Expect us.
01:33:44 ►
And for now, this is Lorenzo
01:33:45 ►
signing off from Cyberdelic Space.
01:33:48 ►
Be well, my friends.