In 2011, Tristan Harris’s company, Apture, was acquired by Google. Inside Google, he became unnerved by how the company worked. There was all this energy going into making the products better, more addicting, more delightful. But what if all that made the users’ lives worse, more busy, more distracted?
Harris wrote up his worries in a slide deck manifesto. “A Call to Minimize Distraction & Respect Users’ Attention” went viral within Google and led to Harris being named the company’s “design ethicist.” But he soon realized that he couldn’t change enough from the inside. The business model wasn’t built to give users back their time. It was built to take ever more of it.
Harris, who recently co-founded the Center for Humane Technology, has become the most influential critic of how Silicon Valley designs products to addict us. His terms, like the need to focus on “Time Well Spent,” have been adopted (or perhaps co-opted) by, among others, Facebook CEO Mark Zuckerberg. His critique is becoming almost mainstream, featured on 60 Minutes and echoed by CEOs, politicians, and even Silicon Valley investors.
I interviewed Harris recently for my podcast. We talked about how the 2016 election threw Silicon Valley into crisis, why negative emotions dominate online, where Silicon Valley’s model of human decision-making went wrong, whether he buys Zuckerberg’s change of heart, and what it means to take control of your time. This transcript has been edited for length and clarity. For the full conversation, which includes the story of what happened when Harris brought legendary meditation teacher Thich Nhat Hanh to Google, listen or subscribe to The Ezra Klein Show.
Ezra Klein
Why do you think this topic has blown up so much in recent months?
Tristan Harris
I think the election woke a lot of people up to the power of social media and I don’t mean just the outcome. This is really important. I think through 2016, there was a sense that social media was just becoming toxic. It’s just outrage everywhere. And that amplifies the addiction that’s already been there. I think people are really feeling like they’re losing agency, and they realize how much time they’re spending on their phones. And then those issues exploded with the Russian manipulation of social media through the election.
Ezra Klein
I had Jaron Lanier on this podcast a couple months ago, and he said something I’ve been thinking about since then. He said that the key to a lot of social media is [that] negative emotions engage more powerfully than positive emotions. Do you think he’s right about that?
Tristan Harris
Oh, absolutely. Outrage just spreads faster than something that’s not outrage.
When you open up the blue Facebook icon, you’re activating the AI, which tries to figure out the perfect thing it can show you that’ll engage you. It doesn’t have any intelligence, except figuring out what gets the most clicks. The outrage stuff gets the most clicks, so it puts that at the top.
Then it learns there’s this magic keyword where if any article had this keyword and you put it at the top of the feed, it would always get a click or a share. And that keyword was “Trump.” If you’re just a naive computer, you put this keyword in front of people and they always click it. It’s reinforcing that this is the thing that should go to the top of the feed.
I look at technology through the lens of persuasion and how it persuades the human animal. What does seeing a repeated set of things that make you outraged do to you? You can feel it when it happens. I think of it as civilization mind control. It’s not that there’s someone who’s deliberately trying to make us all outraged. It’s that 2 billion people, from the moment they wake up in the morning, are basically jacked into an environment, where if you’re a teenager, the first thing you see are photo after photo of your friends having fun without you. That does something to all those human animals. If the first thing you do when your eyes open is see Twitter and there’s a bunch of stuff to be outraged about, that’s going to do something to you on an animal level.
I think what we have to reckon with is how is this affecting us on a deeper level.
Ezra Klein
I’ve been thinking about this a bit in my own life. I’m pretty competitive. It’s not my best trait. And as the feeds have become more algorithmic, it’s brought that out in me even more. It used to be that your Twitter feed was just whatever had just happened. Now I turn on Twitter and the first thing I see is everybody whose tweets are performing better than mine. That is the way my animal brain absorbs it. Here are all the people like me, in the same business as me, and here are their best-performing tweets, which are currently doing better than my tweets. Or I go on Facebook and you see the work from competitors that is going viral.
The reason I bring that up, aside from to note that I’m a bad person with bad impulses, is because I think it’s helpful to talk about us as animals. I think one way your work has been interpreted in the past few years is as about smartphones and design, but a lot of it, it seems to me, is about a particular understanding you have of the human as animal.
Tristan Harris
This the key to everything. Technology feels disempowering because we haven’t built it around an honest view of human nature. The reason we called our new project the Center for Humane Technology is it starts with a view of ourselves.
Silicon Valley is reckoning with having had a bad philosophical operating system. People in tech will say, “You told me, when I asked you what you wanted, that you wanted to go to the gym. That’s what you said. But then I handed you a box of doughnuts and you went for the doughnuts, so that must be what you really wanted.” The Facebook folks, that’s literally what they think. We offer people this other stuff, but then they always go for the outrage, or the autoplaying video, and that must be people’s most true preference.
If you ask someone, “What’s your dream?” that’s not a meaningless signal. A psychotherapist going through an interview process with someone is accessing parts of them that screens never do. I think the [traffic] metrics have created this whole illusion that what people are doing is what people want, when it’s really just what works in the moment, in that situation.
Where are our choices coming from?
Ezra Klein
If we had had this conversation a couple of years ago, I think the thing somebody would’ve said is, “You’re telling me that rather than listening to the choices I make, you want Facebook to decide what is better for me? You want Google to decide what is better for me?”
There was this ideal of neutral platforms. And while the platforms were never truly neutral, part of the reason I think that ideal has been alluring is that we assume that while the consumer — and in this case, we are being treated as a consumer — may not make a perfect decision, they’re going to be a hell of a lot better at making decisions than Mark Zuckerberg or Larry Page or Jack Dorsey.
There’s a desire to not be paternalistic. A desire to leave the choice as close to the person as possible. There is safety in saying the consumer is going to make the decision. Yes, there is an algorithm here, but the algorithm is to the consumer. The algorithm is just saying, “What do you want? Okay, we’re going to give you more of that.”
There are very few things in between to regulate against people’s desires. Really, the main one is drugs. We are willing to say there is something in a cigarette that we think short-circuits the way people naturally make decisions. But even there, the big opening for regulation came through the idea of secondhand smoke. We had to be able to say, at least at first, “Oh, your choice is creating externality for someone else.”
Tristan Harris
I think we have to ask where our choice is coming from. Imagine seeing a picture of a human being and asking, when this thing called a choice happened, what happened? Did the person take a deep breath and think for five seconds and then they acted? What was the choice that was made, or was there a quick circuit between lizard brain and right back out?
Imagine you had an input cable. You’re trying to jack it into a human being. Do you want to jack it into their reptilian brain, or do you want to jack it into their more reflective self? The problem is we don’t have language in the West for making these distinctions, but there’s a very big difference between being asked to pause and take a deep breath for 10 seconds and then decide or being pushed to make an immediate choice.
I brought Thich Nhat Hanh, who is a famous mindfulness teacher, to Google. He’s 92 years old. When he came to Google, he came because he was worried that this thing in our pocket was making it easier to run away from ourselves. I think we don’t have language for that. We say people check their phone 150 times a day and we don’t make the distinction between 150 calm, deep breath, conscious choices and then 150 anxiety-driven reactions. There’s a difference.
Can Silicon Valley be fixed from the inside?
Ezra Klein
There is a lot of focus on meditation in Silicon Valley. Burning Man is a big part of the culture, with its heavy emphasis on presence, on putting down your phone. There’s a lot of experimentation with psychedelics. For all the searching that people are doing inside Silicon Valley, for all the feeling people have that it’s too busy, that something is not quite right, the products they make keep pushing us further toward distraction, toward busyness, to being always and constantly on call. Can this be changed from inside?
Tristan Harris
No, it cannot be. I tried for two and half years inside of Google to change it. There is no way to change these things from the inside.
But I want to go back really quickly to address some of the things you brought up about Silicon Valley culture. There’s a conference coming up at the end of this month called Wisdom 2.0, and I’ll actually be there and speaking ...
Ezra Klein
Wisdom 2.0.
Tristan Harris
Wisdom 2.0.
Ezra Klein
Sorry I’m just holding on that for one moment. What was wrong with Wisdom 1.0?
Tristan Harris
Oh, it’s just old. We got the 2.0 version now. This is actually a fantastic conference. The meditation community, the spirituality community, and the business community talk about mindfulness and business. I say this because some of the principal sponsors of this conference are always the tech companies.
There’s all these groups at Facebook, at Google, at LinkedIn, and they talk about their wisdom programs and their mindfulness programs. They go onstage and they push it out there to the world, and it’s so hypocritical because on the one hand, they’re talking about how they find balance in the workplace, how to not get stressed out by the overwhelm, the notifications, the stress. Obviously the elephant in the room is that the product that’s being exported by these technology companies is possibly the largest counterforce to all the things that they’re talking about.
This is the thing that needs to change. This is why I was working on this for so long. We actually have to change the thing that we are exporting to the world, which is distraction, outrage, slot machine-style rewards, constant stimulation, social validation, making it harder for people to tell what’s true.
We’re distorting every single aspect of the human stack, and meanwhile, we’re talking about meditation and how good it’s been for us and the cool programs that are helping Googlers and Facebook people meditate more. I think that’s the thing that needs to be reconciled.
Ezra Klein
There is a way in which it all makes you a little bit Burkean. It makes you wonder whether we’re changing some of this too fast.
Tristan Harris
This is an interesting thing too about changing too fast. There’s these dimensions to being human and one dimension, per your point about too fast, is clock rate. If we start breathing at a slower rate, speak at a slower rate, being here with each other, that’s very different than if I just dial that thing way up to 10X that. Things start to fall off the rails when you’re going really fast.
This is one of the things that I’m kind of worried about — human animals, when dialed up past certain boundaries of speed, make poor choices. Basically the entire game now in high-frequency trading is to blow up the mountains so you can lay a cable so you can do a trade and a financial transaction a microsecond faster than the other guys. We’re competing to go as fast as possible in domains where, given the impact, we ought to be going as slow as possible.
Ezra Klein
Mark Zuckerberg recently made a bunch of changes to Facebook and said he wanted time on the platform to be “time well spent,” which is the concept you’ve been behind. My sense is you don’t believe that’s going to be a fundamental change in Facebook’s operating approach.
Tristan Harris
Now we’re getting into more of the practicality of what is the system and what is the problem and then how do we fix it? The advertising business model is the thing that forces the technology companies to maximize attention. Zuckerberg said on his earnings call that people were spending one or two minutes less on Facebook a day, and that was 50 million hours less per day. They can only do that to a certain extent. They can’t halve the amount of time that people spend on Facebook. That would be way too much. Their stock price is too hinged on a certain amount of usage. How do we decouple the link between the stock price and how much attention is extracted? This is the thing that I’m actually most alarmed about in the current system.
Ezra Klein
We’re in Washington, so let’s talk about regulation as it relates to this. I can imagine regulatory approaches where you’re dealing with some stuff on the edges. But for the government to come in and say, “We know better than you how you should be spending your time,” is not going to work. It’s hard for me to think of a regulatory approach to this kind of attention hijacking where the cure would not sound worse, and potentially be worse, than the disease.
When you think about it, what are the kinds of rules that you imagine?
Tristan Harris
This week, we launched this big campaign with Common Sense Media called the Truth About Tech campaign, because it’s one thing when we talk about adults, it’s another thing when we talk about children.
I think everybody looking at how kids are interacting with this stuff sense that something is wrong. The main thing is to say it’s not by accident. It’s happening by design. Think of it like the Truth campaign with cigarettes. If you remember the 1990s TV ads, it was not saying, “Hey, this is going to have this bad health consequences for you if you smoke.” That feels like someone is telling you what to do. The Truth campaign was about telling you the truth about how they design it deliberately to be addictive.
The simplest example I always use is Snapchat, which is the No. 1 way that all teenagers in the US communicate. It shows the number of days in a row that you sent a message to your friend. That’s a persuasive and manipulative technique called “the streak.” You have kids who have a streak with one of their best friends going for 360 days. But if I don’t send a message back and forth every 24 hours, the streak goes away. It’s like putting two kids on treadmills, tying their legs together with a string, and then hitting start on both treadmills at the same time, because they both have to keep running, or if one falls, the other one falls down. You have 30 of these streaks [going at once]. This is so persuasive that kids give their password to their parents, or to their friends, to keep their streaks going if they have to disconnect. It’s that bad.
I say this because addiction with teens, developmentally, it’s not good for them. When you talk about regulation, or we talk about how we’re going to get out of this, the specific things you do is another question. I just want to say that we know there’s a huge public health problem here. We have got to do something, because the current thing that’s happening now is not working.
http://ift.tt/2ETZDDF
0 Response to "How technology is designed to bring out the worst in us - Vox"
Post a Comment