A streamlined stack of supplements designed to meet your most critical needs - Adapt Naturals is now live. Learn more

RHR: The Impact Technology Has on Ourselves, Our Families, and Our Future, with Tim Kendall

by

Last updated on

In this episode of Revolution Health Radio, I talk with Tim Kendall, CEO of Moment, an app that helps adults and children use their phones in healthier ways. Tim is the former President of Pinterest where he led product development, engineering, marketing, and sales. Prior to Pinterest, Tim was Facebook’s Director of Monetization, where he led the development of Facebook’s advertising business. You may have seen Tim featured in the Netflix documentary, The Social Dilemma, which explores the wide-ranging impacts of social media and related technologies.

Revolution Health Radio podcast, Chris Kresser

In this episode, we discuss:

  • Tim’s background
  • How media affects our sense of shared reality
  • How technology is evolving for the worse
  • Ways to implement behavior change
  • How social media has impacted children’s development and how we can protect our kids from these technologies

Show notes:

Hey, everybody, Chris Kresser here. Welcome to another episode of Revolution Health Radio. This week, I’m really excited to bring you Tim Kendall as our guest. Tim is the CEO of Moment, an app that helps adults and children use their phones in healthier ways.

Tim is the former president of Pinterest. At various points, he led product development, engineering, marketing, and sales. Prior to Pinterest, Tim was Facebook’s director of monetization, where he led the development of Facebook’s advertising business. Tim serves on the board of UCSF Benioff Children’s Hospital, where he’s recently focused on their mental health strategy. He earned his engineering degree and MBA from Stanford University.

You may have seen Tim featured in the Netflix documentary, The Social Dilemma, which explores the wide-ranging impacts of social media and related technologies, and that’s what we’re going to be discussing on this show. So, without further delay, I bring you Tim Kendall.

Chris Kresser:  Tim, welcome to the show. I’ve really been looking forward to this.

Tim Kendall:  Thank you. Yeah, I have, too.

Chris Kresser:  So I always like to start with a little bit of background. Why don’t you tell us how you came to be the CEO of Moment, in a nutshell, and then we’re going to touch on a lot of different topics. But you had a number of different positions at Pinterest, and also before that played a significant role in developing Facebook’s monetization and advertising program. So tell us where this all started for you.

Tim Kendall:  Yeah, I guess it all starts, really, growing up as a kid, I was just fascinated with technology and fascinated with the underlying science of technology. Like, what were the things that made it so that technology could do magic things in our lives. And, even when I was young, when we had IBM PCjr, there were magical things relative to what you could do a few years prior. So I was always fascinated by technology and the goodness that it could create for people.

I grew up in Colorado, so I moved out here [to California] to pursue that. So, in the spirit of being succinct, I meandered a little bit and then ended up at Facebook. I helped them figure out what the business model for the company was going to be, and then, fast forward five years, I was the president of Pinterest. And about halfway through my time there, I was there about six years, I think I started to just notice how much my phone was taking over my life. I started to notice that I didn’t really have the kind of control over my phone that [I did] back when it was just a cell phone or a Blackberry. And I think with the dawn of the computer now going from the desktop to being in your pocket, the computer becoming so much more powerful, and then, of course, push notifications no matter what kind of phone you have and the app ecosystem, I think I started to really notice just how much it had taken over my life and how I didn’t have control. And that coincided with my wife and I having our first child six years ago. I realized that even though in my heart of hearts, I wanted to be this super engaged, super present dad, I noticed that my phone was pulling me in the other direction.

So I’d find myself in a closet or in our pantry hiding away, looking at Instagram or videos on YouTube, at frivolous stuff. “Stuff” that wasn’t at all consistent with the kind of life I wanted to build. And it was clearly taking away from my relationships and my ability to generally be present with myself and with others. So that was the beginning of a journey that included talking quite a bit about this issue at Pinterest. I actually keynoted at a global advertising conference about three years ago, where I talked about this problem. And as I was preparing to give that talk, I thought this is such an important issue for me, and I need to think about going off and working on it full time. And so I had an idea for software on the phone that would really help people understand how much they were using it, and coach them to use it less. And as I was researching the area, I found Moment, which was a company that a guy named Kevin Holesh had founded and created, and they had a couple million people already using their product.

And so, as I was leaving Pinterest, I gave [Kevin] a call, and we started talking and we hit it off, and I decided to buy the company from him. He has continued to stay involved, we talk very often, and he’s still leading [our] engineering efforts. We moved the company out here to California and have been focusing on the screen time efforts of the company. But we’re also incubating some other things, that at some point we may want to talk about, that really go beyond, how do you control the time you spend on your phone to are there other ways that we might be able to imagine social media? Versus it being this extractive ad business model that just fragments our attention to something that really serves the user, and serves their social needs. So that’s the answer; hopefully, that wasn’t too long-winded.

Chris Kresser:  No, no perfect. There’s so much to unpack there. And we’re going to touch on a lot of different topics. But I wanted to at least begin by framing this dance. I think we’re going to do some back-and-forth throughout this podcast between the personal impacts, and also personal steps that we can take to mitigate the effects of technology with the larger societal impacts, as well as the societal and technical influences that determine our personal usage. Because I think one of the mistakes that is often made when people are talking about these technologies and their impact is the idea that our addiction is just a personal shortcoming or a personal failure. And we neglect to see what influences are behind it that are driving that addiction and how pervasive they are, how powerful they are, and how really kind of helpless, not entirely helpless, but how they exploit our basic human vulnerabilities.

And of course, this is something that The Social Dilemma, I think, pulled the veil back on really well and talked a lot about how the business model of these technologies is designed to exploit our attention. That our attention is the commodity that’s being bought and sold. And I think it’s really helpful for people to understand that because it’s empowering; it takes the sense of, “I’m a failure” out of it. Instead of, “there’s something wrong with me,” to, “oh wait, I’m being actively exploited by very powerful forces.” So, with that in mind, I want to rewind a little bit and go back to your time at Facebook.

Tim Kendall:  Yeah.

Chris Kresser:  So you were starting to develop the business model for Facebook, which was early; I think, at that time, there wasn’t really a lot [of social media]. And so when you were developing the business model for social media, did a lot of people follow Facebook’s lead in that regard? Or would you say there was something that came first that was influential, and how you were thinking about it at the time?

Tim Kendall: The way that I thought about it was, and we didn’t say this explicitly, but we were a media company. Not unlike [how] Yahoo was a media company, not unlike [how] TV was a form of media, not unlike [how] radio was a form of media, not unlike [how] newspapers were a form of media. And by the way, the whole history of media has been broadly supported by advertising.

Chris Kresser:  Of course. Yeah.

Tim Kendall:  And we looked at Google, which was this hybrid of media and technology, completely advertising supported, and very successful. The same with Yahoo. And we looked at other avenues, as well, but it just seemed really clear and straightforward that this [advertising business model] is a nice, clean way for us to make money, and it’s how media has built a business for the past hundred-plus years. What could go wrong?

Chris Kresser:  Right, because it’s not like social media companies were the first to try to get people’s attention and profit from attention. If you take any kind of copywriting course, or look at what newspapers have done for centuries, or how headlines are written, they’re all designed to get people’s attention and to get people to read the story. However, because of the nature of the technology involved, social media companies have just been far, far more successful at doing that.

Tim Kendall:  Correct. And I think that, I haven’t talked about this extensively, but an interesting use case is, if you look at TV and you look at radio, it’s basically human curated, right? It’s human beings making decisions about programming for humans. There isn’t any omnipotent, all-knowing force that’s like, “Oh, we know what people are really going to like. The math is super clear.” But what’s interesting, if you look at Netflix, for instance, there are two forces at play now that are starting to make their programming pretty addictive. It’s not quite like social media, because it’s not personalized to Chris. But, not only are there algorithms, the algorithms are absolutely personalized. So when you turn on your TV, and you look at Netflix, it’s showing you sort of a menu of things that are recommended to you based on what you’ve watched previously. And that can prey on you, for sure, and extract your attention in ways that maybe you don’t understand, or that aren’t necessarily aligned with your best interest. But the other thing that [Netflix] is doing is looking at viewership data on past shows, and using that to inform the development of new shows. And that hasn’t been done at the level of detail and fidelity that they’re doing it.

And everyone’s like, “God, there’s this renaissance of TV. TV is so good and riveting,” and I think that’s in part because they figured out ways to harness the creativity of human beings. I sadly also think that is because we understand, and Netflix and others are starting to really understand, that this is actually the story arc that will really suck someone in and get them to binge-watch for a long period of time. And that was more of a creative art and speculative exercise 20 years ago, but now it’s just becoming wrought science.

Chris Kresser:  Right. And eventually, [artificial intelligence] (AI) will just do the whole thing, right?

Tim Kendall:  Precisely.

Chris Kresser:  Including creating the programming, the digital images, and video, and you won’t even need any people.

Tim Kendall:  And it’ll probably be personalized in a way that, which is just.

Chris Kresser:  We might watch the same show, but it has slightly different endings based on our preferences.

Tim Kendall:  Sure.

Chris Kresser:  Why not? If it’s AI that’s making it and you don’t have actors, you can make essentially an unlimited number of plot twists and endings based on each person’s preference.

Tim Kendall:  Yeah, which is just crazy scary. But, to quickly go back to Facebook, what we didn’t understand, [which] I really clearly understand now, is that when you combine a media company and a business model that’s predicated on getting more and more of the user attention, because advertising allows you to take user attention and turn it into dollars, if that gets intersected with an all-knowing algorithm that’s personalized to every single individual, and it is actually an algorithm that knows you better, in many ways, than you know yourself, the natural extension of that is just, it will eventually take over our lives and take over our society.

I was looking at the data, the usage data for social media. Ten years ago, we spent about 12 and a half minutes a day on social media. Those are usage numbers.

Chris Kresser:  How quaint.

Tim Kendall:  Quaint. That has gone up tenfold. We now spend two and a half hours a day, so 240 minutes a day on average, on social media. And that’s just getting more and more and more potent. We talk about the opiate crisis and, don’t get me wrong, I think the opiate crisis is really scary. But you’re not talking about a drug that’s getting substantially more addictive each year.

Chris Kresser:  And it’s completely unlicensed, unregulated, and, until recently, not acknowledged as addictive, like opiates.

Tim Kendall:  Yes, yes.

Chris Kresser:  Yeah, and opiates are not actively marketed to teenage girls, and boys, and kids as young as six years old, right?

Tim Kendall:  That’s right.

Chris Kresser:  Yes. All right, so we’re going to come back to all that because we’re both parents, and I know we both have strong feelings about how these technologies impact kids and teenagers. But so, you’re at Facebook, you’re figuring out the model—advertising, I can certainly see why you would have arrived at that conclusion. As you were sitting around the table talking about it, I’m imagining you had absolutely no idea where this would lead, or where we’d be today. It seems like it would have been hard to envision the way that this would unfold, just given that we didn’t have a reference point for how potent these technologies could become. And then, I think also, at that time, the algorithms were far less sophisticated, right? Or there weren’t really even algorithms that were making the kinds of decisions that are being made now by algorithms. I think that’s another, to me, X factor that really informs this conversation.

Tim Kendall:  I mean, there was a newsfeed that people looked at. And it actually was, sometimes this gets misreported, it was always from the very beginning algorithmic. It was not deterministic; it was literally like, “Okay, Chris, we’ve got several thousand things we can show you. Let’s put them in order of relevance, not time sort them.” And that was part of Mark Zuckerberg’s genius, actually. He was the one that sort of figured out, like, “oh, this is actually more interesting if I can curate this in a way that’s personally relevant to Chris.” So, no, it wasn’t. You can retrospect a vision as 2020, or whatever the expression is. It’s so easy now. I think what we didn’t understand, well, I’ll say what I did not understand, is just the pace at which the algorithm was going to improve.

And I think we also didn’t understand, at the time, it was truly a social network in that the content that we were looking at to show you, Chris, was pictures of your friends and family, and status updates of your friends and family. And don’t get me wrong, we were still starting to prey on some dimensions of human weakness, right? We were preying on comparison, our vulnerability to comparison, our vulnerability to popularity. And so that’s why it was working, right?

Chris Kresser:  Sure.

Tim Kendall:  We were adding more and more people. But news wasn’t on the service in any real way. There weren’t groups. You were basically on there with 50 to 100 friends, and you were in your own universe. And this idea of societal-level causes, issues, conspiracy theories, and all this other stuff, I mean, none of that was on the horizon at all.

And so, maybe if I’m just trying to be totally objective, if we had really sat down—and this is what really all companies should do—and we didn’t—which is, let’s game theory this thing out. What could go wrong for us, obviously, but also what could go wrong for our users? And I think it would have been really hard to predict the societal stuff. But I think we could have predicted the individual harms, potentially, right? If we just [had] sat down and said, okay, shoot, some people are going to start learning about parties that they weren’t invited to. Like, we knew that was starting to happen, right? People are going to start to see what others are up to. And because those people know that what they’re publicizing is getting pushed out [to their network], that might lead them to distort the quality of what they’re up to, the experience. And this could lead to a little bit of a comparison arms race, where it’s like, “my vacation was better than yours. And my kids are happier than yours. And my marriage is blissful. It’s more blissful than yours!” None of that’s explicit, right? But you could start to see that happening because of the nature of the service. And I think, if I’m really objective, that’s what we probably could have realized. But the polarization, the tribalism, the misinformation, the hate speech, all of that would have been really tough to forecast.

Chris Kresser:  And that’s, of course, become especially amplified over the last couple of years in particular, and I do want to come back to that, as well.

Tune into this week’s episode of RHR, where I sit down with Tim Kendall about the ground breaking film, The Social Dilemma, his experiences at Facebook, Pinterest, and Moment, and how we can be better. #chriskresser

How media affects our sense of shared reality

Tim Kendall:  Well, it’s up to you whether you want to touch on it. I think the misinformation and lack of shared truth that we’re experiencing that in the last 24, 48 hours …

Chris Kresser:  Oh, yeah. So this to me is the fundamental theme of The Social Dilemma. The reason that I’ve become more concerned about this than virtually any other societal problem that we face, whether you’re talking about climate change, or other environmental issues, or social justice, or any of the existential, societal threats that we face, and the reason for me is that, in order to adequately address those threats, we need to have a shared reality as the basis of the conversations that we have around those topics. And if we have no shared reality, discussion and meaningful conversations become impossible. And there’s never been a time in human history where we have not had a shared reality.

So, for me, I had already read and thought a lot about that. But The Social Dilemma really drove that home in a powerful and, frankly, terrifying and alarming way. And we’re seeing the impacts of that with the election, and leading up to the election, where you have one side of the political spectrum that literally has a completely different reality than the other side of the political spectrum. And that makes a conversation where we could find common ground, and empathize with each other, and be able to work from that shared reality to find some type of meeting place in the middle, that becomes impossible.

So how do we navigate this? I don’t think anyone has the answer. But at least raising the question seems to be the most important first step that we can take in at least communicating to people that this is the problem that we’re facing.

Tim Kendall:  Yeah.

Chris Kresser:  Do we even have a shared reality that there’s no shared reality? Not to get too epistemic here.

Tim Kendall:  No, I think that’s a very important question. And I think that was part of the aim of this film, and in some ways, I’m hopeful, that at least maybe temporarily, that’s what the film achieved. Almost 40 million households saw this film.

Chris Kresser:  That’s encouraging. I heard that stat, and I was really happy.

Tim Kendall:  Which is terrific. So the hope, at least, was like, “Okay, wow, we’re all getting effed with. And so all of our realities have distortion fields. Can we each step outside of those and at least see that all of our attention and reality is getting tweaked and manipulated by these algorithms and services?” There does seem to be some of that. But I will say, and I don’t think this is necessarily a function of the film, I think it’s a function of a lot of forces. I am really heartened, and I’m an optimistic person—and sometimes I get criticized for being overly optimistic—so I have to caveat what I’m about to say with that point. I think Facebook has been incredibly negligent in this whole thing, but I am encouraged. I don’t think they’re doing enough, but they [have been] doing a lot more in the last six weeks, in terms of moving from their positions, than they’ve done in the past several years. And what I mean by that is they’re now saying Holocaust denial will be taken down or labeled. They basically didn’t want that, and weren’t going to touch that type of information for years. And Mark defended it under the guise of free speech. And he’s moved on that. And that is, I think, a movement toward shared reality.

The other thing is, it’s a pretty big concession, I think, certainly relative to their original position, for them to post on the top of their newsfeed, “there is no winner to the election.” And [there are] these groups propagating on Facebook about the election being stolen by Democrats with fraudulent counting. And those groups, the distribution of those groups is being capped. And from a policy standpoint, that just wasn’t happening on these kinds of issues even two months ago at Facebook.

Chris Kresser:  Right. So I’m going to play devil’s advocate here, because I can see both sides of this issue. On the one hand, I can appreciate that controlling, or censoring, the presence of obvious false information could contribute to building a shared reality. My problem with that, or my fear and concern around that, is who decides what is fraudulent and what is not fraudulent? So let’s just take an example in my field. What if Mark Zuckerberg or someone decides that saturated fat is actually the boogeyman that we’ve always been told that it is, and that low-carb and ketogenic diets are fake news or false information? And so [Facebook] should only allow sharing content that is in alignment with the American [Diabetes] Association and American Heart Association’s perspective on diet? And this isn’t actually a hypothetical example; I have colleagues.

Tim Kendall:  Probably in holistic medicine, this is a big issue.

Chris Kresser:  No, I have colleagues in the space, whose names I won’t mention, who had a very prominent presence on Facebook and with Facebook groups related to ketogenic diets, who have basically been shadow banned and can no longer advertise or reach their audience because someone at Facebook has decided that ketogenic diets are harmful, or dangerous, or not in alignment with the conventional view.

So, and I have a real problem with the idea that some 20-something, 30-something white males in Silicon Valley are going to be making those decisions.

Tim Kendall:  The irony is that they’re probably intermittent fasting and on keto diets.

Chris Kresser:  Absolutely, that’s what’s crazy about it. So we can all get behind the, well, I shouldn’t say this because I actually don’t get behind it. But it’s easier to get behind censorship when it’s on your side.

Tim Kendall:  I agree.

Chris Kresser:  If we’re censoring something that I don’t believe in, that’s great. But when it’s not, to your point earlier, if you think forward and project down the line where this could end up, it’s scary. And I’m curious what you think about this. People have argued that Facebook and Twitter should be more like public utilities at this point. And because of that, we need to have broader conversations about how these decisions are going to be made, because leaving them to be made just by a handful of tech entrepreneurs is maybe not the best option at this point, given the pervasive influence that they have not only in the [United States], but worldwide.

Tim Kendall:  Yeah. I think you bring up a really good point. Let’s leave aside the shadow banning on Facebook, because it’s really a problem. But at least from their stated policy, Facebook has up until recently said basically, “we’re going to allow anyone to say anything and by virtue of just allowing more and more speech on a diverse spectrum, this thing will sort itself out.” And I think there’s some real merit to that. And I think that certainly the principles of free speech and the principles, particularly of political speech, at least in the origin of what was intended with the right to free speech, was that particularly our leaders or potential leaders could say whatever they wanted.

That premise was created in a world in which you had a small number of discrete institutions ]that] were doing the following. If you said, “the Holocaust isn’t real,” and you were a leader, and I was a leader, or perspective leader, and I said, “No, the Holocaust is real.” Or maybe we should pick something a little less charged. But let’s just stick.

Chris Kresser:  It’s hard. Everything is charged now.

How technology is evolving for the worse

Tim Kendall:  Everything’s charged now. Here’s what’s different today from what was the case when these sort of laws and rights were put into place. What we could count on back then was that these institutions by and large would write an article, contextualize it, put in a quote from you, put in a quote from me, explain the context of my view, “what is it about? What are the varying views for why someone might deny the Holocaust?” And we’d be left with a set of factual reasons why someone might come to this view, even though the view isn’t factually substantiated. And then a similar sort of alternative, the view like the Holocaust is real, and here are the data that support it. And that is what got distributed out to people.

Chris Kresser:  Right. But here’s the other problem in my mind, is that at that time, there were no profit maximizing algorithms.

Tim Kendall:  Exactly.

Chris Kresser:  So what was made clear in The Social Dilemma and in many other venues is that part of the reason that America, and the world, have become so polarized over the past few years is that these algorithms, which had no original intent to do this, their only mission was to maximize the number of clicks, because by doing that, they’re going to increase the profit of the advertisers.

Tim Kendall:  Yep.

Chris Kresser:  And the algorithms quickly learned that people clicked more on articles and stories that were more polarized and that were that were further to the left or further to the right.

Tim Kendall:  Yes.

Chris Kresser:  And so even if you have this environment that permits free speech, people are not seeing the full range of that speech.

Tim Kendall:  They’re seeing the polls.

Chris Kresser:  They’re seeing the polls. And I think the appreciation of nuance, and seeing different sides of an argument has decreased to an all-time low probably in human history because of that. It seems like everything now is either black or white, up or down; there’s no capacity for understanding nuance or seeing different sides of an issue because people have been conditioned, almost like Pavlovian dogs. And I think there’s also an evolutionary reason there.

Tim Kendall:  Absolutely.

Chris Kresser:  The arguments that are black and white, I’m forgetting the exact psychological term, but it’s basically conserving psychological resources. If we have to spend less mental energy to think about something or figure it out, from an evolutionary perspective, that would have probably benefited us in terms of survival. So we’re drawn to arguments that are simple and straightforward, even if they’re wrong.

Tim Kendall:  So Chris, let me ask you, and I would like to definitely talk about the business model, because I think that the film asserts this and I am generally aligned with the fact [that] the business model is certainly probably at the root of a lot of this. Let’s assume that we can solve that just for a second, because I’m curious. Let’s take the example of the ketogenic diet: what would be the best, in terms of a post that advocated the ketogenic diet, that should be allowed with the qualifiers that the American Heart Association, or these organizations do not endorse this? However, these groups have traditionally been many years behind the leading edge of science. Is that the nuance that you want to apply?

Chris Kresser:  That’s a good question. I think I would have to think about it further. My knee-jerk response is, I don’t know that we even need that level of qualification because nobody put those organizations in charge other than themselves, and they were not elected.

Tim Kendall:  Fair.

Chris Kresser: Nobody decided that those organizations, which take enormous amounts of money from pharmaceutical companies and other companies that have a vested interest in producing certain outcomes.

So maybe I’m getting a little too granular in your particular example, but something I try to do as a writer, when I write articles, is to present other points of view. And since you [have followed] my work for a while, you probably know this about me, but I’m not super dogmatic. I’m not a gung ho keto guy, or a low-carb guy, or [a] fasting guy, or any guy. Because for every person that the ketogenic diet helps, because I’m a clinician and I’ve worked with patients for over a decade, I see firsthand when it doesn’t work. And so I have that experience that keeps it real for me where I’m like, “Hey look, this can really help with weight loss for some people with blood sugar issues; it can be a life-changing thing for kids with epilepsy.” But guess what? If you’re a woman who’s burning the candle at both ends and under a lot of stress, it actually might take you in the wrong direction.

So I don’t know, Tim. In that situation, should every article have a sort of box that says “this doesn’t work for everybody?” I can see that being laborious. And who puts that there? Who decides what that box says? It’s tricky. It’s really tricky. And I’ll just say, too, I don’t have the answers. I’ve thought about this a lot and I have opinions. And I imagine you would say the same, too, but I think we’ve got a ways to go to figuring these out. I just want to be having these conversations. I think that’s vital.

Tim Kendall:  I agree. Just going to the topic of news for a moment, because I think these same challenges apply to political news, or just general news in terms of bias and leaning. And there are actually some startups that have emerged in the past six to 12 months that are really trying to go after how we can have services that provide neutral and unbiased news, or at least provide the context on various articles.

In fact, there’s a browser plugin that just came out yesterday, and I will send you the name of it in case you want to put it in the show notes.

Chris Kresser:  Please.

Tim Kendall:  That basically, as you go across the web, and you look at various articles, it shows the bias. So it’ll show “this is actually relative to center,” or, “this is heavily right leaning,” or, “this is slightly right leaning.” And then some of the other bias dimensions that I’ve seen—because right and left is an important spectrum to understand—is establishment versus anti-establishment.

Chris Kresser:  Yeah.

Tim Kendall:  And where does it sit on that spectrum?

Chris Kresser:  Well, I mean, how evidence-based, too?

Tim Kendall:  What did you say?

Chris Kresser:  How evidence-based? That would be another spectrum that would be interesting.

Tim Kendall:  That’s what I was thinking for the keto thing is like, what’s the evidence? What’s the peer review? Was there a double-blind controlled trial? And then, if there are institutions named, there is a disclosure of lobbying dollars, right? All of that is knowable information. It exists out there. And they are objective facts. I don’t think technology can solve everything, but if I had a truth teller that could follow me around the internet and create transparency around institutional claims and fact assertions and etc., etc., that would be pretty useful.

Chris Kresser:  I agree. I think as we’re having this conversation, I’m thinking more and more about the business model as a potential solution here. Because, even with what you’re talking about, the danger is that there’s then no room for new ideas that challenge the status quo and the dominant paradigm. Those ideas quickly get squashed and very little attention.

Tim Kendall:  Yeah.

Chris Kresser:  In medicine, I can point to so many examples of where the status quo was fervently defended for so many years and then it turned out to be wrong.

Tim Kendall:  Right, right.

Chris Kresser:  And one of the best examples is the idea that ulcers were exclusively caused by stress. And there were these two physicians in Australia, who challenged that idea at a gastroenterology meeting and they argued that it was actually caused by an infectious organism, Helicobacter pylori. They were laughed off the stage, literally, and completely excommunicated from their field and not taken seriously for years. And it wasn’t until one of them actually swallowed a vial of this bacteria, infected himself, developed an ulcer, took antibiotics, which treated the infection, and healed the ulcer. And even then, it was still years until that became widely accepted as the pathogenesis of an ulcer.

So, in that scenario, if you can imagine, if they wrote an article about this theory and then it was fact checked, and there were no studies to support this yet, or they said, “this is way on the end of the spectrum of anti-establishment and there’s not enough science to support it.” It doesn’t get any attention and it just dies there. So I don’t mean to, I think you and I are really on the same page here. I’m just playing devil’s advocate, not because I don’t think that’s a valid thing to explore, but just to point out the other side of it and how that could go wrong depending on who’s in control of that information.

Tim Kendall:  Yeah. I think it’s an interesting discussion. The technology that I’m imagining would have to have an affordance for outlier emergent views. Right? So like, I know about this ulcer issue because there’s a similar debate, and you’re familiar with it, around back pain. Is lower back pain caused by what’s actually going on in the spine or the disc? Which is what conventional medicine believes is the case 99.9 percent of the time. Or is there a psychosomatic contributor there? There’s a lot of compelling data that suggests that a lot of our back pain and back issues are emotional and somatic issues.

Chris Kresser:  Right. Or at least not structural. It could be some other inflammation or something happening. But you’re referring to the study at UCLA, where they took people with back pain and without back pain, gave them [magnetic resonance imaging], and found lots of slipped discs and structural problems, but there was no correlation between pain and the structural problems.

Tim Kendall:  Yeah.

Chris Kresser: So let me turn this back around. One of the arguments or proposals that I’ve seen is shifting to a system that’s not advertising-supported, where people are using very small amounts of cryptocurrency, or something, to pay for content that they find to be valuable. Do you think that is a viable solution? Or, with algorithms as they are, is that still just going to continue us along on this echo chamber path where people are just paying, algorithms are presenting the same kind of content to people that they already would be inclined to see, and they’re just paying for it in a different way? Or is there a way that the algorithms could be removed from the equation in that business model, where people are paying for content instead of advertisers paying for it?

Tim Kendall:  I certainly think it’s possible that one of the solutions to the advertising-based business model, which I think there’s increasing alignment around that being problematic, is likely some combination of subscription with—and this gets out of my depth—joined up with a cryptocurrency model that would both enable users to be compensated for information about themselves potentially, and exchanging now on a one-to-one basis with someone who wanted to reach them as a marketer.

Chris Kresser:  Right. And that’s entirely by choice.

Tim Kendall:  And that’s entirely by choice, and it’s opt in, not opt out.

Chris Kresser:  Right.

Tim Kendall:  I don’t have the clear answer. I think the raw materials of the alternative business model that we need to go to are cryptocurrency, some notion of wisdom of crowds, and some notion of a subscription, right? Some notion of the user paying. And I think, importantly, one of the principles there is that the service has to be aligned with my best interests as a human being, which is that I want to live a happy, healthy life of liberty. And if the service is not aligned with providing that, then we have a problem. And the issue with the advertising-based business model is that that is just fundamentally misaligned. It’s funny, this was on a Netflix earnings call; it didn’t show up in the Netflix film The Social Dilemma, but one of the executives was asked about their competition. They were expecting him to refer to Disney+ or HBO, and he said, “Oh, we don’t worry about that. Our competitors are sleep and your relationships.”

Chris Kresser:  That’s right. And it’s amazing that he would just come out and say it that brazenly and transparently.

Tim Kendall:  I know. It was a couple [of] years ago. I don’t think he would have said it today. But that is the starkest example of illustration of the misalignment of incentives, right? A misalignment of interests. They’re like, “my interest is for you to sleep less and have fewer social relationships, because that means more money for me.” That is antithetical to what is in your best interest in the medium and long term.

Chris Kresser:  Yep.

Tim Kendall:  And so that’s the best test, I think, that you need to run on any business model, in terms of understanding, okay, is this going to end up in a bad place? Because if there’s a misalignment of interests, we ultimately get in trouble. And that’s true of cigarettes.

Chris Kresser:  Medicine. That’s a huge issue in my field.

Tim Kendall:  It’s a huge issue in pharmaceuticals.

Chris Kresser:  Yeah.

Tim Kendall:  So yeah, I mean, we’ve got to figure that out. What I imagine could be a way that we get there practically is you look at fossil fuels, and you look at the auto manufacturers, and you look at the awareness that crept in over time with films like The Inconvenient Truth, and so forth, that led us to get to some alignment—not complete alignment, that look, fossil fuels were extractive and were not in our medium- and long-term best interest. And so it was somewhat adversarial, but the point is that the government put incentives and penalties in place to get auto manufacturers to very slowly move toward electric. Segue from fossil fuels to clean energy.

I think I’m going to do this in a much shorter order than what is happening from fossil fuels to clean energy. Ideally, what would happen is Mark Zuckerberg, Jack Dorsey, and Susan Wojcicki, who runs YouTube, would get together with heads of government and consumer advocates for some sort of conference, convention, or summit to talk about what’s going on, look at the facts, and try to get aligned on [the] facts. Because if we can get aligned on [the] facts, I would like to believe that maybe there’s a way that a path could be co-created out of where we are. And it can be joint responsibility, because it is joint responsibility.

We are here because consumers have allowed us to get here as individuals, collectively. Governments have done a poor job of regulating across a number of dimensions. Companies have operated, I think, with some real negligence. So there’s shared responsibility. And so ideally, if there should be some reckoning like that. What if we could figure out a way through incentives, through penalties, carrot and stick, to segue into a model that is the clean energy version of the business for these social media companies? And why are we doing this? Because it’s existential. We need a Paris Climate Summit and then a Paris Accord.

Chris Kresser:  That’s right. It is existential. And I think very few people have recognized that, prior to the release of The Social Dilemma. And I do think that that’s really changed. Just speaking to some people in my life, I know it had a huge impact and really opened their eyes to the extent of the problem, and the scale of the problem, and how far beyond just personal use of these devices it goes. And it is helpful to think about other industries or fields. Like food is an interesting one to consider. Very close parallels between how the manufacturers of processed foods understand very well how to exploit our inherent human vulnerabilities, right? We have this almost insatiable craving for highly rewarding calorie-dense foods, because that actually protected our survival in a natural environment where food scarcity was a much bigger problem than food abundance, right? And so if we sought out sweet, salty, fatty, calorie-dense foods, then we were more likely to survive periods of famine or food scarcity. But then when you’ve got Costco and 7-Eleven, and Walmart everywhere, that totally backfires on us.

And we know that these processed food companies employ neuroscientists who study things like mouthfeel, and who then inform the development of those processed food products so that they’re maximally addictive. You and I are old enough probably to remember the [Lay’s] slogan, “I bet you can’t just have one.” Yeah. They’ll win that bet. And they know they’ll win that bet because they have a lot of science behind that bet. And so, some of the steps that have been taken in the food industry, and this has been controversial politically, one is the soda tax. Right? So that’s passed in some places. And the idea [is], look, we know that this is not good for us and we also know that many people just either don’t know or can’t stop themselves from consuming these products. So the government is stepping in to play a role in protecting human life, just as they did with seat belts in cars. And, of course, you have opinions that span the political spectrum about what the role of government is, and whether that should be the role of government.

And then I recently heard that the city of Berkeley was the first municipality in the entire [United States] to regulate the foods that can be displayed at the checkout counter. I guess they had done studies and found that when you get up to the front of the checkout counter, what do you see? You see candy, chocolate, all kinds of stuff, and people just grab it and throw it in their cart almost unconsciously. And so those are two examples of, in this case, more penalties or regulation, not so much incentives. But penalties that at least those local governments decided were in the protective interest of humanity.

Tim Kendall:  And I’ve certainly heard the paternalistic argument about government imposing, or stepping in. But look, it’s ultimately the government and taxpayer that is going to have to pay for the hundred billion people in the United States that are pre-diabetic or diabetic.

Chris Kresser:  That’s right. That’s right.

Tim Kendall:  And this issue isn’t dissimilar from the standpoint that there clearly are causal links between social media usage, and anxiety and depression. And you know, anxiety and depression lead to heart disease.

Chris Kresser:  Yeah, let’s use that as a segue to talk a little bit more about the personal impacts, and also what steps people can take personally. Because I think it’s pretty clear that the solution has to be multifaceted. No individual is going to overcome this problem on their own. And so it’s probably going to need solutions at the individual level, the family level, the community level, all the way up to maybe even a global consortium, like you’re saying. Because this is not a problem that recognizes any borders, and these companies have a multinational influence. And so any solution that’s viable is going to have to span all of those geographical boundaries. And we’re not going to solve that today as much as we’d like to.

And so you obviously recognize the value of individual action; that’s really what Moment is about. And you came to your own personal realization of how it was impacting you as a person and as a father, and I did, as well. And so let’s talk first about what we know about how this impacts people, their mental, physical, and I would even say spiritual health. And then the steps you’re taking on a personal level, and that you take with your family. I can share some of what I’m doing and have found to be helpful, as well. Because we can’t just wait for solutions to be presented. We’ve got to take this into our own hands.

Tim Kendall:  Absolutely. Well, there are two studies that stick out to me the most. One shows the correspondence of increased social media usage in children, and the impact on the prefrontal cortex. And it actually shows thinning in the prefrontal cortex corresponding with the volume of social media overuse. That’s obviously scary.

Chris Kresser:  The prefrontal cortex is basically what makes us human and just differentiates us from other animals on the planet.

Tim Kendall:  Yeah, yeah. And this is where I’m clearly out of my depth, but at least intuitively, negative brain changes when you’re a kid are significantly more scary than—certainly in some of the football data about kids who play contact football between the ages of 10 and 15—it’s a way worse situation than if it’s later on.

Chris Kresser:  Right, because the brain is still developing and it’s very plastic, and for lack of a better term, impressionable at that point.

Tim Kendall:  Yeah. And then the second study is one that shows again, corresponding with the magnitude of social media usage, that the greater the usage over a certain threshold, the more thinning in the gray matter around the amygdala. Not unlike the thinning that you see in cocaine addicts.

Chris Kresser:  Right.

Tim Kendall:  And those are both, as I understand it, peer-reviewed credible studies. I’ll let you be the judge of that, Chris. So, look, the data is clear. These services, if left to their own devices and without us putting in reasonable limits, make us sick.

Chris Kresser:  Yeah, I would say the deterioration of sleep duration and quality is probably another one of the most alarming health impacts. Something like 70 or even 80 percent of millennials sleep with their phone, either on their pillow or on their bedside. And it’s not their phone in airplane mode, used as an alarm clock. It’s their phone fully on, and they’re waking up and responding to texts. I had Dr. Matthew Walker from Berkeley, [who’s] probably one of the foremost sleep scientists in the world, on the show. Folks who are listening, if you haven’t listened to that episode, please do. Because yes, you’ve heard me talk about the importance of sleep, but listening to him talk about it really drives it home. Sleep deprivation is probably one of the most significant health pandemics that we’re facing right now. And as you mentioned, it’s even more serious in kids and teens because their brains and bodies are still developing. And the studies on how much sleep kids need, versus how much they’re actually getting, are horrifying. And that’s largely due to technology.

One thing that often gets overlooked is just basic physical safety. Right? So texting while driving, I think, is the number one cause of death in a certain age group among teenagers, or at least number two or three. And when something is the number one or even number two cause of death in a particular age group, you expect to see a lot of discussion, maybe even regulation, legislation. Think of how much you see about heart disease and cancer and some of these causes of death in adults. So yeah, it’s a massive problem.

Tim Kendall:  Yeah. The other piece of data that I think is really telling, and is in the film, is the study on suicide rates on 10- to 14-year-old girls in the United States in the last 10 years has gone up three-fold. Prior to 2010, I think it was on the decline in prior decades. And then hospitalizations from self-harm in that same period went up four- or five-fold.

Chris Kresser:  And I would say this is harder to quantify. I know some studies have tried, but just self-confidence, self-worth, feelings of fitting in your own skin. If you’re a teenage girl and you’re posting a photo on Instagram, and this was really well done in the film, and you get only a handful of likes, what does that do for your self-esteem? Then, you’re spending all of your time thinking about how you can change your appearance. In the case of the film, the girl is changing her expressions and doing the sort of pouty lips. It completely affects young people, and everybody, and their sense of who they are. To me, that’s one of the most dangerous and most scary parts of this, both just as a human, and also as a parent of a now nine-year-old girl.

So let’s talk a little bit about when you had your wake-up, personally, how did you shift your use of these technologies? What kind of backstops did you put in place? And now, as a parent, how are you thinking about it? Because I have a presentation that I give on technology addiction, and I always start it by saying, “What’s the dirty little secret of Silicon Valley?” All of those guys and women send their kids to Waldorf schools, which don’t allow the use of technology. And when we were in the Bay Area, our daughter did go to a Waldorf school. She was either homeschooled or went to a Waldorf school. So I very much relate to that. So what do they know that everybody else doesn’t? And then that leads to an interesting question of, “how can we protect our kids from these technologies?”

Ways to implement behavior change

Tim Kendall:  Yeah. Well, I’ll say that individually, the honest answer is I still struggle. I think about this problem all the time, and I’m working on it all day long. And I still have challenges around keeping my usage to a level that I’m okay with and doesn’t have a negative cognitive or psychological effect.

The things that have been the most helpful for me, I would say, are I classify them in kind of the “hard limits bucket.” I think my issue with the setup of screen time and the Android service is that they don’t allow you to set hard limits. They allow you to dismiss anything you’ve imposed on yourself at will. And that is problematic. I use the metaphor of someone who’s addicted to cigarettes and saying, “Hey, here’s some Scotch tape. Why don’t you just put a piece of tape on that pack of cigarettes. That’ll help you not smoke that next cigarette.”

So what does that get to? And you’re familiar with this example. I battle sugar addiction. If you look at the research, the solution that’s tried and true is you’ve got to get [sugar] out of your house or your apartment. You’ve got to create friction between the impulse to eat the sugar, and then your ability to actually get it, put it in your hand, and put it in your mouth. Getting it out of the house is a tried-and-true, and very effective strategy that often gets bucketed as pre-commitment, right? How do you pre-commit to something and then kind of throw away the key? So this example is actually in the film, and the reason it’s in the film is because I told the filmmaker about my experience with the kitchen safe. What is the kitchen safe? It’s a plastic safe with a timer on it that has no back door. And the reason it’s called the kitchen safe is [that] it was originally designed for dieting, for people who wanted to put muffins or sweets in something, and the timer can go as long as several days, if you’re trying to really create discipline. So that kitchen safe was added into the fictional story in the film, which I think was the right thing to do.

And so I basically have used that because I can come home from work and put my phone in this safe and say, “You know what? I don’t want to touch this thing for three hours no matter what.” And so I set the safe to three hours, I press “start,” and I can’t take it out. And I do think that, in terms of psychological well-being, it’s fundamental to “fast,” if you will, right? Take sabbaticals, or fast on your phone.

A more recent hack that I love is that you actually can do a set Screen Time on iOS, and there’s a similar setting on Android. And it’s slightly convoluted, but I can explain it in 30 seconds. You go into Screen Time in Settings, you select the apps that you want to have on your “fasting phone.” Right? And so, in my case, that’s like text messaging, phone, weather, and calendar.

Chris Kresser:  Yep.

Tim Kendall:  And then you set downtime, basically to the amount of time that you want your phone to not be as functional, except for those apps. And mine is set from 3:00 in the afternoon until 7:00 the next morning. That’s an aggressive window, but it’s what I wanted to try to do. And then you reset your Screen Time password. And you go to the reset Screen Time password and then you hand it to your partner, your wife, or your kid, and they have to set the password so you don’t know it. And so that is a way of using Screen Time to set hard limits, and you set hard limits by throwing away the key.

If you’re really serious, and you really want to take time away from your phone, that is one of the best ways. The third way, and full disclosure, I’m an investor in this company, is there are people who have second phones. And people use their Apple Watch in the same way, right? They’ll just walk around with the watch and leave their phone at home if they have cellular connectivity on their watch. So there’s also a company called The Light Phone, [and] they make a credit card-sized phone. It’s just a phone along with text messaging, podcasts, and they’re going to add ride sharing. And it really can operate as your auxiliary phone. Some people have it as their primary phone. But you can toggle back and forth. And that’s also a really nice way of throwing away the key, in a sense, because you can really leave your phone at home, or leave it in a drawer, or put it in the kitchen safe and then just have your Light Phone.

Chris Kresser:  Those are all fantastic strategies; I use some version of each of them. I think another one that’s been really helpful for me is doing regular digital detoxes where I have a dedicated period where I’m not with my phone at all. And so Sundays are our tech-free day for us entirely.

Tim Kendall:  Awesome.

Chris Kresser:  We do one every week. And then I do regular, longer periods ranging from two to three days to two weeks, throughout the year. And for me, those are really powerful. Many of us have a hard time even remembering what human experience is like unmediated by a smartphone device.

Tim Kendall:  What do you notice at the end of a two- to three-day or two-week fast, or sabbatical?

Chris Kresser:  Yeah.

Tim Kendall:  Like, when you get done with it, what’s happening?

Chris Kresser:  The most common experience is a sense of dread to come back, honestly. People often say, “Isn’t that really hard?” And I’m like, “Yeah, it’s really hard to come back to the kind of relationship that I have with my phone.” Even though I, like you, sometimes still struggle; on a spectrum of phone addiction and phone use, I’m pretty far along, or my use is pretty minimal. I’m pretty controlled and pretty tightly regulated.

Tim Kendall:  Yeah.

Chris Kresser:  But still, even with that, I still resent the ways that it affects me and I still struggle with it. I’m actually right now doing a tune-up where I will periodically revisit my sort of my protocols and I’m doing Catherine Price’s workbook, How to Break Up With Your Phone.

Tim Kendall:  Great workbook.

Chris Kresser:  Great course and great workbook. It’s a 30-day program, so I’m going through the workbook, journaling, and making observations about everything. But for me, because I’ve done those so many times, usually it’s only a couple hours adjustment period, or maybe a half day, where I’m, like, twitching because I don’t have my phone. And then after that, I really settle in and remember and appreciate what it’s like just to be in the world without a phone. And often on those detoxes, I’m outside of my normal, but not always, I’m outside of my normal environment. So I might be on vacation, or [in] a retreat-like setting, or somewhere that makes it a little bit easier. And I recommend that people do that, especially for your first detox. I know that there are actually planned trips now and companies that are advertising these digital detoxes.

Tim Kendall:  Yeah, they take your phones.

Chris Kresser:  Exactly. You check your phone; they put it in a lockbox, and they don’t let you have it at any point during the trip. And I think that’s the right direction. And some people are happy to pay for that because they know themselves. If they have access to it while they’re on vacation, they’ll pull it out and they’ll use it. So that’s been a big deal for me. Another one, which you kind of solved the issue if you put your phone in minimal mode already, but if you’re not going to do that is adjusting your notifications. Of course.

Tim Kendall:  Of course.

Chris Kresser:  I have very similar settings to your kind of minimal mode. I have phone, text message notifications, and I think that’s even [a] gray area. I’m old enough that most of the texts I get are just pretty important. Like, “Hey, I’ll meet you here at this time” or most of my friends are not texting on these long threads endlessly, all day long. Then I have Calendar and reminders for meetings, or events, or things like that. And that’s basically it for notifications. And I find that I don’t want to be constantly interrupted. I sometimes will refer to the phone as the interruption device. If I’m working, or I’m with my daughter, out having fun, mountain biking, skiing, whatever it is, do I really want to be interrupted by something that somebody posted on Instagram or Twitter or an email that I got from Amazon that’s confirming something that I bought? Or even from anybody, really. So I think that’s been really helpful for me, also, on a personal level.

How social media has impacted children’s development and how we can protect our kids from these technologies

Tim Kendall:  Yeah. I’ll share another strategy, which transitions us a little bit to what I do with my kids and my family. So we hosted a bunch of family members, pre-pandemic, in this house for Christmas. And we just set up a norm that in the common space, where we eat, cook, and hang out, there are no devices here for the whole time. So if you want to go use your device, no problem. But just go in your bedroom or go to a different part of the house. And I said, “we’re not going to turn off the Wi-Fi; we’re not going to turn on a cell phone jammer,” which I do actually have one that I ordered and sometimes threatened to turn on, but no, we didn’t do that. For the common space that we share, there aren’t devices. And that is a pretty material change in terms of tenor and behavior. It’s very contagious! When one person picks up their phone and starts looking, everybody else gets twitchy and they grab their phones.

Chris Kresser:  Yeah, like, “what am I missing?”

Tim Kendall:  And actually, what prompted us to set up this norm is I would find myself and my family sitting around the table, all on our phones.

Chris Kresser:  Yeah.

Tim Kendall:  And that’s just a sad state.

Chris Kresser: I like that; we do that, too. This is definitely in the realm of family boundaries. So we don’t have phones at the meal table.

Tim Kendall:  That’s great.

Chris Kresser:  And we eat most of our meals at home, together. Also no phones in the bedroom for my wife and I. I like the idea of extending that to the living room and common area. I’ve seen some people advocate for the phone to stay in a parking spot somewhere in the home where you charge it. And to set up a charging station that’s in a dedicated area.

Tim Kendall:  Yeah.

Chris Kresser:  And then you have to willingly, and consciously go over and use your phone. A lot of people will think this is draconian, and maybe just totally over the top, but we decided that our daughter, who’s nine now, won’t have a phone, or access to a phone, and certainly not one that has social media or anything, until she’s 15 or 16. And she knows that. We’ve talked to her a lot about it, and about the reasons behind that. She doesn’t like it, like any other kid who sees these devices that beep and flash. And of course, she has friends now that even have their own phones, or an iPad, or something like that. And, like every other human or kid, she wants what she can’t have. And yet at the same time, when people ask her why she doesn’t have a phone, it’s really interesting to hear her articulate the response. She really does get it. And she gets it even through this small experience of screen time, which for her is one hour a day and that’s really just for listening to audiobooks.

Tim Kendall:  That’s not really screen time, yeah.

Chris Kresser:  It’s not really screen time. And then we have a family movie night, on Saturday nights, and we watch a movie together that she gets to pick. So, as you can imagine, I’ve seen pretty much every Disney and other kids movie possible. And we’ve seen some other movies, as well. But that felt good for us, because it’s a family experience. It’s a time that we get to enjoy together. And she knows when to expect it. It’s something she looks forward to. She gets ice cream that night, which is not something that typically happens in our house on any other night. And so it’s become a bit of a ritual, and something that she looks forward to.

So yeah, that’s where we landed. And again, I know that that’s pretty draconian for some folks, and they’re not going to vibe with it. I’m curious what you’ve decided in your family and how you came to that decision?

Tim Kendall:  Well, my girls are young; they’re four and six. So I haven’t hit the hard times at all. I will share that even at that age, my wife and I did a little bit of trial and error with time on the iPad on longer drives. And what we concluded was that, obviously, if we gave them both iPads to watch shows when we were driving for an hour or two, the drive was much more pleasant.

Chris Kresser:  Right.

Tim Kendall:  And a lot less work.

Chris Kresser:  Yeah.

Tim Kendall:  But then when we got to wherever we were going, the kids were monsters.

Chris Kresser:  Yeah, you’ve got to deal with the fallout.

Tim Kendall:  You’ve got to deal with the fallout. So through trial and error, we decided it was better and easier to establish a new norm, which is that we weren’t going to do that. And so there’s a podcast called Reading Bug that makes up amazing stories, and we listen to those. And they’re educational and useful, and the kids like that. And it doesn’t feel fundamentally that different to them. But it’s very interesting, cognitively, because they’re still having to use their imagination. Right? Behaviorally and cognitively, when we get out of the car, they’re fine. They’re acting normal. So that’s one thing that we’ve learned over time, and we try to persist that as a norm. Not that we’re flying anywhere lately, but pre-pandemic, on airplanes, we would just have them color or draw for the same reason. It’s just easier over the span of a 12- or 24-hour period.

Chris Kresser:  Yeah, absolutely.

Tim Kendall:  And then, I think what we will try to do is this “wait [until] eight” pledge that’s this nationwide framework where you try to get the parents of your kids’ peers in their class, to all agree, “we’re not going to give our kids smartphones.”

Chris Kresser:  Right.

Tim Kendall:  That probably needs to be revised. Because really, what I think is most harmful is just social media. And it doesn’t really matter whether it’s on desktop, tablet, or phone. It’s like, “let’s not give our kids these accounts until this date. You want to give your kid an iPhone without that account, have at it.” So that’s what I would like to do to try and get alignment. This is a really tough issue. And you are already struggling with it with your nine-year-old, and by the time she’s 13, 14, 15, it’s going to be really painful. Because they have these virtual social lives that live and breathe on these services that they are excluded from.

Chris Kresser:  Yeah, I wanted to talk about that.

Tim Kendall:  Somebody’s got to solve that. One potential solution is that someone could create a parallel service that’s super private with norms and rules and that the school has administrative control over, the parents have some control over, [and] there’s complete visibility on it. I think something like that needs to get created.

Chris Kresser:  I agree.

Tim Kendall:  Because, otherwise, you get a bifurcated situation. Often, parents who have four or five kids, with their youngest, they’re just like, “Look, we give up. They can do whatever they want.” And your daughter’s in class with their youngest, then their youngest is going nuts. And you guys as parents can’t totally get aligned. There’s got to be some middle ground that gets created, I think, to help solve this.

Chris Kresser:  I think for most people, that’s going to be a big issue. For us, it’s a little bit less so because we have non-traditionally educated our daughter. She’s mostly been homeschooled; she went to Waldorf for a couple of years. And now she’s doing a very non-traditional program that is loosely Montessori-based, but has horses and skiing involved. Yeah, I mean, and so.

Tim Kendall: That’s amazing for her. It’s just not totally practical for everyone, but great for her!

Chris Kresser:  Yeah, it’s not practical for most people to be able to do that. And I recognize that; it’s not like I’m sitting here saying, “Oh, just do this with your kid.” That’s obviously not going to work for most people.

And it does make it easier that the people who are drawn to those kinds of things, and the kids that she’s hanging out with, tend to be from families that have similar values around this stuff. So, for example, we went on a trip and one of her best friends came with us. We were down in Moab at Arches and Canyonlands, and her friend’s parents feel exactly the same way that we do about technology. So we don’t have to deal with the dynamic where her friend is bringing her phone, and Silvie (my daughter) is looking at her friend using her phone the whole time. That would just not work at all. But I think what you said is really true. This is the way that we’ve approached food with her, as well. We don’t want to be so restrictive that she doesn’t learn to develop her own boundaries around these things. At some point, she’s going to have to go out into the world, she’s going to get her own phone, and she’s going to have to know how to deal with these things. And so, we’re talking about that and how we can approach that as she gets older.

My feeling is still that at her age, that’s not a fair fight. It’s not even a burden that I want to put on her at this point. But certainly, as she gets to be older, we’ll continue to explore how we can help her to develop her own healthy relationship with technology that just comes from her own understanding and experience of how it affects her. But we kind of feel like the stronger foundation and development she has without [technology], the easier it will be for her to set those limits herself, when she gets to that point.

Tim Kendall:  I think that’s absolutely right. You can’t over-shelter. So it’s tricky. It’s such a tricky balance for parents. But ultimately, I do think that one of the best things that we can do, especially early on, is protect their cognitive development. So then they can exercise good judgment later.

Chris Kresser:  Yeah.

Tim Kendall:  And so I don’t know exactly what those age ranges are. But I do think that being a little bit draconian, especially early on, pays dividends in terms of them being able to exercise their own judgment.

Chris Kresser:  Absolutely. And even from a kind of spiritual sense, and this isn’t necessarily spiritual in the way that people would use the term, but just being able to be in your own skin without continuous input or distraction coming from a device. It’s a fundamental skill or capacity that all humans in the history of time developed. We are largely like our generation. And now subsequent generations are the first generations that could ever possibly go through their entire childhood and even adult life without developing that capacity. And so, for me, creating lots of time and space for her to just experience being herself in her body, without the interference of those devices, is probably the most important thing that we’re trying to give to her as a child.

Tim Kendall:  So important.

Chris Kresser:  All right. I’m so appreciative of your time. I know we’re going to wrap it up here quickly. I’m going to pivot again and ask you some more, perhaps philosophical questions.

Tim Kendall:  Okay.

Chris Kresser:  You may not have the answers. But what do you wish you knew five or 10 years ago that’s all very clear to you now? And with this question, I’m aiming this toward maybe some folks who are listening who haven’t seen The Social Dilemma or who haven’t thought about this stuff quite as much. What is something that has fundamentally changed your outlook on all of this that you weren’t aware of five or 10 years ago?

Tim Kendall:  Well, why don’t I just for a moment zero out of even big tech. I think what I didn’t understand 10, 15 years ago was that if you look at the consumer products or services that have been the most wildly successful and profitable over the last several decades, they prey on human weakness.

Chris Kresser:  Right.

Tim Kendall:  Right? That is what Coca-Cola is. That’s what McDonald’s is. That’s what tobacco is. And those are really good businesses because a lot of the business cost is about me acquiring you as a customer. And another big part of running a normal business, that doesn’t prey on human weaknesses, is that I have to pay to keep you around. When we prey on human weakness, it’s pretty easy to get you to come over and be my customer and it’s pretty easy to have you stick around. And what that translates into are just these wildly profitable businesses. And so I think you always need to be skeptical of any business that’s just wildly profitable. I’m not trying to be a total anti-capitalist.

Chris Kresser:  Yeah.

Tim Kendall:  But I do think that wildly profitable businesses often have distorted something in terms of their approach to the customer, or how they’re delivering the service.

Chris Kresser:  Yeah, it’s an addiction-based business model.

Tim Kendall:  It’s an addiction-based business model. So Marc Benioff, the founder and CEO of Salesforce, is someone who I really admire, and he has begun to talk more and more about a slightly different version of capitalism that companies need to start to practice, which is what he calls “stakeholder capitalism.” [That] is when there are multiple stakeholders in the equation. And in the case of Facebook or other social media services, there’s the advertiser; they’re a stakeholder, [and] they pay us. There’s the company, [and] that’s us; we have employees, or owners, if you will. And then there are the users and the consumers. And we have a responsibility to all of these stakeholders. We can’t make our stakeholders sick; we can’t make our stakeholders’ lives worse after using our product.

Chris Kresser:  I like that a lot. I just want to interject here, like, that’s a plain-language way of saying eudaimonics. Have you heard that word before?

Tim Kendall:  I haven’t.

Chris Kresser:  This concept of eudaimonia, which is a Greek word that means “human flourishing or prosperity.” And an economist named Umair Haque, I’m not sure how to pronounce his name. He created a framework called eudaimonics which, unlike traditional capitalism, doesn’t just focus on the single role of profitability. It optimizes for five different ends, which one is the ratio of an organization’s well-being to its income. The second is how much real human wealth an organization is creating. So not just financial wealth, but actual real human wealth, which also considers what he calls illth, or the opposite of wealth. So subtracting deterioration of capital, air pollution, poisoned water, etc., right?

Tim Kendall:  Great.

Chris Kresser:  The third thing is how much human possibility an organization is realizing. It’s the ratio of realized well-being to the highest possible well-being that you could get to. And then fourth is the organization’s net effect on well-being. And fifth is the organization’s equality of well-being, who has access to what that organization is producing. So a very similar framework that we’ve tried to adopt.

Tim Kendall:  This is an interesting sidebar where we’re spending most of our time at Moment actually developing a new set of services. And they’re really aimed at the following premise. We’ve never been more connected in the history of humanity. And yet, the research suggests that we’ve probably never felt more lonely. And that’s pretty upside down.

And so we’ve prototyped five different services, and we’re kind of playing with all of them right now thinking, “what if the objective function at the beginning of social media weren’t how do I get Chris connected to as many people as possible, and spending as many hours as possible on the service?” But instead, “how do I look after Chris’s social well-being and social health? Okay, what would I build? Oh, you know what you’d build, we think? We’d build you a service that would connect you to the five or 10 people that matter most. And then we would facilitate an exchange and self-disclosure with those people to create and sustain closeness.” We don’t know exactly what the shape and size of that service looks like. But look, loneliness is getting epidemic.

And so we just fundamentally believe that in three to five years, there’s going to be a service that you use, and probably pay for, that looks after and supports your social health. It’s interesting, like, the middle and upper class Americans spend all this time thinking about all these individual pursuits, like, “Oh, I’ve got to sleep more.” Definitely important. “[I’ve] got to be mindful; I’ve got to go meditate by myself. And I’ve got to go get on my Peloton by myself. I’ve got to do all these things in the name of health.” And you’re familiar with this. There’s great research that shows that probably the biggest lever for health, well-being, and longevity is the quality of relationships.

Chris Kresser:  Yeah.

Tim Kendall:  But we’re the least deliberate about that part of our lives.

Chris Kresser:  Yeah.

Tim Kendall:  So how do you square that? And I think that’s an interesting opportunity, one that we’re spending a bunch of time on, because we think the world needs it.

Chris Kresser:  Well, we’ll have you back on the show when you’re ready to talk more about that.

Tim Kendall:  Yeah.

Chris Kresser:  So, next question, what keeps you up at night? What are you most concerned about whether it’s the existential impact of these technologies, your family, your daughters, and how it’s going to affect [the future]? What’s most concerning to you about all of this?

Tim Kendall:  I think there’s absolutely a family orientation. What world will my daughters come of age and grow up in? And what does that mean for the life that they’re going to be able to lead? I was sincere in the film when they asked me what I was worried about and I said, “Well, I’m worried about civil war.”

Chris Kresser:  Yeah.

Tim Kendall:  That is what happens when there’s not a shared truth with a capital “T.” And we are in that reality. And as far as I can tell, it’s getting worse, not better. So that’s what I worry about. And I try to think about ways, and you and I brainstormed about it a little bit, too. What is the situation? What are the instruments and services that can get us back to some shared sense of truth, capital “T.” But the erosion of that is incredibly concerning to me.

And as we talked about at the beginning, I just think this issue of unchecked AI, which is really at the root of this social media issue, is existential. And I worry a little bit that the world doesn’t quite understand how existential it is, relative to something like climate change, which is also existential and also very serious, but you can’t ignore [technology]. We’ve got to go after both.

Chris Kresser:  Yeah. Because as we said, if you don’t have a common shared reality, you can’t even have conversations about it. And that’s where it’s at now. It feels like it’s stalled because you have two different sides of the issue thinking that the reality is very different.

Tim Kendall:  It does. Yes.

Chris Kresser:  So it doesn’t really open up any opportunity for compromise, or meeting in places that would lead to real solutions. Yeah, I share that concern. And I think that if we don’t get a handle on it quickly, it could rapidly accelerate because of new technologies like deep fakes that are coming in our lifetime, and probably in the next decade. Imagine a world where you can watch a video or listen to an audio recording, or look at a picture and have no idea of its fidelity. Anyone can be made to say and do anything on video that’s made by AI. And that’s just such a short circuit to our human brains. We are not equipped for living in a world like that. It reminds me of Stuart Russell, AI pioneer at UC Berkeley, who literally wrote the textbook on AI. He likes to say that most people, when they think of the danger of AI, think of the Terminator, or cyborgs, or androids that run amok or take over society.

But the real risk, and a much more likely risk, is actually the one that’s already here.

Tim Kendall:  We’re in it.

Chris Kresser:  Where you have AI that has influenced the outcomes of elections, and that has completely fragmented our society. And if we think about fake news now, like what happens when fake videos are indistinguishable from real video[s]. So, all right, we’re in agreement on that. That’s definitely one of the things that keeps me up at night. What is one thing that gives you hope? You alluded to that earlier, but in the face of all of these threats and challenges that we’re dealing with as a society and as individuals, what’s something that has given you hope?

Tim Kendall:  Well, I alluded to this earlier, and I don’t want to let Facebook off the hook. They’re 100 percent negligent for a lot of this mess, but their movement on a bunch of issues in the last several weeks has given me a little bit of hope. They really were not previously willing to acknowledge hate groups. If [a Facebook group] were just about hating a certain group of people, but weren’t explicitly calling for violence, they were allowing it. And now they’ve really acknowledged, which is so effing obvious, that [a] hate group in and of itself incites violence. And so those need to be looked at and dealt with in a way where the objective function is minimizing violence, hate crimes, and all the downstream things. So that gives me a little bit of hope.

I think the success of the film gives me some hope. There is an appetite to learn about this and come to a shared understanding about what is happening underneath our feet. So that gives me a little bit. I was very surprised, I thought that the film would have a decent audience. But look, it came out right before the election and we had a lot of other things to think about besides this. But I think what happened is, people realized that, oh my gosh, some of this stuff that we’re dealing with, this divergence of facts on COVID[-19], racial division, the root cause of some of these, or at least the accelerant of these, is grounded in AI and social media. So I’m hopeful that that initial message seems to have [been] taken.

Chris Kresser:  Yeah. And seems to span the political spectrum.

Tim Kendall:  It does. It does seem bipartisan.

Chris Kresser:  Yeah. So it’s encouraging to me that there’s growing conversation around these issues. And I think this has been true for a long time that if you ask most people, “has the quality of your life and the quality of your friendships, the quality of your conversations with other people, has all of that improved over the last few years or gotten worse?” I think most people would answer that it’s gotten worse. That they are dissatisfied with the quality of public discussion around the challenges and social, political, [and] economic issues that we face. They realize in some way that we’re moving backward as a society, as a species.

And to me, the human ingenuity that got us into this mess in the first place will hopefully be what gets us out of it, as well, once our attention shifts in that direction. And that’s really what I think it’s about.

Tim Kendall:  I agree.

Chris Kresser:  It depends on what we’re paying attention to. And if we start paying more attention to the harms that these technologies are causing, then I have pretty solid trust and belief that when we put our attention in that direction, we’ll come up with some good solutions. But the first step is awareness. And I’m grateful for folks like you, Tristan Harris, and all of the others that were involved with the film for really raising the alarm here and helping to seed more attention and awareness on this issue. I really do believe it’s certainly one of the, if not the fundamental challenge that we’re facing as a species of our time.

Tim Kendall:  Well, thank you. I noticed a few years ago, you were vocal on this topic. And I think that’s important. I think it’s terrific that people are hopefully factoring their phone usage and social media usage into their assessment of their overall health and well-being.

Chris Kresser:  That goes without saying for the listeners now. But the reason I’m talking about it more is because you could have your diet totally dialed in, you could be doing intermittent fasting, high-intensity strength training, and you’re checking every box there. But if you are addicted to your phone and you have a really unhealthy relationship with technology, you’re not going to be healthy in the way that I understand health. And so it quickly became the elephant in the room in a lot of the conversations I was having with patients, and just the work that I was doing with folks. So, Tim, thank you so much for spending all this time with us.

Tim Kendall:  Thank you, Chris.

Chris Kresser:  I would love to have you back on. That sounded really interesting [when] you were talking about some of the stuff that you’re working on.

Tim Kendall:  I’ll keep you posted for sure. I’d love to come back on and talk about it.

Chris Kresser:  Great.

Tim Kendall:  Okay.

Chris Kresser:  Thanks, everybody, for listening. Hope you enjoyed this episode. Please keep sending your questions in to ChrisKresser.com/podcastquestion and we’ll talk to you next time.