Building H #79: Limited Engagement

Under pressure as Congress debates various interventions, TikTok recently announced that it will limit teens to 60 minutes of use per day. As reported in The Verge, the limit can be overcome by entering a passcode (children under 13 will need a parental code), but, at a minimum, the requirement will add some friction to the user experience and, in theory, prompt some reflection about the urge to continue. TikTok’s move sparked more discussion about the broader issue of screen time and the vagueness or bluntness of the concept. Not all screen time is created equally, of course, and it is, to be fair, a vital part of any teen’s (or adult’s) life. Nor is it a simple explanation for all that ails America's youth. In his piece, TikTok’s Screen Time Limits Are the Real Distraction, for WIRED, Phillip Maciak argues that the “moral panic” about screen time blinds us to what teens see and experience on those screens, which are a reflection of the real societal challenges on which we should be focused.

The issue of screen time is certainly more nuanced than counting minutes per day, but the minutes do matter – especially in the context of the activities they crowd out. TechCrunch reported last summer that in 2021 US kids and teens were spending an average of 99 minutes per day on TikTok, eclipsing YouTube (a mere 61 mins/day) as the leading source of screen time. These numbers aren’t necessarily troubling on their own (who among us hasn’t enjoyed a two-hour movie, after all?), but when you consider that fewer than a quarter of US high schoolers get an hour a day of physical activity, 44% of 18-24 year-olds spend less than 30 minutes a day outdoors, and a third of Ammericans are getting dangerously low levels of sleep – the concern grows.

Ironically, as Shira Ovide reports in her newsletter for the Washington Post, TikTok’s limits might backfire. In a study of TikTok’s time-limiting tools, users who set limits actually wound up spending more time on TikTok than those who didn’t use tools. (People are complex.) From our perspective, the approach of setting usage limits focuses on the wrong leverage point. Clearly TikTok is an engaging product that entertains people to the point that there’s an outcry to stop people from overusing it. It’s no accident that it succeeds – the combination of behavioral scientists, machine learning algorithms to learn what most engages people, and a business model that depends on engagement is a potent mix and TikTok delivers the results for which it is optimized. If you actually want people to spend less time on the platform (that’s a big if), maybe dial down the level of engagement it produces? (Thanks to our collaborator Alec McMorris for a helpful discussion on this topic.)

Which brings us back to the business model problem. Congress can ban TikTok, but the essential ingredients will emerge in other products (Instagram and apparently now even Spotify have already copied much of it). Industries creating products that people have a tendency to overuse, to their detriment, followed by messy attempts to limit their consumption, is an old story that has largely resisted easy resolutions. The challenge is, as Thomas said yesterday in his NextMedHealth talk, to get businesses to “own the outcome,” so that they’re accountable for the health impacts their products produce. Doing so might require a regulatory environment in which a business model based on unhealthy overconsumption is no longer profitable – or less profitable than a healthier alternative. Easier said than done, of course, but better than facing the same problem, with different actors, over and over again.

We'd love to hear your thoughts on this topic – comments are open.

Read the full newsletter.

Steve Downs2 Comments