Attention isn’t just personal grit; it’s a public good. Treat short‑video feeds like a safety‑critical system: labels, defaults, audits and data access across jurisdictions.
Modern economies leak attention like a faulty pipe. Average on‑screen focus now resets in ~47 seconds; daily reading‑for‑pleasure has fallen by ~40% over two decades. Meanwhile, nearly half of US teens say they are online almost constantly. None of this proves simple causation; it does establish a base rate that warrants action. In public‑health terms the markers are present: high prevalence, meaningful severity (especially for adolescents), clear externalities (lost learning, error‑prone work), modifiable risks (autoplay, infinite scroll, push alerts) and feasible interventions.
The market failure
Short‑video platforms optimise for watch‑time using designs that mimic variable‑ratio reinforcement [unpredictable rewards; the slot‑machine schedule]. Infinite scroll, autoplay and push alerts externalise interruption costs onto households, classrooms and workplaces. Crucially, when researchers removed only mobile internet for two weeks—texts and desktop intact—participants’ sustained attention and well‑being improved materially. That is a policy‑scale hint: change defaults, change outcomes.
The risk is dynamic, not static. A new real‑user study of 1,100 TikTok accounts found that even light users roughly doubled their daily watch time over six months. The curve is endogenous; usage begets usage. Regulation must therefore target system behaviour, not individual virtue.
From self‑help to systems: the attention diet as regulation
Think of an “attention diet” not as self‑denial but as market design: labels, budgets and defaults at the product layer, plus audits and data access at the system layer.
- Labels (calories for cognition).
- Warning labels for minors, as urged by the US Surgeon General, frame the baseline risk. But add attention‑nutrition panels—notifications/day; autoplay status; average time/session—visible in app stores and settings. Like food labels, they don’t ban; they disclose.
- Budgets (age‑appropriate ceilings).
- For under‑16s, autoplay off and finite scroll by default; time‑of‑day curfews that respect sleep and school hours; counters that reset only after offline intervals. Australia’s new minimum‑age law for social media provides the legal handle; make the defaults align.
- Defaults (friction beats willpower).
- Notification batching (e.g., three drops/day) reduces stress without FoMO spikes. Creator‑first defaults (draft before feed) and For‑Later queues convert roulette into reading lists. These are small levers with measured effects.
- Audits and access (accountability, not vibes).
- The EU’s DSA (EU’s online‑platforms law) already mandates systemic‑risk assessments for VLOPs (very large online platforms)—including risks to public health and minors’ well‑being—and opens Article 40 data access for vetted researchers. Export that architecture: audited risk logs, independent experiments, and public dashboards.
- Evidence at scale (policy RCTs).
- Run city‑ or district‑wide trials that toggle autoplay, scroll limits, or labels with pre‑registered outcomes (sleep, anxiety screens, reading minutes, exam scores). The two‑week mobile‑internet RCT shows behavioural gains are available; governments should fund the follow‑through.
A cross‑national playbook
United States. Pair warning labels and public‑health messaging with a federal baseline for age‑appropriate defaults. Re‑table KOSA (US minors online‑safety bill) with precise design mandates (autoplay off, robust age checks, independent audits) and safe‑harbour protections for privacy‑preserving research access. Use federal procurement to require attention‑friendly defaults in official apps.
European Union. Enforce DSA risk assessments and mitigations with a named “attention harm” class and require VLOPs to offer a chronological feed and to publish shrink tests (impact of disabling engagement features on minors’ outcomes). Use the new researcher‑access delegated act to open multi‑country panels.
United Kingdom. Under the Online Safety Act, Ofcom’s draft children’s codes already push algorithmic taming and age checks. Add attention metrics to compliance (notifications/day, autoplay status) and align school‑day phone rules with clear guidance on storage, exceptions and enforcement.
Australia. With an under‑16s minimum‑age regime taking effect by December 2025, set pragmatic definitions of “reasonable steps” (high‑accuracy age assurance; family‑device attestation) and require public accuracy audits of age‑checks to avoid false positives.
Schools as the high‑yield lever
School‑day smartphone prohibitions are becoming the norm (England’s guidance; France’s tightening practice). The best evidence suggests test‑score gains concentrate among lower achievers when phone bans are enforced; but bans alone don’t fix sleep or anxiety. Marry phone‑free days with print‑reading minutes and digital‑hygiene lessons.
Productivity is a public‑health externality
Interrupted workers “go faster” to catch up, raising stress and error risk. The mere presence of a smartphone can sap cognitive capacity—replication is mixed, but the prudent policy is out of sight in safety‑critical zones and batched alerts elsewhere. Governments can lead by setting attention standards for the public sector and contractors.
How to measure success
- Exposure: minutes of autoplay exposure per minor/day; notifications/day; share of time in chronological feeds.
- Outcomes: reading‑for‑pleasure minutes (time‑use diaries), sleep duration, YRBS (CDC teen health survey) indicators, and school exam variance.
- Process: share of VLOPs with published risk logs, independent audits, and researcher APIs.
Objections—and replies
- “This is parental, not public, responsibility.” Seatbelts and food labels did not replace personal responsibility; they raised the baseline. So will safer defaults.
- “It will harm innovation.” Transparency, interoperable controls and audits are pro‑innovation; they make quality legible and reduce arms‑race externalities.
- “Evidence is mixed.” True for causal links to all harms, less so for design exposures (autoplay, infinite scroll) and usage intensity. Base‑rate trends and RCT signals justify low‑regret guardrails.
The kicker
Treat attention like clean air for cognition: invisible, essential, taken for granted until it degrades. Set labels, budgets and defaults; demand audits and data. The feeds won’t police themselves.
By the numbers
- ~47 seconds: average on‑screen focus before switching. (gloriamark.com)
- −40%: decline in US reading‑for‑pleasure, 2003–2023. (Cell)
- ~50%: US teens online “almost constantly.” (Pew Research Center)
- RCT: 2 weeks without mobile internet → better attention & well‑being. (OUP Academic)
- Escalation: even light TikTok users double daily watch time in 6 months. (The Washington Post)
- Policy shift: Australia’s under‑16 minimum‑age law effective Dec 2025. (Infrastructure and Transport Dept)
Sources (selected)
- US Surgeon General advisory on social media & youth mental health. (HHS.gov)
- PNAS Nexus (2025) RCT: blocking mobile internet improves attention and well‑being. (OUP Academic)
- iScience (2025): 20‑year decline in US reading for pleasure (ATUS). (Cell)
- Pew Research (2025) Teens & social media fact sheet—“almost constantly” online. (Pew Research Center)
- EU DSA researcher‑access (Article 40) explainer. (algorithmic-transparency.ec.europa.eu)

