It takes a particular kind of genius to convince an entire civilization that the systematic harvesting of human consciousness is not an extractive industry. This week, as Meta announced its quarterly earnings — $40.1 billion in revenue, 3.24 billion daily active users — one was reminded of the coal barons who once argued that miners freely chose to descend into shafts, and that black lung was merely an unfortunate side effect of economic progress.
The comparison is not, of course, without precedent. For roughly a century — from the 1880s through the 1980s — democratic societies operated on a principle so obvious it barely required articulation: industries that systematically consumed finite human capacity required regulation. Coal mining extracted labour and lungs. Textile mills extracted childhood. Meatpacking plants extracted fingers. We passed laws.
Then, sometime around 1995, we agreed to pretend that human attention was different. Infinite, renewable, not really a resource at all. Just a thing people gave away freely to whoever built the cleverest mousetrap. The fact that the mousetraps were designed by neuroscientists, optimized by machine learning, and refined through billions of behavioral experiments somehow failed to register as relevant.
The Precedent We Forgot
In 1938, the United States passed the Fair Labor Standards Act, which among other things limited the workweek to 44 hours. The logic was simple: human capacity to labour is finite. A body worked 80 hours weekly is a body destroyed. The law did not ask whether workers consented to 80-hour weeks. It did not conduct surveys on whether workers enjoyed their jobs. It recognized a biological fact and regulated accordingly.
By 1970, Congress had created OSHA, the Occupational Safety and Health Administration, with a mandate to prevent workplace hazards that were invisible but measurable: airborne toxins, repetitive strain, excessive noise. The principle remained consistent. If an industry profits by consuming human biological capacity, society may regulate how much capacity it may consume.
THE ATTENTION EXTRACTION RATE
Americans now spend an average of 7 hours and 4 minutes daily on screens, according to DataReportal's 2025 Digital Statshot. For adults aged 18-34, the figure rises to 8 hours 17 minutes. This represents a 34% increase since 2019, driven almost entirely by social media and video platforms optimized for engagement maximization.
Source: DataReportal, Digital 2025: Global Overview Report, January 2025The brain, it turns out, is also biological. Attention is finite. The cognitive resources required to resist algorithmically optimized persuasion are exhaustible. The mental fatigue induced by six hours of TikTok is as measurable as the lung damage induced by six hours of coal dust. We have simply chosen not to measure it, or having measured it, to regulate it.
The Industry That Isn't One
The technology industry has executed a semantic maneuver of considerable brilliance: it has convinced regulators that it is not an extractive industry at all. It makes software. It connects people. It provides services. The fact that its revenue model depends entirely on the systematic harvesting of human attention — and that this harvesting is conducted at industrial scale, using industrial methods, with industrial efficiency — has been rendered somehow immaterial.
Consider the architecture. A social media platform is not merely a bulletin board where people choose to linger. It is a system engineered to maximize what the industry calls 'engagement' — a term that sounds volitional but describes something closer to captivity. Every feature is tested. Every colour, every notification cadence, every algorithmic tweak is refined through A/B testing on millions of users to identify which variant most effectively prevents people from leaving.
This is not advertising. Advertising is a magazine spread you can skip. This is the magazine sending you a neurologically optimized electric shock every time your eyes drift toward the next page. The fact that the shock is delivered through dopamine rather than voltage does not alter the underlying mechanism.
THE ENGAGEMENT OPTIMIZATION MACHINE
Internal documents from Meta, released during the 2023 Senate hearings, revealed that Instagram's recommendation algorithm had been refined through 47,000 separate experiments between 2019 and 2022. The stated goal, according to internal metrics, was to increase 'time in app' by reducing what engineers called 'stopping cues' — interface elements that might remind users they intended to do something else.
Source: U.S. Senate Committee on the Judiciary, Subcommittee on Privacy, Technology and the Law, Hearing Record, November 2023Don't miss the next investigation.
Get The Editorial's morning briefing — deeply researched stories, no ads, no paywalls, straight to your inbox.
The Argument They Always Make
The counterargument is well-rehearsed. People choose to use these platforms. No one is forced to open TikTok. Users derive value — connection, entertainment, information. To regulate engagement is to infringe on liberty, to treat adults as children who cannot manage their own time.
This argument has a pedigree. It is the same argument deployed against the eight-hour workday (workers freely contract for longer hours), against child labour laws (families need the income), against seatbelt mandates (drivers can assess their own risk). It failed then because it misunderstands the nature of choice under asymmetric power.
When one party has spent billions of dollars and employed thousands of PhDs to engineer a system that exploits known vulnerabilities in human neurology, and the other party is a teenager with a developing prefrontal cortex, we are not describing a transaction between equals. We are describing extraction.
Meta's 2025 revenue divided by its active user base yields $11,400 per user annually in North America — more than double the company's 2020 figure, driven not by more users but by more efficient attention extraction from existing ones.
What Regulation Would Actually Look Like
Regulating the attention economy does not require banning social media or mandating that algorithms be dull. It requires applying the same principles we apply to every other industry that profits by consuming human biological capacity.
First, disclosure. Coal companies must report workplace injury rates. Pharmaceutical companies must report adverse effects. Social media companies should report attention consumption and cognitive impact metrics with the same granularity they currently report engagement metrics to advertisers. If a platform can tell Procter & Gamble exactly how long users watched a shampoo ad, it can tell the public exactly how long children spent scrolling at 2 a.m.
Second, limits. We do not allow factories to work employees 18 hours daily, even if employees consent. We should not allow platforms to deploy unlimited psychological manipulation, even if users click 'agree.' The European Union's Digital Services Act gestures toward this principle but lacks enforcement teeth. A serious regime would set maximum 'engagement optimization' thresholds, measured by independent audit.
THE DEVELOPMENTAL COST
Adolescents who spend more than three hours daily on social media show double the risk of depression and anxiety symptoms, according to a 2024 longitudinal study tracking 6,500 teenagers across 12 countries. The effect size increased with algorithmic recommendation intensity, not with social media use per se — suggesting that engagement optimization, not connection, drives harm.
Source: Lancet Child & Adolescent Health, International Adolescent Mental Health Study, March 2024Third, liability. When a coal mine collapses, we do not shrug and say miners chose to work there. We investigate whether safety standards were met. When a drug causes unforeseen harm, we do not say patients consented. We examine whether the manufacturer knew or should have known. The same standard should apply to platforms. If internal research shows that an algorithmic change increases self-harm content exposure among vulnerable users, and the company deploys it anyway, that is not innovation. That is negligence.
The Question We Are Avoiding
The deeper question is why we abandoned, so swiftly and completely, a century of precedent. Why extractive industries that consumed lungs and limbs required regulation, but extractive industries that consume attention and cognition do not.
Part of the answer is technological mystification. Coal is simple. Algorithms are complex. Regulators who confidently banned child labour in textile mills grow timid when asked to evaluate neural networks. But complexity is not a moral shield. The fact that we do not fully understand how a system works does not mean we cannot observe what it does.
Part of the answer is ideological. For three decades, we have operated under the assumption that digital technology is inherently democratizing, liberating, immune to the pathologies of industrial capitalism. This was always wishful thinking. An industry that extracts value by consuming finite human resources is an industrial enterprise, whether it smelts iron or monetizes dopamine.
And part of the answer, one suspects, is that we are the mine. It is easier to regulate a coal shaft in Appalachia than to regulate the device in your pocket, because regulating the device means acknowledging that you, personally, are being mined. That the thing you thought was a tool is actually a method. That your attention is not something you give but something that is taken, systematically and at scale, by entities that have invested billions in the taking.
The Choice That Isn't
We are told, endlessly, that this is about choice. That regulation would infringe on our freedom to scroll, to watch, to engage. But freedom is not the absence of constraint. It is the presence of meaningful alternatives. A miner who must choose between the shaft and starvation is not free. A user who must choose between algorithmic manipulation and social exclusion is not free either.
The attention economy is an extractive industry. It consumes a finite resource. It externalizes costs onto individuals and society. It concentrates profits among a small number of firms. It operates through asymmetric power. A century ago, we knew what to do with industries like that. Then we forgot.
Perhaps it is time to remember. Your attention is not oil waiting to be drilled or data waiting to be harvested. It is part of your mind. The systematic extraction of human consciousness, optimized for profit, is not innovation. It is mining. And we have always known how to regulate mines.
