top of page
Search

After Work: From Agents to Androids and What Comes Next

This is a longer read. Listen to an AI podcast, generated by NotebookLM, about this piece here or listed to it be read by an AI generated close of my voice here. 


ree

Overview

This essay tracks how automation systematically eats cognitive work, then physical work—and what happens when markets move faster than institutions can adapt. The technology is largely settled; the decisive variable is political execution: whether we build new rails for income, ownership, and purpose before the old ones collapse, or watch that collapse happen first.

It is structured as a dated scenario with two possible endings.


There is only one path to take.
There is only one path to take.

At 2 a.m. on a Monday in December 2025, something unremarkable happens. A piece of software logs into a customer support system, works through a backlog of 200 queries, and logs out. No errors. No escalations. Perfect documentation. The spectacular part is that nothing goes wrong. By morning, the backlog is clear, the documentation is perfect, and nobody notices the shift that just started.


By 2040, that night shift—and millions like it—will have rewritten the social contract. Not because the technology failed, but because it worked.


Orientation


Over the next 10–15 years, automation systematically eats our economy in an inevitable sequence.

First, cognitive work: software agents and AI-native tools compress months of desk-based labour into hours, then minutes.


Then physical work: as energy gets cheaper, batteries get denser, world-models and digital twin environments are perfected, and embodied AI matures, robots in many form factors operate autonomously for days, self‑charge, and collaborate with each other and the AI agents controlling them.


Markets move first; institutions lag. Governments, school systems, labour‑market and fiscal policies struggle to adapt to exponential change. Now they have no chance of ensuring existing institutions remain fit for purpose. Bold and rapid change is required.


Some states go all in. They flex sovereign reserves and sign multi‑decade PPAs to anchor data centres, accelerators and safety labs at home. Where intelligence is trained and hosted is where value, taxes and talent concentrate.


By the late 2020s, multi‑GW PPA pipelines and gigawatt‑scale data‑centre clusters shifted from proposals to multi‑year build programmes.


Where compute, models and safety labs cluster, value accrues; taxes, talent and IP follow; policy leverage compounds.


A Luddite interregnum follows: It tracks the phases of grief: denial, anger, bargaining, depression, and eventually acceptance. Elections swing on promises already out of date by the time votes are counted. Social unrest feeds on the gap between post‑labour reality and expectations. Generations were promised education, work, careers as identity, and income. The future will not be what citizens grew up expecting.


There are two very different futures that could unfold:


1. The Reforged Path: leaders stop treating automation as temporary. They realise they need to act at a pace only wars have required. They rapidly redesign income rails, ownership, and education; regulate compute and embodied AI for safety and access; realign tax and spending toward participation and resilience. The changes are necessary for the greater good.


2. The Fractured Path: leaders procrastinate and mistake messaging for delivery. Wealth concentrates in automated stacks; education decouples from competence; public trust erodes. Countries that relied on low‑cost labour watch their advantage dissolve as AI becomes more capable and cheaper.

Either way, work is repriced. The decisive variable is political execution, not technical possibility.


The Luddites


The Luddites were early nineteenth‑century English textile workers who destroyed mechanised looms they believed undermined their pay, skills, and livelihoods. They became a symbol of resistance to labour‑displacing technology.


Key Assumptions


1. Stacked exponentials: technologies now amplify one another. AI accelerates AI (model design, training efficiency, agentic tooling) and adjacent fields like healthcare, drug discovery, genomics, advanced materials and space systems.


2. AI’s arc: AI capability only increases. We move from generative AI to tool‑using agents that operate software and machinery, then toward artificial general intelligence (AGI), AI more capable than humans across most cognitive tasks, and eventually artificial superintelligence (ASI), intelligence beyond human comprehension. The self‑improvement loop closes and capability compounds at machine speed, what many call a singularity. Timelines vary, but direction is clear. Sentience or consciousness is unnecessary for AI to transform our world.


3. Embodiment follows cognition: as designs improve and energy costs fall, embodied AI becomes economical for predictable physical tasks. Imagine a robot trained on a digital twin of your home, spending the equivalent of five years learning to navigate before you unbox it. Uptake lags software but compounds once Robot Foundation Models (RFMs) and robot financing mature.


4. Friction is social, not technical: binding constraints are governance, liability, assurance, data rights, and political legitimacy, not absent models or machines.


Cost and Availability of Intelligence


Training costs fall by large factors each generation. Inference cost per task halves repeatedly, and small fine‑tuned models inherit frontier behaviours. Training efficiency improved by roughly an order of magnitude every year‑or‑two in the early 2020s, so today’s frontier performance arrives on a fraction of yesterday’s compute. Open‑weights ecosystems compress diffusion times. Custom accelerators deliver multi‑fold efficiency gains. Typical enterprise task costs dropped by a similar order‑of‑magnitude across model generations as distillation and small‑model fine‑tunes matured. As Sam Altman argued in ‘Moore’s Law for Everything’ (2021), if intelligence keeps getting cheaper, many goods and services will tend toward being twice as capable for the same price on a cadence that resembles semiconductors.


Why Education Cannot Keep Up


Education systems fail the transition for predictable reasons: curriculum inertia, assessment tied to recall, fixed costs, falling demand, and funding optimised for yesterday’s labour market. Our grandparents were taught the same way our children are taught today. Universities with high fixed costs and glacial decision‑making realise they could be obsolete before current undergraduates graduate.


Over time, winning jurisdictions pivot to competency‑based progression, authentic assessment with AI‑rich tools, apprenticeships in judgement‑heavy work, and preparation for human‑machine partnership. But this transition arrives late, after damage is done.


How to Read the Timeline


Treat each dated section as signposts, not prophecies: what to watch in adoption, labour markets, public finances, politics, safety, and trust. Dates are indicative; direction matters more than exact timing. The technology stack will keep compounding. Social outcomes depend on how quickly we reprice work, redesign income and ownership, and rebuild education around a human experience that sits after automation, not in defiance of it.


The first signpost arrives when agents stop being demos and start doing shifts.


Late‑2025 to June 2026: AI Agents Break Out of the Browser


Maya, 24, junior analyst at a London consulting firm, watches her Monday morning workload shrink from forty client emails to eight flagged exceptions. The other thirty‑two were triaged, categorised, and draft responses prepared overnight. She doesn’t know whether to feel relieved or worried.


By December 2025, things change. At 2 a.m. an agent logs into a live system with its own credentials, opens the same screens a junior would, and quietly clears a backlog that would have eaten a human’s Monday. It clicks, types, checks policy, asks for help when thresholds trip, and leaves a perfect log. Nothing spectacular happens; nothing goes wrong. In back offices everywhere, the browser is no longer human‑only.


AI can already use a mouse and keyboard.
AI can already use a mouse and keyboard.

By February 2026, this is the norm. Accounts payable lets agents match purchase orders with invoices, pay them within limits, and chase outstanding payments. Customer support lets them triage emails, analyse voice calls, and issue refunds inside strict rules. Legal lets them classify contracts, pull clauses, manage client onboarding and compliance. Specialist legal AI agents draft first passes that lawyers redline in minutes. HR assembles onboarding packs. Marketing sends highly personalised emails based on deep profiles and prior interactions. Wherever work is structured, high‑volume and policy‑bound, agents go first, with humans in the loop.


Why now? Reliability crept over the line. Frontier systems released in the mid‑2020s cut error rates materially and widened tool‑use reliability; the exact numbers vary by benchmark. Guardrails moved into code and prompts. Identity and access tightened so agents only see what they need. Tools learned to ask for help like good juniors when values drift, fields are missing, or decisions touch real money or the vulnerable.


Under the surface, inference cost per task falls quarterly. Fine‑tuned small models inherit frontier behaviours. Open‑weights ecosystems spread capability. Custom accelerators push multi‑fold efficiency gains and reduce latency.


February 2026 is when procurement wakes up. Contracts name workflows agents may run, error budgets they must meet, and escalation ladders when uncertainty rises. Procurement requests the percentage of delivery performed by AI as a requirement, not a check. Vendors stop selling demos and start selling outcomes: time‑to‑complete, accuracy, replayable logs. Insurers arrive with riders underwriting specific agent‑run processes. When a CFO can point to cover and a contract, adoption moves from experiment to plan.


March 2026 belongs to security and operations. Agent‑operated accounts get least‑privilege diets. Secrets management, session isolation and tamper‑proof logs become as important as model accuracy. Data quality becomes painfully visible. Agents amplify whatever they touch. If taxonomy and documentation were soft, automation makes that softness a cost centre. Everyone records more—sales calls, support calls, meetings—as conversational data value is understood.


April 2026 brings culture shock. The org chart flexes. The old pyramid, large intakes of juniors feeding a narrow band of seniors, thins at the base in agent‑heavy workflows. Fewer people do first‑pass work. More design processes, configure tools, test edge cases and answer for the whole chain. If leaders do not build new rungs—simulations, rotations, supervised exception handling—they discover their path to training future experts has evaporated.


Government departments and national operators feel the same arithmetic by May 2026. Backlogs, budgets and promises collide with the reality that agents can read forms, draft letters, route cases and keep immaculate records. They extract more data from images and videos than humans. The constraint is legitimacy. One visible error can set a department back a year, so they pilot in back offices, stage deployments, add human checks and publish clear limits. But they move, because the alternative is slower service at higher cost while the public learns what private firms already do.


Across these months the conversation shifts from if to how. Who signs when an agent is wrong? Which processes are safe to run at 2 a.m.? How to measure productivity when synthetic labour works beside human labour? Quality gains at lower cost never stay optional. If one competitor uses them well, everyone else follows or loses.


By June 2026, the baseline is set. Agents have left the lab and taken seats at virtual desks. Wins are obvious, cycle times collapse, consistency improves, documentation becomes automatic. Risks are manageable, reliability is earned in production, guardrails are codified, assurance and insurance catch up. The price is subtle, the early‑career ladder is moving even if no one has said it out loud.


The experiments are over.


Mid‑2026 to 2028: Cognitive Automation Scales; Policy Lags


James, 48, factory supervisor in Birmingham, pilots the first robot integration on his production line. He has become a fleet manager overnight, overseeing twelve machines that work three shifts without breaks. He is still employed. His three former direct reports are not.


By autumn 2026, experiments are no longer tucked away in innovation teams. Budgets, targets, board imperatives follow. Compensation plans assume human workers are augmented, and more is expected of them. Agents start work and escalate to people, not the other way round. Reports, outreach, reconciliation, compliance checks and basic analytics are AI‑first in most large organisations. Humans still sign, but drafts are fully formed, quality assured, correctly filed and perfectly logged.


Policy from a linear past doesn't work in an exponential future.
Policy from a linear past doesn't work in an exponential future.

The change feels procedural until it feels structural. Hiring managers materially reduce junior intakes because the first 80 percent of many desk jobs is now done by software. Why would a law firm hire anyone with less than two years post‑qualification experience? The centuries‑old pyramid breaks down and the org chart inverts: experience at the top, delivery with digital co‑workers in the middle, falling need for juniors. Graduates struggle to find gainful employment as entry‑level roles shrink. Teams are smaller, cycles shorter, and evenings and weekends belong to machines.


Each model generation lowers cost, enhances capability, raises reliability. Training that once required the largest labs becomes reachable for enterprises through efficient fine‑tuning and distillation. Open‑weights communities compress diffusion timelines from years to months. Where access is throttled, work routes to permissive jurisdictions. Code, capital and talent move faster than legislation.


Micro‑credentials proliferate as employers seek competence with tools, not badges. The training ground moves and rules change, but no one has marked out the new pitch.


Public finance starts to creak. Payroll‑heavy sectors contribute less even as margins improve elsewhere. Benefits systems, tuned for cyclical shocks, struggle with structural displacement. Consultations on income support and retraining open, attract headlines, then stall. Meanwhile, procurement quietly rewrites the state. Contracts name agent‑run tasks, error budgets and escalation ladders. Unions negotiate automation clauses trading transparency and reskilling for deployment latitude. The billable hour sees its last days.


By late 2027 the impact is visible as job postings for entry‑level cognitive roles fall faster than leaders can invent new ladders. People promised knowledge work discover the rungs are gone or gated by tools they were not taught to use. Local moratoria and human‑made labels appear. They do not slow adoption, but they make legitimacy part of the deployment plan.


Then, in 2028 or 2029, a capability jump arrives that most observers call close to AGI. In practical terms: systems outperform humans across most cognitive tasks with sustained reasoning, tool use and novel problem solving. AI is now better, cheaper and faster than humans at a wide range of cognitive tasks, and labs already know how to improve intelligence by orders of magnitude.


The billions invested become trillions. Design loops in maths, physics, materials, batteries, motors and control accelerate. AGI improves the tools that improve AGI. However, availability is not the same as deployment. Assurance, liability, regulators and politics try to set speed limits, but the genie is out of the bottle and deployment pace is now a matter of national economic security. Winners deploy the most capable mix of AGI‑driven virtual and physical workers fastest.


Interlude: The Luddite Interregnum


By early 2027 the story leaves spreadsheets, boardrooms and newsrooms and enters the streets. Headlines chart how career entry routes are closing and first rungs on the ladder have gone missing. Boards stop talking about pilots and start talking about targets, competing with AI‑first new entrants with a fraction of the headcount. Ministers stop talking about innovation and start talking about legitimacy and who is to blame.


ree

This is the Luddite Interregnum. It's the messy period where technology compounds faster than institutions can absorb it, and politicians try to sugar‑coat it. The public response mirrors the stages of grief.


Denial comes first. We hear the greatest hits: this is like spreadsheets and email; it will all net out; new jobs will appear in time. Some do, but nowhere near enough to offset losses and shrinking knowledge‑work opportunities. Quality gains at lower cost never stay optional in competitive markets. When one bank, retailer or department shows it can do work in hours instead of days, the rest follow. They have to. Economic systems in global markets are driven by the bottom line, and AGI is unlike any previous economic force: the advantage to those who embrace it is too great to resist.

Denial collapses when monthly revenues rise, operating costs are optimised, customer satisfaction ticks up, and headcount does not. Organisations that refused to adapt lost share, and in some cases, failed.


Anger follows.  t is directed at the nearest visible symbol. AI billboards are defaced. Voice agents are mocked for denying refunds. Protest signs appear outside nondescript offices that never attracted protests. Media amplifies every story where an AI‑driven system goes wrong while thousands of quiet successes go unreported, except by those who implement them. The anger is fuelled by AI‑generated content across social platforms. Some mean it; others want views. The anger is not about circuits, it is about status and promises: you told me there would be a place for me in this world, and now I cannot find it.


Bargaining is the most political stage. Medium‑term self‑interest outweighs long‑term good. We try words that sound like action: pauses, moratoria, commissions, sandboxes, human‑made labels, human‑in‑the‑loop mandates. Some of this is necessary, assurance, audit, identity, provenance and incident reporting make systems safer, but most buys time. Firms say the right things but quietly rebuild as AI‑first. When rules are too blunt, work moves offshore to jurisdictions without the constraints. Regulators back‑pedal to bring capacity back onshore. We discover we can shape how automation lands, but not whether, or where. That is dictated by markets, demographics and the hard economics of survival. Countries with shrinking workforces, almost everywhere outside Africa, invest in automation because they must.


Depression is quieter. Application‑to‑vacancy ratios rise for junior cognitive roles while in‑person services still hire. Graduates who did everything right send out endless CVs, scanned and ranked by AGI‑driven recruiters that conduct first interviews. Mid‑career workers who expected to manage teams find their teams smaller and tools smarter. Some stop looking for work, not because they will never work again, but because the on‑ramps they recognise have vanished and new ones are unmarked.


In the UK, economic inactivity among working‑age adults reached more than one in five in the mid‑2020s. Without new rungs, many may never re‑enter. Participation dips. In some places frustration turns inward; in others it fuels a politics of permanent grievance. This is where trust bleeds away if leaders mistake empathy for delivery.


Acceptance is not love for the machine. It is when leaders stop promising yesterday and start designing for tomorrow. A few governments realise early that you cannot ban productivity in an open economy without exporting the work and importing the bill. Even managed economies adopt intelligence on tap to gather data and optimise machine‑human partnerships at scale. They pivot from slogans about job restoration to plans for participation without full‑time employment as the only route to dignity. They talk about income rails that do not depend solely on wages, portable protections that move with the person, and ownership that spreads automation’s upside rather than concentrating it.


Corporations publish automation‑impact statements to show how people, not just shareholders, benefit. Unions trade transparency and reskilling for deployment latitude. Schools pilot competency‑based assessment that assumes the tool is on the desk and asks what you can do with it. Acceptance, when it comes, is pragmatic: we are not going back, so let us make the new order as fair as possible.


Then matter catches up with mind.


2028–2031: Matter Catches Up


By early 2028 floorplates feel different. The cognitive wave set the tempo. Now matter catches up. Nights in warehouses, hospitals and shopping centres start to hum. Fleets of general‑purpose mobile manipulators, good enough rather than perfect, take on predictable work and keep going. They roll up to shelves with dexterity now beyond human. Restocking happens at 3 a.m. without lights, and nobody notices. Corridors are cleaned by small armies that recharge themselves. There is no longer an out‑of‑hours.


Figure AI
Figure AI

The step change is not a single humanoid that can do everything, it is a swarm of specialised robots with many form factors, co‑ordinated by AI agents that act as escalation points, a bridge to humans in the loop and analysts of data streamed by myriad sensors. AGI in 2028–29 accelerates robot capability.


AGI shortens design loops in motors, battery design, grippers, compute, dexterity, materials and control. Robot speed is increasingly limited by the laws of friction and the proximity of human co‑workers. It finds better trade‑offs in weight, torque and cost. It lays out factories virtually and trains robots in virtual worlds before they step into physical reality. It writes safety cases and test plans. What looked like hype on a screen shows up in cast aluminium, fully waterproof with sealed bearings.


By the late 2020s, credible analyses showed humanoid platforms entering the mid‑five‑figure cost range, with leases comparable to or below a national average monthly income in logistics. The ever more realistic humanoids are the most useful, and the most divisive.


Frontier behaviours are distilled into smaller models that run locally, on robots and even domestic appliances. Vendors price outcomes. Intelligence becomes a near‑free input for predictable tasks, constrained more by power and governance than algorithms.


Energy and runtime make the difference between a demo and a business. Battery density inches up as solid‑state matures and novel approaches developed by AGI systems are prototyped, but power management delivers the hours. Fast‑swap and fast‑charge ecosystems become the new water coolers, with less conversation. In many regions energy costs trend down as more production comes online, which matters because uptime is now the unit of value. A robot that runs twenty hours a day and plans its own charging is cheaper, safer and more predictable than a rota of humans.


By 2029 the contracts change, and with them the culture. Robotics‑as‑a‑service replaces capital expenditure with a line item. You lease them or buy outcomes per pick, per clean, per inspection, per kilometre. Vendors guarantee uptime and accuracy and take the depreciation hit. They carry spares in fleets of vans, but systems and hardware designed by AGI rarely fail.


The CFO sees a neat curve of cost no longer linked to revenue growth. Super‑profits will not last, perfect information collapses margins everywhere. But the cost curve remains. The COO sees a dashboard that does not care about school holidays or illness. The CIO sees a security model they can reason with in real time. This is the year robotic labour becomes continuous rather than episodic.


Buildings notice too. New ones are drawn with robot‑ready dimensions as standard. Older ones are retrofitted with ramps where there were lips, automatic doors where there were handles, charging alcoves where there were cupboards. New buildings are designed never expecting to host a human. Dark factories become the norm.


Not every role disappears and many are remixed. Care is the clearest example. Robots handle lifting, fetching, monitoring and falls detection. Humans handle warmth, judgement, dignity and tasks that cannot be reduced to protocol. The ratio shifts. The wage mix does not adjust quickly enough and that becomes a political problem, not a technical one. The wealthy can still afford human care, but robotic care is cheaper and increasingly better.


If you use a keyboard at work, your livelihood is under pressure. You still have runway if you wear a toolbelt.


By 2030 self‑charging fleets operate round the clock with minimal human oversight. Night staff move from doing work to overseeing systems and coaching edge cases into models. Logs are immaculate. Incident reports look like aircraft investigations. Insurance products cover not just premises and liability but specific robotic workflows. The integrator of record becomes as important as the original manufacturer. Identity and provenance for machines and their models mature because without them the whole edifice is a trust problem waiting to happen.


2031–2034: The Income Gap Becomes a Demand Gap


Maya, now thirty, has retrained as a community care co‑ordinator. The pay is half what she earned in consulting, but the work exists. Her former colleagues are split: some found niches in AI oversight roles, others are still searching. One moved to Singapore for an AI regulation job. Two left the workforce entirely.


2031 is the year the numbers demand a second look. Uptime is routine, costs are flatter than they should be, though competition between AGI‑driven winners is crushing margins. Efficiency rises while demand sputters. The cost of intelligence falls faster than the distribution of its gains.

Top‑line growth stalls. Stores are stocked, deliveries are faster, call queues are shorter, but households are not spending as the models assumed. Wage share keeps slipping and without replacement income, consumption softens. Then confidence collapses and spending with it.


First it looks like a marketing problem. More bundles, more personalisation, more financing. Then the penny drops. If more output comes from capital combined with digital labour, and incomes remain tied to wages, the system begins to under‑consume its own efficiency. When wage share falls faster than replacement income rises, the economy under‑consumes its own productivity.


Prices can fall for a while, suppliers can be squeezed, credit can be stretched, but household maths does not change. A bigger share of production is not turning into pay packets. The economic multiplier grinds to a halt.


Retail cracks first. The middle consolidates. Winners with the leanest agent‑robot stacks sweep up share. Smaller chains sell, shrink or close. High streets continue to thin out and become micro‑fulfilment nodes. Hospitality splits into two poles: premium human experience and low‑cost automation. Services follow. Subscription fatigue sets in. Per‑user‑per‑month pricing looks increasingly fragile when the user is a machine. Large players cling to customers who have embedded their software deep into processes, but the writing is on the wall. Buyers demand outcomes while cutting headcount, which caps demand‑driven growth.


Firms experiment. Some rebate customers with dividends and loyalty credits that feel like cash. Others create fairer financing or invite co‑ownership of automation gains. It helps at the margin, but not at scale. The real lever sits with policy and income rails that do not rely solely on wages.


Retraining schemes relaunch, courses fill, hiring hardly moves. Benefits systems built for cyclical shocks strain under structural and permanent displacement. Consultations on negative income tax, dividends and broader ownership make the news, then go quiet. Payroll tax receipts soften in wage‑heavy regions while margins rise elsewhere. Political parties that cannot deliver on promises begin to fracture. Polarisation spikes.


The labour market takes on a paradox. Youth underemployment rises while employers complain of shortages in human‑heavy roles. Care, early years, skilled trades and public safety can hire, but pay and conditions lag their growing social value. The supply‑side keeps incomes low as some people who trained for knowledge work hold out for promised roles while others retrain to use their hands or work in care. Many refuse to accept a future they were not prepared for and fall behind expectations and debt repayments.


Essentials are cheaper and better, but rent, care and status costs keep pressure on household cashflow. Health advances mean people are healthier and expected to live much longer as a cure for ageing approaches.


Retail and services consolidate into winner‑take‑most structures. Vacancy‑to‑application ratios stay high for entry cognitive roles while care and craft advertise constantly. Balance‑sheet stress shows up as weaker discretionary spend and rising arrears on unsecured credit.


The macro story now depends on whether states can rebuild income, ownership, safety and access as fast as capability keeps dropping in price.


2034–2037: The Fork in the Road


Minister Chen, fifty‑two, faces re‑election. Her cabinet is split: half want to accelerate automation and build income rails, half want to slow deployment and protect jobs. The polling shows voters are angry, confused, and demanding action. She knows decisions made in the next eighteen months will determine whether her country thrives or fractures. There is no middle path.


ree

2034 opens with stressed institutions. Courts juggle embodied AI incidents, privacy harms and cross‑border compute disputes. Class actions over data use meet product liability when a robot injures someone or an agent misfires, even though digital labour is already measurably safer and more reliable than human labour. There is a proliferation of networked autonomous vehicles that are far safer, so safe that the vehicle insurance industry shrinks significantly in high‑adoption corridors. Judges ask for logs, provenance and an operator of record.


Liability attached to an operator of record requires new laws and insurance models. Standards bodies and regulators rush interim rules for safety cases, diagnosis, financial and legal advice, human override and audit trails. The paperwork of the old economy is being rewritten for digital labour.

Regulators pivot from slogans to systems. High‑capability models and robot fleets require licences, continuous monitoring and real‑time incident reporting. Model assurance becomes an operating requirement with pre‑deployment tests, live red teams and replayable post‑mortems. Identity is non‑negotiable. Agents and robots carry cryptographic credentials and signed update histories. Post‑quantum cryptography becomes standard as use cases for increasingly affordable quantum machines become available.


Procurement insists on provenance, attestation and a named integrator of record. It is less about bans and more about proving tools do what they claim and fail safely. In 2028 regulators wanted to engage with humans for reporting and compliance; now they insist on direct links to data used and generated by digital compliance agents.


Under the surface, the AI stack crosses a threshold. Near‑AGI becomes self‑improving AGI in practice, and few notice as they struggle to re‑engineer their economics, institutions and politics. Algorithmic efficiency climbs, tool‑use chains deepen and automated research loops design the next model, the next compiler and the next accelerator. Intelligence becomes a utility available to any far‑flung human outpost with a basic computer and increasingly cheap satellite internet.


Compute becomes a primary factor of production. Data centres co‑locate with cheap or stranded energy, sign long PPAs, add battery and thermal storage, and shift loads by the hour. The first space‑based data centres start generating positive cashflow. Carbon‑aware scheduling moves training to low‑cost windows and cooler regions. First‑generation fusion pilots show commercial promise, bending energy price curves. Governments treat compute like national infrastructure with national security‑level importance. Availability improves each quarter, yet implementation still bottlenecks on assurance, liability and talent.


National strategies diverge along fault lines of governance and resources. The Nordic bloc moved early—compute co‑operatives, generous retraining funds, high‑trust societies that absorb change without fracture. By 2036 they are running the most automated economies per capita, with functioning income rails and stable politics. The model works at small scale. The question is whether it ports to larger, more diverse nations.


Singapore and the UAE pivot fast. Singapore positions as a regulatory sandbox with strict safety rails, a proving ground for models and robots that need assurance before global deployment. The UAE uses sovereign wealth to anchor data centres and train foundation models. Both become compute brokers: new petrostates, except the resource is processing power and the clients are everywhere.

China’s state‑directed model scales automation in manufacturing, logistics and surveillance faster than democratic deliberation allows elsewhere. By 2036 it is running the world’s most automated economy in absolute terms. The asymmetry creates competitive pressure on democracies, can they match deployment speed without copying the governance model? But China faces its own trap: without broad income distribution, consumption lags production. Export‑led growth hits limits when trading partners are also automating.


India faces an existential crisis. Its IT services sector, millions of jobs, and decades of comparative advantage evaporates as AGI prices knowledge work to near zero. But scale, English‑language infrastructure and democratic institutions position India to potentially leapfrog to AI‑native services and domestic markets. Success depends on rapid reskilling and whether diaspora talent returns. By 2037 the outcome remains uncertain, but the stakes are now societal.


Africa has the youngest population and least legacy infrastructure to retrofit. Mobile‑first AI adoption could enable leapfrogging, or create permanent dependency on foreign models and compute. Which future emerges depends on investment in local talent and infrastructure over the next five years. Some states make early bets on compute sovereignty. Others remain digital colonies. The gap widens fast.


Security becomes the background noise. Bad actors scale fraud, intrusion and extraction with their own immense teams of digital labour as ROI on cyber‑crime is huge and chances of being caught are slim. Phishing is continuous and personalised with every video call with a family member checked by defence providers locked in an exponential arms race. Deep impostors hit payments and procurement. News channels stop reporting cyber theft under nine figures. Counter‑autonomy blends hard identity, traffic shaping, deception, kill signals and fleet health telemetry into live response. Governments, cities, employers and even brands fund security as safety becomes a differentiator. Compute and model provenance, content authenticity and distribution controls become table stakes.


Meanwhile, vocabulary drifts. In narrow domains, systems exhibit emergent behaviours beyond human audit.


By late 2037 the line is visible. One group builds rails for income, ownership, safety, access and compute. Another tries to slow adoption without fixing demand or capacity.


The paths now diverge. What follows are two scenarios, both plausible, separated not by technology but by choice.


2037–2040: The Reforged Path


By 2037 the pressure from voters, unions, boards and cities is too strong to ignore. Leaders who promised yesterday pivot to delivery or face political oblivion. Patience has run out. They grasp that the problem is not capability, it is institutions. The first years are rough: contested budgets, legal fights and political backlash. Still, they legislate through the noise and build the rails as leaders emerge.


The Intelligence Curve and the Crises It Exposes: Self‑improving AGI has lowered the cost of cognition again and it has become clear there will not be one AGI, or even one ASI. Tool‑use chains and automated research loops produce breakthroughs faster than committees can brief them. That pace lights up every fault line: revenue outpacing wages, safety panics, cyber losses, misinformation and skills obsolescence. Governments choose to treat each as a systems problem rather than as isolated scandals. Citizens want results and leaders with mandates act decisively.


Income Rails and Demand Repair: Countries introduce a universal baseline through negative income tax (basic income) or dividend rails. Here is how negative income tax works in practice: every citizen receives £1,200 per month. Those earning below £30,000 annually pay no income tax. Those above pay progressive rates, but the floor is guaranteed. The mechanism is elegantly simple: the tax system runs in reverse below the threshold.


The politics are brutal. The first countries to implement are not large democracies gridlocked by special interests, they are smaller states with sovereign wealth and existential urgency. Norway uses oil reserves. Singapore redirects investment returns. The UAE positions it as resource diversification. Their success—consumption stabilises, entrepreneurship rises, social stability holds—creates competitive pressure on larger economies.


In the UK, the debate fractures predictably. Property owners resist land value tax. Retailers dependent on working‑hours consumption oppose anything that might reduce employment. Parts of financial services fight wealth redistribution. The first NIT bills fail twice before passing in weakened form in 2038 after a fiscal crisis forces the issue.


The opposition is no longer ideological, it is arithmetic. When wage income cannot sustain demand, and automation advantages compound quarterly, countries face a choice: distribute the surplus or watch consumption collapse.


The Funding Reality: In the UK, £1,200 per month for fifty million adults is £720 billion annually, roughly a third of GDP. This is not a new programme layered on top of existing systems. It is a restructuring:


The roll‑out takes a decade with a careful phase‑in:


  • Years 1–3: £480 per month (40%) funded by benefit consolidation and initial levies

  • Years 4–7: £840 per month (70%) as compute levies scale and land tax reforms take effect 

  • Years 8–10: £1,200 per month (100%) once consumption stabilises and feedback loops prove positive


Administrative cost runs around 2 percent because the infrastructure piggybacks on existing tax systems. Behavioural studies from universal basic income (UBI) pilots in Finland and Kenya show participation remains stable or increases slightly as people pursue education and entrepreneurship without precarity. But these were small‑scale. The true test is a permanent universal baseline when a large share of previous jobs are automated.


Early results are encouraging but incomplete. The £1,200 floor creates breathing room. Consumption of essentials stabilises. Entrepreneurship rises. Mental health metrics improve. But luxury goods sales stall and status consumption shifts. The question remains: is this enough when the median wage job disappears?


Ownership Widens in Parallel: Income floors are half the solution. The other half is ownership. Citizen funds and community investment vehicles spread automation’s upside beyond shareholders. Models vary: Alaska‑style dividends; public compute utilities that run inference at cost and pay out surplus; employee ownership with tax advantages; community robotics where local authorities own fleets and lease revenue funds local services. The principle is constant: if machines produce wealth, citizens own shares in the machines.


Overcoming Resistance: Resistance is predictable. Capital fights distribution. Legacy industries fight change. Status‑quo bias runs deep. What breaks the dam is not argument, it is demonstration. Small countries show it works. Consumption holds. Social stability improves. Innovation accelerates as people with breathing room start companies. Larger nations watch competitors pull ahead and face a choice: adapt or accept decline. The UK moves in 2038 after tax receipts collapse. Germany follows in 2039 after successful regional pilots. The United States fractures into state‑level experiments. By 2040, income‑rail systems operate in more than twenty nations. Implementation varies. Generosity varies. The principle is established: income cannot depend solely on wages when wages depend on scarcity and scarcity is engineered away.


Tax and Spend Off Payroll: Treasuries stop taxing payroll and start taxing compute, data, land and consumption. Budgets follow new priorities: income rails, public health, education that works with AI, and safety.


Education Finally Reboots: Competency‑based progression replaces recall. Assessment assumes the tool is on the desk and asks what you can do with it. Paid time to learn is funded. Apprenticeships scale in judgement‑heavy work across care, trades and operations. A national reskilling portal lists real jobs first, then funds the exact bridge to each one.


Labour Law Follows the Person: Protections detach from employment status and travel with the individual. Rights to income support are portable and pro‑rata. Workers have a right to redeployment and funded retraining when a process is automated. The junior ladder is rebuilt deliberately through supervised exceptions, rotations and simulation, but automation has reduced the aggregate demand for human labour.


Safety, Access and the Arms Race: High‑capability systems require licences, live monitoring and replayable post‑mortems. Watermarking, identity and provenance are table stakes. Public compute provides high‑trust models at cost for civic uses. Counter‑autonomy is funded at scale to keep pace with bad actors. The arms race continues, but defenders have tools, telemetry and legal authority.

Compute and Power as Infrastructure: Compute is planned years out based on expected demand and the arrival of ASI. Data centres co‑locate with cheap energy, relocate to space and the seabed, backed by PPAs and storage. Fusion arrays move from pilots to early commercial service, bending price curves and removing bottlenecks for AI, industry and heat. Abundant power plus cheap intelligence changes what it is economical to make and where.


Health tech Reaches Escape Velocity: AI‑native discovery engines, robotic laboratories and longitudinal data push medicine decisively toward prevention. Early detection becomes routine. Personalised interventions are designed in days and trialled in silico. Research into senescence and epigenetic clocks makes meaningful progress toward extending healthspan, though comprehensive anti‑ageing interventions remain in trials. The average person spends more years able to work, to care and to enjoy the time that automation has returned. Few yet realise it, but humanity is edging toward a digital twin of the entire human system—a model through which treatments and new drugs can be simulated, optimised and tested in data centres before they ever reach a patient.


What Could Still Go Wrong? Even on the Reforged Path, success is not guaranteed. Three persistent risks test the model:


  • Inflation pressure: when income floors rise while labour costs collapse, inflation can eat the dividend. Central banks struggle to calibrate policy for an economy where transfer payments rise as production costs fall. Temporary dividend freezes tied to CPI prove necessary. 

  • Meaning collapse: work provided identity, structure, status and community. Some communities thrive in care, craft and creation. Others fragment. The response becomes explicit: meaning‑making as public health priority, community centres, subsidised apprenticeships in judgement‑heavy fields and cultural validation of non‑market contribution. 

  • Geopolitical instability: nations on different paths generate refugee flows, cyber spillover and trade tensions. The Reforged bloc funds compute‑and‑model control regimes that resemble arms control, pairs border enforcement with reconstruction aid for states that missed the fork.


These challenges are manageable but not trivial. The Reforged Path is direction, not destination.


What Life Feels Like: Maya, thirty‑five, works three days a week as a community care coordinator. The dividend covers her basics. She spends two days learning ceramics and one day volunteering with an AI oversight board. More people spend more time on care, creation, community and craft because the baseline is there and the tools are good. Essentials are cheaper and better. Time wealth rises. Politics cools because practical delivery is continuous.


This future is worth choosing. Humanity stumbled, then reset. We rebuilt the rules, and abundance followed. ASI arrives as a new species of mind, not a rival: flashes of sentience, no threat, extraordinary utility. With cheap energy, near‑free intelligence and trusted rails, scarcity retreats. Work re-centres on judgement, empathy and creation. Purpose returns.


2037–2040: The Fractured Path


James, fifty‑three, lost his fleet‑manager position when his company relocated operations. He retrained twice—first for AI oversight, then for solar installation—but both markets collapsed as AGI‑driven automation and robots priced him out. He is on his fourth temporary contract this year. His son emigrated to Norway. His daughter repairs grey‑market compute. The government still promises jobs are coming back.


The Countries that Ignored the Arithmetic Do Not Collapse Overnight: They experience compounding dysfunction that produces state failure in some cases and chronic underperformance in most. They missed the compute build‑out, discouraged citizens and firms from embracing AI, and never created incentives to learn the tools. Without sovereign compute and adoption cultures, they could not compete on cost, capability or credibility, and the gap widened each quarter. Politicians still sell a return to the past. Board papers show productivity where capital can buy automation, but the streets tell the truth: stalled demand, hollowed‑out services, unemployment, protest sirens at dusk and despair.


An Economy That Eats Itself: Automated capital stacks capture the surplus while wage income is never replaced. In nations that once relied on low‑cost labour, AGI prices that labour at or near zero. Call centres, back‑office work and remote support evaporate. Light manufacturing reshapes around robot micro‑factories sited near demand. General‑purpose robots with near‑human dexterity work round the clock, become commoditised and cheap, and can be leased for less than a single monthly wage. The middle of retail and services collapses into winner‑take‑most networks run on agent‑robot logistics. Outside those networks, high streets become delivery lockers, payday lenders and pawn shops. Grey markets for accelerators and model weights spread. Containerised data centres hide behind warehouses and tap stolen power. The state calls them criminal mines. Locals call them work.


Uneven Outcomes: Not every country on this path experiences total collapse. Most muddle through with partial accommodation: automation in elite sectors while labour‑intensive services remain human; grey‑and‑white economy bifurcation rather than wholesale criminalisation; periodic protest and reform cycles that produce incremental improvements; brain drain to Reforged nations, but not complete exodus. The pattern is chronic underperformance and lost decades. GDP per capita stagnates or declines slowly. Public services deteriorate. Infrastructure ages without replacement. Compound that over fifteen years and the gap with Reforged nations becomes unbridgeable.


AGI Arrives Without Rails: Capability does not wait for politics. Distilled models with sharp tools leak. Small crews run automated research loops that design the next optimiser, the next exploit kit, the next laundering network. Fraud becomes ambient. Your bank calls in a voice you trust. Procurement bleeds to deep impostors. Municipal networks blink off at 3 a.m. Hospitals limp on cached data. Law enforcement is outgunned in cyberspace. Private security fills the gaps for those who can pay. Corporate zones harden into company towns with drones, walls and private courts.


Politics Breaks Down: Protest and crackdown become quarterly seasons. Elections play out in splintered information ecosystems where nothing is trusted and everything is amplified. Brain drain accelerates to automation‑friendly jurisdictions. Heavy rules arrive late and hit the compliant harder than the criminal.


Education Drifts Into Irrelevance: Degrees devalue. Skills training migrates to private guilds and so‑called online experts of uneven quality. The best teaching sits inside corporate compounds that will not hire outsiders. Inequality hardens. Nepotism becomes the real admissions office. Everyone else juggles short contracts on failing platforms.


Theology and Authoritarianism Lock the Door: Where theology guides law, leaders denounce AGI in public while using it for surveillance and propaganda. Authoritarian states reserve model access for security services and state firms. Censorship smothers open research. Private demand dies. Smuggling networks grow to meet real needs. Borders become membranes: capability leaks in for elites and criminals, opportunity leaks out with the young.


Power and Compute Become Leverage: Energy is tight because illegal compute was never planned for and regulated plants were never built. Utilities ration. Blackouts curate the day. Operators with long PPAs remain online. Everyone else queues or steals. Export controls tighten. Domestic stacks do not scale. Remaining innovators leave. Leaders quietly move capital offshore and buy into foreign compute, entrenching the deficit at home.


Contagion and Conflict Follow: Failure does not stay local. Grey compute, digital crime and instability spill across borders. Some states fracture into civil war as factions fight for control of grids, ports and data centres. Nations on the Reforged Path build a compute‑and‑model control regime that looks like arms control: seizures at ports, serial‑number blocklists, remote attestation at network edges, joint cyber operations against criminal clusters. Tension rises as embargoes, sanctions and counter‑sanctions hit supply chains. Skirmishes flare around cable landings, energy corridors and data centres. Refugees move not just for safety, but for access to identity, income and trusted models. Contagion threatens to pull even prepared states into conflict.


What Life Feels Like: Intermittent services and even curfews on bad nights. Lines for fuel and power. Constant phishing and petty thefts that add up to a tax on attention. Prices are higher where you live and lower on the app that will not deliver to your postcode. You know three people who left this year. You know one who tried a scheme and lost everything. Trust is a luxury.


Why They Cannot Catch Up: Catch‑up needs rails and legitimacy at the same time. These governments have neither. They did not invest in compute, did not build adoption cultures, and punished experimentation. Capital left. Talent followed. The tax base shrank. The models that could accelerate recovery are off limits or controlled by the very firms they tried to punish. Each month of delay widens the capability gap and raises the cost of assurance. They chase crises because they cannot afford prevention.


The Endgame: Pockets of order, oceans of uncertainty. Grey compute everywhere, safe compute nowhere. People and countries left behind with no work, no functional economy and no means to create wealth fall into fracture and feud. Wars over power, ports and compute nodes ignite and spread poverty. Nations on the Reforged Path are forced to contain, to police, to feed and to rebuild beyond their borders. This is the bill for ignoring arithmetic. It does not stop at the frontier.


Conclusion: The Choice Before Us


Technology is not the variable. The variable is whether we build the rails in time.


For business leaders: the adoption question is settled. The real decision is whether you deploy in ways that build legitimacy or burn it. Publish automation‑impact statements showing how workers and communities, not just shareholders, benefit. Fund real retraining with job guarantees, not theatre. Design roles for human‑AI teams rather than pure displacement. The companies that maintain social licence compound advantages through talent, trust and regulatory latitude. Those that optimise purely for cost will face backlash, strikes and restriction.


For policymakers: you have a handful of years before the capability gap between prepared and unprepared nations becomes irreversible. Anchor compute infrastructure now with multi‑decade power purchase agreements. Pilot income rails in one region and scale. Reform education towards competency‑based assessment that assumes AI tools on the desk. Build counter‑autonomy at the scale and speed of automation itself. Co‑ordinate internationally on model governance. The political cost of action is high. The cost of inaction is state failure.


For citizens: the education → employment → retirement ladder is breaking in real time. The question is not whether to resist automation, but what to demand instead. Push for baseline income administered like tax in reverse. Demand ownership mechanisms that give you shares in automation’s upside. Require education and retraining systems that are funded and available with paid time to learn. Demand transparency about which jobs are being automated and when. And push for time—time to adapt, to retrain, to find meaning beyond market value.


The asymmetry is clear. If the timelines are long, we have built resilience. If the timelines are short and we fail to prepare, the downside is mass displacement, collapsing demand, state failure and conflict.


The signposts are dated so you can check them against reality. Some have already arrived, others have failed to arrive. Test the thesis. Adjust your confidence accordingly. The technology stack will keep compounding. The decisive factor is execution: whether we build new rails for income, ownership and purpose before the old ones collapse, or watch that collapse happen first.

Move from speeches to systems. Build the rails. Treat compute and energy as infrastructure. Put learning and legitimacy first. If we do, abundance follows. Work re-centres on judgement, empathy and creation. Time wealth rises. Purpose returns. If we do not, the Fractured Path is the bill for ignoring arithmetic.


The choice is ours — and we are closing fast on the fork in the road and the speed is uncertain. 


Copyright © Piers Linney 2025

 
 
bottom of page