Wallace & Gromit: Vengeance Most Fowl models on display in Bristol. This file is licensed under the Creative Commons Attribution-Share Alike 4.0 International license.

I have been watching Daniel Susskind’s lectures on AI and the future of work this week: Automation Anxiety was delivered in September and The Economics of Work and Technology earlier this week. The next in the series, entitled Economics and Artificial Intelligence is scheduled for 13 January. They are all free and I highly recommend them for their great range of source material presented.

In my view the most telling graph, which featured in both lectures, was this one:

Original Source: Daniel Susskind A World Without Work

Susskind extended the usual concept of the ratio between average college and university graduate salaries to those of school leavers to include the equivalent ratio of craftsmen to labourers which then gives us data back to 1220. There are two big collapses in this ratio in the data: that following the Black Death (1346-1353), which may have killed 50% of Europe’s 14th century population, and the Industrial Revolution (which slow singularity started around 1760 and then took us through the horrors of the First World War and the Great Depression before the graph finally picks up post Bretton Woods).

As Susskind shows, the profits from the Industrial Revolution were not going to workers:

Source: The Technology Trap, Carl Benedikt Frey

So how is the AI Rush comparing? Well Susskind shared another graph:

Source: David Autor Work of the Past, Work of the future

This, from 2019, introduced the idea that the picture is now more complex than high-skilled and low-skilled workers, now there is a middle. And, as Autor has set out more recently, the middle is getting squeezed:

Key dynamics at play include:

  • Labor Share Decline: OECD data reveal a 3–5 percentage point drop in labor’s share of income in sectors most exposed to AI, a trend likely to accelerate as automation deepens.
  • Wage Polarization: The labor market is bifurcating. On one end, high-complexity “sense-making” roles; on the other, low-skill service jobs. The middle is squeezed, amplifying both political risk and regulatory scrutiny.
  • Productivity Paradox 2.0: Despite the promise of AI-driven efficiency, productivity gains remain elusive. The real challenge is not layering chatbots atop legacy processes, but re-architecting workflows from the ground up—a costly and complex endeavor.

For enterprise leaders, the implications are profound. AI is best understood not as a job destroyer, but as a “skill-lowering” platform. It enables internal labor arbitrage, shifting work toward judgment-intensive, context-rich tasks while automating the rest. The risk is not just technological—it is deeply human. Skill depreciation now sits alongside cyber and climate risk on the board agenda, demanding rigorous workforce-reskilling strategies and a keen eye on brand equity as a form of social license.

So, even if the overall number of jobs may not be reduced, the case being made is that the average skill level required to carry them out will be. As Susskind said, the Luddites may have been wrong about the spinning jenny replacing jobs, but it did replace and transform tasks and its impact on workers was to reduce their pay, quality of work, status as craftsmen and economic power. This looks like the threat being made by employers once again, with real UK wages already still only at the level they were at in 2008:

However this is where I part company with Susskind’s presentation, which has an implicit inevitability to it. The message is that these are economic forces we can’t fight against. When he discusses whether the substituting force (where AI replaces you) or the complementing force (where AI helps you to be more productive and increases the demand for your work) will be greater, it is almost as if we have no part to play in this. There is some cognitive dissonance when he quotes Blake, Engels, Marx and Ruskin about the horrors of living through such times, but on the whole it is presented as just a natural historical process that the whole of the profits from the massive increases in productivity of the Industrial Revolution should have ended up in the pockets of the fat guys in waistcoats:

Richard Arkwright, Sir Robert Peel, John Wilkinson and Josiah Wedgwood

I was recently at Cragside in Northumberland, where the arms inventor and dealer William Armstrong used the immense amount of money he made from selling big guns (as well as big cranes and the hydraulic mechanism which powers Tower Bridge) to decking out his house and grounds with the five artificial lakes required to power the world’s first hydro-electric lighting system. His 300 staff ran around, like good reverse-centaurs, trying to keep his various inventions from passenger lifts to an automated spit roast from breaking down, so that he could impress his long list of guests and potential clients to Cragside, from the Shah of Persia to the King of Siam and two future Prime Ministers of Japan. He made sure they were kept running around with a series of clock chimes throughout the day:

However, with some poetic irony, the “estate regulator” is what has since brought the entire mechanism crashing to a halt:

Which brings me to Wallace and Gromit. Wallace is the inventor, heedless of the impact of his inventions on those around him and especially on his closest friend Gromit, who he regularly dumps when he becomes inconvenient to his plans. Gromit just tries to keep everything working.

Wallace is a cheese-eating monster who cannot be assessed purely on the basis of his inventions. And neither can Armstrong, Arkwright, Peel, Wilkinson or Wedgwood. We are in the process of allowing a similar domination of our affairs by our new monsters:

Meta CEO Mark Zuckerberg beside Amazon CEO Jeff Bezos and his fiancée (now wife) Lauren, Google CEO Sundar Pichai and Elon Musk at President Trump’s 2nd Inauguration.

Around half an hour into his second lecture, Daniel Susskind started talking about pies. This is the GDP pie (Susskind has also written a recent book on Growth: A Reckoning, which argues that GDP growth can go on forever – my view would be closer to the critique here from Steve Keen) which, as Susskind says, increased by a factor of 113 in the UK between 1700 and 2000. But, as Steve Keen says:

The statistics strongly support Jevons’ perspective that energy—and specifically, energy from coal—caused rising living standards in the UK (see Figure 2). Coal, and not a hypothesised change in culture, propelled the rise in living standards that Susskind attributes to intangible ideas.

Source: https://www.themintmagazine.com/growth-some-inconvenient-truths/

Susskind talks about the productivity effect, he talks about the bigger pie effect and then he talks about the changing pie effect (ie changes to the types of work we do – think of the changes in the CPI basket of goods and services) as ways in which jobs are created by technological change. However he has nothing to say about just giving less of the pie to the monsters. Instead for Susskind the AI Rush is all about clever people throwing 10 times the amount of money at AI as was directed at the Manhattan Project and the heads of OpenAI, Anthropic and Google DeepMind stating that AI will replace humans in all economically useful tasks in 10 years, a claim which he says we should take seriously. Cory Doctorow, amongst others, disagrees. In his latest piece, When AI prophecy fails, he has this to say about why companies have reduced recruitment despite the underperformance of AI systems to date:

All this can feel improbable. Would bosses really fire workers on the promise of eventual AI replacements, leaving themselves with big bills for AI and falling revenues as the absence of those workers is felt?

The answer is a resounding yes. The AI industry has done such a good job of convincing bosses that AI can do their workers’ jobs that each boss for whom AI fails assumes that they’ve done something wrong. This is a familiar dynamic in con-jobs.

The Industrial Revolution had a distribution problem which gave birth to Chartism, Marxism, the Trades Union movement and the Labour Party in the UK alone. And all of that activity only very slowly chipped away at the wealth share of the top 10%:

Source: https://equalitytrust.org.uk/scale-economic-inequality-uk/

However the monsters of the Industrial Revoution did at least have solid proof that they could deliver what they promised. You don’t get more concrete a proof of concept than this after all:

View on the Thames and the opening Tower Bridge, London, from the terraces at Wapping High Street, at sunset in July 2013, Bert Seghers. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.

The AI Rush has a similar distribution problem, but it is also the first industrial revolution since the global finance industry decoupled from the global real economy. So the wealth share of the Top 10% isn’t going back up fast enough? No problem. Just redistribute the money at the top even further up:

What the monsters of the AI Rush lack is anything tangible to support their increasingly ambitious assertions. Wallace may be full of shit. And the rest of us can all just play a Gromit-like support role until we find out one way or the other or concentrate on what builds resilient communities instead.

Whether you think the claims for the potential of AI are exaggerated; or that the giant bet on it that the US stock market has made will end in an enormous depression; or that the energy demands of this developing technology will be its constraining force ultimately; or that we are all just making the world a colder place by prioritising systems, however capable, over people: take your pick as a reason to push back against the AI Rush. But my bet would be on the next 10 years not being dominated by breathless commentary on the exploits of Tech Bros.

The warehouse at the end of Raiders of the Lost Ark

In the year when I was born, Malvina Reynolds recorded a song called Little Boxes when she was a year younger than I am now. If you haven’t heard it before, you can listen to it here. You might want to listen to it while you read the rest of this.

I remember the first time I felt panic during the pandemic. It was a couple of months in, we had been working very hard: to put our teaching processes online, consulting widely about appropriate remote assessments and getting agreement from the Institute and Faculty of Actuaries (IFoA) for our suggested approach at Leicester, checking in with our students, some of who had become very isolated as a result of lockdowns, and a million other things. I was just sitting at my kitchen table and suddenly I felt tears welling up and I was unable to speak without my voice breaking down. It happened at intervals after that, usually during a quiet moment when I, consciously or unconsciously, had a moment to reflect on the enormity of what was going on. I could never point to anything specific that triggered it, but I do know that it has been a permanent change about me, and that my emotions have been very much closer to the surface ever since. I felt something similar again this morning.

What is going on? Well I haven’t been able to answer that satisfactorily until now, but recently I read an article by David Runciman in the LRB from nine years ago when Donald Trump got elected POTUS the first time. I am not sure that everything in the article has withstood the test of time, but in it Runciman makes the case for Trump being the result of the people wanting “Trump to shake up a system that they also expected to shield them from the recklessness of a man like Trump.”. And this part looks prophetic:

[Trump is]…the bluntest of instruments, indiscriminately shaking the foundations with nothing to offer by way of support. Under these conditions, the likeliest response is for the grown-ups in the room to hunker down, waiting for the storm to pass. While they do, politics atrophies and necessary change is put off by the overriding imperative of avoiding systemic collapse. The understandable desire to keep the tanks off the streets and the cashpoints open gets in the way of tackling the long-term threats we face. Fake disruption followed by institutional paralysis, and all the while the real dangers continue to mount. Ultimately, that is how democracy ends.

And it suddenly hit me that this was something I had indeed taken for granted my whole life until the pandemic came along. The only thing that had ever looked like toppling society itself was the prospect of a nuclear war. Otherwise it seemed that our political system was hard to change and impossible to kill.

And then the pandemic came along and we saw government national and local digging mass graves and then filling them in again and setting aside vast arenas for people to die in before quietly closing them again. Rationing of food and other essentials was left to the supermarkets to administer, as were the massive snaking socially-distanced queues around their car parks. Seemingly arbitrary sets of rules suddenly started appearing at intervals about how and when we were allowed to leave the house and what we were allowed to do when out, and also how many people we could have in our houses and where they were allowed to come from. Most businesses were shut and their employees put on the government’s payroll. We learned which of us were key workers and spent a lot of time worrying about how we could protect the NHS, who we clapped every Thursday. It was hard to maintain the illusion that society still provided solid ground under our feet, particularly if we didn’t have jobs which could be moved online. Whoever you were you had to look down at some point, and I think now that I was having my Wile E. Coyote moment.

The trouble is, once you have looked down, it is hard to put that back in a box. At least I thought so, although there seems to have been a lot of putting things in boxes going on over the last few years. The UK Covid-19 Inquiry has made itself available online via a YouTube channel, but you might have thought that a Today at the Inquiry slot on terrestrial TV would have been more appropriate, not just covering it when famous people are attending. What we do know is that Patrick Vallance, Chief Scientific Advisor throughout the pandemic, has said that another pandemic is “absolutely inevitable” and that “we are not ready yet” for such an eventuality. Instead we have been busily shutting that particular box.

The biggest box of course is climate change. We have created a really big box for that called the IPCC. As the climate conferences migrate to ever more unapologetic petro-states, protestors are criminalised and imprisoned and emissions continue to rise, the box for this is doing a lot of work.

And then there are all the NHS boxes. As Roy Lilley notes:

If inquiries worked, we’d have the safest healthcare system in the world. Instead, we have a system addicted to investigating itself and forgetting the answers.

But perhaps the days of the box are numbered. The box Keir Starmer constructed to contain the anger about grooming gangs which the previous 7 year long box had been unable to completely envelop also now appears to be on the edge of collapse. And the Prime Minister himself was the one expressing outrage when a perfectly normal British box, versions of which had been giving authority to policing decisions since at least the Local Government (Review of Decisions) Act 2015 (although the original push to develop such systems stemmed from the Hillsborough and Heysel disasters in 1989 and 1985 respectively) suddenly didn’t make the decision he was obviously expecting. That box now appears to be heading for recycling if Reform UK come to power, which is, of course, rather difficult to do in Birmingham at the moment.

But what is the alternative to the boxes? At the moment it does not look like it involves confronting our problems any more directly. As Runciman reflected on the second Trump inauguration:

Poor Obama had to sit there on Monday and witness the mistaking of absolutism for principle and spectacle for politics. I don’t think Trump mistakes them – he doesn’t care enough to mind what passes for what. But the people in the audience who got up and applauded throughout his speech – as Biden and Harris and the Clintons and the Bushes remained glumly in their seats – have mistaken them. They think they will reap the rewards of what follows. But they will also pay the price.

David Allen Green’s recent post on BlueSky appears to summarise our position relative to that of the United States very well:

To Generation Z: a message of support from a Boomer

So you’ve worked your way through school and now university, developing the skills you were told would always be in high demand, credentialising yourself as a protection against the vagaries of the global economy. You may have serious doubts about ever being able to afford a house of your own, particularly if your area of work is very concentrated in London…

…and you resent the additional tax that your generation pays to support higher education:

Source: https://taxpolicy.org.uk/2023/09/24/70percent/

But you still had belief in being able to operate successfully within the graduate market.

A rational functional graduate job market should be assessing your skills and competencies against the desired attributes of those currently performing the role and making selections accordingly. That is a system both the companies and graduates can plan for.

It is very different from a Rush. The first phenomenon known as a Rush was the Californian Gold Rush of 1848-55. However the capitalist phenomenon of transforming an area to facilitate intensive production probably dates from sugar production in Madeira in the 15th century. There have been many since, but all neatly described by this Punch cartoon from 1849:

A Rush is a big deal. The Californian Gold Rush resulted in the creation of California, now the 5th largest economy in the world. But when it comes to employment, a Rush is not like an orderly jobs market. As Carlo Iacono describes, in an excellent article on the characteristics of the current AI Rush:

The railway mania of the 1840s bankrupted thousands of investors and destroyed hundreds of companies. It also left Britain with a national rail network that powered a century of industrial dominance. The fibre-optic boom of the late 1990s wiped out about $5 trillion in market value across the broader dot-com crash. It also wired the world for the internet age.

A Rush is a difficult and unpredictable place to build a career, with a lot riding on dumb luck as much as any personal characteristics you might have. There is very little you can count on in a Rush. This one is even less predictable because as Carlo also points out:

When the railway bubble burst in the 1840s, the steel tracks remained. When the fibre-optic bubble burst in 2001, the “dark fibre” buried in the ground was still there, ready to carry traffic for decades. These crashes were painful, but they left behind durable infrastructure that society could repurpose.

Whereas the 40–60% of US real GDP growth in the first half of 2025 explained by investment in AI infrastructure isn’t like that:

The core assets are GPUs with short economic half-lives: in practice, they’re depreciated over ~3–5 years, and architectures are turning over faster (Hopper to Blackwell in roughly two years). Data centres filled with current-generation chips aren’t valuable, salvageable infrastructure when the bubble bursts. They’re warehouses full of rapidly depreciating silicon.

So today’s graduates are certainly going to need resilience, but that’s just what their future employers are requiring of them. They also need to build their own support structures which are going to see them through the massive disruption which is coming whether or not the enormous bet on AI is successful or not. The battle to be centaurs, rather than reverse-centaurs, as I set out in my last post (or as Carlo Iacono describes beautifully in his discussion of the legacy of the Luddites here), requires these alliances. To stop thinking of yourselves as being in competition with each other and start thinking of yourselves as being in competition for resources with my generation.

I remember when I first realised my generation (late Boomer, just before Generation X) was now making the weather. I had just sat a 304 Pensions and Other Benefits actuarial exam in London (now SP4 – unsuccessfully as it turned out), and nipped in to a matinee of Sam Mendes’ American Beauty and watched the plastic bag scene. I was 37 at the time.

My feeling is that despite our increasingly strident efforts to do so, our generation is now deservedly losing power and is trying to hang on by making reverse centaurs of your generation as a last ditch attempt to remain in control. It is like the scene in another movie, Triangle of Sadness, where the elite are swept onto a desert island and expect the servant who is the only one with survival skills in such an environment to carry on being their servant.

Don’t fall for it. My advice to young professionals is pretty much the same as it was to actuarial students last year on the launch of chartered actuary status:

If you are planning to join a profession to make a positive difference in the world, and that is in my view the best reason to do so, then you are going to have to shake a few things up along the way.

Perhaps there is a type of business you think the world is crying out for but it doesn’t know it yet because it doesn’t exist. Start one.

Perhaps there is an obvious skill set to run alongside your professional one which most of your fellow professionals haven’t realised would turbo-charge the effectiveness of both. Acquire it.

Perhaps your company has a client who noone has taken the time to put themselves in their shoes and communicate in a way they will properly understand and value. Be that person.

Or perhaps there are existing businesses who are struggling to manage their way in changing markets and need someone who can make sense of the data which is telling them this. Be that person.

All why remaining grounded in which ever community you have chosen for yourself. Be the member of your organisation or community who makes it better by being there.

None of these are reverse centaur positions. Don’t settle for anything less. This is your time.

In 2017, I was rather excitedly reporting about ideas which were new to me at the time regarding how technology or, as Richard and Daniel Susskind referred to it in The Future of the Professions, “increasingly capable machines” were going to affect professional work. I concluded that piece as follows:

The actuarial profession and the higher education sector therefore need each other. We need to develop actuaries of the future coming into your firms to have:

  • great team working skills
  • highly developed presentation skills, both in writing and in speech
  • strong IT skills
  • clarity about why they are there and the desire to use their skills to solve problems

All within a system which is possible to regulate in a meaningful way. Developing such people for the actuarial profession will need to be a priority in the next few years.

While all of those things are clearly still needed, it is becoming increasingly clear to me now that they will not be enough to secure a job as industry leaders double down.

Source: https://www.ft.com/content/99b6acb7-a079-4f57-a7bd-8317c1fbb728

And perhaps even worse than the threat of not getting a job immediately following graduation is the threat of becoming a reverse-centaur. As Cory Doctorow explains the term:

A centaur is a human being who is assisted by a machine that does some onerous task (like transcribing 40 hours of podcasts). A reverse-centaur is a machine that is assisted by a human being, who is expected to work at the machine’s pace.

We have known about reverse-centaurs since at least Charlie Chaplin’s Modern Times in 1936.

By Charlie Chaplin – YouTube, Public Domain, https://commons.wikimedia.org/w/index.php?curid=68516472

Think Amazon driver or worker in a fulfillment centre, sure, but now also think of highly competitive and well-paid but still ultimately human-in-the-loop kinds of roles being responsible for AI systems designed to produce output where errors are hard to spot and therefore to stop. In the latter role you are the human scapegoat, in the phrasing of Dan Davies, “an accountability sink” or in that of Madeleine Clare Elish, a “moral crumple zone” all rolled into one. This is not where you want to be as an early career professional.

So how to avoid this outcome? Well obviously if you have other options to roles where a reverse-centaur situation is unavoidable you should take them. Questions to ask at interview to identify whether the role is irretrievably reverse-centauresque would be of the following sort:

  1. How big a team would I be working in? (This might not identify a reverse-centaur role on its own: you might be one of a bank of reverse-centaurs all working in parallel and identified “as a team” while in reality having little interaction with each other).
  2. What would a typical day be in the role? This should smoke it out unless the smokescreen they put up obscures it. If you don’t understand the first answer, follow up to get specifics.
  3. Who would I report to? Get to meet them if possible. Establish whether they are technical expert in the field you will be working in. If they aren’t, that means you are!
  4. Speak to someone who has previously held the role if possible. Although bear in mind that, if it is a true reverse-centaur role and their progress to an actual centaur role is contingent on you taking this one, they may not be completely forthcoming about all of the details.

If you have been successful in a highly competitive recruitment process, you may have a little bit of leverage before you sign the contract, so if there are aspects which you think still need clarifying, then that is the time to do so. If you recognise some reverse-centauresque elements from your questioning above, but you think the company may be amenable, then negotiate. Once you are in, you will understand a lot more about the nature of the role of course, but without threatening to leave (which is as damaging to you as an early career professional as it is to them) you may have limited negotiation options at that stage.

In order to do this successfully, self knowledge will be key. It is that point from 2017:

  • clarity about why they are there and the desire to use their skills to solve problems

To that word skills I would now add “capabilities” in the sense used in a wonderful essay on this subject by Carlo Iacono called Teach Judgement, Not Prompts.

You still need the skills. So, for example, if you are going into roles where AI systems are producing code, you need to have sufficiently good coding skills yourself to create a programme to check code written by the AI system. If the AI system is producing communications, your own communication skills need to go beyond producing work that communicates to an audience effectively to the next level where you understand what it is about your own communication that achieves that, what is necessary, what is unnecessary, what gets in the way of effective communication, ie all of the things that the AI system is likely to get wrong. Then you have a template against which to assess the output from an AI system, and for designing better prompts.

However specific skills and tools come and go, so you need to develop something more durable alongside them. Carlo has set out four “capabilities” as follows:

  1. Epistemic rigour, which is being very disciplined about challenging what we actually know in any given situation. You need to be able to spot when AI output is over-confident given the evidence, or when a correlation is presented as causation. What my tutors used to refer to as “hand waving”.
  2. Synthesis is about integrating different perspectives into an overall understanding. Making connections between seemingly unrelated areas is something AI systems are generally less good at than analysis.
  3. Judgement is knowing what to do in a new situation, beyond obvious precedent. You get to develop judgement by making decisions under uncertainty, receiving feedback, and refining your internal models.
  4. Cognitive sovereignty is all about maintaining your independence of thought when considering AI-generated content. Knowing when to accept AI outputs and when not to.

All of these capabilities can be developed with reflective practice, getting feedback and refining your approach. As Carlo says:

These capabilities don’t just help someone work with AI. They make someone worth augmenting in the first place.

In other words, if you can demonstrate these capabilities, companies who themselves are dealing with huge uncertainty about how much value they are getting from their AI systems and what they can safely be used for will find you an attractive and reassuring hire. Then you will be the centaur, using the increasingly capable systems to improve your own and their productivity while remaining in overall control of the process, rather than a reverse-centaur for which none of that is true.

One sure sign that you are straying into reverse-centaur territory is when a disproportionate amount of your time is spent on pattern recognition (eg basing an email/piece of coding/valuation report on an earlier email/piece of coding/valuation report dealing with a similar problem). That approach was always predicated on being able to interact with a more experienced human who understood what was involved in the task at some peer review stage. But it falls apart when there is no human to discuss the earlier piece of work with, because the human no longer works there, or a human didn’t produce the earlier piece of work. The fake it until you make it approach is not going to work in environments like these where you are more likely to fake it until you break it. And pattern recognition is something an AI system will always be able to do much better and faster than you.

Instead, question everything using the capabilities you have developed. If you are going to be put into potentially compromising situations in terms of the responsibilities you are implicitly taking on, the decisions needing to be made and the limitations of the available knowledge and assumptions on which those decisions will need to be based, then this needs to be made explicit, to yourself and the people you are working with. Clarity will help the company which is trying to use these new tools in a responsible way as much as it helps you. Learning is going to be happening for them as much as it is for you here in this new landscape.

And if the company doesn’t want to have these discussions or allow you to hamper the “efficiency” of their processes by trying to regulate them effectively? Then you should leave as soon as you possibly can professionally and certainly before you become their moral crumple zone. No job is worth the loss of your professional reputation at the start of your career – these are the risks companies used to protect their senior people of the future from, and companies that are not doing this are clearly not thinking about the future at all. Which is likely to mean that they won’t have one.

To return to Cory Doctorow:

Science fiction’s superpower isn’t thinking up new technologies – it’s thinking up new social arrangements for technology. What the gadget does is nowhere near as important as who the gadget does it for and who it does it to.

You are going to have to be the generation who works these things out first for these new AI tools. And you will be reshaping the industrial landscape for future generations by doing so.

And the job of the university and further education sectors will increasingly be to equip you with both the skills and the capabilities to manage this process, whatever your course title.

What comes next in the following sequence: 650, 400, 300, …? More on this in a minute.

I decided to make a little table with the help of the Oxford English Dictionary to summarise the usage of most of the words Eric Hobsbawm listed at the beginning of his Age of Revolution, 1789-1848. I have highlighted all of the meanings not in use until at least 1800 below:

IndustrySince 1500 it has had a meaning of productive work, trade, or manufacture. In later use esp.: manufacturing and production carried out on a commercial basis, typically organized on a large scale and requiring the investment of capital.
Since 1801–Manufacturing or production, and those involved in it, regarded as an entity, esp. owners or managers of companies, factories, etc., regarded as influential figures, esp. with regard to investment in an economy.
IndustrialistSince 1839 to denote a person engaged in or connected with industry
FactorySince 1618 A location or premises in which a product is manufactured; esp. a building or range of buildings with plant for the manufacture or assembly of goods or for the processing of substances or materials
Middle Class Since 1654 A class of society or social grouping between an upper and a lower (or working) class, usually regarded as including professional and business people and their families; (in singular and plural) the members of such a class. However only since 1836 Of, relating to, or designating the middle class. And only since 1846 Characteristic of the middle class; having the characteristics of the middle classes. Esp. in middle-class morality. Frequently derogatory
Working ClassSince 1757 A class of society or social grouping consisting of people who are employed for wages, esp. in unskilled or semi-skilled manual or industrial work, and their families, and which is typically considered the lowest class in terms of economic level and social status; (with the, in singular and plural) the members of such a class. However only since 1833 Of, belonging to, or characteristic of the working class.
CapitalistSince 1774 A person who possesses capital assets esp. one who invests these esp. for profit in financial and business enterprises. Also: an advocate of capitalism or of an economic system based on capitalism.
CapitalismSince 1833 The practices or principles of capitalists; the dominance of capitalists in financial and business enterprises; esp. an economic system based on wage labour in which the means of production is controlled by private or corporate interests for the purpose of profit, with prices determined largely by competition in a free market.
SocialismSince 1833 Frequently with capital initial. A theory or system of social organization based on state or collective ownership and regulation of the means of production, distribution, and exchange for the common benefit of all members of society; advocacy or practice of such a system, esp. as a political movement. Now also: any of various systems of liberal social democracy which retain a commitment to social justice and social reform, or feature some degree of state intervention in the running of the economy.
MarxismSince 1883 The ideas, theories, and methods of Karl Marx; esp. the political and economic theories propounded by Marx together with Friedrich Engels, later developed by their followers to form the basis for the theory and practice of communism.
AristocracySince 1561 it has had a meaning of In the literal sense of the Greek: The government of a state by its best citizens. Since 1651 The class to which such a ruling body belongs, a patrician order; the collective body of those who form a privileged class with regard to the government of their country; the nobles. The term is popularly extended to include all those who by birth or fortune occupy a position distinctly above the rest of the community, and is also used figuratively of those who are superior in other respects.
RailwaySince 1681 A roadway laid with rails (originally of wood, later also of iron or steel) along which the wheels of wagons or trucks may run, in order to facilitate the transport of heavy loads, originally and chiefly from a colliery; a wagonway. Since 1822 (despite the first railway not being opened until 1825) A line or track typically consisting of a pair of iron or steel rails, along which carriages, wagons, or trucks conveying passengers or goods are moved by a locomotive engine or other powered unit. Also: a network or organization of such lines; a company which owns, manages, or operates such a line or network; this form of transportation.
NationalitySince 1763 National origin or identity; (Law) the status of being a citizen or subject of a particular state; the legal relationship between a citizen and his or her state, usually involving obligations of support and protection; a particular national identity. Also: the legal relationship between a ship, aircraft, company, etc., and the state in which it is registered. Since 1832 group of persons belonging to a particular nation; a nation; an ethnic or racial group.
ScientistSince 1834 A person who conducts scientific research or investigation; an expert in or student of science, esp. one or more of the natural or physical sciences.
EngineerSince 1500 Originally: a person who designs or builds engines or other machinery. Subsequently more generally: a person who uses specialized knowledge or skills to design, build, and maintain complicated equipment, systems, processes, etc.; an expert in or student of engineering. Frequently with distinguishing word. From the later 18th cent. onwards mainly with reference to mechanical, chemical, electrical, and similar processes; later (chiefly with distinguishing word) also with reference to biological or technological systems. Since 1606 A person whose profession is the designing and constructing of works of public utility, such as bridges, roads, canals, railways, harbours, drainage works, etc.
ProletariatSince 1847 Wage earners collectively, esp. those who have no capital and who depend for subsistence on their daily labour; the working classes. Esp. with reference to Marxist theory, in which the proletariat are seen as engaged in permanent class struggle with the bourgeoisie, or with those who own the means of production.
Crisis Since 1588 Originally: a state of affairs in which a decisive change for better or worse is imminent; a turning point. Now usually: a situation or period characterized by intense difficulty, insecurity, or danger, either in the public sphere or in one’s personal life; a sudden emergency situation. Also as a mass noun, esp. in in crisis.
UtilitarianSince 1802 Of philosophy, principles, etc.: Consisting in or based upon utility; spec. that regards the greatest good or happiness of the greatest number as the chief consideration or rule of morality. Since 1830 Of or pertaining to utility; relating to mere material interests. Since 1847 In quasi-depreciative use: Having regard to mere utility rather than beauty, amenity, etc.
StatisticsSince 1839 The systematic collection and arrangement of numerical facts or data of any kind; (also) the branch of science or mathematics concerned with the analysis and interpretation of numerical data and appropriate ways of gathering such data.
SociologySince 1842 The study of the development, structure, and functioning of human society. Since 1865 The sociological aspects of a subject or discipline; a particular sociological system.
JournalismSince 1833 The occupation or profession of a journalist; journalistic writing; newspapers and periodicals collectively.
Ideology By 1796 (a) The study of ideas; that branch of philosophy or psychology which deals with the origin and nature of ideas. (b) spec. The system introduced by the French philosopher Étienne Condillac (1715–80), according to which all ideas are derived from sensations. By 1896 A systematic scheme of ideas, usually relating to politics, economics, or society and forming the basis of action or policy; a set of beliefs governing conduct. Also: the forming or holding of such a scheme of ideas.
Strike Since 1810 A concerted cessation of work on the part of a body of workers, for the purpose of obtaining some concession from the employer or employers. Formerly sometimes more explicitly strike of work. Cf. strike v. IV.24, IV.24b Phrase, on strike, also (U.S.) on a strike. Frequently with preceding qualifying word, as general strike, outlaw strike, selective strike, sit-down strike, stay-away strike, stay-down strike, stay-in strike, sympathetic strike, wildcat strike: see under the first elements. Also figurative. Since 1889 A concerted abstention from a particular economic, physical, or social activity on the part of persons who are attempting to obtain a concession from an authority or to register a protest; esp. in hunger strike, rent strike
PauperismSince 1792 The condition of being a pauper; extreme poverty; = pauperdom n. Since 1807 The existence of a pauper class; poverty, with dependence on public relief or charity, as an established fact or phenomenon in a society. Now chiefly historical.
Source: https://www.oed.com/dictionary/ (subscription needed for full access)

Now try and imagine having a conversation about politics, economics, your job, the news, or even what you watched last night on TV without using any of these words. Try and imagine any of our politicians getting through an interview of any length without resorting to industry, ideology, statistics, nationality or crisis. Let’s call us now Lemmy (Late Modern) and us then Emily (Early Modern):

Lemmy: We need to send back people who arrive here illegally if they are a different nationality.

Emily: What’s a nationality?

Lemmy: Failing to do so is based on woke ideology.

Emily: What’s an ideology? And what has my state of wakefulness got to do with it?

Lemmy: This is a crisis.

Emily: Is that a good crisis or a bad crisis?

Lemmy: All crises are bad.

You get the idea.

The 1700s are divided from us by a political and economic language which would have been almost unrecognisable to the people who lived then.

However the other thing that occurs to me is that 1800 is quite a while ago now. The approximate date boundaries of the various iterations of English are often presented as follows:

Source: https://www.myenglishlanguage.com/history-of-english/

Back to my sequence. We are up to 225 years now since the last major shift. So why do we still base our political and economic discussions on the language of the early 1800s?

Well perhaps only our politicians and the people who volunteer to be in the Question Time audience do. As Carlo Iacono puts it brilliantly here in response to James Marriott’s essay The dawn of the post-literate society:

The future Marriott fears, where we’re all reduced to emotional, reactive creatures of the feed, is certainly one possibility. But it’s not inevitable. The teenagers I see who code while listening to philosophy podcasts, who annotate videos with critical commentary, who create elaborate multimedia presentations synthesising dozens of sources: they’re not the degraded shadows of their literate ancestors. They’re developing new forms of intellectual engagement that we’re only beginning to understand.

In the spirit of the slow singularity, perhaps the transition is already happening, but will only be recorded on a timeline when it is more established. Take podcasts, for instance. Ofcom’s latest Media Nations report from 2024 says this:

After a dip in the past couple of years, it seems that 15-24-year-olds are getting back into podcasts, while 35-44s are turning away. Podcasts are still most popular among adults aged 25-34, with weekly reach increasing to 27.9% in the last year. The over-54s remain less likely than average to listen to podcasts, but in contrast to the fluctuation in younger age groups, reach has been steadily increasing among over-54s in the past five years.

It may be that what will be the important language of the next century is already developing out of sight of most politicians and political commentators.

And the people developing it are likely to have just as hard a time holding a conversation with our current rulers as Lemmy is with Emily.