I recently finished reading an excellent book about how to read Russian short stories: A Swim in a Pond in the Rain by George Saunders. Of course it is about much more than this, drawing on George’s 20 years of experiences of teaching a creative writing course at Syracuse University and his own writing experience (primarily a short story writer, he won the Booker Prize for his first novel in 2017: Lincoln in the Bardo. It has caused me to think more deeply about my teaching (I teach some mathematics, economics, professional skills and communication skills) as a result. The two pages where George talks about finding his literary “voice” are, for me, worth the price of the book on their own – I never really understood how critical this was and why noone I had read had talked about it in very clear terms before. I can also, at long last, see the point of literary criticism. This book is all about the fight for meaning, and a bare-knuckled fight it is at times.

I think finding your own voice can apply in any field, not just the creative ones. George describes realising that he did not belong on Hemingway Mountain and the process of finally accepting his own “Shit Hill” with huge power. But at least as a writer you know you are supposed to be finding your own distinct way of writing. I sometimes think that, in many professional careers, this is not widely encouraged.

However it is, in my view, massively important. Finding your voice in a professional career is about discovering what you are good at and what you are interested in and trying to bridge the gap between the two. It is about being prepared to learn from those around you, although not necessarily the thing they think they are trying to teach you. It is about being prepared to spend time, sometime considerable time, on mastering things which are important to you, even if they seem to have no importance to anyone else. In this way you will develop an independent professional career where you have something interesting to say in your chosen field.

This may sound very utopian to some, particularly those in the early years of a career where you may have little control over your workload or the structure of your working day. However that will not be the case for ever unless you choose it to be and, provided you do not lose the habits for finding your own voice in the meantime, the opportunities to do so will only grow.

What you may have gathered from this is that I see finding your voice, not as some quick process that takes place over a short period at the start of your career (at least not in the pursuits I have been involved in so far), but as a lifetime’s search. Mine didn’t really start until I was 40 and, health willing, will carry on for many years to come (I still have no real idea what my voice as a writer is yet).

Let’s all wish each other good luck on the quest!

Last week, the news from the Actuary magazine was that climate change could slash global GDP by 18%. This was based on a Swiss Re report, the economics of climate change, from which the analysis above is taken.

According to the report, “The current trajectory of temperature increases, assuming action with respect to climate change mitigation pledges, points to global warming of 2.0–2.6°C by mid-century.” It was unclear why they had decided to stop at 2050, when current commitments continue to push temperatures up until 2100. And the scenarios from the IPCC’s AR5 Synthesis Report (see below) show that the path we are currently on diverges far more considerably from the Paris agreements after 2050. Climate effects are very long-term and many of the impacts of a 2-3°C warming would be irreversible ones, ensuring continuing losses at similar or greater levels for decades to come, and that is before we even consider the much higher probabilities of feedback effects: from the melting of the permafrost, additional methane releases, loss of Amazonian carbon and the loss of the albedo reflectivity of Arctic ice. The Swiss Re report makes clear that is has not considered these.

You might notice that there is a separate column to the left, in a different colour, with the title “Well-below 2°C increases” and sub-title of “Paris target”. It is actually an agreement which 189 countries have signed up to, including the UK. As the Paris Agreement says (Article 2 Point 1):

This Agreement, in enhancing the implementation of the Convention, including its objective, aims to strengthen the global response to the threat of climate change, in the context of sustainable development and efforts to eradicate poverty, including by:
(a) Holding the increase in the global average temperature to well below 2°C above pre-industrial levels and pursuing efforts to limit the temperature
increase to 1.5°C above pre-industrial levels, recognizing that this would significantly reduce the risks and impacts of climate change;

There has been some debate over whether the Agreement is aiming for 1.5°C warming with a 50% chance of staying below it, or for “well below” 1.5°C similar to the 2°C goal with a 66% chance of avoiding more than 1.5°C warming, but the modelling used for the next IPCC report has adopted the latter definition. Either way, I cannot see why Swiss Re has decided to put the Paris Agreement targets in a different column from what it calls the “likely range of temperature gains” as if those we have committed to are no longer feasible to aim at.

In saying this, I do not underestimate the massive challenge of keeping to the Paris target. As Mark Lynas says in Our Final Warning, at the end of 2018 over 1,000 GW of additional fossil-fuelled electrical power generation capacity was planned, permitted or already under construction around the world, equivalent to adding an additional 188 Gt CO2 into the atmosphere to the 658 Gt already baked in from existing infrastructure, which gives a total of 846 Gt of CO2 not including impacts from deforestation, agriculture and future land-use change. This compares to a future carbon budget as estimated at the end of 2018 by the IPCC (although estimates of this vary considerably) of 420 Gt of CO2 (or 1,170  Gt of CO2 for 2°C warming). So an extraordinary change of direction is required and we should be very cautious of getting anywhere near these limits when we do not know precisely where they are.

Which brings me onto the modelling of economic impacts. The first thing to say is that modelling in terms of impact on GDP, while guaranteed to get the attention of the financial community, is perhaps not the best way of communicating the devastation of runaway climate change.

In the summary of Mark Lynas’ excellent book Six Degrees: Our Future on A Hotter Planet , which summarised the scientific consensus already arrived at by 2007, the three degree increase for which damages are being estimated is expected to lead to Africa […] split between the north which will see a recovery of rainfall and the south which becomes drier […] beyond human adaptation. Indian monsoon rains will fail. The Himalayan glaciers providing the waters of the Indus, Ganges and Brahmaputra, the Mekong, Yangtze and Yellow rivers [will decrease] by up to 90%. The [IPCC] in its 2007 report concluded that all major planetary granaries will require adaptive measures at 2.5° temperature rise regardless of precipitation rates.[and] food prices [will] soar. Population transfers will be bigger than anything ever seen in the history of mankind. [The feedback effects from the] Amazon rain forests dry[ing] out and wild fires develop[ing] [will lead] to those fires [releasing] more CO2, global warming [intensifying] as a result, vegetation and soil begin[ning] to release CO2 rather than absorb[ing] it, all of which could push the 3° scenario to a 4°-5.5° [one]. The recent update to this: Our Final Warning, describes “entering the three-degree world means we are now living in a hotter climate than any experienced on Earth throughout the entire history of our species”. These impacts, which are likely to pose existential risks for many, appear totally inconsistent with the economic loss modelling shown above.

In his 2020 paper, The appallingly bad neoclassical economics of climate change (apologies, Journal access required), Steve Keen says in the abstract:

Forecasts by economists of the economic damage from climate change have been notably sanguine, compared to warnings by scientists about damage to the biosphere. This is because economists made their own predictions of damages, using three spurious methods: assuming that about 90% of GDP
will be unaffected by climate change, because it happens indoors; using the relationship between temperature and GDP today as a proxy for the impact
of global warming over time; and using surveys that diluted extreme warnings from scientists with optimistic expectations from economists. Nordhaus has misrepresented the scientific literature to justify using a smooth function to describe the damage to GDP from climate change. Correcting for these errors makes it feasible that the economic damages from climate change are at least an order of magnitude worse than forecast by economists, and may be so great as to threaten the survival of human civilization.

There follows a demolition of the methodologies employed by Nordhaus and others in this field. To be fair to the Swiss Re report, some of the criticisms in Keen’s paper appear to have been borne in mind when constructing their model, eg:

A shortcoming of our model build so far is that some economic impacts are linearly estimated: non-linearities are not adequately captured. We use multiplicative factors of 5 and 10 to simulate the increasing severity of outcomes from nonlinearities… Importantly, the framework does not consider
tipping points, events such as the partial disintegration of ice sheets, biosphere collapses or permafrost loss, that pose a threat of abrupt and irreversible climate change. This is because it is thought that tipping points will materialise well after our model horizon of mid-century only.

And as the Swiss Re report also acknowledges:

It is likely that the estimated impacts of GDP damages from climate change will rise as existing modelling develops to incorporate economic linkages in trade, migration and other channels, and to generalise the results to multiple countries.

And they are getting criticisms from the usual suspects of climate denial, eg Bjorn Lomberg on Twitter here, that even their attempts to date to quantify the uncertainties caused by non-linearity are a step too far.

And yet there remains a problem with these analyses in that they fail to capture existential risk. One of the things Steve Keen points out in his paper is the different attitude Nordhaus found towards estimating damages from climate change in natural scientists as opposed to economists. Natural scientists typically estimated the damage at 20-30 times higher than economists and some refused to cooperate with the exercise at all:

I must tell you that I marvel that economists are willing to make quantitative estimates of economic consequences of climate change where the only measures available are estimates of global surface average increases in temperature. As [one] who has spent his career worrying about the vagaries of the dynamics of the atmosphere, I marvel that they can translate a single global number, an extremely poor surrogate for a description of the climatic conditions, into quantitative estimates of impacts of global economic conditions. 

But how do you calibrate what is clearly a complicated model that Swiss Re and Moody’s have constructed for this analysis? Obviously we all have a very recent GDP fall in our minds at the moment – here is a summary from the UK Commons Library of Economic Indicators as at 30 April 2021 (themselves sourced from OECDstat and Eurostat):

This shows an almost identical GDP fall of 10.5% year on year in Q2 2020 for the OECD as predicted in the event of a 3.2°C warming, although it has bounced back pretty quickly since. For a longer term view of the global data, Our World In Data have an Annual growth in GDP per capita graph which runs from 1961 to 2017 (see below).

One very large GDP fall which stands out in the data here is the 26.5% fall in China in 1961. This was towards the end of the China’s Great Famine, in which approximately 3 million people died of starvation over a 3 year period. This certainly qualifies as an existential event and Swiss Re’s modelling suggest something of similar proportions in Asia and Africa at 3.2°C warming.

The biggest danger in all of this is that rich countries will look at a 10.6% reduction in GDP (at 3.2°C warming) and think this liveable with and adaptable to for their populations. After all, Simon Wren Lewis calculates that the austerity policies between 2010 and 2018 in the UK reduced GDP by nearly half of this amount every year for at least the second half of this period, compared to where it would have been without these policies, with an estimated cumulative loss of 15.9% of GDP. An 18.1% overall world average loss, however, effectively means more than a 25% loss for the rest of the world outside the OECD, as the OECD accounts for around half of the world’s total GDP which, even if we did not allow for the acknowledged likelihood that these are underestimates, is still in the Chinese Famine category of disaster and neither liveable with nor adaptable to.

We are already seeing vaccine nationalism carve up the world between rich and poor countries, with up until last month only 0.3% of the vaccines administered around the world having gone to people in low-income countries. This is likely to reduce the ability of poorer countries to be represented properly at this year’s COP26 when it frames a global response to the climate change which will affect them so disproportionately. And the losses if we do not act will be measured in far more frequent floods and sea level rise, extreme storms and heatwaves, crop failures, water and food shortages and mass migration on a scale we have never seen before, not GDP.

Could climate change slash global GDP by 18%? It’s much worse than that.

 

There is a particular variety of We Know Zero graphs that look like this one – showing an experience of a steady increase in something (usually bad, but not always) up until now, followed by a projection of that thing falling in the future. My wife Marsha suggested I call them Hope-over-Experience graphs, which seems to suit them very well.

Such diagrams are often very comforting for those who want to maintain the status quo. Let’s look at three such curves in particular (the excellent Doughnut Economics by Kate Raworth has alerted me to the first two of these).

The Kuznets Curve

There is a considerable body of evidence, most notably from Kate Pickett and Richard Wilkinson, that inequality impacts most health and social problems adversely, to the detriment of all socio-economic groups, but what is to be done about it? Enter our first Hope-over-Experience graph. In this case the x-axis is actually income per capita, but to the extent that this is something expected to increase with time I don’t think this matters too much. The y-axis is inequality. It was originally proposed by Simon Kuznets (the inventor of GDP) in his 1955 paper Economic Growth and Income Inequality (my apologies, but you will need journal access to read this) based on data from England, Germany and the United States from 1875 onwards, and the belief that economic growth will automatically deal with inequality has been a powerful influence on economic policy at the World Bank and elsewhere since.

However, more recent data has shown the patterns suggested by this limited original data set are no longer correct, if indeed they ever were. Thomas Piketty and Emmanuel Saez, in their 2001 paper Income Inequality in the United States 1913-1998, state:

In particular, the evidence presented in this paper, together with the evidence on France by Piketty (2001a, 2001b) and the U.K. by Atkinson (2001),
strongly suggest that there was no such thing as a “spontaneous”, Kuznets-like decline of inequality in developed countries during the first half of the 20th
century. The inequality decline was to a large extent accidental (depression, inflation, wars) and amplified by political factors (progressive taxation). This does not mean that the current rise of inequality will not be followed by a mechanical downturn during the first few decades of the 21st century: this is simply saying that such a mechanical downturn apparently never occurred in the past.

Their data suggests a curve which looks like this instead:

The Environmental Kuznets Curve

This was first proposed by Gene Grossman and Alan Krueger in 1994 in their working paper Economic Growth and the Environment, which suggested that there was an eventual inverse relationship between pollution and income per capita, with a turning point mooted at around $8,000. Most of their graphs are not quite as U-shaped as the Kuznets Curve, but this nonetheless has come to be known as the Environmental Kuznets Curve.

However, in 2016, the international industrial ecology research community and United Nations Environment agreed on a comprehensive data set for global material extraction and trade covering 40 years of global economic activity and natural resource use, which led to several papers including the UNEP Global Material Flows and Resource Productivity: A Report of the International Resource Panel (again apologies but journal access needed). Their graph of material extraction instead looked like this:

The Human Development Index (HDI) is the geometric average of 3 indices: Gross National Income, Health and Education. An optimum score of 1 is achieved where life expectancy is 85 or more years, adult literacy is 100%, school enrolment is 100% and the Gross National Income is US$40 000 or more per person per year in purchasing power parity. So again, this is not very supportive of a reduction in material footprint with increased wealth.

Which brings us to the third graph, often cited as an argument for why one of the most obvious ways to reduce inequality rather than just focusing on average income per capita, ie make taxation more progressive, is pointless.

The Laffer Curve

The story of the Laffer Curve, dating from the 1970s, is recounted by Arthur Laffer himself here. It plots tax rates against tax revenues to indicate that there is a tax rate beyond which tax revenues actually reduce. As he says:

The Laffer Curve itself does not say whether a tax cut will raise or lower revenues. Revenue responses to a tax rate change will depend upon the tax system in place, the time period being considered, the ease of movement into underground activities, the level of tax rates already in place, the prevalence of legal and accounting-driven tax loopholes, and the proclivities of the productive factors. If the existing tax rate is too high…then a tax-rate cut would result in increased tax revenues. The economic effect of the tax cut would outweigh the arithmetic effect of the tax cut.

However, returning to Piketty, this time in the 2011 paper,  Optimal Taxation of Top Labor Incomes: A Tale of Three Elasticities by Piketty, Saez and Stefanie Stantcheva, the evidence underpinning this curve is again highly questionable. As they point out in the abstract (bold type added by me):

This paper presents a model of optimal labor income taxation where top incomes respond to marginal tax rates through three channels: (1) standard labor supply, (2) tax avoidance, (3) compensation bargaining…The macro-evidence from 18 OECD countries shows that there is a strong negative correlation between top tax rates and top 1% income shares since 1960, implying that the overall elasticity is large. However, top income share increases have not translated into higher economic growth. US CEO pay evidence shows that pay for luck is quantitatively more important when top tax rates are low. International CEO pay evidence shows that CEO pay is strongly negatively correlated with top tax rates even controlling for firm characteristics and performance, and this correlation is stronger in firms with poor governance. All those results suggest that bargaining effects play a role in the link between top incomes and top tax rates implying that optimal top tax rates could be higher than commonly assumed.

There are a number of charts which could be used from this paper, but I have chosen the plot of economic growth against changes in top marginal tax rate to illustrate most clearly the problems with the Laffer Curve idea:

This graph should show an inverse relationship if the Laffer Curve were true.

Why do I feel the need to debunk these simple so-called economic laws which are nothing of the sort? Because you will always prioritise economic growth over everything else if you believe that:

  • Growth will fix inequality;
  • Growth will fix pollution;
  • Trying to fix inequality through the tax system is counter-productive.

And these beliefs will then also have policy implications when faced with a different sort of curve.

This was an explainer from Grant Sanderson at 3Blue1Brown about COVID-19 from March 2020 setting out quite simply how it was likely to spread, and how different case numbers in different countries (eg between Italy and the UK) were as likely to be due to being at different time points since the start of the pandemic as reflecting the relative success of their containment policies. We now know the UK Government locked down too late, at least partly because they prioritised economic growth over containment policies in the first few weeks:

Those attitudes changed and we have had an incredibly successful vaccine rollout in the UK, but this has been at the expense of any idea of international cooperation in vaccine supply. Wealthy countries such as the UK have bought enough vaccinations to cover our populations almost three times over, while Covax, the global vaccine procurement scheme, only aims to vaccinate 20% of the populations of recipient countries this year.

This is very short-sighted if we think there might be an international issue even more threatening to life than COVID-19 which can only be combatted by unprecedented levels of international cooperation. And of course this is exactly what we have in the form of the climate emergency and our final graph (from the National Oceanic and Atmospheric Administration (NOAA) in the US showing the relentless rise in the level of carbon dioxide in the atmosphere as global emissions continue to increase:

 

Living in Hope-over-Experience may be very comfortable for some people for a limited time, but if it stops us engaging with the more implacable curves of the world we actually live in then none of us will be safe.

Diagram 1

We are currently behaving like this is the world we live in – because if you are a finance person it is. The Dasgupta Report on the Economics of Biodiversity does nothing substantive to challenge this, despite a foreword from David Attenborough admitting “We are totally dependent upon the natural world”, other than putting a bigger number on the Sustainability portion (Natural Capital). John Kay mentioned in his talk, as part of the Dr Patrick Poon Presidential Speaker series on Finance in the Public Interest for the Institute and Faculty of Actuaries, the habit of actuaries in particular of often “attaching meaningless numbers to data”. There would seem to be great potential for doing precisely this in putting a number on Natural Capital.

But it is worse than that. As the September 2020 InfluenceMap report on sustainability finance policy engagement makes clear, most financial institutions (bottom-right quadrant, in blue, below) have shown caution and, despite having made some high-level supportive comments, have tended not to engage in a detailed or intensive manner. A small number of financial institutions (top-right quadrant, blue) have been actively engaged in promoting sustainable finance policy. A few financial institutions (centre-left of the diagram, blue) appear to be more cautious about sustainable finance policy.

This chart plots the results of InfluenceMap’s analysis for the financial institutions and industry associations included in the analysis. Engagement Intensity refers to how actively the entity is engaging, while Organization Score measures the degree of support/opposition to policy.

Diagram 2

In the meantime, the IFRS Foundation is proposing to set up a Sustainability Standards Board with its own reporting standards. This is what Richard Murphy (who got me thinking about this in Venn diagram terms originally) is rightly complaining about as it would lead to this:

His sustainability cost accounting idea offers a plausible alternative approach in my view. As the introduction says: 

…accounting has to change because we need a clear, audited, enforced and unambiguous indicator of the process of change that business must go through to support continued human life on this planet. Sustainable cost accounting can do that by indicating who can, and cannot, use capital to best effect in this changed environment. That is precisely why it is needed, however uncomfortable the consequences might be.

What is actually needed therefore is clearly an approach rooted in this:

Diagram 3

This is the long term position most working in sustainability would, I believe, like to see. However there are differences of opinion in how to get there.

Kate Raworth argues that you may need to talk within Diagram 1 to start with in order to engage the finance professionals, which of course includes the central bankers and treasury official who might limit the speed at which we could move to Diagram 3. Others disagree, saying once you start talking to finance professionals in their own language, you are condemned to a solution in Diagram 1.

What seem clear to me is that, if our arguments are between Diagram 1 and Diagram 3, perhaps we can dispense with Diagram 2.

Source: Wikimedia Commons: Shattered right-hand side mirror on a 5-series BMW in Durham, North Carolina by Ildar Sagdejev. Cropped by Nick Foster

It starts in 2025 with a description of a horrific heatwave in India which will stay with me for a very long time. As well it should as, in the book, it kills 20 million people. In response, India send thousands of aircraft up to 60,000 feet to spray aerosol particulates of sulphur dioxide into the stratosphere, in defiance of the international conventions banning such activities, to deflect some of the solar radiation with the aim of reducing the probability of future heatwaves for a period. By how much or for how long or with what other consequences is unknown.

As we build up to COP26 in Glasgow in November this year, in the book we start with the results of COP29 in Bogota, where the organisation which would come to be known as The Ministry for the Future (and the title of the book by Kim Stanley Robinson) was set up “to advocate for the world’s future generations of citizens, whose rights, as defined in the Universal Declaration of Human Rights, are as valid as our own. This new Subsidiary Body is furthermore charged with defending all living creatures present and future who cannot speak for themselves, by promoting their legal standing and physical protection.”

The Indian crisis happens a few months later. The new head of this body, Mary Murphy, is briefly held captive by, Frank, one of the survivors of the heatwave in her own flat in Zurich (the book also feels like a love letter to Zurich) and challenged to do more:

It’s not enough. Your efforts aren’t slowing the damage fast enough. They aren’t creating fixes fast enough. You can see that, because everyone can see it. Things don’t change, we’re still on track for a mass extinction event, we’re in the extinctions already. That’s what I mean by not enough. So why don’t you do something more?

This has a profound impact on Mary, who keeps in touch with Frank and his troubled suffering life throughout the book. It also leans her towards effectively endorsing the involvement of her No 2 in “black” operations to ensure certain people are “scared away from burning carbon”.

Indeed the book is suffused with eco-terrorism. Technological progress has partly displaced the state monopoly of violence, with drone technology in particular meaning that no aircraft or ship or surface navy is safe from a well-enough organised group by the end of the book. People stop flying when aircraft start being shot down regularly, and those that still do fly use carbon-negative airships, where solar panels generate more power than the ships use. Davos attendees get taken hostage and given a compulsory seminar at one point. Tax havens become obsolete when all money becomes digital and tracked.

Mary’s interactions with central bankers are probably the closest this book ever comes to comedy. In the first, she tries to argue for a “carbon coin”, a digital currency which would be paid out to organisations and people who could prove they had removed carbon from the environment. This would be the incentive to work alongside the carbon taxes. The contemptuous response from the Federal Reserve and others at first is “not our purview”, however by the end they are on board with this and many of the other ideas developed along the way.

There are so many ideas in this book, far too many to cover them all here: some of them familiar to me from economics (carbon quantitative easing, Jevons’ Paradox, Modern Monetary Theory, Gini Coefficient – these each get a short chapter among many other ideas and interspersed with riddles) and others not so. The Indian techno fix is the first of many: some successful, like sucking out the meltwater under glaciers to slow them sliding into the ocean and others not so, like the billionaire wanting to refreeze the oceans. Russia dyes parts of the Arctic yellow to reflect more sunlight back. Huge areas of land are rewilded.

What strikes me most is that the arguments we tend to have here and now about which course to take (Freud’s phrase is quoted here in the book – “the narcissism of small differences”) seem largely moot in this one imagined near-future: all of them are tried there – it’s not techno-fixes or de-carbonisation of transport and heating, it’s both. It’s not carbon QE or re-wilding, it’s both. If something doesn’t work, it’s abandoned. By far the most important determinant of which of the IPCC future scenarios we end up on seems to be how quickly we start. Economists come in for particular ridicule there – whatever course of action is planned, they can find one group who thinks it will have one effect, one who think it will have the opposite effect and one which thinks it will make no difference at all. The difference is that the economists are no longer guiding policy there, but facilitating and post hoc rationalising it.

There is a wartime feel to the book throughout, with people doing what they feel needs to be done in desperate circumstances. The choices are all different levels of bad, but bad is almost incalculably better than worst. And the overall impression is of a world changing rapidly, with one of its herd animals belatedly getting into better balance with the others. Even at 560 odd pages the impressions are inevitably just that – one chapter is just a list of different organisations working on aspects of the climate emergency in different countries, described as about 1% of the total number active. It is like the shards of a smashed wing mirror picking out details from the vanishing world behind. I have never wanted to apply the word polymesmeric (which I first saw on the cover of Catch 22 by Joseph Heller) to a book as much as I have to this one.

The hoped-for outcome of all of this? In one conversation this is described as a “success made of failures” or a “cobbling-together from less-than-satisfactory parts”, which I think sums it up nicely.

And I definitely want to visit Zurich one day. Probably by airship.

 

There are many papers about model risk, and the dangers of blindly relying on algorithms or metrics without allowing for human judgement at any point in any subsequent analysis (in effect “baking in” whatever analysis was done at the time the computer model or algorithm was constructed as the final word), but these can often descend into the same level of technical impenetrability as the programmes they are attempting to critique.

I watched the film Sully: Miracle on the Hudson for the first time this week, on the anniversary of the landing on the Hudson. In the final scenes there is a hearing (spoiler alert!), where the evidence presented up until that point based on computer simulations, with and without pilots involved, was leading to the unanimous conclusion that Sully and Skiles could have turned back to La Guardia or Teterboro airports rather than landing on the Hudson River in January. However Sully had appealed to have the video recordings of the pilot simulations shown to the hearing, and these revealed the pilots responding to the catastrophic bird strikes which had taken out both engines (again something later confirmed when the actual engines were recovered, but which the simulations themselves did not accept because of the instrument readings on one of the engines from the aircraft) by calmly immediately setting course for La Guardia or Teterboro with no decision or response or recovery time needed at all. When a 35 second allowance for this was inserted into the simulations, the results were fatal crashes in both cases.

What struck me was how invisible this deficiency in the programming of the simulation would have been without a cockpit recording of the simulations. In many of the programmes we use to automate judgement-heavy processes, such as recruitment, many of the capital allocation decisions in financial institutions or even A-level grades, we do not have anything equivalent to a cockpit recording available to us. Perhaps we wait until either events prove us wrong (bad) or those on the receiving end of our automated decisions start to complain in sufficient numbers for us to reconsider (worse). What if quite a large proportion of the cost savings from automating these processes is in fact illusory as a result of our not putting enough time and attention into the original programming and/or not setting aside enough budget for maintaining it and challenging its decisions with parallel processes which do allow for human judgement? How much bigger is this problem going to become in the era of machine learning, where the programmes we are running are themselves several steps of abstraction away from those originally written by humans?

Our ability to programme machines to carry out billions of calculations in seconds would have been regarded as miraculous only a few decades ago and is still pretty astonishing to us now. We need to start thinking a lot more about how we can live alongside these ever more capable machines amicably over the long term. And it can’t be only programmers who get to see what the machines are doing – whatever the technical problems of allowing the equivalent of a cockpit recording to be made which can be understood by any of us, they need to be solved with as much urgency as the process automation itself. All of our decision-making processes need to be understandable and challengeable by the society in whose name they are carried out. It’s time to get serious now about our miracles.

I have written about school qualifications once before here in 2014, when I was criticising the move to adding an A* grade at GCSE and the consequent narrowing of the grade boundaries to mimic the A-level ones. We have of course since moved to a numerical grade system for GCSEs which is even narrower. However, if the exam grade system was a bad way to assess students, the algorithm which replaced it in the summer (explained here and critiqued here) was clearly worse still.

So, against a background of steadily less reliable grade information at both GCSE and A-level, it was interesting to look at the Institute and Faculty of Actuaries’ (IFoA’s) employer directory and note that, of the 25 separate adverts for graduate roles, 11 of them have an A-level or UCAS points requirement in addition to the university degree requirement. My question is why?

I understand that employers, particularly this year, are likely to have very large numbers of applicants and need some way of reducing the number they need to review in detail, but there are many much better sieves than A-levels these days. Psychometric tests can assess how rusty students’ numeracy is. Application forms can be digital and given a computerised first pass on any number of criteria and, if the questions are constructed thoughtfully, will give companies a smaller set of applicants much more closely aligned to their goals than the grade given at mathematics A-level.

Even if you accept the grades as representative, there are clearly issues around social mobility and widening participation from relying on them to exclude a large number of candidates initially, which was highlighted when an algorithm attempted to reproduce the results based on subject studied and school attended. The news today that they will not be trying this again this summer is encouraging, but even if mark allocations are fairer, many problems with A-levels remain.

I have felt that this has been a growing issue for some time – it has always seemed to me ridiculous that a student on my programme (the BSc Mathematics and Actuarial Science at the University of Leicester – a qualification accredited by the IFoA), doing well and on track for all 6 of the core principles exemptions available as a result, still feels the need to retake an A level taken before they had discovered the motivation for actuarial work that they now have, in order to have a chance with many of the top employers. Are those employers so lacking in confidence in the integrity of their own profession’s qualification system that they need the security backstop of an A-level pass?

It is likely to be a tough environment for young people attempting to start their careers this year, whatever their skill set. I hope employers will review their current approach to recruitment and check they are not inadvertently pulling up the ladder before seeing all of the talent available.

NASA, ESA, and the Hubble Heritage Team (STScI/AURA), Public domain, via Wikimedia Commons

Fiscal space is defined as the difference between a nation’s sovereign debt-to-GDP ratio and the limit beyond which the nation will default unless policymakers take fiscal steps that are outside of anything they have done historically. That limit is sometimes referred to as the fiscal cliff, just to ram home the imagery of fixed physical limits beyond which disaster beckons.

How much fiscal space does the UK have? Moody’s have an answer, which depends most heavily on when you ask the question. In September 2019 it was as follows:

This shows the UK with a fiscal space (the “dynamic” means they assume interest rates increase as borrowing does, due to “crowding out” arguments – ie government borrowing pushing up the price of borrowing for everyone – so beloved of most economists) of around 175% of GDP, with this then projected to fall over the following 5 years as rates “normalized”. While the cost of borrowing seems to be dynamic, the actual borrowing itself is not allowed to be in these calculations – it is assumed that they just add to debt without increasing the revenue components of the primary balance.

Well of course then we had 2020, at which point (June 2020) Moody’s appear to have stopped talking about fiscal space and instead are now focusing on something called “debt affordability”. What happened to dynamism and crowding out? Not explained:

However despite this triumph of debt affordability, they then produce another graph to indicate that governments still need to be bearing down on debt to GDP ratios:

As they say in the document “rating implications will depend on governments’ ability to reverse debt trajectories ahead of potential future shocks”. Remember this was in June 2020. Let’s also remind ourselves of another graph:

Requiring governments to reverse debt trajectories in this environment is insane and likely to result in more deaths if not ignored. However as recently as last month in their issuer comment for the UK they said:

However, compared to the government’s March budget (that was quickly overtaken by events), there are some initial signs that fiscal policy outside of investment is likely to be less expansive than previously announced. What remains unclear is whether this ambition will be able to withstand the political pressures that seem to be inevitable given the government’s previous commitments. Even before the Spending Review, longer-term spending commitments for health, education, and defence had already been announced. Together, these three areas account for around 60% of total expenditure.

I have been hard on Moody’s in this piece, they are most certainly not alone. But this attempt to divorce sovereign debt levels from what is actually going on in countries needs to stop as does the constant discounting of the value of any government spending at all. Political pressures to spend more on health and education are not always things that governments need to “withstand” in order to look good in a Moody’s graph. There are far more important things at stake.

 

Blaise Pascal, mathematician and philosopher, once said:

All of humanity’s problems stem from man’s inability to sit quietly in a room alone.

This seems to have a particular relevance at the moment, when many of us are being asked to do precisely that. I also agree that this is definitely a problem we have. However I prefer to think of it as just one consequence of our inability to think about change in any rational way. We fear change, which is why we yearn so much to go back to “normal” at the moment, even if normal life was pretty unsatisfactory for many of us before the pandemic struck. We fight against change if we think what we have is threatened from outside the room we might otherwise sit quietly in, whether that is the loss of our income or that of our influence in the world or our “sovereignty”.

The only way in which we can contemplate change is in the context of some utopian ideal of improved productivity making one aspect of our lives much better while not requiring us to change any other part of them. Hence so much resistance to any idea of redistributing what we already have in favour of “Pareto improvements” to the economy, ie those which benefit some people without making anyone else worse off, and the obsession amongst economists with the “productivity puzzle” in the UK in particular:

So we look for ways to achieve this miraculous productivity improvement while leaving everything else essentially unchanged and the magic word which promises this more than anything else is innovation. Innovation will enable us to do more with less (or, more usually, make us do a lot more much cheaper, therefore encouraging us to use even more in the process). Innovation will have spin offs in lots of other areas we have not even imagined yet, but they will all be good ones! Innovation will solve the productivity puzzle.

In The Innovation Delusion, Lee Vinsel and Andrew Russell challenge this. As we have become more and more desperate for all change to look like innovation, we have made actual innovation harder to achieve, while saddling ourselves with higher and higher maintenance costs of new “innovative” infrastructure which is increasingly unsustainable to finance, rather than maintaining what we already have better.

I therefore prefer the quote that they use, from Kurt Vonnegut:

Another flaw in the human character is that everybody wants to build and nobody wants to do maintenance.

Innovation-speak, as they call it, is not innovation at all, but presenting ideas as innovative when they are not. As they say:

It plays on our worry that we will be left behind: our nation will not be able to compete in the global economy; our businesses will be disrupted; our children will fail to find good jobs because they don’t know how to code…Innovation-speak is a dialect of perpetual worry.

No wonder we are unable to sit quietly in a room alone.

And in the coming years when we will need to make substantial changes that work well enough for all of us to be able to continue living on this planet together, this approach will not work. We need for our thinking not to be magical, but grounded in realism. We need to make new things that we can afford to maintain sustainably. Innovation-speak will not get us there.

I previously wrote a blog in 2013 based around the Office for National Statistics (ONS) statistical bulletin entitled Estimates of the Very Old (including Centenarians), 2002-2011, England and Wales, which summarises how the proportions living to 90 years old and above have changed since 1981. It showed us a population living within a population: Nonagenarian (ie the over 90s) England and Wales (NEW) within the full population of England and Wales. I thought it might be time for an update, based on the latest ONS bulletin from September 2020, which now covers the period 2002-2019.

There have been quite a few changes. There are still more women than men in NEW, although the overall ratio has reduced from 2.7:1 in 2011 to around 2:1 in 2019 (see below). The NEW population, which was somewhere between the sizes of Malta’s and Cape Verde’s full population in 2011, has now just passed that of Western Sahara and has its sights firmly set on passing Luxembourg’s population next.

The population of NEW is still growing far more quickly than that of England and Wales, or indeed the UK, with a 25% increase between 2011 and 2019. However, with the NEW population you need to look beyond just improvements in public health and medical advances to the time at which they were being born. For instance, the number of people alive at almost every age from 90 years and above was higher in 2019 than in 2018, but with by far the largest increase at age 99 years (62.2%). This was caused by a big increase in births from the second half of 1919, compared to the previous year, as a result of the end of World War 1!

The bulletin ends with a sombre reminder that, although we would normally expect the large increase in those aged 99 years in 2019 to translate into a record number of centenarians in 2020, other factors, particularly the COVID-19 pandemic, are likely to have had a significant impact. COVID-19 deaths are highest for the 85 years and over age group. Public Health England have calculated excess deaths in the over 85 population at 11,656 between 21 March and 18 December 2020 (with 13,844 categorised as COVID deaths, suggesting a drop in excess deaths from other causes). This compares with the 2019 NEW population of 605,181, an increase of 21,157 on 2018.