Die, Die, Die!

20131126_122758I received a set of nontransitive dice in the post this week. Transitive is an interesting word. As we all know in grammar it refers to verbs which do things to something. What I didn’t learn at school was that if they do things to one thing they are called monotransitive, and ditransitive if they have both a direct and indirect object. A verb like to trade is categorised as tritransitive. If a verb does not play with others it is called intransitive, eg an example appropriate to this story, to die. If a verb swings both ways it is called ambitransitive.

In the mathematical world transitive is a description of a relation on a set. For example, if A = B and B = C, then A = C. So = is transitive. Similarly, if A > B  and  B > C  then  A > C.

Or does it? Let’s return to the dice (singular die: cemented in my memory on the occasion a teacher responded to a boy coming into his class and asking to borrow a dice by shouting “die, die, die!” at the startled youngster). Mathematicians do not use the word intransitive, preferring perhaps to avoid the ambiguity of words like flammable and inflammable, but instead use nontransitive. Nontransitive dice have the property that if die A tends to beat die B on average, and die B tends to beat die C on average, then rather counter-intuitively die C tends to beat die A on average. How does this work?

There are many different arrangements of the numbers on the faces of the dice which would achieve this effect. My red die has 4 on all its faces except one, which has a 6. My blue die has half its faces with 2s and the other half with 6s. My green die has 5 on all its faces except one, which is unnumbered (or, in fact, undotted).

If we take the average number we expect to get when throwing each die (the concept of expected value, first introduced by Blaise Pascal of triangle fame, also known as the mean, is the first thing that tends to get calculated in any statistical analysis), then red gives us 4⅓, blue gives us 4 and  green     4 1/6. So we would expect from that to see red beat blue, green beat blue and red beat green.

When we pitch red against blue, if we throw a 2 with the blue die (probability of a ½), then we will always lose to red, since all of its faces are greater than 2. If we throw a 6 with blue, we have a 5/6 chance of beating red (since 5 of its 6 faces are 4s) and a 1/6 chance of drawing. So we have for blue a probability of ½ of losing, a probability of ½ x 5/6 = 5/12 of winning and a probability of ½ x 1/6 = 1/12 of drawing. So, in the long run, red beats blue on average, as we would expect it to.

When we pitch blue against green, blue will always win if we throw a 6 with it, with a probability ½. If we throw a 2, also with a probability ½, we have a 1/6 chance of winning against green (if green’s single blank face comes up) otherwise we will lose to a 5. So we have for blue a probability of losing of ½ x 5/6 = 5/12. And the probability of winning as blue (since no draws are possible this time) of 1 – 5/12 = 7/12. So, in the long run, blue beats green, exactly the opposite of what we would expect just going on the expected values.

Finally, when we pitch red against green, the only time green will beat red is when red has a 4 (with probability 5/6) and green has a 5 (also with probability 5/6). So we have a probability of green beating red of 5/6 x 5/6 = 25/36. And the probability of winning as red (since again no draws are possible as the two dice have no numbers in common) is therefore 1 – 25/36 = 11/36. So, in the long run (when as Keynes once helpfully pointed out, we are all dead) green beats red, again exactly the opposite of what we would expect just going on the expected values.

We only had to mess around a little with the 6 faces of the dice to get this counter-intuitive result. Nearly all financial instruments and products are obviously much more complicated than this, with the probabilities of certain outcomes being largely unknown, and even more so when in combination with each other, and therefore counter-intuitive results turn up almost too frequently to be called counter-intuitive any more. In fact the habit of trying to treat financial markets as if they were games obeying rules as fixed and obvious as those you can play with dice is what Nassim Nicholas Taleb refers to as the Ludic Fallacy.

If we double them up we get another surprise. Red still has the highest expected value (8⅔), followed by green again (8⅓) and then blue (8). But this time each pairing has three possible outcomes. Red and green both beat blue as expected from the expected values, but then green unexpectedly beats red.

This kind of behaviour is called nonlinearity, when adding quantities of things together does not just increase their effects, but instead changes them. Nonlinearity in this case means that blue beats green when we use one die each, but that green beats blue when we use two. Nonlinearity is also the single biggest threat to the financial system.

Anyone for darts instead?

Dollars and Census

November 2013 003The latest revelations from Edward Snowden that the US and UK agreed in 2007 to relax the rules governing the mobile phone and fax numbers, emails and IP addresses that the US National Security Agency (NSA) could hold onto (and extending the net to people not the original targets of their surveillance) has increased the pressure on the Government to tighten controls on the activities of the security services. This extension apparently allowed the NSA to venture up to three “hops” away from a person of interest, eg a friend of a friend of a friend on Facebook.

I have an issue with the Guardian analysis here. They say that three hops from a typical Facebook user would rope in 5 million people. However, using actual ratios from the network in their source (43 friends have 3,975 friends of friends have 1,328,361 friends of friends of friends) and the median number of friends of 99 from the original study, would lead to a number closer to 3 million. Still, it is clearly altogether too many people to be treated as guilty by association.

So it might seem like a strange time for me to be advocating that we give the Government more of our data.

The Office for National Statistics (ONS) is currently consulting on the form of the next census and the future of population statistics generally. The two options they have come down to are:

1. Keep the 2021 census pretty much as it was for 2011, although with perhaps slight changes to the questions and a greater push for people to complete them online; or
2. Using administrative data already held by the Government in its various departments to produce an annual estimate of the population in local areas. In addition there would be separate compulsory surveys of 1% and 4% of the population for checking the overall population figures and some of the sub-grouping respectively, and the ‘residents of “communal establishments” such as university halls of residence and military bases’ which are difficult to reach by other means.

In my response to the survey, I suggested that they do both, increase the compulsory surveys each year to 10% of the population and reduce the time between full censuses to 5 years. This is why.

First of all, everybody needs this data to be available. If the Government does not provide it, someone else will. Not by asking you overt questions, but by buying information about your buying preferences or search engine activities or any number of other transactions without your informed consent (eg you ticked agreement to their terms and conditions on their website) and without your knowledge. I would prefer to give my data to the ONS.

The ONS is part of the UK Statistics Authority, which is an independent body at arm’s length from government. It reports directly to Parliament rather than to Government Ministers and has a strong track record of challenging the Government’s misuse of statistics. With the exception of requests received for personal information (which are filtered off to become Subject Access Requests under the Data Protection Act), they have provided copies of all information disclosed by the ONS under the Freedom of Information Act on their website. In my view the ONS has demonstrated that it is a safe custodian of our data. They are everything the NSA is not: overt, apolitical and committed to the appropriate use of statistics.

But there are problems with the current data, which brings me onto my second point. Ten years is too long to wait for updated information. As the ONS points out in its consultation document, because of the ten year gap between censuses, the population growth resulting from expansion of the European Union in 2004 was not fully understood until 2012. There were other problems with the population data everyone had been working with before 2011, 30,000 fewer people in their 90s than expected for instance, which had serious implications for all involved in services to the elderly and those constructing mortality tables too.

So we do need more frequent census information. Five years seems about right to me, provided the annual updates can be made more rigorous. I think the ONS are right to suggest that they need to be compulsory to achieve this, but 5% of the population does not seem a large enough sample to be confident about this to me. I would prefer to see 10% completing annual surveys. This would allow 50% of the population to be covered over every 5 year census period, or 40% if the requirement was dropped in census year. There are many recent examples (see Schonberger and Cukier below) to suggest that the gains in accuracy due to increased coverage would be far greater than the losses due to the ‘messiness’ of incomplete responses.

There is a lot in the consultation document about the relative costs of the different options, but nothing about the commercial value of the data being collected. Indeed the reduction of the consultation to these two, to my mind, inadequate options seems to be very greatly influenced by the question of costs and the current cuts in budgets seen throughout the public sector. This seems to me to be very short-sighted.

However, I think this displays a failure of imagination. According to Viktor Mayer-Schonberger and Kenneth Cukier in their book Big Data, data is set to be the greatest source of wealth and economic growth looking forward. Many others agree. By taking a fully accountable and carefully controlled approach to licensing the data in its care, the ONS should be able to finance its own activities, even at the level I am suggesting, at the very least.

The ONS is very nervous about becoming more intrusive in its collection methods, citing the 35% increase in cost of the 2011 census in achieving the same level of response. It also refers to the response rates to its voluntary surveys which have dropped from around 80% 30 years ago to around 60% today. The main reasons for this in my view are the incessant requests from companies’ marketing departments masquerading as surveys on everything from phone usage to our views on banking to the relentless demands for feedback on every online purchase making us all subject to survey fatigue. This makes it all the more necessary that an organisation which is not trying to sell you anything and which is scrupulous about the protection of your data should be attempting to increase its scope and maintaining its position as the go to place for statistical data rather than falling behind its commercial rivals.

So let’s not fall into the trap of conflating all official data with the mountains of bitty fragments collected by our intelligence agencies from their shady sources. That has nothing to do with the proper, accountable collection of information to allow government and governed alike access to what they need to make better decisions.

So take part in the consultation, it matters. And when the time comes give the ONS your data. You know it makes census.

Why TTIPing may be bad for us

Germany has surprised the European Commission (EC) by suddenly insisting that stiffer data protection controls are incorporated into the negotiations for the Transatlantic Trade and Investment Partnership (TTIP), which began earlier this year, and for which the second round has started this week. For those of you who have not heard of it before (understandable, as the negotiations so far have had a deliberately low profile), the purpose of the TTIP is to create a single transatlantic market, in which all regulatory differences between the United States US and the EU are gradually removed. The EC calls it “the biggest trade deal in the world”.

As the EC goes on to say:

On top of cutting tariffs across all sectors, the EU and the US want to tackle barriers behind the customs border – such as differences in technical regulations, standards and approval procedures. The TTIP negotiations will also look at opening both markets for services, investment, and public procurement. They could also shape global rules on trade.

Concerns have started to emerge about the massive transfer of power from governments to corporations that the final deal might allow. However Germany’s intervention on data protection is just the latest of a list of reasons that have been advanced for why the TTIP talks are unlikely to go anywhere. From the legislative schlerosis of the US, to protectionist instincts on both sides recently strengthened by austerity, to French paternalism towards their film industry, to European fears about an influx of GM foods, the TTIPing point will never be reached, they say. So nothing to worry about then.

Or is there? A document published last year by the US Chamber of Commerce and BusinessEurope explains how it would be able to overturn existing legislation which got in its way. And if the long tortuous progress of Solvency 2’s implementation date, the bureaucratic equivalent of the man with the end is nigh sandwich board on his back, endisnighhas taught us anything, it is that unimplemented regulatory frameworks can still have massive impacts. Just this month it was revealed that the best funded pension schemes in the FTSE 100 are insurers, precisely because of the impact of those schemes on insurers’ solvency capital requirements under Pillar 1 of Solvency 2. And the clear rebuff to EIOPA from exporting these requirements to occupational pension schemes has not prevented the work to develop a framework for imposing them from continuing.

So what would TTIP mean for defined benefit (DB) pension schemes? Well, at first sight, not very much. US DB schemes tend to have funding targets equivalent to FRS17 levels, which would be seen as at the weak end of UK funding targets. However, as we have seen with the process of market harmonisation in the EU, horse trading may lead to the US being stuck with stiffer requirements imported from the EU on pensions in order to maintain subsidies for US farmers, say.

And there are two features of the US DB landscape which would be an issue for many UK DB schemes.

The first is the recovery plan length, which typically does not exceed 7 years in the US. Possibly not too onerous in many cases, if coupled with a FRS17-type funding target, but the EIOPA caravan has surely travelled too far for any dilution of funding target to be allowed at this stage. A 7 year recovery plan would however represent a considerable increase in contribution requirements for many schemes within the UK’s current funding environment.

The second is the restrictions placed on US pension schemes which fall below prescribed funding levels. If the funding level falls below 80%, no scheme amendments are allowed which would increase benefits until the funding level has first been restored to 80% or above, and certain types of benefit payments are restricted. These restrictions become much more stringent below 60% funding, when benefit accrual must cease and the range of benefits which cannot be paid out is extended to cover “unpredictable” contingent events.

We may not be out of the woods of Solvency 2 yet as far as DB pension schemes are concerned. But even if we do manage to break out of EIOPA’s grip, it may be only to find ourselves surrounded by a larger forest.

Illustration by Emma J Hardy
Illustration by Emma J Hardy

Papers and Pensions

mobile pics Nov 2013 010Now that the Great and Good of the actuarial profession and pensions industry have launched their joint consultation with the DWP on defined ambition (DA) options, it is interesting to look at the initial response in the print media.

The first thing to note is how little of it there is. The Daily Mail, Daily Express and Daily Telegraph have it on the front page. The Financial Times, Guardian and Times do not. Nor do the red tops. All three headlines sit alongside photographs of the Duchess of Cambridge.

And the response varies. The Express have written what looks like a positive piece (“Bigger Better Pensions For All”) until you discover it has decided to present the launch of the consultation as an “industry shake-up” which will “spell the end of annuities”. I was a little puzzled about this at first, as the consultation is not really about annuities at all, until I realised that Steve Webb had made a speech the previous day and mentioned the FCA review of annuities. This clearly fed into the default Express editorial line better than the actual topic of the consultation. This became clearer on page 4, with the headline “’Poor value’ annuity payouts are axed in pensions shake-up” next to a big picture of a smiling Ros Altmann. There appears to be only one story possible in the Express on pensions, whatever the actual news event.

The Mail does at least focus on things that are in the consultation, concentrating on the proposals to allow final salary pensions to drop some currently guaranteed elements of benefits such as indexation and spouses’ pensions. “The Death Knell for Widows’ Pensions” is their headline, but the article beneath is fairly balanced on flexible defined benefit (DB), quoting both those highlighting the reductions to benefits the proposal would allow on the one hand, and the danger that all the remaining horses would bolt from the DB stable if changes were not made on the other.

Finally, the Telegraph. “Pensions face new blow from ministers” is their headline. The article is similarly balanced, and is the only one to make the important point that benefits already accrued would be unaffected.

The coverage of the alternatives put up for consultation is patchy. Strangely the Express does best here, despite its desperation to make it a story about the death of the annuity, it does mention in passing collective defined contribution (DC) and guaranteed DC. Otherwise the focus is exclusively on flexible DB in both the Mail and Telegraph, and what members currently accruing non-flexible DB might lose as a result. The comparison with public sector pensions is made several times, with the Telegraph pointing out that the recent settlement on public sector pensions, which would not be removing the requirement to provide indexation and spouses’ pensions, was promised by ministers to be the last for 25 years.

So what kind of start does this represent for engaging the UK public in the debate on the future on pension provision? Mixed, I think. There will clearly be much more scrutiny on any legislative easing to current benefit guarantees than there will be to any addition of guarantees on pensions which currently have none. Perhaps this is to be expected. I do worry that cash balance may get squashed out as an option between the two camps of flexible DB and guaranteed DC – it is barely mentioned in the consultation, and can work well when coupled with a strong commitment to employee education like Morrisons have attempted.

But these are early days and the first thing everybody needs to do is respond to the consultation. Most pensions actuaries and many others will have strong views on many elements of it. So don’t leave it to your firm to do it on your behalf. The deadline is 19 December.