Risk, bias and big decisions

How financial appraisals are not necessarily as objective as you might think.

I have been wanting to write about this for some time. There has been increasing awareness that bias exists in business decision making recently. A close relative of mine works in the tech industry, as it happens. They have personal experience of the unpleasant biases of software developers manifesting in the end product that the customer could see. The tech industry is starting to become aware that its algorithms suck. See this picture of Barak Obama below put through de-pixelator if you don’t agree!*

Image for post
Image for post
Depixilator working on Barack Obama — courtesy of Forbes

In this article, I’m going to wonder if things closer to home might suck as well. We will consider financial appraisals are also vulnerable to the beliefs and biases of the techie people involved, and what we can do about it.

A quick recap — a financial model is used to make decisions as to whether to invest or not in something. It is an equation, or a set of equations, which predicts income and expenditure, usually over a period of time. To allow for the fact that income in the future is less valuable than today (the time value of money) we apply a discount to future income and expenditure. The further in the future, the more the discount. So the discount rate (usually an interest rate) is an incredibly important variable to be thinking about.

This allows you to come up with a single figure — if positive then invest, then negative then don’t. This approach (known as Net Present Value NPV) is scalable — it can be applied from single projects through divisions and up to whole organisations in effect.

In preparing these models we use some variables to estimate what future income and expenditure might be. In a scenario test, we switch out the variables and see what happens to the results of the model. This will tell us how sensitive the model / decision is to movements in that variable.

Variables and their limits

This is where the first bias comes in. Where those variables are set will be down to someone’s judgement or interpretation, and it is easy for bias to creep in. For example, inflation is invariably an important variable, and estimates of future inflation rates can be easily affected by political beliefs or world views. Depends on which consultant you ask! On a variable like inflation, which is a mainstay of such models, the entire model can be highly sensitive to even small moves in the numbers. So this bias can be a critical one. When setting upper and lower limits of a variable its important to encompass a variety of viewpoints and independently benchmark wherever you can.

Which variables to pick?

Here is the next bias. when modelling income say, how do you make an estimate of what that looks like? How much detail do you need? A common mistake is to put in either way too much or too little detail. For example, what is future income going to look like? Is that predicted by a very complicated formula using many variables, or a very simple formula?

Go too simple and you have missed modelling what may be a key relationship. Too much detail and you are introducing false or inaccurate relationships. The signal gets lost in the noise. And its also worth bearing in mind that relationships can change over time, particularly in a sector which is seeing channel shift to working online, or as in mine, where we are subject to the potential of regulatory whims. Any model is going to have some simplification in it by definition, its a map, not the real world. I think the sweet spot we all need to aim for in models is just enough detail to get by.

Bias in scenarios

There are many more potential scenarios than you can ever potentially imagine, let alone actively model, so there will be a process of thinning down to pick what we think are the most useful scenarios are to model. This is another place where techie bias can creep in. If you believe that Brexit is very bad for the economy (an obvious political belief of course) then you are more likely to explore that area, for example. It can require acute powers of prediction, which is ripe for bias, in selecting these most useful scenarios to explore. And many techie people are more comfortable being historically focussed rather than prediction focus, which only adds to poor selection of scenarios . They can all too easily miss the mark, either finding nothing or just creating noise (problems which are truly unlikely to happen). Techie people preparing scenarios need to make sure their scenarios model the things which keep them up at night, and the things which keep their colleagues up at night. Map scenarios back to existing risk matrices or risk assessments as closely as you can.

An old limeworks at East Aberthaw — my photo
An old limeworks at East Aberthaw — my photo
The derelict lime works at East Aberthaw South Wales built in late 19 century & closed in the 1920s. Possibly its NPV didn’t run for 100 years. How many derelict projects are out there? (my photo)

Transparency

It is worth thinking about how your models can be made as transparent as possible, bearing in mind not all users of the service will be prepared to interrogate formulas. But there should be an open approach to the variables used — most users can relate to those. Share that list! Share as much of the variables selected as possible, and encourage a debate around them. Insight should follow.

A final sanity check

When building the next model, is worth thinking about what previous exercises have told you. I would argue it should be a cumulative exercise. How do they measure up to subsequent reality? If you don’t know, really worth checking! To guard against cherry-picking its worth encouraging consistency in variables wherever its safe to do so.

For example, you may be able to avoid rule out certain variables or scenarios safely by thinking what previous models are telling you. And how much space for pessimism do you need in your models? Previous models will help you assess those points. Take the learning from last time, to avoid you building the next derelict building.

* = https://www.businessinsider.com/depixelator-turned-obama-white-illustrates-racial-bias-in-ai-2020-6?r=US&IR=T

Experienced Chief Finance Officer -track record in Welsh social housing and third sector. Chartered Accountant (FCA BFP). Views my own - my space for blogging.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store