Long Finance Spring Conference - Why a Primer?

  • Details
  • Transcript
  • Audio
  • Downloads
  • Extra Reading

THE LONG FINANCE SPRING CONFERENCE

Discounted Cash Flow (DCF) and Net Present Value (NPV) analyses have long been part of the financial analyst's toolbox. By deciding on a discount rate and using that discount rate in some exponential equations it can be argued that future generations will be richer than us, so we can spend wantonly now. Or that something expensive is really quite cheap if we make a very small change in the discount rate.

This conference will examine the implications of discount rates on society's long-term decisions such as health, education, infrastructure, and the environment.

Supported by the Z-Yen Group

Download Transcript

01 March 2016

"Why a Primer?"

Nick Goddard


Good afternoon. As Michael noted, I am a member of the Long Finance 'Kitchen Cabinet', and when it was my turn to write something I opted for a pamphlet on the uses and abuses of discount rates. I chose this topic not because I am deeply expert at creating discounted cash flows, but because over many years I have been a user of the output from people who are. During this time, I have seen models which were very helpful for supporting business decisions, models which seemed to be just window dressing for an obvious decision – a reverse-engineering of common sense – and models which struck me as disingenuous, serving only the interests of those presenting them. So the simple answer to the question in the title of this talk – 'Why a Primer' – is that I wrote it to help other people who get presented with DCF models to understand when they are useful, when they are not so useful, and when they can be down-right dangerous.

Having trained as an engineer, my first job was at the BP Research Centre in the 1980s. BP was going through one of its periodic attempts to diversify and had concluded that advanced aerospace materials were an exciting market. Our research centre was tasked with inventing some exotic materials for BP to commercialise. DCF modelling would have been an excellent starting point for that research. In essence, a DCF model simply time-shifts a set of financial costs and benefits which occur at different times, so that they can be compared side by side. For each advanced material, BP first needed to incur a string of costs, of which inventing the material was only the start. This would be followed characterising a huge matrix of different performance parameters, developing a high volume manufacturing route, finding customers wanting to use the material, waiting for them to secure orders for the aircraft on which it would fly, then building a plant to manufacture it when those aircraft got built. Only then would profits would start to flow.

I don't know if BP did, in fact, carry out a DCF analysis of its proposed foray into advanced materials. If it did, the results were certainly not cascaded down to us lab rats who were free to study a huge range of materials so long as they were novel and had amazing properties. Pretty soon BP had a stable of about a dozen new materials ready to start their journey to commercialisation. I began to worry about the cost of taking so many materials over so long a journey. I did some research and concluded it might cost £500m spread over 25 years from the invention of an entirely new material to its mass adoption. I reckoned that each material in the stable might contribute £1bn to BP's market cap. I was not familiar with the concept of DCF analysis at that stage in my career, but my gut feel was that this was not a good deal. It is interesting that, since then, to the best of my knowledge, there have been no entirely new civil aerospace materials commercialised. Instead, we have seen increases in the market penetration of materials such as carbon fibre which were developed during the cold war on a 'money no object' basis.

In the event, BP's resolve was tested after much less than 25 years. Diversification went out of fashion and the advanced materials programmes were all abandoned by the mid-1990s. I left and started a new job evaluating power stations which had reached the end of their design life. These 'plant life extension' studies end in one of three recommendations: run, repair or retire. 'Run' means that, despite reaching the end of its design life, the station is actually in better condition than might have been expected and can be switched back on for a few more years. 'Repair' means that it will be cost effective to repair its most worn out parts and then squeeze out a bit more life. 'Retire' means that it is not worth repairing.

You make exactly the same decisions regarding your car. It gets to ten years old and is probably worth £1,000. If it passes its MOT you might run it for another year. If it fails, you might spend £500 on new brakes and put it back on the road. But if it needed £2,000 of repairs, then obviously you would retire it. These sorts of decisions are also usefully guided by DCF models, but now you have to introduce probabilised inputs. If I definitely pay £500 for the new brakes, I also have to include the most probable net present value of further repairs which I may have to make later on. It is hard to be certain about such things. Which is why it was interesting to read recently that it suddenly makes sense to extend the life of four aging nuclear power stations just when there could be further delay building their replacement.

After three years working on plant life extension I moved to the Ministry of Defence research labs where I worked for Michael. In 1997 he seconded me to an investment bank. I promptly left the civil service, joined the bank full time and worked in the tech sector during the dotcom bubble. Here I encountered some of the more dubious uses of DCF analysis. You will recall that in essence a DCF model is simply a way to time shift a series of debits and credits so that you can add them together and create a single number for the Net Present Value. The scope for doing DCFs badly therefore basically comes down to using the wrong discount rate, or assuming the wrong value and/or date for the debit and credit items. In my experience, the vast majority of bad DCF analyses are bad because they use poor assumptions about the debit and credit items rather than the wrong discount rates. The dotcom companies which were the darlings of the market during my time in the City provide excellent examples of this. They often had no revenues, let alone profits and therefore had to be valued on the basis of their NPV under a DCF model.

Typically these models started with some top down revenue projections. This involved buying a market report which said "this sector will grow from $100m to $20bn within 15 years" (a compound annual growth rate of over 40%). The modeller would then assume that the company would win 10% of this market (religiously noting that this was conservative because it was, in fact, one of the three most prominent players in the space). Having spent about an hour working out the revenue line, the modeller would then spend the next few days creating an incredibly detailed model of the company's costs, at the end of which they would have the NPV.

The key point is that since these models were dominated by the revenue lines, most of the analyst's time should have been spent testing those assumptions. How accurate were the market reports that claimed explosive growth? Looking at historical reports it was clear there was huge uncertainty – markets typically being smaller or slower to develop than predicted. During the tech boom you could charge several thousand pounds for a bullish market report and could probably sell 100 copies to investment banks alone. Did the publishers simply tell their City paymasters what they wanted to hear? Was a 10% global market share really conservative? Were there really only 10 credible companies worldwide competing for this market, or was it more like 100? Would the market take off five years from now, or 15?

In the simple example of making a run, repair or retire decision for your dilapidated car, I pointed out that you would need to consider the probability that, having replaced the brakes, you would find the clutch needed replacing a month later. In order to improve their accuracy and relevance, DCF models, cry out for a probabilised approach. What I mean by probabilistic modelling would be to say: there are two competing technologies only one of which will secure this $10bn market. For the following technical reasons, the one you are backing has a 30% chance of success, so we value the opportunity at $3bn. What I do not mean is 'this is a $10bn market so we have modelled it at $9bn and $11bn to create upside and downside scenarios'.

Why do analysts spend so much time playing with the linkages between their cells and so little thinking about the appropriate value of the dominant inputs? In the Long Finance report we noted that in any analysis, there is always a risk of 'Garbage In – Garbage Out' – the 'GIGO' principle. The nature of mathematical exponential functions is that what you get with DCF models is actually 'Garbage In – Amplified Garbage Out' – or 'GIAGO'. The charitable explanation is that everyone tends to focus on what they are good at. Investment bank analysts almost never have any practical industrial experience. They accept at face value input numbers on what will happen in industry because they have no way to sanity check them. They devote hours to playing with spreadsheets because that is their peculiar skill: if the only tool you have is a hammer then every problem looks like a nail.

There is of course a more sinister explanation. Tricksters know that if you can get someone to focus on what you are doing with one hand then it makes it easier to pick their pocket with the other. If you present a complex looking DCF spreadsheet and spend an hour discussing the various financial assumptions and linkages, then people are distracted from questioning the input values, their timings and growth rates, which actually dominate the NPV calculation.

I said that poor DCF analysis generally comes down to bad input assumptions rather than inappropriate discount rates. However, I have seen some important of examples of models where discount rates have been used either inappropriately or disingenuously. After leaving banking I have worked mainly in the renewable energy sector and so have spent a lot of time looking at the use of DCF analysis to estimate the so called 'Levelised' cost of energy generated in different ways. In order to look fair, these tend to use the same discount rate for all the different options –often 10%. Now this is an arbitrary hurdle rate – it is certainly not the rate of interest being charged or earned in different industrial sectors at the moment. Installing solar panels will incur clearly defined up-front costs which will not be discounted at all, and then minimal maintenance costs over the next 25 years, during which time electricity will be generated almost for free. The solar farm can probably borrow money at around 5%, so the 10% discount rate is not reflective of the actual project economics.

A nuclear power station has up-front costs which may escalate over a multi-year build period, fuel costs over 40 years which are difficult to predict and a decommissioning cost incurred, say, 50 years from now which may be huge. That's OK, because at a 10% discount rate each billion pound of decommissioning cost will be valued at just £8.5m today. Except that there is nowhere you can invest that £8.5m to obtain a 10% return after inflation for a period of 50 years. So the 10% discount rate is kicking the can down the road for future generations. Once again, it is not reflective of the actual economics. A much better approach would be to work out the price at which solar farms and nuclear power stations would have to sell energy in order to cover all of their costs at the interest rates actually likely to be charged or received.

To summarise: I am not knocking DCF – it is an essential item in a business analyst's tool box. However, its accuracy is almost always dominated by the accuracy of the input assumptions rather than the computational linkages between those assumptions. Effort should therefore be concentrated on dealing with that uncertainty, not on ever more elaborate DCF analysis. And over-elaboration of the models is not just a waste of time – an opportunity cost – it is actually dangerous because of the GIAGO effect. To misquote the Bible – look not at the speck in your analyst's DCF spreadsheet when there is a log(arithmic) error in your own underlying assumptions.

© Nick Goddard, 2016

Nick Goddard - 370x370-Speaker-Pic.jpg

Dr Nick Goddard

Dr Goddard is the Author of "Uses and Abuses Of Discount Rates: A Primer For The Wary". The title was released under Long Finance and...

Find out more

Support Gresham

Gresham College has offered an outstanding education to the public free of charge for over 400 years. Today, Gresham plays an important role in fostering a love of learning and a greater understanding of ourselves and the world around us. Your donation will help to widen our reach and to broaden our audience, allowing more people to benefit from a high-quality education from some of the brightest minds. 

You May Also Like