Expect the unexpected: Risk assessment using Monte Carlo simulations
Modern technology provides managers and accountants the means to effectively and efficiently perform increasingly complex quantitative analyses related to decision–making and financial reporting. With software such as Microsoft Excel, CPAs can perform statistical “Monte Carlo” simulations to assess business decisions and accounting estimates, evaluating not only their expected values, but also their potential upside and downside risks. This article provides a brief description of the Monte Carlo technique, demonstrates how it can be performed using Excel’s features, and illustrates its use in a common business application.
The Monte Carlo simulation technique, named for the famous Monaco gambling resort, originated during World War II as a way to model potential outcomes from a random chain of events. It is particularly useful when an outcome is the product of multiple random variables (i.e., sources of uncertainty) and is readily adaptable to model any situation that involves uncertainty. The simulation includes a mathematical formula that reflects the evaluated outcome based on random variables known to influence the outcome, places a value on each variable from its identified range of variation, and then computes an outcome.
Today, limited only by computing power and software constraints, CPAs can run this basic calculation and repeat it thousands, tens of thousands, or even millions of times, with each computation using an alternative set of randomly generated values for the determinant variables. The resulting output creates a range of possible outcomes from which one can assess the likelihood of a specific outcome, or for the application described in this article, the reasonableness of an accounting estimate based upon its modeled frequency of occurrence.
The example below demonstrates the use of a Monte Carlo simulation in Microsoft Excel for two decisions involving a loan covenant—a scenario familiar to financial managers, accountants, and auditors. The technique, however, could apply to myriad situations where a variety of outcomes are possible and the objective is to assess the likelihood of a specific outcome or the reasonableness of an accounting estimate.
Example Co. is privately held and has financing from a bank loan. As a condition of the loan, the lender requires that Example Co. maintain an interest coverage ratio, defined as earnings before interest and taxes (EBIT) divided by interest expense, of 1.5. While Example Co. is financially sound and expects to be profitable, its profits face two major sources of uncertainty. First, revenues fluctuate due to variations in product mix and sales volume. Second, the firm’s costs present an element of uncertainty due to factors such as variations in the sales mix, efficiencies in operations, changes to input prices, etc.
To estimate EBIT and compute the interest coverage ratio, the CFO analyzes the company’s cost structure. Historically, variable costs average 35% of revenues, but this percentage randomly fluctuates from one period to the next. The firm also has fixed operating costs of $5 million. The interest charged by the lender is also fixed with an annual amount of $500,000, as the net borrowings on the loan are not expected to change.
Based on prior years (e.g., a couple of years of sales data) and current market information, revenues for the current period are projected to be $10 million. The expected EBIT is $1.5 million, computed as ($10,000,000 × (1 — 0.35) — $5,000,000), and the expected interest coverage ratio is 3.0 ($1,500,000 ÷ $500,000). Based only on these estimates, the firm appears likely to meet the lender’s covenant. Rather than relying on the relatively large margin of safety reflected in this single calculation, the CFO uses Excel to further quantify the likelihood of a covenant violation by performing a Monte Carlo simulation of EBIT and the resulting interest coverage ratio.
First, the CFO must quantify the variation around the ratio’s two primary sources of uncertainty—revenues and variable costs. For a stable business, the standard deviation of revenues over a historical period may be appropriate. Alternatively, the CFO could ask for multiple revenue projections from the sales department, each based on different assumptions of factors such as customer growth, sales mix, and economic conditions, and calculate the standard deviation of these projections. Several other advanced statistical techniques, such as regression analysis of prior–period results, could also be employed on historical data to identify trends in variation. In this scenario, the CFO reviews projections prepared by the sales department and determines an appropriate standard deviation for revenue is $500,000. The standard deviation of variable costs can be similarly measured. Here, the CFO reviews the past 24 months of cost data and finds an appropriate standard deviation of 2% for use in the analysis.
The CFO proceeds to build the spreadsheet by labeling each fixed and random variable with its expected value and, if applicable, expected standard deviation. These values are referenced in the spreadsheet to model the thousands of replications of the EBIT and coverage ratio calculations based on “random draws” of sales and variable costs. The resulting output is a range of the possible EBITs and coverage ratios.
To perform a “random draw” for a variable, the CFO uses the following Excel formula: =NORMINV(RAND(),mean value, standard deviation). Then, the CFO references the “drawn variables” in the formulas for EBIT and the interest coverage ratio. Finally, the CFO copies these formulas over any number of rows, with each row representing a separate observation. In this example, the CFO repeats the formulas in 100,000 rows to compute an EBIT and interest coverage ratio for 100,000 potential outcomes. The CFO knows that using a smaller number of rows, perhaps only 10,000, would likely be sufficient, but that the analysis will become more stable over multiple attempts with the larger sample size. Excel limits the number of rows on a worksheet to just over 1 million. However, through using the more than 16,000 maximum columns in Excel, the number of possible calculations increases into the billions. As such, time and the computer’s memory space and processing power usually are the only true limitations.
See the table “Spreadsheet Template” for a snapshot of the template and the table “Excel Formulas” for details on the necessary formulas. Only the first three rows are illustrated, but the formulas remain the same for each of the 100,000 observations. One expects that a greater number of observations would increase the predictive power of the simulation, but there are diminishing returns as observations are added. For example, the model will increase in utility far more when going from 10,000 to 20,000 observations than when going from 100,000 to 200,000 observations.
Spreadsheet template
Excel formulas
Note that for each observation, the Excel functions randomly assign a different amount for sales and variable costs, and the unique combination results in a different EBIT and interest coverage ratio. Excel reassigns random values any time a change is made to the spreadsheet, so it is easy to see how stable the model is based on how much the outcomes change. When the outcomes change dramatically between estimations, additional observations (rows) should be added to improve the model’s stability. The purpose of repeating the formula in each row is to explore the impact of the variables’ variability on an outcome. Because the outcome for each variable can be either above or below its expected value, large variations exist in the outcome for each row. In each of the three observations illustrated, the interest coverage ratio covenant is satisfied, but the margin by which it is met varies by more than 40% just between observations 1 and 2.
The benefit of using Excel to repeat the formulas across a multitude of observations is that extreme outcomes are less likely to influence the expected average based on all the observations. Statistically speaking, employing this exercise over many random draws, or a large sample size, allows the law of large numbers to work upon the data; that is, the sample mean value converges to the expected population mean as the sample size increases. Once the desired level of stability is reached, it may be desirable to copy and paste the values, as opposed to the formulas, to a new sheet to prevent further randomization.
The charts “Frequency and Range of Outcomes for EBIT” and “Frequency and Range of Outcomes for Interest Coverage Ratios” show histograms that illustrate the frequency and range of outcomes for the EBIT and interest coverage ratios based on the 100,000 replications of the calculation. The average value for EBIT is $1,499,770—very close to the expected value of $1,500,000. The average interest coverage ratio is also the expected value of 3.00. The CFO now has new insight into the risk associated with these estimates. The EBIT’s standard deviation is $380,165, and the standard deviation for the interest coverage ratio is 0.76.
Frequency and range of outcomes for EBIT
Frequency and range of outcomes for interest coverage ratios
While the “Frequency and Range of Outcomes for Interest Coverage Ratios” chart shows that, for only a relatively small number of observations, the interest coverage ratio drops below the covenant threshold, the CFO can specifically quantify the likelihood of this occurrence. Using the COUNTIF function in Excel to count all interest coverage values (column G) that fall below the threshold of 1.5, the CFO compares this total to the total number of observations (100,000). Here, the covenant is violated in only 2.18% of observations; that is, the model predicts a 2.18% chance of not meeting or exceeding the lender’s requirement.
A covenant violation appears to be unlikely, consistent with the CFO’s initial belief. But the CFO’s qualitative estimate is now quantified into an actual probability, useful both in evaluating possibilities and justifying decisions. Further, the cost of a potential default can be quantified by multiplying this estimated probability of default by the monetary cost of a default.
DECISION 1
The CFO’s spreadsheet can also be used to perform a sensitivity analysis. Suppose the CFO is presented with an opportunity to restructure some vendor contracts, which would result in fixed expenses increasing from $5 million to $5.75 million to reduce variable costs from 35% to 27% of sales. However, under this new cost structure, variable costs will experience greater variation, and the standard deviation is expected to rise from 2% to 5%.
Substituting in the new variables, one can easily calculate the revised expectations for EBIT and the interest coverage ratio. As illustrated in the table “Effect of Change in Standard Deviation,” the CFO finds that on average the opportunity provides a net positive payoff, as expected EBIT is $1,550,000 ($10,000,000 × (1 — 0.27) — $5,750,000) and the expected interest coverage ratio climbs to 3.1 ($1,550,000 ÷ 500,000). However, sound decision–making would include the consideration of this potential financial gain considering any potential risks and other qualitative factors. Notice, for example, that the standard deviation in EBIT also rises dramatically, to $620,092, and the standard deviation in the interest coverage ratio rises to 1.24.
Effect of change in standard deviation
Effect of lower interest rate
The increased volatility in the coverage ratio leads to a greater probability of a covenant violation. The percentage of outcomes that fall below the required interest coverage ratio threshold increases to 9.54%. While the new cost structure on average brings potential gains, the higher variances increase the likelihood of outcomes that differ from the expected value and increase the overall risk of a covenant violation. With quantitative evaluation of the possible outcomes, the CFO now must weigh the expected increase in profitability against the expected increase to risk. The essential question is whether the $50,000 expected increase to profitability is worth the increased downside risk of a covenant violation.
DECISION 2
Returning to the original analysis, the CFO is in negotiations with the bank. The bank manager offers to lower the effective interest rate by 10% (e.g., lowering the rate from 5% to 4.5%), reducing the expected interest expense by a proportional amount. In exchange, the lender wants the interest coverage ratio increased to 1.75. The CFO knows that meeting the interest rate coverage ratio will simultaneously get easier (due to lower interest expense in the denominator) but more difficult (due to the higher threshold of 1.75), and must decide whether to accept this offer.
The CFO updates the original template, reducing interest expense by 10% to $450,000 and increasing the threshold for the interest coverage ratio to 1.75. The table “Effect of Lower Interest Rate” reports the results of the revised simulation. The expected interest coverage ratio climbs to 3.33, but the probability of a covenant violation actually increases from 2.28% to 2.98%. Again, the CFO must weigh the potential increase in profitability with the increased likelihood of violation and any costs it brings.
CONCLUSION
The examples presented above demonstrate how a Monte Carlo simulation is useful when assessing risk in business and accounting decisions. The loan covenant setting provides a straightforward context for illustration, applicable to a wide variety of professionals, but the modeling can easily be scaled up for more complicated business decisions. For example, more variables (e.g., sales mix, more detailed expense accounts, etc.) may be included for more complex situations. (See “A Word of Caution on Monte Carlo Simulations,” below, for a discussion of how advanced tools can be used to perform more complex analyses.)
Risk is often expressed in qualitative terms, such as remote or probable. The ability to quantify expected outcomes and their probabilities under uncertainty introduces a degree of objectivity to the analysis that is lacking. To the extent that risk can be quantified, these techniques add an element of control to the process. Modern technology supplies the tools to measure risk and incorporate its effects into decision–making, and Monte Carlo simulations provide one opportunity for financial professionals to leverage those tools.
A word of caution on Monte Carlo simulations
The examples in this article assume that the random variables in the model are independent, meaning that no meaningful statistical correlation exists between them as they vary across outcomes and that there is no causality between the variables. However, if a moderate–to–strong correlation or causality between the variables is believed to be present, Monte Carlo simulations are readily adapted using additional advanced techniques to incorporate this co–variation. Such adjustments are easier to make in advanced simulation software (e.g., Crystal Ball or @Risk) but are possible using advanced functions in Excel as well. These Excel add–ins offer other benefits, including enhanced visualization options, higher computational power and less–restrictive data limits, and greater ease to consider alternatives to the normal distribution in the underlying data.
These examples also assume an underlying normal distribution in the data, but changing this assumption to another distributional pattern is also possible in the base Excel package. While the normal distribution applies to many types of real-world phenomena, other situations may follow other distributions.
For example, an insurance company attempting to model how many car accidents an individual may be involved in faces this dilemma. Mathematically, the average person likely has slightly more than zero accidents in a year, but accidents are discrete events (e.g., 0, 1, 2) and are not continuous, as no driver can have a fraction of an accident. Furthermore, a person cannot have negative accidents. This type of random variable likely follows a Poisson distribution. It could be modeled by computing the overall historical probability for each discrete event (e.g., 0 has a 95% chance of happening; 1, a 4% chance; 2, a 0.5% chance; etc.) and then using the RAND() function in Excel to pull a random number between 0 and 1 from a uniform distribution.
If the random draw fell between 0 and 0.95, the observation would receive a 0; if it was larger than 0.95 and less than or equal to 0.99, it would receive a 1; and so on. Even non-normal or skewed continuous distributions (e.g., historical stock market returns) can be modeled and approximated in such a manner. Such advanced analyses, however, are more easily performed with dedicated statistical analysis software packages or Excel add-ins.
About the authors
James Schmutte (jschmutt@bsu.edu) is a professor of accounting, and Douglas Ayres (drayres@bsu.edu) and Jason Stanfield (jwstanfield@bsu.edu) are assistant professors of accounting, all at Ball State University in Muncie, Ind.
To comment on this article or to suggest an idea for another article, contact Jeff Drew, senior editor, at Jeff.Drew@aicpa-cima.com or 919-402-4056.
AICPA resources
Articles
Conference
For more information or to register, go to aicpastore.com or call the Institute at 888-777-7077.
Research & References of Expect the unexpected: Risk assessment using Monte Carlo simulations|A&C Accounting And Tax Services
Source
0 Comments
Trackbacks/Pingbacks