Calculating risk is a large part of a portfolio manager’s job. While there are many ways to do so, Value at Risk (VAR) is by far one of the most widely accepted and useful measures of market risk. VAR helps investors, owners, and fund management groups compare risk across asset classes and calculate the probability of loss in a portfolio so they can make informed predictions about future performance.
In this article, we explore the concept of calculating VAR so you can gain a deeper understanding of this commonly used metric.
What is Value at Risk (VAR)?
VAR is a financial statistic that quantifies the probability of loss within an investment, a portfolio, or an entity such as a corporation over a specific time frame. It is most commonly used by investment companies and commercial banks to determine potential losses.
VAR is comprised of three main elements: the minimum amount of loss, the time period of risk assessment, and the probability of exceeding said loss. For example, a 5% VAR of $1,000 over the next month means that the minimum loss that would occur within the next month is $1,000 and the probability of that loss is 5%. Or, stated another way, there is a 95% chance that the loss will not exceed $1,000 in the next month.
When Is VAR Used?
Private equity firms, venture capital firms, holding companies, investment banks, hedge funds, mutual funds, and brokers that deal with many investments use VAR to mitigate and control risk. VAR is traditionally used to assess company-wide risk exposure, and investors employ it to predict losses to portfolios.
How Do You Calculate VAR?
There are three main methods used to calculate VAR:
-
Variance-Covariance Method
The variance-covariance method uses the expected return and standard deviation in stock and assumes that gains and losses will be normally distributed. Potential losses are framed as a standard deviation from the mean.
-
Historical VAR
This historical VAR method involves collecting historical information about the returns of an asset and making assumptions based on that. To do so, simply order the historical return data in ascending order and choose the percentile of the observations according to the level of confidence needed. The periodicity of returns will define the time period of the VAR.
-
Monte Carlo VAR
This method is computed by software that generates a distribution of the returns from a portfolio. An analyst inputs the historical returns and standard deviation, and the computational model simulates projected returns and losses over hundreds or thousands of iterations.
What are the Advantages of VAR?
The main advantages of using VAR to calculate risk are that is easy to understand, applicable across many types of assets, and universally accepted. Some important items about VAR to keep in mind:
- VAR is measured in currency or as a percentage, and analysts can add it simply to their analysis.
- VAR is a versatile metric that can be used to measure the risk of many different types of assets.
- VAR is universally accepted among regulatory authorities and can be confidently used in filings.
Are There Drawbacks to VAR?
No metric of risk is perfect, and there are some drawbacks to using VAR. For example, VAR does not predict the expected loss if it goes beyond the minimum threshold.
Secondly, the outputs are only as good as the inputs, and sometimes it can be difficult to estimate the necessary inputs. Additionally, statistics pulled arbitrarily from a period of high volatility may overstate the potential risk or vice versa.
Finally, the VAR assessment of potential loss represents the lowest amount of risk in a range of outcomes. For example, if a VAR determines a 97% chance of a 15% asset risk over 30 days, the expectation is you lose at least 15% one of every 30 days on average. By this determination, a loss of 50% still validates the risk assessment.