Category Archives: Risk Reporting

Using The R Script Visual in Power BI To Build Charts With Precise Layouts

Financial regulators are unreasonable people – at least that’s the impression those of us who report a bank’s risk numbers to them would have you believe. Not only do they demand daily reporting but they also require the charts that we provide to have a defined layout and format. This does raise a problem for Power BI in that while it has very good visual practices, these are not exactly those of the regulator. However, we can use the R visual to create a chart in exactly the fashion that a regular demands.

One very important chart for regulators is the market risk back test chart. It is realy two charts in one; a bar and area chart. It compares the profit and loss (P&L) that the bank, or part of the back such as a legal entity or desk, against a margin of safety, known as value-at-risk or VaR, that we promise that the P&L will stay within most of the time. On those days that the P&L exceeds the VaR, that’s a breach. If we have too many breaches, the regulator will take action.

The regulator wants to see the VaR as an area chart with VaR as a mirror image both above and below the x (date) axis since a breach occurs if the bank makes either a profit or a loss outside this VaR envelope. Regulators want this VaR envelope to be partly transparent so that we can superimpose the P&L on it. The daily P&L must be shown as blue bars with breaches shown in red.

In Power BI chart we can get close to these requirements – close, but no cigar.

Backtest built in native PBI

There are three problems

    • The VaR envelope is shown as two boundary lines rather than area chart. The Power BI combo chart visual combines a bar chart (used for the P&L) with a line chart but we can’t combine with any other chart
      • The P&L bars are all blue; there is no means of changing the colour of individual bars based on a condition
      •  The snapshot below show synchronized axes for P&L and VaR but if the the ranges of these two measures are sufficient different, the chart will automatically have different scales for P&L and VaR, which is not what we want and is potentially misleading.

However, we can meet all requirements if we use the R visual. We can use the fine tuning allowed in the plotting package to define our chart elements precisely. The snapshot below shows the R chart.

Backtest built in PBI R Script Visual

The R visual creates a script editor in a lower pane. It provides the first 2 lines of R code to create an R dataframe (R’s equivalent of a table) that contains all unique values of the columns in the values well. From here we can add a few lines of R code firstly to determine if there is a breach on a given day and then to plot the chart. The R code uses two very popular R packages. The dplyr package enables us to manipulate dataframes and the ggplot2 package gives us the ability to plot exactly the chart we require. The ggplot call first creates the P&L bar chart. It selects the colour of the P&L bars based on the breach condition. It then overlays this with partially transparent area charts – one for the upper VaR bound, then one for the lower (negative) VaR bound.

PBI R Script Editor Window

Even with a small amount of R code like this, it is best to build and test the R code within the RStudio, the typical development environment for R, then copy it into the code window for the visual once it is fully working.

The ggplot2 package allows fine tuning of a charts appearance – enough to satisfy the most unreasonable regulator.

This article first appeared on the Microsoft MVP Technical Tuesdays blog  here
The source data is here.

A layman’s guide to BCBS239

This is a draft for a talk I’m giving in a few months – comments welcome

In January 2013, the Basel Committee on Banking Supervision (BCBS) issued a document titled “Principles for effective risk data aggregation and risk reporting”. It is generally known as “BCBS 239”.The document sets out guidelines for effective reporting of risk to the board and senior management of banks.

So does BCBS 239 matter? Well, if a bank is a SIFI (systematically important financial institution), big enough so that its failure could cause instability in the financial system then it does matter since such banks need to comply with the guidelines, some as early as 2016. But I would argue these regulations are important even if a firm is not a SIFI simply because BCBS 239 describes the best practice of this important topic and does it concisely and clearly. The document says that one of its objectives is enhanced risk management and decision making, and to improve strategic planning – for the ultimate good of the firm and its shareholders.

The BCBS has the interests of the taxpayers rather than shareholders in mind. Its motivation for issuing this document arise from the crisis of 2007 when some Risk departments could not cope and it was impossible to get good risk information for proper decision. In any future crisis, the regulators want the good time risk information to be able to wind-up the bank or perhaps organise a recovery.

The document defines 14 principles and groups those into 4 areas; governance, aggregation, reporting and supervisory review. The table below summarises these.

Area Principle   # Summary
Governance and Infrastructure 12 Strong governance “in the round”Risk IT must be able to cope in a crisis
Aggregation 34



Achieve integrity and accuracy via automationCompleteness at group level but also drill down to entity

Timeliness-especially in the crisis

Adaptability-for changing circumstances

Reporting 78




Accuracy through reconciliation and validationComprehensiveness – include all material risks

Clarity – to facilitate informed decision making

Frequency of reporting-appropriate to nature of risk

Distribution to right people (and only them)

Supervisory Review 1213


Periodic review and spot testsSupervisor’s power to require remedial action

Supervisor home/host co-operation

Governance and management controls and processes are the overriding preoccupation; these include controls, roles, ownership of data and independent validation. The document draws a rather arbitrary distinction between aggregation and reporting. Aggregation is the gathering all the data together which is a pre-requisite for reporting – providing useful, relevant summaries to inform better decisions. Supervisory review describes the supervisors’ powers to do periodic assessments (including spot tests) and to require the banks to perform remedial action if they fail these tests.

These are several themes that resonate through the document. There are many references to ensure risk reporting not only works well in normal circumstances but also in a crisis situation – clearly this reflects the experience of the 2007 crisis.

The board and senior management must be aware and in control of all aspects of risk reporting. They must define the information they need and must be aware of the weaknesses and omissions in the reports they receive. This holds true even if they delegate parts of the process to third parties. The buck stops with the board.

The document states the ‘what’ but not the ‘how’. There are very few hard and fast rules on how the guidelines should be implemented. The only occasions when the guidelines are explicit are to state that banks must be able to report certain risks intra-day in a crisis situation and to stipulate the minimum content for a risk report. But for the rest, well it depends on what’s appropriate in the firm’s situation.

The BCBS prefer automation of risk data processes. They have a strong suspicion of any manual process – these should be the temporary (and well documented and independently validated) exception rather than the rule.

“Forward looking” reports are encouraged. These include stress test and scenario-based reports.

The entire BCBS239 can be read from start to end in an hour – it is less than 30 pages. For those in hurry, I would suggest starting with Annex 1 which very clearly defines the terms – so you appreciate BSBS’s distinction between key terms e.g. the difference between accuracy and precision or between completeness and comprehensiveness. Annex 2 is a summary of the 14 principles. For most people, this will be enough but for more detail, the initial sections cover the motivation behind these regulations, the context and then more background to the principles.

BCBS 239 reflects a traditional reporting situation – it assumes that senior management receives their risk data as canned reports – hence their distinction between aggregation and reporting. It assumes a world where the summary risk data is selected as communicated to the board by risk management and risk IT. It does not consider the emerging world of selfreliant and self-service reporting where risk managers have good tools to visually explore and analyse the data. This is probably a reflection of the current status quo in most of the banks.

In summary, BCBS 239 is a very useful document and not only for those firms for which the regulations apply. It provides well thought-out, well drafted guidelines with implications for all processes involved in ensuring that the board of a firm can take well informed decisions about the management of their financial risk.

Moving towards real-time risk reports using in-memory column-based structures

Risk management reporting is moving from overnight processing to delivery of reports and results in near real-time. The results need to be available within a few minutes rather than a few hours of the data becoming available to the risk data warehouse. For example, a market risk middle office (MRMO) team may be under “T+1” deadline pressure to produce daily results and reports within a few minutes once the final adjustments for the day have been loaded into the database.

One step on the way is achieving this is the use of in-memory column-based tables and indexes. Most BI vendors are now incorporating this technology. My tests used Microsoft’s implementation – the columnstore index in the latest version of its SQL Server database. This is designed to make typical SQL queries, used in analytics, run much, much faster.

The tests were designed to see if these techniques can work on typical risk data sets and if so what performance improvements are possible. I set up a database with a typical structure of a market risk data warehouse and a reasonable amount of data, 70 million rows. This represents typically 2-3 days of data for an asset class so you can run reports comparing results from the latest business day with the day before on a data set of this size.

I then created a set of SQL queries typically used to in risk reports. I ran these against the data with and without the columnstore Index applied. The results were impressive – over a suite of tests, the performance improvements were between 11 and 77 times. In the better cases, this translates into a 15 minute query becoming a 15 second one.

These techniques also avoid the step of “reprocessing the cube” so save valuable minutes here also. Cubes are useful for rich exploration analysis of the data but we don’t need them any longer for generating the basic results.

EBA Stress Results One Page Summary

Is it possible to summarise the EBA stress results on a single page? The data contains a lot of detail; for each of the 90 participating banks, the EBA produced a 10 page report. That’s 900 pages in total so distilling that down to one page is a challenge!

Figure 1 below shows my attempt to cram the essentials onto a single page of A4.
European Banking Authority EU Stress Test Results Dashboard Snapshot
Figure 1

This is a snapshot of the first few rows. You can see the full report in PDF format in its full glory here.

The report is split into three sections, capital adequacy, exposures-at-default (EADs) and sovereign exposure. The capital adequacy section contains the core tier 1 ratios (CT1R)and the risk weighted assets (RWA) as at Dec 2010 and then shows the CT1R projected forward to Dec 2011 and Dec 2012 under two economic scenarios; baseline (what the economy is expected to perform) and adverse (simulates a recession – e.g. a drop of EU GDP of 4%). The sovereign exposure section shows the total amount and maturity profile. The EADS sections shows the total amount and the breakdown by sector.

Since the EBA at that time considered under 5% to be unacceptable and between 5% and 6% to be of concern, the CT1R column has highlighted these with red and yellow icons respectively.

The RWA column is useful since RWA is a measure of size and therefore impact if the bank failed – or the amount of funds needed to bail it out. I’ve used data-bars to visualize the RWA – it’s interesting that there are many tiddlers and a few giants.

The sparkline charts show how the CT1R and RWA change under the two scenarios (baseline and adverse). Although each chart only has three points, it does raise some concerns. For example, under the baseline scenarios, most banks are increasing their CT1R as they build up their capital but for a few banks the CT1R seem to be decreasing.

The exposures-at-default (EADs) section has two columns – the data bars show the total value and the stacked bar chart shows the composition between the four main sectors; institutions (in blue), corporate (in green), retail (in red) and commercial real estate (in yellow). We can see differences in the profiles. For example GRUPO BBK has a higher percentage of it EADs to retail than most whereas DekaBank Deutsche Girozentral has a corresponding high percentage to institutions.

In the sovereign exposure section we can see very large differences between banks in the amount of exposure. The maturity column shows a bar chart with a bar for each tenor (ranging from 3 moths to 15 years). Even though each bar chart fits into a tiny space, it does yield a lot of information. For example we can see that HSBC has a maturity profile that is concentrated at the short end whereas HSH Nordbank is concentrated at the long end.

The page does not contain a country breakdown of either the sovereign exposures or EADs data. There was no way I could think of showing this succinctly. The EADs composition stacked bar works well because there are only 4 sectors but would not work with 21 countries. So perhaps a second page is necessary after all.

This dashboard was built in Excel and used the XLCubed Excel add-in. XLCubed has two great strengths used to good effect here. Firstly it has formula reporting which allows really precise positioning and means that we can use the best of Excel together with the best of XLCubed. Secondly, it has a great suite of micro-charts. The EADs composition stacked bar, sovereign maturity column chart and the CT1R and RWA line chart are examples of these.

Alternatives to ‘Top N’ concentration tables in risk management reports

Market risk management reports often contain many ‘Top N’ tables. Figure 1 shows a typical example. Each of the four tables shows the top 5 underliers (companies) for a different risk factor – in this example delta, gamma, vega and time-weighted vega.
Figure 1
Figure 1

Figure 1 actually underestimates the prevalence of these ‘Top N’ tables. Often there may be about for about 9-10 ‘Top 10’ tables and even more if underliers are split between indexes and individual stocks.

It is not easy for readers of these tables to gain insight about the data in these tables quickly . Firstly, they suffer for all the usual problems of tables of text and numbers. Secondly, it’s quite hard to follow the details for a particular underlier across all risk factors. Say we are interested in ‘Underlier F’ since it has the highest delta. We have to scan the other three tables to find it and in fact we don’t find it in the gamma and vega tables since it is not in the Top 5 in these risk factors. Thirdly, the tables don’t provide that much data – there are only 20 data points (4 tables of 5 numbers each).

There are a few alternatives that may improve on this. We can add data-bars that provide some visual cues about the relative values as in Figure 2.
Figure 2
Figure 2

However it would be much better to see all the data for each and every underlier that appears in any of the tables. In our example, that will be between 5 and fewer than 20 underliers since we can expect the same underlier to be present in several tables. Figure 3 shows the data organised with each underlier on a separate row. Now it is easy to see all the values for a particular underlier. Although it is now harder to determine the biggest value since we are no longer ordering by value, the data-bars help us quickly home in on the important values.
Figure 3
Figure 3

The data points represent a snapshot of the position as of close of the previous business day and it is very useful to know the recent history – how has this value been arrived at. In Figure 4, sparklines were added to show a micro-chart of the each value over the last 10 days.
Figure 4
Figure 4

This report was implemented in Excel and the sparklines were added using a very useful third-party Excel add-in, XLCubed.

Figure 4 has made good use of precious space and increased the density of data points. As mentioned Figure 1 represents 20 data points. Figure 4 however manages to represent twenty times as many data points (i.e. 400 data points; 10 underliers by four risk factors by 10 days) while only taking up about twice the screen or page real-estate.