Cascading Balanced Scorecards: Using Strategic Maps to make Performance Relevant to RAF Stations

Introduction

Performance Management plays a key role in the Royal Air Force (RAF), part of 'front-line' of the UK Ministry of Defence. The Performance Management approach within the RAF consists of a series of interdependent hierarchies of Performance Indicators to inform senior commanders on the current and forecast readiness of their forces to meet the range of war scenarios agreed with Government. The Performance Indicators are reported in a balanced scorecard which is designed to cover 4 main perspectives of management, namely: Resources Supplied, Processes Undertaken, Outputs Delivered and Enhancements for the Future (for more information please see API case study "Measuring and Managing Performance in the Royal Air Force"). This management case study outlines how the overall performance scorecard was cascaded to deliver a relevant performance management system to the RAF Stations.

About the Royal Air Force

The RAF has 50,000 Service and civilian personnel and more than 500 aircraft. The RAF supports operations in the Gulf region, Kosovo and Afghanistan as well as maintaining an RAF presence in Cyprus, Gibraltar, Ascension Island, and the Falkland Islands. Its key peacetime responsibility is to maintain the required readiness levels of its forces (e.g. the Harrier, Globemaster, and Sentinel aircraft and their crews) in support of the requirement to operate as an expeditionary air force. To sustain this activity the RAF is organised into 4 layers of management: Service, Command, Group and Station. The aircraft and their crews are based at some 30 stations, from which the aircrew, supported by ground staff, train and operate. Thus the best level of knowledge on the readiness, current and forecast, is at station level. For this reason stations form the backbone of the Performance Management reporting process, supplying the raw data supplemented by the local commander's judgement on the situation. This information is vital to commanders higher in the command chain informing them of the situation on the ground so that they can provide the most effective guidance and direction, while deploying the available resources most effectively.

The Problem: Lack of Local Relevance 

While the requirement to report to higher management is accepted, the task of collecting and reporting performance indicators is often done grudgingly by station staffs. The nature of what has to be reported is often at such a low level of granularity that it rarely provides useful information for the management of stations. The under-lying problem is that stations cannot make the connection between the corporate reporting they have to do and the local strategy they are following. This is what some of the station executives had to say about the existing approach: • "The current system doesn't provide us with the relevant information" • "Measurement is for reporting only - we don't use any of the data we capture" • "The current scorecard is not very useful, I would like to use it as a management tool, but it doesn't capture much data that is relevant to me" • "We only measure what is easy to measure, not what really matters" • "Data is not providing us with the necessary insights - we need more subjective assessments" It was recognised that there was a significant risk that this situation could lead to local strategies being out of alignment with higher level goals. Greater connectivity needed to be achieved between local strategies and corporate strategies. A number of front-line stations welcomed the idea of introducing a more localised performance management and measurement system, subsuming the higher level reporting requirement, but reflecting their local strategy. To achieve goal alignment as well as local relevance, RAF stations embarked on a journey of cascading the RAF scorecard into local performance management systems consisting of strategic maps, key performance questions and key performance indicators. This process was facilitated by the Advanced Performance Institute in close collaboration with the RAF.

Cascading the Scorecard and Creating Strategic Maps 

The cascade of the scorecard into the RAF stations was achieved in a 5 step approach (see Figure 1). The project needed to be agreed and scoped for each participating station and data had to be collected and analysed before a strategic map could be drafted and agreed upon. Once this was completed as set of Key Performance Questions and Key Performance Indicators could be designed and a process needed to be put in place to ensure the performance management information was communicated and used to improve decision making and learning. Finally a process needed to be put in place to ensure the scorecards are reviewed regularly to ensure that they stay current and relevant. The first step in the process was to gain buy-in from the local senior commanders of a station. Generally stations have an executive board, formed of senior officers and they take their lead from the station commander whose personal support is essential. Individual meetings with the station commanders or a small group of station executives were used to achieve top-level buy-in and the go-ahead for the project. A trained and experienced consultant was then used to interview each executive on a one-on-one basis each in their own office. The aim of this exercise was for the interviewer to gain an overview of how the station 'ticked'. Key questions were: • Why does this RAF Station exist? • What does it need to deliver / what are its outputs? • What does the RAF Station have to be good at? • What does your wing /squadron have to be good at to succeed in its defined purpose? • How are you kept informed about your wing / squadron's 'health'? • What contribution do you make to the management process at station level? • Etc. There were significant benefits from seeing the executives in their own offices. On a busy front-line station a number of executives would have their offices close to the taxiways and operational activities and there was often useful information to be gained from seeing the executive operate 'in situ'. Having interviewed the key executives, and thus had the opportunity to look around the station noting key installations and operations en route, a broad understanding was achieved of how the station worked. An important skill was to be able to identify the key strategic elements for the station as a whole and their interdependencies without getting drawn down into the myriad of objectives and relationships that exist between independent sections within the station organization. Based on the information collected from the interviews as well as observations and the review of relevant documents a picture of the station emerged. The essential resources on which the station relied (e.g. people, equipment, runways and buildings) were largely evident. There were also several obvious processes such as flight training, servicing of aircraft and administrative support which needed little thought. However, the importance of maintaining fighting spirit and cohesion across the unit meant that there were a number of intangible, but nonetheless essential, things that the station needed to be competent at. The emerging picture was translated into a value creation map charting the resources flowing to the essential process to the delivered output to achieve the overall mission.

The goal was always to represent the essence of a station on a single A4 page. For the RAF context the resultant diagram was termed the Strategic Map. The Strategic Map, which was initially developed by the expert consultant, was then subjected to rigorous review during a presentation given to the station commander and his executives. While there were differing views on the key interdependencies and the relative importance of competencies, agreement was achieved on the essential components of the map. Once the Map had been agreed in principle, an associated table was generated containing an explanation of the intended scope of each element of the Map. This was necessary to ensure a common understanding of the elements of the Map and proved particularly important as a number of the Map's elements cross-cut over organizational boundaries and conventional processes.

Example Strategic Map for RAF Waddington

Figure 2 outlines the Strategic Map for RAF Waddington. Overall, the station exists to generate world-class Expeditionary Intelligence, Surveillance, Target Acquisition and Reconnaissance (ISTAR) Capabilities. The station has three core competencies or things the station has to deliver well, namely to successfully contribute to operations and other tasks, today and in the future, to provide and develop sufficient capable and prepared people and to maintain, sustain and develop sufficient combat ready equipment. The station agreed on eight drivers of performance which would enable the station to continue to deliver its objectives. These performance drivers are to: Enhance and maintain competencies, training and personal development; to develop excellent motivation, fighting spirit, morale and ethos; to maintain and enhance equipment; to direct and coordinate output to ensure optimal use of resources; to foster a culture of innovation and continuous improvement; to communicate and engage proactively and openly; to enhance health, fitness and well-being; and cultivate a positive image and reputation. Key resources were identified as money, people, equipment and stock, external services and infrastructure.

Creating Key Performance Questions and Key Performance  Indicators 

Using the agreed Strategic Map, the next stage was the identification of Key Performance Questions (KPQs)™ii. A KPQ is a management question that captures exactly what managers want to know when it comes to reviewing each of their strategic elements and objectives. The rationale for KPQs is that they focus our attention on what actually needs to be discussed when we review performance and most importantly, they provide guidance for collecting meaningful performance indicators. This proved a more

demanding phase for the executives but one in which they contributed with some enthusiasm via workshops. The purpose of this phase was to identify what the executive board of the station needed to know in order to manage the station. The emphasis, which occasionally had to be reinforced, was the need to keep questions open and simple. Thus 'Have we trained enough pilots?' was eschewed in favour of 'To what extent are we meeting our pilot training target?'. It was accepted that the phrase 'now and in the future' was implicit in all of the questions and that as a management board they should be more focused on managing future outcomes rather than solely addressing past shortfalls. The development of Key Performance Questions was structured around the elements of the Strategic Map (see also Figure 3). The guidance used was that each element would be likely to generate one, or perhaps two, Key Performance Questions. If no question could be found, doubt was raised about whether the element justified presence on the Map. Likewise, if an element generated more than two questions it indicated that the element might benefit from being split into more than one - or that the executives wanted to micro-manage. It was important to emphasise to the executives the need to think at the board level rather than at the level of managing their own wing or squadron (i.e. department within the station).

Once the Strategic Map and Key Performance Questions had been generated, the focus of the work moved to a more detailed level - the identification and definition of Key Performance Indicators (KPIs). This work was facilitated by the provision of a template spreadsheet: the columns drove the definition of the following data for each KPI: ƒ What was being measured (e.g. headcount) ƒ Data source (e.g. HR database) ƒ Data ownership ƒ Target (e.g. 100% manning)

Target Authority (i.e. Who has set the target) ƒ Trigger Rules (e.g. >70% = RED, >=70%, <80% = AMBER, >=80%<90% = YELLOW, >90% = GREEN) ƒ Rules Authority ƒ Consolidation rule (where appropriate) ƒ Weighting (when part of a weighted average set of KPIs) ƒ Frequency Measured (i.e. how often will the data be collected) ƒ Security Classification ƒ Assessment of likely data quality

The structure of the spreadsheet was designed to illustrate how the various KPIs consolidated to answer the KPQs. Use of the MS Excel grouping function proved a useful aid in demonstrating how drilling down the resultant KPQ/KPI hierarchy would look on the live system. Nonetheless, defining the KPIs proved the most problematic aspect of the process. The difficulty lay in the fact that knowledge about what information was available to answer Key Performance Questions was held by more junior staff, who had not been party to the process of defining the Strategic Map and Key Performance Questions. These staffs were dispersed geographically across high acreage RAF stations. Initial attempts to ask these staff to define the KPIs relevant to their area at their own desks without further guidance illustrated through failure the need for a more controlled approach. The most successful method of defining KPIs required the provision of a member of the central team, fully familiar with the aims of the exercise and who had been involved in the mapping and KPQ design exercise. This person undertook interviews of the relevant staff accompanied by a member of the station team who would eventually have ownership of the resultant work. During the interviews the optimum KPIs were established through patient questioning of the specialist staff. This exercise helped the staff to think beyond the practical issues of data gathering. They were also reminded that the purpose of the exercise was to inform the station management board and without their expertise the station board would be less well informed. These interviews also reinforced why the earlier uncontrolled approach would never have delivered the desired results - the staffs were often very unfamiliar with the KPI concept and did not have the required analytical skills to populate the required KPI information in the spreadsheet. In defining the KPIs many of those interviewed expressed a concern that the raw quantitative data might mislead the management board because of errors in data capture, lack of timely updates of data sources, incompleteness of data, and factors not reflected in the data (often of a qualitative nature). Their fears were generally allayed by being reassured that the performance management approach and the software provided (an RAF-specific, bespoke, ORACLE-based application), allowed the local commanders to record their 'military judgements' alongside the raw data (with an explanation as to why his/her view differed from that indicated by the raw data). This is an important feature as it allows us to go beyond the mechanistic data capture and ensures individuals are engaged in the process of assessing and managing performance as opposed to just reporting numbers. A further concern shared by most was the number of KPIs that were generated. A simple Strategic Map might have some 18 elements. With each element generating an average of 2 KPQs it was not unusual to have up to 40 KPQs overall. Each KPQ then had to be answered by a number of KPIs. Some KPQs, often those dealing with intangibles such a reputation might generate few KPIs; however, there were three drivers which increased the KPI count significantly: 1. Standard reporting lines to allow matrix reporting. The Ministry of Defence has identified 7 standard headings against which all performance reporting should be undertaken. These headings are termed the Defence Lines of Development (DLODs). They are: Manpower, Training, Equipment, Logistics, Infrastructure, Information, Organization and Doctrine and Concepts. Applying the policy of providing a sub-ordinate analysis by DLOD for each output generated an 8-fold increase in KPI numbers. 2. Organizational structure. In some areas it was considered very helpful to analyse the information by the organizational sub-units (termed wings and flying squadrons) on a station. By this means weaknesses that were focused on just one sub-unit could be identified and addressed more specifically. It was also recognised that the competitive element that existed between sub-units could thereby be brought to bear to raise standards across the station. With a minimum of 6 sub-units and often many more on a station, this led to a minimum of a 6-fold increase in KPIs for those areas analysed by organizational unit. 3. Data structure. A particular area that generated the scope for significant number of KPIs was the measurement of manpower as there was an understandable desire to analyse, in addition to an organizational dimension by both specialism (e.g. pilot, navigator, engineer etc) by rank (e.g. Flt. Lt., Sgt.) and by competency (weapons, radar, etc). The lack of source data at this detailed level from the MOD's centralised human resources system for service manpower added to the frustration. The potential for hundreds of KPIs was therefore high; this had to be counteracted by constant challenge as to the value of each KPI against the 'cost' in man hours involved in collecting the data. Again, facilitation and guidance were necessary at this point to ensure a meaningful number of KPIs were being designed and collected. Figure 4 illustrates some examples of KPQs and KPIs linked to specific strategic objectives or elements on the strategic map for an RAF station.

Using Management Information to Improve Performance

Once the maps, KPQs and KPIs were in place it was important to ensure that the resulting management information is communicated and used to inform decision making and performance improvement. Stations have regular station management board meetings in which the station executives get together to discuss and review performance. The data was made accessible to commanders through the performance management software application as well as regular hard copy reports to inform commanders prior and during the station management board meetings. Whereas some stations preferred to deal with items on a by-exception-only basis, i.e. items are reviewed and discussed once the indicators show that there is a problem in certain areas, other stations put a rotating schedule in place to ensure all items are discussed on a regular basis.

Next Steps - Reviewing the System 

Stations are aware that the design of the strategic map with its KPQs and KPIs is not a one-off exercise and needs to be revisited at regular intervals. This will ensure that the station strategies and station management information remains relevant and in line with any changing priorities set our in the overall RAF management plan. Currently it is envisaged to revise the performance management system for the stations on an annual basis.

What were the Key Challenges?

Delivering worthwhile results from the work faced two key challenges: Persuading senior management that they would benefit greatly from strategic mapping. The lack of value gained from the existing process was recognised; however, the project faced frequent scepticism about the benefit potential of another management initiative built from lots of management-speak. This had to be regularly countered by briefings, training and challenge. Employing an external management expert with a proven track record and experience in both government and private sectors proved a pivotal critical success factor. Keeping the project on track in terms of timescales was an ever-present challenge. The majority of the work had to be undertaken by staff at stations that were not familiar with the process. The work was also not their primary duty so it had to be completed alongside other duties. The project was also challenged by the posting process in the RAF which means that people are posted to new jobs and officers are replaced approximately every two years. This constant staff turnover meant that key project staff changed and new staff had to be bought-into the process and trained. Two strategies were employed to maintain momentum: • The work was overseen by a central RAF team responsible for delivering the benefits of the supporting management information system. A lot of the external implementation expertise initially provided by the Advanced Performance Institute was transferred to this central team who could therefore take on the role of facilitator. An important learning point here is the need for close facilitation throughout the implementation process. • Station commanders were encouraged to persevere with the process in order to benefit from the efforts that they had made so far. The commanders were very engaged during the design of the strategic map and the creation of the KPQs. However, they were less involved in the design of the indicators, which was passed on to more junior staff on the station. A learning point here is to ensure closer involvement of senior managers in the design / review of indicators with clear deadlines and project plans.

 

What were the Main Benefits?

Despite several challenges, the cascade of the balanced scorecard and the creation of local performance management systems have delivered a number of key benefits to the RAF stations involved: • Local Buy-in to Strategic Performance Management. In contrast with certain previous management initiatives, the close involvement of local management in defining the strategic map led to greater buy-in into the concept of strategic performance management. Commanders at stations were encouraged to look beyond the immediate priorities to see how their efforts were aligning with the strategic goals of the station and the RAF as a whole. • Strategic Goals Represented in an 'easy to understand' format. The generation of a single page, agreed Strategic Map, representing the work of the station or Force, provided an easily understood mechanism for management to present their strategic goals. This enabled the work on the individual units on the stations to be aligned with the station and RAF strategy. The strategic map has the potential to become the primary strategy communication tool on the participating RAF stations. • Potential for Cascade of Strategic Goals. Although not yet taken up, some areas within stations saw the benefit of taking the principles of strategic performance management and applying them to their own areas, which would achieve a coherent alignment of effort down the command chain. Similarly the opportunity to apply the principles higher in the command chain was apparent although this was outside the scope of the particular work stream. • Performance Reporting becomes Performance Management. The final benefit was the most important. The previous obligatory performance approach of reporting upwards was translated into a process which was relevant for local management as it provided regular assessments in answer to the question: 'How well are we doing at the things we should be good at?' Areas where there were shortcomings were identified with a traditional 'traffic light' mechanism. This reporting was then extended by the linkage to the advanced risk management tool, which enabled the management actions being applied to address shortfalls in achievement to be tracked to delivery. A senior commander at one of the participating RAF stations put the benefits into the following words: "The Advanced Performance Institute has helped us to get a grip on our strategy. Before, we were struggling to agree on one strategy and were measuring everything that was easy to measure. At the same time we were not getting any value from collecting all that data. With the approach introduced and facilitated by the API we were able to clarify and map our strategy on one piece of paper, identify the key performance questions we as executives want to answer on a regular basis, and design meaningful indicators that will help us answer those questions. For me, we now have performance management Nirvana!"

 

Endnotes, References & Further Reading 

Marr, Bernard (2006), "Strategic Performance Management", Butterworth-Heinemann, Oxford. Marr, Bernard (2008) "What are Key Performance Questions?", Management White Paper, The Advanced Performance Institute. Marr, Bernard (2008) "What is a Balanced Scorecard?", Management White Paper, The Advanced Performance Institute.

Bernard Marr is a globally regognized big data and analytics expert. He is a best-selling business author, keynote speaker and consultant in strategy, performance management, analytics, KPIs and big data. He helps companies to better manage, measure, report and analyse performance.
Key Performance Indicators: The 75+ Measures Every Manager Needs to Know
Bernard Marr
Performance indicators are essential tools which will tell you if your business is on target or veering off course. Using the right indicators will help you deliver the right results. Key Performance Indicators cuts straight to the 75 + KPIs that ...
KPI Library
Discover the right Key Performance Indicators for your business.
Videos
The Advanced Performance Institute