New data and enhanced data capabilities offer opportunities for central banks across a wide range of functionalities, including monetary policy, financial stability, macro- and microprudential policies, financial inclusion and payments systems. But many central banks are bogged down by legacy systems and a lack of internal data management strategies, as OMFIF’s latest report, ‘Central banks in the digital age: Bringing data into focus’, finds.
To discuss the findings of the report as well as developments in central banks’ approaches to data, OMFIF convened a panel of experts from the central banking world to discuss the modernisation.
Despite operating within quite different economies, there are similarities in central banks’ data challenges
Representatives from the central banks of Chile, Lithuania and Italy were broadly in agreement on their data challenges and newfound strategies. Their sentiments were echoed by Bruno Tissot, head of statistics and research support at the Bank of International Settlements, in his experience of engaging with central banks around the world on statistical and data capabilities.
During the discussion, central bank officials cited the variety of data sources, the presence of data siloes and a lack of a coherent data governance strategy as challenges. In addition, cultural change was highlighted by all panellists as a key barrier to faster technological upgrading and institutional transformation. In terms of the infrastructure, all the panellists stated an interest in moving to data lakes for at least some databases.
Central banks prepared to use unconventional ‘statistical buffers’ in extraordinary circumstances
All panellists reported the use of high-frequency indicators or other unconventional metrics of economic activity in addition to traditional data. This trend was catalysed by the Covid-19 pandemic when central banks were forced to look beyond traditional statistics for measures of economic activity. Most commonly, these included vehicular mobility and telecommunications data.
Since the pandemic, the use of ‘unconventional’ data sources has greatly expanded. News and social media data are being used to capture expectations and sentiment and are now considered by some institutions as a ‘complementary indicator’ to traditional consumer surveys. At the Bank of Lithuania, these are being used in an ‘anecdotal format’ for now, explained Edita Lukaševičiūtė, head of the bank’s data governance division, but they are also exploring the incorporation of social media sentiment for ‘analytical purposes’.
Weariness towards private data providers
Both the costs and quality of the data acquired by private providers were mentioned as potential risks. Gloria Peña, director of the statistics division of the Banco Central de Chile, noted that while experimental data is more timely, traditional data is often of higher quality. ‘We need to verify and validate [these] sources, we need to make sure there are no errors, even if they are experimental… they are subject to more revision.’ The trade-off between timeliness and quality was also identified as a challenge.
Peña explained further that the bank ‘cannot incorporate something when we are not sure we are going to have that source in the future’. This sentiment was echoed by Giuseppe Bruno, head of the IT division of the Banca d’Italia’s economics and statistics directorate, who warned that central banks must ‘be careful with the private data providers’. He noted several risks: costs which can ‘multiply by five from one period to another’, the format of data which can change without notice and the potential lack of a consistent flow of data over a longer period.
Full migration to cloud services raises questions of sovereignty and security for central banks
‘We’re headed in the direction to look at the cloud seriously as a community to central bankers,’ noted Benjie Fraser, head of the asset owner segment of EMEA at State Street Bank & Trust, ‘but it raises [questions] for practitioners around sensitive information.’
All central bankers on the panel reported using the cloud, but only with public data for now. The panellists acknowledged the advantages of cloud infrastructure, including cost, resiliency and enhanced capabilities. But for sensitive or private data, the panellists expressed reservations. Data encryption could offer one solution, while data localisation may be seen as necessary for other institutions. It is still too early to tell how this will evolve.
Central banks generally sanguine on their data capabilities
When asked if satisfied with their data capabilities, the panellists’ answers ranged from four to seven out of 10 – a rather lacklustre evaluation. However, the panellists expressed excitement regarding several initiatives both within and between central banks.
Peña emphasised the usefulness of a data warehouse for sharing codes across departments at the Banco Central de Chile. This way, they ‘don’t duplicate or waste time if someone already did the same thing’. Bruno echoed the importance of sharing codes and applications via the work that the BIS and International Financial Corporation are conducting with central banks globally. ‘They have set up a training platform so that we meet every year and discuss a variety of issues related to platforms for sharing code and data’ and providing examples.
Along with technology and processes, cultural change is another obstacle which central banks are working to tackle. ‘People’ is the final pillar of the data governance framework at the Bank of Lithuania. As noted by Lukaševičiūtė, ‘We need to educate people, if we have new technologies, we have to ensure that the people who work with these technologies know how to work.’
Taylor Pearce is Senior Economist at OMFIF.