Official Statistics in the Wake of the Covid-19 Pandemic: A Central Banking Perspective

The impact of the Covid-19 pandemic (CV19) on 
official statistics has been large and multiform. As regards central banks, 
their statisticians responded proactively to the related data disruptions, and 
new ways were found to address policy information needs. But the pandemic also 
triggered a general review of their statistical functions with the view of 
reorganising statistical production chains. In that sense, the pandemic proved 
to be a wake-up call for official statistics, underscoring the need for a new, 
enhanced global framework to improve existing core statistics and address new 
data needs. The completion of the G20 Data Gaps Initiative at the end of 2021 
will provide a key opportunity to support this endeavour and enhance the global 
statistical infrastructure.


Introduction
The impact of the Covid-19 pandemic (CV19) on official statistics has been a DOI: 10.4236/tel.2021.114047 696 Theoretical Economics Letters particularly relevant issue for central banks, as both producers and users of data.
As producers, they have been confronted with statistical data gaps that arose and also involved in methodological interventions to address the challenges caused by the pandemic. As users of statistics, central banks needed information to pursue their monetary and financial stability policy objectives in the face of the sharp disruptions caused by the pandemic. The experience of statisticians in various central banks weathering this storm highlighted three main lessons.
A first, somewhat reassuring lesson is that the occurrence of CV19 underscored the importance of the international efforts made since the 2007-09 Great Financial Crisis (GFC), especially in the context of the Data Gaps Initiative (DGI) endorsed by the G20, 1 to build better-quality, more comprehensive, flexible where relevant, and integrated statistics (FSB & IMF, 2009). As the pandemic struck, policy makers had at their disposal a wealth of statistics that would have been barely available a few years ago. In particular, central banks and financial supervisors realised the potential benefits of accessing very granular data on financial instruments such as loans and debt securities as well as on the balance sheets of key financial institutions. These more granular and deeper "chests" of statistics compiled from input data enabled them to better understand the consequences of the crisis as well as the potential impact of their actions on the economy-such as increased liquidity measures, adjusted financing mechanisms for commercial banks, and other relief measures.
The second lesson is that, despite recent progress, many data gaps remain that have been exacerbated by the crisis (FSB, 2021a). This is particularly the case for financial accounts: the ongoing crisis is likely to lead to unprecedented public and private debt levels, calling for a better understanding of the financial interlinkages in the economies, as well as of the distribution of revenues and assets among economic agents. Associated financial risks remain hard to assess in many places, a difficulty that has been reinforced by the impact of financial innovation (e.g. emergence of new financial intermediaries and risk transfer mechanisms including through derivatives and cross-border operations). These issues were clearly underlined by the turmoil in financial markets observed in March 2020 when CV19 escalated (FSB, 2020b). Addressing them will require important follow up work to comprehensively implement the data collections launched since the GFC (in the context of the DGI as well as among non-G20 jurisdictions), especially with regards to repo transactions, cross-border exposures and derivatives.
Thirdly, the pandemic also highlighted the need to go beyond the "standard" offering of official statistics especially in crisis times. A key requirement in the first year of CV19 was to have more timely (almost in real time), frequent (weekly or even daily), and well-documented indicators to guide policy. Addressing these 1 The efforts made by G20 countries were accompanied by significant improvement in other economies too, as some DGI recommendations were also relevant for non-G20 countries participating in the related data collections (for instance by BIS member central banks) and also because of the global recognition of the benefits realised. sharing among official statistics producers as well as considering alternative, "big data" sources as a complement to official statistics.
The rest of this paper is organised into four parts. It first reviews the impact of CV19 on the compilation and use of official statistics in light of the experience of central banks. Second, it looks at the implications of these developments for the design of their statistical functions. Third, it discusses why the pandemic can represent a useful wake-up call for official statistics. And fourth, it reviews possible ways to enhance the global statistical framework in response to the crisis.

Disruptions in the Compilation of Reliable Official Statistics
The occurrence of the pandemic highlighted the need for reliable data, for instance to assess how badly the economy was hurt or to monitor the subsequent recovery and impact of policy responses. Yet policy makers were confronted with a sudden disruption in official statistics as CV19 escalated.
1) The first aspect was, almost literally, statistical darkness. Many economic activities (e.g. air transportation) just stopped, at least temporarily: there was simply nothing to record any more. Alternatively, the statistical apparatus was unable to measure properly the new activities that had quickly replaced others.
Case in point was the composition of household spending and the related measurement of consumer price inflation (CPI), as the spending on specific items went to almost zero due to CV19-related lockdowns and social distancing (e.g. no dinners organised in restaurants, theatres closed), while it surged for other items (e.g. consumption of take away food and online services). The difficulty for statisticians was to quickly adapt to these changing patterns, noting that the related weights (e.g. in the CPI basket) are usually adjusted only progressively, often once a year.
Another important example was the impact that CV19 had on external statistics (BoP/IIP; IMF, 2020a; UN ECLAC, 2020b). This reflected for instance the drying up of cross-border activities, e.g. for non-financial transactions, due to the sudden stop in tourism and trade in goods (Berthou & Stumpner, 2021), and the spill over effects of external developments on the resident economy. In particular, the related large movements in global financial market prices and exchange rates had a notable impact on corporates in key emerging market economies as CV19 escalated . And the consequences in terms of import and export volumes and values led to distortions in the weights retained for compiling nominal and real effective exchange rates (and thereby for supporting competitiveness analysis).
2) Second, official statistics became more difficult to assemble properly. Compilation work was hindered by the impact of CV19 on respondent resources through the disruption of normal day-to-day activities, reflecting social distanc- Price statisticians around the world had to quickly adapt to these compilation difficulties. A first response was to use the wide range of statistical tools available to deal with missing information, in particular with so-called "imputations" techniques. 2 One option was to simply re-use past observations, for instance to carry forward the past value of missing prices when measuring CPI-but this generally led a downward bias in inflation and an upward bias in real consumption (Diewert & Fox, 2020 calling for some delays in collecting information; as a result, some flexibility was offered in terms of reporting obligations to "reduce the operational burden on banks (…) and enable them to report with an adequate level of data quality" (ECB, 2020b). More globally, the Basel Committee on Banking Supervision decided in Spring 2020 to reduce the collection of data related to its global systemically important bank (G-SIB) assessment exercise and postpone the implementation of the revised G-SIB framework to provide additional operational capacity for banks and supervisors (BCBS, 2020).

Challenges for Data Users
One of the main repercussions of CV19 has been the challenges it posed for users of statistics, especially in central banks, for three key reasons: first, increased concerns about the disposal and/or accuracy of the indicators being generated by statisticians; second, delays in the availability of official statistics caused by the pandemic; third, the need to deal with larger uncertainty and data revisions. biased to a few larger firms dominating industries at national level and smaller size firms operating at regional level; this proved a key issue as non-response sampling problems rose with CV19. For instance, with the greater difficulty to process adequate stratification and methodological adjustments, even a small decrease in the response rate of a survey could cause large changes in average outcomes. Alternatively, biases could be triggered by non-responses that included enterprises that ceased to exist due to CV19 but were still included in sample survey population (SARB, 2020). Another challenge was that aggregate information was not sufficient to properly assess the underlying distributional shifts in the population of interest. Such issues are particularly relevant for central banks in view of their price stability mandate. As noted above, the impact of CV19 on inflation was volatile 4 and different across household income groups (Cavallo, 2020), as well as between the types of price indexes considered (Mead et al., 2020). There were also important challenges for central banks' financial stability functions: the CV19 crisis required complementing the assessment of its central tendency impact with an analysis of its consequences on the tails of the distribution in order to form a comprehensive view of the pockets of risks prevalent in the entire economy.
This need for more granular information was clearly evident for authorities willing to understand the causes of CV19-induced stress in the financial markets observed in early 2020, which broadly related to three main categories: short-term funding stress; market structure/liquidity driven stress; and long-term credit stress (Kothari et al., 2020). In particular, government bond markets experienced heightened turbulence in March 2020, reflecting multiple factors. On the one hand, CVI9 raised investors' risk aversion, with flight-to-safety investment into US Treasuries. At the same time, prices differences between financial instruments led a number of leveraged investors such as hedge funds to take aggressive arbitrage positions. Yet the pick-up in volatility and the deterioration of liquidity in specific market segments such as US futures markets led to forced selling, leading to a perverse spiral of illiquidity, price dislocations and tighter margin requirements.
These tensions quickly affected other markets and the real economy given the importance of the US government bond market as benchmark financial price (Schrimpf et al., 2020). 4 Cf Cecchetti & Schoenholtz (2020) who argue that the volatility in CPI caused by CV19 could be exacerbated by measurement issues (e.g. chain-weighting) and that central banks should be looking at ways to filter out such high-frequency noise, for instance by using trimmed-mean price indices. Examples of difficulties faced by central bank users of official statistics CV19-related general increase in data uncertainty: • Unobserved indicators: uncertainty created by mitigation techniques (e.g. for imputed price changes in the CPI) • Changing economic patterns: e.g. in the consumption basket due to supply constraints and lockdown rules, resulting in bias in price deflators/real indicators • Postponement of publication dates • Large and more frequent data revisions • Alteration to conventional data collection methods • More difficult adjustment for seasonal patterns in time series • Uncertainty related to the extent of inherent biases (e.g. time variation, impact of the changing economy and policy actions) Specific issues related to sample surveying: • Sample representativeness: for instance, sample survey frames are typically drawn from Statistical Business Registers (SBRs), which in turn often rely on taxation information which was distorted by the CV19-related recession and disrupted tax payments • Outdated profiling information: for instance, enterprises listed in the South African SBR are generally deemed to be active for approximately 18 months after their last tax payment. Hence, those that had ceased operations due to CV19 impact could still be included in the sample frame for a significant period of time • Changes in time series trends: risk of increasing bias and over-estimation • Survey accuracy: reduced importance for respondents to comply to survey requirements because of other, CV19-related priorities • Data sharing constraints: the supply of auxiliary information (e.g. financial statements) could be distorted because of resources issues/statistical delays resulting from CV19 These difficulties highlighted the importance of having an encompassing view of the financial system, especially by types of market segments, financial instruments, and investors. They were also a stark reminder of the important data gaps that remain despite the progress achieved under the DGI, in particular as regards the functioning of the repo market (especially for bilateral repos), the balance sheets of non-bank financial entities such as hedge funds, and the interconnections between the various players in terms of liquidity provision (cf Section 5).
2) A second problem faced by users during the pandemic has been the delayed release of various statistics due to compilation difficulties. Policy makers are traditionally concerned by the fact that, while they need a real-time readout of the economy to take best-informed decisions, official statistical releases are quite lagged, especially in developing economies. They have thus to make decisions with limited information-a challenge that is particularly important for central banks, who need to respond to financial stability risks rapidly to avoid sub-optimal equilibrium adjustments in the economy. The delayed availability of statistics was reinforced in the wake of the CV19 crisis, as many countries imposed lockdown restrictions, which negatively affected data compilation and dissemination tasks within their own statistical value chains. This clearly exacerbated the information challenges faced by policy makers, hampering their assessment of economic indicators and their ability to analyse the situation and take appropriate actions. To try to avoid making "decisions in the dark", users have therefore been leading the call to generate alternative statistics in response to the pandemic. Indeed, and as stated by the IMF in its analysis of the impact of CV19 on statistics, "many traditional official statistics-even those with monthly frequency-are just not sufficiently up to date to be useful at this time. Higher-frequency alternative data are needed to complement official statistics. While the important role of alternative indicators was already recognized before the pandemic, the disruptions of traditional statistics caused by the current crisis have made it more urgent" (Ducharme et al., 2020).
Yet one issue is the high volatility of such sources, with the difficulty to get informative but not too noisy data and their potential estimation bias (especially when the economy is changing rapidly, as was the case during the pandemic).
Another challenge is that accessing alternative data sources could, at least initially, lead to significant "red tape" burden to adhere to the related rules and formalities-something that proved a key obstacle when CV19 struck and raised acute data needs. Hence, significant attention should be devoted to explore how to improve access to these new data sources effectively (Biancotti et al., 2021).
Furthermore, a key consideration is that statisticians are better equipped for such exercises, since they comprehend better the involved methodological and quality issues and have the requisite skills to mine alternative sources (structured as well as unstructured, like text information) and compile statistics that can meet methodological and quality tests as would be the case for official statistics that are collected, managed, compiled, disseminated and analysed according to well-defined principles (United Nations, 2013). For instance, the UK Government Statistical Service (GSS), which is a community for all civil servants working in the collection, production and communication of official statistics, has developed top tips for those statisticians who need additional ad-hoc data with a quick turn-around but with a sufficient degree of quality assurance, especially when time and resources are limited (UK GSS, 2020). It is in any case essential to convince users to avoid accessing themselves alternative data sources outside the realm of official statistics, as this could lead to unintended consequences and costly mistakes.
3) Third, by impacting both the quality and the timeliness of statistics, the pandemic further complicated users' life because of increased data uncertainty and revisions. To say the least, CV19 brought a high level of uncertainty to the statistical world. The shock to the economy, unprecedented since WWII, raised the question of how to "deal with this type of structural shift and the uncertainty it brings to conventional policy parameters" (CCSA, 2020). In addition, uncertainties were amplified by the sheer responses of economic agents to CV19-induced disruptions, which are still hard to gauge at the present juncture. For instance, the international financial statistics compiled by the BIS showed that the pandemic-induced turbulence in global financial markets in early 2020 was associated with a halt in the provision of banking credit to emerging market and devel- Obviously, an important consequence of this uncertainty has been that official statistics are more likely to be revised significantly as times evolves. Certainly, it is not a new challenge for policy makers, who have always had to take decisions knowing that the statistics they can look at are subject to future revision. But this difficulty was clearly reinforced by CV19, compared to what would have been the case in the past. The revision policy of official statistics will need to be adapted to address these challenges and preserve statistical quality and integrity-which are both of great importance for users.

Implications for the Statistical Function at Central Banks
By making statistical compilation work more difficult and bringing various challenges to data users, the pandemic triggered a general review of the statistical function in central banks. Two areas of interest were, first, the identification of users' new data needs brought about by the crisis and, second, the adaptation of existing statistical frameworks to ensure the continuous provision of reliable statistics to support policy-making.

Users' New Data Needs
In view of the impact of CV19 on official statistics, most central banks felt the need to have more, not less information. Compared to the situation prevailing before CV19, attention focused on three major points: timeliness, frequency, and the need to address the new issues raised by the crisis. A key consequence was to spur interest in alternative data sources.
1) Firstly, with regards to the need for timely data, this reflected the fact that the speed of the pandemic and the size of the economic disruptions had called for having more rapid statistics at hand to quickly assess the economic situation.
Many central banks simply felt they could not wait for a few weeks before having basic data on say, unemployment, industrial production or inflation. They needed "high-speed" data, almost on a real time basis (Hinge, 2020). The actions taken were multiform. One was to advance the compilation process, sometimes at the price of reduced quality in the aggregates measured. Another was to look for other, more timely indicators that could shed light on specific economic areas.
In a large number of countries, this was done by using "traditional" but secondary data sources, for instance, to calculate advance estimates of industrial production based on partial indicators available earlier on electricity consumption and/or on specific supply indicators collected by industry organisations (pro-  (Chapman & Desai, 2021). In Ireland, the central bank similarly found useful value in analysing the data on credit and debit cards and on ATM withdrawals at its disposal (Hopkins & Sherman, 2020). Turning to Chile, electronic invoicing data were used to elaborate high-frequency activity indicators with detailed breakdown by sectors to assess the impact of CV19 on firms' sales and analyse related funding needs (Central Bank of Chile, 2020). Newer types of data sources were also considered, for instance to assess the impact of the pandemic on Japanese household habits based on mobility trends derived from smartphone location data, in order to monitor access to workplaces or activities in recreation areas (Bank of Japan, 2020a). A last example has been the Bank of England's use of high-frequency measures of formal restrictions on activity imposed by governments to contain the spread of COVID-19 (compiled by Oxford University) and of household mobility (based on Apple Mobility Trends Reports 6 ) to assess its impact on GDP (Davies et al., 2020). Yet these new approaches did raise specific practical issues too. For instance, a number of the new indicators developed incorporated information on electricity consumption which needs to be properly adjusted not only for seasonal effects (like many traditional economic indicators) but also for weather conditions, because of the impact of heating and air conditioning on this variable (Castelletti et al., 2020).
2) The second and related focus point was frequency. Traditional macro statistics are usually available on a quarterly basis for the national accounts, at least  industrial output and quarterly GDP) and greater timeliness. In addition, an important reported feature is the specification robustness to changes in the way these indices are constructed: the approach followed is often based on principal component analysis, allowing for extracting "common signals" that are relatively independent from the specific underlying series being considered all together. Moreover, these indices do not rely on pre-defined patterns-unlike traditional nowcasting exercises that are often based on the identification of a stable relationship between well-defined variables, such as between economic confidence indicators and industrial output. This can be quite valuable in unprecedented crisis times.
3) The third focus point was to get information on the new issues raised by the crisis that were not properly covered by the "traditional" statistical apparatus. As noted above, many central banks realised on the occasion of CV19 the potential of micro data, in recognition of the efforts undertaken in recent years to have a better, more granular access to monitor risks in the financial and non-financial sector as well as to analyse interconnectedness and cross-border spill overs. In particular, many granular datasets had started to be set up since the GFC to collect loan-level data or security-by-security information (IFC, 2021b). The compilation of off-balance sheet information (e.g. derivatives statistics) has also been enhanced, allowing for a better understanding of how risks can be transferred between economic agents. All this information proved particularly useful when the economy was disrupted by CV19, not only to assess pressure points but also to facilitate the implementation of targeted policy measures. In contrast, when this information was not already available, authorities in a number of countries quickly decided to fill this gap. In Chile, for instance, the central bank introduced new information requirements (in terms of granularity and frequency of banks' loans data) to monitor the effectiveness of the measures adopted by authorities to directly support business credit. 7 Another area of attention has been to assess the impact of sentiment among economic agents. One example was to use text-mining techniques to build a media-based sentiment indicator for Spain on a daily basis and updated in real time; this proved particularly useful to enhance nowcasting exercises during CV19, in particular compared to more traditional survey-based confidence indicators which are usually available on a monthly basis (Aguilar et al., 2020 Reflecting central banks' increased focus on the points above, a key consequence was the renewed push for alternative data sources and tools to complement official statistics. In particular, the CV19 crisis underlined two main areas of interest. One was to take more benefit from the new digital information provided by the "data revolution", namely web-based indicators and other "organic" available as a by-product of the services provided by the wide range of sensors, devices, satellites etc. Recognising that much of these data, as well as the expertise to analyse them, are concentrated outside the public sector, the Swedish Riksbank's decided to open up a collaborative public channel through which academics and the private sector can directly contribute in real time (Hull, 2020). This allowed the exploration of various types of indicators, such as daily retail prices and volumes, restaurant bookings, social media content or company press releases. A second focus area was to make better use of the information contained in large administrative datasets (e.g. public registers, tax files, supervisory reports etc) that have been collected by authorities for many years without being duly exploited for statistical purposes. While there had been already a gradual recognition of the value of such administrative data since the GFC (Bean, 2016), the pandemic clearly reinforced this trend.

Revisiting Statistical Frameworks to Better Support Policy Making
Faced with such increased uncertainty, central banks as producers of statistics had to re-assess their production functions and reorganise themselves where needed and appropriate to supply data addressing users' needs. In general terms, they focussed on three areas.
1) First, the importance of the economic shock and its uncertain repercussions throughout the economy reinforced the need for developing a comprehensive overview of the entire economy, its components and the way they interact-noting that a key point of interest for central banks is to monitor the risks associated with such interconnections. For those many countries in the world that do not yet actively compile institutional sector accounts, a key priority is thus to advance this work so that counterparty positions can be assessed in a consistent and methodologically sound way. Even more advanced NSSs face similar tasks, for instance to ensure the comprehensive recording of the various policy actions taken in response to the crisis. One example is when a government provides financial support to companies affected by CV19, which could be considered as a financial investment or as simply as a subsidy with no return expected. Depending on the recording of those actions in the macroeconomic accounts, one could have a different picture of the situation in terms of fiscal deficit and public debt (IMF, 2020b 9 ). Moreover, "inconsistent approaches to recording government support to people and businesses" could 9 For a concrete application of the IMF guidance in the case of Australia, see the spotlight on classifying CV19 policy interventions in macroeconomic statistics in Australian Bureau of Statistics 2) Second, statistical frameworks have to become more flexible to address evolving users' needs and the sheer uncertainty created by the crisis. This called statisticians to re-assess the relevance and agility of their tools and methods to deliver required statistics. Key was to be able to meet the (new types of) users' data demands as structural changes were quickly happening. They had in particular to "think the unthinkable", in order to support policy makers despite the high-level data uncertainty brought on by CV19. They also had to be innovative, for instance to deal with the methodological issues induced by the new policy measures taken in response. In fact, the situation caused by CV19 in terms of data needs was similar to the one faced during big catastrophes, such as the Great East Japan Earthquake of 2016. This type of crises, because of their magnitude and suddenness and also non-linearities-which disrupt historically-observed econometric relationships between economic variables-lessen the relevance of the predictive power of usual indicators. More generally, while alternative data may provide only limited information in "normal" periods of the economic cycle, they can help to better understand the situation in periods of crisis, at least in the very short term (INSEE, 2020).
3) Third, statisticians had to figure out how complementary data sources and information could be brought into their mainstream statistical frameworks. One way was to integrate alternative input sources within the conventional methodological process to generate official statistics. Another was to use these additional sources to get supporting and benchmarking data that can act as an "information buffer" in times when conventional official statistics dry up or are lagged significantly. For instance, to address the end to direct questioning of foreign visitors (the "main source") during the initial CV19 period, the Bank of France defined a list of "auxiliary" data sources, such as payment cards, mirror data provided from European countries and mobile networks, that could be accessed to measure international travel as a second best solution (Le Gallo & Schmitt, 2020).
In any case, a key concern is that the new data sources being consideredsuch as private web-based data, additional regulatory input data, etc-do not present the same guarantees as official statistics in terms of uniform definition consistency, time consistency, etc. Statisticians can also face confidentiality/ethical issues when using these data, as well as potential unintended consequences that could emerge due to combining various, possibly inconsistent data sources (Tissot, 2019).

A Wake-Up Call for Official Statistics
The effects of the CV19 pandemic on economic statistics have been far-reaching and necessitated statistical agencies including central banks to adapt to these challenges in an effort to continue to best inform policy makers under difficult

Making Better Use of Existing Data
Given its dynamic nature, measuring economic activity can be a never-ending process: there will always be new structural developments that are not yet measured but should be measured. And, indeed, CV19 was a stark reminder that new challenges can constantly emerge that require more statistical information to be properly addressed. However, before deciding whether additional/other types of data should be collected, one should realise that currently existing datasets might not be fully exploited-sometimes simply because potential users are not aware of their existence, or because of the training costs entailed. Bloomberg to monitor credit flows. These examples above suggest that a key task for central banks is to make operational the large and structured financial data sets that are available-arguably a more pressing priority than exploring "new" alternative indicators.
Another example is the need to identify linkages between economic agents, especially through common exposures that could favour the contagion of shocks.
Detailed counterparty information could be helpful in this regards, calling for enhancing the resilience of the underlying market infrastructure, especially by supporting the further development of global identifiers including parent relationships (LEIROC, 2016) so as to assess interconnections across the various parts of the financial system (FSB, 2020a). Instrument-level data could also shed light on the linkages between borrowers, creditors and specific financial intermediaries, in turn facilitating the identification of the distribution of risks and systemic effects-which can be hidden by aggregated-level statistics, for instance when assets and liabilities are "matched" at the level of one sector but not at the level of a single but large institution, whose failure could put the entire system at risk.
Turning to so-called "alternative" or "big data" sources, they currently mostly fall outside the official statistics boundary. They can be quite diverse, comprising for instance third-party information, anecdotal evidence collected from economic units, data scraped from the internet as well as listed firms' financial results. As noted above, they could provide timelier, more frequent and complimentary insights to traditional, survey-based statistics. One point highlighted during CV19 was the potential value of "economic intelligence", e.g. more anecdotal information collected from various sources such as industry information, the press or even social media. This information (

Revamping Macroeconomic Statistical Frameworks by Leveraging on Technology Innovation
Realising the full benefit of supplementary data to support official statistics calls for a reassessment of our macroeconomic statistical frameworks-in a way similar to what was undertaken in response to the GFC especially with the launch of the DGI endorsed by the G20. Such a review should focus first on ensuring an adequate integration in "main stream" measurement frameworks of the wide range of supplementary data sources available (e.g. the internet of things, micro-level data sets). It also calls for making further progress on the necessary IT underlying infrastructure to enhance the management of the increasing amount and variety of statistics compiled as well as the governance of the associated processes (IFC, 2021c).
However, such a fundamental review may take time and energy. One solution would be to start by enriching the financial accounts framework that is already available in several jurisdictions. This framework can provide a plethora of potentially useful information across the main institutional sectors, including the full view of income, assets and liabilities across an economy (Çakmak et al., 2020). The addition of supplementary data sources would enhance this basis of information, to effectively inform monetary and financial stability policy decisions and facilitate the assessment of their implementation. It would also facilitate the compilation of distributional financial accounts (ECB Expert Group on Linking macro and micro data for the household sector, 2020).
Another approach is to build on recent technological developments to develop datasets which reflect the entire target population or at least a much larger sample thereof. The importance of the CV19 related economic fallout suggests that this may be a good investment to augment conventional data compilation exercises (which have traditionally been based on a representative samples, drawn randomly from a broader population). In particular, a well-defined input sources development strategy could help to mitigate the uncertainty and risk associated with disruptions observed in the compilation of official statistics in crisis times.
However, developing such additional data sources is a long-term endeavour that can be costly and compete with the limited resources of NSSs. It would also require statisticians to refine the tools at their disposal to mine, analyse and compute them. The drive towards alternative data sources is therefore not necessary a saving grace, even for those developing economies that often face very specific problems with traditional statistics compilation processes.
Other innovative ways in which statisticians can also expand the borders of Theoretical Economics Letters their conventional framework is by using online surveys to increase response rate. One can also re-apply some official statistics to serve other purposes in case of disruptions; an example of this would be economies that used various indicators (e.g. electricity consumption, monthly industrial production, business confidence) to infer some proxy of GDP during CV19. Another way would be to expand bi-or multivariate time series analyses in order to develop official statistics based in parallel on conventional inputs and on alternative data sourcing.
For instance, to identify possible relationships between such complementary approaches and develop alternative processes supporting data compilation-with the caveat that those relationships may not be stable over time, depending for instance on the state of the economy or on policy actions.
Lastly, digital innovation could more generally help to accelerate the production of official statistics. A more automated linking between granular (e.g. institution-and instrument-level) datasets available in the economy and the macro statistical frameworks could help to compress compilation times, provide more frequent indicators, and reduce revisions. As analysed above, this would be of great interest for users in central banks.

Supporting Users' Experience with Statistics by Promoting Data Sharing and Cooperation
CV19 has forced statisticians to revisit the services they provide to their stakeholders, for instance to increase the frequency and/or timelines of their estimates to better support policy decision. It was also an opportunity to revisit general user experience with official statistics. Several issues deserved to be considered for this purpose. For instance, are users utilising the generated statistics to their full potential? Are they made sufficiently aware of the opportunities of specific data sets? Where should users turn to get statistical information and how should the official statistical framework be set up to facilitate this process?
The starting point is that thinking the unthinkable has become a key priority for statisticians supporting policy makers in the aftermath of CV19. The reason is that CV19 has triggered structural shifts in the economy, destabilising existing relationships and leading to unprecedented policy decisions. Proactive action in the face of such uncertainty is critical: without appropriate data, policy makers will have at their disposal irrelevant historical estimates, or will simply have to rely on anecdotal evidence, which would clearly be sub-optimal given the issues at stake.
One way to go for answering these questions is cooperation. CV19 highlighted the importance of ex ante coordination among public authorities so that adequate processes were already in place to allow for an effective exchange of information when the crisis occurred. This also requires to develop data sharing within and between agencies producing statistics so that databases can be linked in a coherent way, also helping to limit the reporting burden for the economy (IFC, 2015). The South African Reserve Bank is for example currently developing a framework for a consumer credit register serving multiple stakeholders with the intention to deliver granular information on consumer loans. Moreover, the need for statisticians to deal with new, alternative sources and to try to be prepared for the unthinkable puts a premium on the sharing of experience, not just to access data but also to get a better sense of their actual content/limitations. In particular, statisticians would benefit from knowing about the initiatives that have worked in other fields, to the extent that these can be at least partially emulated in their own areas.
At the international level, the sudden data needs highlighted by CV19 under- Another avenue is to develop a "central marketplace" to increase accessibility to official statistics. This is already being implemented in a number of countries, and may well be expanded at the global level-for instance by asking relevant 10 At the global level, the main improvements achieved with the DGI have been related to the regulated banking sector, especially with the recommendations to improve the collection of international banking statistics and the set-up of the BIS-hosted international data hub for global systemic institutions (Bese Goksu & Tissot, 2018

Looking Forward: A New, Enhanced Global Framework for Official Statistics?
Obviously, a key question posed by the pandemic looking forward is how to best address policy needs, specifically in the monetary and financial sectors. One solution is to build upon the infrastructure already put in place by the G-20 DGI so as to take advantage of its three key success factors. First, its approach of structured collaboration between international organisations and NSSs, which ensures effective coordination and helps avoid the risk of duplicating other global statistical initiatives such as those related to updating international statistical standards. Second, its close connection to current authorities' priorities, with effective reporting to policymakers (as was the case with the G20 Finance Ministers and Central Bank Governors under the DGI). And third, an effective peer pressure mechanism for spurring the active involvement of national jurisdic- For instance by building on the US experience of the Federal Reserve Economic Data (FRED) data service, which is updated daily and allows access to over 500,000 financial and economic data series from more than 100 public and proprietary sources. At the international level, the Principal Global Indicators is an initiative that aims to facilitate the monitoring of economic and financial developments for G20 jurisdictions and is a joint undertaking of the Inter-Agency Group of Economic and Financial Statistics (IAG). In addition, and as part of its new strategy on data and statistics, the IMF is working to establish "global data commons", i.e. an integrated cloud-based network of country websites publishing data essential for surveillance (IMF, 2018). Beer, B. Tissot DOI: 10.4236/tel.2021.114047 715 Theoretical Economics Letters compilation of international banking and financial statistics).

B. De
The way forward would thus be to enhance and expand the current DGI initiative that is due to be completed at end-2021 so as to set up a more permanent and comprehensive framework to support the global production and use of official statistics. The focus should be on the actual reporting of relevant data, complementing other existing international statistical work streams that are more devoted to methodological issues, such as the updates of the 2008 System of National Accounts (2008 SNA) and the Balance of Payments and International Investment Position Manual, sixth edition (BPM6), both launched in 2020. Such a revised international framework for cooperation could be instrumental to both 1) enhancing existing core official statistics, especially as regards timeliness, frequency and international comparability, 2) addressing newly emerging data needs, and 3) strengthening the global statistical infrastructure.

Enhancing Existing Core Official Statistics
The CV19 pandemic has highlighted the urgency of pursuing the statistical exercises started after the GFC to compile better macroeconomic statistical aggregates-for instance, the DGI recommendation of publishing general government data consistent with the SNA (an issue that has clearly gained importance in view of the surge in public spending that reflects authorities' response to the pandemic) or of furthering the development of FA (including detailed breakdowns of securities issues and holdings; (Çakmak et al., 2020) as well as of fintech statistics (IFC, 2020a), as argued above. It also underlines the need for collecting more granular financial information to better understand episodes of stress in financial markets.
From this perspective, important financial data collections initiated since the GFC, especially in the context of the DGI, should be finalised, especially for the following four main areas that are closely inter-related: 1) Credit flows. The economic disruptions caused by CV19 highlighted the need for having more granular information on firms' funding needs, especially on the size of their cash shortfalls, the ability to finance them, and the way to do so e.g. through credit lines (Banerjee et al., 2020 (Borio et al., 2016). In normal times, this basis spread should be close to zero assuming perfect arbitrage. Yet with the start of the CV19 pandemic it widened again vis-à-vis the US dollar across major currencies, and central banks had to expand their operations in terms of swap lines and temporary US dollar liquidity arrangements to mitigate market stress . These developments clearly underscored the need to improve the measurement of financial balance sheets' currency composition for important sectors. 4) Derivatives. Certainly, many initiatives have been in train since the GFC to address the information gaps related to derivatives, not least the decision to collect granular transaction data through trade repositories (TRs). Yet challenges remain, especially for smaller jurisdictions where data are scarcer and access for central banks is more difficult. There is therefore a clear need for greater coordination at both the domestic and international levels to enhance the quality of TR data, develop their global aggregation, and foster their use for policymaking (IFC, 2018). Moreover, particular attention should be paid to FX derivatives, which generally require the actual payment of the notional amount at maturitywhich make them a form of debt, unlike many other derivatives; cf Borio et al. (2017). One issue is that the amounts involved are recorded off balance sheet, while they can have significant implications for on-balance sheet cash positions. Moreover, they can be used as hedging tools to close on-balance sheet currency mismatches-see Aldasoro et al. (2020) for an analysis of these issues and related dollar funding needs from commercial banks. Hence, more data should be collected to be able to monitor conditions in global funding markets, in particular data to assess the direction and amounts of FX trades crossed by currency, maturity, instrument type and counterparty sector/region. 13 This would greatly complement existing information on countries' total (i.e. on-and off-) balance sheets, and would significantly enhance existing measures of both external debt and foreign currency debt. 14

Addressing Newly Emerging Data Needs
A new framework for global statistical statistics should also address the lessons underlined by the pandemic. The first one relates to how to better tap into big 13 These elements are not currently captured by the global derivatives statistics published by the BIS that encompass the trading of foreign exchange instruments in spot and OTC derivatives markets as well as of OTC interest rate derivatives (cf Wooldridge, 2019) as well as BIS data on https://www.bis.org/statistics/about_derivatives_stats.htm?m=6%7C32). They were under consideration not least in the context of the recommendations of the second phase of the DGI related to cross-border exposures (FSB & IMF, 2015). data, e.g. private data sources as well as administrative registers, so that they can be brought into mainstream statistical frameworks and used to deliver more timely, frequent and comprehensive information to policymakers. This calls in particular for developing partnerships allowing for accessing private data for official statistical purposes (G20 Italian Presidency, 2021a). For instance, better cooperation among the different operators of administrative, statistical and commercial business registers would help to have a more agile NSS (Lane, 2021). Another important issue is to understand better the potential of emerging trends in data science, data engineering and information technologies (IFC, 2020b(IFC, , 2021a. Central banks are in particular increasingly interested in adopting data analytics and business intelligence techniques along with data transformation and big data ecosystems in their organisations-especially in finding appropriate sources, developing new methodological concepts and techniques, compiling policy-relevant indicators and making use of them, and taking advantage of rapid improvements in technology (Wibisono et al., 2019). Progress looking forward will depend on the fostering of exchange, collaboration and understanding of the related interdisciplinary practices, use cases, and technologies, including to cover important issues such as data governance and protection (IFC, 2021c). Regional specificities, especially between advanced and developing economies, are also important factors to be carefully considered. 15 A second focus area is to better measure the new topics underscored by CV19 that are not properly covered by the "traditional" statistical apparatus, especially on environmental topics and socioeconomic factors (e.g. distributional aspects, inequalities). In particular, financial authorities have an increasing interest in developing proper statistics on sustainable finance and address the related analytical needs. The key point is to draw the relevant lessons from the impact of the pandemic as regards future developments in greenhouse gas emissions, investments in sustainable technologies, and ways to strengthen the sustainability and resilience of today's economies. 16 This requires taking stock of the related statistical data needs of users in policy-making financial institutions, especially as regards the use of sustainable finance data in areas such as microprudential supervision, financial stability and macroeconomic analysis, risk and reserve man-15 As regards the challenges for accessing alternative private data in developing countries, a key one is that these jurisdictions often have particularly tight resources and rarely possess the expertise to extract information from raw unofficial data (Robin et al, 2015). Yet, on the other hand, the big data revolution offers an opportunity to avoid the organisation of costly data collection exercises for underdeveloped NSSs. For instance, quick inflation estimates can be made by directly scraping prices displayed on the web, instead of setting up specific surveys that can be quite time-and resource-intensive-see for instance the Billion Prices Project (Cavallo & Rigobon, 2016). 16 Policy makers have indeed underlined the importance of these issues to address the challenges associated with the CV19 pandemic. In particular, the G20 Finance Ministers and Central Bank Governors have stressed "that improving data availability and provision, including on environmental issues, and harnessing the wealth of data produced by digitalisation, while ensuring compliance with legal frameworks on data protection and privacy, will be critical to better inform our decisions" and invited the main international financial organisations to reflect on a possible new Data Gaps Initiative (G20 Italian Presidency, 2021b). Theoretical Economics Letters agement etc. Another objective is to review existing indicators, ad-hoc surveys, and analytical datasets developed or under development at national, regional, or industry levels, as well as the operational ways for bringing together data supply and demand (e.g. development of statistical hubs). Lastly, ways should be found to close related data gaps, both in the official and the private sector.

Strengthening the Global Statistical Infrastructure
At the global level, the underlying financial statistical infrastructure is still incomplete, reflecting the slow development of global identifiers, standards for exchanging information, and data sharing arrangements. Certainly, there have been notable achievements since the GFC as regards the Legal Entity Identifier (LEI), the actual international sharing of granular information on financial institutions, and the Statistical Data and Metadata eXchange (SDMX; see IFC, 2016) international standard for describing and exchanging data and metadata between organisations. In particular, the new SDMX 3.0 standard launched in 2021 will facilitate the handling of both micro data sets as well as certain types of unstructured information (e.g. geospatial data). Moreover, public authorities and the private industry 17 have been working to promote a "regtech approach" to the reporting of financial data, which basically refers to the provision of methodology, technology and processes to financial institutions to support regulatory monitoring, reporting and compliance. Nevertheless, further progress is clearly needed to foster more effective data-sharing possibilities and the use of global identifiers (IAG, 2017).
Promoting global initiatives and the international exchange of national experiences could be also instrumental to enhance the timely production of official statistics, by leveraging information technology to support data collection, compilation and dissemination processes. It would also provide an opportunity to highlight existing best practices and potential opportunities, especially to support policymaking, as well as to take stock of the challenges to be addressed as a priority.
Addressing these issues could help to significantly enhance NSSs' preparedness in the face of unexpected events such as CV19 and their role as providers of timely and reliable information to central banks as well as to other authorities and the public in general. This will, however, require careful and effective prioritisation of related implications for official statistics, tailored to actual policy needs.