Data Science for Financial Services

Financial Analytics and Research

Why do(n’t) retail clients invest into ESG products?

Sustainable investments have become immensely important for financial markets in recent years. From 2012 to 2019, ESG assets in European equity funds rose from 0.651 to 1.663 trillion euros and are expected to grow to around 5.5 to 7.6 trillion euros by 2025, making up around half of all European fund assets. This contrast with very little, evidence on private investors and their inclination towards sustainable investments.

In our envisioned project we cooperate with a large German retail bank and combine administrative bank data with survey data to elicit the motivations and obstacles of private investors to invest in ESG investments. Motivations and obstacles include return seeking, impact seeking, social norms, but also confusion and warm glow.

The online survey will allow us to confront the survey participants with one of three information treatments in order to be able to gain causal insights into the various motivations and obstacles vis-à-vis a non-treated control group. Post-treatment questions and the administrative data, which include investment and consumption records, allow us to control for individual characteristics and to observe any effects the information treatments might have on intended and actual investment and consumption choices. This careful empirical setup allows us to produce rigorous evidence on the motivations and obstacles of private investors to invest in ESG investments.

Understanding why people invest sustainably is important not only to academics, but also to institutional investors, who often invest on behalf of individuals.

Principal Investigator: Prof. Dr. Andreas Hackethal

Inflation, Net Nominal Positions, and Consumption

We aim to explore how investors respond to changes in their inflation expectations using randomized control trials (RCT) with clients of a large German bank. We plan to run an online survey to elicit clients’ inflation expectations and beliefs about asset returns during periods of surging inflation. In multiple treatment arms, we then provide professional inflation forecasts and/or information about actual asset returns during past inflationary times to exogenously shift bank clients’ beliefs about inflation and/or asset returns. Post-treatment questions, administrative data, and follow-up surveys allow us to study the effects of the information treatments on intended and actual investment choices, as well as underlying mechanisms. This careful empirical setup allows us to produce rigorous evidence on how inflation expectations drive household financial-portfolio allocations.

These tests matter for asset managers and financial institutions, trying to understand consequences of heightened inflation expectations on portfolio reallocation and asset returns; policy makers, interested in how inflation affects capital markets and their stability; as well as central banks, which need to understand the effects of changes in inflation expectations. In congruence to the focus of efl, our project aims to utilize vast administrative bank data to study the effects of an RCT.

Principal Investigator: Prof. Dr. Andreas Hackethal

Project Members: Dr. Philip Schnorpfeil

A graph showing high-frequency data.

Measuring Lead-Lag Structures in Ultra-High-Frequency Trading

Lead-lag effects in the context of financial markets describe price discovery situations, where some financial instruments are leading and providing price signals to other instruments lagging behind. Lead-lag correlations can arise as a consequence of cross-asset trading and of the mutual influence between price adjustment processes of different assets. Against this background, we aim at developing a price and liquidity discovery network at the most granular level of market and trading data possible. In the current research, either trading data of only few instruments are used, short time periods of observations are analyzed or coarse sampling frequencies are investigated. Therefore, the goal of this research project is to address these limitations by a deep and broad analysis of lead-lag structures across various assets and asset classes using ultra-high-frequency data on a nano-second time scale. Cross-asset effects, exploited by HFTs, are short-living. Therefore, the scrutinization of the sub-second trading area is crucial to measure the speed of price adjustments in a trading world exhibiting an ever increasing speed. Covering asset classes like stocks, futures and options could shed light on previously unknown and fundamentally unsuspected relationships between two assets. Relevant market variables for the analyses include midpoint, spread, depth and orderbook imbalance. Volatillity measures will also be assessed.

The main part of the analyses will be guided by cross-correlation estimators, such as HY-covariance estimator for handling non-synchronous data, and its extension allowing for leads and lags by introduction of the so-called contrast function. There is well-established evidence that both volatilities and correlations exhibit strong intraday variation. Therefore, the stableness as well as the strength of lead-lag structures within and across different time periods will be examined. Moreover, several studies have shown that the interdependence of stock markets increases during periods of market distress and high volatility such as with the global financial crisis. The research project shall also contribute to this knowledge and test lead-lag structures for their volatility-dependence. The temporal relationships of the instruments' returns will be of decisive importance to underscore the contribution of this thesis for academics and practitioners alike. Non-contemporaneous relationships, for instance, could open up opportunities for exploitation by, e.g., statistical and spot-futures arbitrage strategies for algorithmic and high-frequency traders.

Principal Investigator: Prof. Dr. Peter Gomber

Project Members: Micha Bender, Tino Cestonaro, Julian Schmidt

The Impact of Zero-Commission Brokers on Stock Market Quality

A new generation of retail brokers recently emerged that allow securities trading at zero cost thereby enabling a new era of stock market participation for retail investors. Starting in the U.S., these so-called zero-commission brokers are experiencing massive growth worldwide. This development and the increased activity of retail traders was pushed by the COVID-19 pandemic, the associated high market volatility, and the current low interest rate environment. Zero-commission brokers, also referred to as neobrokers, do not charge order commissions. They are based on completely new business models and generate their revenues mainly by so-called "payment for order flow" (PFOF) arrangements. This means that zero-commission brokers forward their clients’ orders to certain market makers or high-frequency trading firms (HFTs) which pay for uninformed retail orders because they can profitably trade against them and take advantage of the additional liquidity. PFOF is debated controversially because it leads to conflicts of interest and brokers violating their duty to achieve best execution for their clients.

Research needs to provide a deeper understanding of this new phenomenon. Especially the potential negative effect of zero-commission brokerage on overall market quality and efficiency needs to be investigated. PFOF can harm stock market quality when a substantial fraction of uninformed (retail) order flow does not arrive at the limit order books of the reference market (i.e., the exchange where price discovery takes place) but is, instead, executed bilaterally outside the reference market against market makers and HFTs. By taking away uninformed order flow from the main exchanges, execution risk increases due to the higher relative share of informed order flow and higher adverse selection costs. This in turn can lead to higher bid-ask spreads on the reference market to compensate traders for this risk and higher volatility, and might, thus, harm stock market quality. Therefore, this project aims to close this research gap by analyzing whether shifts in retail trading volumes away from main exchanges affect stock market quality, i.e., liquidity, volatility, and price efficiency.

Principal Investigator: Prof. Dr. Peter Gomber

Project Members: Dr. Benjamin Clapham, Dr. Jascha-Alexander Koch, Julian Schmidt

What Matters in Limit Order Books? Order Book Events and Their Impact on Security Prices

When professional investors need to buy (sell) large quantities of securities, they face the risk to substantially move securities’ prices because of the created demand (supply) pressure in the market. However, not only completed transactions, but also other trading activities, e.g., order submissions, contain information that can affect the price of a security. Understanding the correlation between a trading activity and the subsequent market reaction is valuable for different market participants.

Only a few studies in the literature examine the price impact of trading activities other than trades. In our project, we aim to extend the current literature by examining the determinants of the price impact of a single order book event with respect to all its characteristics. The in-depth analysis of order book event characteristics enables us to identify events, which significantly contribute to price formation on the one hand, and events, which do not represent relevant new information on the other hand. As a key contribution, the results will enable researchers to identify which part of the data universe of an electronic financial market is relevant in security price formation and should therefore be incorporated in their data sample.

We address the research goal by calculating the price impact of billions of events in the trading process using message data from Deutsche Börse. We investigate the price impact of single order book events for different equities, exchange-traded products, and derivatives traded on Xetra and Eurex. Finally, the prices serve as input for different vector autoregressive models to account for endogeneity and to evaluate price impacts’ dependence on individual order book events and market characteristics.

Principal Investigator: Prof. Dr. Peter Gomber

Project Members: Micha Bender, Tino Cestonaro

Don’t Stop Me Now! Clustering and Classification of Useful and Unnecessary Volatility Interruptions

Volatility interruptions automatically interrupt continuous trading if pre-determined price thresholds are exceeded. While exchange operators and regulators advocate the use of volatility interruptions, the research community has a mixed perception with regard to their effectiveness and discussed potential negative side effects of these mechanisms. The underlying mechanism does not distinguish between plausible, meaningful price changes (in the following: fundamental price shocks) on the one hand and unsubstantiated price jumps on the other, e.g., due to mis-specified trades, erroneous trading algorithms, or fake news (in the following: error-induced price shocks). In order to mitigate this problem, the underlying mechanism would need to understand the particular market circumstances and trigger a volatility interruption only in those situations when it is actually needed.

Using a data science and machine learning approach to analyze and identify the causes of historical volatility interruptions, this project aims to lay the foundation for an important milestone regarding the safety and integrity of today’s financial markets. Based on high-resolution message data provided by Deutsche Börse, we plan to (1) employ a clustering approach to identify the circumstances under which volatility interruptions have been triggered historically on Xetra, and then (2) use those clusters as labels in a classification model that predicts those circumstances given pre-event order book information.

Such model could ultimately be implemented by exchange operators to discriminate between the above-mentioned cases and thereby restrict volatility interruptions to those instances in which they are actually needed and meaningful, that is, in case of error-induced price shocks but not in case of fundamental price shocks. This approach bears the potential to increase the effectiveness of market safeguards significantly and would serve to solve issues that researchers have criticized and markets have not solved yet.

Principal Investigator: Prof. Dr. Peter Gomber

Project Members: Dr. Benjamin Clapham

Understanding Consumers’ Information Needs for the "Digital Euro"

Digital currencies are on the rise. Once merely a vehicle for speculation, digital currencies such as Bitcoin, Litecoin, or Ethereum have reached mainstream consumers who use them for trading, payments, or private transactions. In response, centralized institutions such as central banks developed plans for their own digital currencies. One example of a digital currency of a central bank is the “Digital Euro”. The Digital Euro could offer consumers an additional choice for making transactions. It would offer a fast and easy means of sending and receiving money while protecting consumers’ privacy. Because the Digital Euro would be accessible to a broad audience, it contributes to the inclusion of the population in the digital currency space. The Digital Euro will likely also foster financial innovation and improve the efficiency of the payment system.

Despite its promises, the Digital Euro’s success hinges upon its adoption by consumers. Its issuers (i.e., central and national banks) must convince consumers to use it such that it can fully deliver on its promises to society. This project aims to identify consumers’ information needs related to the Digital Euro. By providing financial institutions (such as central or consumer banks) with knowledge about these information needs, we enable them to provide more targeted and consumer-centric information to consumers that defuse and ultimately overcome adoption barriers.

Monitoring how consumers search the web and analyzing their search queries reveals their information needs, which in turn can be satisfied through corresponding information campaigns. However, obtaining such data on how consumers search online is not easily possible: Only search engines themselves have detailed records of consumers’ searches. We overcame this challenge with the help of a novel Search Simulator that we developed. We then use modern Data Science tools to identify consumers’ information needs. Specifically, we rely on state-of-the-art natural language processing, such as transformer-based embedding models. Based on the identified similarities, we can leverage unsupervised machine learning to consolidate them into meaningful topics. Finally, these topics can inform advertising and information campaigns to overcome barriers to adopting the Digital Euro.

Principal Investigator: Prof. Dr. Bernd Skiera

Project Members: Maximillian Matthe, Daniel M. Ringel

Sponsors

The following sponsors support efl - the Data Science Institute Frankfurt