Building an Enterprise-Wide Data Fabric for Capital Markets

Building an Enterprise-Wide Data Fabric for Capital Markets

Data fabric

The ongoing digital transformation in capital markets requires firms to gain control of the vast amounts of data at their disposal to establish a single view of the truth, and to underpin analytics, reporting, and regulatory processes. Firms can enjoy the benefits of insights into customer needs and behaviors, a better understanding of market dynamics, and the ability to respond to emerging opportunities, all while meeting their regulatory obligations.

But many organizations have legacy data systems that need to be modernized to meet their digital ambitions. Data modernization is not just about upgrading legacy platforms one-for-one. Emerging technologies can allow organizations to build toward a coherent data fabric that unlocks new opportunities and allows them to differentiate their offerings.

The opportunities and challenges involved with putting in place a modern data fabric are discussed in detail in a new whitepaper published by A-Team Group’s Data Management Insight and commissioned by Yellowbrick. You can download the paper free of charge by clicking here: Creating an Enterprise-Wide Data Fabric to Underpin Digital Transformation in Capital Markets

The need for a common data fabric is underpinned by several internal and external factors. Financial institutions across the board – from hedge funds and asset managers, through brokers and investment banks, to custodians and service providers – face business pressures that are driving the quest for more effective use of resources to drive down operating costs, rapidly design and deploy new products, and improve client outcomes – all while driving up profitability.

 Sell-side firms are under pressure to increase equity value, which has dropped steadily post-Credit Crisis. Many have responded by attacking costs. But without modernizing IT infrastructures, these measures run the risk of reduced ability to deliver, and less flexibility to respond to opportunities, ultimately failing.

 Buy-side firms are under pressure to demonstrate value – vs. benchmarks, new entrants, and their peers. To excel, they need outstanding analytical capabilities to help them better understand their clients and identify new opportunities. But many firms lack the resources required and remain hindered by legacy systems that limit their ability to differentiate.

Streamlining Data Operations

Capital markets firms are being held back by legacy systems and siloed organizational structures. This makes firms slow to onboard new products and capabilities, as well as to onboard new clients. The infrastructure becomes an obstacle to better client outcomes and profitability.

As they address business pressures, capital markets participants are witnessing an explosion in the volumes of data they are required to deal with. Across all industries, digitization is creating huge datasets that require sophisticated systems to support analytics and complex calculations, whether for front-office trading models or enterprise functions like credit risk or customer behavioral analysis.

Firms are realizing that artificial intelligence techniques like machine learning (ML) and robotic process automation (RPA) can help with large amounts of data. Automation streamlines operations both for analytical applications and business workflows. For example, trading front-to-back office STP, or straight-through processing, but also regulatory-related activities like trade/transaction reporting and trade surveillance for potential market abuse.

Analytics for Growth

For those firms seeking to grow, improved analytics are essential to their ability to take advantage of emerging opportunities, and better understand client needs and preferences. By applying analysis to client, counterparty, and market risk information, infrastructure activity and performance data, and other types of information, practitioners can gain insights into client behaviors, emerging market opportunities and threats/risk, and potentially damaging operational issues that may be affecting performance.

Recognizing that improved analytics can boost profitability, the emphasis has fallen on data infrastructures to support the processing of large data sets to aid in the development and operation of firms’ analytical applications. There is growing demand from hedge funds, non-bank liquidity providers (NLPs), electronic market-makers, and other trading firms for consistent data for the development, testing, and deployment of trading models. NLPs and market-makers are basing their business models on the ability to match huge numbers of retail trades generated by established electronic brokers, generating wafer-thin margins on each trade. Their ability to do this at scale is often underpinned by a data fabric that enables the rapid and robust turnaround of orders.

In the front-office area, there is specialist demand for analysis of trading infrastructure latency and throughput performance, and client fill rates, which can be used to generate execution performance assessment and client behavioral analytics.

Firms are looking to analyze very large datasets to better understand client investment preferences, identify possible fraudulent activities among customers and staff, and build a clear picture of counterparty and issuer risk, increasingly with respect to ESG (environmental, social, governance) investing.

The Culture Bunker

A necessary first step in enjoying the capabilities and benefits of modernizing IT infrastructures is adopting a cultural shift toward a data-driven approach to business. One early step is to implement data quality and data governance disciplines throughout the organization. ESG is seen as especially important, due to the need to understand data linkages and “supply chains.”

But old habits die hard. It’s hard to change how things are done when data teams adopt a bunker mentality, rejecting change out of hand. As institutions adopt modern technology and methodologies for digital transformation, they must “unlearn” widely accepted approaches to managing and using data. Acceptance and direction of a new data culture need to be driven from the top, and the emergence of the Chief Data Officer in many financial institutions is a significant positive step.

Barriers to Progress

Time and again, disconnected legacy siloed datasets, and outdated, incompatible data technologies and infrastructure are restricting firms’ ability to respond to business drivers and utilize these opportunities. For many firms, this creates a need for data infrastructure modernization.

Perhaps the biggest single issue involves the multiple data repositories across lines of business that are characteristic of many financial firms’ organizational structure, often due to years and decades of M&A activity. These data siloes – an analytics professional at a tier 1 investment bank recently complained of operating 25,000 of them – prevent firms from establishing a holistic, single view of the truth.

Consolidating data from these siloes can be difficult. Not only that there are so many, but also due to individual business units’ different operating models and views of data. Many of these repositories are old – with client/server and even mainframe technologies still handling key processes. With the huge volume and sustained velocity of many datasets used by financial services firms, these systems are often not capable of meeting the demands of today’s Big Data requirements.

The result is slow response times, multiple data copies, gaps in data coverage, inconsistent and poor-quality data, and ultimately service disruption. Any of these can undermine a firm’s reputation among customers and counterparties, and draw regulatory censure and financial penalties. As a result, these performance issues represent a key driver of data infrastructure modernization programs.

Toward a Solution

Despite early hesitance, financial firms have started to recognize that cloud technologies can provide the flexibility and scale in their data operations needed to realize the potential benefits of taking a data-driven approach. Although many firms now see cloud as an integral part of their data-driven initiatives, adoption strategies are not standardized. Firms are taking different approaches, with some implementing single cloud solutions while others look at multi-cloud, some deploying private cloud while others use public.

At the same time, it is recognized that some key processes may not (yet) be suitable for offsite operation, due to latency performance, regulatory considerations, and other issues.

For instance, high-performance low-latency trading applications like co-location, exchange connectivity, real-time market data, pre-trade risk, and so on, are not seen as appropriate and are likely to remain situated on-premises or in a purpose-built data center. Some regulatory jurisdictions – notably the Monetary Authority of Singapore (MAS) – prohibit the hosting of client information data outside of the territory they cover.

On a purely technical level, some legacy technologies may be too difficult and/or important to switch off or risk disrupting. And certain data sets may be too sensitive for off-prem hosting.

 This all adds up to an acceptance among financial institutions that the hybrid on-premises/cloud model will be with us for some time. Hybrid brings a whole set of challenges around standardization, orchestration, and latency to ensure data sets hosted in different environments map correctly. The situation is also complicated by a multi-cloud hybrid approach.

Some organizations have attempted to resolve these issues using first-generation distributed data warehousing solutions, which emerged over the past decade or so. But given financial services’ unique and complex requirements, many have found that these may not deliver on performance and cost criteria and are unable to support the unification of data parameters required for successful digital transformation programs.

These first-generation solutions are often optimized for single-operator cloud implementations, which raise concentration risk concerns and the potential for regulatory rebuke or penalty. Early solutions suffer performance issues; they are slow, and can’t handle exploding financial data rates, particularly as data volumes, velocity, and volatility all rise.

There are cost and security issues too. Many early solutions have usage-based commercial models that are difficult to police internally, resulting in possible nasty surprises for CFOs as practitioners lose track of how much processing power they are consuming. And since the modern data fabric necessarily condenses many data siloes into a single data framework, there is a growing recognition of the need for robust security measures to protect the emerging single data fabric from cyber-attack and data breaches.

How to Get it Right

There is no question that obstacles remain for those seeking to put in place a data-driven approach to their business operations. But by taking into account a few considerations, practitioners can avoid major pain and make serious progress in transforming the business:

  1. Cloud is not a panacea. Don’t expect costs to drop. Cloud adoption requires culture change, new skill sets, and re-thinking of IT processes.
  2. Involve the business. Without business buy-in, modernization projects will find it difficult to secure support and funding. It’s essential to identify specific problems to solve and to involve business, data, technology, analytics, regulatory and operational departments in building support.
  3. Plan for hybrid: Hybrid is proving to be the new normal with some data needing to reside on-premises for the foreseeable future. Governance is key.
  4. Prepare for an explosion in data platform utilization as the business catches on to the new capabilities.
  5. Plan for real-time: Real-time presents real opportunities for firms to identify and respond to emerging opportunities.
  6. Don’t forget to build a data fabric, not just a one-for-one replacement. Modernize your thinking as you modernize.

Managing different data platforms on-premises and in the cloud can add complexity that firms simply cannot afford. The is an emerging architectural approach that seeks to address these challenges. Yellowbrick’s “Your Data Anywhere” approach can help firms as they navigate their journey to becoming data-driven organizations.

The opportunities and challenges involved with putting in place a modern data fabric are discussed in detail in a new whitepaper published by A-Team Group’s Data Management Insight and commissioned by Yellowbrick. You can download the paper free of charge by clicking here: Creating an Enterprise-Wide Data Fabric to Underpin Digital Transformation in Capital Markets

Get the latest Yellowbrick News & Insights
Why Private Data Cloud?
This blog post sheds light on user experiences with Redshift,...
Data Brew: Redshift Realities & Yellowbrick Capabilities –...
This blog post sheds light on user experiences with Redshift,...
DBAs Face Up To Kubernetes
DBAs face new challenges with Kubernetes, adapting roles in database...
Book a Demo

Learn More About the Only Modern Data Warehouse for Hybrid Cloud

Faster
Run analytics 10 to 100x FASTER to achieve analytic insights that have never been possible.

Simpler to Manage
Configure, load and query billions of rows in minutes.

Economical
Shrink your data warehouse footprint by as much as 97% and save millions in operational and management costs.

Accessible Anywhere
Achieve high speed analytics in your data center or in any cloud.