Skip to content
February 7, 2023

Key principles of cloud data governance for banks

Banks – and their data volumes – are at the epicenter of the world’s digital transformation. 

The pace of change mirrors the velocity, volume and variety of data within the industry. Where new products, new markets, and new touchpoints mean new – often cloud-based – ways to do business in financial services. 

Of course, in highly regulated industries, innovation can equal risk. That’s why continued progress depends on banking institutions establishing robust data governance principles. 

After all, wherever sensitive, personally identifiable information is stored, malicious actors will be found. Even without external threats, there are still the unintentional threats from internal employees. Phishing remains the most common attack vector in finance, and a leading entry point for ransomware. 

The path to successful bank transformation

Mitigating these threats without stifling innovation is an ever-present challenge for organizations. However, evidence clearly shows what happens when it’s done successfully. 

Industry incumbents can partner with fintechs to disrupt long-established banking models. Customer contact strategies are free to evolve from static statements to personalized interactions. Closed, proprietary models give way to open banking ecosystems. 

Naturally, delivering these on-demand services often means storing and accessing information held in the cloud. In this digital-first banking model, data often arrives unstructured. Sometimes through unsecure omnichannel interactions or unpatched devices. 

This means data volumes aren’t just getting bigger. They’re also becoming wider. Much like banks’ attack surfaces, and the locations of their often-remote workforce. 

Borderless business: Meeting cloud data governance challenges

The resulting pressure is magnified by the financial and reputational impact of data breaches. 

Flagstar experienced one of the biggest breaches in 2022, with 1.5 million consumers impacted. The US bank also paid an out-of-court settlement of $5.9 million dollars. 

Also in 2022, there was the conviction of the hacker behind the 2019 Capital One data breach. This resulted in a $190 million class action settlement

Financial organizations already experience one of the highest average costs from data breaches. At $5.97 million, only the healthcare industry is higher, at £10.10 million on average.

For international banks, increased risks come from the multiple regulations for moving and managing data across borders. For example, operating in the US requires a state-by-stage approach to data governance. Five states (California, Colorado, Connecticut, Utah, Virginia) are set to enforce new statutes in 2023.

“Approximately 78% of financial institutions regard the rapid change in reporting requirements and ad-hoc, non-standardized reporting requests as significantly impacting compliance costs.”

EY

The concept of data location is further complicated by partnerships with fintechs and other third parties. 

From data storage to API access, banks require full transparency and visibility from any cloud data governance strategy. They need this for their internal workflows, as well as to demonstrate compliance to regulators. Also from a brand perspective, to reassure and satisfy their customers. 

Once these foundations are in place, organizations can focus on maximizing value with a cloud data governance framework. The process starts by analyzing where data access has been in the past. And then understanding where data access is going in the future.

“Once a defined governance and access control structure is in place, banks can “democratize” their data by allowing each business unit to access the data mesh and take greater ownership of the quality and value of the data relevant to their domain.”

Accenture

How Velotix Supports Financial Services

Financial services rely on data to advance their business,
understand their customers, and predict the future.

Novelty to necessity: How data needs have evolved

Traditional data access operated on a relatively open-to-all basis. When concepts of data privacy and security tended to be limited to compliance and governance teams. 

After the 2007/8 financial crisis, regulatory reforms started having an impact. The international Basel Committee on Banking Supervision was established, with guidance for banks around identifying and acting on risks. 

Enter the ‘need to know’ approach

The reforms helped pave the way for a “need-to-know” approach. Where data access became more controlled, with permissions granted based on roles or attributes. 

For now, some aspects of this remain manual-led where IT teams have to play a gatekeeping role, fielding requests from the business. It can also be relatively inefficient and time-consuming. Many internal resources are often required to keep track, reconcile, and generate reports to relevant teams. 

As a result, bottlenecks are more likely to appear – and have an impact on the bottom line across different scenarios. A lack of discoverable data causes less-informed decision-making. Or a slow approval process means data becomes out-of-date by the time permissions are granted.

The ‘need to share’ approach

That’s why it’s imperative that banks look beyond the “need to know” approach. And start incorporating the “need to share”. Where data is governed based on sharing with the right people at the right time. 

While this approach still fulfills regulatory requirements, it also offers competitive advantage, by putting information at the fingertips of the people who need it. The focus is more on making sure insights are discoverable, rather than locked away in silos. 

Naturally, access is granted in seconds, so helping organizations democratize data. Achieving this – especially at scale – relies on a mix of technology and human-in-the-loop expertise. 

How the human-in-the-loop helps a hybrid cloud data management strategy

The approach brings systems and humans together. A hybrid approach, reflecting the nature of many banks’ infrastructures. 

After all, banking remains one of the industries where legacy infrastructure still runs many core banking processes. Although there’s another form of legacy that’s an advantage to established banks. The legacy a bank builds over many years of managing and authenticating customer accounts. Building trust and reputations by protecting people’s savings and investments. Generating loyalty by giving access to mortgages and financial advice. 

It’s a legacy spanning authentication, protection and access. In other words, a readymade foundation for moving toward a “need to share” approach. Banks can get even further ahead by putting in place some key principles.

Principle 1: Implement a Data Security Operations (DataSecOps) approach 

DataSecOps is the natural evolution for how organizations manage data. It mirrors the evolution of software development. Where security (Sec) became part of the application development and operations (DevOps) cycle, and no longer an add-on. This reduced silos and security gaps, and opened the doors to the more continuous and agile approach (DevSecOps).

In practice, DataSecOps means policy decisions are automated to ensure access is as close to real-time as possible. Instead of following traditional approaches, where potential access threats notify an admin who has to carry out manual remediation. 

DataSecOps is run from one single, integrated platform with a consistent and dynamic inventory. Data governance can therefore be run – and scaled – from a central location, reducing delays. Sensitive data can be masked at query run-time, and aligned to relevant regulatory requirements. Self-service is at the heart of DataSecOps, enabling data democratization within the business.

Principle 2: Ensure continuous data discovery, classification and security

Banking’s fast-moving and highly regulated environment means access policies are complex by default. Policies also need to be dynamic and scalable to match growing data repositories. Alongside identifying the when, where, how, and what purpose. This relies on data catalogs, enriched with cleansed and unified metadata. It then becomes possible to maintain a standardized taxonomy for automated aggregation. 

Centralizing data access can remove much of the manual effort associated with data governance. One integrated dashboard that provides an overview of activity. Where self-service requests can be activated and monitored.  

The result – identifying sensitive and restricted information, while also supporting data discoverability. Other times, anonymization and obfuscation are the desired outcomes. That’s when it’s time to deploy techniques such as masking and partial masking, hashing, bucketing, and row-level filtering.  

Principle 3: Develop a data governance strategy for the data-driven banking era

For banks, being data-driven isn’t just about uncovering insights. It also means governing sensitive information at scale – in a way that secures and protects privacy and corporate reputations. While also ensuring access is available so that the same customers receive the levels of service they expect and deserve. 

It’s a form of exchange that benefits all parties – when managed correctly. The challenge is maintaining the right balance. Particularly when new and complex regulations impact the way data should be accessed. Or when advances in technology recalibrate customer expectations, forcing banks to adjust their offerings. 

To stay ahead of the changes, banks must adopt increasingly agile approaches to data governance and access. So that information can be shared, with control at a more granular level. This calls for dynamic cloud data governance tools, with self-service for managing data lifecycles. The increased freedom can be supported by data lineage that maps, tracks and audits permission statuses. Through the increased visibility, permissions can be revoked and granted based on use case and policy.

How to add a partner to the principles 

Banks need to start implementing “need to share” access controls as soon as possible. The above principles are crucial to realizing the competitive advantage that’s currently available. Add the Velotix platform to the process, and banks also gain an AI-based policy engine. It learns from previous requests, becoming more accurate and providing more granular recommendations. Even after regulations change or grow in complexity. 

Banks gain real-time access, based on the “need to share”. Data flows are automated from request, while security and compliance is enforced. 

Ready to experience Velotix’s automated data security platform? Contact us to explore how we meet the needs of banking institutions, however complex the requirements.