Insights

Key Trends in Data Management

June 24, 2021

Data Reusability in the Financial Services Sector

The concept of Data Reusability lies at the foundation of nearly all the current priorities for the financial services industry, including operational resilience, digital transformation, and Environmental, Social and Corporate Governance (ESG) transformation.  It is based on the premise that the extraction of data from the source system is supported by aspects such as metadata collection, data quality, data lineage, master and reference data management. It also requires that the data be stored centrally, allowing access to both human and non-human actors, so that the same data can be used repeatedly for various purposes. So, what role does data reusability play in ensuring the success of the key trends in the financial services sector? And how can organizations implement a successful data reusability framework?

Lakehouse

A data lakehouse is an environment designed to combine the data structure and data management capabilities of a data warehouse with the low-cost storage of a data lake. The Lakehouse mimics the functionalities and performance of cloud-based data warehouses, and its open architecture combines the best features of data lakes and data warehouses, enabling BI and machine learning on all data. Databricks and AWS offer data lakehouses, but there are also other vendors that offer similar functionalities without using the term “lakehouse”. In this article, we talk about Databricks’ Lakehouse feature and some of its other new capabilities.

Learn more here.

Cloud Cost Governance

The cloud offers greatly increased speed-to-market compared to what could be achieved on-premise, readily available security and disaster recovery tools, and a relative ease of portability. However, one of the most lucrative promises of the cloud is the potential of cost-savings. Yet, as many organizations have recognized, there is significant difference in the cost of Cloud in the application space versus in data and analytics. For applications, the workload, demand on the system, growth projection and forecasts are deterministic and linear, and therefore, our ability to understand and model the costs over time is quite straight forward. However, in the case of data and analytics, the workloads are highly variable and difficult to predict, which makes it harder to model, estimate, and control costs on cloud. While all the cloud platforms offer out-of-the-box solutions for cost tracking and management, effective cloud governance requires a deeper look into service configurations, query size, and run times. Increasingly, organizations are considering setting up Cloud Cost Governance Offices to effectively respond to cost incidents and establish processes and best practices to ensure efficient use of consumption-based resources on the cloud.

Scalable, Multi-Cloud Insights

The Cloud has undoubtedly opened new avenues for extracting business insights from data, and with increased Cloud adoption, more businesses are opting for a multi-Cloud approach to make the most of the services offered by different platforms. However, extracting multi-cloud insights remains challenging for many organizations. As one of the Big 3 cloud platforms, Google’s data warehouse, BigQuery, and business intelligence tool, Looker, integrate seamlessly to enable users to break down data silos and extract actionable insights from data, regardless of where it is stored. For multi-cloud analytics, BigQuery Omni, brings the power of BigQuery to third-party cloud platforms, eliminating the need for data to moved between clouds and helping organizations save on time and data transfer costs as they extract value from their data in a more efficient manner.

Join hundreds of professionals who enjoy regular updates by our experts. You can unsubscribe at any time.

SUBSCRIBE - Sidebar Newsletter

More Insights