Recently, Tristan Spaulding, Acceldata’s Head of Product, met with Paige Bartley, a Senior Research Analyst with 451 Research for a session titled, Prevention vs Cure: Modern Data Management Calls for Data Observability. It was a wide-ranging conversation on the increasing demand and critical need for data observability solutions for modern data teams. Bartley and Spaulding assessed the rise of data observability in a complementary effort as enterprises rapidly migrate their data operations to the cloud.
Data environments are becoming more complex as all elements of operations – data integration, metadata, management of data quality, and everything in between – are touching and interacting with data and impacting its quality and performance. There’s also the issue of data teams needing to optimize their economic investment in the tools they use to manage their data efforts.
In the discussion, Bartley and Spaulding look at the trend towards cloud migration, which they see as something necessary, but complex and operationally cumbersome. While cloud migration clearly is not just lift and shift, it doesn’t have to be unwieldy. Modern data teams put much more thought and strategy into how they adopt the cloud and how they migrate their workloads. Bartley pointed out some key cloud themes that are driving cloud migration strategies, such as:
- A transition away from lift and shift philosophy
- Multicloud and hybrid cloud as common deployment patterns
- Private cloud is often favored for data platforms
- For new operational workloads, relational databases hold firm, but cloud is preferred
- For new analytic workloads, organizations tend to choose a cloud environment
Data consumers demand high availability of their data, and it must be reliable and of high integrity. The increasing number of enterprise data consumers has increased pressure on data teams to integrate, cleanse, analyze, stage, and manage the lifecycle of their data. The problems, as Spaulding and Bartley see it is that data teams now encounter increased pain points and challenges, including:
- More formal responsibilities for data engineering functions
- Need for automation and tooling to support scale
- Need for repeatability of workflows and processes
- Need for auditability, security, and governance
- Emphasis on speed and responsiveness of data delivery
- Multicloud and hybrid compatibility as key considerations
As data engineering functions evolve, Data Observability becomes an essential tool to maintain effective data operational management. Discussed that as data systems grow more complex and interconnected, there is pressure to improve overall visibility and to push data anomaly detection further upstream where potential issues can be caught early. Data observability is focused specifically on the data layer, and Bartley and Spaulding go into detail explaining how the use of monitoring and alerting of data system output to infer data system health improves the derived value and overall ROI of data systems.
We encourage you to watch a replay of the session and learn more about why data observability is essential for modern data management.