Introducing Snowflake Table and Column Usage Analytics
It’s easier than ever to ETL data, but harder than ever to understand who or what is using this data. Metaplane now monitors table and column level usage analytics for Snowflake customers so you can better understand how critical data is used, what should be tested, and how to prioritize data quality issues.
Keeping Data Quality in Plain View
This week, Kevin had the opportunity to talk with Kelly on the Hashmap on Tap podcast about how Metaplane helps data teams.
Lineage Visualization Is Here!
Metaplane now visualizes your data lineage from warehouse to business intelligence tools. Finally understand how your data is used across your stack, easily find data assets that could be refactored or removed, and dig into how specific data assets are connected with a few clicks.
Data Quality Begins and Ends Outside of the Analytics Team
Sarah Krasnik, Data Engineer at Perpay, writes about how monitoring is important at every point of the stack.
Redshift and dbt Improvements
We now intelligently use different Redshift table metadata to ensure we are referencing accurate metadata without incurring more load on the warehouse. Additionally, we added more resiliency when making requests to the dbt API.
Data Observability on the Cloudcast Podcast
On the Cloudcast Podcast, Kevin talks about the concepts behind Data Observability and the challenges facing Data Engineers.
Email Alerts With Instant Model Feedback
Metaplane now supports flexible email destinations for alerting. You can send alerts for specific databases, schemas, tables, and columns to multiple email...
Metaplane can now route certain alerts to PagerDuty so you can ensure your team is alerted when critical parts of your data stack are broken.
We’re excited to share our new rules based alerting system so every team can receive the most relevant alerts where they live.
Sometimes data changes - upstream sources can produce new data that may look anomalous, but actually represent a new normal.
With one click, you can now easily reset test start dates to make sure you are incorporating data from a specific time period. This is helpful for min/max tests or tests on entities that have large changes such as a data backfill.