Hi DataOps.live User,
Welcome to our February Product Newsletter.
Not creating new Data Products in 10 minutes yet? Join Us March 14 to see how!
On March 14, you will experience how you can create new Data Products with simplicity and speed. Discover how the new DataOps.live Create helps you capture the business requirements. Then learn how to use our new Assist AI-copilot in the DataOps Development Environment (DDE) to rapidly create new datasets and Data Products.
Use all your data already in Snowflake and build out a new dataset. Import your existing dbt Core project and refine it. Always assess your data quality for data sources and shared datasets. Establish trust with your business stakeholders.
And that’s not all. If you are new to DataOps, don't miss the launch of our DataOps.live Professional Edition especially designed for up to 10 data engineers providing all the foundational features to turn your dbt Core projects into trusted Data Products.
Start Using dbt Core 1.7
Get ready for an early look at what’s to come! The support of dbt 1.7 is just a taste of the exciting features we have planned for our upcoming Orchestrator release in early March.
We are introducing dbt Core version 1.7 support across our transformation orchestrator MATE, the MATE packages, and the DDE.
This enhancement takes your DataOps experience to a new level. Using dbt 1.7 in your DataOps projects unlocks a range of exciting updates and performance enhancements, including:
- Support of Snowflake's dynamic tables
- Allow freshness to be determined via DBMS metadata for supported adapters
- Detect breaking changes to enforced constraints
- ... and many more
Explore the MATE Orchestrator, MATE Packages, and DataOps Development Environment documentation to learn more.
Monitor your DataOps pipelines by name
Starting this week, you can filter pipelines by their pipeline name, typically “full-ci.yml.” This exciting enhancement lets you monitor, analyze, and act on pipeline runs for a specific pipeline filename rather than just the last commit message on the branch.
Plus, if you want even more control, you can set a custom pipeline name to personalize your search to meet your needs. The only limitation that we can only cover pipelines executed after February 24, 2024.
Learn more at the Monitoring Pipeline Executions documentation.
Community
Did you hear that Snowflake has standardized on DataOps.live? On last week's #TrueDataOps podcast, Vernon Tan, Senior Manager, frostbyte Industry Solutions, Snowflake and Robert Guglietti, Solution Development Manager, Industry & Technical Innovation, Snowflake, fascinated the crowd with how Snowflake has licensed the DataOps.live platform to prepare and deliver technical sales demonstrations for its global customers. Read the press release and link to the replay below.
Did you know...
... that creating Functions and Procedures just got more flexible. You’re not limited to inputting raw SQL/JavaScript/Java/Python code directly. You can also read a file or multiple files with a Python handler using import. The “handler” parameter specifies what function/procedure to call while the “imports” parameter imports the module.
This helps keep your code organized while enjoying enhanced flexibility in your development process. Check out the details in the Function and Proceduredocumentation
That’s all for now and until next time.
- the DataOps.live Product Team