Skip to main content

Product Updates

See what’s new at our product, check the updates below

featured-image

DataOps.live Product Newsletter | August 2024

Hi DataOps.live User,     We're glad to be back after a summer break in July! Enjoy the DataOps.live Runner fully deployed in Snowflake, check out upcoming webinars and podcasts, and leverage new orchestrators for Informatica Data Governance and Catalog as well as dbt Cloud. Read on... Join our next product webinar "Building AI-powered Data Products with DataOps" on Sep 26 🚀 DataOps.live Runner for Snowpark Container Services in preview DataOps.live already supports deploying a Runner with Docker as a single instance and with Kubernetes as part of an auto-scaling group. With the new version of the DataOps.live Runner, you can also deploy it on Snowpark Container Services as a service native to Snowflake. No data leaves Snowflake and you don't need to use an external service as a dependency. Keep everything inside Snowflake – even credentials  No additional deployment is needed at AWS/Azure/Google  This is the initial step to making the entire data product platform available as a Snowflake native app.   Learn how to install and use the runner #TrueDataOps season 3 kicks off September 11! After the Summer break, we are restarting the #TrueDataOps podcast with your host Kent 'The Data Warrior' Graziano! We go live every other Wednesday at 8AM PST | 4PM GMT.   Our first guest of the season will be Mike Ferguson, a speaker you don’t want to miss out on. Register for Season 3!   If you are new to the podcast, binge-watch them from the YouTube podcast channel or enjoy Mike’s previous insights from episode 8. 🥁 New Informatica Cloud Data Governance and Catalog (CDGC) orchestrator The new Informatica Cloud Data Governance and Catalog (CDGC) orchestrator will enable Snowflake customers using Informatica and DataOps.live to deliver tested, trusted, and backward-compatible data products in record time while not compromising governance. This comprehensive and collaborative solution includes the following features: Active push of new metadata and data lineage for near-real-time feedback loops Coverage of the entire development lifecycle from dev to prod Complete integration with the entire data ecosystem powered by Informatica and Snowflake The Informatica CDGC orchestrator complements the existing Informatica Cloud taskflow orchestrator.   Learn more 📢 New dbt Cloud orchestrator We proudly announce our dbt Cloud orchestrator today. If you are in need of dbt Explorer, dbt Mesh, dbt Semantic Layer and column-level linage, you will be happy to hear that with DataOps you now can orchestrate dbt Cloud and also all the advanced dbt capabilities. Our dbt Cloud orchestrator triggers jobs as part of a DataOps pipeline, facilitating the run of your predefined jobs within dbt Cloud. We support up to three environments: production (PROD), staging (STAGE), and development (DEV) all mapped seamlessly to and from DataOps.live environments.   Learn more DataOps.live Featured in the Gartner® Market Guide for DataOps Tools That's two for two! The market is moving fast. Eight of the vendors in the original guide were already replaced. If you want to understand why and get more insights into how the market is changing, Sanjeev Mohan, Principal of SanjMo and former Gartner Research VP, Data and Analytics, will join us for a webinar, September 5, on Unpacking the Market Guide to DataOps Tools.   Learn more and register now Did you know... ... that we published a new Knowledge Base article on how to “How to Mix and Match Database Roles with Account Roles”.   Learn more   Enjoy the rest of August and until next time  - the DataOps.live Product Team

featured-image

DataOps.live Product Newsletter | June 2024

Hi DataOps.live user, Welcome to our June newsletter.  What an exciting month of announcements. Snowflake Summit in San Francisco ended recently. And, of course, we are making many of the announcements immediately available to you. Enjoy. Join the DataOps Summer Launch Webinar July 11 DataOps.live named Snowflake AI Data Cloud Product Growth Partner of the Year DataOps.live was recognized by Snowflake for helping joint customers to automate, orchestrate, observe and deploy data products on the Snowflake AI Data Cloud to supercharge their data engineering team productivity by 10X while achieving up to a 60% reduction in cost.  Learn more... ❄️ Snowflake Summit 24 for DataOps users Many exciting announcements were made at the Snowflake Summit in June 2024. Let's group them into five primary areas: enterprise AI, enterprise ML, native apps, catalog and data lake, and data governance. Many are now generally available or in public preview – ready for you to use during your data operations projects.   Let’s start with enterprise AI. Centered around Snowflake Cortex, they enable you to use generative AI with large language models. Fully hosted in Snowflake, your valuable data never leaves Snowflake. Whether you want to build a chat application with the SQL COMPLETE function, create a fine-tuned model to reduce cost and improve model quality, or summarize and describe unstructured data – you can use them in DataOps immediately. Leverage them when you use MATE to build your tables or call them in Streamlit from Python as a UDF.   Let’s continue with enterprise ML. Snowflake offers forecasting and anomaly detection in time series data in their set of ML functions. Leverage them in your standard SQL as alternative ways to create insights from your harmonized data. New with Summit is the Snowpark pandas API. Allowing you to run your existing pandas with only a few changes, you can now make use of them with the Snowpark orchestrator and run them natively in a DataOps pipeline.   Extremely exciting for all builders of Snowflake Native Apps was the announcement that they are now fully integrated with Snowpark Container Service. If you want to learn more about how you build your own, we invite you to watch our recent webinar again. 📺 Building a Cortex-Powered Snowflake Native App in 10 Minutes  (47min) 🖥️ By Engineers for Engineers - DataOps How-to videos In this month’s video, let's review our best practices of configuring your DataOps Runner effectively for running reliable and efficient data pipelines. We have written a couple of articles on this topic which you can find on our community portal: Why are my DataOps.live pipelines running slowly?  Troubleshooting and Preventive Measures for DataOps Runner Errors  More community articles with the [Runner] tag ✨ DataOps.live Assist is accessible to all! Our copilot has matured and now supports a wide variety of use cases. To celebrate this, we are now making Assist available to all users by default.    Enjoy summaries and descriptions of your merge requests and accelerate your development tasks in Develop. Have it generate documentation, dbt SQL models, Snowpark Python code, or even a Streamlit application. If you're not quite ready for this feature yet, contact our support team to opt-out. ICYMI - Frank Bell - #TrueDataOps Podcast Ep 36 Frank Bell, a Snowflake Data Superhero and Evangelist, talks about his journey from the Air Force to becoming a top Snowflake partner, sharing lessons from his extensive experience in data operations and his take on Snowflake’s transformative impact on the industry.    Frank and host Kent Graziano discuss:  The critical role of DataOps in managing the rapid pace of innovation, especially with the rise of AI and machine learning.  The concept of data products and DataOps impact on delivering reliable, scalable solutions  The need for automated monitoring and testing to maintain data quality  And more!  📺 Watch Episode 36 here  🎧 Listen to Episode 36 here  Snowflake Lifecycle Engine for data products is generally available Starting with the May 2024 monthly release, we are proud to make SOLE for data products generally available.   To start, decide on the granularity of grouping for each Snowflake object. You can group objects per schema, group objects based on business logic, or keep it simple and put each object into a single file.   Once you defined how you want to manage the granularity of your object:  Introduce a SOLE project file dataops_config.yml  Split your large database.template.yml into smaller files Learn more... It's that time of year! The Summer 24 Product Launch webinar is coming up soon. We'll share all the new features and show you how to use DataOps.live Assist to create a Streamlit application. Click the button below to learn more and save your spot. Join the DataOps Summer Launch Webinar July 11 That’s all for now and until next time. - the DataOps.live Product Team  

featured-image

DataOps.live Product Newsletter | May 2024

Hi DataOps.live user, Welcome to our May newsletter. Don’t miss the chance to meet with us at Snowflake Summit - booth #1332 and chat about all things DataOps!  By Engineers for Engineers -  DataOps.live How-to VideosIn this month’s video, we will demonstrate why DataOps Develop is such a great developer environment. The DataOps development environment is a ready-to-code environment that follows the basic principles of continuous development. It gives you highly optimized data development experience for key DataOps use cases. The video below will show you how to connect to your Snowflake environment, develop models, and quickly test your changes. These are only a few of the many applications for DataOps Develop! If you want to learn more about DataOps Develop, please get in touch! Click on the image to see the video: For more information on Develop please visit our documentation page!   Massively better data.world orchestratorWe updated our data.world Catalog orchestrator. Enjoy more complete metadata capture and much-improved data and system lineage, requiring less manual stitching in the catalog and correct allocation to the performing system.   Learn more... Build an AI Native App in 10 Minutes (Really!) May 28 | Live session  Join two Data Superheroes to learn how to build and launch a Cortex-powered Snowflake native app in less than ten minutes. And just to spice things up, we’ll include a Streamlit app. And some Snowpark. And Snowpark Container Services and some UDFs for good measure. And we'll do it in 10 minutes or less - seriously.   Most importantly, this won’t be a ‘quick hack’ resulting in technical debt. It will be a quality native app built from scratch and deployed in a highly governed, secure and repeatable process.     #TrueDataOps Podcast Frank Bell & Stewart Bryson - #TrueDataOps Podcast Ep.36   Welcome to another episode of the #TrueDataOps Podcast! May 29th, we're excited to host Frank Bell and Stewart Bryson. Discover what happens when the Data Warrior takes on two Data Superheroes. Visit us at Snowflake Data Cloud Summit 2024DataOps.live is a proud Green Circle partner for Snowflake Data Cloud Summit 2024 happening in San Francisco, June 3-6.  Stop by our booth #1332 to learn chat about everything DataOps. Book a meeting.   Did you know…... that we simplified writing large statements for Snowflake functions or procedures including full editor support in Develop using the new the directive !dataops.include. Learn more...for procedures for functions That’s all for now and until next time - the DataOps.live product team

featured-image

DataOps.live Product Newsletter | April 2024

Hi DataOps.live User,  Welcome to our April newsletter. Explore our AI-powered copilot, DataOps.live Assist, which will transform your DataOps experience. Plus, don’t miss the chance to connect with us at Snowflake Data Cloud Summit 2024, booth #1332 and chat about all things DataOps! Book some time with our Snowflake Data Products experts here.Unlock the full potential of DataOps.live Assist  Looking to simplify your SAP data workflows? Look no further than our latest blog posts and videos that show the power of DataOps.live Assist in simplifying your SAP data processes.  Building SAP Data Tables in Snowflake Made Simple with DataOps Assist: Learn how DataOps.live Assist takes the complexity out of creating SAP data tables in Snowflake, making it easier than ever to set up your data environment.  Generate SAP Insights on Snowflake Instantly with DataOps.live Assist: Discover how DataOps.live Assist empowers you to generate SAP insights on Snowflake in seconds, combining artificial intelligence and human intuition to bridge the gap between your data and comprehension.  Read these insightful articles and watch the included videos to unlock the full potential of your SAP data workflows with DataOps.live Assist. We're also thrilled to introduce a new technical video series, by data engineers for data engineers!  Please watch the first in the series, showcasing how to use the DataOps.live Assist Chat feature. 🚀 NEW SERIES 🚀 DataOps.live How-to Short Videos  Have questions? Add your questions and comments on our community forum.   Community forum  Try out SOLE for Data Products now in public previewOur new framework, SOLE for Data Products, is ready for you to test and see the difference it makes in your data journey, especially with the recent support of hybrid tables and virtual columns. Experience a more fluid approach to structuring your objects with SOLE for Data Products. With its new structure, the child objects need not be confined within the same file as the parent’s configuration. Instead, each child object details the parent object's specifics, allowing for separate file configurations. This flexibility enables hierarchical object configurations in distinct files to match organization needs. Give it a go in the public preview and unlock the full power of your data journey. Check out the SOLE for Data Products documentation for more information. Multi-line SQL in Snowflake Orchestrator and SOLE HooksNow, with our Snowflake Orchestrator, you can write and run longer SQL queries across multiple lines by simply providing “dataops-snowsql” a variable that points to the .sql file path:  Or configure SOLE (Snowflake Object Lifecycle Engine) to execute all the hooks from the group in a shared Snowflake session by setting the “auto_commit” property to “False”: CommunityICYMI - #TrueDataOps Podcast - Santona TuliIn this episode of the #TrueDataOps podcast, host Kent Graziano talks with Santona Tuli, a physicist turned data scientist. Santona discusses her transition from analyzing nuclear collisions at CERN to her role in the tech industry, emphasizing the challenges and innovations in managing large-scale data. She explores the concept of data products and the importance of a product mindset in data operations, stressing the need to understand user needs and maintain service level agreements. The episode also covers the role of AI in automating data processes and the necessity of continuous monitoring and testing to ensure operational efficiency and reliability.   📺 Watch here  🎧 Listen here   In this episode of the #TrueData Ops podcast, host Kent Graziano talks with Carsten Bange, CEO of BARC, about the role of continuous processes in data operations and the importance of data products. Carsten emphasizes the challenges in operationalizing data projects and the need for automation to manage data complexities. He also discusses the significance of solid data culture and ethical data management, particularly in light of advances in generative AI. The conversation highlights how these elements are crucial for effective data practices in today’s rapidly evolving tech landscape. 📺 Watch here 🎧 Listen here    Upcoming! Koen Verheyen - #TrueDataOps Podcast Ep.34 Welcome to another episode of the #TrueDataOps Podcast! May 1st, we're excited to host Koen Verheyen, a seasoned expert with a remarkable 20-year journey across various pivotal roles in the IT and data management sectors. Koen has evolved from development to technical architecture and solution architecture, now stepping into a crucial role as VP of Quality Assurance at VaultSpeed.  Prior to Koen's current position, Koen was a BI/DWH consultant involved in all phases of large business intelligence projects, from scoping and design to deployment. His expertise spans several industries, including telecommunications, retail, direct marketing, and banking. RSVP Now Visit us at Snowflake Data Cloud Summit 2024DataOps.live is a proud Green Circle partner for Snowflake Data Cloud Summit 2024 happening in San Francisco, June 3-6.  Stop by our booth #1332 to learn chat about everything DataOps. Book a meeting. Did you know...... that we’ve got some handy troubleshooting tips in our latest Knowledge Base articles?  Learn how to fix common Git errors like missing permissions for specific operations and issues while trying to commit changes to the repository in Dataops.live Develop. These resources will help you overcome obstacles and keep your development process running smoothly. Troubleshooting Git Error in Develop: Missing “read_repository” Permission  Troubleshooting Git Commit Issue in DataOps.live Develop That’s all for now and until next time.- the DataOps.live Product Team

featured-image

DataOps.live Product Newsletter | March 2024

Hi DataOps.live User,  Welcome to our March newsletter and discover a better way of managing data products with the power of Gen AI.  Build Data Products with DataOps.live Create and DataOps.live Assist  We’re excited to launch DataOps.live Create! Join our public preview to be among the first to explore this new app and share your valuable feedback. With DataOps.live Create, you can: Build Data Products in minutes with a few clicks requiring no prior setup  Use the power of Gen AI for instant iterating and refining your datasets  Create value for your organization by ensuring reliable, scalable, secure, accessible, and discoverable Data Products Accelerate and simplify your Data Product development cycle with automation and collaboration. Try DataOps.live Create now! Not ready yet? Learn more...  Run, Debug, and Test DataOps Projects in DataOps.live Develop On April 6th, DataOps.live changes the default choice for working with your projects from the file editor WebIDE to the fully-fledged development environment DataOps.live Develop. Develop is a powerful tool that automates and simplifies your development tasks. With Develop, you can do everything you did with WebIDE and more. It's designed to make building Data Products smoother and more efficient. Learn more about these tools in the DataOps.live Develop and DataOps WebIDEdocumentation.  CommunityJoin us for our next #TrueDataOps podcast episode with Santona Tuli, Ph.D., a distinguished data science expert and machine learning expert. In this episode, we will explore the frontiers of data operations, machine learning, and the technology shaping the future of data-driven decision-making. RSVP now Sequence Ordering in SOLEIf you are using sequences in Snowflake to generate unique numbers, our latest enhancement to the Snowflake Object Lifecycle Engine (SOLE) will interest you. We’ve introduced a new ordering parameter for the Sequence object in the latest SOLE update. This addition lets you specify the order in which new Sequence values are generated, offering greater control and flexibility in your data operations. Curious to learn more? Read the updated Sequence object documentation.   Data Universe April 10-11, NYCWill you be attending Data Universe in NYC, April 10 & 11? If so, we would love to have you join us at our after-party event at Chez Zou from 6-8pm, co-hosted by Satori.  RSVP now  Did you know...... that we’ve revamped our Getting Started Guide, showcasing the robust capabilities of Dataops.live to help you build Data Products with simplicity and speed using DataOps.live Create and DataOps.live Develop. Follow the Getting Starteddocumentation to build your first Data Product in minutes.  That’s all for now and until next time.- the DataOps.live Product Team 

featured-image

DataOps.live Product Newsletter | February 2024

Hi DataOps.live User,  Welcome to our February Product Newsletter. Not creating new Data Products in 10 minutes yet? Join Us March 14 to see how! On March 14, you will experience how you can create new Data Products with simplicity and speed. Discover how the new DataOps.live Create helps you capture the business requirements. Then learn how to use our new Assist AI-copilot in the DataOps Development Environment (DDE) to rapidly create new datasets and Data Products. Use all your data already in Snowflake and build out a new dataset. Import your existing dbt Core project and refine it. Always assess your data quality for data sources and shared datasets. Establish trust with your business stakeholders. And that’s not all. If you are new to DataOps, don't miss the launch of our DataOps.live Professional Edition especially designed for up to 10 data engineers providing all the foundational features to turn your dbt Core projects into trusted Data Products.Register Now  Start Using dbt Core 1.7Get ready for an early look at what’s to come! The support of dbt 1.7 is just a taste of the exciting features we have planned for our upcoming Orchestrator release in early March. We are introducing dbt Core version 1.7 support across our transformation orchestrator MATE, the MATE packages, and the DDE. This enhancement takes your DataOps experience to a new level. Using dbt 1.7 in your DataOps projects unlocks a range of exciting updates and performance enhancements, including: Support of Snowflake's dynamic tables  Allow freshness to be determined via DBMS metadata for supported adapters  Detect breaking changes to enforced constraints  ... and many more Explore the MATE Orchestrator, MATE Packages, and DataOps Development Environment documentation to learn more.  Monitor your DataOps pipelines by name Starting this week, you can filter pipelines by their pipeline name, typically “full-ci.yml.” This exciting enhancement lets you monitor, analyze, and act on pipeline runs for a specific pipeline filename rather than just the last commit message on the branch.  Plus, if you want even more control, you can set a custom pipeline name to personalize your search to meet your needs. The only limitation that we can only cover pipelines executed after February 24, 2024. Learn more at the Monitoring Pipeline Executions documentation.Watch the replay  CommunityDid you hear that Snowflake has standardized on DataOps.live? On last week's #TrueDataOps podcast, Vernon Tan, Senior Manager, frostbyte Industry Solutions, Snowflake and Robert Guglietti, Solution Development Manager, Industry & Technical Innovation, Snowflake, fascinated the crowd with how Snowflake has licensed the DataOps.live platform to prepare and deliver technical sales demonstrations for its global customers.  Read the press release and link to the replay below. Watch episode Did you know...... that creating Functions and Procedures just got more flexible. You’re not limited to inputting raw SQL/JavaScript/Java/Python code directly. You can also read a file or multiple files with a Python handler using import. The “handler” parameter specifies what function/procedure to call while the “imports” parameter imports the module.   This helps keep your code organized while enjoying enhanced flexibility in your development process.  Check out the details in the Function and Proceduredocumentation  That’s all for now and until next time.- the DataOps.live Product Team

featured-image

DataOps.live Product Newsletter | January 2024

Hi DataOps.live User,   Explore our January newsletter to uncover exciting new features designed to improve your development workflow.Snowflake Snowpark Container ServicesWe’re excited to announce support for the latest Snowflake feature, Snowpark Container Services, which is now available in public preview and enabled by default for all accounts!  This new addition significantly enhances DataOps.live in the Snowflake ecosystem, expanding its capabilities beyond traditional data warehousing to arbitrary workloads. You can deploy a wide range of applications, including AI model training and inference powered by Nvidia GPUs. Use DataOps to build your application images and register and deploy them to the Snowpark Container Registry runtime. Once completed, the new container jobs, functions or services are available to your DataOps pipeline for Snowflake.  This integration brings increased flexibility and efficiency in managing diverse workloads, ranging from model training to data applications. Learn more at Using Snowpark Container Services with Dataops.live  AI Assist in DataOps.live CreateIn our last newsletter, we shared exciting news about Dataops.live Assist, your AI-powered copilot for our data product platform; and DataOps.live Develop, the environment transforming the development workflows for a smoother experience with automation and collaboration benefits. Now, as we start 2024, more exciting news is on the horizon!   With the upcoming launch of DataOps.live Create, we are introducing the new Assist chat feature with real-time help. Get instant responses to your queries while creating data products, troubleshoot challenges, and maximize your DataOps productivity. Assist integrates into the Data Product creation workflow, guiding users in building models by suggesting code based on natural language prompts. It understands business problem descriptions, offering transformational model suggestions. Users can ask for more suggestions until satisfied, and then accept and tweak the code as needed before committing it to the git repository. Stay tuned for updates leading up to the launch next month/quarter!AI Assist Chat Feature Demo  CommunityWe are happy to announce that the winter season of the #TrueDataOps Podcast series is back on the air!  Join us for the next episode with a hands-on DataOps implementor and architect, Ronny Steelman, CEO of the consultancy, Quadrabtye.  RSVP Now  ICYMIFor a true masterclass on DataOps, please watch the latest edition of the “It Depends” podcast with host Sanjeev Mohan and our Co-founder, Guy Adams.  Watch the Episode  Did you know...…that SOLE, our powerful engine for managing your Snowflake ecosystem, and SOLE for Data Products now fully support running Python stored procedures. Developers can write and run stored procedures using Python syntax and leverage third-party libraries, enabling them to use Python's capabilities for data manipulation, analysis, and other tasks.  Learn more at the Snowflake stored procedures documentation. Have you tried Spendview?Take control of your Snowflake spend and get complete transparency into your Snowflake Data Cloud usage and spending and reduce costs by 15% or more.  Get it FREE (forever)  That’s all for now. Until next time, - the DataOps.live Product Team 

featured-image

DataOps.live Product Newsletter | December 2023

As the year ends, we’d like to express our gratitude for your continued support. Wishing you a joyful holiday season filled with warmth, laughter, and cherished moments with your loved ones!  Hi DataOps.live User,   In this December newsletter, discover our latest updates and exciting new features. Transform your workflow with DataOps.live | AssistElevate your data journey with DataOps.live Assist — your AI-powered copilot for your data products.  Assist generates insightful merge request (MR) summaries, enhancing reviews and ensuring quick decisions. It describes complex change requests, helping data product owners quickly understand and approve MRs.  Assist goes beyond MR review, supporting engineers in model development and analyzing data pipeline failure. It simplifies tasks and boosts collaboration for every team member.  Join our private preview by contacting us at privatepreview@dataops.live and make your data work smarter with Assist. Discover more at Dataops.live Assist documentation.  DataOps.live | DevelopIf you haven’t taken your data development experience to the next level with our DataOps.live Develop, now’s the perfect time to do so!  The DataOps development environment changes the development workflow to remove friction and improve your experience with automation and collaboration benefits. DevReady, the cloud-based deployment model of the development environment, streamlines the development process and provides you with the flexibility and scalability of cloud computing resources.   To let DevReady automatically work with your DataOps project, open your project in our platform and select DataOps.live|Develop to switch from the code editor to the development environment.  DevReady has a built-in tool, DataOps SOLE Validate, that checks your SOLE configuration locally before starting a SOLE pipeline. This proactive approach helps you catch potential configuration issues early, saving you time during development and pipeline execution.  To use this tool, simply open your project in DevReady, click on the DataOps.live icon in the left sidebar, and select DataOps SOLE Validate. The tool will automatically validate your resource configurations and show results on the console.Community#TrueDataOps Podcast is on Holiday break! But in case you missed some of our recent episodes - including Juan Sequeda, Principal Scientist & Head of AI Lab, data.world, and Cindi Howson, Chief Data Strategy Officer, Thoughspot (just to name a few) - make sure to catch up and subscribe to stay up to date on our latest guests and topics.   Upcoming!   Sonny Rivera - #TrueDataOps Podcast Ep.28 Join our next episode on Jan. 24, 2024 with Snowflake Superhero Sonny Rivera! Sony is a technology executive with deep experience in using internet technologies to improve business processes and enhance the business value chain. He is currently focused on migrating our on-premises data warehouse to Snowflake Cloud Data Warehouse and building advanced analytics.  In his own words, “Over the last few years, my team and I have designed and developed innovative Data as a Service products that generate millions of dollars in revenue annually. We have also integrated our data assets into additional products and services that Randall-Reilly offers.” RSVP now  Did you know...…that SOLE, our powerful engine for managing your Snowflake ecosystem, now supports loading data with Snowpipe to a table using the COPY INTO select query of the Pipe object.  Learn how to use it And lastly, hear your fellow DataOps.live customer Digikey describe their Cloud Migration journey, including real-world examples and use cases for their Data Product goals. Download nowThat’s all for now. Until next time, - the DataOps.live Product Team 

featured-image

DataOps.live Product Newsletter | November 2023

Hi DataOps.live User,   Welcome to the November newsletter! Explore our latest updates and exciting new features.  Sign Up for DataOps.live | AssistAfter announcing DataOps.live Assist in October, we are happy to invite a select group of users to join our private preview. This allows you to be the first to experience the new app and share valuable feedback that will directly influence its development. DataOps.live Assist: Simplifies the creation of SQL or Python data models with AI suggestions  Increases your productivity with suggested data transformations  Suggests initial project documentation for review  Offers explanations for current queries  Automatically summarizes and reviews merge requests for correctness  Offers additional reassurance for Data Product Owners to approve and promote changes to production safely  And much more...  Join our exclusive preview by reaching out to us at privatepreview@dataops.live and experience how to make the daily lives of data engineers easier.  Fully Released Orchestrators: Vaultspeed and Python 3.11The VaultSpeed and Python 3.11 orchestrators are now generally available and accessible to all customers. Use the first to deploy the data vault you have designed in VaultSpeed into a Snowflake database and the latter to run Python 3.11 scripts in your DataOps pipelines. Discover more at VaultSpeed Orchestrator and Python Orchestrator.  Observability User ManagementWith the new Spendview User Management app, you can easily share metrics on Snowflake accounts’ spending with people in your organization. This app offers streamlined user control with role-based access, an intuitive interface, and enhanced security. Experience the convenience of the Spendview User Management app now by accessing it directly from your Spendview dashboard.  Discover more at Managing Spendview Users   DataOps CLIWe now support the Snowflake object types “task” and “row access policy” in the SOLE Generator. We’ve also added new commands when you’re working inside the DataOps development environment. The new commands let you create and validate data product manifests and run helper SOLE & MATE. Discover more at SOLE Generator and CLI Commands.  CommunityJoin us with our next guest, Matt Aslett, VP and Research Director at Ventana Research. Matt is a highly experienced tech sector analyst and researcher with particular expertise in data and analytics and a focus on databases, data warehousing, data streaming, data integration, BI, data science, and big data, as well as data culture, data literacy, and hybrid cloud data processing. He was shortlisted for IIAR Analyst of the Year 2022.  RSVP today    Did you know...…that we’ve introduced a new App switcher for the data product platform. You can seamlessly toggle between apps and learning systems from DataOps.live.    Here’s what you can do: Switch between apps and systems with a single click for effortless navigation  Jump between functionalities seamlessly for enhanced productivity Enjoy a unified platform for all your needs for an integrated experience  That’s all for now. Until next time, - the DataOps.live Product Team 

featured-image

DataOps.live Product Newsletter | October 2023

Hi DataOps.live User,  Welcome to the October newsletter! Explore our latest updates and exciting new features.  Announcing DataOps.live | AssistWe’re happy to introduce a powerful capability to the DataOps.live product platform – our AI-powered Copilot, DataOps.live Assist. This solution brings advanced intelligence and automation to elevate your DataOps experience and boost productivity in your day-to-day tasks. Register now for our launch webinar on Nov 2nd where we'll share all the details!And Learn more from our press release HERE MATE with dbt 1.5Get ready for an exciting upgrade! We're rolling out dbt version 1.5 as the default for our Transformation (MATE) orchestrator and MATE packages, taking your DataOps experience to a new level. Using dbt 1.5 in your DataOps projects unlocks a range of exciting updates and enhanced performance.  Discover more about the MATE Orchestrator and MATE Packages  Python 3.11 OrchestratorWe’ve introduced the Python 3.11 Orchestrator, currently available in private preview. You now have the option to choose between running on Python 3.8 or 3.11. No matter the version, the flexibility provided for your project stays the same. You can run a single script or run fully-fledged applications pulling in third-party dependency. The choice is yours. Stay tuned for the official release with one of the next stable releases. Check out the details of Python Orchestrators  Web IDE Revamp: A Fresh New ExperienceNew Web IDE interface on November 11! We’re building our current Web IDE on top of Visual Studio Code. But rest assured, there will be no changes to the underlying functionality itself. Your familiar workflows and procedures stay unchanged. You can continue to use the Web IDE in the same way as before — to write, edit, and manage code and files directly in a web browser. The only difference is a refreshed interface for improved usability. If you’re looking to extend your development beyond just code writing and file editing, look no further! Our ready-to-code environment DataOps.live | Develop offers a set of tools and features to support every stage of the development lifecycle. See the following section for more information. DataOps.live | DevelopOur DataOps Development Environment has been rebranded to  DataOps.live | Develop and is making great progress in its private preview phase. We’ll transition it to a public preview soon. This means even more users will have the chance to explore its capabilities and provide us with feedback. With the public preview, you’ll experience a seamless, accelerated development process that puts automation and collaboration first, enhancing your experience across key development use cases. Explore all details at DataOps Development Environment   CommunityCindi Howson - #TrueDataOps Podcast Ep.23  Join us for our next #TrueDataOps Podcast with Chief Data Strategy Officer at ThoughtSpot, host of The Data Chief podcast, and Women in Data's Data Leader of the Year 2021, Cindi is a BI, data and analytics expert. Formerly VP Data and Analytics at Gartner, she also founded BI Scorecard, is the author of several books, and serves on the board for Rutgers University Big Data Certificate program. RSVP today!   Did you know...…that we’ll empower your Observability experience with user management and access control? This ensures seamless team collaboration when monitoring and analyzing operational metadata across your entire ecosystem. For existing users of Spendview for Snowflake you'll need to reset your passwords. We’re committed to guiding you through every step. Expect follow-up communication with all the necessary steps to take. Stay tuned for the imminent release!   That’s all for now. Until next time, - the DataOps.live Product Team   

featured-image

DataOps.live Product Newsletter | September 2023

Hi DataOps.live User,   Welcome to the September newsletter! Find out how our latest developments enhance your DataOps experience. Streamline Snowflake setup and save time managing environments. Test new features with reduced risk. And more. Read on. Snowflake Environment ManagementIn DataOps.live, creating pipelines involves combining various jobs, like those in the DataOps Reference Project. Each job has a specific role, and our powerful Snowflake Object Lifecycle Engine (SOLE) ensures everything works smoothly. We’ve added more optional jobs to the Reference Project for managing Snowflake development environments, making it much easier to set up and remove such environments. We’ve also introduced execution rules to handle Snowflake in different setups. And no worries, all your existing jobs still fit right in.  Want to know more? Check out the details of these new Reference Project Jobs.   Setup Wizard for Development Environment Start an even smoother DataOps experience! We’ve launched a new setup wizard to help you configure your Snowflake connection for the web-based DevReady and desktop-based DevPod development environments. This step-by-step walkthrough gives clear instructions on selecting the dbt version for your workspace and setting up your Snowflake credentials. Explore the details of this handy new feature for DevReady and for DevPod.   Bundled Feature FlagBundled feature flags offer an efficient method for integrating and testing new features with reduced risks. We’ve introduced a DataOps.live bundle that includes various behavioral refinements in your development process. These modifications are integrated into an optional bundled feature flag, giving you the freedom to choose and customize your experience.  Check out the details of this Bundled Feature Flag.  Enhanced Dataops Template ProjectOur latest template project, DataOps Standard Template V2, kickstart your project with the optimal structure and default settings. Projects built on this template lay the foundation for heightened developer productivity in DataOps.live. Check out Creating a Project from a Template for an effortless project initialization and seamless launch.  CommunitySeason 2 of the #TrueDataOps podcast is here! Make sure to sign up to our newsletter to stay up to date! https://www.dataops.live/truedataops-podcast   Pipeline resilience using Multiple Pull PoliciesOn the 28th of September, Docker experienced degraded performance for some of their components, which may have affected your pipelines. You can find a write up on how to mitigate for such issues here: Enhancing DataOps Pipeline Resilience with Multiple Pull Policies and please subscribe to our Status Page for any updates on the performance of our Platform: https://trust.dataops.live/  Did You Know...…that we’ve made some big changes to our Documentation site. These updates aim to improve navigation and functionality. Here's what’s new: We’ve restructured the topics in the navigator based on the core principles of DataOps.live using section separators for clarity  We've also made some cosmetic tweaks to improve the overall look and feel Hope you find these changes helpful!  That’s all for now. Until next time, - the DataOps.live Product Team 

Related products:Observability
featured-image

DataOps.live Product Newsletter | August 2023

Hi DataOps.live User,   Welcome to the August newsletter! Find out how our latest developments enhance your DataOps experience. Save time investigating Snowflake spend. Make the most of DDE. Enhance reliability and fault tolerance. And more. Read on.   Enhancements in Spendview for SnowflakeCompute is often the top cost area for optimization related to a Snowflake Warehouse, where the most common problems are query performance and idle warehouses.    Spendview for Snowflake brings awareness to business stakeholders and data team leads to understand their current resource usage. Spendview provides big saving opportunities in a matter of clicks as it empowers your data team with self-service Snowflake consumption and spending metrics.   Sign up now for our live webinar on optimizing your Snowflake spend and join DataOps.live and XponentL Data discussion on how you can perfect your budget by following best practices and cost-saving strategies.  Optimize the Potential of DataOps Development with dbt Make the most of the DataOps Development Environment (DDE) by using the latest dbt versions for better production efficiency. DDE is most effective when used with the recent dbt versions, which significantly boosts the development speed.  In DDE, specify the version by setting the DATAOPS_FEATURE_MATE_DBT_VERSION environment variable, for example, DATAOPS_FEATURE_MATE_DBT_VERSION=1.4. Read more.... In the DataOps pipeline, set the variable in your project’s variables.yml file.  Sign up for the DDE private preview by reaching out to ours Support team.  Enhancements in SOLE Generator Managing your existing Snowflake databases through the DataOps.live platform just got a bit easier. Besides supporting database, function, procedure, schema, table, and task, we now also support sequence, tag, stream, and file format.  Start using the SOLE Generator CLI (Command-line Interface) tool that helps you generate a YAML configuration for SOLE objects.    Community#TrueDataOps Podcast Season 2 is here! Join us with our special guest, Bob Mugila, formerly the Chief Executive Officer of Snowflake. We will cover his new book, The DataPreneurs (https://www.thedatapreneurs.com/), the new age of AI, and all things Data related! Make sure to add it to your calendar. You do not want to miss this episode!  RSVP Here AcademyThe latest course - Fundamentals of DataOps.live - just went live at our Academy. In this course, we go step by step through the process of ingesting a new data source using our Stage Ingestion Orchestrator. Topics covered include creating multiple objects with SOLE, autogenerating sources from ingestion tables, creating MATE jobs to run only specified models, creating a new pipeline -ci.yml file, and more. Enroll today!  Did You Know?…that you can incorporate a fallback job that retries your main job again but with a different container image or with alternative configuration adjustments. By incorporating a fallback job into a pipeline, developers and data engineers can enhance the reliability, robustness, and fault tolerance of the system.  Read more at Create Fallback Jobs   That’s all for now. Until next time, - the DataOps.live Product Team

featured-image

DataOps.live Product Newsletter | July 2023

Hi DataOps.live User,   Welcome to the July newsletter! Gain insights on how the important developments announced at the Snowflake Summit 2023 may greatly benefit DataOps.live projects and operations. Enjoy the collaboration with your DataOps peers on our Community platform. Read on.   Snowflake Summit Announcements and DataOps Our powerful Snowflake Object Lifecycle Engine (SOLE) and Modelling and Transformation Engine (MATE) help data engineers automate and streamline the steps in the development lifecycle, allowing them to focus on Snowflake’s functional capabilities.  With the exciting announcements of Snowflake Summit 2023, we are assessing how the latest technical developments by Snowflake further enhance DataOps.live. Potential advancements include: Enhance SOLE to support Snowpark container services and libraries management  Further simplify building user interfaces using Streamlit and enhance the management of Snowpark libraries  Update SOLE to ease building, testing, distributing, and monetizing of your Snowflake data apps on Snowflake Marketplace and ensure MATE is compatible Create Iceberg and Dynamic tables within SOLE and MATE Allow using geospatial and geometric data Make costs predictable and transparent through fine-tuned cost and budget management You can already use our DataOps Development Environment (DDE) to develop data science workloads with Snowpark and build data applications with Streamlit. With the planned enhancements, things will become even easier going forward. Further leverage DataOps.live Spendview for Snowflake to visualize and investigate your Snowflake accounts’ spend drivers.  Join Our Community Today  Join the DataOps.live community today!  We are excited to announce the launch of our NEW community platform! Designed with customers, partners, and DataOps professionals in mind, the DataOps.live community is the place to: Share your expertise and ask questions  Access a wealth of knowledgebase articles to gain a deeper understanding of the DataOps.live platform  Access our training and documentation materials free of charge.  Interact with other customers and get support from the DataOps.live team, who will be actively participating in the forum Sign up and add your avatar to our new community platform by Aug 1st to earn an “Early Adopter Badge”.   Early Adopter BadgeJoin Today Did You Know?… that there is a dedicated Spendview for Snowflake space on our new Community platform where you can access valuable insights on best practices, share your expertise, and engage in discussions. If you haven’t signed up yet, join and share your experience with us.  That’s all for now. Until next time, - the DataOps.live Product Team 

featured-image

DataOps.live Product Newsletter | June 2023

Hi DataOps.live User,  Welcome to the June newsletter. Discover how our free observability tool and new features enhance your DataOps experience. Save time investigating Snowflake spend. Streamline development, reduce complexity, and automate data product building and deployment on the Snowflake data cloud.  Spendview for Snowflake Spendview for Snowflake is a module from DataOps.live observability that we now make available for free. It offers a centralized view of your Snowflake compute, usage, and storage costs and consumption. It is accessible to anyone with a Snowflake account. Leveraging Spendview for Snowflake saves time investigating Snowflake spend anomalies and user adoption. It offers root cause analysis that pinpoints potential cost drivers and thus helps you monitor your budget and optimize your Snowflake spending and consumption. Learn more…   MATE with dbt 1.5  You can now use the Modelling and Transformation Engine (MATE) with dbt-core 1.5 and benefit from all the significant additions of this new version. You can enable dbt-core 1.5 by simply setting the variable DATAOPS_FEATURE_MATE_DBT_VERSION to 1.5. Learn more…  Improvements to DDE (DataOps Development Environment) Experience the latest enhancements to our fully integrated development environment, available in Private Preview. Our DDE, accessible through any browser and without any setup, provides a seamless, optimized development experience for various DataOps use cases.  With DDE, you can now switch between different dbt versions and use a pre-commit tool that helps identify issues before committing changes to your project.Learn more...    Community  In episode 17th of the #TrueDataOps Podcast, we are joined by our guest Sanjeev Mohan! Sanjeev, Principal at SanjMo and a former Gartner Research VP for Big Data and Advanced Analytics covers all things data, including analytics, data governance, data management, and data observability.  Watch the recording...   Did you Know…  … that we have introduced Data Products as a new framework in DataOps.live. You can now easily build an ecosystem of simple and composite data products on the Snowflake data cloud and benefit from greater accuracy and accessibility of data.  Data Products is currently available as a Private Preview, allowing early access to selected users. Email us and sign up for the Data Product Private Preview to get access.  Read more at Data Products.   That’s all for now. Until next time, - the DataOps.live Product Team

Related products:DataOps PlatformObservability
featured-image

Product Newsletter | May 2023

DataOps emphasizes automation and integration, enabling seamless and efficient data workflows. The DataOps CLI now offers easier Snowflake database management. And new third-party tool integrations help you collaborate with your team members better on your projects. Default Parameter Values in SOLE Generator  The SOLE Generator is our command-line interface generating the Snowflake confirmation from incoming DDL. Now you can create a template for a SOLE configuration holding your desired DataOps parameter values. When “dataops solegen” runs, it merges the templates with the incoming DDL. Learn more   More Third-party Integrations with Our Platform You can now benefit from the integration of more third-party tools with the DataOps platform. You can now use Google Hangouts Chat, Email on Push, Slack, Microsoft Teams, and Cisco Webex in all your DataOps project. Learn more    Community  Join our #TruedataOps Podcast Newsletter to stay up to date on the latest! Sign up now - https://www.dataops.live/truedataops-podcast and enjoy earlier episodes.  ICYMI – Keith Belanger - #TrueDataOps Podcast Ep #16  In episode 16th of the #TrueDataOps Podcast, we are joined by our guest Keith Belanger! Keith is passionate about data, technology, learning, mentoring, designing, and leading data solutions and teams. With over 25 years in Data Architecture and Information Management, Keith is highly experienced at assembling and directing high-performing, data-focused teams and solutions. Combining a deep technical and data background with a business-oriented mindset and excellent communication skills. He consistently builds and delivers innovative data platforms and solutions to everyday business problems.  Watch the recording...  🚨 Up Next! 🚨 Upcoming Sanjeev Mohan - #TrueDataOps Podcast Ep #17  Wed, May 31, 2023, 8:00 AM PT  Join our next guest, Sanjeev is Principal at SanjMo and a former Gartner Research VP for Big Data and Advanced Analytics. With expertise covering all things data, including analytics, data governance, data management, and data observability, he has published many white papers. He is a prolific speaker at conferences and events worldwide. RSVP here  Did you Know…  …  that you can easily set up Git pre-commit hooks for your DataOps project to check the code being committed for syntax errors, style violations, or other issues. You can write these hooks in any programming language and perform any checks necessary for your project. Learn more…  That’s all for now. Until next time, The DataOps.live Product Team 

featured-image

Product Newsletter | April 2023

Get ready to supercharge your data operations with our latest + greatest updates - packed with powerful new features, enhancements, and expert insights to help you unlock the full potential of your data operations!   More Enhancements in SOLE  We are continuously enhancing our Snowflake Object Lifecycle Engine with more capabilities to help you fully manage your Snowflake ecosystem. You can now: Plan modifications to any Snowflake database before applying them by using the plan and apply for lifecycle jobs. Learn more   Review in a human-readable version, generated as part of the plan lifecycle job, all the modifications planned to be applied to a Snowflake database. Learn more Use the SNOWPARK-OPTIMIZED warehouse that provides 16x memory per node compared to a standard Snowflake virtual warehouse. Learn more  Prevent SOLE from implicitly granting USAGE privileges on the parent and grandparent objects by simply setting a value to a variable. Learn more   MATE with Different dbt Versions  Our Modeling and Transformation Engine (MATE) now supports different dbt versions and provides you with the variable DATAOPS_FEATURE_MATE_DBT_VERSION to decide which version to use. Learn more MATE now also supports colored lineage graphs by annotating your logical layers in your dbt project. Learn more  Did You Know… … that our template rendering engine now supports using Python's datetime and time modules in dbt models and Jinja templates? Learn more  Community   Dan Linstedt | #TrueDataOps Podcast Ep.13 “In the data world, there are lots of advances in cloud technology that should not go unnoticed or ignored… I think the future is cloud – there’s no question about it.”  Live streaming on LinkedIn and available on-demand, the thirteenth episode of the #TrueDataOps Podcast brought together Kent Graziano, aka 'The Data Warrior', with his longtime friend Dan Linstedt, inventor of the data vault system of business intelligence. “We’re on a collision course to re-engage business analytics alongside operational systems, at least in the cloud space. It’s because data is growing at an unprecedented rate, and it’s about to increase again… Data is simply getting too large to move, and this is why I definitely think the cloud is the future...” Hear the rest of what Dan had to say in the link above!  Barr Moses | #TrueDataOps Podcast Ep.14 “The problem of data trust is ubiquitous. The number one thing people told us was ‘the data is wrong and that leads us to a place where we can’t trust it, and we can’t use it…’” Episode 14 of the #TrueDataOps podcast welcomed Barr Moses, Data Observability pioneer, CEO, and co-founder of Monte Carlo, who joined host Kent ‘The Data Warrior’ Graziano to discuss all things data and DataOps. “We can’t afford to ignore the implications of bad data - and this is where data observability comes in. Today, you not only have a lot more data, you also have a lot more people using the data…”   🚨 Up Next! 🚨 Neil Strange | #TrueDataOps Podcast Ep.15 Wednesday, May 3rd | 4pm GMT 11am EDT Join us for our next episode with Neil Strange, CEO and founder of DataVault. Neil is a professional management consultant specializing in all aspects of business intelligence. A director of Business Thinking (trading as Datavault), he is a thought leader in Data Vault 2.0 Data Warehousing, BI, analytics, AI, deep learning, and information governance. Neil is a Guest Presenter, Visitor at the Royal Holloway (University of London) School of Management | RSVP on LinkedIn That’s all for now. Until next time,  The DataOps.live Product Team