Product Updates

See what’s new at our product, check the updates below

DataOps.live Product Newsletter | January 2024

Hi DataOps.live User,   Explore our January newsletter to uncover exciting new features designed to improve your development workflow.Snowflake Snowpark Container ServicesWe’re excited to announce support for the latest Snowflake feature, Snowpark Container Services, which is now available in public preview and enabled by default for all accounts!  This new addition significantly enhances DataOps.live in the Snowflake ecosystem, expanding its capabilities beyond traditional data warehousing to arbitrary workloads. You can deploy a wide range of applications, including AI model training and inference powered by Nvidia GPUs. Use DataOps to build your application images and register and deploy them to the Snowpark Container Registry runtime. Once completed, the new container jobs, functions or services are available to your DataOps pipeline for Snowflake.  This integration brings increased flexibility and efficiency in managing diverse workloads, ranging from model training to data applications. Learn more at Using Snowpark Container Services with Dataops.live  AI Assist in DataOps.live CreateIn our last newsletter, we shared exciting news about Dataops.live Assist, your AI-powered copilot for our data product platform; and DataOps.live Develop, the environment transforming the development workflows for a smoother experience with automation and collaboration benefits. Now, as we start 2024, more exciting news is on the horizon!   With the upcoming launch of DataOps.live Create, we are introducing the new Assist chat feature with real-time help. Get instant responses to your queries while creating data products, troubleshoot challenges, and maximize your DataOps productivity. Assist integrates into the Data Product creation workflow, guiding users in building models by suggesting code based on natural language prompts. It understands business problem descriptions, offering transformational model suggestions. Users can ask for more suggestions until satisfied, and then accept and tweak the code as needed before committing it to the git repository. Stay tuned for updates leading up to the launch next month/quarter!AI Assist Chat Feature Demo  CommunityWe are happy to announce that the winter season of the #TrueDataOps Podcast series is back on the air!  Join us for the next episode with a hands-on DataOps implementor and architect, Ronny Steelman, CEO of the consultancy, Quadrabtye.  RSVP Now  ICYMIFor a true masterclass on DataOps, please watch the latest edition of the “It Depends” podcast with host Sanjeev Mohan and our Co-founder, Guy Adams.  Watch the Episode  Did you know...…that SOLE, our powerful engine for managing your Snowflake ecosystem, and SOLE for Data Products now fully support running Python stored procedures. Developers can write and run stored procedures using Python syntax and leverage third-party libraries, enabling them to use Python's capabilities for data manipulation, analysis, and other tasks.  Learn more at the Snowflake stored procedures documentation. Have you tried Spendview?Take control of your Snowflake spend and get complete transparency into your Snowflake Data Cloud usage and spending and reduce costs by 15% or more.  Get it FREE (forever)  That’s all for now. Until next time, - the DataOps.live Product Team 

DataOps.live Product Newsletter | December 2023

As the year ends, we’d like to express our gratitude for your continued support. Wishing you a joyful holiday season filled with warmth, laughter, and cherished moments with your loved ones!  Hi DataOps.live User,   In this December newsletter, discover our latest updates and exciting new features. Transform your workflow with DataOps.live | AssistElevate your data journey with DataOps.live Assist — your AI-powered copilot for your data products.  Assist generates insightful merge request (MR) summaries, enhancing reviews and ensuring quick decisions. It describes complex change requests, helping data product owners quickly understand and approve MRs.  Assist goes beyond MR review, supporting engineers in model development and analyzing data pipeline failure. It simplifies tasks and boosts collaboration for every team member.  Join our private preview by contacting us at privatepreview@dataops.live and make your data work smarter with Assist. Discover more at Dataops.live Assist documentation.  DataOps.live | DevelopIf you haven’t taken your data development experience to the next level with our DataOps.live Develop, now’s the perfect time to do so!  The DataOps development environment changes the development workflow to remove friction and improve your experience with automation and collaboration benefits. DevReady, the cloud-based deployment model of the development environment, streamlines the development process and provides you with the flexibility and scalability of cloud computing resources.   To let DevReady automatically work with your DataOps project, open your project in our platform and select DataOps.live|Develop to switch from the code editor to the development environment.  DevReady has a built-in tool, DataOps SOLE Validate, that checks your SOLE configuration locally before starting a SOLE pipeline. This proactive approach helps you catch potential configuration issues early, saving you time during development and pipeline execution.  To use this tool, simply open your project in DevReady, click on the DataOps.live icon in the left sidebar, and select DataOps SOLE Validate. The tool will automatically validate your resource configurations and show results on the console.Community#TrueDataOps Podcast is on Holiday break! But in case you missed some of our recent episodes - including Juan Sequeda, Principal Scientist & Head of AI Lab, data.world, and Cindi Howson, Chief Data Strategy Officer, Thoughspot (just to name a few) - make sure to catch up and subscribe to stay up to date on our latest guests and topics.   Upcoming!   Sonny Rivera - #TrueDataOps Podcast Ep.28 Join our next episode on Jan. 24, 2024 with Snowflake Superhero Sonny Rivera! Sony is a technology executive with deep experience in using internet technologies to improve business processes and enhance the business value chain. He is currently focused on migrating our on-premises data warehouse to Snowflake Cloud Data Warehouse and building advanced analytics.  In his own words, “Over the last few years, my team and I have designed and developed innovative Data as a Service products that generate millions of dollars in revenue annually. We have also integrated our data assets into additional products and services that Randall-Reilly offers.” RSVP now  Did you know...…that SOLE, our powerful engine for managing your Snowflake ecosystem, now supports loading data with Snowpipe to a table using the COPY INTO select query of the Pipe object.  Learn how to use it And lastly, hear your fellow DataOps.live customer Digikey describe their Cloud Migration journey, including real-world examples and use cases for their Data Product goals. Download nowThat’s all for now. Until next time, - the DataOps.live Product Team 

DataOps.live Product Newsletter | November 2023

Hi DataOps.live User,   Welcome to the November newsletter! Explore our latest updates and exciting new features.  Sign Up for DataOps.live | AssistAfter announcing DataOps.live Assist in October, we are happy to invite a select group of users to join our private preview. This allows you to be the first to experience the new app and share valuable feedback that will directly influence its development. DataOps.live Assist: Simplifies the creation of SQL or Python data models with AI suggestions  Increases your productivity with suggested data transformations  Suggests initial project documentation for review  Offers explanations for current queries  Automatically summarizes and reviews merge requests for correctness  Offers additional reassurance for Data Product Owners to approve and promote changes to production safely  And much more...  Join our exclusive preview by reaching out to us at privatepreview@dataops.live and experience how to make the daily lives of data engineers easier.  Fully Released Orchestrators: Vaultspeed and Python 3.11The VaultSpeed and Python 3.11 orchestrators are now generally available and accessible to all customers. Use the first to deploy the data vault you have designed in VaultSpeed into a Snowflake database and the latter to run Python 3.11 scripts in your DataOps pipelines. Discover more at VaultSpeed Orchestrator and Python Orchestrator.  Observability User ManagementWith the new Spendview User Management app, you can easily share metrics on Snowflake accounts’ spending with people in your organization. This app offers streamlined user control with role-based access, an intuitive interface, and enhanced security. Experience the convenience of the Spendview User Management app now by accessing it directly from your Spendview dashboard.  Discover more at Managing Spendview Users   DataOps CLIWe now support the Snowflake object types “task” and “row access policy” in the SOLE Generator. We’ve also added new commands when you’re working inside the DataOps development environment. The new commands let you create and validate data product manifests and run helper SOLE & MATE. Discover more at SOLE Generator and CLI Commands.  CommunityJoin us with our next guest, Matt Aslett, VP and Research Director at Ventana Research. Matt is a highly experienced tech sector analyst and researcher with particular expertise in data and analytics and a focus on databases, data warehousing, data streaming, data integration, BI, data science, and big data, as well as data culture, data literacy, and hybrid cloud data processing. He was shortlisted for IIAR Analyst of the Year 2022.  RSVP today    Did you know...…that we’ve introduced a new App switcher for the data product platform. You can seamlessly toggle between apps and learning systems from DataOps.live.    Here’s what you can do: Switch between apps and systems with a single click for effortless navigation  Jump between functionalities seamlessly for enhanced productivity Enjoy a unified platform for all your needs for an integrated experience  That’s all for now. Until next time, - the DataOps.live Product Team 

DataOps.live Product Newsletter | October 2023

Hi DataOps.live User,  Welcome to the October newsletter! Explore our latest updates and exciting new features.  Announcing DataOps.live | AssistWe’re happy to introduce a powerful capability to the DataOps.live product platform – our AI-powered Copilot, DataOps.live Assist. This solution brings advanced intelligence and automation to elevate your DataOps experience and boost productivity in your day-to-day tasks. Register now for our launch webinar on Nov 2nd where we'll share all the details!And Learn more from our press release HERE MATE with dbt 1.5Get ready for an exciting upgrade! We're rolling out dbt version 1.5 as the default for our Transformation (MATE) orchestrator and MATE packages, taking your DataOps experience to a new level. Using dbt 1.5 in your DataOps projects unlocks a range of exciting updates and enhanced performance.  Discover more about the MATE Orchestrator and MATE Packages  Python 3.11 OrchestratorWe’ve introduced the Python 3.11 Orchestrator, currently available in private preview. You now have the option to choose between running on Python 3.8 or 3.11. No matter the version, the flexibility provided for your project stays the same. You can run a single script or run fully-fledged applications pulling in third-party dependency. The choice is yours. Stay tuned for the official release with one of the next stable releases. Check out the details of Python Orchestrators  Web IDE Revamp: A Fresh New ExperienceNew Web IDE interface on November 11! We’re building our current Web IDE on top of Visual Studio Code. But rest assured, there will be no changes to the underlying functionality itself. Your familiar workflows and procedures stay unchanged. You can continue to use the Web IDE in the same way as before — to write, edit, and manage code and files directly in a web browser. The only difference is a refreshed interface for improved usability. If you’re looking to extend your development beyond just code writing and file editing, look no further! Our ready-to-code environment DataOps.live | Develop offers a set of tools and features to support every stage of the development lifecycle. See the following section for more information. DataOps.live | DevelopOur DataOps Development Environment has been rebranded to  DataOps.live | Develop and is making great progress in its private preview phase. We’ll transition it to a public preview soon. This means even more users will have the chance to explore its capabilities and provide us with feedback. With the public preview, you’ll experience a seamless, accelerated development process that puts automation and collaboration first, enhancing your experience across key development use cases. Explore all details at DataOps Development Environment   CommunityCindi Howson - #TrueDataOps Podcast Ep.23  Join us for our next #TrueDataOps Podcast with Chief Data Strategy Officer at ThoughtSpot, host of The Data Chief podcast, and Women in Data's Data Leader of the Year 2021, Cindi is a BI, data and analytics expert. Formerly VP Data and Analytics at Gartner, she also founded BI Scorecard, is the author of several books, and serves on the board for Rutgers University Big Data Certificate program. RSVP today!   Did you know...…that we’ll empower your Observability experience with user management and access control? This ensures seamless team collaboration when monitoring and analyzing operational metadata across your entire ecosystem. For existing users of Spendview for Snowflake you'll need to reset your passwords. We’re committed to guiding you through every step. Expect follow-up communication with all the necessary steps to take. Stay tuned for the imminent release!   That’s all for now. Until next time, - the DataOps.live Product Team   

DataOps.live Product Newsletter | September 2023

Hi DataOps.live User,   Welcome to the September newsletter! Find out how our latest developments enhance your DataOps experience. Streamline Snowflake setup and save time managing environments. Test new features with reduced risk. And more. Read on. Snowflake Environment ManagementIn DataOps.live, creating pipelines involves combining various jobs, like those in the DataOps Reference Project. Each job has a specific role, and our powerful Snowflake Object Lifecycle Engine (SOLE) ensures everything works smoothly. We’ve added more optional jobs to the Reference Project for managing Snowflake development environments, making it much easier to set up and remove such environments. We’ve also introduced execution rules to handle Snowflake in different setups. And no worries, all your existing jobs still fit right in.  Want to know more? Check out the details of these new Reference Project Jobs.   Setup Wizard for Development Environment Start an even smoother DataOps experience! We’ve launched a new setup wizard to help you configure your Snowflake connection for the web-based DevReady and desktop-based DevPod development environments. This step-by-step walkthrough gives clear instructions on selecting the dbt version for your workspace and setting up your Snowflake credentials. Explore the details of this handy new feature for DevReady and for DevPod.   Bundled Feature FlagBundled feature flags offer an efficient method for integrating and testing new features with reduced risks. We’ve introduced a DataOps.live bundle that includes various behavioral refinements in your development process. These modifications are integrated into an optional bundled feature flag, giving you the freedom to choose and customize your experience.  Check out the details of this Bundled Feature Flag.  Enhanced Dataops Template ProjectOur latest template project, DataOps Standard Template V2, kickstart your project with the optimal structure and default settings. Projects built on this template lay the foundation for heightened developer productivity in DataOps.live. Check out Creating a Project from a Template for an effortless project initialization and seamless launch.  CommunitySeason 2 of the #TrueDataOps podcast is here! Make sure to sign up to our newsletter to stay up to date! https://www.dataops.live/truedataops-podcast   Pipeline resilience using Multiple Pull PoliciesOn the 28th of September, Docker experienced degraded performance for some of their components, which may have affected your pipelines. You can find a write up on how to mitigate for such issues here: Enhancing DataOps Pipeline Resilience with Multiple Pull Policies and please subscribe to our Status Page for any updates on the performance of our Platform: https://trust.dataops.live/  Did You Know...…that we’ve made some big changes to our Documentation site. These updates aim to improve navigation and functionality. Here's what’s new: We’ve restructured the topics in the navigator based on the core principles of DataOps.live using section separators for clarity  We've also made some cosmetic tweaks to improve the overall look and feel Hope you find these changes helpful!  That’s all for now. Until next time, - the DataOps.live Product Team 

Related products:Observability

DataOps.live Product Newsletter | August 2023

Hi DataOps.live User,   Welcome to the August newsletter! Find out how our latest developments enhance your DataOps experience. Save time investigating Snowflake spend. Make the most of DDE. Enhance reliability and fault tolerance. And more. Read on.   Enhancements in Spendview for SnowflakeCompute is often the top cost area for optimization related to a Snowflake Warehouse, where the most common problems are query performance and idle warehouses.    Spendview for Snowflake brings awareness to business stakeholders and data team leads to understand their current resource usage. Spendview provides big saving opportunities in a matter of clicks as it empowers your data team with self-service Snowflake consumption and spending metrics.   Sign up now for our live webinar on optimizing your Snowflake spend and join DataOps.live and XponentL Data discussion on how you can perfect your budget by following best practices and cost-saving strategies.  Optimize the Potential of DataOps Development with dbt Make the most of the DataOps Development Environment (DDE) by using the latest dbt versions for better production efficiency. DDE is most effective when used with the recent dbt versions, which significantly boosts the development speed.  In DDE, specify the version by setting the DATAOPS_FEATURE_MATE_DBT_VERSION environment variable, for example, DATAOPS_FEATURE_MATE_DBT_VERSION=1.4. Read more.... In the DataOps pipeline, set the variable in your project’s variables.yml file.  Sign up for the DDE private preview by reaching out to ours Support team.  Enhancements in SOLE Generator Managing your existing Snowflake databases through the DataOps.live platform just got a bit easier. Besides supporting database, function, procedure, schema, table, and task, we now also support sequence, tag, stream, and file format.  Start using the SOLE Generator CLI (Command-line Interface) tool that helps you generate a YAML configuration for SOLE objects.    Community#TrueDataOps Podcast Season 2 is here! Join us with our special guest, Bob Mugila, formerly the Chief Executive Officer of Snowflake. We will cover his new book, The DataPreneurs (https://www.thedatapreneurs.com/), the new age of AI, and all things Data related! Make sure to add it to your calendar. You do not want to miss this episode!  RSVP Here AcademyThe latest course - Fundamentals of DataOps.live - just went live at our Academy. In this course, we go step by step through the process of ingesting a new data source using our Stage Ingestion Orchestrator. Topics covered include creating multiple objects with SOLE, autogenerating sources from ingestion tables, creating MATE jobs to run only specified models, creating a new pipeline -ci.yml file, and more. Enroll today!  Did You Know?…that you can incorporate a fallback job that retries your main job again but with a different container image or with alternative configuration adjustments. By incorporating a fallback job into a pipeline, developers and data engineers can enhance the reliability, robustness, and fault tolerance of the system.  Read more at Create Fallback Jobs   That’s all for now. Until next time, - the DataOps.live Product Team

DataOps.live Product Newsletter | July 2023

Hi DataOps.live User,   Welcome to the July newsletter! Gain insights on how the important developments announced at the Snowflake Summit 2023 may greatly benefit DataOps.live projects and operations. Enjoy the collaboration with your DataOps peers on our Community platform. Read on.   Snowflake Summit Announcements and DataOps Our powerful Snowflake Object Lifecycle Engine (SOLE) and Modelling and Transformation Engine (MATE) help data engineers automate and streamline the steps in the development lifecycle, allowing them to focus on Snowflake’s functional capabilities.  With the exciting announcements of Snowflake Summit 2023, we are assessing how the latest technical developments by Snowflake further enhance DataOps.live. Potential advancements include: Enhance SOLE to support Snowpark container services and libraries management  Further simplify building user interfaces using Streamlit and enhance the management of Snowpark libraries  Update SOLE to ease building, testing, distributing, and monetizing of your Snowflake data apps on Snowflake Marketplace and ensure MATE is compatible Create Iceberg and Dynamic tables within SOLE and MATE Allow using geospatial and geometric data Make costs predictable and transparent through fine-tuned cost and budget management You can already use our DataOps Development Environment (DDE) to develop data science workloads with Snowpark and build data applications with Streamlit. With the planned enhancements, things will become even easier going forward. Further leverage DataOps.live Spendview for Snowflake to visualize and investigate your Snowflake accounts’ spend drivers.  Join Our Community Today  Join the DataOps.live community today!  We are excited to announce the launch of our NEW community platform! Designed with customers, partners, and DataOps professionals in mind, the DataOps.live community is the place to: Share your expertise and ask questions  Access a wealth of knowledgebase articles to gain a deeper understanding of the DataOps.live platform  Access our training and documentation materials free of charge.  Interact with other customers and get support from the DataOps.live team, who will be actively participating in the forum Sign up and add your avatar to our new community platform by Aug 1st to earn an “Early Adopter Badge”.   Early Adopter BadgeJoin Today Did You Know?… that there is a dedicated Spendview for Snowflake space on our new Community platform where you can access valuable insights on best practices, share your expertise, and engage in discussions. If you haven’t signed up yet, join and share your experience with us.  That’s all for now. Until next time, - the DataOps.live Product Team 

DataOps.live Product Newsletter | June 2023

Hi DataOps.live User,  Welcome to the June newsletter. Discover how our free observability tool and new features enhance your DataOps experience. Save time investigating Snowflake spend. Streamline development, reduce complexity, and automate data product building and deployment on the Snowflake data cloud.  Spendview for Snowflake Spendview for Snowflake is a module from DataOps.live observability that we now make available for free. It offers a centralized view of your Snowflake compute, usage, and storage costs and consumption. It is accessible to anyone with a Snowflake account. Leveraging Spendview for Snowflake saves time investigating Snowflake spend anomalies and user adoption. It offers root cause analysis that pinpoints potential cost drivers and thus helps you monitor your budget and optimize your Snowflake spending and consumption. Learn more…   MATE with dbt 1.5  You can now use the Modelling and Transformation Engine (MATE) with dbt-core 1.5 and benefit from all the significant additions of this new version. You can enable dbt-core 1.5 by simply setting the variable DATAOPS_FEATURE_MATE_DBT_VERSION to 1.5. Learn more…  Improvements to DDE (DataOps Development Environment) Experience the latest enhancements to our fully integrated development environment, available in Private Preview. Our DDE, accessible through any browser and without any setup, provides a seamless, optimized development experience for various DataOps use cases.  With DDE, you can now switch between different dbt versions and use a pre-commit tool that helps identify issues before committing changes to your project.Learn more...    Community  In episode 17th of the #TrueDataOps Podcast, we are joined by our guest Sanjeev Mohan! Sanjeev, Principal at SanjMo and a former Gartner Research VP for Big Data and Advanced Analytics covers all things data, including analytics, data governance, data management, and data observability.  Watch the recording...   Did you Know…  … that we have introduced Data Products as a new framework in DataOps.live. You can now easily build an ecosystem of simple and composite data products on the Snowflake data cloud and benefit from greater accuracy and accessibility of data.  Data Products is currently available as a Private Preview, allowing early access to selected users. Email us and sign up for the Data Product Private Preview to get access.  Read more at Data Products.   That’s all for now. Until next time, - the DataOps.live Product Team

Related products:DataOps PlatformObservability

Product Newsletter | May 2023

DataOps emphasizes automation and integration, enabling seamless and efficient data workflows. The DataOps CLI now offers easier Snowflake database management. And new third-party tool integrations help you collaborate with your team members better on your projects. Default Parameter Values in SOLE Generator  The SOLE Generator is our command-line interface generating the Snowflake confirmation from incoming DDL. Now you can create a template for a SOLE configuration holding your desired DataOps parameter values. When “dataops solegen” runs, it merges the templates with the incoming DDL. Learn more   More Third-party Integrations with Our Platform You can now benefit from the integration of more third-party tools with the DataOps platform. You can now use Google Hangouts Chat, Email on Push, Slack, Microsoft Teams, and Cisco Webex in all your DataOps project. Learn more    Community  Join our #TruedataOps Podcast Newsletter to stay up to date on the latest! Sign up now - https://www.dataops.live/truedataops-podcast and enjoy earlier episodes.  ICYMI – Keith Belanger - #TrueDataOps Podcast Ep #16  In episode 16th of the #TrueDataOps Podcast, we are joined by our guest Keith Belanger! Keith is passionate about data, technology, learning, mentoring, designing, and leading data solutions and teams. With over 25 years in Data Architecture and Information Management, Keith is highly experienced at assembling and directing high-performing, data-focused teams and solutions. Combining a deep technical and data background with a business-oriented mindset and excellent communication skills. He consistently builds and delivers innovative data platforms and solutions to everyday business problems.  Watch the recording...  🚨 Up Next! 🚨 Upcoming Sanjeev Mohan - #TrueDataOps Podcast Ep #17  Wed, May 31, 2023, 8:00 AM PT  Join our next guest, Sanjeev is Principal at SanjMo and a former Gartner Research VP for Big Data and Advanced Analytics. With expertise covering all things data, including analytics, data governance, data management, and data observability, he has published many white papers. He is a prolific speaker at conferences and events worldwide. RSVP here  Did you Know…  …  that you can easily set up Git pre-commit hooks for your DataOps project to check the code being committed for syntax errors, style violations, or other issues. You can write these hooks in any programming language and perform any checks necessary for your project. Learn more…  That’s all for now. Until next time, The DataOps.live Product Team 

Product Newsletter | April 2023

Get ready to supercharge your data operations with our latest + greatest updates - packed with powerful new features, enhancements, and expert insights to help you unlock the full potential of your data operations!   More Enhancements in SOLE  We are continuously enhancing our Snowflake Object Lifecycle Engine with more capabilities to help you fully manage your Snowflake ecosystem. You can now: Plan modifications to any Snowflake database before applying them by using the plan and apply for lifecycle jobs. Learn more   Review in a human-readable version, generated as part of the plan lifecycle job, all the modifications planned to be applied to a Snowflake database. Learn more Use the SNOWPARK-OPTIMIZED warehouse that provides 16x memory per node compared to a standard Snowflake virtual warehouse. Learn more  Prevent SOLE from implicitly granting USAGE privileges on the parent and grandparent objects by simply setting a value to a variable. Learn more   MATE with Different dbt Versions  Our Modeling and Transformation Engine (MATE) now supports different dbt versions and provides you with the variable DATAOPS_FEATURE_MATE_DBT_VERSION to decide which version to use. Learn more MATE now also supports colored lineage graphs by annotating your logical layers in your dbt project. Learn more  Did You Know… … that our template rendering engine now supports using Python's datetime and time modules in dbt models and Jinja templates? Learn more  Community   Dan Linstedt | #TrueDataOps Podcast Ep.13 “In the data world, there are lots of advances in cloud technology that should not go unnoticed or ignored… I think the future is cloud – there’s no question about it.”  Live streaming on LinkedIn and available on-demand, the thirteenth episode of the #TrueDataOps Podcast brought together Kent Graziano, aka 'The Data Warrior', with his longtime friend Dan Linstedt, inventor of the data vault system of business intelligence. “We’re on a collision course to re-engage business analytics alongside operational systems, at least in the cloud space. It’s because data is growing at an unprecedented rate, and it’s about to increase again… Data is simply getting too large to move, and this is why I definitely think the cloud is the future...” Hear the rest of what Dan had to say in the link above!  Barr Moses | #TrueDataOps Podcast Ep.14 “The problem of data trust is ubiquitous. The number one thing people told us was ‘the data is wrong and that leads us to a place where we can’t trust it, and we can’t use it…’” Episode 14 of the #TrueDataOps podcast welcomed Barr Moses, Data Observability pioneer, CEO, and co-founder of Monte Carlo, who joined host Kent ‘The Data Warrior’ Graziano to discuss all things data and DataOps. “We can’t afford to ignore the implications of bad data - and this is where data observability comes in. Today, you not only have a lot more data, you also have a lot more people using the data…”   🚨 Up Next! 🚨 Neil Strange | #TrueDataOps Podcast Ep.15 Wednesday, May 3rd | 4pm GMT 11am EDT Join us for our next episode with Neil Strange, CEO and founder of DataVault. Neil is a professional management consultant specializing in all aspects of business intelligence. A director of Business Thinking (trading as Datavault), he is a thought leader in Data Vault 2.0 Data Warehousing, BI, analytics, AI, deep learning, and information governance. Neil is a Guest Presenter, Visitor at the Royal Holloway (University of London) School of Management | RSVP on LinkedIn That’s all for now. Until next time,  The DataOps.live Product Team