Question

pipeline not running

  • 21 December 2023
  • 19 replies
  • 116 views

  • Participating Frequently
  • 11 replies

Hi Team,

I am doing Fundamentals of DataOps.live training and as part of this did setup for mentioned runner in this course.

However when I am running pipeline its giving below message-

Please help to resolve this issue to proceed further with this course

 

config updated as below


message getting when trying to run pipeline
 

 


19 replies

when I tried using with shared runner , then also its failing

 

 

Userlevel 4
Badge

Hi @Sushil ,

Thank you for reaching out. We have noticed that the runner you have selected in your agent_tag.yml is not active:

To create your own runner you can follow the guide below: https://docs.dataops.live/docs/administration/docker-runner/installation/

As for the shared runner - it has certain limitations that may restrict the usage of some orchestrator images provided. In light of this, we recommend creating your own runner to avoid any potential limitations associated with the shared runner. By setting up a custom runner, you gain more control over the environment and can tailor it to meet the specific requirements of your project.

If you need assistance in creating a custom runner or have any questions about the process, feel free to reach out, and we'll be happy to guide you through the setup.

Kind regards,
Traycho Milev
CloudOps & DataOps Engineer

Thanks for your reply.

 

I tried running pipeline with this -partnerconnect-k8s-runner runner so now its started but then failed with this new error.

 

Please suggest on how to fix this error.

 

 

 

 

Userlevel 4
Badge

Dear @Sushil ,

As you have deducted the inclusion of the variable 'SECRETS_SELECTION: $SECRET_ARN' successfully resolved the issue. To provide context, our partner connect runner faces limitations in permissions, specifically being restricted to retrieving secrets exclusively from a designated location. Notably, it lacked the 'ListSecrets' permission, leading to the initial failure.

Kind regards,
Traycho Milev
CloudOps & DataOps Engineer

Userlevel 4
Badge

Dear @Sushil ,

We have noticed that you are experiencing the following error:

snowflake_database.ACADEMY_FUNDAMENTALS_PROD: Creating...

│ Error: error creating database ACADEMY_FUNDAMENTALS_PROD: 003001 (42501): SQL access control error:
│ Insufficient privileges to operate on account 'BV31501'

To give some context, our PC_DATAOPS_ROLE inherits privileges from the PUBLIC role. It's important to note that the PUBLIC role lacks the necessary permissions to create databases. Consequently, we recommend manually assigning the required permissions in Snowflake to be able to proceed.

To give you a rough idea on what needs to be granted you can check this POC script here:

https://docs.dataops.live/docs/get-started/set-up-snowflake/#configuring-a-new-snowflake-account

Kind regards,
Traycho Milev
CloudOps & DataOps Engineer

Dear @Traycho 

 

Thanks a lot for your suggestion. It helped so now I think I will go ahead with next sessions in training Fundamentals of DataOps.live and then modify the pipeline accordingly.

 

Thanks again for all your help so far -:)

Dear @traychomilev 

Need one help.

I am getting error while running job-Test all Sources.

before running this job I ran manually in snowflake below command-

GRANT IMPORTED PRIVILEGES ON DATABASE snowflake_sample_data  TO PC_DATAOPS_ROLE

also made below change in test_all_source.yml file & verified if role PC_DATAOPS_ROLE has required privileges.

Still getting this error. Hope you can guide and help me.

 

Thanks !

 

Userlevel 4
Badge

Hi @Sushil ,

 Thank you for bringing this to our attention. We've observed that you're encountering the following error:

[ 05:12:15.782 ] - [ 20-render-templates ] - [ INFO ] - Rendering templates in /tmp/local_config using dataops-render render-dir
/tmp/local_config/profiles.yml [not rendered: 'dict object' has no attribute 'TRANSFORM']
Exiting due to strict mode

This error signals a problem during the rendering process of a templated YAML file. Specifically, the file in question pertains to the default profile we supply for our MATE jobs. This file expects several keys to be present in your vault:

  • SNOWFLAKE.TRANSFORM.USERNAME
  • SNOWFLAKE.TRANSFORM.PASSWORD
  • SNOWFLAKE.TRANSFORM.ROLE
  • SNOWFLAKE.TRANSFORM.WAREHOUSE
  • SNOWFLAKE.TRANSFORM.THREADS

The vault.template.yml file automatically populates the specified values in the vault, mapping secrets from the secret manager to the designated keys mentioned above.

To understand the cause of the failure, it is crucial to review the log message from the preceding job, "Initialize Pipeline." While this job may have been marked as successful, it contains a critical log message that provides insights into the root cause of the issue. Please refer to the log details to gain a comprehensive understanding of the situation.

https://app.dataops.live/BV31501.aws-ap-southeast-1/academy-fundamentals/-/jobs/22888988#L132


dataops-vault WARNING Unable to parse secrets file /builds/BV31501.aws-ap-southeast-1/academy-fundamentals/vault-content/vault.yml as YAML: while parsing a block mapping
in "/builds/BV31501.aws-ap-southeast-1/academy-fundamentals/vault-content/vault.yml", line 9, column 3
expected <block end>, but found '<block mapping start>'
in "/builds/BV31501.aws-ap-southeast-1/academy-fundamentals/vault-content/vault.yml", line 28, column 4

It has come to our attention that the vault.template.yml file has been modified, resulting in an invalid YAML structure and rendering issues. We kindly request that you restore the file to its original contents. It is important to note that any alterations to the file are not expected, as the system is designed to function seamlessly without the need for manual adjustments. Reverting to the original configuration should resolve any issues, allowing the system to operate as intended out of the box.

Kind regards,
Traycho Milev
CloudOps & DataOps Engineer

dear @traychomilev 

 

Thanks a lot for all the help to resolve this issue.

Its resolved now :-)

Happy Holidays !

Dear @traychomilev 

 

could you please help me to resolve this error. I tried to debug few times but no luck. It was working fine earlier.

 

 

 

 

 

 

similar error I am getting for my new job.

For this I skipped test all sources job and wanted to see if my new job- build curation runs fine or not

 

please help !

 

 

Userlevel 4
Badge

Hi @Sushil ,

The reason behind your error is in the syntax of the column_width test you have introduced here:

https://app.dataops.live/BV31501.aws-ap-southeast-1/academy-fundamentals/-/blob/e1f03676c48c4f6165f8b0338ba21763375023fd/dataops/modelling/macros/column_width.sql

Ensure that every test introduced to the project concludes with the phrase {% endtest %}

Otherwise during the compile, DBT will fail with:

Compilation Error
Reached EOF without finding a close tag for test (searched from line 1)

Example syntax from the dbtutils package: https://github.com/dbt-labs/dbt-utils/blob/main/macros/generic_tests/recency.sql#L3C14-L3C14

Similarly, macros that are added to a project have to end with {% endmacro %}​​​​​​.

I trust this proves beneficial.

Kind regards,
Traycho Milev
CloudOps & DataOps Engineer

Hi @traychomilev 

Thanks a lot for helping to figure it out. It was miss from my end. Now I corrected it.

However  test all sources job failed with below error-

I followed all the steps mentioned in this guide.

https://academy.dataops.live/courses/take/fundamentals-of-dataops-live/texts/46739997-step-by-step-autogenerating-sources-from-ingestion-tables

could you please assist to debug please.

 

 

Userlevel 4
Badge

Hi @Sushil ,

Thank you for bringing this matter to our attention. It has come to our notice that the pipeline is currently running on dbt version 1.0.1, which is considered outdated. This observation suggests that the version of the orchestrator's image being utilised is not up-to-date. We suspect that the root cause of this issue lies in the configuration of the runner, specifically in its imagePullPolicy.

We acknowledge this concern and assure you that we will be actively working on updating the configuration of our runner to address this issue. In the interim, we recommend that you create your own runner. This will allow you to take advantage of the latest image versions, avoiding restrictions related to secrets placement, image pulling capabilities, and other potential limitations.

Kind regards,
Traycho Milev
CloudOps & DataOps Engineer

Thank a lot @traychomilev . I will try to explore on creating my own runner and see how it goes.

So far it was great experience while exploring DataOps.live :-)

Would be great if partnerconnect-k8s-runner gets updated with all the required latest images so that it can be used by folks like me who are experimenting with your great product !

 

One query though-

how come then this partner connect job running with same partner connect runner image and with dbt version 1.0.1?

any idea?

 

 

 

 

 

Userlevel 4
Badge

Hi @Sushil ,

Thank you for your observation! We can confirm that the current default version of DBT that we use is dbt 1.5 Therefore, we shall be proceeding with updating the configuration of our partnerconnect-k8s-runner and we will notify you once done.

Kind regards,
Traycho Milev
CloudOps & DataOps Engineer

Hi @traychomilev

Thanks a lot for your reply and confirmation on upgrading partner connect runner with latest dbt version. 

I will wait for the update and post that will verify my pipeline to confirm all looks good at my end. 

Thanks again for your prompt response today -:) 

 

 

Userlevel 4
Badge

Hello @Sushil ,

I hope this message finds you well. I wanted to inform you that the Partner Connect Runner has been successfully updated, and it is now operating on the latest 5-stable versions of the orchestrators as per your request. Please proceed at your convenience to verify your pipeline. 

If you encounter any issues or require further assistance, please don't hesitate to reach out. Thank you for your collaboration and swift response throughout this process.

Kind regards,
Traycho Milev
CloudOps & DataOps Engineer

Hi @traychomilev 

Thanks a lot for your help to resolve this issue on priority. Really appreciate !!

Now I can run pipeline without any issues 🙂

 

Have a great Happy New Year !!!

 

Thanks

Sushil Joshi

Reply