We want to use DataOps to load data from Azure Storage on a regular basis (every 30 minutes). The data is stored in blob storage as datetimestamp/entity/csv file e.g. :
/2024-07-08T09.14.53Z/exchangerate/2024.csv
/2024-07-08T09.30.04Z/exchangerate/2024.csv
/2024-07-30T13.29.53Z/exchangerate/2024.csv
To use the pattern method takes too long, so I was wondering if it possible to use a before script to get a list of valid files (something along using the snowflake list @stage command) and then create that as a variable and then place that into the path e.g. path : <variable> but I cant see any documentation / help online that shows this?