admin管理员组文章数量:1391975
I have a Azure Synapse pipeline which is triggered by a storage event. In pipeline trigger we have defined 2 variable Container name whose value is flatData and Blob path begins with whose value is inputFiles/
What it should do, when files (e.g. market_orientation.csv) received under flatData/inputFiles/, it should read them and move it other location. Right now what is doing once a file is received under flatData/inputFiles/ it gets triggered but while reading the file it tries to read it from flatData rather than flatData/inputFiles/ Somehow it is ignoring Blob path begins with while reading the file.
I have a Azure Synapse pipeline which is triggered by a storage event. In pipeline trigger we have defined 2 variable Container name whose value is flatData and Blob path begins with whose value is inputFiles/
What it should do, when files (e.g. market_orientation.csv) received under flatData/inputFiles/, it should read them and move it other location. Right now what is doing once a file is received under flatData/inputFiles/ it gets triggered but while reading the file it tries to read it from flatData rather than flatData/inputFiles/ Somehow it is ignoring Blob path begins with while reading the file.
Share Improve this question edited Apr 5 at 0:17 qkfang 1,7851 gold badge1 silver badge20 bronze badges asked Mar 13 at 10:15 GD_JavaGD_Java 1,4516 gold badges26 silver badges42 bronze badges 1- It sounds like your trigger is working, but something in your pipeline setup maybe wrong. But you've provided no screenshots / details of how your pipeline is setup. Usually with a blob creation trigger - you add 2 pipeline paramaters 'Folder' and 'File', and the trigger passes @triggerBody().folderPath and @triggerbody().filename to those parameters. You then use those parameters in any pipeline activities. Are you doing that? – Celador Commented Mar 14 at 9:35
1 Answer
Reset to default 0It seems like you have opted for the wild card file path in your copy activity which might be the reason for copying all the files in the given container.
The trigger will only trigger the pipeline when the uploaded/modified file falls in given filters but reading and copying the file will also depends on the activity configurations.
To read and copy the same triggered file, you need to use the trigger parameters like @triggerBody().folderPath
and @triggerBody().fileName
in your pipeline.
First create two dataset parameters folderpath
and filename
of string type in your dataset and use those in the container name and file name as shown below.
Similarly, do the same in the pipeline as well without any default values. During the creating of the trigger, it will ask whether you would like to provide any values for these parameters or not. Here, give the trigger parameters to the pipeline parameters.
Now, in the activity, give the pipeline parameters to the dataset parameters. For sample, I have used lookup activity here. You can do the same with copy activity or dataflow as well.
Upon upload/modification of the file, these parameters will get the folder path and file name of the file, and those values will be propagated to the dataset and activity. You can confirm the same from the pipeline run and activity input details.
本文标签: Azure storage trigger is not Blob path begins withStack Overflow
版权声明:本文标题:Azure storage trigger is not Blob path begins with - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744708570a2620987.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论