admin管理员组

文章数量:1122846

I would like to hear from you if my approach is the best out there, or if there are some out of the box solution or better solution when it comes to convenience and security. I have an Azure Data Factory serving as the ingestion tool and there is Synapse Analytics as the transformation (my solution is mostly in Synapse but you cannot share Self Hosted Integration Runtime linked service among synapse workspaces so it would require more amount of VM with SHIR linked services which has been denied as its cost).

Currently the only way I see is WebActivity within ADF pipeline and REST API call to trigger Synapse pipeline. I guess it's pretty decent solution as I've come across that before, but maybe there is something better I can think of? Any recommendation and idea would be highly appreciated

Thank you in advance

I would like to hear from you if my approach is the best out there, or if there are some out of the box solution or better solution when it comes to convenience and security. I have an Azure Data Factory serving as the ingestion tool and there is Synapse Analytics as the transformation (my solution is mostly in Synapse but you cannot share Self Hosted Integration Runtime linked service among synapse workspaces so it would require more amount of VM with SHIR linked services which has been denied as its cost).

Currently the only way I see is WebActivity within ADF pipeline and REST API call to trigger Synapse pipeline. I guess it's pretty decent solution as I've come across that before, but maybe there is something better I can think of? Any recommendation and idea would be highly appreciated

Thank you in advance

Share Improve this question asked Nov 22, 2024 at 11:15 lifeofthenoobielifeofthenoobie 1691 silver badge13 bronze badges 4
  • 1 are you limited to use only these two resources? if you are allowed to use a storage account, you can copy a sample file using ADF pipeline to storage account container from which you can use storage event trigger on synapse pipeline? From this approach, you can avoid the possibility of hardcoding any secrets or Bearer tokens in the web activity. If you need to, I can provide this as an answer. – Rakesh Govindula Commented Nov 22, 2024 at 11:41
  • Yeah I use storage accounts, with hierarchical namespace enabled, so a data lake. So there is a possibility to use built-in trigger in synapse workspace analytics once pipeline in ADF complete its run and all files will land in storage? What about if in the near future I enable CDC? – lifeofthenoobie Commented Nov 22, 2024 at 12:35
  • CDC won't trigger any pipeline. But whenever you start the pipeline with dataflow after its first run, it will take the latest modified files data. In its first run, it will take full load. and to avoid multiple extra triggers, you need to a specific container and a sample dummy only for this triggering purpose. – Rakesh Govindula Commented Nov 22, 2024 at 12:40
  • I don't use dataflow, only copy activity in ADF. So I guess I am forced for REST API call and secure BearerToken in output of retrieving it – lifeofthenoobie Commented Nov 22, 2024 at 14:09
Add a comment  | 

1 Answer 1

Reset to default 1

Currently, there is no specific activity in ADF to trigger the synapse pipeline. As mentioned, you can either use a REST API to trigger the synapse pipeline or use storage event trigger.

  • The Storage event trigger approach requires a temporary storage location where no files apart from ADF should be uploaded to this. Create storage event trigger in the synapse pipeline to this location. In the ADF pipeline, after all the activities, use a copy activity and copy a sample file to this location. Every time, you want to trigger the synapse trigger, you can run the ADF pipeline, and it will copy the same sample file to this storage location which will trigger the synapse pipeline. The sample file can be of any type as it will be used only for the triggering. In this approach, you can avoid the possibility of hardcoding the secrets and Bearer tokens.
  • The REST API approach requires hardcoding of the secrets and Bearer tokens, but you can use the option of securing the input and output of the web activity. Additionally, in this approach, you can pass any parameters if you want from the web activity body. And also, if you return any values from the synapse pipeline, you can capture the output values after the synapse pipeline run. Go through this SO answer to know more about it.

本文标签: azureBest approach to trigger Synapse from ADFStack Overflow