admin管理员组文章数量:1395785
I have a use case to execute same Databricks workflow with different params independently . Please advise with some guidance .
I have a python script(script.py) that reads from a csv file and load to a cutput table .The csv file location (file_path1) and output table(table1) are parameterized in param file .Also I can have The csv file location (file_path2) and output table(table3) are parameterized in param file and uses the script.py to load.{it is not necessary that I have the all files at the same time . Just needs to kick off the relevant instance of Databricks with parameters}
The parameters I fetch from the trigger file name to loop though the param files to fetch the param values .
Please advise me how to dynamically pass the parameters and kick off the relevant instance of Databricks workflow
本文标签: dynamically executing the Databricks workflow based on parametersStack Overflow
版权声明:本文标题:dynamically executing the Databricks workflow based on parameters - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744650639a2617663.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论