admin管理员组

文章数量:1395785

I have a use case to execute same Databricks workflow with different params independently . Please advise with some guidance .

I have a python script(script.py) that reads from a csv file and load to a cutput table .The csv file location (file_path1) and output table(table1) are parameterized in param file .Also I can have The csv file location (file_path2) and output table(table3) are parameterized in param file and uses the script.py to load.{it is not necessary that I have the all files at the same time . Just needs to kick off the relevant instance of Databricks with parameters}

The parameters I fetch from the trigger file name to loop though the param files to fetch the param values .

Please advise me how to dynamically pass the parameters and kick off the relevant instance of Databricks workflow

本文标签: dynamically executing the Databricks workflow based on parametersStack Overflow