admin管理员组文章数量:1345313
I am building an ML pipeline in Azure but it is failing when trying to invoke the endpoint with my model. The error reads:
Error Code: ScriptExecution.StreamAccess.NotFound
Native Error: error in streaming from input data sources
StreamError(NotFound)
=> stream not found
NotFound
Error Message: The requested stream was not found. Please make sure the request uri is correct.|
The first stage of the pipeline trains the model and works fine. Three parameters are defined in the pipeline yaml and passed to the python script, --input_folder (uri_folder: blob storage location of dataset) --config_file (parameters used in the python script, includes name of the dataset file), and --output_folder (uri_folder: artifacts saved in blob storage). Again, no problems training the model and uploading to the registry.
However, things are a little fuzzy when it comes to passing the parameters while invoking the model for testing.
The error suggests a possible permission issue, but I verified the UAMI has Storage Blob Data Contributor role for each storage account.
The cli command takes --input as an argument and I pass a json file containing the three parameters needed for the script.
{
"input_data": {
"uri_folder": "azureml://subscriptions/.../resourcegroups/.../workspaces/.../datastores/my_datastore/paths/my_folder/"
},
"parameters": {
"input_folder": "azureml://subscriptions/.../resourcegroups/.../workspaces/.../datastores/my_datastore/paths/my_folder/",
"output_folder": "azureml://subscriptions/.../resourcegroups/.../workspaces/.../datastores/my_store/paths/output/",
"config_file": "azureml:config_file@latest"
}
}
本文标签: Ingesting data into Azure ML model deploymentStack Overflow
版权声明:本文标题:Ingesting data into Azure ML model deployment - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1743811213a2543102.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论