admin管理员组文章数量:1296320
I'm currently working on transferring files from SFTP to Google Cloud using the SFTPToGCSOperator in Apache Airflow. I’ve set up a connection in the Airflow UI, and the credentials work perfectly when I log in manually. However, I keep encountering below error when executing the Python script.
Traceback (most recent call last): File "/home/airflow/.local/lib/python3.11/site-packages/airflow/models/taskinstance.py", line 768, in _execute_task result = _execute_callable(context=context, **execute_callable_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/models/taskinstance.py", line 734, in _execute_callable return ExecutionCallableRunner( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/operator_helpers.py", line 252, in run return self.func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/models/baseoperator.py", line 424, in wrapper return func(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/google/cloud/transfers/sftp_to_gcs.py", line 119, in execute sftp_hook = SFTPHook(self.sftp_conn_id) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/sftp/hooks/sftp.py", line 93, in init super().init(*args, **kwargs) TypeError: SSHHook.init() got an unexpected keyword argument 'host_proxy_cmd'
I'm developing this functionality using Docker on my local machine. Below is the code snippet that fails:
transfer_to_gcs = SFTPToGCSOperator(
task_id=f'transfer_{table}',
sftp_conn_id="sftp_connection", # Replace with your SFTP connection ID
source_path=sftp_path,
destination_bucket=gcs_bucket, # Replace with your GCS bucket name
destination_path=file_name,
dag=dag
)
Thank you for your help in advance.
I'm currently working on transferring files from SFTP to Google Cloud using the SFTPToGCSOperator in Apache Airflow. I’ve set up a connection in the Airflow UI, and the credentials work perfectly when I log in manually. However, I keep encountering below error when executing the Python script.
Traceback (most recent call last): File "/home/airflow/.local/lib/python3.11/site-packages/airflow/models/taskinstance.py", line 768, in _execute_task result = _execute_callable(context=context, **execute_callable_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/models/taskinstance.py", line 734, in _execute_callable return ExecutionCallableRunner( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/operator_helpers.py", line 252, in run return self.func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/models/baseoperator.py", line 424, in wrapper return func(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/google/cloud/transfers/sftp_to_gcs.py", line 119, in execute sftp_hook = SFTPHook(self.sftp_conn_id) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.11/site-packages/airflow/providers/sftp/hooks/sftp.py", line 93, in init super().init(*args, **kwargs) TypeError: SSHHook.init() got an unexpected keyword argument 'host_proxy_cmd'
I'm developing this functionality using Docker on my local machine. Below is the code snippet that fails:
transfer_to_gcs = SFTPToGCSOperator(
task_id=f'transfer_{table}',
sftp_conn_id="sftp_connection", # Replace with your SFTP connection ID
source_path=sftp_path,
destination_bucket=gcs_bucket, # Replace with your GCS bucket name
destination_path=file_name,
dag=dag
)
Thank you for your help in advance.
Share Improve this question asked Feb 11 at 21:13 BacktoCoding9BacktoCoding9 2515 gold badges6 silver badges13 bronze badges1 Answer
Reset to default 1I recently had this issue, and fixed it by upgrading the apache-airflow-providers-ssh package to 4.0.0 . They added a host_proxy_cmd parameter to this which the latest Airflow hooks will try to call
本文标签:
版权声明:本文标题:file copying - TypeError: SSHHook.__init__() got an unexpected keyword argument 'host_proxy_cmd' while using SFT 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1741634723a2389574.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论