admin管理员组文章数量:1391934
I have a k3s cluster setup with the Argo Events framework. I have an AWS SQS queue set up.
The event source and sensor are configured properly (Every time I send a message to the queue manually through the AWS dashboard the sensor gets triggered. I verified this by checking the generated pods that the sensor creates) The trigger is working but the sensor is unable to get the body of the SQS event.
Here is the sensor:
apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
name: s3-sensor
spec:
template:
serviceAccountName: argo-events-sa
dependencies:
- name: s3-file-upload
eventSourceName: s3-source
eventName: s3-file-upload
triggers:
- template:
name: log-trigger
k8s:
group: ""
version: v1
resource: pods
operation: create
source:
resource:
apiVersion: v1
kind: Pod
metadata:
generateName: log-pod-
labels: log-pod
spec:
containers:
- name: log-container
image: alpine
command: ["echo"]
args: ["S3 file upload event received:\n", ""]
restartPolicy: Never
parameters:
- src:
dependencyName: s3-file-upload
dataKey: body
dest: spec.containers.0.args.1
Here is my event source file:
apiVersion: argoproj.io/v1alpha1
kind: EventSource
metadata:
name: s3-source
namespace: argo-events
spec:
sqs:
s3-file-upload:
jsonBody: true
accessKey:
key: AWS_ACCESS_KEY_ID
name: aws-credentials
secretKey:
key: AWS_SECRET_ACCESS_KEY
name: aws-credentials
region: us-east-1
queue: sqsqueue
waitTimeSeconds: 5
When I try to view the logs for the created pod from the event:
sudo kubectl --namespace argo-events logs log-pod-6kxd5
I get:
S3 file upload event received:
The event body from the sqs queue is not showing.
Ive tried changing dataKey to 'data' and using busybox instead of alpine to echo the upload event.
版权声明:本文标题:amazon sqs - How can I gain access to an AWS SQS queue message using argo events in Kubernetes? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744730909a2622042.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论