admin管理员组文章数量:1287569
We receive high volume of Account & Contacts through external service (approx. 2M per day). We need to load them in Salesforce with fire and fet type of integration. Please share your thoughts on below architecture.
- Rest API in Salesforce will be called by external service, it will accept the request, publish the platform event and return an immediate response to external service (like 202)
- Platform event trigger, We'll deserialize the payload into a wrapper class and send it to the queueable class for processing
Limitation: External system can send only one request (with one payload) at a time. Platform event trigger, calling the queueable class for every record is not useful as we will reach the daily limit of Asynchronous calls in Salesforce.(max 250,000 call in 24 hour period)
Other workaround we are considering is to load the payload requests into a custom object from platform event trigger and then bulk process them using a batch job or queueable. What's the recommended way to load 2M records per day?
Tried Step 1 and 2 from above but queueable being called for every record will not work for high volume of records received daily.
We receive high volume of Account & Contacts through external service (approx. 2M per day). We need to load them in Salesforce with fire and fet type of integration. Please share your thoughts on below architecture.
- Rest API in Salesforce will be called by external service, it will accept the request, publish the platform event and return an immediate response to external service (like 202)
- Platform event trigger, We'll deserialize the payload into a wrapper class and send it to the queueable class for processing
Limitation: External system can send only one request (with one payload) at a time. Platform event trigger, calling the queueable class for every record is not useful as we will reach the daily limit of Asynchronous calls in Salesforce.(max 250,000 call in 24 hour period)
Other workaround we are considering is to load the payload requests into a custom object from platform event trigger and then bulk process them using a batch job or queueable. What's the recommended way to load 2M records per day?
Tried Step 1 and 2 from above but queueable being called for every record will not work for high volume of records received daily.
Share Improve this question asked Feb 24 at 22:05 ShailaShaila 112 bronze badges1 Answer
Reset to default 0Not a proper answer but too long to comment
What's the delay between the REST calls? Enough for platform events to be processed as single records? Or did you modify the https://developer.salesforce/docs/atlas.en-us.platform_events.meta/platform_events/platform_events_trigger_config.htm and changed from default 2000 to 1 at a time (if you did then "well, here's your problem". We don’t recommend setting the batch size to a small number or to 1)
why there's queueable at play? REST API -> fire platform event -> that event trigger can be "normal", not asynchronous. You're hitting some limits and that caused you to inject Queueable to the whole thing? If you need async to send a callout back to the external system - do it from a batch job scheduled every hour or so. If you're reusing an old queueable - refactor it a bit or call it synchronous?
new MyQueueable(Wrapper w).execute(null);
?you're stashing the whole serialized payload in the platform event, right? In long text area? Will the payload be ever over 131,072 characters?
how do you handle errors in processing the JSON? is it ok that the plat event is poof gone or would you benefit from some staging table. Either normal or "big object". Normal can be expensive storage-wise, 2M *2k = 4 GB permamently allocated for a staging table... But having them in a table means you can call a batch job to consume them at your own pace, control the chunk size with normal batch tricks rather than deploying a secret metadata file... With big object you could do main inserts in
execute()
and delete from big object infinish()
have you considered putting a proper ESB in the middle, Biztalk, Mulesoft... something to queue up the messages with built-in error reporting, replaying the messages...
本文标签:
版权声明:本文标题:rest - Bulk data processing for high volume records using platform events and Queueable class in Salesforce - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1741239518a2363693.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论