Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Salesforce-to-salesforce round trip field update issue

Tags:

salesforce

it's been a couple releases since I've had to do a S2S integration, but I ran into an unexpected issue that hopefully someone can solve more effectively.

I have two orgs, sharing contacts over S2S.

Contacts in each org have the identical schema, it's standard fields plus custom fields. I've reproduced a base case with just two custom fields: checkbox field A, and Number(18,0) field B.

Org 1 publishes field A, and subscribes to field B.

Org 2 subscribes to field A, and publishes field B.

Org 1 initiates all S2S workflow by sharing contacts to Org 2 over S2S. Org 2 has auto-accept on.

Org 2 has a Contact Before Insert trigger that simply uses field A to calculate the value for field B. e.g. if field A is checked, populate field B with 2, if unchecked, 0. (This of course is a drastic over-simplification of what I really need to do, but it's the base reproducible case.)

That all works fine in Org 2 - contacts come across fine with field A, and I see the field results get calculated into field B.

The problem is that the result - field B - does not get auto-shared back to Org 1 until the next contact update. It can be as simple as me editing a non-shared field on that same contact, like "Description", in Org 2, and then I instantly see the previously calculated value of field B get pushed back to Org 1.

I'm assuming that this is because, since the calculation of field B is occurring within a Before Insert, the S2S connection assumes the current update transaction was only performed by itself (I can see how this logic would make sense to prevent infinite S2S update loops).

I first tried creating a workflow field update that forcibly updated a (new, dummy) shared field when field B changed, but that still did not cause the update to flow back, presumably because it's in the same execution context which Salesforce deems exempt from re-sharing. Also tried a workflow rule that forwarded the Lead back to the connection queue when the field is changed, and it also didn't work.

I then tried a re-update statement in an AfterUpdate trigger - if the shared field is updated, reload and re-update the shared object. That also didn't work.

I did find a solution, which is a Future method called by the AfterUpdate trigger which reloads and touches any record that had its shared field changed by the BeforeUpdate trigger. This does cause the field results to show up in near-real-time in the originating organization.

This solution works for me for now, but I feel like I MUST be missing something. It causes way more Future calls and DML to be executed than should be necessary.

Does anyone have a more elegant solution for this?

like image 794
jkraybill Avatar asked Nov 14 '22 02:11

jkraybill


1 Answers

Had the same problem and an amazing Salesforce support rep unearthed this documentation, which covers Salesforce's specific guidance here: https://web.archive.org/web/20210603004222/https://help.salesforce.com/articleView?id=sf.business_network_workflows.htm&type=5

Sometimes it makes sense to use an Apex trigger instead of a workflow. Suppose that you have a workflow rule that updates a secondary field, field B, when field A is updated. Even if your Salesforce to Salesforce partner subscribed to fields A and B, updates to field B that are triggered by your workflow rule aren’t sent to your partner’s organization. This prevents a loop of updates.

If you want such secondary field updates to be sent to your Salesforce to Salesforce partners, replace the workflow with an Apex trigger that uses post-commit logic to update the secondary field.

In bi-directional connections, Salesforce to Salesforce updates are triggered back only on “after” triggers (for example, “after insert” or “after update”), not on “before” triggers.

This is what OP ended up doing, but this documentation from Salesforce at least clears up the assumptions and guesses that were made here as part of the discussion. It also helpfully points out that it's not best practice to use "before" triggers in these circumstances, for future reference.

like image 128
Walter E Avatar answered Dec 25 '22 23:12

Walter E