5-minute read

Here’s a requirement you don’t get every day: There is a cloud system (Salesforce in this case, though it could be any) where users will create records that reference source data existing in SAP. Nothing special so far. Because of the fluidity of the data, there will be times when some of the information will be known by the user in the cloud before it is entered in SAP. This is less common, but bi-directional integrations happen all the time. As they should. Which would have made this somewhat easier. But not here.

OK, well, we just have a business process that says if the data isn’t available yet, then come back and finish later. Nope, that won’t work, either, because the whole purpose of moving these processes from another cloud system is that business users do not want to do it that way. OK … I will save you the tedium of listening to all of the other scenarios that were floated up and shot down if you will trust me that the implementation team went through them all, leaving us with the following constraints:

• The source-of-truth (SoT) is SAP.

• The Salesforce application must receive updates from SAP.

• The data may not exist in SAP at the time of initial entry in Salesforce.

• The integration must be in batch mode.

• The external identifier in Salesforce will be a combination of two fields in SAP.

• The Salesforce record name for the value is non-unique and consists of a single field.

Again, I will spare you a long story of the various solutions that were discussed and discarded as insufficient. The solution consists of three parts (two if you are data purist, which I am not).

The solution

The first part is, how can we allow users to create new records in Salesforce when the SoT is SAP? Because the external identifier is a combination of two fields and the record name consists of a single field (when selecting existing records, users are required to review the matches before selecting), the first step requires two pieces. 1) If a matching record cannot be found, the user is allowed to create one. However, the record form only consists of the single Name field; 2) To prevent duplicate and orphan data, a Workflow Rule is created that triggers Field Update Workflow that combines the two fields used to set the value of the external identifier field (UniqueMaterialIdNDC) whenever the record is updated. The external identifier field has a unique constraint, so if a user attempts to create more than one new record with a Name field that has not been paired with the second part of the unique field (MaterialID), they are given an error message advising them that the value exists. This solves the human side of the equation.

The other part of the solution has to do with how we determine the data to synchronize with new values created by business users. This is similar to the “4 gallons from a 3-gallon and a 5-gallon jug” problem (which the internet seems to think originated with Die Hard 3, even though I learned it more than a decade before Die Hard 1, except without the math). The problem is one of perception, and the desire to avoid inelegant solutions.

The challenge is that every day a full update comes from SAP, and it is important to not create new records where partial records have been created in Salesforce, and yet still add truly new records as well as update existing records. And here is how to do that:

1. The incoming batch from SAP is loaded to a holding object (SAP_Material_Master).

2. Get a list of all Salesforce Records that do not have a MaterialID value (new SFDC Records).

3. Get a list of all Salesforce Records that have a Name field matching the previous list (SFDC Potential Records).

4. Get a list of all SAP_Material_Master records that have a match to the list of Names from the first list of SFDC records (new SFDC Records) as Potential SAP Matches.

5. Remove every Potential SAP Matches record from the list that has a matching unique ID in SFDC Potential Records.

6. For every remaining Potential SAP Matches record that has a matching Name in new SFDC Records, update the Salesforce Records.

(and now comes the hard part …)

7. Upsert all of the SAP_Material_Master records to Salesforce Records.

Why was that last one hard? Because it goes against all of our training to avoid redundant work. But here is the reality: Despite the simplicity of the solution above, it took a few days to design and then a couple more to develop and debug. Ensuring that the records were not updated redundantly would have taken another one to two days and saved .010 seconds in a daily batch routine. In this case, an inelegant solution is truly the best solution when the requirements and needs of the business are taken in to account.

This is why this was the Integration from Heck. Because the various business rules eliminated the “best practice” approaches, the solution had to deal with supporting the business needs while still being functionally reliable enough to be maintainable.

(If the title had you picturing someone with horns, a pointy tail and holding a large spoon, you are my people. If not, I still like you anyways :-))

Like what you see?

Paul Lee

Senior Technical Architect Scott S. Nelson has over 20 years’ experience consulting on robust and secure solutions in areas such as multi- and hybrid-cloud, system integrations, business process automation, human workflow management, and quality assurance innovation.

Author