Integration architecture
The following diagram shows the Integration Architecture that we have adopted for experimenting with Google Prediction API and Salesforce Data:
The data will be loaded into the Google Cloud bucket and trained using Prediction API manually using the console, and a model is formed. We will submit the query to the formed model by triggering an HTTP request to the Google Prediction API from Salesforce. The architecture is purposefully kept very simple to help readers grasp the fundamentals before we approach Prediction API in detail.
The following are the steps required to train the dataset via Google Prediction API. Note that we will use prediction.trainedmodels.insert to train the model:
- Load the Sample Dataset: At this point, the assumption is that you have extracted data out of the Salesforce opportunity object and loaded it into the Google Cloud Storage. The steps are covered in the prerequisites section.
- Training the Dataset: Let’s use the API explorer to train the sample data using API. The API explorer of Google is located at https://developers.google.com/apis-explorer/#s/prediction/v1.6/.
The following screenshot shows how one can train the model:
The request and response JSON are shown as follows. Note that the learned-maker-155103 is the
Project ID. Replace with your current Project ID.
Request Payload:
POST https://www.googleapis.com/prediction/v1.6/projects
/learned-maker-155103/trainedmodels
{
"modelType": "regression",
"id": "opportunity-predictor",
"storageDataLocation":
"salesforceeinstein/SFOpportunity.csv"
}
Carefully, note that we have pointed the location in Cloud Storage where our data resides. Once successful, we get a response from the API, which is shown as follows:
Response Payload:
200
{
"kind": "prediction#training",
"id" : "opportunity-predictor",
"selfLink": "https://www.googleapis.com/prediction/v1.6
/projects/learned-maker-155103/trainedmodels/opportunity-
predictor", "storageDataLocation":
"salesforceeinstein/SFOpportunity.csv"
}
We can monitor the status of the data training via API prediction.trainedmodels.get.
The request to execute in the console is as follows:
Request Payload:
GET https://www.googleapis.com/prediction/v1.6/
projects/learned-maker-155103/trainedmodels/opportunity-
predictor
Response Payload:
200
{
"kind": "prediction#training",
"id": " opportunity-predictor",
"selfLink": "https://www.googleapis.com/prediction/v1.6/
projects/learned-maker-155103/trainedmodels/opportunity-
predict",
"created": "2017-01-18T19:10:27.752Z",
"trainingComplete": "2017-01-18T19:10:48.584Z",
"modelInfo": {
"numberInstances": "18",
"modelType": "regression",
"meanSquaredError": "79.61"
},
"trainingStatus": "DONE"
}
You will get a response showing the training status, which is shown here:
DONE
Note that if your data does not have any correlation, then you will see a very high value of meanSquaredError.
- Using the trained Dataset to predict the outcome: For this, we will create a simple trigger on the opportunity object in Salesforce to make an asynchronous API call to invoke the Google Prediction API to predict the probability of opportunity closure.
Before we add Triggers to Salesforce, make sure that Predicted_Probability field is created in the Salesforce. To create a field in Salesforce follow the following steps:
- Navigate to SETUP | Object Manager | Opportunity in Lightning experience or Setup | Opportunities | Fields | New in Classic experience.
- Select the Field type as Number (with length as 5 and decimal places as 2) and follow the defaults and save:
The trigger code uses Apex, which is similar to JAVA provided by the Salesforce platform to write business logic. For the purposes of demonstration, we will keep the code very simple:
//Trigger makes an API call to Google Prediction API
to predict opportunity probability
//Please note that this trigger is written for demonstration
purpose only and not bulkified or batched
trigger opportunityPredictor on Opportunity (after insert) {
if(trigger.isinsert && trigger.isAfter){
OpportunityTriggerHelper.predictProbability
(trigger.new[0].Id);
}
}
If you are using Salesforce Classic, the navigation path to add that trigger an opportunity is SETUP | Customize | Opportunities | Triggers.
For Lightning experience, the path is SETUP | Triggers | Developer Console. Use Developer Console to create a trigger
Also note that since triggers use apex classes, first save the dependent apex classes before saving the apex trigger.
The Apex class that is invoked from the trigger is as follows:
SETUP | Develop | Apex Classes
//Apex Class to make a Callout to Google Prediction API
public with sharing class opportunityTriggerHelper{
@future(callout=true)
public static void predictProbability(Id OpportunityId){
Opportunity oppData = [Select Id,Amount,Type,
Predicted_Probability__c from Opportunity where Id =
:OpportunityId];
HttpRequest req = new HttpRequest();
req.setEndpoint('callout:Google_Auth');
req.setMethod('POST');
req.setHeader('content-type','application/json');
//Form the Body
PredictionAPIInput apiInput = new PredictionAPIInput();
PredictionAPIInput.csvData csvData =
new PredictionAPIInput.csvData();
csvData.csvInstance = new list<String>{
oppData.Type,String.valueof(oppData.Amount)};
apiInput.input = csvData;
Http http = new Http();
req.setBody(JSON.serialize(apiInput));
HTTPResponse res = http.send(req);
System.debug(res.getBody());
if(res.getStatusCode() == 200){
Map<String, Object> result = (Map<String, Object>)
JSON.deserializeUntyped(res.getBody());
oppData.Predicted_Probability__c =
Decimal.valueof((string)result.get('outputValue'));
update oppData;
}
}
}
The apex class for parsing the JSON is as follows:
public class PredictionAPIInput {
public csvData input;
public class csvData {
public String[] csvInstance;
}
}
Salesforce Apex offers HTTP methods for us to make calls to an external API. We are leveraging that and the configuration in the named credential to make an HTTP request to the Google Prediction API.