Skip to main content

Salesforce Lightning Connect to integrate with oData providers



Lightning Connect is a great offering from Salesforce that will allow us to integrate our salesforce org with an external data source which can expose oData rest APIs. The beauty of this integration is that it can be setup with just point and click steps in Salesforce. 

The main requirement is that the external data source should expose oDATA services. I will explain what oData is below and then we can talk about the advantages of using this form of integration in Salesforce.

oData 
According to oData.org, oData is the best way to do REST API integration (there may be others who disagree but we wont go there :) ). As you already know REST APIs expose data in json format from a source and are accessed via unique URI end points. oData enables the creation and consumption of REST APIs identified using URLs and defined in a data model, using simple HTTP messages.

It is also called as ODBC for the web sometimes. Following this protocol, it is possible to setup services exposing REST api urls that expose data out of a database. I will write on how I did the same for a POSTGRES database table in a future post.

Now an oData service exposes a metadata url endpoint which describes the structure/schema of the data it is exposing. For an example, please see below metadata endpoint and its xml output for something i created on Heroku -


If you examine the xml carefully, it mentions couple of things. At the very top, it says m:DataServiceVersion="2.0". This indicates the version of the protocol here is oData 2.0 . Salesforce supports both oData 2.0 and oData 4.0 standards.
Next you will see it exposes a schema with two entities - Account and Servicehistory. These are two tables whose data is exposed via the service.

Though the metadata url returns data in xml format, the actual end point for accessing data can return it in different formats like json, etc.
for ex: https://whispering-woodland-8378.herokuapp.com/odata/account.svc/Account

Now why is it so special in Salesforce?
  1. we can setup an integration with the oData service using just point and click steps
  2. the entities I mentioned above while discussing the metadata service, gets exposed in Salesforce as External Objects. They are just like Custom objects. We can configure salesforce features like object CRUD settings, page layouts, setup relationships with other objects, setup chatter feeds etc on an external object. Remember we are setting up all these on externally integrated data. That is powerful!
  3. External objects can be searched in the global search bar as well.
Consider building all that for an external integration source using APEX or Visualforce!

Here is a screenshot of how my oData service sends data back -



Now lets figure out how to setup the external data source and an external data object is Salesforce. 

Now, an important note - I did not configure any authentication for my oData service so this blog will not talk about authentication setup.

Steps to setup an external data source -
  • In your Salesforce org, go to Setup -> External Data Stores. Click the New External Data Source button
  • Set a name for the External data source. Since our data service is 2.0, we will choose Type as Lightning Connect: OData 2.o.  Set the URL to the endpoint of the odata service. 
  • If you want to allow to edit the data in the external system, set "Allow Create, Edit and Delete" to true. I am going to leave it unchecked as I want the integration to be read only.
  • I am also setting the Identity Type and Authentication Protocol  as No Authentication since my oData service is setup without authentication for now. (Something i need to work on!). This is how my setup looks like -

  • Now click the Validate and Sync button. This will validate the connection to the service. It will also read the metadata endpoint and show the possible entities (external objects) it can create.
  • On selecting the tables and proceeding, it will automatically create the external object, its fields based on the columns exposed from the service, its page layout, etc. Its practically everything you need for exposing this to your users :). Here is a screenshot of my external object that got created -
You will notice the object's name ends with __x instead of __c 
  • Now you can create a Tab and link it to the external object you just created.
  • Expose this tab to required profiles. 
Thats it, now test your integration by accessing the tab you just created. if you setup everything correctly, the data should show up correctly as it is in the external data source - 

Here is my external data source and its data - 
and here is how the data will appear in Salesforce -

Cool, isnt it? Hope you find this post useful.

Comments

Popular posts from this blog

Workaround to bypass Salesforce SSO

One of the best practices for implementing Single Sign On for your Salesforce org is always ensure there is a way your System administrator can login via the standard login page side-stepping the SSO configuration.  Reason for this is if ever something goes wrong with your Idp provider's service and the SSO authentication responses are not coming as expected, all your users are unable to login. That is, if you have setup your My domain to prevent logins via standard Salesforce login urls (login.salesforce.com). This includes the System administrator as well. Only if your system administrator can somehow login, then he or she can disable the SSO settings for your domain and allow login via the normal login page as a temporary measure. What do you do in such a situation? Well Salesforce has built a workaround for this which is not well documented anywhere (probably for a good reason :) ). I found out about it from a colleague at work. If your my domain url is - https://Com

Salesforce Big Objects - Key learnings

I remember reading about Salesforce Big Objects before they became available and thought maybe it is an option to backup data present in regular objects. That is, instead of taking a backup via an ETL tool or data loader, maybe this will be an option to backup the data in the Force.com platform itself. Now that it is GA and I am reading more about it, i think the use cases for this are more varied. Some use cases I can think of are –  Archival of key data objects within the platform: You may want to use this option if you dont use any other means to backup key data. Also this may be an attractive option for non-large enterprise customers who dont themselves invest on large data warehouses or data lakes in their enterprise architecture. Ex: customer history (if present in tasks and activities) will become huge over years but this is useful information for reporting & customer analysis. Store key information which is large volume in quantity and also high volume in transa

DBAmp for Salesforce - salesforce integration for SQL Server DBAs

Recently i got the opportunity to explore a tool called DBAmp for integration with Salesforce. I found it to be a very useful tool which will help with any data manipulation requirements in Salesforce. Following are my learnings from the exercise. Hope it helps any of you who may need to work with this tool -  DBAmp is a SQL Server package that can be used to integrate with Salesforce. The site where this software is available is - http://www.forceamp.com/ Overview: It essentially installs on top of an existing SQL Server database and provides an OLE DB connector that can be used to connect to Salesforce. Behind the scenes, it executes API calls against Salesforce for any SQL server command it receives. Thus we can create a connection in SQL server to Salesforce org and pull data into tables in the database just as if we are querying against Salesforce tables directly. Use cases for DBAmap + Salesforce: Many use cases exist for using this tool against Salesforce. Pr