Skip to main content

DBAmp for Salesforce - salesforce integration for SQL Server DBAs





Recently i got the opportunity to explore a tool called DBAmp for integration with Salesforce. I found it to be a very useful tool which will help with any data manipulation requirements in Salesforce. Following are my learnings from the exercise. Hope it helps any of you who may need to work with this tool - 

DBAmp is a SQL Server package that can be used to integrate with Salesforce. The site where this software is available is - http://www.forceamp.com/

Overview:

It essentially installs on top of an existing SQL Server database and provides an OLE DB connector that can be used to connect to Salesforce.

Behind the scenes, it executes API calls against Salesforce for any SQL server command it receives. Thus we can create a connection in SQL server to Salesforce org and pull data into tables in the database just as if we are querying against Salesforce tables directly.

Use cases for DBAmap + Salesforce:


Many use cases exist for using this tool against Salesforce. Primary among them are -      

  • DBAmp can be used to replicate data from Salesforce to on premise or on cloud SQL server for database backup. This could be done for few tables or all tables based on the use case.
  • DBAmp can be used to pull data in Salesforce to on premise or on cloud SQL Server. Then the data can be transformed, calculations can be done by executing SQL commands or executing stored procedures in the database. Post the data updates, a load table can be created with the data to be loaded back to Salesforce. The updated data can be pushed back to Salesforce.

Using DBAmp’s bulk ops procedure to pull and push data from Salesforce is recommended to keep API usage under control.

SQL Server and DBAmp Installation:

Prerequisites for installation:


  • Preferably install latest version of SQL Server. If SQL Server license is not available, use Express edition which is a free version provided by Microsoft
  • DBAmp does not support versions older than SQL Server 2005
  • 64 bit windows server is required for DBAmp to work
  • DBAmp requires Dot net framework 4.5 or higher
  • Windows OS version should be Windows 2008 or higher
  • DBAmp software has to be purchased or received via Trail. The trail lasts for 30 days only
  • SQL Sever service should not be running through the Network Service account

Installation steps:


  • Install SQL Server and ensure it is working fine (for ex: create a database, create a table, insert a few records etc).
  • Install SQL Management tool if not already installed. This is used to create linked server etc.
  • Stop the SQL Server service.
  • Install DBAmp
  • Please check DBAmp installation guide for details of the steps after this -
    • In SQL Management tool, open Providers tree and configure properties.
    • Connect DBAmp to Sql Server by creating a linked server. While doing this, choose the location url based on whether you are using Production/Developer or Sandbox Salesforce org. This step will need a Salesforce login (with pwd and security token). The user being used here should be able to execute APIs to pull the data from Salesforce).
    • Once connection is established and tested, create a database to hold Salesforce data. Call it “salesforce backups”
    • In ‘salesforce backups’, run a script called “Create DBAmp SPROCS.sql” found in the DBAmp installation folders to create the stored procedures used to pull and push data to Salesforce.
    • There is a application called DBAmp configuration that is installed as part of DBAmp installation. Run this and setup Work Directory, Proxy details etc.

Usage of Salesforce connection:

Once connection is established, queries can be executed against Salesforce objects as if they are just simple SQL Server tables.
For ex:
Select id,name from SALESFORCE…contact where lastname = ‘Ramachandran’

To pull data from Salesforce object to local table in Salesforce backups database, please execute following stored procedure –  SF_REPLICATE . On executing it with an object name, it creates a table in the database with name matching the object and columns matching the fields.

To push data back to Salesforce, best option is – 

  • create a load table with the name such as <<Object name>>_LOAD . for ex: Opportunity_Load
  • The table should have columns such that their name matches the api name of the fields in Salesforce
  • Additionally Error columns (Text type with 255 length) is mandatory
  • Update this load table with the data that needs to be pushed back to Salesforce
  • If you are doing an update, add Id column so it can be used as the key for update.
  • Use stored procedure SF_Bulkops to push the update or insert into Salesforce. This uses Bulk API and so is optimized for performance.

Below is an example sql procedure that downloads the contact data from Salesforce, modifies the email field and then pushes the data back to Salesforce –

Use "salesforce backups"
Exec SF_Replicate 'SALESFORCE', 'Contact'
select lastname,firstname,email from dbo.Contact
insert into dbo.Contact_Load(Id,email)(select id,'testrun1@test.com' from dbo.Contact)

Exec SF_Bulkops 'Update','SALESFORCE','Contact_Load'

Screenshot of a similar simple program - 



After creating your own stored procedures, you can execute them for your data manipulation interfaces with Salesforce. Also such jobs can be scheduled to execute at planned intervals as well.

Comments

  1. I acknowledge the author for his brilliant work for making this exceptionally useful and informative content to guide us.
    salesforce form builder

    ReplyDelete
  2. Two lines: This represents a negative test result Although synthetic urine is a bit unconventional, it’s still worth a shot! If you’re new to this whole synthetic urine idea, you’d probably not know if it will work and which product would work effectively, without you getting caught red-handed58-AM These metabolites, known as cannabinoids, stay in the body's fat stores Regardless of how often or how little an individual partakes in recreational drug use, the follicle test is an extremely potent obstacle to overcome It also depends on whether you’re getting tested in a lab or getting an instant test at your job Visit: https://www.urineworld.com/

    ReplyDelete
  3. Really appreciated reading this type of innovative blog. Qdexi Technology gives the best Bing Ads Service.

    ReplyDelete

Post a Comment

Popular posts from this blog

Workaround to bypass Salesforce SSO

One of the best practices for implementing Single Sign On for your Salesforce org is always ensure there is a way your System administrator can login via the standard login page side-stepping the SSO configuration.  Reason for this is if ever something goes wrong with your Idp provider's service and the SSO authentication responses are not coming as expected, all your users are unable to login. That is, if you have setup your My domain to prevent logins via standard Salesforce login urls (login.salesforce.com). This includes the System administrator as well. Only if your system administrator can somehow login, then he or she can disable the SSO settings for your domain and allow login via the normal login page as a temporary measure. What do you do in such a situation? Well Salesforce has built a workaround for this which is not well documented anywhere (probably for a good reason :) ). I found out about it from a colleague at work. If your my domain url is - https://Com

Salesforce Big Objects - Key learnings

I remember reading about Salesforce Big Objects before they became available and thought maybe it is an option to backup data present in regular objects. That is, instead of taking a backup via an ETL tool or data loader, maybe this will be an option to backup the data in the Force.com platform itself. Now that it is GA and I am reading more about it, i think the use cases for this are more varied. Some use cases I can think of are –  Archival of key data objects within the platform: You may want to use this option if you dont use any other means to backup key data. Also this may be an attractive option for non-large enterprise customers who dont themselves invest on large data warehouses or data lakes in their enterprise architecture. Ex: customer history (if present in tasks and activities) will become huge over years but this is useful information for reporting & customer analysis. Store key information which is large volume in quantity and also high volume in transa