Salesforce

Salesforce started as a CRM and now has marketing and customer service solutions. The integration of Salesforce data for your Data Warehouse or Data Lake created by Kondado allows you to have access to all components of the platform, even the custom fields and objects in your analytical cloud.

Adding the data source

To automate Salesforce Sales Cloud and Service Cloud ETL with Kondado, follow the steps below:

1) Log in to your Salesforce account and in the upper right panel, click on your company logo and then on “Settings";

2) On the settings page, look for “Reset my security key” and on the reset page, click on “Reset my security key”;

3) You will receive your new token by email. Copy it as it will be used later to add the data source on the Kondado platform;

4) On the Kondado platform, go to the add new sources page and select the Salesforce data source;

5) Name your data source, fill in the e-mail and password of the user used to generate the token and, in “Security Token” use the token generated in step (3);

Now you can save the data source and start replicating your Salesforce Service Cloud or Sales Cloud objects into your Data Warehouse or Data Lake!

Pipelines

Relationship chart

Object

1. With our integration, you can replicate standard and custom objects

1.1 Only objects that the user who created the data source has access to will be available, as well as objects that exist in the organization

1.2 Returned records can be filtered according to the access of the user who created the data source

2. You can create an incremental integration using, for example, the LastModifiedDate column, however deleted records will not be synchronized

The created table will have a format similar to the one below:

FieldType 
idtext 
col_xtext 
col_ytext 
col_ztext 

Add Salesforce as a Data Source on Kondado

Follow these steps to connect your Salesforce Sales Cloud or Service Cloud data to your Data Warehouse or Data Lake through Kondado.

1
Generate your Salesforce security token

Log in to Salesforce, click your company logo in the upper right, go to "Settings", find "Reset my security key", and click to reset it. You'll receive the new token by email—copy it for later use.

2
Start adding the source on Kondado

On the Kondado platform, navigate to the add new sources page and select the Salesforce data source to begin the connection setup.

3
Configure credentials and token

Name your data source, enter the email and password of the Salesforce user who generated the token, and paste the security token from step 1 into the "Security Token" field.

4
Save and start replicating data

Save the data source to begin replicating your Salesforce objects. You can then use data integration features to sync standard and custom objects to your Data Warehouse or Data Lake, with optional incremental sync using columns like LastModifiedDate.

Frequently asked questions

What Salesforce components can I access through Kondado?
The Kondado integration gives you access to all components of the Salesforce platform, including custom fields and custom objects, so your analytical cloud contains complete data from Sales Cloud and Service Cloud.
Do I need a security token to connect Salesforce to Kondado?
Yes. You must generate a security token in your Salesforce settings by clicking "Reset my security key." The token is emailed to you and required when adding the data source on the Kondado platform.
Which Salesforce objects are available for replication?
Standard and custom objects are available, but only those the data source creator has access to. Returned records are also filtered according to that user's access permissions.
Can I set up incremental sync for Salesforce data?
Yes, you can create an incremental integration using columns like LastModifiedDate. However, note that deleted records will not be synchronized in this process.
What format do the replicated Salesforce tables have in the destination?
Tables follow a simple text-based format with an id field and additional columns (col_x, col_y, col_z, etc.) all stored as text type, ready for transformation in your data transformation pipeline.

Written by·Published 2023-06-16·Updated 2026-04-25