FTP

FTP is a protocol for transferring files over the internet. The FTP data pipeline for the Data Warehouse created by Kondado allows you to have access to the data of your files in your analytical cloud.

Adding the data source

To automate FTP ETL with Kondado for your database or spreadsheet, follow the steps below:

1) Have your FTP service address, port, username and password handy

2) Allow the Kondado IPs on your FTP server

3) On the Kondado platform, go to add data sources page and select the FTP data source

4) Name your data source and enter the information from step (1)

When informing the “Address” parameter, use only the host, as shown in the image – without including “ftp://” or even “/” at the end

Now just save the connector and start integrating your FTP data into the Data Lake or Data Warehouse.

Pipelines

Relationship Chart

CSV

You can indicate the name of a file or even the beginning of the file name and we will integrate all of them.

Once executed, the pipeline will save the highest change date of the files it read and, on the next run, only look for files that have a later change date.

In order to absorb files with different columns, the data will be pivoted on the target and will follow the following pattern:

Field Type

row_number

int

column_number

int

first_column_value

text

value

text

__file_basename

text

__file_path

text

__file_name

text

__kdd_insert_time

timestamp

Add the FTP data source on Kondado

Configure your FTP server as a data source to automate ETL into your Data Warehouse or Data Lake.

1
Gather FTP credentials

Collect your FTP service address, port, username, and password before starting the setup on the Kondado platform.

2
Whitelist Kondado IPs

Allow the Kondado IPs on your FTP server so the platform can securely connect and read your files.

3
Select the FTP connector

On the Kondado platform, navigate to the add data sources page and choose the FTP data source option.

4
Configure connection details

Name your data source and enter the host address without 'ftp://' or trailing slashes, plus the port and credentials from step 1.

5
Save and start integrating

Save the connector to begin integrating your FTP data into your Data Lake or Data Warehouse for analysis.

Frequently asked questions

What information do I need before adding the FTP data source?
You need your FTP service address, port, username, and password. Make sure to have these credentials ready, and remember to allow Kondado IPs on your FTP server for secure access.
How should I format the FTP address when configuring the connector?
Use only the host in the 'Address' field — do not include 'ftp://' or a trailing '/'. For example, enter something like 'ftp.example.com' rather than a full URL.
What happens when Kondado reads multiple CSV files from my FTP server?
You can specify a file name or prefix to match multiple files. The pipeline tracks the latest modification date and only processes newer files on subsequent runs, with data pivoted into a standardized schema including fields like row_number, column_number, value, and file metadata.
Can I send FTP data directly to a BI tool or spreadsheet?
Yes — after landing the data in your Data Warehouse or Data Lake, you can connect it to dashboards and BI tools, or use spreadsheet destinations for analysis.
Where can I see examples of FTP pipelines with specific tools?
You can explore ready-made visualization combinations such as Airtable with FTP, Alteryx with FTP, or Amazon QuickSight with FTP in the Kondado dataviz gallery.

Written by·Published 2023-07-19·Updated 2026-04-25