Logs

Log objects store data about the execution of Pipelines and models. With this endpoint, you can check when your Pipelines and models were last executed and then trigger other processes in your analytics stack.

For example, if you use Kondado to create a feature table that feeds a machine learning model, you can check whether new data has been received that allows a new training cycle. Another example is when Kondado is used to populate a staging database that feeds a production database: you can query Pipeline logs to verify whether new data is available for consumption.

This endpoint uses a scrolling concept, where data is returned in batches of 50 records in ascending order. You must provide a parameter indicating the ID of the oldest record you want to retrieve.

Of course, you can also obtain log data directly using our free logs source, which consumes data from this API.

List Pipeline Logs

Description

Lists the logs of Pipelines that the user has access to. Only the last 90 days of data are available.

Method

GET

Endpoint

/pipeline_logs

URL Parameters

client_id

  • Description: Parameter that indicates the account ID from which logs should be retrieved. If not provided, logs from all accounts the token has access to will be returned.
  • Required: no
  • Format: integer number

start_element_id

  • Description: Parameter that indicates the ID of the oldest log record to be retrieved.
  • Required: yes
  • Format: integer number

Example Response

{
    "success": true,
    "data": {
        "success": true,
        "data": [
            {
                "id": 5181910,
                "client_id": 1,
                "pipeline_id": 702,
                "execution_time_seconds": "0.486988",
                "has_deltas": false,
                "pipeline_name": "Kondado - Blog - wp_options",
                "row_count": "256",
                "createdAt": "2020-09-17T00:03:48.108Z",
                "updatedAt": "2020-09-17T00:03:48.108Z",
                "initial_savepoint": null,
                "new_savepoint": null,
                "external_ip": "3.220.239.218",
                "is_billable": true,
                "raw_row_count": "123",
                "mb_estimate": "0.00231"
            }
        ],
        "pagination": {
            "next_start_element_id": 5181910
        }
    }
}

Response Description

The data object returns an array of JSON objects containing the 50 log records following the provided start_element_id.

The pagination object returns the next_start_element_id parameter, which can be used in a subsequent request to retrieve the next page of results. If it is null, there are no more log records after the provided start_element_id (end of pagination).

The JSON in the data object contains the following fields:

id

Log record ID

createdAt

Log creation timestamp (UTC) – indicates when the Pipeline finished executing.

updatedAt

Log update timestamp (UTC). Usually the same as createdAt, but may differ if the log is modified after creation.

pipeline_id

ID of the Pipeline to which the log belongs.

pipeline_name

Name of the Pipeline at the time the log was created.

client_id

ID of the client to which the Pipeline belongs.

has_deltas

Boolean indicating whether the Pipeline stored deltas when the log was created.

execution_time_seconds

Time in seconds the Pipeline took to execute for this log entry – not including the initial setup time (which involves tasks such as reading metadata about the Pipeline, source, and destination).

row_count

Number of records processed by the Pipeline.

is_billable

Boolean indicating whether the rows counted in row_count were billed as part of your plan or were free.

initial_savepoint

If the Pipeline is incremental, indicates the savepoint from which data was read. NULL for full Pipelines.

new_savepoint

If the Pipeline is incremental, indicates the new savepoint generated after execution. NULL for full Pipelines.

external_ip

IP address of the Kondado server that executed the Pipeline.

raw_row_count

Number of rows processed. This field is populated starting on June 12, 2025; for earlier dates, row_count is equivalent.

mb_estimate

Estimated size, in MB, of the data inserted by the execution.

List Model Logs

Description

Lists the logs of models that the user has access to.

Method

GET

Endpoint

/model_logs

URL Parameters

client_id

  • Description: Parameter that indicates the account ID from which logs should be retrieved. If not provided, logs from all accounts the token has access to will be returned.
  • Required: no
  • Format: integer number

start_element_id

  • Description: Parameter that indicates the ID of the oldest log record to be retrieved.
  • Required: yes
  • Format: integer number

Example Response

{
    "success": true,
    "data": {
        "success": true,
        "data": [
            {
                "id": 5219430,
                "client_id": 1,
                "model_id": 162,
                "execution_time_seconds": 0.201923,
                "model_name": "Kondado - Leads",
                "row_count": 4915,
                "createdAt": "2020-09-17T00:05:00.892Z",
                "updatedAt": "2020-09-17T00:05:00.892Z"
            }
        ],
        "pagination": {
            "next_start_element_id": 5219430
        }
    }
}

Response Description

The data object returns an array of JSON objects containing the 50 log records following the provided start_element_id.

The pagination object returns the next_start_element_id parameter, which can be used in a subsequent request to retrieve the next page of results. If it is null, there are no more log records after the provided start_element_id (end of pagination).

The JSON in the data object contains the following fields:

id

Log record ID

createdAt

Log creation timestamp (UTC) – indicates when the model finished executing.

updatedAt

Log update timestamp (UTC). Usually the same as createdAt, but may differ if the log is modified after creation.

model_id

ID of the model to which the log belongs.

model_name

Name of the model at the time the log was created.

client_id

ID of the client to which the model belongs.

execution_time_seconds

Time in seconds the model took to execute for this log entry – not including the initial setup time (which involves tasks such as reading metadata about the Pipeline, source, and destination).

row_count

Number of rows present in the table generated by the model.

Retrieve and paginate execution logs via API

Use the Kondado API to fetch Pipeline and model execution logs with scrolling pagination, then integrate log data into your analytics workflows.

1
Authenticate and prepare your API request

Ensure your API token has access to the accounts whose logs you need. Decide whether to filter by client_id or retrieve logs across all accessible accounts.

2
Request the first batch of Pipeline logs

Send a GET request to /pipeline_logs with start_element_id set to 0 (or another starting ID). The endpoint returns 50 records in ascending order, including fields like execution_time_seconds, row_count, and has_deltas.

3
Paginate through results using next_start_element_id

Extract next_start_element_id from the pagination object. If it is not null, use it as the new start_element_id in your next request. Repeat until next_start_element_id is null, indicating the end of available logs.

4
Repeat the process for model logs

Apply the same scrolling pattern to the /model_logs endpoint. Model logs include model_id, model_name, and row_count for tables generated by data transformations.

5
Automate downstream processes based on log data

Use log timestamps and has_deltas flags to trigger events in your stack—for example, initiating a machine-learning training cycle when new data arrives, or confirming that a staging database is ready for production consumption.

Frequently asked questions

What is the maximum number of log records returned per API request?
Each request returns a batch of 50 log records. This applies to both the /pipeline_logs and /model_logs endpoints.
How do I know when I have reached the end of the log pagination?
Check the pagination.next_start_element_id field in the response. If it is null, there are no more records after the current start_element_id and you have reached the end of the available logs.
What does the has_deltas field indicate in Pipeline logs?
The has_deltas boolean indicates whether the Pipeline stored deltas (incremental changes) when the log was created. This is useful for determining if new or changed data was processed during that execution.
How far back can I retrieve Pipeline logs via the API?
Only the last 90 days of Pipeline log data are available through the API. Model logs do not have this restriction mentioned in the documentation.
Can I consume log data without writing API calls myself?
Yes. Kondado offers a free logs source that consumes data from this API directly, allowing you to access log information without building custom integrations.
What is the difference between row_count and raw_row_count in Pipeline logs?
raw_row_count represents the number of rows processed, while row_count reflects the number of records processed by the Pipeline. The raw_row_count field is populated starting June 12, 2025; for earlier dates, row_count is equivalent. You can also use Kondado's platform features to monitor usage.

Written by·Published 2026-01-12·Updated 2026-04-25