Inputs
Input types in the Management API
IdOrUniqueName
The ID or unique name input.
If both ID and unique name are provided, the ID will take precedence.
The unique identifier of the object.
The unique name of the object.
CreateEnvironmentInput
The fields for creating an Environment.
The Environment’s unique name.
The Environment’s description.
ModifyEnvironmentInput
The fields for modifying an Environment.
The Environment’s unique name.
The Environment’s description.
CreateApplicationInput
The fields for creating an Application.
The Application’s unique name. If not specified, Propel will set the ID as unique name.
The Application’s description.
The Application’s Propeller. If no Propeller is provided, Propel will set the Propeller to P1_X_SMALL
.
The Application’s API authorization scopes. If specified, at least one scope must be provided; otherwise, all scopes will be granted to the Application by default.
ModifyApplicationInput
The fields for modifying an Application.
The ID or unique name of the Application to modify.
The Application’s new unique name.
The Application’s new description.
The Application’s new Propeller.
The Application’s new API authorization scopes.
CreateSnowflakeDataSourceInput
The fields for creating a Snowflake Data Source.
The Data Source’s unique name. If not specified, Propel will set the ID as unique name.
The Data Source’s description.
The Data Source’s connection settings.
ModifySnowflakeDataSourceInput
The fields for modifying a Snowflake Data Source.
The ID or unique name of the Data Source to modify.
The Data Source’s new unique name.
The Data Source’s new description.
The Data Source’s new connection settings.
SnowflakeConnectionSettingsInput
The fields for creating a Snowflake Data Source’s connection settings.
The Snowflake account. Only include the part before the “snowflakecomputing.com” part of your Snowflake URL (make sure you are in classic console, not Snowsight). For AWS-based accounts, this looks like “znXXXXX.us-east-2.aws”. For Google Cloud-based accounts, this looks like “ffXXXXX.us-central1.gcp”.
The Snowflake database name.
The Snowflake warehouse name. It should be “PROPELLING” if you used the default name in the setup script.
The Snowflake schema.
The Snowflake username. It should be “PROPEL” if you used the default name in the setup script.
The Snowflake password.
The Snowflake role. It should be “PROPELLER” if you used the default name in the setup script.
PartialSnowflakeConnectionSettingsInput
The fields for modifying a Snowflake Data Source’s connection settings.
The Snowflake account. Only include the part before the “snowflakecomputing.com” part of your Snowflake URL (make sure you are in classic console, not Snowsight). For AWS-based accounts, this looks like “znXXXXX.us-east-2.aws”. For Google Cloud-based accounts, this looks like “ffXXXXX.us-central1.gcp”. If not provided this property will not be modified.
The Snowflake database name. If not provided this property will not be modified.
The Snowflake warehouse name. It should be “PROPELLING” if you used the default name in the setup script. If not provided this property will not be modified.
The Snowflake schema. If not provided this property will not be modified.
The Snowflake username. It should be “PROPEL” if you used the default name in the setup script. If not provided this property will not be modified.
The Snowflake password. If not provided this property will not be modified.
The Snowflake role. It should be “PROPELLER” if you used the default name in the setup script. If not provided this property will not be modified.
HttpBasicAuthInput
The fields for specifying an HTTP Data Source’s Basic authentication settings.
The username for HTTP Basic authentication that must be included in the Authorization header when uploading new data.
The password for HTTP Basic authentication that must be included in the Authorization header when uploading new data.
HttpDataSourceTableInput
The fields for specifying an HTTP Data Source’s table.
The name of the table
All the columns present in the table
HttpDataSourceColumnInput
The fields for specifying a column in an HTTP Data Source’s table.
The column name. It has to be unique within a Table.
The column type.
The ClickHouse type to use when type
is set to CLICKHOUSE
.
Whether the column’s type is nullable or not.
S3DataSourceTableInput
The fields for specifying an S3 Data Source’s table.
The name of the table
The path to the table’s files in S3.
All the columns present in the table
S3DataSourceColumnInput
The fields for specifying a column in an S3 Data Source’s table.
The column name. It has to be unique within a Table.
The column type.
Whether the column’s type is nullable or not.
WebhookDataSourceColumnInput
The fields for specifying a column in an Webhook Data Source’s table.
The column name. It has to be unique within a Table.
The JSON property that the column will be derived from. For example, if you POST a JSON event like this:
Then you can use the JSON property “greeting.message” to extract “hello, world” to a column.
The column type.
Whether the column’s type is nullable or not.
TableSettingsInput
A Data Pool’s table settings.
These describe how the Data Pool’s table is created in ClickHouse.
The ClickHouse table engine for the Data Pool’s table.
This field is optional. A default will be chosen based on the Data Pool’s timestamp
and uniqueId
values, if specified.
The PARTITION BY clause for the Data Pool’s table.
This field is optional. A default will be chosen based on the Data Pool’s timestamp
and uniqueId
values, if specified.
The PRIMARY KEY clause for the Data Pool’s table.
This field is optional. A default will be chosen based on the Data Pool’s timestamp
and uniqueId
values, if specified.
The ORDER BY clause for the Data Pool’s table.
This field is optional. A default will be chosen based on the Data Pool’s timestamp
and uniqueId
values, if specified.
The TTL clause for the Data Pool’s table.
TableEngineInput
A Data Pool’s table engine.
Field for specifying the MergeTree table engine.
Field for specifying the ReplacingMergeTree table engine.
Field for specifying the SummingMergeTree table engine.
Field for specifying the AggregatingMergeTree table engine.
Field for specifying the PostgreSQL table engine.
MergeTreeTableEngineInput
Parameters for the MergeTree table engine.
The type is always MERGE_TREE
.
ReplacingMergeTreeTableEngineInput
Parameters for the ReplacingMergeTree table engine.
The type is always REPLACING_MERGE_TREE
.
The ver
parameter to the ReplacingMergeTree engine.
SummingMergeTreeTableEngineInput
Parameters for the SummingMergeTree table engine.
The type is always SUMMING_MERGE_TREE
.
The columns argument for the SummingMergeTree table engine
AggregatingMergeTreeTableEngineInput
Parameters for the AggregatingMergeTree table engine.
The type is always AGGREGATING_MERGE_TREE
.
PostgreSqlTableEngineInput
Parameters for the PostgreSQL table engine.
The type is always POSTGRESQL
.
BackfillOptionsInput
Whether historical data should be backfilled or not
CreateMaterializedViewInput
The fields for creating a Materialized View.
The Materialized View’s unique name. If not specified, Propel will set the ID as the unique name.
The Materialized View’s description.
The SQL that the Materialized View will execute.
By default, a destination Data Pool with default settings will be created for the Materialized View; however, you can customize the destination Data Pool (or point to an existing Data Pool), by setting this field. Use this to target an existing Data Pool or the engine settings of a new Data Pool.
By default, a Materialized View only applies to records added after its creation. This option allows to backfill all the data that was present before the Materialized View creation.
ModifyMaterializedViewInput
The fields for modifying a Materialized View.
The ID of the Materialized View to modify.
The Materialized View’s new unique name.
The Materialized View’s new description.
CreateMaterializedViewDestinationInput
The fields for targeting an existing Data Pool or a new Data Pool.
If specified, the Materialized View will target an existing Data Pool. Ensure the Data Pool’s schema is compatible with your Materialized View’s SQL statement.
If specified, the Materialized View will create and target a new Data Pool. You can further customize the new Data Pool’s engine settings.
CreateMaterializedViewDestinationNewDataPoolInput
The fields for customizing a new Data Pool that a Materialized View will target.
The Data Pool’s unique name.
The Data Pool’s description.
Optionally specify the Data Pool’s primary timestamp. This will influence the Data Pool’s engine settings.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp
and uniqueId
values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.
DimensionInput
The fields for creating or modifying a Dimension.
The name of the column to create the Dimension from.
TimestampInput
The fields to specify the Data Pool’s primary timestamp column. Propel uses the primary timestamp to order and partition your data in Data Pools. It will serve as the time dimension for your Metrics.
The name of the column that represents the primary timestamp.
TenantInput
The fields to specify the Data Pool’s tenant ID column. The tenant ID column is used to control access to your data with access policies.
The name of the column that represents the tenant ID.
UniqueIdInput
The fields to specify the Data Pool’s unique ID column. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated within the Data Pool.
The name of the column that represents the unique ID.
DataPoolColumnInput
The name of the Data Source column that this Data Pool column derives from.
The Data Pool column’s type. This may differ from the corresponding Data Source column’s type.
The ClickHouse type to use when type
is set to CLICKHOUSE
.
Whether the column is nullable, meaning whether it accepts a null value.
CreateDataPoolInputV2
The fields for creating a Data Pool.
The Data Source that will be used to create the Data Pool.
The table that the Data Pool will sync from.
The table’s primary timestamp column.
Propel uses the primary timestamp to order and partition your data in Data Pools. It’s part of what makes Propel fast for larger data sets. It will also serve as the time dimension for your Metrics.
If you do not provide a primary timestamp column, you will need to supply an alternate timestamp when querying your Data Pool or its Metrics using the TimeRangeInput.
The Data Pool’s unique name. If not specified, Propel will set the ID as the unique name.
The Data Pool’s description.
The list of columns.
The Data Pool’s syncing settings.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp
and uniqueId
values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.
ModifyDataPoolInput
The fields for modifying a Data Pool.
The ID or unique name of the Data Pool to modify.
The Data Pool’s new unique name.
The Data Pool’s new description.
The Data Pool’s new data retention in days.
The Data Pool’s new syncing settings.
The table’s primary timestamp column.
Propel uses the primary timestamp to order and partition your data in Data Pools. It’s part of what makes Propel fast for larger data sets. It will also serve as the time dimension for your Metrics.
If you do not provide a primary timestamp column, you will need to supply an alternate timestamp when querying your Data Pool or its Metrics using the TimeRangeInput.
Enables or disables access control for the Data Pool.
If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
DataPoolSyncingInput
The fields for modifying the Data Pool syncing.
DataGridInput
The fields for querying Data Grid records.
The Data Pool to be queried.
The time range for retrieving the records.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.
You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The columns to retrieve.
The index of the column to order the table by. The index is 1-based. If not provided, records will be ordered by their timestamp by default.
The sort order of the rows. It can be ascending (ASC
) or descending (DESC
) order. Defaults to descending (DESC
) order when not provided.
The filters to apply to the records, in the form of SQL. You may only filter on columns included in the columns
array input.
The number of rows to be returned when paging forward. It can be a number between 1 and 1,000.
The cursor to use when paging forward.
The number of rows to be returned when paging forward. It can be a number between 1 and 1,000.
The cursor to use when paging backward.
MetricReportInput
The fields for querying a Metric Report.
A Metric Report is a table whose columns include dimensions and Metric values, calculated over a given time range.
The time range for calculating the Metric Report.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.
You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
One or many dimensions to group the Metric values by. Typically, dimensions in a report are what you want to compare and rank.
One or more Metrics to include in the Metric Report. These will be broken down by dimensions
.
The Query Filters to apply when building the Metric Report, in the form of SQL. These can be used to filter out rows.
The index of the column to order the Metric Report by. The index is 1-based and defaults to the first Metric column. In other words, by default, reports are ordered by the first Metric; however, you can order by the second Metric, third Metric, etc., by overriding the orderByColumn
input. You can also order by dimensions this way.
The number of rows to be returned when paging forward. It can be a number between 1 and 1,000.
The cursor to use when paging forward.
The number of rows to be returned when paging forward. It can be a number between 1 and 1,000.
The cursor to use when paging backward.
MetricReportDimensionInput
The fields for specifying a dimension to include in a Metric Report.
The column name of the dimension to include in a Metric Report. This must match the name of a Data Pool column.
The name to display in the headers
array when displaying the report. This defaults to the column name if unspecified.
The sort order for the dimension. It can be ascending (ASC
) or descending (DESC
) order. Defaults to ascending (ASC
) order when not provided.
MetricReportMetricInput
The fields for specifying a Metric to include in a Metric Report.
The Metric to query. You can query a pre-configured Metric by ID or name, or you can query an ad hoc Metric that you define inline.
The name to display in the headers
array when displaying the report. This defaults to the Metric’s unique name if unspecified.
The Query Filters to apply when calculating the Metric, in the form of SQL.
The sort order for the Metric. It can be ascending (ASC
) or descending (DESC
) order. Defaults to descending (DESC
) order when not provided.
CreateCountMetricInput
The fields for creating a new Count Metric.
The Data Pool that powers this Metric.
The Metric’s unique name. If not specified, Propel will set the ID as unique name.
The Metric’s description.
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Dimensions. Dimensions define the columns that will be available to filter the Metric at query time.
CreateSumMetricInput
The fields for creating a new Sum Metric.
The Data Pool that powers this Metric.
The Metric’s unique name. If not specified, Propel will set the ID as unique name.
The Metric’s description.
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Dimensions. Dimensions define the columns that will be available to filter the Metric at query time.
The column to be summed.
CreateAverageMetricInput
The fields for creating a new Average Metric.
The Data Pool that powers this Metric.
The Metric’s unique name.
The Metric’s description.
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Dimensions. Dimensions define the columns that will be available to filter the Metric at query time.
The column to be averaged.
CreateMinMetricInput
The fields for creating a new Minimum (Min) Metric.
The Data Pool that powers this Metric.
The Metric’s unique name. If not specified, Propel will set the ID as unique name.
The Metric’s description.
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Dimensions. Dimensions define the columns that will be available to filter the Metric at query time.
CreateMaxMetricInput
The fields for creating a new Maximum (Max) Metric.
The Data Pool that powers this Metric.
The Metric’s unique name. If not specified, Propel will set the ID as unique name.
The Metric’s description.
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Dimensions. Dimensions define the columns that will be available to filter the Metric at query time.
The column to calculate the maximum from.
CreateCountDistinctMetricInput
The fields for creating a new Count Distinct Metric.
The Data Pool that powers this Metric.
The Metric’s unique name. If not specified, Propel will set the ID as unique name.
The Metric’s description.
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Dimensions. Dimensions define the columns that will be available to filter the Metric at query time.
The Dimension over which the count distinct operation is going to be performed.
CreateCustomMetricInput
The fields for creating a new Custom Metric.
The Data Pool that powers this Metric.
The Metric’s unique name. If not specified, Propel will set the ID as unique name.
The Metric’s description.
The Metric’s Filters, in the form of SQL. Metric Filters allow defining a Metric with a subset of records from the given Data Pool. If no Filters are present, all records will be included.
The Metric’s Dimensions. Dimensions define the columns that will be available to filter the Metric at query time.
The expression that defines the aggregation function for this Metric.
ModifyMetricInput
The fields for modifying a Metric.
The ID of the Metric to modify.
The Metric’s new unique name.
The Metric’s new description.
The Metric’s new Dimensions. Used to add or remove Dimensions.
The Metric’s new Filters, in the form of SQL. Used to add or remove Metric Filters.
Enables or disables access control for the Metric.
MigrateMetricInput
The fields for migrating a Metric’s Data Pool.
The Metric that is going to be migrated.
The DataPool to which the Metric is going to be migrated.
DataPoolInput
The ID of the Data Pool.
The name of the Data Pool.
CustomMetricQueryInput
The Data Pool to which this Metric belongs.
Custom expression for defining the Metric.
CountMetricQueryInput
The Data Pool to which this Metric belongs.
SumMetricQueryInput
The Data Pool to which this Metric belongs.
The column to be summed.
AverageMetricQueryInput
The Data Pool to which this Metric belongs.
The column to be averaged.
MinMetricQueryInput
The Data Pool to which this Metric belongs.
The column to calculate the minimum from.
MaxMetricQueryInput
The Data Pool to which this Metric belongs.
The column to calculate the maximum from.
CountDistinctMetricQueryInput
The Data Pool to which this Metric belongs.
The column to count distinct values from.
MetricInput
The ID of a pre-configured Metric.
The name of a pre-configured Metric.
An ad hoc Custom Metric.
An ad hoc Count Metric.
An ad hoc Sum Metric.
An ad hoc Average Metric.
An ad hoc Min Metric.
An ad hoc Max Metric.
An ad hoc Count Distinct Metric.
CounterInput
The fields for querying a Metric in counter format.
A Metric’s counter query returns a single value over a given time range.
The Metric to query. You can query a pre-configured Metric by ID or name, or you can query an ad hoc Metric that you define inline.
The time range for calculating the counter.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.
You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The Query Filters to apply before retrieving the counter data, in the form of SQL. If no Query Filters are provided, all data is included.
TimeSeriesInput
The fields for querying a Metric in time series format.
A Metric’s time series query returns the values over a given time range aggregated by a given time granularity; day, month, or year, for example.
The Metric to Query. It can be a pre-created one or it can be inlined here.
The time range for calculating the time series.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.
You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The time granularity (hour, day, month, etc.) to aggregate the Metric values by.
The Query Filters to apply before retrieving the time series data, in the form of SQL. If no Query Filters are provided, all data is included.
Columns to group by.
LeaderboardInput
The fields for querying a Metric in leaderboard format.
A Metric’s leaderboard query returns an ordered table of Dimension and Metric values over a given time range.
The Metric to query. You can query a pre-configured Metric by ID or name, or you can query an ad hoc Metric that you define inline.
The time range for calculating the leaderboard.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.
You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
One or many Dimensions to group the Metric values by. Typically, Dimensions in a leaderboard are what you want to compare and rank.
The sort order of the rows. It can be ascending (ASC
) or descending (DESC
) order. Defaults to descending (DESC
) order when not provided.
The number of rows to be returned. It can be a number between 1 and 1,000.
The Query Filters to apply before retrieving the leaderboard data, in the form of SQL. If no Query Filters are provided, all data is included.
TimeRangeInput
The fields required to specify the time range for a time series, counter, or leaderboard Metric query.
If no relative or absolute time ranges are provided, Propel defaults to an absolute time range beginning with the earliest record in the Metric’s Data Pool and ending with the latest record.
If both relative and absolute time ranges are provided, the relative time range will take precedence.
If a LAST_N
relative time period is selected, an n
≥ 1 must be provided. If no n
is provided or n
< 1, a BAD_REQUEST
error will be returned.
The timestamp field to use when querying. Defaults to the timestamp configured on the Data Pool or Metric, if any. Set this to filter on an alternative timestamp field.
The relative time period.
The number of time units for the LAST_N
relative periods.
The optional start timestamp (inclusive). Defaults to the timestamp of the earliest record in the Data Pool.
The optional end timestamp (exclusive). Defaults to the timestamp of the latest record in the Data Pool.
FilterInput
The fields of a filter.
You can construct more complex filters using and
and or
. For example, to construct a filter equivalent to
you could write
Note that and
takes precedence over or
.
The name of the column to filter on.
The operation to perform when comparing the column and filter values.
The value to compare the column to.
Additional filters to AND with this one. AND takes precedence over OR.
Additional filters to OR with this one. AND takes precedence over OR.
UpdateDataPoolRecordsJobSetColumnInput
The fields for creating an Update Data Pool Records Job.
The name of the column to update.
The value to which the column will be updated. Once evaluated, it should be of the same data type as the column.
CreateBoosterInput
The fields for creating a new Booster.
Boosters can be understood as an aggregating index. The index is formed from left to right as follows:
- The Data Pool’s Tenant ID column (if present)
- Metric Filter columns (if present)
- Query Filter Dimensions (see
dimensions
) - The Data Pool’s timestamp column
The Booster’s Metric.
Dimensions to include in the Booster.
Follow these guidelines when specifying Dimensions:
- Specify Dimensions in descending order of importance for filtering and in ascending order of cardinality.
- Take into consideration hierarchical relationships as well (for example, a “country” Dimension should appear before a “state” Dimension).
CreatePolicyInput
The fields for creating a Policy.
The Metric to which the Policy will be applied.
The type of Policy to create.
The Application that will be granted access to the Metric.
ModifyPolicyInput
The fields for modifying a Policy.
The Policy’s unique identifier.
The type of Policy.
DeletionRequestInput
The fields for creating a Deletion Job.
The Data Pool that is going to get the data deleted
The filters that will be used for deleting data, in the form of SQL. Data matching these filters will be deleted.
CreateDeletionJobInput
The fields for creating a Deletion Job.
The Data Pool that is going to get the data deleted
The filters that will be used for deleting data, in the form of SQL. Data matching these filters will be deleted.
CreateAddColumnToDataPoolJobInput
The fields for creating an Add Column Job.
The Data Pool to which the column will be added.
Name of the new column.
Type of the new column.
JSON property to which the new column corresponds.
CreateUpdateDataPoolRecordsJobInput
The fields for creating an Update Data Pool Records Job.
The Data Pool that is going to get its records updated.
The filters that will be used for updating records, in the form of SQL. Records matching these filters will be updated.
Describes how the job will update the records.
CreateDataPoolAccessPolicyInput
The Data Pool Access Policy’s unique name. If not specified, Propel will set the ID as unique name.
The Data Pool Access Policy’s description.
The Data Pool to which the Access Policy belongs.
Columns that the Access Policy makes available for querying.
If set to ["*"]
, all columns will be available for querying.
Row-level filters that the Access Policy applies before executing queries, in the form of SQL.
ModifyDataPoolAccessPolicyInput
The Data Pool Access Policy’s new unique name.
The Data Pool Access Policy’s new description.
Columns that the Access Policy makes available for querying. If not provided this property will not be modified.
Row-level filters that the Access Policy applies before executing queries, in the form of SQL. If not provided this property will not be modified.
SqlV1Input
Input to the SqlV1 api.
The SQL query.
The SQL dialect to use. If not provided, the query is parsed on a best-effort basis.
DescribeSqlV1Input
Input for describing SqlV1 inputs.
The SQL query.
The SQL dialect to use. If not provided, the query is parsed on a best-effort basis.
RecordsByUniqueIdInput
The fields for querying records by unique ID.
The Data Pool to be queried. A Data Pool ID or unique name can be provided.
The columns to retrieve.
The unique IDs of the records to retrieve.
TopValuesInput
The fields for querying the top values in a given column.
The Data Pool to be queried. A Data Pool ID or unique name can be provided.
The column to fetch the unique values from.
The time range for calculating the top values.
The time zone to use. Dates and times are always returned in UTC, but setting the time zone influences relative time ranges and granularities.
You can set this to “America/Los_Angeles”, “Europe/Berlin”, or any other value in the IANA time zone database. Defaults to “UTC”.
The maximum number of values to return. It can be a number between 1 and 1,000. If the parameter is omitted, default value 10 is used.
ClickHouseConnectionSettingsInput
The ClickHouse Data Source connection settings.
Which database to connect to
The password for the provided user
The URL where the ClickHouse host is listening to HTTP[S] connections
The user for authenticating against the ClickHouse host
CreateClickHouseDataSourceInput
The ClickHouse Data Source’s connection settings
The ClickHouse Data Source’s description.
The ClickHouse Data Source’s unique name. If not specified, Propel will set the ID as unique name.
ModifyClickHouseDataSourceInput
The ClickHouse Data Source’s new connection settings. If not provided this property will not be modified.
The ClickHouse Data Source’s new description. If not provided this property will not be modified.
The ID or unique name of the ClickHouse Data Source to modify.
The ClickHouse Data Source’s new unique name. If not provided this property will not be modified.
PartialClickHouseConnectionSettingsInput
The ClickHouse Data Source connection settings.
Which database to connect to If not provided this property will not be modified.
The password for the provided user If not provided this property will not be modified.
The URL where the ClickHouse host is listening to HTTP[S] connections If not provided this property will not be modified.
The user for authenticating against the ClickHouse host If not provided this property will not be modified.
CreateHttpDataSourceInput
The HTTP Data Source’s connection settings
The HTTP Data Source’s description.
The HTTP Data Source’s unique name. If not specified, Propel will set the ID as unique name.
HttpConnectionSettingsInput
The HTTP Data Source connection settings.
The HTTP Basic authentication settings for uploading new data.
If this parameter is not provided, anyone with the URL to your tables will be able to upload data. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.
The HTTP Data Source’s tables.
ModifyHttpDataSourceInput
The HTTP Data Source’s new connection settings. If not provided this property will not be modified.
The HTTP Data Source’s new description. If not provided this property will not be modified.
The ID or unique name of the HTTP Data Source to modify.
The HTTP Data Source’s new unique name. If not provided this property will not be modified.
PartialHttpConnectionSettingsInput
The HTTP Data Source connection settings.
The HTTP Basic authentication settings for uploading new data.
If this parameter is not provided, anyone with the URL to your tables will be able to upload data. While it’s OK to test without HTTP Basic authentication, we recommend enabling it. If not provided this property will not be modified.
Set this to false
to disable HTTP Basic authentication. Any previously stored HTTP Basic authentication settings will be cleared out. If not provided this property will not be modified.
The HTTP Data Source’s tables. If not provided this property will not be modified.
CreateKafkaDataSourceInput
The Kafka Data Source’s connection settings
The Kafka Data Source’s description.
The Kafka Data Source’s unique name. If not specified, Propel will set the ID as unique name.
KafkaConnectionSettingsInput
The Kafka Data Source connection settings.
The type of authentication to use. Can be SCRAM-SHA-256, SCRAM-SHA-512, PLAIN or NONE
The bootstrap server(s) to connect to
The password for the provided user
Whether the the connection to the Kafka servers is encrypted or not
The user for authenticating against the Kafka servers
ModifyKafkaDataSourceInput
The Kafka Data Source’s new connection settings. If not provided this property will not be modified.
The Kafka Data Source’s new description. If not provided this property will not be modified.
The ID or unique name of the Kafka Data Source to modify.
The Kafka Data Source’s new unique name. If not provided this property will not be modified.
PartialKafkaConnectionSettingsInput
The Kafka Data Source connection settings.
The type of authentication to use. Can be SCRAM-SHA-256, SCRAM-SHA-512, PLAIN or NONE If not provided this property will not be modified.
The bootstrap server(s) to connect to If not provided this property will not be modified.
The password for the provided user If not provided this property will not be modified.
Whether the the connection to the Kafka servers is encrypted or not If not provided this property will not be modified.
The user for authenticating against the Kafka servers If not provided this property will not be modified.
CreatePostgreSqlDataSourceInput
The PostgreSQL Data Source’s connection settings
The PostgreSQL Data Source’s description.
The PostgreSQL Data Source’s unique name. If not specified, Propel will set the ID as unique name.
ModifyPostgreSqlDataSourceInput
The PostgreSQL Data Source’s new connection settings. If not provided this property will not be modified.
The PostgreSQL Data Source’s new description. If not provided this property will not be modified.
The ID or unique name of the PostgreSQL Data Source to modify.
The PostgreSQL Data Source’s new unique name. If not provided this property will not be modified.
PartialPostgreSqlConnectionSettingsInput
The PostgreSQL Data Source connection settings.
Which database to connect to If not provided this property will not be modified.
The host where PostgreSQL is listening If not provided this property will not be modified.
The password for the provided user If not provided this property will not be modified.
The port where PostgreSQL is listening (usually 5432) If not provided this property will not be modified.
Which schema to use If not provided this property will not be modified.
The user for authenticating against PostgreSQL If not provided this property will not be modified.
PostgreSqlConnectionSettingsInput
The PostgreSQL Data Source connection settings.
Which database to connect to
The host where PostgreSQL is listening
The password for the provided user
The port where PostgreSQL is listening (usually 5432)
Which schema to use
The user for authenticating against PostgreSQL
CreateS3DataSourceInput
The S3 Data Source’s connection settings
The S3 Data Source’s description.
The S3 Data Source’s unique name. If not specified, Propel will set the ID as unique name.
ModifyS3DataSourceInput
The S3 Data Source’s new connection settings. If not provided this property will not be modified.
The S3 Data Source’s new description. If not provided this property will not be modified.
The ID or unique name of the S3 Data Source to modify.
The S3 Data Source’s new unique name. If not provided this property will not be modified.
PartialS3ConnectionSettingsInput
The connection settings for an S3 Data Source. These include the S3 bucket name, the AWS access key ID, and the tables (along with their paths). We do not allow fetching the AWS secret access key after it has been set.
The AWS access key ID for an IAM user with sufficient access to the S3 bucket. If not provided this property will not be modified.
The AWS secret access key for an IAM user with sufficient access to the S3 bucket. If not provided this property will not be modified.
The name of the S3 bucket. If not provided this property will not be modified.
The S3 Data Source’s tables. If not provided this property will not be modified.
S3ConnectionSettingsInput
The connection settings for an S3 Data Source. These include the S3 bucket name, the AWS access key ID, and the tables (along with their paths). We do not allow fetching the AWS secret access key after it has been set.
The AWS access key ID for an IAM user with sufficient access to the S3 bucket.
The AWS secret access key for an IAM user with sufficient access to the S3 bucket.
The name of the S3 bucket.
The S3 Data Source’s tables.
CreateWebhookDataSourceInput
The Webhook Data Source’s connection settings
The Webhook Data Source’s description.
The Webhook Data Source’s unique name. If not specified, Propel will set the ID as unique name.
ModifyWebhookDataSourceInput
The Webhook Data Source’s new connection settings. If not provided this property will not be modified.
The Webhook Data Source’s new description. If not provided this property will not be modified.
The ID or unique name of the Webhook Data Source to modify.
The Webhook Data Source’s new unique name. If not provided this property will not be modified.
PartialWebhookConnectionSettingsInput
The Webhook Data Source connection settings.
The HTTP basic authentication settings for the Webhook Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it. If not provided this property will not be modified.
Set this to false
to disable HTTP Basic authentication. Any previously stored HTTP Basic authentication settings will be cleared out. If not provided this property will not be modified.
WebhookConnectionSettingsInput
The Webhook Data Source connection settings.
Enables or disables access control for the Data Pool. If the Data Pool has access control enabled, Applications must be assigned Data Pool Access Policies in order to query the Data Pool and its Metrics.
The HTTP basic authentication settings for the Webhook Data Source URL. If this parameter is not provided, anyone with the webhook URL will be able to send events. While it’s OK to test without HTTP Basic authentication, we recommend enabling it.
The additional columns for the Webhook Data Source table.
Override the Data Pool’s table settings. These describe how the Data Pool’s table is created in ClickHouse, and a
default will be chosen based on the Data Pool’s timestamp
and uniqueId
values, if any. You can override these
defaults in order to specify a custom table engine, custom ORDER BY, etc.
The tenant ID column, if any.
The primary timestamp column, if any.
The unique ID column, if any. Propel uses the primary timestamp and a unique ID to compose a primary key for determining whether records should be inserted, deleted, or updated.