Phoenix ETL Source
See the Connector Marketplace topic. Please request your administrator to start a trial or subscribe to the Premium Phoenix connector.
In Gathr, it can be added as a channel to help in fetching customers’ and prospects’ data and transform it as needed before storing it in a desired data warehouse to run further analytics.
Schema Type
See the topic Provide Schema for ETL Source → to know how schema details can be provided for data sources.
After providing schema type details, the next step is to configure the data source.
Data Source Configuration
Configure the data source parameters as explained below.
Connection Name
Connections are the service identifiers. A connection name can be selected from the list if you have created and saved connection details for Phoenix earlier. Or create one as explained in the topic - Phoenix Connection →
Use the Test Connection option to ensure that the connection with the Phoenix channel is established successfully.
A success message states that the connection is available. In case of any error in test connection, edit the connection to resolve the issue before proceeding further.
Entity
Tables in Phoenix are statically defined to model Phoenix entities.
If you selected the Fetch From Source method to design the application, the Entities will list as per the configured connection. Select the entity to be read from Phoenix.
If you selected the Upload Data File method to design the application, the exact name of the entity should be provided to read the data from Phoenix.
If you selected the Fetch From Source method to design the application, the Fields would list as per the Entity chosen in the previous configuration parameter. Select the fields or provide a custom query to read the desired records from Phoenix.
Fields
The conditions to fetch source data from a Phoenix table can be specified using this option.
Select Fields: Select the column(s) of the entity that should be read.
Custom Query: Provide an SQL query specifying the read conditions for the source data.
Example: SELECT "Id" FROM Companies
If you selected the Upload Data File method to design the application, provide a custom query to fetch records from the Phoenix entity specified in the previous configuration.
Query
The conditions to fetch source data from a Phoenix table can be specified using this option.
Provide an SQL query specifying the read conditions for the source data.
Example: SELECT "Id" FROM Companies
Read Options
This section contains additional configuration parameters.
Page Size
The number of rows which will be returned for each frame request.
Allow Prepared Statement
Prepare a query statement before its execution.
If the AllowPreparedStatement property is set to false, statements are parsed each time they are executed. Setting this property to false can be useful if you are executing many different queries only once.
Query Pass through
This option passes the query to the Apache Phoenix server as is.
Partitioning
This section contains partitioning-related configuration parameters.
Enable Partitioning
This enables parallel reading of the data from the entity.
Partitioning is disabled by default.
If enabled, an additional option will appear to configure the partitioning conditions.
Column
The selected column will be used to partition the data.
Max Rows per Partition: Enter the maximum number of rows to be read in a single request.
Example: 10,000
It implies that a maximum number of 10,000 rows can be read in one partition.
Add Configuration: Additional properties can be added using this option as key-value pairs.
Detect Schema
Check the populated schema details. For more details, see Schema Preview →
Pre Action
To understand how to provide SQL queries or Stored Procedures that will be executed during pipeline run, see Pre-Actions →.
Notes
Optionally, enter notes in the Notes → tab and save the configuration.
If you have any feedback on Gathr documentation, please email us!