Pipeline Introduction
Gathr allows several ETL/ELT, Data Ingestion, CDC, Stream Processing, Data Preparation, ML, and Data Science functions.
A data pipeline is a sequence of actions that moves data from a source component to a destination target. A pipeline may involve filtering, cleaning, enriching, and even analyzing data on the go.
In Gathr, the data can be moved via. either batch processing or stream processing.
Under the pipeline section of Gathr, data pipelines can be easily created with a set of tools (Data Sources/Processors/ML algorithms/Emitters) to automate the movement and transformation of data between a source system and a target repository.
Within a Project, navigate to Pipeline page from the menu.
If you have any feedback on Gathr documentation, please email us!