Gathr has got the ability to integrate multiple pipelines and Workflows itself. You can orchestrate the sequence and conditions that govern the execution of these pipelines. This is achieved through the Workflow interface.
On a Workflow page, you can perform the following actions:
Go to Workflow option in your Project and click on Create New Workflow button. You will be redirected to Workflow Definition page.
A Workflow tile has the following fields:
This page has nodes to create and execute the workflow. They are explained below:
To define a workflow, four nodes are available:
Add Control node with one or multiple pipelines, with actions applied on it. Save your workflow. Once the Workflows are saved, you can also concatenate Workflows from the Workflow tab.
Two types of control nodes are available:
Batch Pipelines created in the workspace are reflected here.
Action nodes are available to provide functionality of the following actions to a workflow. Following are the Actions available under Action node.
Each action node is explained below:
This operator is used to assign workflow level variable and its value.
You can define multiple variables and their values by clicking on + ADD VARIABLE button. Make sure the values of Variable are of python data type.
This node is used to execute bash script, command or set of commands. Following are the configuration properties of Bash Operator. Command, set of commands or reference to bash script that is to be executed. Number of times workflow tries to run this task in case of failure.
This node is used for sending email. The details provided are redirected to the mail server details given in the Airflow configuration during installation.
Set the following configurations for this node.
Note: You can choose multiple recipients for To, CC and BCC.
HDFS Sensor is used to check whether the given location (of the file or folder path) lands on HDFS (HDFS Sensor). If the sensor finds a given location on HDFS in the given time interval, then it will be considered successful otherwise failed.
Following are the configurations for this node.
In the above configuration, once HDFSSensor is triggered, it will keep checking location(/sample/file/path) in every 20 seconds for duration of 300 seconds.
This operator is used to hit an end point over HTTP system.
Following are the properties under the configuration:
This operator use to execute SQL statements on a Microsoft SQL database. Select connection ID from drop-down which is defined in Airflow to connect with required MSSQL database. Number of times workflow tries to run this task in case of failure.
This operator is used to execute SQL statement on a MySQL database. Select connection ID from drop-down which is defined in Airflow to connect with required MYSQL server. Number of times workflow tries to run this task in case of failure.
The Pause Operator is used to Pause the current workflow. Provide the below configuration details to configure Pause Operator:
Pipeline operator is used to run selected pipeline. You can select pipelines that needs to run. Function of this operator is same as that of a Pipeline Node.
You have to set the following configurations for this operator.
Write custom python code here that would be execute by workflow. Number of times workflow tries to run this task in case of failure.
This operator is used to execute SQL statement on a PostgreSQL database. Select connection ID from drop down which is defined in airflow to connect with required Postgres database. Number of times workflow tries to run this task in case of failure.
This operator allows you to execute custom logic/code in a workflow. You can write custom code in python language and it will be execute by workflow.
Write custom code in a python method and provide method name that should be invoked by workflow. In addition, you can also get, set or update workflow variable in custom logic.
get_dag_variable(variable_name, variable_type=None):
This method is used to get workflow variable. Arguments are:
variable_type (optional): provide variable type
set_dag_variable(variable_name, variable_value):
This method is used to set/update workflow variable. Arguments are:
variable_name: provide variable name
variable_value: provide variable value
This operator is used for transferring files from remote host to local or vice a versa Select connection ID from drop down which is defined in airflow to connect with required SFTP sever. Number of times workflow tries to run this task in case of failure.
SFTP Sensor is used to check if a given location (file or folder path) is landed on SFTP or not. If sensor finds given location on SFTP in the given interval, then it will considered successful, otherwise failed.
SQLSensor runs the SQL statement until first cell is in (0,’0’,’’).
It runs the SQL statement after each poke interval until Time-Out interval.
Waits for a given amount of time before succeeding. User needs to provide configurations.
This operator is used to execute SQL statement on a Vertica database. Select connection ID from drop-down which is defined in airflow to connect with required Vertica database. Number of times workflow tries to run this task in case of failure.
Kafka alert operator is use to send alert/message to a Kafka topic.
Time window operator is use to check if current execution is in given time window or not. It also checks if current execution date is in calendar holiday or not. If current execution time is not in given time window or current execution date is in calendar holiday then operator returns False and consider as failed. Select as True to check, if current execution date is in calendar holiday. When you select TRUE, additional configuration is populated. Select a calender for holidays. These are the calendars created from Register Entities < Calendar section. Number of times workflow tries to run this task in case of failure.
SSH operator is use to run commands over remote machine. For that user need to provide following.
All the Workflows that are created in Gathr for the logged in user are listed on the Workflow home page.
You can add workflow as an operator inside a workflow (similar to pipeline). This workflow will be act as sub-workflow and a separate instance of workflow and will be executed as sub-workflow.
Once a workflow is defined, provide a workflow name and click on Save Workflow button to create a workflow using another workflow. (Workflow Tile).
Once you have defined and created a workflow, following operations user can perform on workflow:
You can edit a workflow, however not when it is active.
A workflow can have another workflow in it. This enables the Parent Child marker in the workflows.
A Parent Marker will be shown with a Parent Workflow marker icon on the workflow, similarly for a Child Workflow marker.
On every tile, a count of no. of pipelines and subworkflow is shown upto level 1.
Which means, that if a Workflow has a 2 subworkflows, you will be able to see their names on the tiles. However, if there are more subworkflows in it or pipelines, then you will have to keep clicking and expand them on the workflow definition page. For example, you can view the three subworkflows under t2, but to check the t3, you need to click on it and view the edit mode of the subworkflow.
When a user clicks on the pipeline/workflow count, the pipeline names and creation date appear on the popup. On clicking on an individual pipeline/workflow you will be redirected to its edit view.
In the edit view, you will be able to drill down every workflow till its last level. Double click on any workflow to view the window shown below:
Note: Workflow component apart from pipeline and workflow type are not available for drill down, which includes an individual operator.
Once workflow is created, deploy the workflow by clicking on SCHEDULE button. Set your workflow scheduling configuration as defined below:
Runs workflow after this interval. You can select a predefined intervals. If the user opts for None option for scheduled interval, he can Trigger Workflow. Select as True to check, if current execution date of workflow is in given calendar holiday. If current execution date is in calendar holiday then that execution of workflow will not happen. If selected as True then following additional configurations are populated: Calendar and Include Weekend. Select the calendar that you created in the Calendar section. Select True or False, if you want to include the weekend as a holiday or not. Number of times Airflow restarts workflow (in case of failure) If number of retries is greater than 1 or above, then provide a value in Retry Delay (in seconds), which enables the workflow to try restarting, after a certain time of delay. You can send an email if the workflow retry is attempted, by setting its value to True or not by using False. An email will be sent on failure of the workflow, if the value is set to True. the default value is False. Provide mail ids. If Email on Retry or Email on Failure is set to True, it sends an email to a given mail id. If the workflow fails, Airflow will retry to run workflow and accordingly the mail will be sent. Set to True, if current run of workflow depends upon its last run. Default is False In case of True: A scheduled run will only be executed if a previously scheduled downstream task is complete. In case the downstream task has failed the scheduled task will wait unless the downstream task is complete. In case of False: The scheduled task will run irrespective of the downstream task. Option to specify the number of instances that can run concurrently for this Workflow. User can use this property as environment variable in bash operator i.e., $RFC. User can use this property as environment variable in bash operator i.e., $CheckpointID.
When the user selects Schedule Interval as None then the Workflow can be Triggered as shown above.
To remove a workflow from a schedule, click on the SCHEDULING option. This option is available on the Workflow tile under the menu button.
A new window will pop up and you can un-schedule the workflow.
You can always Reschedule the workflow, using the same steps.
After scheduling workflow, status of the workflow will change to SCHEDULED. Now, you can start the workflow by clicking on START button. If the workflow starts successfully, the status will change to ACTIVE.
Workflow monitoring feature allows you to:
1. View the workflow run history.
2. View the component-level status and its logs details.
4. Re-trigger all the failed tasks with a single click.
The view summary page will take you to the dashboard of workflow monitoring where you can see the last 5 runs and their details.
Upon clicking the Run ID, user can view the workflow summary. If the workflow fails, monitor window will help you resume and run the workflow from the failed stage. Shown below is a Monitor window with a failed task. Right click on the pipeline to view the logs and re-trigger in case of a failed task, as shown in the below images respectively.
As shown above within the Workflow summary DAG window you can perform the below functions:
Once the workflow is in ACTIVE state, you can PAUSE the workflow.
You can RESUME a paused workflow.
Click on Delete button, to delete the workflow.
You can test the workflow before it is scheduled.
Once you click on the Test button as shown below, a pop-up with testing configuration will appear. Details of each field is explained below:
Here workflow will consider current date/time (of given time zone) as start date. Once you click on Test button, workflow will be deployed and is instantly scheduled to start.
During testing, Workflow runs only once. You will not be able to perform other operations like Start/Stop/Schedule/Resume. Once workflow testing is complete, the workflow will be ready to be Scheduled.
Once testing is done, test status will be visible on Workflow Tile. It shows whether workflow is succeed or failed.
You can create connections in Airflow from Gathr. After clicking on + sign in workflow creation page (Create Connection Icon), a pop-up will appear (Create Connection). Provide information required to create a connection in this tile.
Note: If Airflow is configured with AWS MWAA in gathr Configurations, then the user will need to create the connection for the required operator in Airflow UI and provide the connection ID while configuring respective operator in gathr.
Following types of connection can be created:
You can create a workflow that has multiple executions of the same flow with different configurations.
You can convert any configuration property as a placeholder in each operator. This will allow you to provide different configurations for different executions. Once all operators are configured and the flow is defined, you will able to provide different values for each placeholder either by entering data in data grid or by uploading a csv file.
You will have following options:
You can convert any configuration property as a placeholder by clicking on it. Once clicked, its color will change and '$$' will be append as prefix. Value that user provided for this configuration will considered as placeholder key.
If at least any one of the property is selected as placeholder in any of the operator, workflow will be converted into a template workflow and data grid option will be enable to provide configurations
On the top right corner of the pipeline canvas, a grid icon is placed. This is the template icon. Once you click on it, it opens the template window beneath the pipeline.
In the Data grid, all placeholder keys are listed as header and you can provide their values for each execution. Each row will consider as one execution. A column with ‘unique_id’ is appended by default to identify each execution.
The Template window configuration properties are explained below:
• It is not allowed to convert any configuration property, in Pipeline, Control and Workflow node as a placeholder.
• Make sure the placeholder name and property names are not similar, they should be different. For example, if your property is Connection_ID, make sure the placeholder name is not Connection_ID too.
• Provide python list to use variable name (string) & value in Assignment Operator as placeholder.
Below mentioned are certain terminologies which are used in the workflow:
Id used to create a connection in Airflow. Connection ID provided in workflow component should be same as ID given in airflow for connection
It is the rule by which current node or the task is triggered. The trigger rule is based on current state of all the parent nodes/ upstream tasks. Default value of trigger rule is ALL Success, which means, current node/task will only trigger, if all the parent nodes/upstream tasks have run successfully. Following can be the value of trigger rule.
Service Level Agreement is the time by which a task should have succeeded.
If the task does not succeed within the given SLA, an alert email is sent on the configured mail id with details of task. SLA time has two properties, Duration and Value.
A message will be displayed on the workflow homepage for following scenarios:
l If Airflow is not configured properly or Airflow webserver is down.
l If Airflow scheduler is down.