Streaming Emitter
In this article
Streaming Emitter is an action provided by Spark Structured Streaming. It provides a custom implementation for processing streaming data which is executed using the emitter.
Streaming Emitter Configuration
To add a Custom Streaming Emitter to your pipeline, drag the Custom Streaming Emitter onto the canvas, connect it to a Data Source or processor, and right-click on it to configure:
Field | Description |
---|---|
Implementation Class | Foreach implementation class to which control will be passed to process incoming data flow. |
Output Mode | Output mode to be used while writing the data to Streaming sink. Select the output mode from the given three options: Append: Output Mode in which only the new rows in the streaming data will be written to the sink Complete Mode: Output Mode in which all the rows in the streaming data will be written to the sink every time there are some updates Update Mode: Output Mode in which only the rows that were updated in the streaming data will be written to the sink every time there are some updates. |
Enable Trigger | Trigger defines how frequently a streaming query should be executed. |
Processing Time | It will appear only when Enable Trigger checkbox is selected. Processing Time is the trigger time interval in minutes or seconds. |
ADD CONFIGURATION | Additional properties can be added. |
Click on the Next button. Enter the notes in the space provided.
Click on the Done button for saving the configuration.
If you have any feedback on Gathr documentation, please email us!