OpenJMS Emitter
In this article
OpenJMS is used to send and receive messages from one application to another. OpenJms emitter is used to write data to JMS queues or topics. All applications that have subscribed to those topics/queues will be able to read that data.
OpenJms Emitter Configuration
To add an OpenJms emitter to your pipeline, drag the emitter onto the canvas, connect it to a Data Source or processor, and right-click on it to configure it.
Field | Description |
---|---|
Connection Name | All OpenJms connections will be listed here. Select a connection for connecting to the OpenJMS. |
Queue Name | Queue name on which messages are published. |
Output Format | Select the data format in which OpenJMS is configured to write the data. |
Output Fields | Select the fields which should be a part of the output data. |
Checkpoint Storage Location | Select the checkpointing storage location. Available options are HDFS, S3, and EFS. |
Checkpoint Connections | Select the connection. Connections are listed corresponding to the selected storage location. |
Checkpoint Directory | It is the path where Spark Application stores the checkpointing data. For HDFS and EFS, enter the relative path like /user/hadoop/, checkpointingDir system will add suitable prefix by itself. For S3, enter an absolute path like: S3://BucketName/checkpointingDir |
Time-Based Check Point | Select checkbox to enable timebased checkpoint on each pipeline run i.e. in each pipeline run above provided checkpoint location will be appended with current time in millis. |
Output Mode | Output Mode to be used while writing the data to data sink. Select the output mode from the given three options: Append: Output Mode in which only the new rows in the streaming data will be written to the sink Complete Mode: Output Mode in which all the rows in the streaming data will be written to the sink every time there are some updates Update Mode: Output Mode in which only the rows that were updated in the streaming data will be written to the sink every time there are some updates. |
Enable Trigger | Trigger defines how frequently a streaming query will be executed. |
Processing Time | It will appear only when Enable Trigger checkbox is selected. Processing Time is the trigger time interval in minutes or seconds. |
ADD CONFIGURATION | Enables to configure additional properties. |
Click on the Next button. Enter the notes in the space provided.
Click SAVE for saving the configuration details.
If you have any feedback on Gathr documentation, please email us!