Skip to main content

Dependencies

Before setting up the plugin, ensure you have the following dependencies:
  • A MongoDB database for storing notification templates and records
  • Access to the Kafka instance used by the FlowX.AI Engine
  • A Redis instance for caching notification templates
  • An S3-compatible file storage solution (for example MinIO) if you need to attach documents to notifications

Authorization configuration

Set these variables to connect to your identity management platform:
Environment VariableDescriptionDefault Value
SECURITY_TYPESecurity typeoauth2
SECURITY_OAUTH2_BASESERVERURLBase URL of the OAuth2/OIDC server
SECURITY_OAUTH2_REALMOAuth2 realm name
SECURITY_OAUTH2_CLIENT_CLIENT_IDClient ID for token introspection
SECURITY_OAUTH2_CLIENT_CLIENT_SECRETClient secret for token introspection

MongoDB configuration

The only thing that needs to be configured is the DB access info, the rest will be handled by the plugin.
Environment VariableDescriptionDefault Value
SPRING_DATA_MONGODB_URIMongoDB connection URImongodb://${DB_USERNAME}:${DB_PASSWORD}@mongodb-0.mongodb-headless,mongodb-1.mongodb-headless,mongodb-arbiter-0.mongodb-arbiter-headless:27017/notification-plugin
DB_USERNAMEUsername for runtime MongoDB connectionnotification-plugin
DB_PASSWORDPassword for runtime MongoDB connectionpassword

Redis configuration

Notification Plugin uses Redis for caching. Configure Redis connection using the standard Redis environment variables. Quick reference:
Environment VariableDescriptionExample ValueStatus
SPRING_DATA_REDIS_HOSTRedis server hostnamelocalhostRecommended
SPRING_DATA_REDIS_PORTRedis server port6379Recommended
SPRING_DATA_REDIS_PASSWORDRedis authentication password-Recommended
REDIS_TTLCache TTL in milliseconds5000000Optional
Both SPRING_DATA_REDIS_* and SPRING_REDIS_* variable prefixes are supported. The SPRING_DATA_REDIS_* prefix is the modern Spring Boot standard and is recommended for new deployments.
For advanced Redis deployment modes (Sentinel, Cluster) and SSL/TLS setup, see the Redis Configuration guide. Note that Sentinel and Cluster modes are only supported by the Events Gateway service.

Kafka configuration

Core Kafka settings

Environment VariableDescriptionDefault Value
SPRING_KAFKA_BOOTSTRAPSERVERSAddress of the Kafka server(s)localhost:9092
SPRING_KAFKA_SECURITY_PROTOCOLSecurity protocol for Kafka connectionsPLAINTEXT
SPRING_KAFKA_CONSUMER_GROUPIDConsumer group identifiernotification-plugin-consumer
KAFKA_MESSAGE_MAX_BYTESMaximum message size (bytes)52428800 (50 MB)
KAFKA_AUTHEXCEPTIONRETRYINTERVALRetry interval after authorization exceptions (seconds)10
KAFKA_CONSUMER_THREADSNumber of consumer threads1

Consumer error handling

Environment VariableDescriptionDefault Value
KAFKA_CONSUMER_ERRORHANDLING_ENABLEDEnable consumer error handlingfalse
KAFKA_CONSUMER_ERRORHANDLING_RETRIESNumber of retry attempts for failed messages0
KAFKA_CONSUMER_ERRORHANDLING_RETRYINTERVALInterval between retries (milliseconds)1000

OAuth authentication (when using SASL_PLAINTEXT)

Environment VariableDescriptionDefault Value
KAFKA_OAUTH_CLIENT_IDOAuth client IDkafka
KAFKA_OAUTH_CLIENT_SECRETOAuth client secretkafka-secret
KAFKA_OAUTH_TOKEN_ENDPOINT_URIOAuth token endpointkafka.auth.localhost
When using the kafka-auth profile, the security protocol will automatically be set to SASL_PLAINTEXT and the SASL mechanism will be set to OAUTHBEARER.

Topic naming configuration

Environment VariableDescriptionDefault Value
KAFKA_TOPIC_NAMING_PACKAGEPackage prefix for topic namesai.flowx.
KAFKA_TOPIC_NAMING_ENVIRONMENTEnvironment segment for topic names
KAFKA_TOPIC_NAMING_VERSIONVersion suffix for topic names.v1
KAFKA_TOPIC_NAMING_SEPARATORPrimary separator for topic names.
KAFKA_TOPIC_NAMING_SEPARATOR2Secondary separator for topic names-
KAFKA_TOPIC_NAMING_ENGINERECEIVEPATTERNEngine receive patternengine.receive.

Topic configurations

Each action in the service corresponds to a Kafka event on a specific topic. Configure the following topics:

OTP topics

Environment VariableDescriptionDefault Value
KAFKA_TOPIC_OTP_GENERATE_INTopic for incoming OTP generation requestsai.flowx.plugin.notification.trigger.generate.otp.v1
KAFKA_TOPIC_OTP_GENERATE_OUTTopic for OTP generation resultsai.flowx.engine.receive.plugin.notification.generate.otp.results.v1
KAFKA_TOPIC_OTP_VALIDATE_INTopic for incoming OTP validation requestsai.flowx.plugin.notification.trigger.validate.otp.v1
KAFKA_TOPIC_OTP_VALIDATE_OUTTopic for OTP validation resultsai.flowx.engine.receive.plugin.notification.validate.otp.results.v1

Notification topics

Environment VariableDescriptionDefault Value
KAFKA_TOPIC_NOTIFICATION_INTERNAL_INTopic for incoming notification requests from the process engine (used by both Kafka-forwarded notifications and the Send Notification action)ai.flowx.plugin.notification.trigger.send.notification.v1
KAFKA_TOPIC_NOTIFICATION_INTERNAL_OUTTopic for notification delivery confirmationsai.flowx.engine.receive.plugin.notification.confirm.send.notification.v1
KAFKA_TOPIC_NOTIFICATION_EXTERNAL_OUTTopic for forwarding notifications to external systemsai.flowx.plugin.notification.trigger.forward.notification.v1

Audit topic

Environment VariableDescriptionDefault Value
KAFKA_TOPIC_AUDIT_OUTTopic for sending audit logsai.flowx.core.trigger.save.audit.v1

Resource usages topics

New in v5.5.0
Environment VariableDescriptionDefault Value
KAFKA_TOPIC_RESOURCESUSAGES_REFRESHTopic for resource usages refresh eventsai.flowx.application-version.resources-usages.refresh.v1
KAFKA_TOPIC_APPLICATION_RESOURCE_RESELEMUSAGEVALIDATION_RESPONSETopic for sub-resource validation responsesai.flowx.application-version.resources-usages.sub-res-validation.response.v1
KAFKA_TOPIC_APPLICATION_RESOURCE_USAGES_OUTTopic for bulk resource usage operationsai.flowx.application-version.resources-usages.operations.bulk.v1

File storage configuration

Based on use case you can use directly a file system or an S3 compatible cloud storage solution (for example min.io). The file storage solution can be configured using the following environment variables:
Environment VariableDescriptionDefault Value
APPLICATION_FILESTORAGE_TYPEStorage type to use (s3 or fileSystem)s3
APPLICATION_FILESTORAGE_DISKDIRECTORYDirectory for file storage when using filesystemMS_SVC_NOTIFICATION
APPLICATION_FILESTORAGE_S3_ENABLEDEnable S3-compatible storagetrue
APPLICATION_FILESTORAGE_S3_SERVERURLURL of MinIO or S3-compatible serverhttp://minio-service:9000
APPLICATION_FILESTORAGE_S3_ENCRYPTIONENABLEDEnable server-side encryptionfalse
APPLICATION_FILESTORAGE_S3_ACCESSKEYAccess key for S3minio
APPLICATION_FILESTORAGE_S3_SECRETKEYSecret key for S3secret
APPLICATION_FILESTORAGE_S3_BUCKETPREFIXPrefix for bucket namesqdevlocal-preview-paperflow
When using S3-compatible storage for notifications with attachments, the S3 user configured through APPLICATION_FILESTORAGE_S3_ACCESSKEY and APPLICATION_FILESTORAGE_S3_SECRETKEY must have read access to multiple buckets beyond its own:Required bucket access:
  • Own bucket - defined by APPLICATION_FILESTORAGE_S3_BUCKETPREFIX
  • Documents Plugin bucket - defined in the Documents Plugin configuration via APPLICATION_FILESTORAGE_S3_BUCKETPREFIX
  • CMS Core public bucket - defined in the CMS Core configuration via APPLICATION_FILESTORAGE_S3_BUCKETNAME
  • Integration Designer bucket - defined in the Integration Designer configuration via APPLICATION_FILESTORAGE_S3_BUCKETPREFIX
Ensure your S3 user has appropriate read permissions to all releva dnt buckets to avoid attachment failures.

SMTP setup

Configure SMTP settings for sending email notifications:
Environment VariableDescriptionDefault Value
SIMPLEJAVAMAIL_SMTP_HOSTSMTP server hostnamesmtp.gmail.com
SIMPLEJAVAMAIL_SMTP_PORTSMTP server port587
SIMPLEJAVAMAIL_SMTP_USERNAMESMTP server usernamenotification.test@flowx.ai
SIMPLEJAVAMAIL_SMTP_PASSWORDSMTP server passwordpaswword
SIMPLEJAVAMAIL_TRANSPORTSTRATEGYEmail transport strategy (e.g., SMTP, EXTERNAL_FORWARD)SMTP
APPLICATION_MAIL_FROM_EMAILDefault sender email addressnotification.test@flowx.ai
APPLICATION_MAIL_FROM_NAMEDefault sender nameNotification Test

Email attachments configuration

Configure handling of email attachments:
Environment VariableDescriptionDefault Value
SPRING_HTTP_MULTIPART_MAXFILESIZEMaximum file size for attachments15MB
SPRING_HTTP_MULTIPART_MAXREQUESTSIZEMaximum request size for multipart uploads15MB

OTP configuration

Configure One-Time Password generation and validation:
Environment VariableDescriptionDefault Value
FLOWX_OTP_LENGTHNumber of characters in generated OTPs4
FLOWX_OTP_EXPIRETIMEINSECONDSExpiry time for OTPs (seconds)6000 (10 minutes)

Logging configuration

Control logging levels for different components:
Environment VariableDescriptionDefault Value
LOGGING_LEVEL_ROOTRoot logging level-
LOGGING_LEVEL_APPApplication-specific log levelINFO
LOGGING_LEVEL_MONGO_DRIVERMongoDB driver log levelINFO
LOGGING_LEVEL_THYMELEAFThymeleaf template engine log levelINFO
LOGGING_LEVEL_FCM_CLIENTFirebase Cloud Messaging client log levelOFF
LOGGING_LEVEL_REDISRedis/Lettuce client log levelOFF

CAS lib configuration

Environment VariableDescriptionDefault Value
FLOWX_SPICEDB_HOSTSpiceDB server hostnamespicedb
FLOWX_SPICEDB_PORTSpiceDB server port50051
FLOWX_SPICEDB_TOKENSpiceDB authentication token-

Usage notes

Topic naming convention

Topics follow a standardized naming convention:
  • Example: ai.flowx.plugin.notification.trigger.generate.otp.v1
  • Structure: {package}{environment}.{component}.{action}.{subject}.{version}

Consumer error handling

When KAFKA_CONSUMER_ERRORHANDLING_ENABLED is set to true:
  • The application will retry processing failed messages according to KAFKA_CONSUMER_ERRORHANDLING_RETRIES
  • Between retries, the application will wait for the duration specified by KAFKA_CONSUMER_ERRORHANDLING_RETRYINTERVAL
For example, if KAFKA_CONSUMER_ERRORHANDLING_RETRYINTERVAL is set to 5000 (5 seconds) and KAFKA_CONSUMER_ERROR_HANDLING_RETRIES is set to 5, the consumer application will make up to 5 attempts, waiting 5 seconds between each attempt.

Message size configuration

The KAFKA_MESSAGE_MAX_BYTES setting affects multiple Kafka properties:
  • spring.kafka.producer.properties.message.max.bytes
  • spring.kafka.producer.properties.max.request.size
  • spring.kafka.consumer.properties.max.partition.fetch.bytes

OAuth authentication

When using the ‘kafka-auth’ profile, the security protocol changes to ‘SASL_PLAINTEXT’ and requires OAuth configuration via the KAFKA_OAUTH_* variables.

Troubleshooting

Common issues

Symptoms: Notification requests are accepted but emails or messages are never delivered.Solutions:
  1. Verify Kafka topics are correctly configured and the notification plugin is consuming from KAFKA_TOPIC_NOTIFICATION_INTERNAL_IN
  2. Check SMTP configuration (SIMPLEJAVAMAIL_SMTP_HOST, port, credentials) and ensure the SMTP server is reachable from the pod
  3. If using the Email Gateway for forwarding, verify KAFKA_TOPIC_NOTIFICATION_EXTERNAL_OUT is configured and the Email Gateway is running
  4. Check application logs for delivery errors by setting LOGGING_LEVEL_APP to DEBUG
Symptoms: OTP requests return errors or no OTP is generated.Solutions:
  1. Verify OTP configuration values (FLOWX_OTP_LENGTH, FLOWX_OTP_EXPIRETIMEINSECONDS) are set correctly
  2. Check that Kafka topics KAFKA_TOPIC_OTP_GENERATE_IN and KAFKA_TOPIC_OTP_GENERATE_OUT are created and accessible
  3. Ensure MongoDB is reachable and the notification plugin database has write permissions
  4. Verify Kafka consumer group SPRING_KAFKA_CONSUMER_GROUPID is not conflicting with another instance
Symptoms: Notifications are sent but contain raw template syntax or missing values.Solutions:
  1. Verify substitution tags in the template match the keys provided in the notification request payload
  2. Check that the Thymeleaf engine is functioning by reviewing logs at LOGGING_LEVEL_THYMELEAF set to DEBUG
  3. Ensure the template exists in MongoDB and is in the correct format
  4. Verify Redis cache is not serving stale templates — clear the cache or restart the plugin if templates were recently updated
Symptoms: Push notifications are not delivered to mobile devices.Solutions:
  1. Verify Firebase Cloud Messaging credentials are correctly configured
  2. Check that the target device has a valid FCM registration token
  3. Enable FCM client logging by setting LOGGING_LEVEL_FCM_CLIENT to DEBUG to inspect request/response details
  4. Ensure network policies allow outbound HTTPS traffic to Firebase servers (fcm.googleapis.com)

Sending Notifications

Learn how to configure and send notifications from your processes

Email Gateway Setup

Configure the Email Gateway for inbound and outbound email processing

Redis Configuration

Complete Redis setup including Sentinel and Cluster modes
Last modified on March 25, 2026