Dependencies
Before setting up the plugin, ensure you have the following dependencies:- A MongoDB database for storing notification templates and records
- Access to the Kafka instance used by the FlowX.AI Engine
- A Redis instance for caching notification templates
- An S3-compatible file storage solution (for example MinIO) if you need to attach documents to notifications
Authorization configuration
Set these variables to connect to your identity management platform:| Environment Variable | Description | Default Value |
|---|---|---|
SECURITY_TYPE | Security type | oauth2 |
SECURITY_OAUTH2_BASESERVERURL | Base URL of the OAuth2/OIDC server | |
SECURITY_OAUTH2_REALM | OAuth2 realm name | |
SECURITY_OAUTH2_CLIENT_CLIENT_ID | Client ID for token introspection | |
SECURITY_OAUTH2_CLIENT_CLIENT_SECRET | Client secret for token introspection |
MongoDB configuration
The only thing that needs to be configured is the DB access info, the rest will be handled by the plugin.| Environment Variable | Description | Default Value |
|---|---|---|
SPRING_DATA_MONGODB_URI | MongoDB connection URI | mongodb://${DB_USERNAME}:${DB_PASSWORD}@mongodb-0.mongodb-headless,mongodb-1.mongodb-headless,mongodb-arbiter-0.mongodb-arbiter-headless:27017/notification-plugin |
DB_USERNAME | Username for runtime MongoDB connection | notification-plugin |
DB_PASSWORD | Password for runtime MongoDB connection | password |
Redis configuration
Notification Plugin uses Redis for caching. Configure Redis connection using the standard Redis environment variables. Quick reference:| Environment Variable | Description | Example Value | Status |
|---|---|---|---|
SPRING_DATA_REDIS_HOST | Redis server hostname | localhost | Recommended |
SPRING_DATA_REDIS_PORT | Redis server port | 6379 | Recommended |
SPRING_DATA_REDIS_PASSWORD | Redis authentication password | - | Recommended |
REDIS_TTL | Cache TTL in milliseconds | 5000000 | Optional |
Both
SPRING_DATA_REDIS_* and SPRING_REDIS_* variable prefixes are supported. The SPRING_DATA_REDIS_* prefix is the modern Spring Boot standard and is recommended for new deployments.For advanced Redis deployment modes (Sentinel, Cluster) and SSL/TLS setup, see the Redis Configuration guide. Note that Sentinel and Cluster modes are only supported by the Events Gateway service.
Kafka configuration
Core Kafka settings
| Environment Variable | Description | Default Value |
|---|---|---|
SPRING_KAFKA_BOOTSTRAPSERVERS | Address of the Kafka server(s) | localhost:9092 |
SPRING_KAFKA_SECURITY_PROTOCOL | Security protocol for Kafka connections | PLAINTEXT |
SPRING_KAFKA_CONSUMER_GROUPID | Consumer group identifier | notification-plugin-consumer |
KAFKA_MESSAGE_MAX_BYTES | Maximum message size (bytes) | 52428800 (50 MB) |
KAFKA_AUTHEXCEPTIONRETRYINTERVAL | Retry interval after authorization exceptions (seconds) | 10 |
KAFKA_CONSUMER_THREADS | Number of consumer threads | 1 |
Consumer error handling
| Environment Variable | Description | Default Value |
|---|---|---|
KAFKA_CONSUMER_ERRORHANDLING_ENABLED | Enable consumer error handling | false |
KAFKA_CONSUMER_ERRORHANDLING_RETRIES | Number of retry attempts for failed messages | 0 |
KAFKA_CONSUMER_ERRORHANDLING_RETRYINTERVAL | Interval between retries (milliseconds) | 1000 |
OAuth authentication (when using SASL_PLAINTEXT)
| Environment Variable | Description | Default Value |
|---|---|---|
KAFKA_OAUTH_CLIENT_ID | OAuth client ID | kafka |
KAFKA_OAUTH_CLIENT_SECRET | OAuth client secret | kafka-secret |
KAFKA_OAUTH_TOKEN_ENDPOINT_URI | OAuth token endpoint | kafka.auth.localhost |
When using the
kafka-auth profile, the security protocol will automatically be set to SASL_PLAINTEXT and the SASL mechanism will be set to OAUTHBEARER.Topic naming configuration
| Environment Variable | Description | Default Value |
|---|---|---|
KAFKA_TOPIC_NAMING_PACKAGE | Package prefix for topic names | ai.flowx. |
KAFKA_TOPIC_NAMING_ENVIRONMENT | Environment segment for topic names | |
KAFKA_TOPIC_NAMING_VERSION | Version suffix for topic names | .v1 |
KAFKA_TOPIC_NAMING_SEPARATOR | Primary separator for topic names | . |
KAFKA_TOPIC_NAMING_SEPARATOR2 | Secondary separator for topic names | - |
KAFKA_TOPIC_NAMING_ENGINERECEIVEPATTERN | Engine receive pattern | engine.receive. |
Topic configurations
Each action in the service corresponds to a Kafka event on a specific topic. Configure the following topics:OTP topics
| Environment Variable | Description | Default Value |
|---|---|---|
KAFKA_TOPIC_OTP_GENERATE_IN | Topic for incoming OTP generation requests | ai.flowx.plugin.notification.trigger.generate.otp.v1 |
KAFKA_TOPIC_OTP_GENERATE_OUT | Topic for OTP generation results | ai.flowx.engine.receive.plugin.notification.generate.otp.results.v1 |
KAFKA_TOPIC_OTP_VALIDATE_IN | Topic for incoming OTP validation requests | ai.flowx.plugin.notification.trigger.validate.otp.v1 |
KAFKA_TOPIC_OTP_VALIDATE_OUT | Topic for OTP validation results | ai.flowx.engine.receive.plugin.notification.validate.otp.results.v1 |
Notification topics
| Environment Variable | Description | Default Value |
|---|---|---|
KAFKA_TOPIC_NOTIFICATION_INTERNAL_IN | Topic for incoming notification requests from the process engine (used by both Kafka-forwarded notifications and the Send Notification action) | ai.flowx.plugin.notification.trigger.send.notification.v1 |
KAFKA_TOPIC_NOTIFICATION_INTERNAL_OUT | Topic for notification delivery confirmations | ai.flowx.engine.receive.plugin.notification.confirm.send.notification.v1 |
KAFKA_TOPIC_NOTIFICATION_EXTERNAL_OUT | Topic for forwarding notifications to external systems | ai.flowx.plugin.notification.trigger.forward.notification.v1 |
Audit topic
| Environment Variable | Description | Default Value |
|---|---|---|
KAFKA_TOPIC_AUDIT_OUT | Topic for sending audit logs | ai.flowx.core.trigger.save.audit.v1 |
Resource usages topics
New in v5.5.0
| Environment Variable | Description | Default Value |
|---|---|---|
KAFKA_TOPIC_RESOURCESUSAGES_REFRESH | Topic for resource usages refresh events | ai.flowx.application-version.resources-usages.refresh.v1 |
KAFKA_TOPIC_APPLICATION_RESOURCE_RESELEMUSAGEVALIDATION_RESPONSE | Topic for sub-resource validation responses | ai.flowx.application-version.resources-usages.sub-res-validation.response.v1 |
KAFKA_TOPIC_APPLICATION_RESOURCE_USAGES_OUT | Topic for bulk resource usage operations | ai.flowx.application-version.resources-usages.operations.bulk.v1 |
File storage configuration
Based on use case you can use directly a file system or an S3 compatible cloud storage solution (for example min.io). The file storage solution can be configured using the following environment variables:| Environment Variable | Description | Default Value |
|---|---|---|
APPLICATION_FILESTORAGE_TYPE | Storage type to use (s3 or fileSystem) | s3 |
APPLICATION_FILESTORAGE_DISKDIRECTORY | Directory for file storage when using filesystem | MS_SVC_NOTIFICATION |
APPLICATION_FILESTORAGE_S3_ENABLED | Enable S3-compatible storage | true |
APPLICATION_FILESTORAGE_S3_SERVERURL | URL of MinIO or S3-compatible server | http://minio-service:9000 |
APPLICATION_FILESTORAGE_S3_ENCRYPTIONENABLED | Enable server-side encryption | false |
APPLICATION_FILESTORAGE_S3_ACCESSKEY | Access key for S3 | minio |
APPLICATION_FILESTORAGE_S3_SECRETKEY | Secret key for S3 | secret |
APPLICATION_FILESTORAGE_S3_BUCKETPREFIX | Prefix for bucket names | qdevlocal-preview-paperflow |
When using S3-compatible storage for notifications with attachments, the S3 user configured through
APPLICATION_FILESTORAGE_S3_ACCESSKEY and APPLICATION_FILESTORAGE_S3_SECRETKEY must have read access to multiple buckets beyond its own:Required bucket access:- Own bucket - defined by
APPLICATION_FILESTORAGE_S3_BUCKETPREFIX - Documents Plugin bucket - defined in the Documents Plugin configuration via
APPLICATION_FILESTORAGE_S3_BUCKETPREFIX - CMS Core public bucket - defined in the CMS Core configuration via
APPLICATION_FILESTORAGE_S3_BUCKETNAME - Integration Designer bucket - defined in the Integration Designer configuration via
APPLICATION_FILESTORAGE_S3_BUCKETPREFIX
SMTP setup
Configure SMTP settings for sending email notifications:| Environment Variable | Description | Default Value |
|---|---|---|
SIMPLEJAVAMAIL_SMTP_HOST | SMTP server hostname | smtp.gmail.com |
SIMPLEJAVAMAIL_SMTP_PORT | SMTP server port | 587 |
SIMPLEJAVAMAIL_SMTP_USERNAME | SMTP server username | notification.test@flowx.ai |
SIMPLEJAVAMAIL_SMTP_PASSWORD | SMTP server password | paswword |
SIMPLEJAVAMAIL_TRANSPORTSTRATEGY | Email transport strategy (e.g., SMTP, EXTERNAL_FORWARD) | SMTP |
APPLICATION_MAIL_FROM_EMAIL | Default sender email address | notification.test@flowx.ai |
APPLICATION_MAIL_FROM_NAME | Default sender name | Notification Test |
Email attachments configuration
Configure handling of email attachments:| Environment Variable | Description | Default Value |
|---|---|---|
SPRING_HTTP_MULTIPART_MAXFILESIZE | Maximum file size for attachments | 15MB |
SPRING_HTTP_MULTIPART_MAXREQUESTSIZE | Maximum request size for multipart uploads | 15MB |
OTP configuration
Configure One-Time Password generation and validation:| Environment Variable | Description | Default Value |
|---|---|---|
FLOWX_OTP_LENGTH | Number of characters in generated OTPs | 4 |
FLOWX_OTP_EXPIRETIMEINSECONDS | Expiry time for OTPs (seconds) | 6000 (10 minutes) |
Logging configuration
Control logging levels for different components:| Environment Variable | Description | Default Value |
|---|---|---|
LOGGING_LEVEL_ROOT | Root logging level | - |
LOGGING_LEVEL_APP | Application-specific log level | INFO |
LOGGING_LEVEL_MONGO_DRIVER | MongoDB driver log level | INFO |
LOGGING_LEVEL_THYMELEAF | Thymeleaf template engine log level | INFO |
LOGGING_LEVEL_FCM_CLIENT | Firebase Cloud Messaging client log level | OFF |
LOGGING_LEVEL_REDIS | Redis/Lettuce client log level | OFF |
CAS lib configuration
| Environment Variable | Description | Default Value |
|---|---|---|
FLOWX_SPICEDB_HOST | SpiceDB server hostname | spicedb |
FLOWX_SPICEDB_PORT | SpiceDB server port | 50051 |
FLOWX_SPICEDB_TOKEN | SpiceDB authentication token | - |
Usage notes
Topic naming convention
Topics follow a standardized naming convention:- Example:
ai.flowx.plugin.notification.trigger.generate.otp.v1 - Structure:
{package}{environment}.{component}.{action}.{subject}.{version}
Consumer error handling
WhenKAFKA_CONSUMER_ERRORHANDLING_ENABLED is set to true:
- The application will retry processing failed messages according to
KAFKA_CONSUMER_ERRORHANDLING_RETRIES - Between retries, the application will wait for the duration specified by
KAFKA_CONSUMER_ERRORHANDLING_RETRYINTERVAL
KAFKA_CONSUMER_ERRORHANDLING_RETRYINTERVAL is set to 5000 (5 seconds) and KAFKA_CONSUMER_ERROR_HANDLING_RETRIES is set to 5, the consumer application will make up to 5 attempts, waiting 5 seconds between each attempt.
Message size configuration
TheKAFKA_MESSAGE_MAX_BYTES setting affects multiple Kafka properties:
spring.kafka.producer.properties.message.max.bytesspring.kafka.producer.properties.max.request.sizespring.kafka.consumer.properties.max.partition.fetch.bytes
OAuth authentication
When using the ‘kafka-auth’ profile, the security protocol changes to ‘SASL_PLAINTEXT’ and requires OAuth configuration via theKAFKA_OAUTH_* variables.
Troubleshooting
Common issues
Notifications not being sent
Notifications not being sent
Symptoms: Notification requests are accepted but emails or messages are never delivered.Solutions:
- Verify Kafka topics are correctly configured and the notification plugin is consuming from
KAFKA_TOPIC_NOTIFICATION_INTERNAL_IN - Check SMTP configuration (
SIMPLEJAVAMAIL_SMTP_HOST, port, credentials) and ensure the SMTP server is reachable from the pod - If using the Email Gateway for forwarding, verify
KAFKA_TOPIC_NOTIFICATION_EXTERNAL_OUTis configured and the Email Gateway is running - Check application logs for delivery errors by setting
LOGGING_LEVEL_APPtoDEBUG
OTP generation fails
OTP generation fails
Symptoms: OTP requests return errors or no OTP is generated.Solutions:
- Verify OTP configuration values (
FLOWX_OTP_LENGTH,FLOWX_OTP_EXPIRETIMEINSECONDS) are set correctly - Check that Kafka topics
KAFKA_TOPIC_OTP_GENERATE_INandKAFKA_TOPIC_OTP_GENERATE_OUTare created and accessible - Ensure MongoDB is reachable and the notification plugin database has write permissions
- Verify Kafka consumer group
SPRING_KAFKA_CONSUMER_GROUPIDis not conflicting with another instance
Notification templates not rendering
Notification templates not rendering
Symptoms: Notifications are sent but contain raw template syntax or missing values.Solutions:
- Verify substitution tags in the template match the keys provided in the notification request payload
- Check that the Thymeleaf engine is functioning by reviewing logs at
LOGGING_LEVEL_THYMELEAFset toDEBUG - Ensure the template exists in MongoDB and is in the correct format
- Verify Redis cache is not serving stale templates — clear the cache or restart the plugin if templates were recently updated
Firebase push notifications not working
Firebase push notifications not working
Symptoms: Push notifications are not delivered to mobile devices.Solutions:
- Verify Firebase Cloud Messaging credentials are correctly configured
- Check that the target device has a valid FCM registration token
- Enable FCM client logging by setting
LOGGING_LEVEL_FCM_CLIENTtoDEBUGto inspect request/response details - Ensure network policies allow outbound HTTPS traffic to Firebase servers (
fcm.googleapis.com)
Related resources
Sending Notifications
Learn how to configure and send notifications from your processes
Email Gateway Setup
Configure the Email Gateway for inbound and outbound email processing
Redis Configuration
Complete Redis setup including Sentinel and Cluster modes

