Configuration
Salesforce Connector Service
The environment variables used by the service (*
= optional) are:
HTTP_PORT
: port, defaulted to 8080KAFKA_BROKERS
: comma separated list of broker addressesKAFKA_CLIENT_ID
: client id of the producer that will write on projection topicsKAFKA_SASL_MECHANISM*
: authentication mechanism, usescram-sha-256
to enable scram authentication, defaulted toplain
KAFKA_SASL_USERNAME*
: scram usernameKAFKA_SASL_PASSWORD*
: scram passwordSALESFORCE_HTTP_BASE_PATH
: base path for salesforce access token generation (mustn't end with/
)SALESFORCE_CLIENT_ID
: client id of your salesforce accountSALESFORCE_CLIENT_SECRET
: secret of your salesforce accountSALESFORCE_USERNAME
: username of your salesforce accountSALESFORCE_PASSWORD
: concatenation of password and secret token of your salesforce accountCRUD_URL
: url of your crud serviceKAFKA_DLQ_TOPIC
: Kafka topic used to implement the DLQCONFIG_FILE_PATH
: path to the yaml configuration filePARALLELISM*
: size of the coroutine pool which the events are dispatched to. It should be a prime number to reduce collisions and distribute the workload evenly. The default value is 97CKP_UPSERT_PERIOD_MS*
: period in milliseconds specifying the frequency of checkpoint save, default 5000DEFAULT_REPLAY_POLICY*
: replay id policy used in case a checkpoint is invalid for topic connection. UseREPLAY_FROM_TIP
if you want the topic to start processing from the current instant, useREPLAY_FROM_EARLIER
if you want to process all events received during the last 24 hours (might put the service under heavy load, but also
might help to not lose changes)RESTART_TIMES*
: comma separated list ofHH:mm
formatted times at which the service will restartRESTART_TIME_ZONE*
: zone id of the time zone of reference of restart times, defaulted toEtc/UTC
INACTIVITY_TIMEOUT_SECONDS*
: duration of the timeout that resets a topic connection after a period of inactivity, namely a period where no updates are received
The default value should be fine for more cases, however the higher it gets, the more cpu resources will be required. Please feel free to tune this parameter if needed, using only prime numbers.:::
The yaml configuration file must have the following structure:
- salesforceTableName: Table1
salesforceTopic: /data/Table1ChangeEvent
projectionTopic: fd-p-sforce-table1-json
idPropertyName: CustomId
- salesforceTableName: Table2
salesforceTopic: /topic/Table2
projectionTopic: fd-p-sforce-table2-json
idPropertyName: Id
- ...
salesforceTableName
: name of the table created on Salesforce DBsalesforceTopic
: name of the StreamingAPI topicprojectionTopic
: name of the Kafka topic of the projectionidPropertyName
: optional name of the property representing the id of a record, defaulted to"Id"
Every item of the list corresponds to a new connection to a StreamingAPI topic. Be sure these data are correct as otherwise your service might get stuck in a restart loop, as health checks will fail.
Salesforce Connector DLQ Service
The environment variables used by the service (*
= optional) are:
HTTP_PORT
: port, defaulted to 8080KAFKA_BROKERS
: comma separated list of broker addressesKAFKA_CLIENT_ID
: client id of the producer that will write on projection topicsKAFKA_GROUP_ID
: group id of the DLQ consumerKAFKA_SASL_MECHANISM*
: authentication mechanism, usescram-sha-256
to enable scram authentication, defaulted toplain
KAFKA_SASL_USERNAME*
: scram usernameKAFKA_SASL_PASSWORD*
: scram passwordSALESFORCE_HTTP_BASE_PATH
: base path for Salesforce access token generation (mustn't end with/
)SALESFORCE_CLIENT_ID
: client id of your Salesforce accountSALESFORCE_CLIENT_SECRET
: secret of your Salesforce accountSALESFORCE_USERNAME
: username of your Salesforce accountSALESFORCE_PASSWORD
: concatenation of password and secret token of your Salesforce accountKAFKA_DLQ_TOPIC
: DLQ Kafka topicFIELD_MAPPING_FILE_PATH*
: path to the yaml configuration fileSALESFORCE_ACCESS_TOKEN_RETRY_ATTEMPTS*
: number of retry attempts that will be done before declaring the service as unhealthy, defaulted to 5
The yaml configuration file must have the following structure:
Table1:
Id: CustomId
Field: CustomField
Table2:
Field: CustomField
...
In the above example, we are going to rename all "Id"
fields of Table1
to "CustomId"
before sending them to the
projection.