...
AWS Secrets Manager needs to be configured with the following secret key value pairs. Suggested name for the secret is ‘bplm-credentials’:
storage-access-key
- Optional, AWS Access Key used for accessing Amazon DynamoDB and Amazon SQS by the telemetry agentstorage-secret-key
- Optional, AWS Secret Key used for accessing Amazon DynamoDB and Amazon SQS by the telemetry agentNote: DynamoDB is the telemetry data store, access from the LHM services or telemetry agents in Databricks workspaces can be enabled either with the access key/secret key pair or via IAM Roles/Credentials and Instance Profiles, in which case the key pair above becomes optional
service-account-username
- Databricks service account usernameservice-account-password
- Databricks service account passwordNote: the Databricks service account is required for access to the Billable Usage Logs of Databricks Accounts API. These logs will fuel all the consumption reports at the Databricks account, workspace, job, job run, task run, nested notebook, cluster, notebook, dlt pipeline, dlt update etc level in the tool and it will help prioritize optimization efforts.
It is a Databricks Account user with user name and password, since the only authentication supported by the Databricks Accounts API is username/pwd.
mssql-password
- SQL Login password for the SQL Databaseapplication-encryption-secret
- encryption key for storing PATs (Personal Access Tokens) and the Databricks Accounts credentials (billable usage logs) in the LHM SQL databasemsft-provider-auth-secret
- Optional, Client secret value from azure app registrationNote: The Service Principal secret key is needed in case you want LHM configured with Azure Active Directory for login and SSL. If you choose to use Databricks authentication only this is not needed and the secret can be omitted.
...