Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Step 1) Required Resources

Lakehouse Monitor requires the following resources to already be created:

...

  • AWS Secrets Manager needs to be configured with the following secret key value pairs. Suggested name for the secret is ‘bplm-credentials’:

    • storage-access-key - Optional, AWS Access Key used for accessing Amazon DynamoDB and Amazon SQS by the telemetry agent

    • storage-secret-key - Optional, AWS Secret Key used for accessing Amazon DynamoDB and Amazon SQS by the telemetry agent

      • Note: DynamoDB is the telemetry data store, access from the LHM services or telemetry agents in Databricks workspaces can be enabled either with the access key/secret key pair or via IAM Roles/Credentials and Instance Profiles, in which case the key pair above becomes optional

    • service-account-username - Databricks service account username

    • service-account-password - Databricks service account password

      • Note: the Databricks service account is required for access to the Billable Usage Logs of Databricks Accounts API. These logs will fuel all the consumption reports at the Databricks account, workspace, job, job run, task run, nested notebook, cluster, notebook, dlt pipeline, dlt update etc level in the tool and it will help prioritize optimization efforts.

      • It is a Databricks Account user with username and password, since the only authentication supported by the Databricks Accounts API is username/password.

    • mssql-password - SQL Login password for the SQL Database

    • application-encryption-secret - encryption key for storing PATs (Personal Access Tokens) and the Databricks Accounts credentials (billable usage logs) in the LHM SQL database

    • msft-provider-auth-secret - Optional, Client secret value from azure app registration

      • Note: The Service Principal secret key is needed in case you want LHM configured with Azure Active Directory for login and SSL. If you choose to use Databricks authentication only this is not needed and the secret can be omitted.

...

Code Block
  .env
  docker-compose.yml
  setup.sh
  start.sh
  
  1. Before you start setup you need to fill out the .env file with the required information). Open the file in your editor of choice and fill in the values.

    1. Please find a brief explanation of the .env values below

...

Code Block
LOG_LEVEL=info
LOG_LEVEL_APP=info
LOG_LEVEL_HTTP_HEADERS=error

APPSERVICE_URL=<eg:https://demo.aws-bplm.com>

SQL_DATABASE=master
SQL_SERVER_HOST=<eg:192.168.4.10 or endpoint DNS name>
SQL_USER=<eg:sql_admin>

STORAGE_AWS_REGION=<eg:us-west-1>
STORAGE_AWS_TABLE_PREFIX=bplm

AWS_SECRETS_MANAGER_ENABLED=true
AWS_SECRETS_MANAGER_REGION=<eg:us-west-1>
BPLM_SECRET_NAME=<name of the secrets manager secret>
SERVER_SSL_ENABLED=true
SERVER_SSL_KEY-STORE=/keystore/bplm.p12
SERVER_SSL_KEY-STORE-PASSWORD=
SERVER_SSL_KEY-STORE-TYPE=PKCS12
SERVER_SSL_KEY-ALIAS=bplm
SERVER_SSL_KEY-PASSWORD=

SERVICE_PRINCIPAL_CLIENTID=<eg: 925accb1-8506-4ec4-a90b-b1b0e6d8a5eb>
SERVICE_PRINCIPAL_TENANTID=<eg: 03786a4c-412b-4fac-a981-b4c5bcbc55b7>
#SERVICE_PRINCIPAL_CLIENTSECRET=${msft-provider-auth-secret}

DATABRICKS_ACCOUNT_ID=<eg: 56293882-89e7-4ecd-a5f7-cb61e68a54f0>
DATARICKS_SERVICE_PRINCIPAL=<eg: 48de6ad6-ff14-403d-b842-d4ce5da4662f>
ACTIVE-DIRECTORY_HOST=https://login.microsoftonline.com
ACTIVE-DIRECTORY_TOKEN-ENDPOINT=/oauth2/v2.0/token
ACTIVE-DIRECTORY_AUTHORIZE-ENDPOINT=/oauth2/v2.0/authorize
ACTIVE-DIRECTORY_JWK-ENDPOINT=/discovery/keys
ACTIVE-DIRECTORY_USER-INFO-URI=https://graph.microsoft.com/oidc/userinfo

CLOUD_PROVIDER=AWS
AUTHENTICATION_PROVIDER=databricks-account,active-directory
SPRING_PROFILES_ACTIVE=production-aws
SERVER_SERVLET_SESSION_PERSISTENT=true
SERVER_SERVLET_SESSION_STORE_DIR=/home/ubuntu/spring-session/session
ADMIN_APP_ROLE=bplm-admin
METRIC_PROCESSOR_ENABLED=true
STORAGE_THROUGH_IAM_CREDENTIALS=true
#metric.queueMonitoring.compactionTimeout=PT25M
APPLICATION_NOTIFICATION_JOBNOTIFICATIONQUEUENAME=<prefix for sqs names>

Note: due to the docker version provided by CentOS the SERVICE_PRINCIPAL_CLIENTSECRET can not be pulled from the secrets manager.

...