Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

LHO Agent has access to the configuration of a Databricks entity (e.g. Workflow, Job) and in that configuration can be found the Secrete Scope container where credentials are stored. Any entity has access to Databricks Secrets service.

...

What permissions are required for the telemetry agent

...

to read/write data from/to Azure Tables? (7)

The LHO Agent stores telemetry data in the Azure Tables of the configured Azure Storage account and sends events (e.g. spark job completed finished events) to Azure Queue configured on the Storage Account that also saves the telemetry data.

The LHO App dequeues events from this Queue and triggers the analysis when the Databricks job or DLT update is complete.

The access to cloud storage via Access Key can be disabled the the LHO App configured to use Service Principal to access cloud storage. The LHO Service Principal requires the Storage Queue Data Contributor and Storage Table Data Contributor roles at the Storage Account level used by LHO Agent. This allows the LHO App to read data from the Storage Account’s Queue and the LHO Agent to write data to this queue. (8)

Storage Queue Data Contributor and Storage Table Data Contributor roles must be granted manually to the LHO Service Principal on the Storage Account used by LHO AgentService Principal of the LHO App that uses Azure Blob Storage service must be configured manually by the administrator with the role of Storage Table Data Contributor at the storage account level.

The telemetry collector agent is using Databricks Secrets to retrieve the client secret of the Service Principal that will be used to access cloud storage.

How does the telemetry agent communicate with the LHO App for realtime telemetry data analysis? (8)

The LHO Agent stores telemetry data in the cloud storage and sends events (e.g. job finished events) to an Azure Queue configured in the same Storage Account used for saving the telemetry data.

The LHO App dequeues events from this Queue and triggers the analysis when the Databricks job or DLT update is complete.

The LHO Service Principal requires Storage Queue Data Contributor role at the queue level.

...

📍 Public Workspaces

How do I expose Subscriptions and Workspaces to users from other AD tenants?

...