Configure Public Subscriptions and Workspaces

 

What Subscriptions and Workspaces does the Signed-in User see?

A signed-in user sees all the Subscriptions and Workspaces that the user can see in Azure Portal and Databricks plus workspaces configured in the publicSubscriptionMetadata.csv file.

If a signed-in Lakehouse Optimizer (LHO) user does not have access to any Azure Subscription and/or Databricks Workspace, but the LHO admin still wants to provide read rights to that particular user (or group), the admin can use publicSubscriptionMetadata LHO config file to make subscriptions and workspaces accessible in LHO.

As an application-wide setting, use publicSubscriptionMetadata.csv file to provide read-only access to Subscriptions and Workspaces to any user that is able to login to LHO App.

Public Subscriptions and Workspaces

LHO configuration file publicSubscriptionMetadata.csv is used in order to expose subscriptions and workspaces to all users who use LHO app regardless of the access rights of that user that are set in Azure or Databricks.

This file must be manually generated in storage account under a directory with the name as the subscription id, in a file named publicSubscriptionMetadata.csv. Public file is only read by LHO app.

This file is used to allow signed-in users to list workspaces and see reports in LHO even if that particular user does not have rights configured on that tenant.

For example, the user signs in to LHO with an external AD account registered with a tenant from another Microsoft AD subscription, thus the signed-in user doesn’t have the sufficient rights to read workspaces from the subscription that also manages the Databricks workspaces. In order to allow this user to read workspaces that the admin deemed “public”, LHO uses the publicSubscriptionMetadata.csv to expose these workspaces to this external user.

Manual Public Subscriptions and Workspaces Configuration

Step 1 – Define Public Subscriptions manually in cloud storage

  • use cloud storage path from LHO Settings page

    • cloud storage path
  • define in blob storage a file named

publicSubscriptions.csv

  • one row for each subscription

  • example:

displayName,subscriptionId,tenantId Blueprint Data Engine,a63c1e51-40ae-4a34-b230-bf80e132c05c,12e2dd65-5024-44c2-83b5-3ca21c04ef0e

Step 2 – Create a folder with subscription-id as name in cloud storage

In this folder, we have two files

  • (1)publicSubscriptionMetadata.csv

    • create the file manually

    • add metadata for workspaces that you want to consider “public”

    • example:

    • displayName,isPremium,workspaceHost,workspaceId,workspaceResourceGroupId,workspaceResourceId ca-adb-test-workspace,false,adb-511420607229897.17.azuredatabricks.net,511420607229897,/subscriptions/a63c1e51-40ae-4a34-b230-bf80e132c05c/resourceGroups/databricks-rg-ca-adb-test-workspace-loj4oc72jjfum,/subscriptions/a63c1e51-40ae-4a34-b230-bf80e132c05c/resourceGroups/cost-analyzer-resources/providers/Microsoft.Databricks/workspaces/ca-adb-test-workspace