Product Features (Azure)

Blueprint Lakehouse Optimizer for Databricks features real time and predictive analysis of what your organization has spent by Subscription, Workspace, Resource and User, allocating usage to specific business functions, reports, and applications. This gives you full visibility into your telemetry and cost data, you will also be able to monitor your cloud expense. The complete solution includes starter kits with PowerBI Dashboard templates for custom reporting and Databricks ingestion notebooks for advanced cost and performance analysis. Continue reading to learn more about how Lakehouse Optimizer can help your organization.



Comprehensive Insights

Lakehouse Optimizer offers comprehensive overview of where and when costs are made. Users can easily customize data views by date ranges, cumulative or granular costs, Workspaces, and/or Azure versus Databricks costs. Monthly and daily changes are recognized and displayed via widgets so you can better understand your costs in comparison to previous spending.

Homepage with Executive Summaries

 

  • Provide a Budget Estimator widget to keep costs aligned with planned expenses.

  • View a breakdown of granular costs between cloud provider and Databricks for individual workspaces and subscriptions. (VMs, VM Disks, Network, Storage, Pools)

Detailed Cost and Performance Reporting

 

  • Cost and telemetry metrics are backed by a database to facilitate lightning-fast analytics and reporting.

  • Tracking past runs of jobs, including deleted jobs and background processes for true auditability of the environment.

All-Purpose Clusters

  • Provides extensive reporting with all Jobs and Notebooks that are run-on All-Purpose Clusters.

User Activity Reporting

Provides extensive reporting with a focus on the activity of users and usage patterns of computing resources.

  • Present stats per clusters with the activity of users

  • Show idleness distribution per user

  • Show activity distribution per user in a cluster

Delta Live Tables Monitoring

Enable monitoring and reporting with cost and telemetry information for your DLT pipelines.

 


Intelligent Recommendations

https://youtu.be/2zlTsy1jwvI?si=eW3dt_l9VSP49lSE

Non-Optimal Configuration Warnings

  • Warn on possible misuse of all-purpose-clusters when job-clusters can be used

    • See Clusters configuration panel when enabling and disabling the monitor agent


Advanced Cost Analysis

(Add brief 2-4 sentence summary.)

Azure Costs

  • Supports Databricks DBCU Reservations

    • Databricks Credit Units Reservations

      • A company buys in advanced DBU units (prepaid) units, the company receives a discount for the prepaid amount

    • Cost per job widget on reporting page supports discounted prices

      • Use amortized cost provided by Azure for scenarios with reserved (prepaid) DBCU

  • Support both pay-as-you-go scenarios and prepaid resources (reservations)

  • Support Consumption Data Loading for Azure via Databricks notebook by using the bpcs-consumption module

    • Load Consumption Data and store it in Azure Blob Storage

AWS Costs

  • Support Consumption Data Loading for AWS via Databricks notebook by using the bpcs-consumption module

    • Load Consumption Data and store it in S3


Databricks Pools Support

 

 

 


Security

  • Active Directory integration

  • Supports Azure managed identity

  • Use private endpoints across VMs to access cost and telemetry data saved in Azure storage


Modular deployment

  • Supports Databricks on AWS

  • Supports Databricks on Azure

  • Can be run in a VM with Active Directory integration

  • Distributed design

  • Large amounts of data can be analyzed by using modules deployed


Â