Skip to main content
Unlisted page
This page is unlisted. Search engines will not index it, and only users having a direct link can access it.

Billable usage log schema (legacy)

important

This documentation has been retired and might not be updated. The products, services, or technologies mentioned in this content are no longer supported. To view current admin documentation, see Manage your Databricks account.

note

This article includes details about the legacy usage logs, which do not record usage for all products. Databricks recommends using the billable usage system table to access and query complete usage data.

This article explains how to read and analyze the usage log data downloaded from the account console.

You can view and download billable usage directly in the account console, or by using the Account API.

CSV file schema

ColumnTypeDescriptionExample
workspaceIdstringID of the workspace.1234567890123456
timestampdatetimeEnd of the hour for the provided usage.2019-02-22T09:59:59.999Z
clusterIdstringID of the cluster (for a cluster) or of the warehouse (for a SQL warehouse)Cluster example: 0406-020048-brawl507

SQL warehouse example: 8e00f0c8b392983e
clusterNamestringUser-provided name for the cluster/warehouse.Shared Autoscaling
clusterNodeTypestringInstance type of the cluster/warehouse.Cluster example: m4.16xlarge

SQL warehouse example: db.xlarge
clusterOwnerUserIdstringID of the user who created the cluster/warehouse.12345678901234
clusterCustomTagsstring (“-escaped json)Custom tags associated with the cluster/warehouse during this hour."{""dept"":""mktg"",""op_phase"":""dev""}"
skustringBilling SKU. See the Billing SKUs table for a list of values.STANDARD_ALL_PURPOSE_COMPUTE
dbusdoubleNumber of DBUs used by the user during this hour.1.2345
machineHoursdoubleTotal number of machine hours used by all containers in the cluster/warehouse.12.345
clusterOwnerUserNamestringUsername (email) of the user who created the cluster/warehouse.user@yourcompany.com
tagsstring (“-escaped json)Default and custom cluster/warehouse tags, and default and custom instance pool tags (if applicable) associated with the cluster during this hour. See Cluster tags, Warehouse tags, and Pool tags. This is a superset of the clusterCustomTags column."{""dept"":""mktg"",""op_phase"":""dev"",
""Vendor"":""Databricks"",
""ClusterId"":""0405-020048-brawl507"",
""Creator"":""user@yourcompany.com""}"

Billing SKUs

  • AWS_ENHANCED_SECURITY_AND_COMPLIANCE
  • ENTERPRISE_ALL_PURPOSE_COMPUTE
  • ENTERPRISE_ALL_PURPOSE_COMPUTE_(PHOTON)
  • ENTERPRISE_DLT_CORE_COMPUTE
  • ENTERPRISE_DLT_CORE_COMPUTE_(PHOTON)
  • ENTERPRISE_DLT_PRO_COMPUTE
  • ENTERPRISE_DLT_PRO_COMPUTE_(PHOTON)
  • ENTERPRISE_DLT_ADVANCED_COMPUTE
  • ENTERPRISE_DLT_ADVANCED_COMPUTE_(PHOTON)
  • ENTERPRISE_JOBS_COMPUTE
  • ENTERPRISE_JOBS_COMPUTE_(PHOTON)
  • ENTERPRISE_JOBS_LIGHT_COMPUTE
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_US_EAST_N_VIRGINIA
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_US_EAST_OHIO
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_US_WEST_OREGON
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_CANADA
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_EUROPE_IRELAND
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_EUROPE_FRANKFURT
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_AP_SINGAPORE
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_AP_SYDNEY
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_US_EAST_N_VIRGINIA
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_US_EAST_OHIO
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_US_WEST_OREGON
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_CANADA
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_EUROPE_IRELAND
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_EUROPE_FRANKFURT
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_AP_SINGAPORE
  • ENTERPRISE_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_AP_SYDNEY
  • ENTERPRISE_SERVERLESS_SQL_COMPUTE_US_EAST_N_VIRGINIA
  • ENTERPRISE_SERVERLESS_SQL_COMPUTE_US_WEST_OREGON
  • ENTERPRISE_SERVERLESS_SQL_COMPUTE_EUROPE_IRELAND
  • ENTERPRISE_SERVERLESS_SQL_COMPUTE_AP_SYDNEY
  • ENTERPRISE_SQL_COMPUTE
  • ENTERPRISE_SQL_PRO_COMPUTE_US_EAST_N_VIRGINIA
  • ENTERPRISE_SQL_PRO_COMPUTE_US_EAST_OHIO
  • ENTERPRISE_SQL_PRO_COMPUTE_US_WEST_OREGON
  • ENTERPRISE_SQL_PRO_COMPUTE_US_WEST_CALIFORNIA
  • ENTERPRISE_SQL_PRO_COMPUTE_CANADA
  • ENTERPRISE_SQL_PRO_COMPUTE_SA_BRAZIL
  • ENTERPRISE_SQL_PRO_COMPUTE_EUROPE_IRELAND
  • ENTERPRISE_SQL_PRO_COMPUTE_EUROPE_FRANKFURT
  • ENTERPRISE_SQL_PRO_COMPUTE_EUROPE_LONDON
  • ENTERPRISE_SQL_PRO_COMPUTE_EUROPE_FRANCE
  • ENTERPRISE_SQL_PRO_COMPUTE_AP_SYDNEY
  • ENTERPRISE_SQL_PRO_COMPUTE_AP_MUMBAI
  • ENTERPRISE_SQL_PRO_COMPUTE_AP_SINGAPORE
  • ENTERPRISE_SQL_PRO_COMPUTE_AP_TOKYO
  • ENTERPRISE_SQL_PRO_COMPUTE_AP_SEOUL
  • PREMIUM_ALL_PURPOSE_COMPUTE
  • PREMIUM_ALL_PURPOSE_COMPUTE_(PHOTON)
  • PREMIUM_DLT_CORE_COMPUTE
  • PREMIUM_DLT_CORE_COMPUTE_(PHOTON)
  • PREMIUM_DLT_PRO_COMPUTE
  • PREMIUM_DLT_PRO_COMPUTE_(PHOTON)
  • PREMIUM_DLT_ADVANCED_COMPUTE
  • PREMIUM_DLT_ADVANCED_COMPUTE_(PHOTON)
  • PREMIUM_JOBS_COMPUTE
  • PREMIUM_JOBS_COMPUTE_(PHOTON)
  • PREMIUM_JOBS_LIGHT_COMPUTE
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_US_EAST_N_VIRGINIA
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_US_EAST_OHIO
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_US_WEST_OREGON
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_CANADA
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_EUROPE_IRELAND
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_EUROPE_FRANKFURT
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_AP_SINGAPORE
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_AP_SYDNEY
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_US_EAST_N_VIRGINIA
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_US_EAST_OHIO
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_US_WEST_OREGON
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_CANADA
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_EUROPE_IRELAND
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_EUROPE_FRANKFURT
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_AP_SINGAPORE
  • PREMIUM_SERVERLESS_REAL_TIME_INFERENCE_LAUNCH_AP_SYDNEY
  • PREMIUM_SERVERLESS_SQL_COMPUTE_US_EAST_N_VIRGINIA
  • PREMIUM_SERVERLESS_SQL_COMPUTE_US_WEST_OREGON
  • PREMIUM_SERVERLESS_SQL_COMPUTE_EUROPE_FRANKFURT
  • PREMIUM_SERVERLESS_SQL_COMPUTE_EUROPE_IRELAND
  • PREMIUM_SERVERLESS_SQL_COMPUTE_AP_SYDNEY
  • PREMIUM_SQL_COMPUTE
  • PREMIUM_SQL_PRO_COMPUTE_US_EAST_N_VIRGINIA
  • PREMIUM_SQL_PRO_COMPUTE_US_EAST_OHIO
  • PREMIUM_SQL_PRO_COMPUTE_US_WEST_OREGON
  • PREMIUM_SQL_PRO_COMPUTE_US_WEST_CALIFORNIA
  • PREMIUM_SQL_PRO_COMPUTE_CANADA
  • PREMIUM_SQL_PRO_COMPUTE_SA_BRAZIL
  • PREMIUM_SQL_PRO_COMPUTE_EUROPE_IRELAND
  • PREMIUM_SQL_PRO_COMPUTE_EUROPE_FRANKFURT
  • PREMIUM_SQL_PRO_COMPUTE_EUROPE_LONDON
  • PREMIUM_SQL_PRO_COMPUTE_EUROPE_FRANCE
  • PREMIUM_SQL_PRO_COMPUTE_AP_SYDNEY
  • PREMIUM_SQL_PRO_COMPUTE_AP_MUMBAI
  • PREMIUM_SQL_PRO_COMPUTE_AP_SINGAPORE
  • PREMIUM_SQL_PRO_COMPUTE_AP_TOKYO
  • PREMIUM_SQL_PRO_COMPUTE_AP_SEOUL
  • STANDARD_ALL_PURPOSE_COMPUTE
  • STANDARD_ALL_PURPOSE_COMPUTE_(PHOTON)
  • STANDARD_DLT_CORE_COMPUTE
  • STANDARD_DLT_CORE_COMPUTE_(PHOTON)
  • STANDARD_DLT_PRO_COMPUTE
  • STANDARD_DLT_PRO_COMPUTE_(PHOTON)
  • STANDARD_DLT_ADVANCED_COMPUTE
  • STANDARD_DLT_ADVANCED_COMPUTE_(PHOTON)
  • STANDARD_JOBS_COMPUTE
  • STANDARD_JOBS_COMPUTE_(PHOTON)
  • STANDARD_JOBS_LIGHT_COMPUTE

Deprecated SKUs

The following SKUs have been deprecated:

Deprecated SKU NameDeprecation DateReplacement SKUs
LIGHT_AUTOMATED_NON_OPSEC
LIGHT_AUTOMATED_OPSEC
March 2020STANDARD_JOBS_LIGHT_COMPUTE
PREMIUM_JOBS_LIGHT_COMPUTE
ENTERPRISE_JOBS_LIGHT_COMPUTE
STANDARD_AUTOMATED_NON_OPSEC
STANDARD_AUTOMATED_OPSEC
March 2020STANDARD_JOBS_COMPUTE
PREMIUM_JOBS_COMPUTE
ENTERPRISE_JOBS_COMPUTE
STANDARD_INTERACTIVE_NON_OPSEC
STANDARD_INTERACTIVE_OPSEC
March 2020STANDARD_ALL_PURPOSE_COMPUTE
PREMIUM_ALL_PURPOSE_COMPUTE
ENTERPRISE_ALL_PURPOSE_COMPUTE
ENTERPRISE_ALL_PURPOSE_COMPUTE_(DLT)
PREMIUM_ALL_PURPOSE_COMPUTE_(DLT)
STANDARD_ALL_PURPOSE_COMPUTE_(DLT)
April 2022ENTERPRISE_DLT_CORE_COMPUTE
PREMIUM_DLT_CORE_COMPUTE
STANDARD_DLT_CORE_COMPUTE
ENTERPRISE_SERVERLESS_SQL_COMPUTE
PREMIUM_SERVERLESS_SQL_COMPUTE
STANDARD_SERVERLESS_SQL_COMPUTE
June 2022ENTERPRISE_SERVERLESS_SQL_COMPUTE_US_EAST_N_VIRGINIA
ENTERPRISE_SERVERLESS_SQL_COMPUTE_US_WEST_OREGON
ENTERPRISE_SERVERLESS_SQL_COMPUTE_EUROPE_IRELAND
ENTERPRISE_SERVERLESS_SQL_COMPUTE_AP_SYDNEY
PREMIUM_SERVERLESS_SQL_COMPUTE_US_EAST_N_VIRGINIA
PREMIUM_SERVERLESS_SQL_COMPUTE_US_WEST_OREGON
PREMIUM_SERVERLESS_SQL_COMPUTE_EUROPE_IRELAND
PREMIUM_SERVERLESS_SQL_COMPUTE_AP_SYDNEY

Analyze usage data in Databricks

This section describes how to make the data in the billable usage CSV file available to Databricks for analysis.

The CSV file uses a format that is standard for commercial spreadsheet applications but requires a modification to be read by Apache Spark. You must use option("escape", "\"") when you create the usage table in Databricks.

Total DBUs are the sum of the dbus column.

Import the log using the Create Table UI

You can use the Upload files to Databricks to import the CSV file into Databricks for analysis.

Create a Spark DataFrame

You can also use the following code to create the usage table from a path to the CSV file:

Python
df = (spark.
read.
option("header", "true").
option("inferSchema", "true").
option("escape", "\"").
csv("/FileStore/tables/usage_data.csv"))

df.createOrReplaceTempView("usage")

If the file is stored in an S3 bucket, for example when it is used with log delivery, the code will look like the following. You can specify a file path or a directory. If you pass a directory, all files are imported. The following example specifies a file.

Python
df = (spark.
read.
option("header", "true").
option("inferSchema", "true").
option("escape", "\"").
load("s3://<bucketname>/<pathprefix>/billable-usage/csv/workspaceId=<workspace-id>-usageMonth=<month>.csv"))

df.createOrReplaceTempView("usage")

The following example imports a directory of billable usage files:

Python
df = (spark.
read.
option("header", "true").
option("inferSchema", "true").
option("escape", "\"").
load("s3://<bucketname>/<pathprefix>/billable-usage/csv/"))

df.createOrReplaceTempView("usage")

Create a Delta table

To create a Delta table from the DataFrame (df) in the previous example, use the following code:

Python
(df.write
.format("delta")
.mode("overwrite")
.saveAsTable("database_name.table_name")
)
warning

The saved Delta table is not updated automatically when you add or replace new CSV files. If you need the latest data, re-run these commands before you use the Delta table.