Skip to main content

BigQuery Lens

DoiT Cloud Navigator: Enhanced and above

Introduction

Google Cloud BigQuery is serverless, multi-cloud data analytics platform. BigQuery pricing mainly consists of two parts: compute pricing for the cost to process queries, and storage pricing for the cost to store data.

DoiT BigQuery Lens (BQ Lens) analyzes BigQuery usage by setting up an audit log sink for BigQuery activities. It generates actionable recommendations for both compute and storage, helping you to maximize the efficiency of your cloud spend, improve query performance, and ensure sustainable growth as your data scales.

Required permissions

  • To access BigQuery Lens, your DoiT account needs the Cloud Analytics permission.

  • BigQuery Lens needs specific permissions at the organization level. See BigQuery Lens permissions for details.

Set up BigQuery Lens

  1. Connect your Google Cloud Organization to the DoiT Platform, if you haven't done so.

  2. Update your service account to grant the permissions required by BigQuery Lens, BigQuery Lens Editions, and optionally BigQuery Lens Advanced.

Once set up, the BigQuery Lens will backfill historical data for the last 30 days and start gathering information about your Google Cloud BigQuery usage patterns. It can take up to 24 hours to fully populate the BigQuery Lens dashboard with statistics and recommendations.

BigQuery Lens Advanced

BigQuery Lens Advanced provides clustering recommendations based on the following assumptions:

  • You'll continue to execute similar queries as you did in the previous 30 days, and
  • You'll reference fields in the order that we recommend.

To set up BigQuery Lens Advanced, you also need to enable Google Cloud Resource Manager API, which manages metadata for Google Cloud Platform resource containers:

  1. Sign in to the Google Cloud console, go to the Cloud Resource Manager API page.

  2. Select the project where you created your service account.

  3. Select Enable.

Update permissions

Beginning December 18, 2023, the BigQuery Lens queries the INFORMATION_SCHEMA.TABLES view, which needs two more organization-level permissions: bigquery.routines.get and bigquery.routines.list.

If you have activated BigQuery Lens before December 18, 2023, run the following gcloud CLI commands to continue using it:

  1. Get your Google IAM organization resource ID.

    ORG_ID=$(gcloud organizations list --format 'value(ID)')
  2. Update the IAM custom role doit_cmp_role.

    gcloud iam roles update doit_cmp_role --organization $ORG_ID \
    --add-permissions bigquery.routines.get,bigquery.routines.list

BigQuery Lens dashboard

The BigQuery Lens dashboard consists of widgets that highlight different aspects of your BigQuery usage.

The Google Cloud BigQuery Lens dashboard

General settings:

  • Time range: Adjusts the time frame of the data you're looking at. Options: Last day, Last 7 days, and Last 30 days.

  • Usage type: Filters queries by pricing models. Options:

Widgets

Note

Except BigQuery Spend by SKU, all other widgets on the BigQuery Lens dashboard are updated daily at 01:00 AM UTC.

Recommendations

The Recommendations widget shows actionable insights to BigQuery optimizations relevant to your usage. You can select each individual recommendation to see the details.

Compute recommendations

  • Backup and remove unused tables: This recommendation applies to all usage types. It lists the unused tables that you can back up and then remove from BigQuery to reduce storage costs.

    Below is an example:

    Google Cloud BigQuery On-demand Recommendation

    • The Project column shows the billing project. See Project and Billing project.

    • Select a table name in the Table column to open it in your Google Cloud BigQuery console.

    • If the table has multiple partitions, select the number in the Partition(s) to Remove column to see which partitions should be removed.

    • You can also download the full list as a CSV file.

On-demand pricing model recommendations
  • Cluster your tables optimally: Cluster tables using the suggested Cluster By fields. This action helps to eliminate unnecessary scans.

  • Enforce Partition Fields: Use the suggested Partition Field in queries indicated by the Query ID.

  • Limit query jobs: Reduce execution frequency of the listed jobs in the Query ID column by the percentage you choose on the slider and view the Savings by Reducing Jobs.

  • Partition your tables: Use the suggested Partition Fields to convert non-partitioned tables to partitioned ones. BigQuery Lens does not give recommendations for tables that are already partitioned.

Capacity pricing model recommendations

If you use BigQuery editions with 1-yr or 3-yr slot commitments, or if you have Flat-Rate Legacy usage, you will see the following recommendation when applicable:

  • Move repeating queries to a different time slot: Lower your minimum average slots requirements by moving the recurring queries to a less-busy hour of the day. Review the Slots Consumption widget with the interval set to Hour to identify suitable hours. Troughs in your daily slots consumption indicate optimal hours to run recurring queries.

Storage recommendations

  • Change dataset to Physical Storage: Change a dataset's storage billing model to pay for the compressed physical storage used by tables in the dataset.

    You can open the recommendation link to view the detailed information on datasets that could benefit from using the physical storage billing model, including the project, table, logical storage size, physical storage size, logical storage cost, physical storage cost, and potential savings.

    Google Cloud BigQuery Physical Storage Recommendation

    • The Project column shows the billing project. See Project and Billing project.

    • Select a table name in the Table column to open it in your Google Cloud BigQuery console.

    • You can also download the full list as a CSV file.

    Note

    Make sure to review Google's physical storage documentation for eligibility criteria. The BigQuery Lens recommendation doesn't perform eligibility check.

Explorer

Use the Explore widget to get granular information about your BigQuery usage. You can select individual entries to drill down into the data.

Google Cloud BigQuery Lens Explorer

The metrics available vary with the scope and usage type.

Usage typeScopeMetric
On-Demand, Edition (pay as you go)Billing Project, UserScan Price, Scan TB
On-Demand, Edition (pay as you go)Project, Dataset, TableScan Price, Scan TB, Storage Price, Storage TB
Edition (one year and three year commitments), Flat-Rate LegacyBilling Project, UserAverage Slots
Edition (one year and three year commitments), Flat-Rate LegacyProject, Dataset, TableStorage Price, Storage TB

The example below shows the scan price of a chosen user. You can also select the individual queries run by the user to view the price.

BigQuery scan price of a user

Project and Billing project

  • The Project contains the data that is queried during the job. It is called data project in the BigQuery audit logs overview.

  • The Billing project is the project that runs the query. It is the project where you create your service account.

    The example below shows where you can find the billing project in the Google Cloud BigQuery console.

    Google Cloud BigQuery billing project

"Not yet allocated" items

BigQuery Lens may not be able to detect which table is being scanned due to the following reasons:

  • Queries like SELECT 1
  • Queries we are unable to parse due to complexity (i.e., a parser time out)
  • Queries that use a function or statement that we do not support yet

In such cases, you'll see Not yet allocated items in the BigQuery Lens Explorer widget.

Slots consumption

Shows the number of slots consumed by your organization in terms of Peak usage and Average usage per Hour or Day.

The example below shows the slots assumption when the usage type is one of BigQuery editions.

Google Cloud BigQuery Slots Consumption

Scans by table type

This widget shows the volume of data you are scanning from various types of tables, broken down between unpartitioned tables and multiple types of partitioned tables, along with external sources (e.g., Google Sheets).

You can select a table type to see the most frequently scanned tables of that type.

Google Cloud BigQuery scans by table type

BigQuery spend by SKU

This widget shows a breakdown of BigQuery costs per SKU.

Google Cloud BigQuery spend by SKU

You can select the three dots menu () in the upper-right corner of the widget for two additional options:

  • Refresh: By default, the BigQuery Spend by SKU widget is updated twice per day. Use the Refresh option to fetch the latest billing data from Google.

    Note

    The information shown in the widget is when the DoiT Platform retrieves the billing data, while the freshness of the data is decided by Google Cloud's frequency of data loads.

  • Open full report: Opens the report that generates the visualization in this widget.

FAQ

What datasets and sinks do you create?

We create the following dataset and sink in your billing project:

  • A dataset named doitintl-cmp-bq.

  • An audit log sink named <Billing_Project_ID>.doitintl-cmp-bq.cloudaudit_googleapis_com_data_access for all your BigQuery jobs.

The dataset doitintl-cmp-bq is created when you update your service account with the required permissions and removed when you remove the service account.

What queries do you run?

BigQuery Lens has two main processes: a table discovery process to collect table metadata, and a dashboard update process to run queries.

  • The table discovery process runs on a schedule and writes the table metadata it collected into a table named tables in the doitintl-cmp-bq dataset.

  • The dashboard update process executes the following queries:

    • Daily run of cost simulation queries to provide the cost savings information and potential savings.
    • Daily run of lens calculation queries to provide detailed BigQuery usage.

How much does BigQuery Lens cost?

BigQuery Lens creates and maintains the doitintl-cmp-bq dataset on your behalf and runs queries to provide you with analytics data. The cost for running the queries is usually around $10 per month.

It is worth mentioning that the actual cost depends on the volume of your logs. If your organization uses BigQuery heavily, the cost could be much higher if you are using BigQuery's on-demand analysis pricing model.

Caution

If you have an excess of historical jobs, you may see an increased cost for BigQuery Streaming Inserts when we backfill them for the last 30 days during your service account setup.

What's next