Sensitive Data Protection pricing

This page provides pricing information for Sensitive Data Protection. Prices on this page are listed in US dollars (USD).

Sensitive Data Protection requires billing information for all accounts before you can start using the service. To sign up for billing, go to your project's billing page in the Google Cloud console.

Sensitive Data Protection charges for usage based on the following price sheet. At the end of each billing cycle, a bill is generated that lists the usage and charges for that cycle.

Prematurely canceling an ongoing operation still incurs costs for the portion of the operation that was completed.

Overview of Sensitive Data Protection pricing

Sensitive Data Protection pricing has three main components:

Inspection and transformation pricing

Sensitive Data Protection provides a set of features for inspecting and transforming data. Across these scenarios, you pay only for what you use, with no upfront commitments.

Inspection and transformation: supported features

Sensitive Data Protection supports inspection and transformation features summarized in the following table:

Feature Description
Inspection with built-in infoType detectors Each built-in classifier detects a different data element such as names, phone numbers, email addresses, Social Security numbers, and more.
Inspection with custom infoType detectors Allows custom defined dictionary to classify new elements or augment predefined infoTypes.
Image redaction Extracts text from images, classifies the text, and generates a new image with rectangular boxes that mask any findings.
De-identification Transforms tabular or free-text data to mask, redact, or obfuscate by column, record, or infoType finding.

Inspection and transformation of data in Google Cloud storage systems

The projects.dlpJobs.create method lets you create an inspection job that inspects for sensitive data in certain Google Cloud storage systems. You are billed according to the storage inspection job pricing. If the inspection job is also configured to de-identify the findings, then you are also billed according to the storage transformation job pricing.

Storage inspection job pricing

Sensitive Data Protection storage jobs are billed based on bytes inspected according to the following schedule:

Storage data inspected per month Price per gigabyte (GB)
Up to 1 GB Free
1 GB to 50 terabytes (TB) US$1.00
Over 50 TB US$0.75
Over 500 TB US$0.60

If you configure an inspection job to save findings to a BigQuery table, the billing and quota usage for the tabledata.insertAll operation are applied to the project that contains the destination table.

For more information about inspecting content stored in Google Cloud, see Inspecting storage and databases for sensitive data.

Storage transformation job pricing

Sensitive Data Protection storage jobs are billed based on bytes transformed according to the following schedule:

Storage data transformed per month Price per gigabyte (GB)
Up to 1 GB Free
1 GB to 50 terabytes (TB) US$1.00
Over 50 TB US$0.75
Over 500 TB US$0.60

If you choose to store the transformation details in a BigQuery table, the billing and quota usage for the tabledata.insertAll operation are applied to the project that contains the destination table.

For more information about de-identifying content stored in Google Cloud, see De-identification of sensitive data in storage.

Inspection of data from any source

The projects.dlpJobs.create method lets you create a hybrid job that inspects for sensitive data from any source, including sources outside Google Cloud. You are billed based on bytes inspected according to the following schedule:

Hybrid data inspected per month Price per GB
Up to 1 GB Free
Over 1 GB US$3.00
Over 1 TB US$2.00

If you pay in a currency other than USD, the prices listed in your currency on Cloud Platform SKUs apply.

A minimum of 1 KB is billed per hybrid inspection request.

If you configure a hybrid inspection job to save findings to a BigQuery table, the billing and quota usage for the tabledata.insertAll operation are applied to the project that contains the destination table.

For more information about inspecting data from any source, see Hybrid jobs and job triggers.

Inspection and transformation through content methods

The content methods are listed in the following table, along with notations of the types of charges each method may be billed for:

API method Content inspection Content transformation
projects.image.redact Yes No
projects.content.inspect Yes No
projects.content.deidentify Yes Yes
projects.content.reidentify Yes Yes

Content inspection method pricing

Sensitive Data Protection content method pricing is billed based on bytes inspected according to the following schedule:

Content data inspected per month Price per GB
Up to 1 GB Free
Over 1 GB US$3.00
Over 1 TB US$2.00

Content transformation method pricing

Sensitive Data Protection content method pricing is billed based on bytes transformed according to the following schedule:

Content data transformed per month Price per GB
Up to 1 GB Free
Over 1 GB US$2.00
Over 1 TB US$1.00

If you pay in a currency other than USD, the prices listed in your currency on Cloud Platform SKUs apply.

A minimum of 1 KB is billed per content inspect or transform request.

Inspection and transformation: other charges and no-charge features

In addition to the billed charges directly incurred by Sensitive Data Protection, requests that are configured to invoke other Google Cloud products may result in their own billed charges. For example, the projects.content.inspect method may incur Cloud Storage charges if directed to inspect Cloud Storage objects.

Some methods can result in billed charges for either inspection, transformation, or both depending on how they are configured. This is the case for the projects.content.deidentify and projects.content.reidentify methods when, for example, transformation is configured but inspection is not. The same applies to transformation when only inspection is configured.

Simple redaction, which includes the RedactConfig and ReplaceWithInfoTypeConfig transformations, is not counted against the number of bytes transformed when infoType inspection is also configured.

Inspection and transformation pricing examples

This section contains several example inspection and transformation usage scenarios, along with pricing calculations for each.

Scenario 1: Data inspection and transformation using content methods

Suppose you have just over 10 GB of structured (tabular) data. You stream it to the DLP API, instructing Sensitive Data Protection in the request to inspect for 50 different built-in infoType detectors, and to de-identify any matches it finds by using cryptographic tokenization transformation. After performing the de-identification operation, you note that Sensitive Data Protection has matched on and transformed around 20% of the data, or around 2 GB.

Pricing:

  • Inspection: 10 GB of data × US$3.00 per GB = US$30.00
  • Transformation: 2 GB × US$2.00 per GB = US$4.00
  • Total: US$34.00

Scenario 2: Structured data transformation only using content methods

Suppose you have a 10 GB table and want to transform three columns (user_id, email, phone_number) using cryptographic tokenization transformation. The three columns represent about 30% of the table. Because you're specifying entire columns to transform, no inspection is necessary.

Pricing:

  • Inspection: 0 GB of data = US$0.00
  • Transformation: 3 GB of data × US$2.00 per GB = US$6.00
  • Total: US$6.00

Scenario 3: Unstructured data inspection and transformation with content methods

Suppose you have 10 GB of unstructured chat logs. You want to inspect and de-identify any infoType findings. To do so, you need to inspect the entire payload and then transform the findings. 20% of all the text are findings.

Pricing:

  • Inspection: 10 GB of data × US$3.00 per GB = US$30.00
  • Transformation: 2 GB of data × US$2.00 per GB = US$4.00
  • Total: = US$34.00

Scenario 4: Storage repository inspection using storage jobs

Suppose you have 1,000 BigQuery tables that you want to inspect. Each table is around 1 GB, making the total size of the data 1 TB. Not wanting or needing to scan the entirety of every table, you've turned on sampling so that just 1,000 rows of each table are scanned. Each row is roughly 10 KB.

Pricing:

  • Data to inspect: 1,000 tables × 1,000 rows per table × 10 KB per row = 10 GB total scanned
  • Total: 10 GB × US$1.00 per GB = US$10.00

Scenario 5: Storage repository inspection and transformation using storage jobs

Suppose you have 5 GB of structured (tabular) and unstructured (freeform) text data in a Cloud Storage bucket. You create an inspection job that instructs Sensitive Data Protection to inspect for 25 different built-in infoType detectors and to de-identify any matches it finds by using cryptographic tokenization transformation. After performing the de-identification operation, you note that Sensitive Data Protection has matched on and transformed 25% of the data, or 1.25 GB.

Pricing:

  • Inspection: 5 GB of data × US$1.00 per GB = US$5.00
  • Transformation: 1.25 GB × US$1.00 per GB = US$1.25
  • Total: US$6.25

Discovery pricing

This section describes the cost to generate data profiles. Data profiles are high-level metrics and insights about your data. For information about the types of data that the discovery service can profile, see Supported resources.

Sensitive Data Protection offers a choice of two pricing modes for the discovery service:

  • Consumption pricing. In consumption mode, projects or organizations are subject to per-GB pricing based on the size of the profiled data.
  • Fixed-rate subscription pricing. In subscription mode, you explicitly choose how much compute time (capacity) to reserve for profiling. Your profiles are generated within that capacity, and you pay for that capacity continuously every second it's deployed. You have this capacity until you cancel your subscription. There is no charge for bytes profiled when using this pricing mode.

    The subscription pricing mode offers predictable and consistent costs, regardless of your data growth.

By default, you are billed according to the consumption pricing mode.

Discovery pricing comparison table

Pricing for the discovery service is as follows:

Pricing mode Pricing details
Consumption mode
  • Discovery for BigQuery
    • You are charged US$0.03 per GB of BigQuery data profiled. The billable bytes per table is equal to the table's size or 3 TB, whichever is lower.
    • Each BigLake table profiled is billed as a 300 GB table.
  • Discovery for Cloud SQL: you are charged US$0.03 per GB of Cloud SQL data profiled, with a minimum of US$0.01 for each table. The billable bytes per table is equal to the table's size or 3 TB, whichever is lower.
  • Discovery for Cloud Storage and Amazon S3: you are charged US$0.03 per GB of scanned files. Charges per bucket are capped at 3 TB worth of data. For example, if two buckets are profiled, the charges are capped at 6 TB worth of data.

    You aren't charged for files that Sensitive Data Protection failed to scan, such as corrupt files and files that are password protected. You are charged US$0.03 for each bucket that is empty or that has no supported file types.

Subscription mode
  • You are charged US$2,500 per subscription unit.
  • Each subscription unit profiles approximately 10,000 standard tables1 or 2,000 BigLake tables or 500 file stores2 or a mix of these data assets3.

    The scope of a subscription is either an organization or a project. An organization-level subscription doesn't apply to a project-level scan configuration.

Table notes:

  1. BigQuery or Cloud SQL tables.

  2. Sensitive Data Protection uses the term file store to refer to a file storage bucket. Buckets that are empty or that have no supported file types still consume capacity. For such a bucket, you consume the equivalent capacity of .05 buckets.

  3. Actual throughput varies depending on the complexity of the data and the resource types. For more information, see Discovery: subscription mode pricing.

When you profile Cloud Storage data, Cloud Storage charges apply regardless of your pricing mode. For more information, see Discovery for Cloud Storage on this page.

When you profile Amazon S3 data, AWS charges apply regardless of your pricing mode. For more information, see Sensitive data discovery for Amazon S3.

Discovery: estimate your profiling costs

Before you choose a pricing mode for the discovery service, we recommend that you run an estimation. An estimation helps you understand how much data you have and how much it might cost to profile that data in subscription mode and in consumption mode. For more information, see the following:

Discovery: consumption mode pricing

When you configure data profiling, all data assets in the scope of your discovery scan configuration are profiled. You incur costs at the rate listed. For example, a 10 GB table costs US$0.30 to profile.

Consumption mode pricing examples

This section contains example usage scenarios related to data profiling, along with pricing calculations.

These examples are based on the default profiling frequency.

Scenario 1: Organization-wide data profiling

Suppose you have 10 TB of data across your entire organization. Each month, you add the following:

  • 1 TB of data in new tables.
  • 1 TB of data in new columns in existing tables. This amounts to 5 TB of data representing tables with schema changes.
Month 1: Profiles are created for all your data
Data Price
Starting data: 10 TB of data is profiled.
10,000 GB x US$0.03
US$300.00
1 TB of data is added as new tables (picked up daily) over the month. Profiling is triggered shortly after the new tables are added.
1,000 GB x US$0.03
US$30.00
5 TB of data representing tables with schema changes. Reprofiling is scheduled for the next month. US$0
Total US$330.00
Month 2: Tables with schema changes are reprofiled
Data Price
Starting data: 12 TB total. 5 TB of data is set for reprofiling due to tables with schema changes last month.
When a table is set for reprofiling, the entire table is reprofiled. Charges are based on the total table size.
5,000 GB x US$0.03
US$150.00
1 TB of data is added as new tables (picked up daily) over the month. Profiling is triggered shortly after the new tables are added.
1,000 GB x US$0.03
US$30.00
5 TB of data representing tables with schema changes. Reprofiling is scheduled for the next month. US$0
Total US$180.00

Scenario 2: Organization-wide data profiling with static data schema

Suppose you have 5 TB of data across your entire organization. Each month, you add 1 TB of new data in new tables. Existing tables have no schema changes (no new columns), but do have additional rows.

Month 1: Profiles are created for all your data
Data Price
Starting data: 5 TB of data is profiled.
5,000 GB x US$0.03
US$150.00
1 TB of data is added as new tables (picked up daily) over the month. Profiling is triggered shortly after the new tables are added.
1,000 GB x US$0.03
US$30.00
Total US$180.00
Month 2: Only new tables are profiled
Data Price
Static data: 6 TB. Because existing tables remain unchanged, a new scan is not triggered. US$0
1 TB of data is added as new tables (picked up daily) over the month.
1,000 GB x US$0.03
US$30.00
Total US$30.00

Discovery: subscription mode pricing

A subscription unit is a reservation of compute time (capacity) that Sensitive Data Protection uses to generate a profile.

Data assets profiled per subscription unit

The throughput of profile generation depends on the complexity and type of the data to be profiled. Determining factors include the following:

  • Presence of large custom dictionaries (not recommended for profiling).
  • Table type. A BigLake table uses five times the capacity of a non-BigLake table.

The following table shows example throughputs per subscription unit.

Subscription unit count Cost per month Approximate number of profiles per month
1 unit $2,500 10,000 standard tables1 or 2,000 BigLake tables or 500 file stores2 or a mix of these data assets
2 units $5,000 20,000 standard tables or 4,000 BigLake tables or 1,000 file stores or a mix of these data assets
4 units $10,000 40,000 standard tables or 8,000 BigLake tables or 2,000 file stores or a mix of these data assets
20 units $50,000 200,000 standard tables or 40,000 BigLake tables or 10,000 file stores or a mix of these data assets

1 BigQuery or Cloud SQL tables.

2 Sensitive Data Protection uses the term file store to refer to a file storage bucket. Buckets that are empty or that have no supported file types still consume capacity. For such a bucket, you consume the equivalent capacity of .05 buckets.

Subscription scope

The scope of a subscription is either an organization or a project. An organization-level subscription doesn't apply to a project-level scan configuration.

Subscription term

The first month of the subscription is a month term. After the initial month, you are billed on a monthly basis, and you can cancel or edit the subscription at any time.

  • You can't delete or reduce units of a monthly subscription during the first month.
  • After the first month, you can delete or change subscription units at any time, and you will be charged only for the minutes your subscription was active.
  • If you don't cancel the subscription, you continue to be charged.

Example

Suppose you purchased a subscription unit at 6:00:00 on October 5. The following apply:

  • You start being charged at that moment.
  • You can't cancel or reduce your subscription until 6:00:00 on November 4.
  • If you cancel at 7:10:10 on November 5, you will be charged for the month plus one day, one hour, ten minutes, and ten seconds (from 6:00:00 on October 5 to 7:10:10 on November 5).

Expiration of a subscription term

At the conclusion of the subscription's initial term, billing will continue from month to month and the subscription will remain in place.

Purchase a subscription

  1. In the Google Cloud console, go to the Subscriptions page.

    Go to Subscriptions

  2. Select the project or organization that you want to purchase a subscription for.

    If you purchase an organization-level subscription, that subscription doesn't apply when you create a project-level scan configuration. Similarly, if you purchase a project-level subscription, the subscription applies only to the project.

  3. Next to Pricing mode, click Switch to subscriptions.

  4. Follow the prompts to complete the purchase.

Under-provisioned capacity

If profiling a project, folder, or organization requires more capacity than currently available, Sensitive Data Protection queues up work and waits for capacity to become available. As progress on profiling is made, and as capacity frees up, the queued-up work gets picked up for execution.

If your profiling demand exceeds the capacity you subscribed to, you aren't charged for overage. That is, you're not charged for additional capacity, and you're not charged for additional capacity at a consumption rate. The resources to be profiled will be added to an internal queue and will be profiled when capacity becomes available.

Therefore, when deciding how many subscription units to purchase, you can choose to under-provision your capacity if you're willing to wait for your profiles to be created. For example, you can purchase one subscription unit even if your total current table count exceeds 10,000.

Monitoring utilization

In the API/Service Details page for Sensitive Data Protection, you can view how much capacity you have used for your subscription.

Capacity used for profiling per day

In this example, the customer has purchased a subscription of size 1. They have the capacity to profile approximately 333 standard tables or 66 BigLake tables or 16 file stores a day. These numbers aren't hard per-day limits. As resources become available throughout the month's subscriptions, you may see some fluctuations in use.

Error handling

In some cases, profiles might be generated with errors and still consume capacity. The following are a few scenarios where this issue can occur; this isn't an exhaustive list.

  • The data assets to be profiled are within VPC Service Controls boundaries.
  • The service agent is missing Identity and Access Management permissions.
  • Configuration changes were made in the to discovery scan configuration or inspection templates.

These errors can still consume your capacity because the system still performs work to attempt to generate profiles. You will get a partial profile with information about why Sensitive Data Protection could not generate the full profile.

Discovery for Cloud Storage

This section applies to Cloud Storage data profiling in both consumption and subscription pricing modes. For a comparison of the two pricing modes, see Discovery comparison table on this page.

Class A and Class B operations

You are charged for the Class A and Class B operations that Sensitive Data Protection performs in the process of profiling your buckets. Sensitive Data Protection uses the following operations:

  • Class A: storage.objects.list
  • Class B: storage.buckets.get and storage.buckets.getIamPolicy

For information about how much Cloud Storage charges for Class A and Class B operations, see Operation charges in the Cloud Storage documentation.

Retrieval fees

For objects that have a non-Standard storage class, you are charged for retrieval fees. For information about how much Cloud Storage charges for data retrieval, see Retrieval fees in the Cloud Storage documentation.

Discovery: profiling schedule

The default profiling schedule is described in Default frequency of data profile generation. You can adjust the schedule in your scan configuration. This is true for both consumption and subscription modes.

In subscription mode, if you under-provision your capacity, profiling can run less frequently than you requested. Capacity is distributed evenly among the resources to be profiled within the project or organization. As capacity becomes available, Sensitive Data Protection picks up resources to be profiled from the queue in a way that maximizes throughput.

Discovery: BigQuery billing and quota consumption

The process of profiling BigQuery data doesn't incur BigQuery charges or consume BigQuery quota. However, standard BigQuery charges and quotas apply when you export the data profiles to a BigQuery table.

Discovery: pricing for exporting data profiles

The following table shows billing and quota consumption for your usage of other Google Cloud services when you export data profiles to those services. You configure exporting by turning on certain actions in your discovery scan configuration.

Action Quota consumption Charges
Publish to Google Security Operations Not applicable Depending on your contract, Google SecOps may charge for data ingestion or storage. Contact your Google Cloud account manager for more information.
Publish to Security Command Center Not applicable Security Command Center charges may apply, depending on your service tier.1
Save data profile copies to BigQuery Consumes BigQuery quota in the service agent container2 or the project to be profiled3. Standard BigQuery charges apply. The charges are applied to the service agent container2 or the project to be profiled3
Publish to Pub/Sub Consumes Pub/Sub quota in the service agent container2 or the project to be profiled3 Standard Pub/Sub charges apply. The charges are applied to the service agent container2 or the project to be profiled3
Send to Dataplex as tags Not applicable Dataplex metadata storage charges and API charges apply.

1Sensitive Data Protection works with Security Command Center in all service tiers.

2When you profile data at the organization or folder level, charges and quota consumption are applied to the service agent container.

3When you profile data at the project level, charges and quota consumption are applied to the project to be profiled.

Risk analysis

Risk analysis uses resources in BigQuery and charges appear as BigQuery usage. Sensitive Data Protection does not add any additional charges for risk analysis.

Risk analysis jobs are created using the projects.dlpJobs.create method with the following configuration objects:

Controlling costs

Depending on the quantity of information that you instruct the Sensitive Data Protection to scan, it is possible for costs to become prohibitively high. To learn several methods that you can use to keep costs down while also ensuring that you're using the Sensitive Data Protection to scan the exact data that you intend to, see Keeping Sensitive Data Protection costs under control.

What's next

Request a custom quote

With Google Cloud's pay-as-you-go pricing, you only pay for the services you use. Connect with our sales team to get a custom quote for your organization.
Contact sales