Prerequisites

The following prerequisites are required to connect your Google billing data to Flexera One:

1. Enabling and Configuring Detailed Billing Export to BigQuery in Google Cloud Console 
2. Locating Your Google Service Account 
3. Creating a Sub-Folder in Google Cloud Storage Bucket 

Enabling and Configuring Detailed Billing Export to BigQuery in Google Cloud Console

Note:If you have already configured billing export to BigQuery, ensure you have the Dataset ID and Project that contains the dataset, and proceed to the next step.

Important:To enable and configure the export of Google Cloud billing detailed usage cost data to a BigQuery dataset, you require the following permissions:

The Billing Account Administrator role or the Billing Account Costs Manager role on the target Cloud Billing account. For more information, see the Google Cloud documentation topic, Cloud Billing access control and permissions.
The BigQuery User role for the Google Cloud project that contains the BigQuery dataset to be used to store the Cloud Billing data. For more information, see the Google Cloud documentation topic, Control access to resources with IAM.

You might also need project permissions. If you are a member of a Google Cloud Organization, to create a new project, you need the Project Creator role on the Organization or Folder. For more information, see the Google Cloud documentation topic, IAM roles and permissions index.

Flexera One consumes Google billing data via the BigQuery export method in Google. For more information, see the Google Cloud documentation topic, Set up Cloud Billing data export to BigQuery.

To enable and configure detailed billing export to BigQuery in Google Cloud Console:

1. In the Google Cloud Console, choose a Google Cloud project to contain your BigQuery dataset that stores the exported billing data.
2. In the left pane, click Cost management > Billing export. The Billing export page opens.
3. In the Detailed Usage Cost section, select the Enable export option.
4. Select the project and create a new BigQuery dataset with the following settings:
a. Enter a Dataset ID.
b. Select the Data location as Multi region.
c. Clear the Enable table expiration checkbox.
d. Choose the Encryption setting as Google-owned and Google-managed encryption key.
5. Click Create Dataset.

Note:Consider the following:

After you enable Cloud Billing export to BigQuery, Google automatically creates a service account as an owner of the BigQuery dataset that you specify for the export. For example, billing-export-bigquery@system.gserviceaccount.com. This service account is only used by Google to write billing data into your BigQuery dataset.
Do not delete this service account. Deleting it stops data from being exported into your BigQuery dataset, and prevents backfilling the missing data.
To view a breakdown of Google Kubernetes Engine (GKE) cluster costs in a detailed data export, you must also enable cost allocation for GKE. For more information, see the Google Cloud documentation topic, Get key spending insights for your GKE resource allocation and cluster costs.

Locating Your Google Service Account

After your billing data is being exported to BigQuery, Flexera One needs access to the BigQuery dataset to read the data. The Flexera One platform uses a Google service account to gain access to the BigQuery dataset.

You need to create your own service account or use an existing one to gain access to the BigQuery dataset. You must have access to the JSON private key for the service account as well as the service account ID. For information on creating a service account, see the Google Cloud documentation topic, Create service accounts.

To locate your Google service account:

1. In the Google Cloud Console, go to the Google service account page for your project.
2. Do one of the following:
If you do not already have a service account, create your own service account. For information on creating a service account, see the Google Cloud documentation topic, Create service accounts.
If you have an existing service account, from the list of service accounts, select your service account.
3. Click the Permissions tab and run the policy checker to check if your service account has access to the BigQuery dataset and table. For more information, see Roles Required for the Service Account.
4. Click the Keys tab to open the Keys page.

Note:Before you create a service account key, make sure that the iam.disableServiceAccountKeyCreation organization policy constraint isn't enforced for your project. If this constraint is enforced for your project, you cannot create service account keys in that project.

5. Click the Add key dropdown list and select Create new key. For more information about permissions to create the service account keys, see Roles and Permissions Required to Create the Service Account Keys.
6. Select JSON as the Key type and click Create to create the service account key.

The downloaded key has the following format, where PRIVATE_KEY is the private portion of the public/private key pair:

{

    "type": "service_account",

    "project_id": "PROJECT_ID",

    "private_key_id": "KEY_ID",

    "private_key": "-----BEGIN PRIVATE KEY-----\nPRIVATE_KEY\n-----END PRIVATE KEY-----\n",

    "client_email": "SERVICE_ACCOUNT_EMAIL",

    "client_id": "CLIENT_ID",

    "auth_uri": "https://accounts.google.com/o/oauth2/auth",

    "token_uri": "https://accounts.google.com/o/oauth2/token",

    "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",

    "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/SERVICE_ACCOUNT_EMAIL"

}

 

Note:The Project ID is used to identify the project associated with the detailed export table. If you have a centralized project with all service accounts, change the project ID to the one containing the export before entering the details.

Roles and Permissions Required to Create the Service Account Keys

You require the following roles and permissions to create the service account keys in a project:

Organization Policy Administrator

(roles/orgpolicy.policyAdmin)

Organization Viewer

(roles/resourcemanager.organizationViewer)

Tag Administrator

(roles/resourcemanager.tagAdmin)

Note:For more information about permissions, see the Google Cloud documentation topic, IAM roles and permissions index.

Roles Required for the Service Account

In Google Cloud Platform, grant project-level IAM roles to the service account within the project that contains the billing BigQuery dataset. The following roles are required:

BigQuery Data Viewer —To read data from the specified dataset.
BigQuery Job User—To create query jobs, allowing access to the results generated from those queries.

Creating a Sub-Folder in Google Cloud Storage Bucket

Flexera One consumes Google billing data via the BigQuery export method to transfer all cloud billing data temporarily to your Google Cloud Storage (GCS) bucket.

To create a sub-folder in GCS bucket:

1. In the Google Cloud Console, go to the GCS bucket.
2. Create a sub-folder or a folder within your GCS bucket.

Note:For information on creating a bucket and a sub-folder, see the Google Cloud documentation topics, Create a bucket and Create and manage folders.

3. Select the folder you want to manage and click the three dot icon and select Edit access. The Enable folder management? dialog box appears.
4. Click Enable to manage folder-level access via IAM.
5. In the Add Principal section, enter the service account that has access to BigQuery.
6. In the Assign roles section, select the Storage Object User role and click Save.

Note:The Storage Object User role is required to access a specific folder within the Google Cloud Storage bucket.

7. To copy the path of the folder, click Objects in the upper-left corner of the page, and select the sub-folder you created and click the Copy button. Add gs:// as prefix to form the complete GCS export URI path:

gs://{bucket_name}/{folder_name}

This URI is used by Flexera One to locate and ingest the exported billing data.

Note:Over time, the accumulated data can become excessive and unnecessary. You can set a policy to automatically clean up files created more than 24 hours ago. For information on cleaning up files, see the Google Cloud Community blog, GCS Lifecycle Policies with Prefix & Suffix.