Export GCP billing data to BigQuery using DuploCloud
By exporting your Google Cloud Platform (GCP) billing data to BigQuery, you can leverage DuploCloud's dashboard to monitor and analyze your GCP billing effectively.
To export to BigQuery you must have:
A Google Cloud Platform account with billing enabled.
Permission to access the Google Cloud Billing API and BigQuery.
Billing Account Administrator permissions
BigQuery Admin permissions
Navigate to the BigQuery Console in your Google Cloud Platform account.
In GCP, select the Project where you want to create the dataset.
Click Create Dataset.
In the Create dataset window, configure your dataset with the following parameters:
Dataset ID: Enter a unique name for your dataset.
Location Type: Select Multi-Region.
Default table expiration: Select Enable table expiration and set a default expiration time for tables in this dataset, such as 60 days. Tables will be automatically deleted after this period.
Click Create Dataset.
Once the dataset is created, it appears in the BigQuery Console under your project. Select the dataset to view details.
In GCP, open the Google Cloud Console.
Select Billing from the main menu or visit Google Cloud Billing.
Select the billing account for which you want to enable the billing export.
In the Billing Account Details page, select Billing Export from the left navigational pane.
In the Billing Export page, in the Detailed usage cost area, click Edit Settings.
In the BigQuery Export tab, configure Detailed usage cost.
Select the Project: Choose the project where you created the BigQuery dataset.
Select the Dataset: Choose the dataset you created for billing data.
Click Save.
Contact DuploCloud Support to complete additional steps to enable the billing dashboard.
The exported billing data includes detailed information about your GCP usage and charges. Regularly monitor and analyze this data to keep track of your cloud spending.
Manage costs for resources
Usage costs for resources can be viewed and managed in the DuploCloud Portal, by month or week, and by Tenant. You can also explore historical resource costs.
To view the Billing page for GCP in the DuploCloud Portal, click Administrator -> Billing.
You can view usage by:
Time
Select the Spend by Month tab and click More Details to display monthly and weekly spending options.
Tenant
Select the Spend by Tenant tab.
In Google Cloud Platform (GCP), billing data can be exported to a BigQuery dataset in only one project. However, when deploying instances of an application across multiple projects (e.g., dev, qa, stg, prod), it is necessary to replicate the billing dataset to enable billing monitoring on all DuploCloud dashboards in these projects. This documentation outlines the steps to configure automated replication of a BigQuery dataset from a source project to a destination project.
NOTE: This documentation is an extension of Export Billing to BigQuery
Two GCP projects: a source project where the original billing dataset resides, and a destination project where the dataset will be replicated.
Appropriate permissions to create datasets and data transfer jobs in BigQuery.
Google Cloud SDK installed and initialized.
Enable BigQuery Data Transfer API from API and Services in Destination GCP Project.
Source Project: GCP project where the original billing dataset resides with billing export.
Destination Project: New GCP project which has duplo-master running and dataset needs to be created.
Open the BigQuery console in the source project: BigQuery Console
Click on CREATE DATASET.
Enter the dataset ID, choose a data location, and set other options as mentioned in the below screenshot.
Click Create dataset.
For the replication to work, you need to allow specific roles on the dataset in source project to the duplo-master
GCP service account of the destination project
Following roles are needed:
BigQuery Admin
BigQuery Data Viewer
BigQuery Data Editor
BigQuery User
Open the BigQuery console in the destination project.
In the left-hand menu, click on Data Transfers.
Click on CREATE TRANSFER.
Select Source Type as Dataset Copy
Schedule options: Choose Start now. Set the frequency option to every 12 hours.
Under the Destination Settings
Put destination project dataset as Dataset
Put source project dataset as Source Dataset
Put source project ID as Source Project
Enable checkbox Overwrite destination table
Under Service Account select the destination duplo-master
service account (which has the permission to access the source project dataset)
Click SAVE
In the BigQuery console of the destination project, go to the Transfers tab.
You should see your transfer job listed. You can click on it to view details and monitor its progress.
By following these steps, you can set up automated replication of a BigQuery dataset from one GCP project to another, enabling billing monitoring on all DuploCloud dashboards across multiple projects. Ensure to monitor the transfer job periodically to make sure it is running as expected.