Skip to content

gbg3/azure-daily-export

default
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
December 1, 2021 10:35
December 1, 2021 10:35
December 1, 2021 10:35
December 1, 2021 10:35

Azure Usage Detail Export Function

A simple function to pull usage details out of an Azure enrollment and put them somewhere more useful.

Prerequisites

  1. Google Cloud Project (BigQuery, Secrets Manager, Cloud Scheduler, Cloud Functions)
  2. Azure AD Service Principal/App Registration
  3. Enterprise Enrollment Access

Setup


Assign the Enrollment Reader role in your enrollment to the service principal. Steps to complete this can be found here The guid representing the roles remains consistent across enrollments, only the billing profile id, aka. enrollment number, will change. To successfully complete the role assignment, it must be the Object Id of the App Registration, not the Application Id.

A helper script is included to assist with creating the BigQuery table that will receive the data. The table is date partitioned to reduce query sizes and improve query performance. A view is also provided to simplify queries related to general billing questions eg. how much was spent on reserved instances's this year. The view will provide much the same information as the EA portal, but in a much more queryable fashion. Clone this repository to a cloud shell, and install the python required modules.

pip install -r requirements.txt

Rename config.sample to config.py and edit the values to match the App Id/Client Id, and Tenant. Optionally you may change the Dataset and Table names to more appropriate values.
Use the gcloud command line to confirm or change the current project for your session.

gcloud config get-value project

gcloud config set project [projectId] if needed to change projects

Run the table creation script. The script is idempotent and will not cause issues in re-applying.

python table.py

Create Service Account

gcloud iam service-accounts create [service-account-name] \
  --display-name="Azure Usage Detail Export Function Service Account"

Store the Azure client secret in Secrets Manger and assign Accessor Role to Service Account

printf [ClientSecretValue] | gcloud secrets create [ClientSecretName] --data-file=-

gcloud secrets add-iam-policy-binding [ClientSecret] \
 --member='serviceAccount:[service-account-name]@[projectId].iam.gserviceaccount.com' \
 --role='roles/secretmanager.secretAccessor'

Edit cloudbuild.yaml To set the secret reference and the service account

- --set-secrets=CLIENT_SECRET_KEY=[ClientSecretName]:latest

- --service-account=[service-account-name]@[projectId].iam.gserviceaccount.com

Deploy the function using Cloud Build

gcloud builds submit --config=cloudbuild.yaml .

Assign Service Account the Data Editor role to the table

bq add-iam-policy-binding \
  --member='serviceAccount:[service-account-name]@[projectId].iam.gserviceaccount.com' \
  --role='roles/bigquery.dataEditor' \
  [projectId]:[dataset].[table]

Define the schedule for execution. The schedule can be repeated for each enrollment you have changing the enrollment value in the message body. The schedule exampled here is for 08:00 AM on the fifth day of each month.

gcloud scheduler jobs create http [name] --location=us-central1 --schedule="0 8 5 * *" --message-body="{"enrollment":[EnrollmentNumber], "load_type":"monthly"}" --uri=[httpsTrigger:url]

Optional Next Steps

  1. Commit changes to git and push to a new repository. Configure a Cloud Build trigger and deployment pipeline.
  2. Repeat the scheduled execution for additional enrollments.

Known Limitations

Azure data accuracy

In testing the option to load data on a daily or weekly basis proved unstable. While data was brought in it was either duplicative, or missing entries compared to the EA portal or Cost Management in the standard Azure portal. This has been found up to a few days after the end of the month. Data retrieved on the 5th day for the previous month has been found to be accurate.

Tag data

Tags are currently dropped from the data set. While feasible to have them in the BigQuery data set, they need to be sanitized to ensure the keys and values are allowable.

Contributing

Contributions are welcome, especially for known issues, bugs, or to improve documentation. Please understand that this code is sufficient for our needs, it does not mean that is perfect, the correct way and it is far from the only way to collect this information. Sending an email will not guarantee a response, please use the issues functionality if you have problems.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages