From 4de7583881ce1d443f8eba293e0a273397cae632 Mon Sep 17 00:00:00 2001 From: "n_young@uncg.edu" Date: Wed, 21 Oct 2020 10:30:26 -0400 Subject: [PATCH] more readme formatting and expanding on explanations --- README.md | 95 +++++++++++++++++++++++++++++++------------------------ 1 file changed, 53 insertions(+), 42 deletions(-) diff --git a/README.md b/README.md index 217af62..a4933da 100644 --- a/README.md +++ b/README.md @@ -2,28 +2,25 @@ This repo will let you audit all the IAM settings on projects in the GCP organization. -KUDOS -- This approach was built on top of the excellent work of colleagues at other universities, including UNC Charlotte and Indiana University. +## KUDOS ## +This approach was built on top of the excellent work of colleagues at other universities, including UNC Charlotte and Indiana University. -## SETUP ## +# INSTRUCTIONS # -### Google Compute Environment ### +## CREATE VM ON GCE ## -Create the a Virtual Machine (VM) in the Google Compute Engine dashboard. +Create a Virtual Machine (VM) in the Google Compute Engine dashboard. -**NOTE:** You are allowed [1 free F1-micro instance per month](https://cloud.google.com/free/) in your Google environment. - -If you don't see the ability to create an F1-micro instance from the dashboard, you can use the following example command in Cloud Shell to create one: +**NOTE:** You are allowed [1 free F1-micro instance per month](https://cloud.google.com/free/) in your Google environment. If you don't see the ability to create an F1-micro instance from the dashboard, you can use the following example command in Cloud Shell to create one: ```gcloud compute instances create --machine-type=f1-micro --zone=us-east1-b``` Once the VM instance has been created, stop the VM instance and change the following setting: -Cloud API access scopes -Allow full access to all Cloud APIs - -*Hint: copy your service account information for later use* + Cloud API access scopes - Allow full access to all Cloud APIs + *Hint: copy your service account information for later use* -Start the VM instance back up, and enter into the SSH terminal for the machine. Run the following commands to prepare the environment for the repo: +Start the VM instance, and SSH into it. Run the following commands to prepare the environment for the repo: Install git: ```sudo apt-get install git``` @@ -34,57 +31,71 @@ Install pip3: Install pandas: ```sudo pip install pandas``` +## CONFIGURE IAM ROLES ## +The service account running the VM will need to have rights to query the organization, folders, and projects for the IAM policies. +Create a new role under the main organization (at the root level) with the following permissions: -### IAM Role ### + orgpolicy.policy.get + resourcemanager.folders.get + resourcemanager.folders.getIamPolicy + resourcemanager.folders.list + resourcemanager.projects.get + resourcemanager.projects.getIamPolicy + resourcemanager.projects.list -The service account running the machine will need to have rights to query the organization, folders, and projects for the IAM policies. +Once the role has been created, assign it to the VM instance's service account. -Create a role under the main organization (at the root level) with the following permissions: -orgpolicy.policy.get -resourcemanager.folders.get -resourcemanager.folders.getIamPolicy -resourcemanager.folders.list -resourcemanager.projects.get -resourcemanager.projects.getIamPolicy -resourcemanager.projects.list +## CREATE BIGQUERY TABLE ## -Once the role has been created, add the VM instance's service account to the role. +Create a new BigQuery table for this process to dump information to. +Note: If you are using separate projects for BigQuery and Compute Engine, you may need to allow the service account permissions to create jobs and insert data into the table. -### Create BigQuery Table ### +## COPY SCRIPTS TO GCE VM ## -Create a new BigQuery table for this process to dump information to. +SSH into your VM instance, clone the repo to the machine. -Note: If you are using separate projects for BigQuery and Compute Engine, you may need to allow the service account permissions to create jobs and insert data into the table. +```git clone https://github.internet2.edu/nyoung/gcp-gce-project-audit-bq.git``` -### Install & Configure ### +Copy settings.default to settings.py and edit the file using your editor of choice (if using Compute Engine, vi / vim / nano come preinstalled on some machines). Example: -Using the SSH terminal to the VM instance, clone the repo to the machine. +```cd path/to/repo && cp settings.default settings.py && vi settings.py``` -Copy settings.default to settings.py and edit the file using your editor of choice (if using Compute Engine, vi / vim / nano come preinstalled on some machines). +Enter your organization ID, app script folder id, and any project IDs you may want to exclude. Example: -Enter your organization ID, app script folder id, and any project IDs you may want to exclude. +``` +ORGANIZATION_ID = '188811122222' +APPS_SCRIPT_FOLDER_ID = '555274895555' +EXCLUDED_PROJECTS = [my-special-project-id,other-special-project-id] +``` Next, edit run_audit.sh and set the TABLE variable to your BigQuery table URI. +```vi run_audit.sh``` + +## TIME TO AUTOMATE! ## + +Ensure that execution permission on the run_audit.sh script is allowed. Example: + +```chmod +x run_audit.sh``` -### Schedule The Audit ### +```0 0 * * 0 /path-to-repo/run_audit.sh``` -You will need to change the execution permission on the run_audit.sh script to allowed (chmod +x run_audit.sh). +Use crontab (or your favorite scheduler) to execute the script on your desired schedule. IE: -Use crontab (or your favorite scheduler) to execute the script on your desired schedule. +```crontab -e``` -### Who do I talk to? ### +## Who do I talk to? ## -* Nick Young -* Enterprise Analytics Architect -* University of North Carolina at Greensboro -* nickyoung@uncg.edu +Nick Young +Enterprise Analytics Architect +University of North Carolina at Greensboro +nickyoung@uncg.edu -* Tim Watts -* Integrations Specialist -* University of North Carolina at Greensboro -* timwatts@uncg.edu \ No newline at end of file +Tim Watts +Integrations Specialist +University of North Carolina at Greensboro +timwatts@uncg.edu \ No newline at end of file