Storing GCP IAM policy bindings from many projects in a single data store

--

In my previous blog post, I provided the details about the current inventory of projects we are managing as part of owning a data platform on GCP. As I explained, its a humongous task for us managing the resources deployed across many projects.

One of my team’s responsibilities as data platform owner is to provision access on the various resources deployed to the GCP projects. We use Terraform to provision the IAM policies on the GCP resources.

In one of my earlier blog posts, I explained about the process I developed for pro-active monitoring of IAM policies provisioned through Terraform. There are still other challenges in managing the IAM policy bindings.

  1. Review the policies when/if needed

To check some access for any user ID, we have to manually go to the particular project in GCP console.

With 80+ projects (and growing), going to GCP console every time is quite cumbersome.

2. Check historical information for the IAM policy bindings

It is quite difficult to query GCP logs to get the historical information for the IAM policy bindings to troubleshoot any access issues.

To address the challenges listed above, I built a custom process to capture IAM policy bindings provisioned across all the GCP projects (we manage) and store those details in a data store (BigQuery dataset).

Shell script does these 3 things.

  1. Run gcloud asset search-all-iam-policies command at folder level.

Note: You can run the same at project level also

2. Load the output of gcloud command into a BigQuery staging table.

3. Run a SQL statement to flatten the JSON data and load it into a curated table.

Disclaimer: The posts here represent my personal views and not those of my employer or any specific vendor. Any technical advice or instructions are based on my own personal knowledge and experience.

--

--