-
Notifications
You must be signed in to change notification settings - Fork 16
1. Installation Instructions & Folder Setup
If you want to use docker to run the tool, you can use the existing Dockerfile to create a container with the tool and all dependencies installed. It will then drop you into a venv inside the docker container when starting allowing you to run "python3 main.py". Note because it is docker, unless you mount volumes with -v, your data will be wiped upon exiting the docker container
# From gcpwn base directory
docker build -t gcpwn .
docker run -it gcpwn
Note I cannot guarantee support for other OS types/deviations from instructions below, but feel free to file issues if there are any major items that arise.
Supported OS: Kali Linux 6.6.9
Python Version: Python3 3.11.8
Installation Instructions:
- Setup a virtual environment as shown below. Not required but I ran into less difficulties with dependencies this way
- Clone the code from the official NetSPI GitHub organization, maybe check out some other cool repositories while your there ;)
- Run the setup script. If you don't want to run the setup script and want to do the same steps manually, you just need to install the gcloud CLI tool + pip install all the libraries in the requirements.txt file.
- Start the tool via
python3 main.py
. If this is your first time, the tool will ask you to create a workspace. This is a purely logical attempt at a container, you can pass in whatever name you want. See the subsequent wiki sections on adding authentication/running modules.
# Setup a virtual environment
python3 -m venv ./myenv
source myenv/bin/activate
# Clone the tool
git clone https://github.com/NetSPI/gcpwn.git
# Run setup.sh; This will install gcloud CLI tool and pip3 install -r requirements if you want to do those separately
chmod +x setup.sh; ./setup.sh
# Launch the tool after all items installed & create first workspace
python3 main.py
[*] No workspaces were detected.
New workspace name: my_workspace
[*] Workspace 'my_workspace' created.
Welcome to your workspace! Type 'help' or '?' to see available commands.
[*] Listing existing credentials...
Submit the name or index of an existing credential from above, or add NEW credentials via Application Default
Credentails (adc - google.auth.default()), a file pointing to adc credentials, a standalone OAuth2 Token,
or Service credentials. See wiki for details on each. To proceed with no credentials just hit ENTER and submit
an empty string.
[1] *adc <credential_name> [tokeninfo] (ex. adc mydefaultcreds [tokeninfo])
[2] *adc-file <credential_name> <filepath> [tokeninfo] (ex. adc-file mydefaultcreds /tmp/name2.json)
[3] *oauth2 <credential_name> <token_value> [tokeninfo] (ex. oauth2 mydefaultcreds ya[TRUNCATED]i3jJK)
[4] service <credential_name> <filepath_to_service_creds> (ex. service mydefaultcreds /tmp/name2.json)
*To get scope and/or email info for Oauth2 tokens (options 1-3) include a third argument of
"tokeninfo" to send the tokens to Google's official oauth2 endpoint to get back scope.
tokeninfo will set the credential name for oauth2, otherwise credential name will be used.
Advised for best results. See https://cloud.google.com/docs/authentication/token-types#access-contents.
Using tokeninfo will add scope/email to your references if not auto-picked up.
Input:
Two folders, "GatheredData" and "LoggedActions" are auto-created and populated as you run the tool:
- GatheredData/[workspace_id]/* - This folder contains all downloaded data from modules along with any IAM analysis reports for permissions/roles. For example, running
modules run enum_buckets --download
will try to download blobs to the specified folder, or runningmodules run process_iam_bindings
will write the summary reports to this folder if--csv
or--txt
is specified. - LoggedActions/[workspace_id]/* - This will timestamp when modules start/end so you can better compare the actions taken with your own logs via blue team exercises
Internal databases store the information. You don't need to know the details below but if interested:
- databases/* - 3 databases are created in the databases folder:
- workspaces.db - Smallest database, just contains the workspace name + the integer ID assigned to it
- session.db - Contains your session credentials in a JSON serialized form. Includes individual permission actions in a JSON data structure for the given credname
- service_info.db - Contains all the object attributes as you enumerate data. When querying data from the tool, this is the table you are interacting with. If you want to get resource information manually via sqlite3, this is the database to point to.