AWS Credentials Exposed in Python Configuration Files

Critical Risk Secrets Exposure
awspythoncloud-credentialsaccess-keysboto3configuration-filesiams3

What it is

A critical security vulnerability where AWS access keys, secret access keys, session tokens, and other cloud credentials are hardcoded in Python configuration files, source code, or accidentally exposed through version control systems. This exposure allows unauthorized access to AWS resources, potential data breaches, and can lead to significant financial costs from resource abuse.

# VULNERABLE: Django settings.py with hardcoded AWS credentials
import boto3

# settings.py
# VULNERABLE: Hardcoded AWS credentials in settings
AWS_ACCESS_KEY_ID = 'AKIAIOSFODNN7EXAMPLE'
AWS_SECRET_ACCESS_KEY = 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY'
AWS_STORAGE_BUCKET_NAME = 'my-django-app-files'
AWS_S3_REGION_NAME = 'us-east-1'

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'

# VULNERABLE: Hardcoded credentials in boto3 client
def upload_to_s3(file_obj):
    s3_client = boto3.client(
        's3',
        aws_access_key_id='AKIAIOSFODNN7EXAMPLE',
        aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
        region_name='us-east-1'
    )
    
    s3_client.upload_fileobj(
        file_obj,
        'my-django-app-files',
        file_obj.name
    )
    return {'status': 'success'}

# VULNERABLE: Class with hardcoded credentials
class AWSUtils:
    def __init__(self):
        self.s3_client = boto3.client(
            's3',
            aws_access_key_id='AKIAIOSFODNN7EXAMPLE',
            aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY'
        )
        
        self.dynamodb = boto3.resource(
            'dynamodb',
            aws_access_key_id='AKIAIOSFODNN7EXAMPLE',
            aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY'
        )
# SECURE: Django settings using environment variables
import os
import boto3

# settings.py
# SECURE: AWS configuration from environment variables
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME']
AWS_S3_REGION_NAME = os.environ.get('AWS_S3_REGION_NAME', 'us-east-1')

# boto3 automatically discovers credentials from:
# 1. Environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
# 2. IAM roles for EC2/ECS/Lambda
# 3. ~/.aws/credentials file

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'

# SECURE: boto3 uses default credential chain
def upload_to_s3(file_obj):
    # No hardcoded credentials - uses environment/IAM role
    s3_client = boto3.client('s3')
    
    s3_client.upload_fileobj(
        file_obj,
        os.environ['AWS_STORAGE_BUCKET_NAME'],
        file_obj.name
    )
    return {'status': 'success'}

# SECURE: Class using default credential chain
class AWSUtils:
    def __init__(self):
        # SECURE: No hardcoded credentials
        self.s3_client = boto3.client('s3')
        self.dynamodb = boto3.resource('dynamodb')

# Usage:
# export AWS_ACCESS_KEY_ID=your_key_id
# export AWS_SECRET_ACCESS_KEY=your_secret_key
# export AWS_STORAGE_BUCKET_NAME=my-bucket

# Or use IAM roles on EC2/ECS/Lambda (no keys needed)

💡 Why This Fix Works

The vulnerable Django application hardcodes AWS credentials in settings files and view code, exposing them through logs and error messages. The secure version uses boto3's default credential chain, environment variables, proper logging filters, and a centralized AWS service manager without exposing sensitive information.

Why it happens

Developers commonly hardcode AWS access keys and secret keys in Python configuration files, settings modules, or constants files. This often occurs when setting up boto3 clients, configuring Django/Flask applications for S3 storage, or initializing AWS services. These credentials become visible to anyone with access to the codebase or deployed applications.

Root causes

Hardcoded AWS Credentials in Configuration Files

Developers commonly hardcode AWS access keys and secret keys in Python configuration files, settings modules, or constants files. This often occurs when setting up boto3 clients, configuring Django/Flask applications for S3 storage, or initializing AWS services. These credentials become visible to anyone with access to the codebase or deployed applications.

Preview example – PYTHON
# config/aws_settings.py - VULNERABLE configuration
AWS_CONFIG = {
    'region_name': 'us-east-1',
    'aws_access_key_id': 'AKIAIOSFODNN7EXAMPLE',
    'aws_secret_access_key': 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
    'aws_session_token': 'AQoEXAMPLEH4aoAH0gNCAPyJxz4BlCFFxWNE1OPTgk5TthT+FvwqnKwRcOIfrRh3c/L...',
}

# Usage in application
import boto3
from config.aws_settings import AWS_CONFIG

# VULNERABLE: Using hardcoded credentials
s3_client = boto3.client('s3', **AWS_CONFIG)

Django/Flask Settings with Embedded AWS Keys

Web frameworks like Django and Flask often contain AWS credentials in their settings files for services like S3 file storage, SES email sending, or CloudWatch logging. These settings files are typically committed to version control, making AWS credentials accessible through the repository history and to all developers with access.

Preview example – PYTHON
# Django settings.py - VULNERABLE AWS configuration
import os

# VULNERABLE: Hardcoded AWS credentials for S3 storage
AWS_ACCESS_KEY_ID = 'AKIAI44QH8DHBEXAMPLE'
AWS_SECRET_ACCESS_KEY = 'je7MtGbClwBF/2Zp9Utk/h3yCo8nvbEXAMPLEKEY'
AWS_STORAGE_BUCKET_NAME = 'my-django-app-media'
AWS_S3_REGION_NAME = 'us-west-2'

# SES configuration with hardcoded credentials
EMAIL_BACKEND = 'django_ses.SESBackend'
AWS_SES_REGION_NAME = 'us-east-1'
AWS_SES_REGION_ENDPOINT = 'email.us-east-1.amazonaws.com'

# CloudWatch logging with exposed credentials
LOGGING = {
    'handlers': {
        'cloudwatch': {
            'class': 'watchtower.CloudWatchLogsHandler',
            'log_group': 'django-app-logs',
            'aws_access_key_id': 'AKIAI44QH8DHBEXAMPLE',  # Hardcoded!
            'aws_secret_access_key': 'je7MtGbClwBF/2Zp9Utk/h3yCo8nvbEXAMPLEKEY'
        }
    }
}

Boto3 Client Initialization with Hardcoded Credentials

Python applications using boto3 for AWS integration often hardcode credentials directly in client initialization code. This is particularly common in data processing scripts, Lambda functions, or automation tools where developers embed AWS keys for convenience. The credentials become permanently visible in source code and version history.

Preview example – PYTHON
# data_processor.py - VULNERABLE boto3 usage
import boto3
import pandas as pd
from datetime import datetime

class DataProcessor:
    def __init__(self):
        # VULNERABLE: Hardcoded AWS credentials in constructor
        self.s3_client = boto3.client(
            's3',
            aws_access_key_id='AKIAIOSFODNN7EXAMPLE',
            aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
            region_name='us-east-1'
        )
        
        self.dynamodb = boto3.resource(
            'dynamodb',
            aws_access_key_id='AKIAIOSFODNN7EXAMPLE',  # Same credentials reused
            aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
            region_name='us-east-1'
        )
        
        # More AWS services with hardcoded credentials
        self.ses_client = boto3.client(
            'ses',
            aws_access_key_id='AKIAI44QH8DHBEXAMPLE',  # Different key pair
            aws_secret_access_key='je7MtGbClwBF/2Zp9Utk/h3yCo8nvbEXAMPLEKEY',
            region_name='us-west-2'
        )
    
    def process_data(self, bucket_name, file_key):
        # Processing logic using the vulnerable clients
        response = self.s3_client.get_object(Bucket=bucket_name, Key=file_key)
        # ... rest of processing

Fixes

1

Use Environment Variables and AWS Configuration Files

Replace hardcoded AWS credentials with environment variables or AWS configuration files. Use boto3's default credential chain that automatically looks for credentials in environment variables, AWS config files, IAM roles, and other secure sources. This keeps credentials out of source code and allows different credentials for different environments.

View implementation – PYTHON
# SECURE: AWS configuration using environment variables
import boto3
import os
from botocore.exceptions import NoCredentialsError, PartialCredentialsError

class SecureAWSConfig:
    def __init__(self):
        # Validate AWS configuration at startup
        self.validate_aws_config()
        
    def validate_aws_config(self):
        """Validate that AWS credentials are properly configured"""
        required_vars = ['AWS_DEFAULT_REGION']
        missing_vars = [var for var in required_vars if not os.getenv(var)]
        
        if missing_vars:
            raise ValueError(f"Missing AWS environment variables: {missing_vars}")
        
        # Test credentials by making a simple AWS call
        try {
            sts_client = boto3.client('sts')
            sts_client.get_caller_identity()
            print("AWS credentials validated successfully")
        except (NoCredentialsError, PartialCredentialsError) as e:
            raise ValueError(f"Invalid AWS credentials: {e}")
    
    def get_s3_client(self):
        """Get S3 client using default credential chain"""
        return boto3.client('s3')  # Uses environment variables or IAM role
    
    def get_dynamodb_resource(self):
        """Get DynamoDB resource using default credential chain"""
        return boto3.resource('dynamodb')
    
    def get_ses_client(self, region_name=None):
        """Get SES client with optional region override"""
        if region_name:
            return boto3.client('ses', region_name=region_name)
        return boto3.client('ses')

# Usage with environment variables:
# export AWS_ACCESS_KEY_ID=your_access_key
# export AWS_SECRET_ACCESS_KEY=your_secret_key
# export AWS_DEFAULT_REGION=us-east-1

# Or use AWS config file (~/.aws/credentials):
# [default]
# aws_access_key_id = your_access_key
# aws_secret_access_key = your_secret_key
# region = us-east-1
2

Implement IAM Roles and Temporary Credentials

Use IAM roles for EC2 instances, ECS tasks, and Lambda functions instead of hardcoded access keys. Implement AWS STS for temporary credentials when cross-account access is needed. This provides automatic credential rotation, better security boundaries, and eliminates the need to manage long-term access keys in application code.

View implementation – PYTHON
# SECURE: Using IAM roles and temporary credentials
import boto3
from botocore.credentials import RefreshableCredentials
from botocore.session import get_session
from datetime import datetime, timezone

class SecureAWSCredentialManager:
    def __init__(self, role_arn=None, external_id=None):
        self.role_arn = role_arn
        self.external_id = external_id
        self.session = None
        
        if role_arn:
            # Use cross-account role with temporary credentials
            self._setup_assume_role_session()
        else:
            # Use default credential chain (IAM role, environment, etc.)
            self.session = boto3.Session()
    
    def _setup_assume_role_session(self):
        """Setup session with assume role for cross-account access"""
        def refresh_credentials():
            sts_client = boto3.client('sts')
            
            assume_role_kwargs = {
                'RoleArn': self.role_arn,
                'RoleSessionName': f'boto3-session-{int(datetime.now().timestamp())}',
                'DurationSeconds': 3600  # 1 hour
            }
            
            if self.external_id:
                assume_role_kwargs['ExternalId'] = self.external_id
            
            response = sts_client.assume_role(**assume_role_kwargs)
            credentials = response['Credentials']
            
            return {
                'access_key': credentials['AccessKeyId'],
                'secret_key': credentials['SecretAccessKey'],
                'token': credentials['SessionToken'],
                'expiry_time': credentials['Expiration']
            }
        
        # Create refreshable credentials
        refreshable_credentials = RefreshableCredentials.create_from_metadata(
            metadata=refresh_credentials(),
            refresh_using=refresh_credentials,
            method='assume-role'
        )
        
        # Create session with refreshable credentials
        session = get_session()
        session._credentials = refreshable_credentials
        self.session = boto3.Session(botocore_session=session)
    
    def get_client(self, service_name, **kwargs):
        """Get AWS service client with secure credentials"""
        return self.session.client(service_name, **kwargs)
    
    def get_resource(self, service_name, **kwargs):
        """Get AWS service resource with secure credentials"""
        return self.session.resource(service_name, **kwargs)
    
    def get_caller_identity(self):
        """Get information about the current AWS credentials"""
        sts_client = self.get_client('sts')
        return sts_client.get_caller_identity()

# Usage examples:
# For EC2/ECS with IAM role (no explicit credentials needed)
cred_manager = SecureAWSCredentialManager()
s3_client = cred_manager.get_client('s3')

# For cross-account access with assume role
cred_manager = SecureAWSCredentialManager(
    role_arn='arn:aws:iam::123456789012:role/CrossAccountRole',
    external_id='unique-external-id'
)
ec2_client = cred_manager.get_client('ec2', region_name='us-west-2')
3

Use AWS Secrets Manager and Parameter Store

Store AWS credentials and other sensitive configuration in AWS Secrets Manager or Systems Manager Parameter Store. These services provide encrypted storage, access logging, automatic rotation, and fine-grained access controls. Retrieve secrets at runtime rather than embedding them in configuration files.

View implementation – PYTHON
# SECURE: Using AWS Secrets Manager for credential management
import boto3
import json
from botocore.exceptions import ClientError
from functools import lru_cache
import os

class AWSSecretsManager:
    def __init__(self, region_name=None):
        self.region_name = region_name or os.getenv('AWS_DEFAULT_REGION', 'us-east-1')
        self.secrets_client = boto3.client('secretsmanager', region_name=self.region_name)
        self.ssm_client = boto3.client('ssm', region_name=self.region_name)
    
    @lru_cache(maxsize=32)
    def get_secret(self, secret_name):
        """Retrieve secret from AWS Secrets Manager with caching"""
        try:
            response = self.secrets_client.get_secret_value(SecretId=secret_name)
            return json.loads(response['SecretString'])
        except ClientError as e:
            if e.response['Error']['Code'] == 'ResourceNotFoundException':
                raise ValueError(f"Secret {secret_name} not found")
            elif e.response['Error']['Code'] == 'InvalidRequestException':
                raise ValueError(f"Invalid request for secret {secret_name}")
            else:
                raise ValueError(f"Error retrieving secret {secret_name}: {e}")
    
    @lru_cache(maxsize=32)
    def get_parameter(self, parameter_name, decrypt=True):
        """Retrieve parameter from SSM Parameter Store with caching"""
        try:
            response = self.ssm_client.get_parameter(
                Name=parameter_name,
                WithDecryption=decrypt
            )
            return response['Parameter']['Value']
        except ClientError as e:
            if e.response['Error']['Code'] == 'ParameterNotFound':
                raise ValueError(f"Parameter {parameter_name} not found")
            else:
                raise ValueError(f"Error retrieving parameter {parameter_name}: {e}")
    
    def get_database_credentials(self, secret_name):
        """Get database credentials from Secrets Manager"""
        secret = self.get_secret(secret_name)
        required_keys = ['username', 'password', 'host', 'port', 'dbname']
        
        missing_keys = [key for key in required_keys if key not in secret]
        if missing_keys:
            raise ValueError(f"Missing database credential keys: {missing_keys}")
        
        return secret
    
    def get_api_credentials(self, secret_name):
        """Get API credentials from Secrets Manager"""
        secret = self.get_secret(secret_name)
        
        if 'api_key' not in secret:
            raise ValueError("API key not found in secret")
        
        return secret

class SecureAWSServiceManager:
    def __init__(self, secrets_manager=None):
        self.secrets_manager = secrets_manager or AWSSecretsManager()
        self._clients = {}
    
    def get_s3_client_with_credentials(self, credentials_secret_name):
        """Get S3 client using credentials from Secrets Manager"""
        if credentials_secret_name not in self._clients:
            credentials = self.secrets_manager.get_secret(credentials_secret_name)
            
            required_keys = ['access_key_id', 'secret_access_key']
            missing_keys = [key for key in required_keys if key not in credentials]
            if missing_keys:
                raise ValueError(f"Missing AWS credential keys: {missing_keys}")
            
            client = boto3.client(
                's3',
                aws_access_key_id=credentials['access_key_id'],
                aws_secret_access_key=credentials['secret_access_key'],
                aws_session_token=credentials.get('session_token'),
                region_name=credentials.get('region', 'us-east-1')
            )
            
            self._clients[credentials_secret_name] = client
        
        return self._clients[credentials_secret_name]
    
    def get_rds_connection_string(self, db_secret_name):
        """Get RDS connection string from Secrets Manager"""
        db_creds = self.secrets_manager.get_database_credentials(db_secret_name)
        
        return (
            f"postgresql://{db_creds['username']}:{db_creds['password']}"
            f"@{db_creds['host']}:{db_creds['port']}/{db_creds['dbname']}"
        )
    
    def get_external_api_client(self, api_secret_name):
        """Get external API client with credentials from Secrets Manager"""
        api_creds = self.secrets_manager.get_api_credentials(api_secret_name)
        
        # Example: Return configured API client
        return ExternalAPIClient(
            api_key=api_creds['api_key'],
            base_url=api_creds.get('base_url'),
            timeout=api_creds.get('timeout', 30)
        )

# Usage example:
# First, store credentials in AWS Secrets Manager:
# aws secretsmanager create-secret \
#   --name "prod/app/aws-credentials" \
#   --description "AWS credentials for production app" \
#   --secret-string '{"access_key_id":"AKIA...","secret_access_key":"...","region":"us-east-1"}'

# Then use in application:
secretsManager = AWSSecretsManager()
serviceManager = SecureAWSServiceManager(secretsManager)

# Get S3 client using credentials from Secrets Manager
s3_client = serviceManager.get_s3_client_with_credentials('prod/app/aws-credentials')

# Get database connection string
db_url = serviceManager.get_rds_connection_string('prod/app/database-credentials')

Detect This Vulnerability in Your Code

Sourcery automatically identifies aws credentials exposed in python configuration files and many other security issues in your codebase.