AWS S3 Client

An asynchronous, thread-safe singleton client for interacting with AWS S3 storage, implementing the StorageClient interface. Built on top of aioboto3 for efficient async operations.

Note

This client is designed to be used as a singleton within your application. All methods are asynchronous and must be awaited.

Configuration

Configure the client using environment variables:

Environment Variable

Type

Required

Description

S3_AWS_REGION

string

Yes

AWS region (e.g., ‘us-east-1’)

S3_AWS_ACCESS_KEY_ID

string

Yes

AWS access key ID with S3 permissions

S3_AWS_SECRET_ACCESS_KEY

string

Yes

AWS secret access key

S3_BUCKET_NAME

string

No

Default bucket name (can be overridden per operation)

Example .env file:

# Required
S3_AWS_REGION=us-east-1
S3_AWS_ACCESS_KEY_ID=your-access-key-id
S3_AWS_SECRET_ACCESS_KEY=your-secret-access-key

Basic Usage

Initialization

The S3Client is configured using environment variables. Make sure these are set before initializing the client:

from prs_commons.aws.s3_client import S3Client
import asyncio

async def main():
    # Initialize the client (uses environment variables)
    s3 = S3Client()

    # Example: Upload a file
    try:
        result = await s3.upload_file(
            file_path="local_file.txt",
            bucket="my-bucket",
            key="path/in/s3/file.txt"
        )
        print(f"Upload successful: {result}")
    except Exception as e:
        print(f"Error: {e}")

# Run the async function
asyncio.run(main())

File Operations

Uploading Files

# Basic upload with default settings
result = await s3.upload_file(
    file_path="local_file.txt",
    bucket="my-bucket",
    key="path/in/s3/file.txt"
)

# Upload with metadata and content type
result = await s3.upload_file(
    file_path="image.jpg",
    bucket="my-bucket",
    key="images/profile.jpg",
    ExtraArgs={
        'ContentType': 'image/jpeg',
        'Metadata': {'uploaded-by': 'user123'}
    }
)

Downloading Files

# Basic download
try:
    success = await s3.download_file(
        bucket="my-bucket",
        key="path/in/s3/file.txt",
        file_path="local_file.txt"
    )
    if success:
        print("File downloaded successfully")
except FileNotFoundError as e:
    print(f"File not found: {e}")
except Exception as e:
    print(f"Error: {e}")

# Download as base64-encoded string (for small files)
try:
    base64_data = await s3.download_as_base64(
        bucket="my-bucket",
        key="images/photo.jpg"
    )
    print(f"Downloaded {len(base64_data)} bytes (base64)")
except FileNotFoundError as e:
    print(f"File not found: {e}")

Deleting Files

try:
    result = await s3.delete_object(
        bucket="my-bucket",
        key="path/to/delete/file.txt"
    )
    print(f"Delete result: {result}")
except Exception as e:
    print(f"Error deleting file: {e}")

Checking File Existence

exists = await s3.file_exists(
    bucket="my-bucket",
    key="path/to/check/file.txt"
)
print(f"File exists: {exists}")

Pre-signed URLs

Generate URL for Upload

# Generate a pre-signed URL for uploading a file
upload_url = await s3.generate_upload_url(
    bucket="my-bucket",
    key="uploads/new_file.txt",
    ContentType="text/plain",
    expiration=3600  # URL expires in 1 hour
)
print(f"Upload URL: {upload_url}")

Generate URL for Download

# Generate a pre-signed URL for downloading a file
download_url = await s3.generate_download_url(
    bucket="my-bucket",
    key="documents/report.pdf",
    expiration=1800  # URL expires in 30 minutes
)
print(f"Download URL: {download_url}")

Error Handling

The S3 client raises the following exceptions:

  • FileNotFoundError: When the specified file doesn’t exist

  • ClientError: For AWS S3 specific errors

  • NoCredentialsError: When AWS credentials are missing or invalid

  • ValueError: For invalid input parameters

Example with error handling:

from botocore.exceptions import ClientError, NoCredentialsError

try:
    result = await s3.upload_file(
        file_path="nonexistent.txt",
        bucket="my-bucket",
        key="test.txt"
    )
except FileNotFoundError as e:
    print(f"Local file not found: {e}")
except NoCredentialsError as e:
    print("AWS credentials not found or invalid")
except ClientError as e:
    error_code = e.response.get('Error', {}).get('Code')
    print(f"S3 error ({error_code}): {e}")
except Exception as e:
    print(f"Unexpected error: {e}")

Pre-signed URLs

Generate Upload URL

generate_upload_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None

Generate a pre-signed URL for uploading a file to S3.

This is a convenience wrapper around generate_presigned_url() for uploads. The content type will be automatically detected from the file extension if not provided.

Parameters:
  • bucket (str) – S3 bucket name

  • key (str) – S3 object key where the file will be stored

  • expiration (int) – Time in seconds until the URL expires (default: 3600)

  • **kwargs – Additional parameters to pass to the S3 put_object operation

Returns:

Pre-signed URL as a string, or None if credentials are invalid

Return type:

str | None

Common Parameters

  • ContentType (str, optional): The content type of the file (e.g., ‘image/jpeg’). If not provided, it will be automatically detected from the file extension.

  • ACL (str, optional): Access control for the file. Defaults to the bucket’s ACL. Common values: ‘private’, ‘public-read’, ‘public-read-write’, ‘authenticated-read’

  • Metadata (dict, optional): Dictionary of metadata to store with the object. Keys will be prefixed with ‘x-amz-meta-’ when stored in S3.

  • Other parameters supported by boto3’s generate_presigned_url for ‘put_object’ operation

Example

# Generate URL for uploading a text file with metadata
url = s3.generate_upload_url(
    bucket='my-bucket',
    key='documents/report.txt',
    ContentType='text/plain',
    Metadata={
        'author': 'user@example.com',
        'description': 'Quarterly report Q2 2023'
    },
    expiration=7200  # 2 hours
)

# Use the URL to upload a file with a PUT request # import requests # with open(‘file.txt’, ‘rb’) as f: # response = requests.put(url, data=f)

Generate Download URL

# Generate a pre-signed URL for file download
download_url = s3.generate_download_url(
    bucket="my-bucket",
    key="downloads/file.txt",
    ResponseContentType="application/octet-stream",
    expiration=3600  # URL expires in 1 hour (default)
)
print(f"Download URL: {download_url}")

Error Handling

The S3 client raises the following exceptions:

  • ClientError: For AWS service errors

  • NoCredentialsError: When AWS credentials are not found

  • RuntimeError: For client initialization errors

  • FileNotFoundError: When the requested file doesn’t exist

  • PermissionError: When there are permission issues

Example error handling:

from botocore.exceptions import ClientError, NoCredentialsError

try:
    # Your S3 operations here
    pass
except FileNotFoundError as e:
    print(f"File not found: {e}")
except PermissionError as e:
    print(f"Permission denied: {e}")
except NoCredentialsError:
    print("AWS credentials not found")
except ClientError as e:
    error_code = e.response.get('Error', {}).get('Code')
    if error_code == 'NoSuchBucket':
        print("Bucket does not exist")
    else:
        print(f"AWS error: {e}")
except Exception as e:
    print(f"Unexpected error: {e}")
from botocore.exceptions import ClientError, NoCredentialsError

try:
    s3.upload_file("nonexistent.txt", "my-bucket", "file.txt")
except FileNotFoundError as e:
    print(f"Local file not found: {e}")
except NoCredentialsError:
    print("AWS credentials not found")
except ClientError as e:
    print(f"AWS error: {e.response['Error']['Message']}")
except Exception as e:
    print(f"Unexpected error: {e}")

API Reference

class prs_commons.aws.s3_client.S3Client[source]

Bases: StorageClient

Thread-safe singleton client for AWS S3 operations.

This client provides an async interface to interact with AWS S3 using aioboto3. It implements the singleton pattern for efficient resource usage.

__init__() None[source]

Initialize the S3 client with configuration from environment variables.

resource() AsyncGenerator[Any, None][source]

Async context manager for S3 resource access.

Yields:

An aioboto3 S3 resource instance for direct S3 operations.

Example

async with s3.resource() as s3:

bucket = await s3.Bucket(‘my-bucket’) # Perform operations with the bucket

property client: Any

Get the S3 client instance.

Returns:

aioboto3 S3 client instance for low-level operations.

Note

Prefer using the higher-level methods when possible.

async upload_file(file_path: str, bucket: str, key: str, **kwargs: Any) Dict[str, Any][source]

Upload a file to an S3 bucket.

Parameters:
  • file_path – Path to local file to upload

  • bucket – Target S3 bucket name

  • key – S3 object key/path

  • **kwargs – Additional args for boto3 upload_file - ExtraArgs: Dict of additional args (e.g., ContentType, ACL) - Callback: Progress callback function - Config: boto3.s3.transfer.TransferConfig

Returns:

Dict with status, bucket, and key

Raises:
  • NoCredentialsError – If AWS credentials are invalid

  • ClientError – For S3-specific errors

  • FileNotFoundError – If local file doesn’t exist

async download_file(bucket: str, key: str, file_path: str, **kwargs: Any) bool[source]

Download a file from S3 to local filesystem.

Parameters:
  • bucket – Source S3 bucket name

  • key – S3 object key/path

  • file_path – Local path to save file (must include filename)

  • **kwargs – Additional args for boto3 download_file - ExtraArgs: Additional arguments for download - Callback: Progress callback function - Config: boto3.s3.transfer.TransferConfig

Returns:

True if download was successful

Return type:

bool

Raises:
  • FileNotFoundError – If file doesn’t exist in S3 or local path is invalid

  • PermissionError – If permission issues with S3 or local filesystem

  • ClientError – For S3-specific errors

  • IOError – If issues writing to local filesystem

  • ValueError – If bucket name or key is not provided

async delete_object(bucket: str, key: str) Dict[str, Any][source]

Delete an object from S3.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key to delete

Returns:

Dict with operation status and details

Raises:

ClientError – If deletion fails

async generate_presigned_url(bucket: str, key: str, operation: str = 'get_object', expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for an S3 object.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key

  • operation – S3 operation (‘get_object’, ‘put_object’, etc.)

  • expiration – URL expiration time in seconds (default: 1 hour)

  • **kwargs – Additional parameters for the S3 operation

Returns:

Pre-signed URL, or None if credentials are invalid

Return type:

str

Example

# Generate upload URL url = await s3.generate_presigned_url(

bucket=’my-bucket’, key=’uploads/file.txt’, operation=’put_object’, ContentType=’text/plain’

)

async generate_upload_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for uploading a file to S3.

This is a convenience wrapper around generate_presigned_url for uploads.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key where the file will be stored

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 put_object operation Common parameters: - ContentType: The content type of the file (e.g., ‘image/jpeg’) - ACL: Access control for the file (e.g., ‘private’, ‘public-read’) - Metadata: Dictionary of metadata to store with the object

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

>>> upload_url = await s3.generate_upload_url(
...     bucket='my-bucket',
...     key='uploads/file.jpg',
...     ContentType='image/jpeg',
...     ACL='private',
...     Metadata={
...         'custom': 'value'
...     }
... )
async generate_download_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for downloading a file from S3.

This is a convenience wrapper around generate_presigned_url for downloads.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key of the file to download

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 get_object operation

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

>>> download_url = await s3.generate_download_url(
...     bucket='my-bucket',
...     key='downloads/file.txt',
...     ResponseContentType='application/pdf',
...     ResponseContentDisposition='attachment; filename=report.pdf'
... )
async download_as_base64(bucket: str, key: str, check_exists: bool = True, **kwargs: Any) str[source]

Download file from S3 as base64-encoded string. This method is useful when you need to work with the file contents directly in memory without saving to disk, such as when sending files in API responses or processing file contents in memory.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key

  • check_exists – If True, verify file exists first

  • **kwargs – Additional args for boto3 get_object - VersionId: Version ID of the object - SSECustomerAlgorithm: Server-side encryption algorithm - SSECustomerKey: Server-side encryption key

Returns:

Base64-encoded file contents

Return type:

str

Raises:
  • FileNotFoundError – If file doesn’t exist and check_exists is True

  • ClientError – For S3-specific errors

Note

Loads entire file into memory - not suitable for very large files.

async file_exists(bucket: str, key: str) bool[source]

Check if a file exists in S3.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key to check

Returns:

True if file exists, False otherwise

Return type:

bool

Raises:

ClientError – If error occurs during check

File Operations

Base64 Download

async S3Client.download_as_base64(bucket: str, key: str, check_exists: bool = True, **kwargs: Any) str[source]

Download file from S3 as base64-encoded string. This method is useful when you need to work with the file contents directly in memory without saving to disk, such as when sending files in API responses or processing file contents in memory.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key

  • check_exists – If True, verify file exists first

  • **kwargs – Additional args for boto3 get_object - VersionId: Version ID of the object - SSECustomerAlgorithm: Server-side encryption algorithm - SSECustomerKey: Server-side encryption key

Returns:

Base64-encoded file contents

Return type:

str

Raises:
  • FileNotFoundError – If file doesn’t exist and check_exists is True

  • ClientError – For S3-specific errors

Note

Loads entire file into memory - not suitable for very large files.

Pre-signed URLs

The S3 client provides methods to generate pre-signed URLs for secure, time-limited access to S3 objects:

async S3Client.generate_presigned_url(bucket: str, key: str, operation: str = 'get_object', expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for an S3 object.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key

  • operation – S3 operation (‘get_object’, ‘put_object’, etc.)

  • expiration – URL expiration time in seconds (default: 1 hour)

  • **kwargs – Additional parameters for the S3 operation

Returns:

Pre-signed URL, or None if credentials are invalid

Return type:

str

Example

# Generate upload URL url = await s3.generate_presigned_url(

bucket=’my-bucket’, key=’uploads/file.txt’, operation=’put_object’, ContentType=’text/plain’

)

async S3Client.generate_upload_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for uploading a file to S3.

This is a convenience wrapper around generate_presigned_url for uploads.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key where the file will be stored

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 put_object operation Common parameters: - ContentType: The content type of the file (e.g., ‘image/jpeg’) - ACL: Access control for the file (e.g., ‘private’, ‘public-read’) - Metadata: Dictionary of metadata to store with the object

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

>>> upload_url = await s3.generate_upload_url(
...     bucket='my-bucket',
...     key='uploads/file.jpg',
...     ContentType='image/jpeg',
...     ACL='private',
...     Metadata={
...         'custom': 'value'
...     }
... )
async S3Client.generate_download_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for downloading a file from S3.

This is a convenience wrapper around generate_presigned_url for downloads.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key of the file to download

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 get_object operation

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

>>> download_url = await s3.generate_download_url(
...     bucket='my-bucket',
...     key='downloads/file.txt',
...     ResponseContentType='application/pdf',
...     ResponseContentDisposition='attachment; filename=report.pdf'
... )

Error Handling

All methods return a dictionary with the following structure:

{
    "status": "success" | "error",
    "bucket": "bucket-name",
    "key": "object-key",
    # Only present if status is "error"
    "error": "error-message",
    # Additional fields may be present depending on the operation
}

Thread Safety

The S3Client is implemented as a thread-safe singleton. Multiple threads can safely use the same instance.

Dependencies

  • boto3 >= 1.28.0

  • botocore >= 1.31.0