AWS S3 Client

A thread-safe singleton client for interacting with AWS S3 storage, implementing the StorageClient interface.

Note

This client is designed to be used as a singleton within your application. Multiple instances will return the same underlying client.

Configuration

The S3Client is configured using environment variables. All parameters are required.

Variable

Type

Default

Description

REGION

string

None

AWS region name (e.g., ‘us-east-1’)

BUCKET_ACCESS_KEY_ID

string

None

AWS access key ID with S3 permissions

BUCKET_SECRET_ACCESS_KEY

string

None

AWS secret access key

BUCKET_NAME

string

None

Default bucket name (optional, can be passed to methods)

Example .env file:

REGION=us-east-1
BUCKET_ACCESS_KEY_ID=your-access-key-id
BUCKET_SECRET_ACCESS_KEY=your-secret-access-key
BUCKET_NAME=my-default-bucket

Basic Usage

Initialization

The S3Client is configured using environment variables. Make sure these are set before initializing the client:

from prs_commons import S3Client

# Initialize with environment variables
s3 = S3Client()

# The client is now ready to use

File Operations

Uploading Files

# Basic upload
result = s3.upload_file(
    file_path="local_file.txt",
    bucket="my-bucket",
    key="path/in/s3/file.txt"
)

# Upload with additional S3 parameters
result = s3.upload_file(
    file_path="local_file.txt",
    bucket="my-bucket",
    key="path/in/s3/file.txt",
    ExtraArgs={
        'ContentType': 'text/plain',
        'Metadata': {'author': 'user'}
    }
)

Downloading Files

# Basic download
try:
    success = s3.download_file(
        bucket="my-bucket",
        key="path/in/s3/file.txt",
        file_path="local_file.txt"
    )
    if success:
        print("File downloaded successfully")
except FileNotFoundError as e:
    print(f"File not found: {e}")
except PermissionError as e:
    print(f"Permission denied: {e}")
except ClientError as e:
    print(f"S3 error: {e}")

# Download as base64-encoded string
try:
    file_data = s3.download_as_base64(
        bucket="my-bucket",
        key="documents/report.pdf"
    )
    print(f"Downloaded file (base64, {len(file_data)} bytes)")
except FileNotFoundError as e:
    print(f"File not found: {e}")
except PermissionError as e:
    print(f"Permission denied: {e}")
except ClientError as e:
    print(f"S3 error: {e}")

Deleting Files

result = s3.delete_object(
    bucket="my-bucket",
    key="path/in/s3/file.txt"
)

Pre-signed URLs

Generate Upload URL

generate_upload_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None

Generate a pre-signed URL for uploading a file to S3.

This is a convenience wrapper around generate_presigned_url() for uploads. The content type will be automatically detected from the file extension if not provided.

Parameters:
  • bucket (str) – S3 bucket name

  • key (str) – S3 object key where the file will be stored

  • expiration (int) – Time in seconds until the URL expires (default: 3600)

  • **kwargs – Additional parameters to pass to the S3 put_object operation

Returns:

Pre-signed URL as a string, or None if credentials are invalid

Return type:

str | None

Common Parameters

  • ContentType (str, optional): The content type of the file (e.g., ‘image/jpeg’). If not provided, it will be automatically detected from the file extension.

  • ACL (str, optional): Access control for the file. Defaults to the bucket’s ACL. Common values: ‘private’, ‘public-read’, ‘public-read-write’, ‘authenticated-read’

  • Metadata (dict, optional): Dictionary of metadata to store with the object. Keys will be prefixed with ‘x-amz-meta-’ when stored in S3.

  • Other parameters supported by boto3’s generate_presigned_url for ‘put_object’ operation

Example

# Generate URL for uploading a text file with metadata
url = s3.generate_upload_url(
    bucket='my-bucket',
    key='documents/report.txt',
    ContentType='text/plain',
    Metadata={
        'author': 'user@example.com',
        'description': 'Quarterly report Q2 2023'
    },
    expiration=7200  # 2 hours
)

# Use the URL to upload a file with a PUT request # import requests # with open(‘file.txt’, ‘rb’) as f: # response = requests.put(url, data=f)

Generate Download URL

# Generate a pre-signed URL for file download
download_url = s3.generate_download_url(
    bucket="my-bucket",
    key="downloads/file.txt",
    ResponseContentType="application/octet-stream",
    expiration=3600  # URL expires in 1 hour (default)
)
print(f"Download URL: {download_url}")

Error Handling

The S3 client raises the following exceptions:

  • ClientError: For AWS service errors

  • NoCredentialsError: When AWS credentials are not found

  • RuntimeError: For client initialization errors

  • FileNotFoundError: When the requested file doesn’t exist

  • PermissionError: When there are permission issues

Example error handling:

from botocore.exceptions import ClientError, NoCredentialsError

try:
    # Your S3 operations here
    pass
except FileNotFoundError as e:
    print(f"File not found: {e}")
except PermissionError as e:
    print(f"Permission denied: {e}")
except NoCredentialsError:
    print("AWS credentials not found")
except ClientError as e:
    error_code = e.response.get('Error', {}).get('Code')
    if error_code == 'NoSuchBucket':
        print("Bucket does not exist")
    else:
        print(f"AWS error: {e}")
except Exception as e:
    print(f"Unexpected error: {e}")
from botocore.exceptions import ClientError, NoCredentialsError

try:
    s3.upload_file("nonexistent.txt", "my-bucket", "file.txt")
except FileNotFoundError as e:
    print(f"Local file not found: {e}")
except NoCredentialsError:
    print("AWS credentials not found")
except ClientError as e:
    print(f"AWS error: {e.response['Error']['Message']}")
except Exception as e:
    print(f"Unexpected error: {e}")

API Reference

class prs_commons.aws.s3_client.S3Client[source]

Bases: StorageClient

Thread-safe singleton client for AWS S3 operations.

This client provides a simple interface to interact with AWS S3 using boto3. It implements the singleton pattern to ensure only one instance exists.

__init__() None[source]

Initialize the S3 client with configuration from environment variables.

property client: BaseClient

Get the boto3 S3 client instance.

Returns:

The boto3 S3 client instance.

Raises:

RuntimeError – If the S3 client cannot be initialized.

upload_file(file_path: str, bucket: str, key: str, **kwargs: Any) Dict[str, Any][source]

Upload a file to an S3 bucket.

Parameters:
  • file_path – Path to the file to upload

  • bucket – Target S3 bucket name

  • key – S3 object key (path in the bucket)

  • **kwargs – Additional arguments to pass to boto3 upload_file

Returns:

Dictionary containing status and operation details

Return type:

S3Response

Raises:

ClientError – If the upload fails

download_file(bucket: str, key: str, file_path: str, **kwargs: Any) bool[source]

Download a file from an S3 bucket to the local filesystem.

Parameters:
  • bucket – Source S3 bucket name

  • key – S3 object key (path in the bucket)

  • file_path – Local filesystem path where the file will be saved. Must include the target filename and extension. Example: ‘/path/to/destination/filename.ext’ The parent directory must exist and be writable.

  • **kwargs – Additional arguments to pass to boto3 download_file

Returns:

True if download was successful

Return type:

bool

Raises:
  • FileNotFoundError – If the file doesn’t exist in S3 or local path is invalid

  • PermissionError – If there are permission issues with S3 or local filesystem

  • botocore.exceptions.ClientError – For other S3-specific errors

  • IOError – If there are issues writing to the local filesystem

Example

>>> s3 = S3Client()
>>> success = s3.download_file(
...     bucket='my-bucket',
...     key='folder/file.txt',
...     file_path='/local/path/to/save/file.txt'
... )
>>> if success:
...     print("File downloaded successfully")
delete_object(bucket: str, key: str) Dict[str, Any][source]

Delete an object from an S3 bucket.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key to delete

Returns:

Dictionary containing status and operation details

Return type:

S3Response

generate_presigned_url(bucket: str, key: str, operation: str = 'get_object', expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for an S3 object.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key (path in the bucket)

  • operation – The S3 operation to allow with this URL. Common values: ‘get_object’, ‘put_object’, ‘delete_object’

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 operation

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

# Generate a URL to upload a file >>> upload_url = s3.generate_presigned_url( … bucket=’my-bucket’, … key=’uploads/file.txt’, … operation=’put_object’, … ContentType=’text/plain’ … )

# Generate a URL to download a file >>> download_url = s3.generate_presigned_url( … bucket=’my-bucket’, … key=’downloads/file.txt’, … operation=’get_object’, … ResponseContentType=’application/octet-stream’ … )

generate_upload_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for uploading a file to S3.

This is a convenience wrapper around generate_presigned_url for uploads.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key where the file will be stored

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 put_object operation Common parameters: - ContentType: The content type of the file (e.g., ‘image/jpeg’) - ACL: Access control for the file (e.g., ‘private’, ‘public-read’) - Metadata: Dictionary of metadata to store with the object

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

>>> upload_url = s3.generate_upload_url(
...     bucket='my-bucket',
...     key='uploads/file.jpg',
...     ContentType='image/jpeg',
...     ACL='private',
...     Metadata={
...         'custom': 'value'
...     }
... )
generate_download_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for downloading a file from S3.

This is a convenience wrapper around generate_presigned_url for downloads.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key of the file to download

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 get_object operation

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

>>> download_url = s3.generate_download_url(
...     bucket='my-bucket',
...     key='downloads/file.txt',
...     ResponseContentType='application/pdf',
...     ResponseContentDisposition='attachment; filename=report.pdf'
... )
download_as_base64(bucket: str, key: str, check_exists: bool = True, **kwargs: Any) str | None[source]

Download a file from S3 and return its contents as a base64-encoded string.

This method is useful when you need to work with the file contents directly in memory without saving to disk, such as when sending files in API responses or processing file contents in memory.

Parameters:
  • bucket – Name of the S3 bucket

  • key – S3 object key (path in the bucket)

  • check_exists – If True, verify the file exists before downloading

  • **kwargs – Additional arguments to pass to boto3 download_fileobj

Returns:

Base64-encoded string of the file contents

Raises:
  • FileNotFoundError – If the file doesn’t exist

  • PermissionError – If there are permission issues

  • botocore.exceptions.ClientError – For S3-specific errors

  • Exception – For any other unexpected errors

Example

>>> # Download a file as base64
>>> file_data = s3.download_as_base64(
...     bucket='my-bucket',
...     key='documents/report.pdf',
... )
>>> if file_data:
...     # Use the base64 data (e.g., embed in HTML, send in API response)
...     print(f"File size: {len(file_data)} bytes")

File Operations

Base64 Download

S3Client.download_as_base64(bucket: str, key: str, check_exists: bool = True, **kwargs: Any) str | None[source]

Download a file from S3 and return its contents as a base64-encoded string.

This method is useful when you need to work with the file contents directly in memory without saving to disk, such as when sending files in API responses or processing file contents in memory.

Parameters:
  • bucket – Name of the S3 bucket

  • key – S3 object key (path in the bucket)

  • check_exists – If True, verify the file exists before downloading

  • **kwargs – Additional arguments to pass to boto3 download_fileobj

Returns:

Base64-encoded string of the file contents

Raises:
  • FileNotFoundError – If the file doesn’t exist

  • PermissionError – If there are permission issues

  • botocore.exceptions.ClientError – For S3-specific errors

  • Exception – For any other unexpected errors

Example

>>> # Download a file as base64
>>> file_data = s3.download_as_base64(
...     bucket='my-bucket',
...     key='documents/report.pdf',
... )
>>> if file_data:
...     # Use the base64 data (e.g., embed in HTML, send in API response)
...     print(f"File size: {len(file_data)} bytes")

Pre-signed URLs

The S3 client provides methods to generate pre-signed URLs for secure, time-limited access to S3 objects:

S3Client.generate_presigned_url(bucket: str, key: str, operation: str = 'get_object', expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for an S3 object.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key (path in the bucket)

  • operation – The S3 operation to allow with this URL. Common values: ‘get_object’, ‘put_object’, ‘delete_object’

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 operation

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

# Generate a URL to upload a file >>> upload_url = s3.generate_presigned_url( … bucket=’my-bucket’, … key=’uploads/file.txt’, … operation=’put_object’, … ContentType=’text/plain’ … )

# Generate a URL to download a file >>> download_url = s3.generate_presigned_url( … bucket=’my-bucket’, … key=’downloads/file.txt’, … operation=’get_object’, … ResponseContentType=’application/octet-stream’ … )

S3Client.generate_upload_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for uploading a file to S3.

This is a convenience wrapper around generate_presigned_url for uploads.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key where the file will be stored

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 put_object operation Common parameters: - ContentType: The content type of the file (e.g., ‘image/jpeg’) - ACL: Access control for the file (e.g., ‘private’, ‘public-read’) - Metadata: Dictionary of metadata to store with the object

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

>>> upload_url = s3.generate_upload_url(
...     bucket='my-bucket',
...     key='uploads/file.jpg',
...     ContentType='image/jpeg',
...     ACL='private',
...     Metadata={
...         'custom': 'value'
...     }
... )
S3Client.generate_download_url(bucket: str, key: str, expiration: int = 3600, **kwargs: Any) str | None[source]

Generate a pre-signed URL for downloading a file from S3.

This is a convenience wrapper around generate_presigned_url for downloads.

Parameters:
  • bucket – S3 bucket name

  • key – S3 object key of the file to download

  • expiration – Time in seconds until the URL expires (default: 1 hour)

  • **kwargs – Additional parameters to pass to the S3 get_object operation

Returns:

The pre-signed URL as a string, or None if credentials are invalid

Return type:

Optional[str]

Example

>>> download_url = s3.generate_download_url(
...     bucket='my-bucket',
...     key='downloads/file.txt',
...     ResponseContentType='application/pdf',
...     ResponseContentDisposition='attachment; filename=report.pdf'
... )

Error Handling

All methods return a dictionary with the following structure:

{
    "status": "success" | "error",
    "bucket": "bucket-name",
    "key": "object-key",
    # Only present if status is "error"
    "error": "error-message",
    # Additional fields may be present depending on the operation
}

Thread Safety

The S3Client is implemented as a thread-safe singleton. Multiple threads can safely use the same instance.

Dependencies

  • boto3 >= 1.28.0

  • botocore >= 1.31.0