Amazon S3 is a cloud-based storage service that provides virtually unlimited storage space for businesses of all sizes. Boto3 is a Python software development kit that makes it easy for developers to interact with AWS services, including Amazon S3. Using Amazon S3 with Boto3 provides a simple and powerful way to store and manage files in the cloud.
In this lesson, we'll show you how to upload files to Amazon S3 using Boto3. We'll cover everything you need to know, from writing the code to upload files to unit testing it. Whether you're a beginner or an experienced developer, this lesson will help you get started with uploading files to Amazon S3 with Boto3.
Upload regular files to S3 bucket
Let's take a look, how to upload a regular file to Amazon S3 bucket:
Upload file to S3 bucket
import io import logging import boto3 import botocore logger = logging.getLogger() logging.getLogger("boto3").setLevel(logging.WARNING) logging.getLogger("botocore").setLevel(logging.WARNING) logging.basicConfig(level=logging.INFO, format='%(asctime)s: %(levelname)s: %(message)s') S3_CLIENT = boto3.client('s3') def upload_file_to_s3(file_name, bucket, object_name=None, args=None): """ Upload a file to an S3 bucket :param file_name: File to upload :param bucket: Bucket to upload to :param object_name: S3 object name. If not specified then file_name is used :param args: Additional arguments to S3 client :return: True if file was uploaded, else False """ # If S3 object_name was not specified, use file_name if object_name is None: object_name = file_name # Upload the file try: response = S3_CLIENT.upload_file(file_name, bucket, object_name, ExtraArgs=args) except botocore.exceptions.ClientError as error: logging.error(error) return False return True
In the code above we upload several dependencies:
- io - we'll need this module in the next example to generate file object in memory
- logging - standard Python logging module to log error messages
- boto3 - AWS SDK for Python
- botocore - this module is required to catch potential exceptions during upload operations
upload_file_to_s3() method method takes in four parameters:
- file_name: The name of the file to upload to S3.
- bucket: The name of the S3 bucket to upload the file to.
- object_name (optional): The name to give the object in S3. If not specified, the file_name will be used.
- args (optional): Additional optional arguments for the Boto3 S3 client
The method starts by checking if an object name was specified. If not, it uses the file_name as the object name. It then creates an S3 client, which is used to upload the file to the specified bucket with the specified object name. If the file upload is successful, the method returns True, otherwise it returns False.
By wrapping the code in a method like this, you can easily upload files to S3 from any Python script by simply calling the
upload_file_to_s3() method. You can customize the method as needed, such as adding additional parameters or error handling.
Upload in-memory generated file object to S3 bucket
Now, let's take a look how to to generate file in memory and upload it to S3 bucket afterward:
def upload_generated_file_object_to_s3(bucket, object_name, args=None): """ Upload a file object to an S3 bucket :param bucket: Bucket to upload to :param object_name: S3 object name :param args: Additional arguments to S3 client :return: True if file was uploaded, else False """ with io.BytesIO() as f: # Create a text file object in memory f.write(b'First line.\n') f.write(b'Second line.\n') f.seek(0) try: S3_CLIENT.upload_fileobj(f, bucket, object_name, ExtraArgs=args) except botocore.exceptions.ClientError as error: logging.error(error) return False return True
The code above is similar to the previous one with one exception, we're using io.BytesIO to generate file content in memory and upload it to S3 bucket using upload_fileobj method.
Unit testing S3 upload operations
Next, it would be nice to test above methods using Moto library:
import os import unittest import boto3 from moto import mock_s3 @mock_s3 class TestS3Upload(unittest.TestCase): def setUp(self): self.s3_resource = boto3.resource('s3') # Create a test bucket self.s3_resource.create_bucket(Bucket='test-bucket') def test_upload_file_to_s3(self): from upload_file import upload_file_to_s3 # Upload a test file to the test bucket file_name = 'test_file.txt' with open(file_name, 'w') as f: f.write('Test file contents') result = upload_file_to_s3(file_name, 'test-bucket', object_name='test_file.txt') # Verify that the file was uploaded successfully bucket = self.s3_resource.Bucket('test-bucket') objs = list(bucket.objects.filter(Prefix='test_file.txt')) self.assertEqual(len(objs), 1) self.assertEqual(objs.key, 'test_file.txt') self.assertEqual(result, True) # Delete test file os.unlink(file_name) def test_upload_generated_file_object_to_s3(self): from upload_file import upload_generated_file_object_to_s3 result = upload_generated_file_object_to_s3('test-bucket', object_name='test_file.txt') # Verify that the file was uploaded successfully bucket = self.s3_resource.Bucket('test-bucket') objs = list(bucket.objects.filter(Prefix='test_file.txt')) self.assertEqual(len(objs), 1) self.assertEqual(objs.key, 'test_file.txt') self.assertEqual(result, True) if __name__ == '__main__': unittest.main()
As usual, to test S3 file upload operations, we'll create two almost identical methods in our unit test class:
- test_upload_file_to_s3 - this test creates the file on the file system, uploads it to S3 bucket, and checks that file is available in the bucket
- test_upload_generated_file_object_to_s3 - this method is calling upload_generated_file_object_to_s3 method and ensures that the generated file is available in the S3 bucket.
You may extend this unit test by adding additional logic for checking file content if required.
Uploading files to Amazon S3 with Boto3 is a simple and powerful way to store and manage files in the cloud. By using examples from this lesson, you can quickly write the code to upload files to S3 and unit test them using Python.
Using Amazon S3 with Boto3 provides many benefits, such as scalability, durability, cost-effectiveness, and ease of use. Boto3 provides a simple and intuitive way to interact with AWS services, making it easy to write Python code that interacts with Amazon S3.
By wrapping the code to upload files to S3 in a method, you can easily upload files to S3 from any Python script. You can customize the method as needed, such as by adding additional parameters or error handling.
To make sure that your code is working as expected, you can use the Moto library to test your code in a controlled environment. This provides a simple way to test your code without needing to upload files to a live S3 bucket.
In conclusion, uploading files to Amazon S3 with Boto3 is a simple and powerful way to store and manage files in the cloud. By following the steps outlined in this guide and customizing the code to meet your specific needs, you can take advantage of the many benefits of Amazon S3 and Boto3 to streamline your file management tasks.