Step to Read Files or Folders using Boto3

Step 1: Import all the necessary libraries, we use dotenv to access environment variables and load them.

import os

import boto3
from dotenv import load_dotenv

Step 2: Create an S3 Client that provides all the necessary methods to work with the S3 bucket. Provide Access key and Secret Access Key using os.

# Create S3 client
s3 = boto3.client(
"s3",
aws_access_key_id=os.getenv("ac_key"),
aws_secret_access_key=os.getenv("sac_key"),
)

Step 3: Store the bucket name in a variable.

# Store bucket name
bucket_name = "gfg-s3-test-bucket"

Step 4: Make a list of all objects in a bucket using the list_objects_v2() method and get all the content or metadata of objects.

# Store contents of bucket
objects_list = s3.list_objects_v2(Bucket=bucket_name).get("Contents")

Step 5: Iterate over a list of objects.

# Iterate over every object in bucket
for obj in objects_list:

Step 6: Store the object name using the ‘Key’ attribute in the object contents.

    #  Store object name
obj_name = obj["Key"]

Step 7: Fetch and store all contents of an object using get_object(), which takes the bucket name and key or object name resulting in the dictionary.

    # Read an object from the bucket
response = s3.get_object(Bucket=bucket_name, Key=obj_name)

Step 8: Read object data from the body attribute of the response and decode it.

    # Read the object’s content as text
object_content = response["Body"].read().decode("utf-8")

Step 9: Finally print all the contents of the respective file.

    # Print all the contents
print(f"Contents of {obj_name}\n--------------")
print(object_content, end="\n\n")

Here is the complete code for Read file content from S3 bucket with boto3

This Python script uses the Boto3 library to interact with AWS S3. It first loads AWS credentials from environment variables using the dotenv module. Then, it creates an S3 client using these credentials. The script lists all objects in a specific S3 bucket, retrieves each object’s content, and prints it to the console. Finally, it decodes the content from bytes to a readable string using UTF-8 encoding.

import os

import boto3
from dotenv import load_dotenv

# Load Environment Variables
load_dotenv()

# Create S3 client
s3 = boto3.client(
"s3",
aws_access_key_id=os.getenv("ac_key"),
aws_secret_access_key=os.getenv("sac_key"),
)
# Store bucket name
bucket_name = "gfg-s3-test-bucket"

# Store contents of bucket
objects_list = s3.list_objects_v2(Bucket=bucket_name).get("Contents")

# Iterate over every object in bucket
for obj in objects_list:
# Store object name
obj_name = obj["Key"]
# Read an object from the bucket
response = s3.get_object(Bucket=bucket_name, Key=obj_name)
# Read the object’s content as text
object_content = response["Body"].read().decode("utf-8")
# Print all the contents
print(f"Contents of {obj_name}\n--------------")
print(object_content, end="\n\n")

Output:

Final Output

Contents of Test.txt
--------------
Test.txt is running
GFG Test

Contents of Test1.txt
--------------
Test1.txt is running
Reading contents from file using boto3

How to Read File Content from S3 Bucket with Boto3 ?

AWS S3 (Simple Storage Service), a scalable and secure object storage service, is often the go-to solution for storing and retrieving any amount of data, at any time, from anywhere. Boto3 is the AWS Software Development Kit (SDK) for Python, which provides an object-oriented API for AWS infrastructure services. It allows Python developers to build applications on top of Amazon services.

Similar Reads

Prerequisites

AWS account: Before starting this tutorial, you must have an AWS account. Read this article if you don’t have an AWS account. S3 Bucket: You should have a bucket set up in your S3. Refer to this article if you haven’t made it yet. AWS CLI: You should have AWS CLI set up on your local machine with access. Refer to this article to set up AWS CLI. Python and Boto3: Must have Python installed in your system and the Boto3 package....

Step-By-Step Guide to Read Files Content from S3 Bucket

Steps to Create S3 Buckets and Upload Files and Folders...

Step to Read Files or Folders using Boto3

Step 1: Import all the necessary libraries, we use dotenv to access environment variables and load them....

Conclusion

Reading files from an AWS S3 bucket using Python and Boto3 is straightforward. With just a few lines of code, you can retrieve and work with data stored in S3, making it an invaluable tool for data scientists working with large datasets....

Read File Content From S3 Bucket With Boto3 – FAQ’s

How do I secure my AWS credentials in Python?...