AWS
You will learn "How to": 1. Create new Bucket 2. Get list Bucket - Get via RESOURCE - Get via CLIENT 3. Upload file to Bucket - Check uploaded files - Delete file from Bucket - Delete Bucket 4. Check Access Permission
Before using Boto3, you need to set up authentication credentials for your AWS account using either the IAM Console or the AWS CLI. You can either choose an existing user or create a new one.
For instructions about how to create a user using the IAM Console, see Creating IAM users. Once the user has been created, see Managing access keys to learn how to create and retrieve the keys used to authenticate the user.
Configure AWS CLI
pip install awscli
aws configure
AWS Access Key ID [none]: ****************VQNB
AWS Secret Access Key [none]: ****************AMAm
Default region name [none]: eu-central-1
Default output format [none]: json
Amazon S3 examples
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-examples.html
1. Create new Bucket
Simple code:
import boto3
bucket_name = "new-storage4"
region = "eu-central-1"
s3 = boto3.client('s3', region_name=region)
location = {'LocationConstraint': region}
s3.create_bucket(Bucket=bucket_name, CreateBucketConfiguration=location)
Code with exceptions
import logging
import boto3
from botocore.exceptions import ClientError
def create_bucket(bucket_name, region=None):
"""
Create an S3 bucket in a specified region
:param bucket_name: Bucket name
:param region: set region, example: us-west-2
:return: True if bucket created, else False
"""
try:
if region is None:
s3_client = boto3.client('s3')
s3_client.create_bucket(Bucket=bucket_name)
else:
s3_client = boto3.client('s3', region_name=region)
location = {'LocationConstraint': region}
s3_client.create_bucket(Bucket=bucket_name,
CreateBucketConfiguration=location)
except ClientError as error:
logging.error(error)
return False
return True
create_bucket("new-storage3", "eu-central-1")
2. Get list of Bucket
2.1. Get via RESOURCE:
import boto3
s3 = boto3.resource('s3')
print('Existing buckets:')
for bucket in s3.buckets.all():
print(f"\t{bucket.name}")
Existing buckets:
new-storage3
ns-lab-web
ns-lab.open-storage
2.2. Get via CLIENT:
import boto3
s3 = boto3.client('s3')
response = s3.list_buckets()
print('Existing buckets:')
for bucket in response['Buckets']:
print(f'\t\t{bucket["Name"]}')
Existing buckets:
new-storage3
ns-lab-web
ns-lab.open-storage
3. Upload file to Bucket
Upload via "RESOURCE" and "PUT_OBJECT"
import boto3
bucket_name = 'new-storage3'
filepath = '/home/salavat/Pictures/picture.jpg'
filename = 'photo-6.jpg'
s3 = boto3.resource('s3')
file = open(filepath, 'rb')
bucket = s3.Bucket(bucket_name)
bucket.put_object(Key=filename, Body=file)
Upload via "CLIENT" and "UPLOAD_FILE"
import boto3
bucket_name = 'new-storage3'
filepath = '/home/salavat/Pictures/picture.jpg'
filename = 'photo-7.jpg'
s3 = boto3.client('s3')
s3.upload_file(filepath, bucket_name, filename)
7. Check Access Permission
import boto3
# Retrieve a bucket's ACL
s3 = boto3.client('s3')
result = s3.get_bucket_acl(Bucket='ns-lab-web')
print(result)
print(f"Permission:\t{result['Grants'][0]['Permission']}")
{'ResponseMetadata': {'RequestId': '0KH7CJF83GB8TE4W', 'HostId': '+mqxBWKk4+INjELAEg+pICxfT8cDEU0/CpZPAbYeDUWGo3t6XfkoCb/RiL4Tb6GH90vk+x/bNaM=', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amz-id-2': '+mqxBWKk4+INjELAEg+pICxfT8cDEU0/CpZPAbYeDUWGo3t6XfkoCb/RiL4Tb6GH90vk+x/bNaM=', 'x-amz-request-id': '0KH7CJF83GB8TE4W', 'date': 'Wed, 15 Feb 2023 21:11:46 GMT', 'content-type': 'application/xml', 'transfer-encoding': 'chunked', 'server': 'AmazonS3'}, 'RetryAttempts': 1}, 'Owner': {'ID': 'a832a8b8a4e6cf3a0ca798daa02b087110234b6db836b61904789188f9615fb0'}, 'Grants': [{'Grantee': {'ID': 'a832a8b8a4e6cf3a0ca798daa02b087110234b6db836b61904789188f9615fb0', 'Type': 'CanonicalUser'}, 'Permission': 'FULL_CONTROL'}]}
Permission: FULL_CONTROL