AWS - Day 5 - S3 - Simple Storage Service



John Rankin @ August 22, 2018

Amazon S3 (Simple Storage Service) is  AWS's way to provide strage through web services using REST, SOAP and BitTorrent APIs.

S3 is an Object-Storage architechure.  This differs compared to a filesystem architechure.  

The key difference in Object-Storage is that each object:

  • Contains the data itself
  • Contains MetaData
  • Has a globally unique identifier

Let's not get too wrapped up on what that means, yet.  For now, just think of  S3 is a Storage repository, because that's what it is anyway.

 So lets get started... First off, we need to create a bucket.  A bucket , in Filesystem terms is just a directory to hold our stuff.

So lets create one using the aws cli.  I am going to stick with the aws s3api commands because I like json output.    Another keynote about s3api is that it allows for queries.

root@damrkul2:~/aws/shell# aws s3api create-bucket --bucket rekous-bucket --region us-east-1
{
"Location": "/rekous-bucket"
}


If I wanted to do another region OTHER THAN us-east-1 ... such as us-west-2,  I would need to add a LocationContraint ... I would do that with the following command instead:

aws s3api create-bucket --bucket rekous-bucket --region us-west-2 --create-bucket-configuration LocationConstraint=us-west-2

Anyways.... we've created our bucket...  we can call "aws s3api list-buckets" to list our buckets, and we will see rekous-bucket in there... 

Transfering Up our first File.

We have a few methods to do this.   First I'll start with the simplest command, using aws s3 cp

root@damrkul2:~/aws# aws s3 cp john.jpg s3://rekous-bucket/
upload: ./john.jpg to s3://rekous-bucket/john.jpg

 

The other way is using the s3api ... but this time I'm going to copy john.jpg and copy it up to the bucket , but store it as john2.jpg.

root@damrkul2:~/aws# aws s3api put-object --bucket=rekous-bucket --body=john.jpg --key="john2.jpg"
{
"ETag": "\"33d28b819cb040922f5b5620ee7a936f\""
}

From the CLI, we can see the contents of our current bucket:

root@damrkul2:~/aws# aws s3api list-objects --bucket=rekous-bucket
{
    "Contents": [
        {
            "LastModified": "2018-08-22T17:46:51.000Z",
            "ETag": "\"33d28b819cb040922f5b5620ee7a936f\"",
            "StorageClass": "STANDARD",
            "Key": "john.jpg",
            "Owner": {
                "DisplayName": "rankintoday",
                "ID": "4a24d5f097f7469d6f43549c9da2ad1f21d6f0942d5e18030cebcb7d72ee9ace"
            },
            "Size": 117446
        },
        {
            "LastModified": "2018-08-22T17:48:39.000Z",
            "ETag": "\"33d28b819cb040922f5b5620ee7a936f\"",
            "StorageClass": "STANDARD",
            "Key": "john2.jpg",
            "Owner": {
                "DisplayName": "rankintoday",
                "ID": "4a24d5f097f7469d6f43549c9da2ad1f21d6f0942d5e18030cebcb7d72ee9ace"
            },
            "Size": 117446
        }
    ]
}

So you can see both files are now inside the bucket..... 

My bucket objects can be accesed via Web Browser by either : https://rekous-bucket.s3.amazonaws.com/john.jpg or https://s3.amazonaws.com/rekous-bucket/john.jpg

Now, at this point of time in the tutorial, if we went out and clicked on these links, they will show something like:

This is because the bucket permissions are private, and we need to change the bucket policy to make them public.   

So lets change this...  I've found an Amazon Policy Generator page... located here : http://awspolicygen.s3.amazonaws.com/policygen.html  ... When we fill this out we need to select the s3:GetObject action, and then for our resource arn:aws:s3:::rekous-bucket/* .  We are making this available to Anonymous users, so we can just put a Wildcard for the Prinicpal .  Lets take this Json , and save it to a file on ur workstation.

rekousbucket_policy.json

{
  "Version":"2012-10-17",
  "Statement":[
    {
      "Sid":"AddPerm",
      "Effect":"Allow",
      "Principal": "*",
      "Action":["s3:GetObject"],
      "Resource":["arn:aws:s3:::rekous-bucket/*"]
    }
  ]
}

Lets now call the put-bucket-policy function and change the permissions of our bucket so the Internet can view the files inside.

root@damrkul2:~/aws# aws s3api put-bucket-policy --bucket=rekous-bucket --policy=file://rekbucket_policy.json

We can now go to https://rekous-bucket.s3.amazonaws.com/john.jpg and we can now see the following:

Eh it works.  

 

 There are numerous other ways to doing file uploading as well.   For example, you could use cURL to upload an image, but you need to follow instructions on how to setup authentication in your cURL request.   I would rather recommend using one of the aws SDK's for file uploading.   This tutorial will not dive into these methods, that I will leave up to you.

 

Tagging objects:   I will just quickly touch on tagging objects.     Tagging objects allow you to help distinguish what types of files they are,  i.e metadata.  This can be beneficial for managing all the objects as you can apply filters to find files more easily.  Example:  I might have a file named "asdf33fe.jpg" but the picture itself is of a dog.  I tag the file with a "dog" tag.   Then I can filter out all the dog images by using a tag filter.

 

 

 

 

 

 

 

 

Most Recent Posts


Bitwise XOR-exclusive OR

Read This
May 10, 2023

0-1 Knapsack

Read This
May 10, 2023

K-Way Merge

Read This
May 10, 2023

Subsets

Read This
May 10, 2023

Backtracking

Read This
May 10, 2023

Greedy Algorithm

Read This
May 10, 2023

Trie

Read This
May 10, 2023

Depth-First Search-DFS

Read This
May 10, 2023

Breadth-First Search-BFS

Read This
May 10, 2023

Binary Search Tree-BST

Read This
May 10, 2023

Top K Elements

Read This
May 10, 2023

Binary Search Pattern

Read This
May 10, 2023

Cyclic Sort pattern

Read This
May 10, 2023

Merge Intervals

Read This
May 10, 2023

Day 7 - DynamoDB - and Working with 2 Services - Lambda

Read This
August 25, 2018

Day 6 - Lambda - Creating an API

Read This
August 23, 2018

AWS - Day 5 - S3 - Simple Storage Service

Read This
August 22, 2018

AWS - Day 4 - AWS CLI Useful Scripts

Read This
August 21, 2018

AWS - Day 3 - Create Container from another container

Read This
August 20, 2018

Day 2 - Docker Intro

Read This
August 17, 2018

AWS - Day1 - Tutorial - Launching my first Docker Container

Read This
August 16, 2018

AWS - Day 1 - Signing up and testing out their tutorials

Read This
August 16, 2018

Dynamic Programming - Edit Distance

Read This
December 19, 2016

Dynamic Programming - Fibonacci

Read This
December 19, 2016