Amazon S3 storage classes and optimization

Blake Howe
6 min readNov 29, 2020

If you are a business that has cloud based information management system chances are that you have heard of Amazon Web Services and more specifically Amazon Simple Storage Service. It is one of the most popular storage service in the market and is being utilized for a variety of applications from blogs, static websites, personal media storage, data analytics. staging sites and enterprise backup storage.

AWS S3 expenditures can be one of the highest cost drivers since the simple storage service is storage cost charged per GB and can incur transfer fees.

Here is a brief explanation of the different storage classes and a few ways to make sure your costs are optimized.

Here are the different Storage Classes for S3

Amazon S3 Standard

S3 standard is the most commonly used storage class and it offers high durability, availability, and performance when it comes to object storage for frequently accessed data. S3 standard is appropriate for use cases including dynamic websites, cloud applications, content distribution, gaming and mobile applications and big data analytics. S3 standard offers low latency and high throughput performance. It is designed for durability of 99.999999999% of objects across multiple Availability Zones and 99.99% availability over a given year. S3 Standard also supports encryption for data at rest and SSL for data in transit.

Amazon S3 Intelligent — Tiering

Amazon S3 Intelligent Tiering is perhaps one of the best option for data with unknown or changing access requirements. S3 intelligent tiering offers S3 cost optimization as it automatically switches the data between four access tiers depending on the changing access patterns. It works by storing objects in four access tiers; two low latency access tiers optimized for frequent and infrequent access, and two opt-in archive access tiers designed for asynchronous access that are optimized for rare access. It is designed for durability of 99.999999999% of objects across multiple Availability Zones and 99.9% availability over a given year. However, there is a small monthly monitoring and auto-tiering fee but it is nothing as compared to the operation costs that a client would have to pay if they wanted to change the storage tier for their data.

Amazon S3 Standard — Infrequent Access

S3 Standard — IA is used for storing data, which is not accessed as frequently, but when it is, it required quick access and retrieval. S3 Standard — IA is similar to S3 standard in terms of performance but offers a lot lower per GB storage price as well as per GB retrieval price.

Amazon S3 One Zone — Infrequent Access

S3 One Zone is no different than S3 standard — IA in terms of the frequency of access or the speed of retrieval, the only difference that exists between the two is unlike other S3 storage classes which usually store data in three availability zones minimum, S3 One Zone stores the data in only one availability zone. S3 One Zone — IA offers the same low latency and high throughput performance of S3 standard, but it was designed for 99.5% availability over a given year.

Amazon S3 Glacier

Amazon S3 glacier is a low cost storage solution for archiving data for extended periods. It is more secure, durable and cheaper than on-premises solutions. In order to cater to varying needs, S3 glacier, offers three different retrieval solutions that can take from minutes to hours during retrieval with each one having different costs. You can either store the data directly into the glacier or use the lifecycle option to transfer data to the glacier class.

Amazon S3 Glacier Deep Archive

Amazon S3 Glacier Deep Archive, is the storage solution for customers belonging to highly regulated industries such as healthcare, public sectors or financial services; industries that need to retain data for extended periods of time say 7–10 years in order to meet compliance requirements. Amazon S3 Glacier Deep Archive is the cheapest of all storage solutions but only allows one or two accesses during a year. Glacier Deep Archive is an ideal alternative to magnetic tape libraries. It has a retrieval time of about 12 hours.

Six Way you can optimize your Amazon Simple Storage Service (S3) Costs.

1. Using the Appropriate Storage Class

Amazon simple storage service has 5 different tiers of object storage available and each one is suited to a different class of customers. So before you start utilizing S3 it is important that you understand when and why to use each of the five classes. For each tier, the cost breakdown is such that it depends on the amount of storage, the number of HTTP GET and PUT requests and the volume of data that the user can transfer. The five tiers are as following;

Amazon S3 Standard — For general purpose, offers 5 GB of Amazon S3 Storage, 20,000 Get requests, 2,000 put requests, and 15 GB of data transfer.

Amazon S3 Standard — Infrequent Access (IA) — For data that is used less frequently, same specifications as the S3 standard but with lower pricing, you are charged a retrieval fee of $0.01 per GB.

Amazon S3 One Zone — Infrequent Access (IA) — Even cheaper than S3 Standard IA, data is only stored in a single availability zone with less resiliency, ideal for secondary backups.

Amazon Glacier — for data that needs to be retained for more than 90 days, takes longer to retrieve data.

You can implement an object lifecycle management, which automatically transitions your data from one category to another category, which would not only help you cut down on excessive costs as the importance of data changes through every phase of your project.

2. Storing Data in Compressed Format

While the Amazon Simple Storage Service does not charge anything for transferring data into an S3 bucket, it does however charges for data storage and PUT, GET, and LIST request, therefore it is important that you store your data in the compressed format, which can help avoid paying extra money.

3. Evenly Distribute Objects

When storing files in S3 it is very important that you distribute the objects evenly making a virtual folder structure. What this does is it reduces the number of operations required to read a file, thus reducing the costs involved in extra GET and LIST requests that you might otherwise have done while accessing a file.

4. Enable Lifecycle Feature

One of the most important step that you can take in order to optimize your S3 costs is avoid storing unnecessary data. There are a number of points that you can keep in mind to evaluate if a file is necessary or can be deleted and storage costs could be minimized.

Log Collection — If your deployment is using S3 for log collection, there is always a certain period, after which the log is no longer required. This is when you need to set up automatic deleting option by enabling the lifecycle feature, which automatically deletes a file after its lifecycle ends.

Files that can be recreated — Another important thing to look out for is not to keep files, which can be easily recreated. If there is something that is not difficult to make, why should you keep paying for saving an older version.

Incomplete uploads — There are chances that you once started an upload and it didn’t quite finish and something came up. You later uploaded the complete file again. S3 will keep the data from that incomplete upload so you need to clear it after every 7 days to be on the safe side.

5. Optimizing S3 buckets

Amazon S3 buckets are basically the virtual equivalent of file folders and storage objects which store cloud data along with its descriptive metadata. Optimizing S3 buckets can also help bring S3 costs down. Firstly, you need to appropriately tag buckets so that the misuse of S3 resources can be avoided in the event of a data compromise.

Secondly, you need to monitor the S3 object access patterns and store them in the appropriate storage class accordingly. Finally, you need to remove unused S3 buckets and not leave any buckets running without any data in them.

6. Use a Cloud Cost Optimization Tool

We understand that taking all the above-mentioned tasks can be a tedious job and not everyone has the time and energy laying around to do this detailed revamping of their data. So you always have the option to utilize a cloud cost optimization tool and there are plenty of those available including AWS’s own cost optimization tools. What these tools do is that they monitor all the above-mentioned factors for you and carry out a number of those tasks themselves. For the remaining ones, they give you tips as to what you can do to optimize costs even further.

--

--