Developing, managing, and operating your applications requires a wide variety of technology services. Customers often ask us what represents a fully-functional, flexible technology infrastructure platform. Below, we outline requirements for a modern, robust, industry-leading technology infrastructure platform with all the benefits that the cloud brings to bear. We also provide information about how AWS delivers against these requirements and why you might need each of these capabilities.

AWS began offering its technology infrastructure platform in 2006. At this point, we have over a million active customers using AWS in every imaginable way, and have developed considerable experience operating at scale. We’ve also innovated and delivered at a very rapid pace (delivering 159 significant features and services in 2012, 280 in 2013, 516 in 2014, 722 in 2015, and 1,017 in 2016). Expect this focus on rapidly delivering what customers want to continue.

Download a spreadsheet listing these requirements that you can use in your cloud platform evaluation activities.

 




Recognized independent third-party attestations, reports and certifications

Third-party attestations and certifications can give you confidence about a cloud operator’s policies and procedures and help to enable the deployment of business critical applications in a cloud environment.

AWS engages with independent third-party auditors and certifying bodies to provide customers with considerable information regarding the policies, processes, and controls we establish and operate. The relevant attestations, reports and certifications include:

  • SOC 1/ ISAE 3402
  • SOC 2
  • SOC 3
  • PCI DSS Level 1
  • ISO 27001
  • IRAP
  • FIPS 140-2
  • MPAA

Learn More »

  • HIPAA
  • FedRAMP (SM)
  • DoD CSM Levels 1-2, 3-5
  • DIACAP and FISMA
  • MTCS Tier 3 Certification
  • ITAR
  • CSA
  • ISO 9001

Control access to your cloud resources at a granular level

Creating users and groups, and using sophisticated policy language to control access to your cloud resources at granular levels (e.g. user, resource, time of day, source IP address) means that you can deploy applications more securely and implement your security policies more easily in the cloud.

AWS IAM allows you to create and manage users and groups, as well as use permissions to control access to AWS resources such as Amazon S3 storage buckets, Amazon EBS snapshots, or Amazon DynamoDB tables. Learn More »

Integrate with your existing identity and access management systems

Integrating with your existing identity and access management systems means you do not have to go through the process of creating parallel sets of identities in the cloud: you can use identities in your existing systems to grant access to your resources in the cloud.

AWS Directory Service is a managed service that allows you to connect your AWS resources with an existing on-premises Microsoft Active Directory or to set up a new, stand-alone directory in the AWS Cloud Learn More »

Dedicated, hardware-based key management

For applications and data subject to rigorous contractual or regulatory requirements for managing cryptographic keys, additional protection is sometimes necessary. You should be able to use a service that enables you to protect your encryption keys within Hardware Security Modules (HSMs) designed and validated to government standards for secure key management deployed in the cloud.

The AWS CloudHSM service provides access to dedicated HSM appliances within the AWS cloud. Learn More »


Deploy applications close to your customers

It is important to have choice when deploying applications, to place them close to your users or customers, ensuring the lowest possible latency and best user experience. Learn More »

Maintain and ensure data locality

Many customers have regulatory or policy requirements that govern where their data must reside. Maintaining compliance with these regulations or policies requires that you know that data locality will be maintained.

AWS offers a choice of many different geographically-isolated Regions located all over the world. You decide where to place your data: it is not replicated to other regions and doesn’t move unless you choose to move it. Learn More »

Faster downloads, and lower latency connections for your customers

It is often important to be able to deliver a low-latency, high-performance application to your end users or customers, even if they are not located near the source of the application, whether that is static content or pre-recorded or live video.

Amazon CloudFront distributes content to end users via a network of edge locations across the globe. Learn More »

Ensure your applications are always reachable even during site outages with low latency

Every external-facing application needs domain name resolution (DNS) to ensure that inbound requests reach healthy infrastructure. A DNS service should be latency-aware, ensuring that users have a fast experience when using applications.

Amazon Route 53 is a highly available and scalable DNS service that includes DNS failover and latency-based routing, enabling you to deliver a highly available and fast performing application. Learn More »


Cloud applications have a wide range requirements for compute, memory and network resources.

Amazon EC2 provides resizable compute capacity in the cloud. It is designed to make web-scale computing easier for developers and system administrators. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change. Learn More »

Containers are lighter in weight and have less memory and computational overhead than virtual machines, and make it easy to support applications that consist of hundreds or thousands of small, isolated "moving parts." A properly containerized application is easy to scale and maintain, and makes efficient use of available system resources.

Amazon EC2 Container Service is a highly scalable, high performance container management service that supports Docker containers and allows you to easily run distributed applications on a managed cluster of Amazon EC2 instances. Learn More »

AWS Lambda is a compute service that runs your code in response to events and automatically manages the compute resources for you, making it easy to build applications that respond quickly to new information. AWS Lambda starts running your code within milliseconds of an event such as an image upload, in-app activity, website click, or output from a connected device.

Lambda runs your code on high-availability compute infrastructure and performs all the administration of the compute resources, including server and operating system maintenance, capacity provisioning and automatic scaling, code and security patch deployment, and code monitoring and logging. All you need to do is supply the code. Learn More »

General-purpose compute instances

General-purpose compute instances provide a balance of compute, memory and network resources and are a good choice for many applications such as small and mid-sized databases, data processing tasks, caching fleets and backend servers for various applications.

Amazon EC2 features T2 and M3 general-purpose instance types. T2 instances are Burstable Performance Instances that provide a baseline level of CPU performance with the ability to burst above the baseline. M3 instance types provides a balance of compute, memory, and network resources, and it is a good choice for many applications. Learn More »

Compute-optimized instances

Compute-optimized instances are optimized for applications that benefit from high compute power. Compute-optimized instance types are recommended for running CPU-bound applications such as high traffic front-end fleets, on-demand batch processing, distributed analytics, web servers, batch processing and high performance science and engineering applications.

Amazon EC2 features C4 and C3 compute-optimized instance types that have a higher ratio of vCPUs to memory than other instance types. C4 instances are the latest generation of Compute-optimized instances, featuring the highest performing processors and the lowest price/compute performance in EC2. Learn More »

Memory-optimized instances

Memory-optimized instances are optimized for memory-intensive applications, and are recommended for applications such as databases, memcached and other distributed caches and larger deployments of enterprise applications.

Amazon EC2 features R3 memory-optimized instance types that have the lowest cost per GiB of RAM among Amazon EC2 instances. Learn More »

Storage-optimized instances

Storage-optimized instances are optimized for applications with specific disk I/O and storage capacity requirements and are recommended for applications such as NoSQL databases such as Cassandra and MongoDB, scale out transactional databases, data warehousing, Hadoop and cluster file systems.

Amazon EC2 features I2 and HS1 storage-optimized instance types. I2 instances provide very fast SSD-backed instance storage optimized for very high random I/O performance, and provide high IOPS at a low cost. HS1 instances provide very high storage density and high sequential read and write performance per instance. It also offers the highest storage density among other EC2 instances and is lowest on $/GB storage. Learn More »

GPU instances

GPU instances allow you to take advantage of GPU performance capabilities for applications such as computational chemistry, rendering, financial modeling and engineering design.

Amazon EC2 features the G2 instance type intended for graphics and general purpose GPU compute applications. Learn More »

Resize instances at any time

Since the requirements of your applications may change over time, or during the development of your application, you should be able to change the size of an instance easily at any time to ensure that you are only paying for the capacity that you need.

Amazon EC2 makes it easy for you to resize your compute instances at any time. You can move to a larger or smaller instance, depending on the needs of your application, with a few clicks of a mouse. Learn More »

Pay-as-you-go pricing

For applications with short-term, spiky, or otherwise unpredictable usage, a pay-as-you-go pricing model, where you pay only for what you use, and with no upfront commitment, may be the most appropriate option.

Amazon EC2 On-Demand Instances let you pay for compute capacity by the hour with no long-term commitments. Learn More »

Significant discount (up to 75%) over On-Demand Instance pricing 

Reserved Instances provide you with a significant discount (up to 75%) compared to On-Demand Instance pricing. You are assured that your Reserved Instance will always be available for the operating system (e.g. Linux/UNIX or Windows) and Availability Zone in which you purchased it. 

For applications that have steady state needs, Reserved Instances can provide significant savings compared to using On-Demand Instances. Learn More »

Market-based pricing for significant discounts on excess capacity

For applications with flexible starts and stops (i.e. able to be interrupted), or for applications that are economical only at very low compute prices, market-based pricing can enable you to obtain significant discounts to standard on-demand pricing at a bid price that you specify.

Amazon EC2 Spot Instances allow you to specify the price you are willing to pay to obtain AWS excess capacity. AWS makes excess capacity it has available to customers via the Spot market. If your max bid exceeds the Spot price at any juncture in your bid window, you receive those instances at your low Spot bid. Learn More »

Sell your reserved capacity on the open market

If you have compute resources that you have committed to for a period of time, and then your plans change, the ability to sell that capacity to other users can give you flexibility and let you recoup investments.

The Reserved Instance Marketplace allows you to sell your Amazon EC2 Reserved Instances to other businesses and organizations if your needs change. You can also browse the Reserved Instance Marketplace to find an even wider selection of Reserved Instance term lengths and pricing options sold by other AWS customers. Learn More »

Achieve further discounts when buying in bulk

A pricing approach that provides lower costs as you use more capacity in the cloud can provide even lower costs when reserving capacity.

In addition to the lower hourly rate, larger purchases of reserved capacity are eligible for further, tiered capacity discounts. The more you use, the lower the price. Learn More »

The freedom to access 100,000s of cores, only when you need them

The availability of high scale, utility computing removes the constraints of traditional infrastructure, where capacity is no longer a barrier to delivering answers to complex questions, in a short time frame.

Provision the compute capacity you need, when you need it. Cycle Computing recently provisioned 50,000 cores, significantly accelerating a drug discovery pipeline.

Easily assign static public IP addresses to your instances

Being able to assign static, public IP addresses easily to any instance running in the cloud means that you can move IP addresses from one instance to another, without complex configuration work or reliance on datacenter staff.

Amazon EC2 instances support Elastic IP addresses that allow you to easily assign and reassign public, static IP addresses to any of your instances. Learn More »

Control your IP networking configuration

When you deploy applications in the cloud, you should have complete control over your IP addressing configuration. This means it is easy to continue using your established IP addressing schemes and to connect to your existing IP networks easily.

Amazon Virtual Private Cloud provides you with complete control over a logically isolated virtual network you define. Learn More »

Create multiple private and public subnets

Being able to create multiple IP subnets, network interfaces, and control routing tables gives you fine-grained control over your application’s network communications. For example, you can create a public-facing subnet for webservers that have Internet access and additionally, create a private subnet without Internet access for back-end servers. This helps make your applications more secure.

Amazon Virtual Private Cloud provides you with complete control over a logically isolated virtual network you define. Learn More »

Attach multiple network interfaces to compute resources

For tasks such as creating a management network, using network and security appliances or other scenarios where you might want to use dual-homed instances, having the ability to attach multiple network interfaces to compute instances is required.

Amazon Virtual Private Cloud lets you create and attach multiple elastic network interfaces to Amazon EC2 instances in a logically isolated virtual network you define. Learn More »

Control traffic to and from your compute resources at a granular level

Being able to control traffic flowing to and from your compute instances is an important part of being able to implement security policies and procedures in a cloud environment.

Amazon Virtual Private Cloud lets you use security groups and network access control lists (ACLs) so that you have fine-grained control of the network traffic flowing to and from your Amazon EC2 instances in a logically isolated virtual network you define.         Learn More »

Hardware-based virtual private networking connection to cloud resources

Using a hardware appliance of your choice to extend your network to the cloud with a VPN, securely, can provide easy and seamless access between your existing network infrastructure and cloud resources.

Amazon Virtual Private Cloud allows you to use a hardware based VPN to connect your network to a logically isolated virtual network you define. Learn More »

High speed, low latency, private, dedicated connectivity between on-premises and cloud infrastructure

If you have security or connectivity requirements that cannot be met by standard Internet connections, connecting your network directly to the cloud from a variety of locations using a private 1GBPS or 10GBPS connection helps you to meet those requirements.

AWS Direct Connect makes it easy to establish a dedicated, private network connection from your premises to AWS. Learn More »

Automatically scale up or down to meet customer demand

Your cloud infrastructure should automatically scale up or down, adding or removing capacity based on the policies and metrics that you define. This helps you meet the demands of your customers, while paying only for what you need and use.

Auto Scaling allows you to scale your Amazon EC2 capacity up or down automatically according to triggers you define. Learn More »

Deploy applications in physically separate locations

To meet your application’s requirements for high-availability, it should be easy for you to deploy your application across multiple redundant, physically separate locations. This helps ensure that an outage in one facility won’t interrupt the availability of your application.

Each AWS region offers multiple, redundant Availability Zones to deploy applications in physically separate locations, close enough for synchronous data replication. Learn More »

Automatically balance variable request loads

To ensure that demand for your applications is evenly balanced across your cloud infrastructure, you should be able to take advantage of a load balancing service that automatically scales and manages itself, freeing you from having to deploy and manage a separate service. The load balancing service should also check the health of your application so that failures do not impact your users.

Elastic Load Balancing automatically distributes incoming application traffic across multiple Amazon EC2 instances. It seamlessly provides the amount of load balancing capacity needed in response to incoming application traffic. Elastic Load Balancing can be used with Amazon VPC to provide internal and external load balancing. Learn More »

An operating system designed for the cloud

An operating system designed, built and optimized for the cloud can drive better stability, security, and performance, while enabling new levels of automation.

The Amazon Linux AMI is provided at no extra cost to Amazon EC2 users. It is designed to provide a stable, secure, and high-performance execution environment for applications running on Amazon EC2. The AMI includes numerous tools, libraries and utilities to integrate with AWS. Amazon Web Services provides ongoing security and maintenance updates to all instances running the Amazon Linux AMI. Learn More »

Red Hat Enterprise Linux

If you are currently using Red Hat Enterprise Linux, or have an application that requires it, being able to choose the operating system you need in the cloud makes it easy to deploy the applications your business requires.

Amazon EC2 running Red Hat Enterprise Linux provides a dependable cloud platform to deploy a broad range of applications and is available for all instance types including Cluster Compute instances.
Learn More »

SUSE Linux Enterprise Server

If you are currently using SUSE Linux Enterprise Server, or have an application that requires it, being able to choose the operating system you need in the cloud makes it easy to deploy the applications your business requires.

Amazon EC2 running SUSE Linux Enterprise Server is a proven cloud platform for development, test, and production workloads, with more than 10,000 certified applications from over 1,600 independent software vendors. SUSE Linux Enterprise Server is available for all instance types including Cluster Compute instances. Learn More »

Ubuntu Server

If you are currently using Ubuntu Server or have an application that requires it, being able to choose the operating system that you need in the cloud makes it easy to deploy the applications that your business requires.

Ubuntu Server can be deployed on Amazon EC2 instances from the AWS Marketplace in just a few clicks. Learn More »

Microsoft Windows Server

For applications that run on Microsoft Windows, being able to choose the right operating system and version in the cloud means you can easily deploy the Windows applications your business requires.

Amazon EC2 running Microsoft Windows Server is a fast and dependable environment for deploying applications using the Microsoft Web Platform, ASP.NET and Internet Information Server (IIS). Learn More »

Microsoft SQL Server

For applications that use Microsoft SQL server, being able to choose a pre-configured compute instance running Windows and SQL Server means you can spend more time focusing on your application instead of deploying software.

Amazon EC2 running Windows Server with SQL Server offers you the flexibility to run a database server for as much or as little time as you need. We offer Amazon EC2 with several versions of Microsoft SQL Server. Learn More »

Oracle Databases and middleware

Having easy access to Oracle solutions that you are familiar with in a cloud environment means that you can spend time focusing on building your application in the cloud instead of sourcing and deploying software.

AWS and Oracle have worked together to offer customers convenient options for deploying enterprise applications in the cloud. You can launch entire enterprise software stacks from Oracle on Amazon EC2. In addition, Amazon Relational Database Service provides a fully managed database service offering a choice of engines including Oracle. Learn More »

Microsoft business applications

Taking the Microsoft Enterprise applications you use in your business and deploying them in a cloud environment allows you to use the familiar applications you need and benefit from the availability, low cost and flexibility provided by the cloud.

You can run Microsoft applications such as SharePoint, Exchange, SQL Server, Lync, System Center, and Dynamics on the low-cost, high performance infrastructure of Amazon EC2. Learn More »

Microsoft License Mobility

If you have already acquired licenses for Microsoft software, that license should follow you to the cloud, preserving your investment.

Microsoft License Mobility through Software Assurance allows Microsoft customers to use Microsoft Server application licenses that they already own on AWS without any additional software license fees. Learn More »

SAP solutions

The flexibility to deploy your SAP solutions on a scalable, pay-as-you-go platform – without making long-term commitments or costly capital expenditures for their underlying infrastructure – can result in lower costs, increased efficiencies, and faster time to market.

AWS and SAP are dedicated to creating innovative solutions for businesses of all sizes, and delivering maximum customer value. New and existing SAP customers can deploy their solutions on SAP-certified Amazon EC2 instances in production environments, knowing that SAP and AWS have tested and verified the performance of the underlying AWS resources and certified them against standards applying to servers and virtual platforms. Learn More »

IBM solutions and developer tools

Having the flexibility to deploy IBM solutions and developer tools on a scalable, pay-as-you-go platform, without making long-term commitments or costly capital expenditures for underlying infrastructure, can provide you with access to the tools and solutions you need with the availability, low cost and flexibility provided by the cloud.

On Amazon EC2 you can run many of the proven IBM platform technologies with which you're already familiar, including IBM DB2, IBM Domino, IBM Informix, IBM Web Content Manager, IBM WebSphere Application Server, IBM WebSphere sMash, IBM WebSphere Portal Server, and InfoSphere DataStage/QualityStage with its corresponding Windows client. Learn More »

Powerful multi-core processors

High Performance Computing (HPC) workloads often require multiple, high speed cores. The availability of these processors in an on-demand utility cloud computing platform puts supercomputer class power in the hands of every developer.

Amazon EC2 Cluster Compute Instances feature the latest Intel Xeon processor E5 family processors, with advanced vector extensions, NUMA, turbo mode and hardware virtualization to provide an extremely high performance environment for your codes. Learn More »

High speed interconnects

Many HPC codes exchange information between nodes of a cluster over the network. A fast, interconnected network ensures low latency delivery of this exchange, and can significantly accelerate large-scale computational workloads.

Amazon EC2 Cluster Compute instances are deployed on a high performance, low latency, fully bisectional, 10Gb ethernet network. Learn More »

Physical proximity between instances

Placing instances such that their underlying hardware that is physically close together reduces communication latency between those instances, improving computational performance.

Amazon EC2 cluster placement groups ensure that applications benefit from full-bisection bandwidth and low-latency network performance. Learn More »

Obtain information about the health of your instances

It is important to know the status of instances that you run in the cloud, so that you can be confident your application is operating as designed. Built-in status checks can provide you with information about the availability of your instances.

Amazon EC2 instances have built-in Instance Status Checks to provide you with information about the health and availability of your instances. Learn More »

Deploy to hardware dedicated only to you

Customers often have compliance or policy requirements mandating that their compute infrastructure run on hardware dedicated solely for their use.

Amazon EC2 Dedicated Instances run hardware dedicated to a single customer while allowing you to take full advantage of the benefits of the AWS Cloud. Learn More »

Provision cloud-based desktops for your end users

Being able to easily provision high performance cloud-based virtual desktops to end users that can be accessed from a range of client devices, can help you keep your data secure and meet the needs of a diverse set of users.

Amazon WorkSpaces is a fully managed desktop cloud computing platform that allows you to easily provision cloud-based desktops that allow end-users to access the documents, applications and resources they need with the device of their choice, including laptops, iPad, Kindle Fire, or Android tablets. Learn More »


Highly durable storage for all types of data

For applications requiring high scale, anytime access to data, and high durability, you should be able to choose the geographic location to store your data without a minimum commitment or up-front fees, and be able to take advantage of other capabilities such as encryption at rest.

Amazon S3 allows the storage and access of any amount of data at any time, from anywhere on the web, and is designed for durability of 99.999999999%. Learn More »

Amazon S3 also offers the Standard-Infrequent Access tier, which has the same durability but at a reduced cost with a 30-day storage minimum. This is ideal for less active data that needs immediate access. Learn More »

Archival storage for infrequently accessed data

For data that you access infrequently, and for which retrieval times of several hours are suitable, you should choose a service with very high durability and availability at a very low cost.

Amazon Glacier is a low-cost service that provides high-durability storage for archiving and backup for as little as $0.01 per gigabyte per month. Glacier is designed for durability of 99.999999999% Learn More »

Shared file storage

Amazon Elastic File System (EFS) is a shared file system for Amazon EC2 instances. You can ceate and configure file systems quickly and easily for home directories, software development or content repositories, and it grows and shrinks automatically so you don't need to provision in advance. Learn More »

Versioning

The ability to apply versioning to objects in cloud storage makes it easier for you to archive older versions of your objects, and helps prevent objects from being deleted or overwritten by mistake.

Amazon S3 provides versioning of objects, so that you can preserve, retrieve and restore every version of every object you store in S3. Learn More »

Multi-factor delete

Requiring the use of a multi-factor authentication device before objects are deleted from the cloud helps ensure maximum protection of preserved versions of your objects.

Amazon S3 provides MFA Delete. When enabled, this feature requires the use of a multi-factor authentication device to delete objects stored in S3. Learn More »

Encryption

Organizational policies, or industry or government regulations, might require the use of encryption at rest or in transit. Using a cloud storage solution with built-in encryption can make it easier to ensure data security and compliance with policies or regulation.

Amazon S3 uses SSL to encrypt data in transit and has built-in functionality to encrypt data stored at rest. Learn More »

Flexible Access Control Mechanisms

When storing data in the cloud, having a flexible set of access control mechanisms makes it easier to comply with your security policies and helps ensure only authorized access to your data.

Amazon S3 supports bucket policies and access control lists (ACLs) to control access either to storage buckets or at the individual object level. Learn More »

Time-limited access to objects

Being able to provide access to objects by using a URL that is valid only for a defined period of time can be useful for scenarios such as software downloads or other applications where you want to restrict the length of time users have access to an object.

Amazon S3 supports query string authentication, which allows you to provide a URL which is valid only for a length of time that you define. Learn More »

Audit logs

Being able to obtain logs that show all requests made against your cloud storage can be useful for monitoring applications or for audit purposes.

Amazon S3 supports logging of requests made against your Amazon S3 resources. These server access logs capture all requests made against a bucket or the objects in it and can be used for auditing purposes. Learn More »

Define policies to delete old data, or move to archival storage

Your policies might specify deleting or moving data to archival storage when it reaches a certain age. Storage lifecycle management lets you set policies and have the appropriate action taken automatically, without your intervention.

Amazon S3 provides object lifecycle management, allowing you to set policies. For example, data reaching a certain age can either be deleted or moved to Amazon Glacier and retained for archiving for as little as $0.01 per gigabyte per month. Learn More »

Cost Control

When you store data in the cloud, ensuring that costs can be allocated across different business groups, applications, or cost centers makes it easier for you to control expenditures. Configurable alerts that notify you when charges exceed your chosen threshold provide even more cost control.

Amazon S3 provides the ability to tag buckets so you can allocate costs across dimensions such as cost centers, applications, or business owners. Integration with Amazon CloudWatch (the monitoring service for AWS) allows for billing alerts to be sent if your estimated charges are to exceed the threshold you set. Learn More »

Event Notification

Amazon S3 event notifications can be sent when objects are uploaded to Amazon S3. Event notifications can be delivered using Amazon SQS, Amazon SNS, or sent directly to AWS Lambda, enabling you to trigger workflows, alerts, or other processing. Learn More »


Back up your data to the cloud automatically

Deploying an on-premises appliance that automatically transfers your data to a highly available and highly durable cloud storage platform means that you can easily deliver solutions for disaster recovery with minimum cost and effort.

Amazon Storage Gateway connects your on-premises infrastructure to the AWS cloud. When using Gateway-Stored volumes, your on-premises gateway stores your primary data locally, and asynchronously backs up point-in-time snapshots of your data to Amazon S3. This provides you with durable and inexpensive off-site backups that you can recover locally or, in the case of a disaster, use with Amazon EC2 instances. Learn More »

Corporate file sharing integrated with the cloud

Typically, managing on-premises storage for departmental file shares and home directories can result in high capital and maintenance costs. Using an on-premises appliance that is connected to the cloud means you can focus on delivering what your users need, rather than on acquiring and provisioning storage infrastructure.

Amazon Storage Gateway connects your on-premises infrastructure to the AWS cloud. When using Gateway-Cached volumes, you store your primary data in Amazon S3, and retain your frequently accessed data locally. Gateway-Cached volumes provide substantial cost savings on primary storage, minimize the need to scale your storage on-premises, and provide low-latency access to your frequently accessed data. Learn More »

Direct attached, ephemeral storage

Some applications, such as Hadoop or certain NoSQL databases, benefit from directly attached, ephemeral storage since persistence of this data beyond the lifetime of an instance is not required. Cloud compute instances should provide ephemeral storage for scenarios like these.

Amazon EC2 instances provide between 4 GB and 6400 GB of direct attached, ephemeral storage at no additional cost. Learn More »

Persistent storage

Certain applications, such as databases, require persistent storage that can be attached to compute instances of your choice. Replication of volumes should be provided to increase the durability of data.

Amazon EBS provides block level persistent storage volumes for Amazon EC2 instances sized between 1GB and 1TB. Amazon EBS provides three volume types: General Purpose (SSD), Provisioned IOPS (SSD), and Magnetic. The three volume types differ in performance characteristics and cost, so you can choose the right storage performance and price for the needs of your applications. Learn More »

Persistent storage, with provisioned I/O performance

Certain applications have specific I/O requirements that must be satisfied for them to perform at an acceptable level. Being able to specify required I/O performance helps ensures that your application delivers the level of performance its users require.

Amazon EBS offers a Provisioned IOPS feature, allowing you to specify the amount of I/O performance you require, up to 4000 IOPS. Learn More »

Durable snapshots

For long-term data durability, or to use existing volumes as the basis for new ones, the ability to take multiple point-in-time snapshots and have them stored with high durability can help you protect your data and make creation of new volumes simple.

Amazon EBS provides the ability to take multiple point-in-time snapshots, which are durably stored in Amazon S3 to help protect your data and make creation of new volumes easier. Learn More »


Deliver your website content to users around the world with low latency

A content distribution network that provides low-latency access helps users have the best experience, even though your application might not be deployed near all of your users.

Amazon CloudFront can deliver your entire website, both static and dynamic content, to users around the world, via a global network of 54 POPS located near your users. Learn More »

Deliver software or large files to end users

When you need to deliver software updates or other large files, a content distribution network that works seamlessly with a cloud storage solution can help you provide a great experience with low latency for users and low cost to you.

Amazon CloudFront integrates with Amazon S3 to provide a low-latency option to deliver software and other large files to your end users, wherever they are located, via 52 POPs located around the world. Learn More »

Deliver streaming of pre-recorded media, progressive download or events to end users

If you have pre-recorded or progressive download media, or have live events you would like to stream to end users, a content distribution network can ensure your users have a great experience interacting with your content at a lower cost to you.

Amazon CloudFront supports distribution of streaming pre-recorded media, progressive download media, and live HTTP streaming to different devices, including Flash-based and Apple iOS devices. Learn More »

Control which users can access content

When using a content distribution network to provide a low latency experience to end users, it is important to be able to control access to content such as digital assets, training materials, personalized documents, or media files.

The Amazon CloudFront private content feature allows you to control who is able to download your content from Amazon CloudFront distributions. Learn More »


Managed MySQL, Oracle, Microsoft SQL Server, PostgreSQL, and MariaDB

A managed relational database service can provide you with access to the MySQL database engine with which you’re already familiar, but without common and time-consuming administrative tasks, freeing you to focus on your applications and business.

Amazon Relational Database Service provides a managed database service offering a choice of MySQL, Oracle, Microsoft SQL Server, or PostgreSQL engines. Learn More »

Speed and reliability of high-end commercial databases with the simplicity and cost-effectiveness of open source databases

Amazon Aurora is a relational database engine that combines the speed and reliability of high-end commercial databases with the simplicity and cost-effectiveness of open source databases. It delivers up to five times the throughput of standard MySQL running on the same hardware. Amazon Aurora is designed to be compatible with MySQL 5.6, so that existing MySQL applications and tools can run without requiring modification.

Amazon Aurora joins MySQL, Oracle, Microsoft SQL Server, and PostgreSQL as the fifth database engine available to customers through Amazon RDS. Learn More »

Provisioned I/O throughput

Certain database workloads, such as OLTP, require a specific level of I/O performance to function acceptably. Being able to specify the level of I/O performance required without complex configuration or deployment processes makes it easier for you to focus on your application rather than the management of underlying infrastructure.

Amazon Relational Database Service supports Provisioned IOPS storage that allows you to specify the level of I/O performance you require, up to 30,000 IOPS, depending on engine size. Learn More »

Easy scaling

During the lifetime of an application, the size of a database is likely to grow, and peak demand might make it necessary to scale-up the database infrastructure. Using simple scaling options, where you just select the size (larger or smaller) of database instance you require, makes it easy for you to keep-up with the needs of your application.

Amazon Relational Database Service offers the ability to easily scale up or down to a larger or smaller managed database instance with a few mouse clicks. Learn More »

High availability

Traditionally, managing high availability for relational databases involves time-consuming efforts to set-up, and high ongoing costs. With a managed database service, that allows you to easily configure high availability, you can avoid undifferentiated heavy lifting and focus on your application’s capabilities.

Amazon Relational Database Service has a deployment option called Multi-AZ that provides high availability across multiple, geographically separate Availability Zones can be enabled with a check box. Learn More »

Read replicas

Read replicas provide the capability to scale out beyond the capacity constraints of a single DB Instance for read-heavy database workloads. Configuring a read replica for a relational database traditionally requires overhead for set-up and management. A managed database service can enable the provisioning of a read replica without these time-consuming setup tasks.

Amazon Relational Database Service offers the ability to create read replicas with a few mouse clicks, as well as taking care of all underlying configuration tasks. Learn More »

Bring your own licensing

If you have already acquired licenses for on-premises database software, you should be able to bring your licenses with you when you begin using a managed database service in the cloud, and preserve your existing licensing investment.

Amazon Relational Database Service allows you to ‘bring your own license’ when using Oracle or Microsoft SQL Server engines, preserving your existing licensing investment. Learn More »

Flexible pricing options

Depending on your requirements, a managed database offering should offer multiple pricing options that fit your preferred usage patterns and purchasing methods, including the option of pay-as-you-go and significant discounts via low up-front payment.

Amazon Relational Database Service offers commitment-free, on-demand, pay-as-you-go pricing, as well as Reserved Instances that offer a significant discount to the hourly rate in return for a low up-front payment. Learn More »

Security and Compliance

A managed database service that has numerous built-in security features can make it easy for you to comply with your security or compliance requirements without having to deploy and manage additional software.

Amazon Relational Database Service supports multiple levels of firewalls and integrates with Amazon VPC to provide network isolation for your database instances. Amazon RDS for MySQL & SQL Server offers SSL to secure data in transit. Amazon RDS for Oracle supports transparent data encryption to secure data at rest. Amazon RDS is SOC1 and SOC2 compliant, and also integrates with AWS Identity and Access Management (IAM) for fine-grained access control for users within your organization. Learn More »

Managed non-relational database

A managed non-relational database can provide you with access to a high-performance database that meets the needs of your application while freeing you up from the time-consuming administrative tasks of deploying, managing, scaling, and tuning a database.

Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed database and supports both document and key-value data models. Learn More »

Document and Key-value Data Model support

        DynamoDB supports storing, querying, and updating documents. Using the AWS SDK you can write applications that store JSON documents directly into Amazon DynamoDB tables. Learn More »

        Amazon DynamoDB supports key-value data structures. Each item (row) is a key-value pair where the primary key is the only required attribute for items in a table and uniquely identifies each item. Learn More »

Throughput based automatic provisioning

Traditional databases require time-consuming and complex capacity planning, configuration, and deployments to meet the performance needs of your application. A database that lets you define the throughput you require and performs all necessary provisioning tasks lets you focus on your application’s features rather than on managing database infrastructure.

Amazon DynamoDB allows you to define the throughput you require and takes care of all underlying infrastructure provisioning tasks to deliver the consistent high performance your application requires. Learn More »

Built-in high availability

Achieving high-availability typically requires complex configuration and management tasks, both at set-up and on an ongoing basis. A managed database system, with high-availability built-in, lets you focus on your application rather than managing database infrastructure.

Amazon DynamoDB has built-in reliability, replicating your data across three geographically separated facilities in a region to ensure that your application can be highly available. Learn More »

Security and Compliance

A managed database service that has numerous built-in security features can make it easy for you to comply with your security or compliance requirements without having to deploy and manage additional software.

Amazon DynamoDB integrates with AWS Identity and Access Management (IAM) for fine-grained access control for users within your organization. You can assign unique security credentials to each user and control each user's access to services and resources. Learn More »

Managed in-memory cache

An in-memory cache can help improve the performance of your application. Using a managed in-memory cache means you can offload the management, monitoring, and operation of the in-memory cache and focus on your application.

Amazon Elasticache provides a fully managed, memcached-compliant, in-memory cache to help improve the performance of your application. Learn More »

Petabyte-scale managed data warehouse service

Traditional data warehousing solutions can be extremely costly and complex to deploy and manage. Using a managed, petabyte-scale data warehouse service means that most of the common administration tasks associated with provisioning, configuring, monitoring, backing-up and securing a data warehouse are taken care of.

Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service, costing less than $1,000 per terabyte per year, a tenth the cost of most traditional solutions. Learn More »

Easily resize your data warehouse

A data warehousing service should make it easy for you to re-size your data warehouse by choosing the size of the clusters you need and making infrastructure changes for you, allowing you to focus on your data rather than managing and deploying infrastructure.

Amazon Redshift lets you start with a single 2 TB node and scale up to 100 16 TB nodes for 1.6 PB of compressed customer storage by using the Amazon Redshift API or the AWS Management Console. Your cluster remains available for queries during the resize process. Learn More »

Fast query performance regardless of the size of your data set

Traditional data warehouses can require expert tuning or expensive hardware to deliver fast query performance across a range of data set sizes. A data warehouse service should maintain fast query performance regardless of the size of your dataset so that you can focus on your data and not tuning or managing infrastructure.

Amazon Redshift uses columnar technology and parallelizes and distributes queries across multiple nodes to deliver fast query performance on datasets ranging in size from hundreds of gigabytes to a petabyte and more. Learn More »

Security and Compliance

A managed database service that has numerous built-in security features can make it easy for you to comply with your security or compliance requirements without having to deploy and manage additional software.

Amazon Redshift integrates with Amazon VPC to provide network isolation for your data warehouse clusters.

Data in transit can be secured using SSL. Data at rest, including all blocks, temporary results and backups can be secured using hardware-accelerated AES-256 encryption.

Amazon Redshift is SOC1 and SOC2 compliant and integrates with AWS Identity and Access Management (IAM) for fine-grained access control for users within your organization.

Use the tools you are already familiar with

Being able to use familiar tools with a data warehouse solution means that you don't have to learn new technologies to work with your data. Having a broad ecosystem of easily accessible data integration and Business Intelligence (BI) tools makes it easy for you to focus on working with your data instead of learning or acquiring tools.

Amazon Redshift has a strong ecosystem of many software and consulting companies that can enable you to use familiar tools and get help with implementations. You can use data integration tools like Informatic and Attunity any, BI tools like MicroStrategy, Jaspersoft and Tableau and work with systems integrators such as Capgemini and Cognizant when deploying Amazon Redshift and many of these tools can be easily deployed from the AWS Marketplace. Learn More »


Managed search service

If you have an application requiring search capability, being able to easily add a managed search service, without deploying and managing additional infrastructure, can free up your time to focus on your application.

Amazon CloudSearch is a fully managed search service in the cloud that allows easy integration of search functionality into applications. Learn More »

Managed queuing service

If you have an application that uses queuing, using a managed queuing service means that you don’t have to deploy and manage additional infrastructure, enabling you to spend your time developing your application.

Amazon Simple Queue Service (Amazon SQS) offers a reliable, highly scalable, hosted queue for storing messages as they travel between computers. By using Amazon SQS, developers can simply move data between distributed components of their applications that perform different tasks, without losing messages or requiring each component to be always available. Learn More »

Managed notification service

If you have an application that uses notifications, using a managed notifications service means that you don’t have to deploy and manage additional infrastructure, enabling you to spend your time developing your application.

Amazon Simple Notification Service (Amazon SNS) is a web service that makes it easy to set up, operate, and send notifications from the cloud. It provides developers with a highly scalable, flexible, and cost-effective capability to deliver messages from an application to subscribers or other applications using various protocols (e.g. email, HTTP, SMS). Learn More »

Managed workflow service

If you have an application that uses workflows, using a managed workflow service means that you don’t have to deploy and manage additional infrastructure, enabling you to spend your time developing your application.

Amazon Simple Workflow Service (Amazon SWF) is a workflow service for building scalable, resilient applications. Whether automating business processes for finance or insurance applications, building sophisticated data analytics applications, or managing cloud infrastructure services, Amazon SWF reliably coordinates all of the processing steps within an application. Learn More »

Bulk email delivery

Sending bulk email, whether for application registration or marketing purposes, can be difficult and time-consuming. Challenges include IP address reputation management and monitoring deliverability. Using a managed email service lets you remain focused on your application and the emails you want to send, rather than managing separate email infrastructure.

Amazon Simple Email Service (Amazon SES) is a scalable and cost-effective bulk and transactional email-sending service that meets rigorous ISP requirements for email content. Learn More »

Media transcoding

If your application requires video transcoding, using a managed transcoding service allows you to spend your time on your application, rather than deploying, setting-up, and maintaining separate transcoding infrastructure.

Amazon Elastic Transcoder is a scalable and cost-effective service that transcodes video files to ensure playback on multiple devices, such as smartphones, tablets, and PCs. Learn More »

Managed Application Streaming

Running your application in the cloud, and streaming it to a wide variety of mass-market devices means that you don't have to make trade-offs to limit your audience by requiring high-end hardware or deliver a lower fidelity experience by supporting a broader range of devices.

Amazon AppStream is a flexible, low-latency service that lets you stream resource intensive applications and games from the cloud. Amazon AppStream deploys and renders your application on AWS infrastructure and streams the output to mass-market devices, such as personal computers, tablets, and mobile phones. Learn More »


Deploy infrastructure with templates

Using a template-based service makes deployments simpler, more orderly, and predictable instead of deploying each element of an application (e.g. security groups, instances, database servers, and load balancers) separately and by hand. Whether you are provisioning, updating or deprovisioning infrastructure, or deploying your application to other locations, templates make the process simpler and more predictable.

AWS CloudFormation allows easy creation and management of stacks of AWS resources via templates. Learn More »

Container-based deployment

Leveraging container-based deployments eliminates the need to worry about underlying infrastructure. You can simply upload an application and the container service handles the details of capacity provisioning, load balancing, auto scaling, and application health monitoring while still giving you full control of underlying resources.

AWS Elastic Beanstalk lets you simply upload an application and takes care of capacity provisioning, load balancing, auto scaling, and health monitoring while giving you full control of the underlying infrastructure. Learn More »

Application lifecycle management

Using a DevOps based solution for managing application lifecycle means that you can focus on the functionality of your application. You can let the service focus on resource provisioning, configuration management, application deployment, software updates, monitoring, and access control rather than deploying infrastructure manually, and then performing configurations for multiple tiers of your application by hand or by deploying your own management infrastructure.

AWS OpsWorks is a DevOps solution for managing applications of any scale or complexity in the AWS cloud. Learn More »

Track AWS Resource Configuration

In cloud, resources can be created, attached, configured, used, detached, and destroyed in a matter of minutes. With all of this change happening, organizations of all sizes face some new challenges when it comes to asset tracking, inventory management, change management, and governance. 

AWS Config is a fully managed service that provides you with an AWS resource inventory, configuration history, and configuration change notifications to enable security and governance. Learn More »

 

Tools for Code Management and Deployment

The following AWS tools are designed to help individual developers, teams of developers, and system administrators store, integrate, and deploy their code on the cloud.

 

AWS CodeDeploy - This service efficiently deploys your released code to a "fleet" of EC2 instances while taking care to leave as much of the fleet online as possible. It can accommodate fleets that range in size from one instance all the way up to tens of thousands of instances. Learn More »

AWS CodeCommit - This is a managed revision control service that hosts Git repositories and works with all Git-based tools. You no longer need to worry about hosting, scaling, or maintaining your own source code control infrastructure.

AWS CodePipeline - This service will help you to model and automate your software release process. You can design a development workflow that fits your organization's needs and your working style and use it to shepherd your code through the staging, testing, and release process. CodePipeline works with third-party tools but is also a complete, self-contained end-to-end solution. 

 

Key Management Service

To date we have provided our customers with multiple options including client-side and server-side encryption for Amazon Simple Storage Service (S3), along with server-side encryption for Amazon Elastic Block Store (EBS), Amazon Redshift, Amazon RDS for Oracle, and Amazon RDS for SQL Server. Up until now, the server-side encryption support provided by these services has made use of "master keys" that are generated, stored, and managed within AWS.

AWS Key Management Service (KMS) is a managed service that makes it easy for you to create and control the encryption keys used to encrypt your data, and uses Hardware Security Modules (HSMs) to protect the security of your keys. AWS Key Management Service is integrated with other AWS services including Amazon EBS, Amazon S3, and Amazon Redshift. Learn More »


Built-in monitoring tools with alerts

Monitoring tools provide you information and alerts about the status of your cloud resources. Rather than deploying, administering, and maintaining your own monitoring solution, using built-in monitoring tools allows you to focus on your application.

AWS CloudWatch provides monitoring and alerts for AWS cloud resources and applications. Metrics are provided in one minute intervals and you can define your own, custom metrics. Learn More »

Consistent management tools

Management tools provide solutions for controlling your cloud resources. A consistent set of management tools makes for an easier management experience and lets you control cloud resources from wherever you are, whether in front of a browser or on a mobile device.

The web-based AWS Management Console and companion mobile applications provide a simple and intuitive way of managing AWS from a browser or on the go. Learn More »


Tools for Code Management and Deployment

The following AWS tools are designed to help individual developers, teams of developers, and system administrators store, integrate, and deploy their code on the cloud.

AWS CodeDeploy - This service efficiently deploys your released code to a "fleet" of EC2 instances while taking care to leave as much of the fleet online as possible. It can accommodate fleets that range in size from one instance all the way up to tens of thousands of instances. Learn More »

AWS CodeCommit - This is a managed revision control service that hosts Git repositories and works with all Git-based tools. You no longer need to worry about hosting, scaling, or maintaining your own source code control infrastructure.

AWS CodePipeline - This service will help you to model and automate your software release process. You can design a development workflow that fits your organization's needs and your working style and use it to shepherd your code through the staging, testing, and release process. CodePipeline works with third-party tools but is also a complete, self-contained end-to-end solution. 

 

IDE integration

Integration with common IDE environments means that it’s easy for developers to provision and manage cloud resources directly from tools they already use, rather than having to learn new interfaces.

AWS provides IDE toolkits for Visual Studio and Eclipse so that developers can access and administer their AWS infrastructure as they build .Net or Java applications running on AWS. Learn More »

SDKs for the languages you use

Developers can easily manipulate cloud resources by using SDKs based on the languages they are already know, rather than having to learn new languages.

AWS provides SDKs for Android, iOS, Java, .NET, Node.js, Python, PHP and Ruby. Learn More »


Round the clock support options

It is not possible to predict when support services might be required, so it is important to ensure that support is available when you need it.

AWS provides basic support to all customers at no additional charge. Learn More »

Direct, one to one technical support and rapid case review

For critical applications, one-on-one support and access to technical account managers, with rapid case review and direct routing to senior engineers, are important components in ensuring problems are addressed quickly.

AWS support provides a choice of four support tiers including features such as 15 minute response time, direct routing to senior engineers and access to a Technical Account Manager. Learn More »

Regular, actionable guidance on how to optimize cloud infrastructure and reduce costs

Receiving proactive guidance from a tool that incorporates best practices, aggregated from experience in serving large numbers of customers, can be useful to you. These programmatic insights can improve the security and availability of your applications and help you make better use of under-utilized resources. They should also proactively identify areas of cost savings.

AWS Trusted Advisor uses hundreds of automated checks, inspects your AWS environment and proactively makes recommendations on opportunities to save money, improve performance, or close security gaps. Learn More »

Direct channels of communication and support for launches and major events

During a high-profile event, such as a marketing campaign or product launch when your applications may experience significant demand, you may benefit from direct, high-touch engagement with senior engineers who can provide 24/7 support for the duration of the event.

The AWS Infrastructure Event Management program provides critical support from senior engineers to help ensure success for high profile events. Learn More »


Control your IP networking configuration

When you deploy applications on cloud computing platforms, you should have complete control over your IP addressing configuration. This means it is easy to continue using your established IP addressing schemes and easily connect to your existing IP networks.

Amazon Virtual Private Cloud provides you with complete control over a logically isolated virtual network you define. Learn More »

Create multiple private and public subnets

Being able to create multiple IP subnets, network interfaces, and control routing tables gives you fine-grained control over your application’s network communications. For example, you can create a public-facing subnet for webservers that have Internet access, while creating a private subnet, without Internet access, for back-end servers. This helps make your applications more secure.

Amazon Virtual Private Cloud provides you with complete control over a logically isolated virtual network you define. Learn More »

Highly durable automatic data replication and recovery

By deploying a virtual appliance in your infrastructure, you can use secure, industry-standard protocols to easily backup your data asynchronously via a connection to the cloud. Data should be encrypted at rest and then stored in a durable storage system to ensure you can access it when you need to.

The AWS Storage Gateway is a service that makes it easy to backup your data to the cloud; it securely connects an on-premises software appliance to the AWS cloud. Learn More »

Move large amounts of data to and from the cloud cost-effectively with portable storage devices

Moving large amounts of data to and from the cloud over the Internet may be expensive and time consuming. Using portable storage devices to transport large amounts of data to the cloud can be faster as well as more cost effective than upgrading your connectivity.

AWS Import/Export Snowball accelerates moving large amounts of data to and from the cloud using portable storage appliances as transport. Learn More »

Easily import virtual machines into the cloud

Rather than re-creating on-premises virtual machines that you have already built, being able to easily import or export them to or from the cloud lets you leverage your existing investments, making it easier to deploy workloads across your IT infrastructure.

VM Import/Export enables you to import virtual machines easily from your existing environment to Amazon EC2 instances and then export them back to your on-premises environment and is available at no additional charge. Learn More »

Hardware-based virtual private networking connection to cloud resources

Using a hardware appliance of your choice, to extend your network securely to the cloud with a VPN can provide easy and seamless access between your existing network infrastructure and cloud resources.

Amazon Virtual Private Cloud allows you to use a hardware based VPN to connect your network to a logically isolated virtual network in the AWS cloud that you define. Learn More »

High speed, low latency, private, dedicated connectivity between on-premises and cloud infrastructure

If you have security or connectivity requirements that cannot be met by standard Internet connections, connecting your network directly to the cloud from a variety of locations, using a private 1Gbps or 10 Gbps connection, helps enable you to meet those requirements.

AWS Direct Connect makes it easy to establish a dedicated, private network connection from your premises to AWS. Learn More »


Managed Hadoop workloads

The Hadoop framework has been adopted rapidly for data-intensive workloads because of its ability to work with complex data easily.

Amazon Elastic MapReduce (EMR) makes it easy to provision Hadoop clusters for data processing at any scale in just a few clicks. Learn More »

Dynamic resizing for Hadoop clusters

Adjusting the size of your Hadoop clusters means you can quickly and flexibly conform to the size and scope requirements of important analytics projects.

Amazon EMR allows dynamic sizing of clusters, allowing you to add or remove capacity, paying only for what you use. Learn More »

A choice of Hadoop distributions

Just like operating systems, different analytics challenges can benefit from different distributions of Hadoop.

Amazon EMR lets you choose the right Hadoop distribution for your workload every time, and makes it easy to experiment and benchmark your queries. Learn More »

Analytics in your choice of language

Take advantage of the skills and expertise in your organization by using existing tools and code bases for large-scale analytics projects.

Work with the tools that you’re used to by running analytics in virtually any language, from Ruby to Python, Java to C++.

Established and growing analytics ecosystem

The Hadoop ecosystem of data management tools is growing quickly. Using these tools gives you a fast track to advanced analytics, machine learning, and data storage.

Amazon EMR lets you take advantage of a growing collection of Hadoop data management and analytics tools, training, and partners. Learn More »

Managed non-relational database

A managed non-relational database can provide unlimited data storage and the ability to scale seamlessly from hundreds to hundreds of thousands of reads and writes per second.

Amazon DynamoDB is a fast, fully managed NoSQL database service that can store any amount of data and serve any level of request traffic. Average latencies are single-digit millisecond, regardless of scale. The service runs on SSDs, delivering fast performance, high throughput and has built-in high availability.

Amazon DynamoDB integrates easily with Amazon EMR and Amazon Redshift to give you multiple options for analyzing your data. Learn More »

Petabyte-scale managed data warehouse service

A managed data warehousing service allows you to use SQL to analyze hundreds of gigabytes to a petabyte and more of data. You can use the same SQL-based tools you use today for dashboards, reporting, and ad hoc querying.

Amazon Redshift is a fast, fully managed petabyte-scale data warehouse service, costing less than $1,000 per terabyte per year, a tenth the cost of most traditional solutions. You can provision an Amazon Redshift cluster in minutes and scale easily from a single 2 TB node to 100 16 TB nodes for 1.6 PB of storage.

Amazon Redshift uses columnar technology and distributes and parallelizes queries across multiple nodes to deliver fast query performance for datasets of all sizes.

Amazon Redshift integrates with Amazon DynamoDB and Amazon S3 enabling you to integrate and analyze data from multiple different sources using SQL. Learn More »

Durable storage for large data volumes

Your data is a valuable resource. It requires a durable, reliable, and redundant storage platform that's cost effective.

Amazon S3 supports any type of data; XML to Excel, from 1 byte to 5 TB per object, redundantly stored across multiple data centers.

Query all your data

Integrating queries across multiple storage volumes, locations, and formats can lead to more accurate and actionable information.

Amazon EMR can access objects on S3 as if they were on native disk, which means less time moving data around, and more integration between datasets.

Amazon Redshift enables you to store all your data in a central location for reporting, ad hoc querying and trend analysis.

Integration across your data formats

Business data is rarely packaged for analytics, which makes tools that load and transform such data very attractive.

Amazon EMR is pre-loaded with tools to read virtually any enterprise file format, via HParser from Informatica. Learn More »

Unlimited scale NoSQL data stores

The cost of data generation is falling, suggesting that more and more data will be created in less time. Services to manage that data must be simple to provision and provide access at virtually unlimited scale.

Amazon DynamoDB lets you store an unlimited amount of data, and access it at virtually unlimited scale, without provisioning a single server.

High performance from solid state drives

Low latency access to data at scale is challenging, especially on traditional spinning media. Solid state drives provide much faster, more predictable response times.

Amazon DynamoDB uses solid state drives under the hood to ensure single-digit millisecond latency, irrespective of the volume of queries, or scale of data.

Arbitrarily complex data processing workflows

Data processing usually involves multiple steps, simultaneously executed within a workflow. These workflows need to be reliable, durable, and able to detect failures or missing data efficiently.

The AWS Data Pipeline helps you build repeatable, scalable processing workflows across multiple data sources, with drag and drop ease. Learn More »

Easy integration across multiple big data services

Real world analytics involves different types of data that need to be processed, integrated, and analyzed.

AWS provides best-of-breed services for Hadoop, NoSQL, and SQL data analytics with nearly infinite scalability for compute and storage.

Amazon EMR and Amazon Redshift integrate directly with Amazon S3 and Amazon DynamoDB to enable easy analysis of structured and unstructured data.

By providing integrated, specialized services, AWS lets customers choose the best tool for the job at hand.

Low cost, high performance data warehousing

Data warehousing and business intelligence allows organizations to integrate data into a single source of truth, for better insight into all aspects of their business and customers, such as operations, supply chain and sales activities.

Amazon Redshift is a fully managed data warehouse service that costs less than $1,000/TB/year and delivers fast query performance for datasets ranging in size from hundreds of gigabytes to a petabyte and more. Learn More »

Easy integration with the ETL and business intelligence software you already use

Using the ETL and business intelligence tools with which you are already familiar can make it easy to move data from on-premises systems into the cloud and start analyzing your data right away, without having to learn how to use a different set of tools.

Amazon Redshift has over 15 ETL and BI partners who have integrated their tools with the service. This makes it easy for you to use Informatica, Attunity, MicroStrategy, Jaspersoft, Tableau and many other tools with Amazon Redshift clusters. Learn More »

Low latency, high bandwidth networking

Distributed workloads require fast, low latency networking for efficient communication of information between processing components.

Amazon EC2 supports low latency, fully bisectional, 10 gigabit Ethernet networking to ensure fast, efficient movement of data.

Public Data sets in the cloud

The availability of large datasets next to utility computing resources allows you to quickly experiment and innovative to create new products, services, and insights.

The Public Data Sets on AWS program makes valuable data sets available, such as those created by the Common Crawl and the 1000 Genomes projects, at no cost to the community.

Fully managed real-time processing of streaming data

Being able to easily write applications that process information in real-time, from sources such as web site click-streams, marketing and financial information, manufacturing instrumentation and social media, and operational logs and metering data means you can make decisions more quickly and at the most relevant time.

Amazon Kinesis is a fully managed service for real-time processing of streaming data at massive scale. With Amazon Kinesis applications, you can build real-time dashboards, capture exceptions and generate alerts, drive recommendations, and make other real-time business or operational decisions. Learn More »


A Marketplace with a choice of ready-to-go solutions

Rather than manually deploying applications onto compute instances in the cloud, the ability to choose from a marketplace environment, with a range of different solutions across numerous categories that can be deployed with a few clicks, means you can focus on your business and applications rather than deploying software.

The AWS Marketplace features a wide selection of over 1900 listings of commercial and free software, from well-known vendors, designed to run on Amazon EC2 instances with a few mouse clicks. Learn More »

Choose from a broad range of Independent Software Vendors offering solutions designed for the cloud

You might require a specific application or management product that is ready to run in the cloud. Easily finding an Independent Software Vendor (ISV) with such a solution can give you a higher level of confidence for successful deployment.

The AWS Partner Network is a global partner program that helps you choose an Independent Software Vendor that has built applications designed to work well in the AWS cloud from our ecosystem of thousands of diverse partners, including organizations such as Adobe, ESRI, SAP, Microsoft and Oracle. Learn More »

Choose from a wide range of systems integrators with deep cloud platform experience

When deploying a cloud-based solution, it’s important to find a systems integrator (SI) with the right level of qualification and skill to provide you with any assistance you need to better ensure the successful deployment and operation of your solution.

The AWS Partner Network is a global partner program that enables you to choose a Systems Integrator from our ecosystem of thousands of diverse partners, (including organizations such as Booz Allen Hamilton, Capgemini, Cognizant, and Infosys,) that has the skills to help you architect, deploy, and operate a solution and has deep experience in the AWS cloud. Learn More »

Role-based certifications to demonstrate cloud proficiency

Identifying and selecting individuals by recognized credentials and certifications can give you confidence that those involved with architecting, deploying, or operating cloud-based solutions have the right level of technical skills to help ensure success.

AWS Certifications recognize IT professionals that possess the skills and technical knowledge necessary for designing, deploying, and operating applications and infrastructure on AWS. Earning certification helps you gain visibility and credibility for your proven experience working with AWS, as well as contributes to your organization’s proficiency with AWS-based applications. Learn More »

Access to role-based cloud specific training offerings

Access to a wide range of role-based training courses and material helps ensure that you and your teams have the resources to successfully architect, deploy, and operate cloud-based solutions.

AWS provides a number of publically available, role-based training courses that you or your staff can access to learn how to architect, deploy and operate applications that you run in the AWS cloud. Learn More »