Overcome problems with public cloud storage providers – TechTarget

If you have a new app or use case requiring scalable, on-demand or pay-as-you-go storage, one or more public cloud storage services will probably make your short list. It's likely your development team has at least dabbled with cloud storage, and you may be using cloud storage today to support secondary uses such as backup, archiving or analytics.

Every cloud storage option has its pros and cons. Depending on your specific needs, the size of your environment, and your budget, its essential to weigh all cloud and on-prem options. Download this comprehensive guide in which experts analyze and evaluate each cloud storage option available today so you can decide which cloud model public, private, or hybrid is right for you.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

While cloud storage has come a long way, its use for production apps remains relatively limited. Taneja Group surveyed enterprises and midsize businesses in 2014 and again in 2016, asking whether they are running any business-critical workloads (e.g., ERP, customer relationship management [CRM] or other line-of-business apps) in a public cloud (see "Deployments on the rise"). Less than half were running one or more critical apps in the cloud in 2014, and that percentage grew to just over 60% in 2016. Though cloud adoption for critical apps has increased significantly, many IT managers remain hesitant about committing production apps and data to public cloud storage providers.

Concerns about security and compliance are big obstacles to public cloud storage adoption, as IT managers balk at having critical data move and reside outside data center walls. Poor application performance, often stemming from unpredictable spikes in network latency, is another top-of-mind issue. And then there's the cost and difficulty of moving large volumes of data in and out of the cloud or within the cloud itself, say when pursuing a multicloud approach or switching providers. Another challenge is the need to reliably and efficiently back up cloud-based data, traditionally not well supported by most public cloud storage providers.

How can you overcome these kinds of issues and ensure your public cloud storage deployment will be successful, including for production workloads? We suggest using a three-step process to assess, compare and contrast providers' key capabilities, service-level agreements (SLAs) and track records so you can make a better informed decision (see: "Three-step approach to cloud storage adoption").

Let's examine specific security, compliance and performance capabilities as well as SLA commitments you should look for when evaluating public cloud storage providers.

Maintaining cloud data storage security is generally understood to operate under a shared responsibility model: The provider is responsible for security of the underlying infrastructure, and you are responsible for data placed on the cloud as well as devices or data you connect to the cloud.

All three major cloud storage infrastructure-as-a-service providers (Amazon Web Services [AWS], Microsoft Azure and Google Cloud) have made significant investments to protect their physical data center facilities and cloud infrastructure, placing a particular emphasis on securing their networks from attacks, intrusions and the like. Smaller and regional players tend also to focus on securing their cloud infrastructure. Still, take the time to review technical white papers and best practices to fully understand available security provisions.

Though you will be responsible for securing the data you connect or move to the cloud, public cloud storage providers offer tools and capabilities to assist. These generally fall into one of three categories of protection: data access, data in transit or data at rest.

Data access: Overall, providers allow you to protect and control access to user accounts, compute instances, APIs and data, just as you would in your own data center. This is accomplished through authentication credentials such as passwords, cryptographic keys, certificates or digital signatures. Specific data access capabilities and policies let you restrict and regulate access to particular storage buckets, objects or files. For example, within Amazon Simple Storage Service (S3), you can use Access Control Lists (ACLs) to grant groups of AWS users read or write access to specific buckets or objects and employ Bucket Policies to enable or disable permissions across some or all of the objects in a given bucket. Check each provider's credentials and policies to verify they satisfy your internal requirements. Though most make multifactor authentication optional, we recommend enabling it for account logins.

Data in transit:To protect data in transit, public cloud storage providers offer one or more forms of transport-level or client-side encryption. For example, Microsoft recommends using HTTPS to ensure secure transmission of data over the public internet to and from Azure Storage, and offers client-side encryption to encrypt data before it's transferred to Azure Storage. Similarly, Amazon provides SSL-encrypted endpoints to enable secure uploading and downloading of data between S3 and client endpoints, whether they reside within or outside of AWS. Verify that the encryption approach in each provider's service is rigorous enough to comply with relevant security or industry-level standards.

Data at rest:To secure data at rest, some public cloud storage providers automatically encrypt data when it's stored, while others offer a choice of having them encrypt the data or doing it yourself. Google Cloud Platform services, for instance, always encrypt customer content stored at rest. Google encrypts new data stored in persistent disks using the 256-bit Advanced Encryption Standard (AES-256) and offers you the choice of having Google supply and manage the encryption keys or doing it yourself. Microsoft Azure, on the other hand, enables you to encrypt data using client-side encryption (protecting it both in transit and at rest) or to rely on Storage Service Encryption (SSE) to automatically encrypt data as it is written to Azure Storage. Amazon's offering for encrypting data at rest in S3 is nearly identical to Microsoft Azure's.

Also, check for data access logging -- to enable a record of access requests to specific buckets or objects -- and data disposal (wiping) provisions, to ensure data's fully destroyed if you decide to move it to a new provider's service.

Your provider should offer resources and controls that allow you to comply with key security standards and industry regulations. For example, depending on your industry, business focus and IT requirements, you may look for help in complying with Health Insurance Portability and Accountability Act, Service Organization Controls 1 financial reporting, Payment Card Industry Data Security Standard or FedRAMP security controls for information stored and processed in the cloud. So be sure to check out the list of supported compliance standards, including third-party certifications and accreditations.

Unlike security and compliance, for which you can make an objective assessment, application performance is highly dependent on IT environment, including cloud infrastructure configuration, network connection speeds and the additional traffic running over that connection. If you're achieving an I/O latency of 5 to 10 milliseconds running with traditional storage on premises, or even better than that with flash storage, you will want to prequalify application performance before committing to a cloud provider. It's difficult to anticipate how well a latency-sensitive application will perform in a public cloud environment without actually testing it under the kinds of conditions you expect to see in production.

Speed of access is based, in part, on data location, meaning expect better performance if you colocate apps in the cloud. If you're planning to store primary data in the cloud but keep production workloads running on premises, evaluate the use of an on-premises cloud storage gateway -- such as Azure StorSimple or AWS Storage Gateway -- to cache frequently accessed data locally and (likely) compress or deduplicate it before it's sent to the cloud.

To further address the performance needs of I/O-intensive use cases and applications, major public cloud storage providers offer premium storage capabilities, along with instances that are optimized for such workloads. For example, Microsoft Azure offers Premium Storage, allowing virtual machine disks to store data on SSDs. This helps solve the latency issue by enabling I/O-hungry enterprise workloads such as CRM, messaging and other database apps to be moved to the cloud. As you might expect, these premium storage services come with a higher price tag than conventional cloud storage.

Bottom line on application performance: Try before you buy.

A cloud storage service-level agreement spells out guarantees for minimum uptime during monthly billing periods, along with the recourse you're entitled to if those commitments aren't met. Contrary to many customers' wishes, SLAs do not include objectives or commitments for other important aspects of the storage service, such as maximum latency, minimum I/O performance or worst-case data durability.

In the case of the "big three" providers' services, the monthly uptime percentage is calculated by subtracting from 100% the average percentage of service requests not fulfilled due to "errors," with the percentages calculated every five minutes (or one hour in the case of Microsoft Azure Storage) and averaged over the course of the month.

Typically, when the uptime percentage for a provider's single-region, standard storage service falls below 99.9% during the month, you will be entitled to a service credit. (Though it's not calculated this way for SLA purposes, 99.9% availability implies no more than 43 minutes of downtime in a 30-day month.) The provider will typically credit 10% of the current monthly charges for uptime levels between 99% and 99.9%, and 25% for uptime levels below 99% (Google Cloud Storage credits up to 50% if uptime falls below 95%). Microsoft Azure Storage considers storage transactions failures if they exceed a maximum processing time (based on request type), while Amazon S3 and Google Cloud Storage rely on internally generated error codes to measure failed storage requests. Note that the burden is on you as the customer to request a service credit in a timely manner if a monthly uptime guarantee isn't met.

Also, carefully evaluate the SLAs to determine whether they satisfy your availability requirements for both data and workloads. If a single-region service isn't likely to meet your needs, it may make sense to pay the premium for a multi-region service, in which copies of data are dispersed across multiple geographies. This approach increases data availability, but it won't protect you from instances of data corruption or accidental deletions, which are simply propagated across regions as data is replicated.

With these guidelines and caveats in mind, you can better assess whether public cloud storage makes sense for your particular use cases, data and applications. If public cloud storage providers' service-level commitments and capabilities fall short of meeting your requirements, consider developing a private cloud or taking advantage of managed cloud services.

Though public cloud storage may not be an ideal fit for your production data and workloads, you may find it fits the bill for some of your less demanding use cases.

Companies move toward public cloud storage

Evaluate all variables in the cloud storage equation

Public, private or hybrid? What's the right cloud storage for you?

See the rest here:

Overcome problems with public cloud storage providers - TechTarget

Related Posts

Comments are closed.