Google reportedly set a goal of being a top-two cloud player by 2023 – CNBC

Google CEO Sundar Pichai

Getty Images

In early 2018, top executives at Alphabet debated whether the company should leave the public cloud business, but eventually set a goal of becoming a top-two player by 2023, according to a report from The Information on Tuesday.

If the company fails to achieve this goal, some staffers reportedly believe that Alphabet could withdraw from the market completely.

After the report was published, a Google spokesperson told CNBC the article was "not accurate," and disputed that the company debated leaving the cloud market in 2018.

While Alphabet subsidiary Google is dominant in web search and advertising, the company is still a small player in cloud computing, which involves renting out computing and storage resources to other companies, schools and governments. In 2018 the company lagged Amazon, Microsoft and Alibaba in that market, according to industry research firm Gartner.

Alphabet doesn't break out revenue for the Google cloud business but said in July that it had reached $8 billion in annualized revenue. Amazon Web Services, the market leader, generated $9 billion in revenue during the third quarter alone. Microsoft doesn't specify revenue from its Azure cloud, but Griffin Securities analyst Jay Vleeschhouwer estimated that Azure delivered $4.3 billion in revenue in the third quarter.

Google co-founder Larry Page, who was Alphabet's CEO at the time, reportedly thought being a distant third-place in cloud was not acceptable. But eventually he, CFO Ruth Porat, and then-Google CEO Sundar Pichai decided that Alphabet should remain in the cloud business, according to the report. The company set a five-year budget for capital expenditures of $20 billion, in part to reach that cloud goal.

The company replaced VMware co-founder Diane Greene, who had been leading the cloud business, with Oracle executive Thomas Kurian at the start of 2019. Pichai replaced Page as Alphabet's CEO earlier this month.

Read the full Information article here.

Update: This article has been updated to reflect that on Tuesday afternoon, a Google spokesperson told CNBC the report was "not accurate" and denied the company debated leaving the cloud market in 2018.

WATCH: Google Cloud VP of Retail Carrie Tharp: Cyber Monday is 'make or break'

Originally posted here:

Google reportedly set a goal of being a top-two cloud player by 2023 - CNBC

Didi Chuxing Teams with NVIDIA for Autonomous Driving and Cloud Computing – HPCwire

Dec. 17, 2019 NVIDIA and Didi Chuxing (DiDi), the worlds leading mobile transportation platform, today announced that DiDi will leverage NVIDIA GPUs and AI technology to develop autonomous driving and cloud computing solutions.

DiDi will use NVIDIA GPUs in the data center for training machine learning algorithms and NVIDIA DRIVE for inference on its Level 4 autonomous driving vehicles. In August, DiDi upgraded its autonomous driving unit into an independent company and began a wide range of collaborations with industry partners.

As part of the centralized AI processing of DiDis autonomous vehicles, NVIDIA DRIVE enables data to be fused from all types of sensors (cameras, lidar, radar, etc.) using numerous deep neural networks (DNNs) to understand the 360-degree environment surrounding the car and plan a safe path forward.

Developing safe autonomous vehicles requires end-to-end AI, in the cloud and in the car, said Rishi Dhall, vice president of Autonomous Vehicles at NVIDIA. NVIDIA AI will enable DiDi to develop safer, more efficient transportation systems and deliver a broad range of cloud services.

To train these DNNs, DiDi will use NVIDIA GPU data center servers. For cloud computing, DiDi will also build an AI infrastructure and launch virtual GPU (vGPU) cloud servers for computing, rendering and gaming.

DiDi Cloud will adopt a new vGPU license mode to provide users with better experiences, richer application scenarios and more efficient, flexible GPU cloud computing services. Currently, DiDi Cloud is collaborating with industry partners including NVIDIA to provide services in transportation, AI, graphics rendering, video games and education.

Delivering 10 billion passenger trips per year, DiDi is working toward the safe, large-scale application of autonomous driving technology, leveraging its own technology capacities, data resources and open collaboration with tech leaders and OEM partners.

About Didi Chuxing

Didi Chuxing (DiDi) is the worlds leading mobile transportation platform. The company offers a full range of app-based transportation services for 550 million users across Asia, Latin America and Australia, including Taxi, Express, Premier, Luxe, Bus, Designated Driving, Enterprise Solutions, Bike Sharing, E-bike Sharing, Automobile Solutions and Food Delivery. Tens of millions of drivers who find flexible work opportunities on the DiDi platform provide 10 billion passenger trips a year.

About NVIDIA

NVIDIAs invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined modern computer graphics and revolutionized parallel computing. More recently, GPU deep learning ignited modern AI the next era of computing with the GPU acting as the brain of computers, robots and self-driving cars that can perceive and understand the world. More information at http://nvidianews.nvidia.com/.

Source: NVIDIA

Read more:

Didi Chuxing Teams with NVIDIA for Autonomous Driving and Cloud Computing - HPCwire

National Science Foundation Awards Grant to Develop Next-Generation Cloud Computing Testbed Powered by Red Hat – Business Wire

RALEIGH, N.C.--(BUSINESS WIRE)--Red Hat, Inc., the world's leading provider of open source solutions, today announced that the National Science Foundation (NSF) Division of Computer and Network Systems has awarded a grant to a research team from Boston University, Northeastern University and the University of Massachusetts Amherst (UMass) to help fund the development of a national cloud testbed for research and development of new cloud computing platforms.

The testbed, known as the Open Cloud Testbed, will integrate capabilities previously developed for the CloudLab testbed into the Massachusetts Open Cloud (MOC), a production cloud developed collaboratively by academia, government, and industry through a partnership anchored at Boston Universitys Hariri Institute for Computing. As a founding industry partner and long-time collaborator on the MOC project, Red Hat will work with Northeastern University and UMass, as well as other government and industry collaborators, to build the national testbed on Red Hats open hybrid cloud technologies.

Testbeds such as the one being constructed by the research team, are critical for enabling new cloud technologies and making the services they provide more efficient and accessible to a wider range of scientists focusing on research in computer systems and other sciences.

By combining open source technologies and a production cloud enhanced with programmable hardware through field-programmable gate arrays (FPGAs), the project aims to close a gap in computing capabilities currently available to researchers. As a result, the testbed is expected to help accelerate innovation by enabling greater scale and increased collaboration between research teams and open source communities. Red Hat researchers plan to contribute to active research in the testbed, including a wide range of projects on FPGA hardware tools, middleware, operating systems and security.

Beyond this, the project also aims to identify, attract, educate and retain the next generation of researchers in this field and accelerate technology transfer from academic research to practical use via collaboration with industry partners such as Red Hat.

Since its launch in 2014, Red Hat has served as a core partner of the MOC, which brings together talent and technologies from various academic, government, non-profit, and industry organizations to collaboratively create an open, production-grade public cloud suitable for cutting-edge research and development. The MOCs open cloud stack is based on Red Hat Enterprise Linux, Red Hat OpenStack Platform and Red Hat OpenShift.

Beyond creating the national testbed, the grant will also extend Red Hats collaboration with Boston University researchers to develop self-service capabilities for the MOCs cloud resources. For example, via contributions to the OpenStack bare metal provisioning program (Ironic), the collaboration aims to produce production quality Elastic Secure Infrastructure (ESI) software, a key piece to enabling more flexible and secure resource sharing between different datacenter clusters. And by sharing new developments that enable moving resources between bare metal machines and Red Hat OpenStack or Kubernetes clusters in open source communities such as Ironic or Ansible, Red Hat and the MOCs researchers are helping to advance technology well beyond the Open Cloud Testbed.

Supporting Quotes

Michael Zink, associate professor, Electrical and Computer Engineering (ECE), University of Massachusetts AmherstThis testbed will help accelerate innovation in cloud technologies, technologies affecting almost all of computing today. By providing capabilities that currently are only available to researchers within a few large commercial cloud providers, the new testbed will allow diverse communities to exploit these technologies, thus democratizing cloud-computing research and allowing increased collaboration between the research and open-source communities. We look forward to continuing the collaboration in MOC to see what we can accomplish with the testbed.

Orran Krieger, professor of Electrical and Computer Engineering, Boston University; co-director, Red Hat Collaboratory; PI, Massachusetts Open CloudAn important part of the MOC has always been to enable cloud computing research by the academic community. This project dramatically expands our ability to support researchers both by providing much richer capabilities and by expanding from a regional to a national community of researchers.

Chris Wright, senior vice president and chief technology officer, Red HatThis grant and the work being done by the MOC show how open source solutions can positively impact real-world challenges outside of enterprise data centers. Red Hat is no stranger to pioneering new ways in which open source software can be used for innovative research, and we are pleased to help drive this initiative in bringing open cloud technologies to a wider range of disciplines, from social sciences to physics, while also continuing our commitment to the next generation of open source practitioners.

Additional Resource

Connect with Red Hat

About Red Hat, Inc.

Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Forward-Looking Statements

Certain statements contained in this press release may constitute "forward-looking statements" within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements provide current expectations of future events based on certain assumptions and include any statement that does not directly relate to any historical or current fact. Actual results may differ materially from those indicated by such forward-looking statements as a result of various important factors, including: risks related to the ability of the Company to compete effectively; the ability to deliver and stimulate demand for new products and technological innovations on a timely basis; delays or reductions in information technology spending; the integration of acquisitions and the ability to market successfully acquired technologies and products; risks related to errors or defects in our offerings and third-party products upon which our offerings depend; risks related to the security of our offerings and other data security vulnerabilities; fluctuations in exchange rates; changes in and a dependence on key personnel; the effects of industry consolidation; uncertainty and adverse results in litigation and related settlements; the inability to adequately protect Company intellectual property and the potential for infringement or breach of license claims of or relating to third party intellectual property; the ability to meet financial and operational challenges encountered in our international operations; and ineffective management of, and control over, the Company's growth and international operations, as well as other factors. In addition to these factors, actual future performance, outcomes, and results may differ materially because of more general factors including (without limitation) general industry and market conditions and growth rates, economic and political conditions, governmental and public policy changes and the impact of natural disasters such as earthquakes and floods. The forward-looking statements included in this press release represent the Company's views as of the date of this press release and these views could change. However, while the Company may elect to update these forward-looking statements at some point in the future, the Company specifically disclaims any obligation to do so. These forward-looking statements should not be relied upon as representing the Company's views as of any date subsequent to the date of this press release.

Red Hat, Red Hat Enterprise Linux, the Red Hat logo, and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Linux is the registered trademark of Linus Torvalds in the U.S. and other countries. The OpenStack Word Mark is either a registered trademark/service mark or trademark/service mark of the OpenStack Foundation, in the United States and other countries, and is used with the OpenStack Foundation's permission. Red Hat is not affiliated with, endorsed or sponsored by the OpenStack Foundation, or the OpenStack community.

Read more:

National Science Foundation Awards Grant to Develop Next-Generation Cloud Computing Testbed Powered by Red Hat - Business Wire

Amazon used Bluetooth beacons to track attendees at its massive AWS cloud computing conference – Business Insider UK

Amazon CEO Jeff Bezos is an engineer at heart and built a company that gathers data on everything it can. "People think of Amazon as very data-oriented and I always tell them, look, if you can make the decision with data, make the decision with data," Bezos once said in 2018

So it should not be surprising that when 60,000 people descended on Las Vegas earlier this month to attend Amazon Web Services re:Invent, the cloud platform's biggest annual customer conference, Amazon wanted data on what these attendees did and where they went during the show.

Specifically, the lanyards for the conference badges that Amazon issued this year included a Bluetooth beacon from a company called TurnoutNow, AWS confirmed to Motherboard's Joseph Cox.

Not everyone was aware that the badge included such tracking information, sources told Motherboard. But many were. On Twitter, people were discussing signs that AWS placed around the registration area alerting people to the tracker.

Per posts on social media, the signs said:

"Notice that object on your lanyard? It's an anonymous beacon that lets us count the number of attendees at certain event locations, so we can facilitate foot traffic, improve transportation, and help plan future events. Your beacon is not associated with your name or any other personal information about you, and it will only send anonymous data to us about our meeting space and other central gathering locations for this event. If you'd rather have a lanyard without the beacon, please visit the help desk."

Amazon also told Business Insider that attendees were free to opt out, and that accepting the lanyard was not a requirement for admittance.

Even still, given this was a conference for IT professionals, many of whom are computer security experts, a few people complained about the tracking device on Twitter, including security pro Jerry Gamblin.

Gamblin pointed out in a tweet that if some hacker figured out how to match the lanyard tracker with the name on the badge, they would no longer be anonymous. "The beacon has a unique ID so it is not anonymous and at best is pseudonymous. It would only take matching of name to the unique ID for full tracking," he tweeted.

And attendee Rachel Dines dissected hers and posted a picture on Twitter. "Did anyone else take apart the beacon on their #reinvent lanyard? I see a battery, an antennae and some unknown chips. Also, as soon as you remove the beacon from your lanyard, the QR code starts to fade #creepy"

The beacon was the subject of other jokes and banter, from opting out with a hammer to swapping lanyards with other attendees.

Interestingly, such conference attendee tracking devices are hardly a new idea. They've been used by conference organizers for well over a decade.

And it's a bit ironic that Amazon, of all companies, wouldn't use one of the cutting edge technologies it develops and sells to solve this problem, like machine learning/AI, computer vision, or even something more mundane, like the location data from its conference mobile app.

As one attendee on Twitter put it, "2008 wants their conference badges back."

Are you an insider with insight to share? Contact Julie Bort on encrypted chat app Signal at (970) 430-6112 using a non-work phone (no PR inquiries, please), or email at jbort@businessinsider.com. Open DMs on Twitter @Julie188.

Original post:

Amazon used Bluetooth beacons to track attendees at its massive AWS cloud computing conference - Business Insider UK

Prime Leverage: How Amazon Wields Power in the Technology World – The Indian Express

By: New York Times | Seattle | Published: December 18, 2019 4:13:50 pm While cloud computing may appear obscure, it has grown into one of the technology industrys largest and most lucrative businesses, offering computing power and software to companies. And Amazon is its single-biggest provider.

By Daisuke Wakabayashi

Elastic, a software startup in Amsterdam, was rapidly building its business and had grown to 100 employees. Then Amazon came along.

In October 2015, Amazons cloud computing arm announced it was copying Elastics free software tool, which people use to search and analyze data, and would sell it as a paid service. Amazon went ahead even though Elastics product, called Elasticsearch, was already available on Amazon.

Within a year, Amazon was generating more money from what Elastic had built than the startup by making it easy for people to use the tool with its other offerings. So Elastic added premium features last year and limited what companies like Amazon could do with them. Amazon duplicated many of those features anyway and provided them free.

In September, Elastic fired back. It sued Amazon in federal court in California for violating its trademark because Amazon had called its product by the exact same name: Elasticsearch. Amazon misleads consumers, the startup said in its complaint. Amazon denied it had done anything wrong. The case is pending.

Not since the mid-1990s, when Microsoft dominated the personal computer industry with Windows, has a technology platform instilled such fear in competitors as Amazon is now doing with its cloud computing arm.

While cloud computing may appear obscure, it has grown into one of the technology industrys largest and most lucrative businesses, offering computing power and software to companies. And Amazon is its single-biggest provider.

Amazon has used its cloud computing arm called Amazon Web Services, or AWS to copy and integrate software that other tech companies pioneered. It has given an edge to its own services by making them more convenient to use, burying rival offerings and bundling discounts to make its products less expensive. The moves drive customers toward Amazon, while those responsible for the software may not see a cent.

Even so, smaller rivals said they have little choice but to work with Amazon. Given the companys broad reach with customers, startups often agree to its restrictions on promoting their own products and voluntarily share client and product information with it. For the privilege of selling through AWS, startups pay a cut of their sales back to Amazon.

Some of the companies have a phrase for what Amazon is doing: strip-mining software. By lifting other peoples innovations, trying to poach their engineers and profiting off what they made, Amazon is choking off the growth of would-be competitors and forcing them to reorient how they do business, the companies said.

All of this has fueled scrutiny of Amazon and whether it is abusing its market dominance and engaging in anti-competitive behavior. The companys tactics have led several rivals to discuss bringing antitrust complaints against it. And regulators and lawmakers are examining its clout in the industry.

AWS is just one prong of Amazons push to dominate large swaths of the US industry. The company has transformed retailing, logistics, book publishing and Hollywood.

But what Amazon is doing through AWS is arguably more consequential. The company is the unquestioned market leader triple the size of its nearest competitor, Microsoft in the seismic shift to cloud computing. Millions of people unknowingly interact with AWS every day when they stream movies on Netflix or store photos on Apples iCloud, services that run off Amazons machines.

Jeff Bezos, Amazons chief executive, once called AWS an idea no one asked for. The service began in the early 2000s when the retailer struggled to assemble computer systems to start new projects and features. Once it built a common computer infrastructure, Amazon realized other companies needed similar capabilities.

Now companies like Airbnb and General Electric essentially rent computing from Amazon otherwise known as using the cloud instead of buying and running their own systems. Businesses can then store their information on Amazon machines, pluck data from them and analyze it.

For Amazon itself, AWS has become crucial. The division generated $25 billion in sales last year and is Amazons most profitable business.

But in interviews with more than 40 current and former Amazon employees and those of rivals, many said the costs of what the company was doing with AWS were hidden. They said it was hard to measure how much business they had lost to Amazon or how the threat of Amazon had turned off would-be investors. Many spoke on the condition of anonymity for fear of angering the company.

Now regulators are approaching some of Amazons software rivals. The House Judiciary Committee, which is investigating the big tech companies, asked Amazon in a September letter about AWS practices. The Federal Trade Commission, which is also investigating Amazon, has questioned AWS competitors, according to officials.

When Amazon Web Services began last decade, Amazon was struggling to turn a consistent profit.

Startups embraced AWS. They saved money because they did not need to buy their own computing equipment, spending only on what they used. Soon more companies flocked to Amazon for computing infrastructure and, eventually, the software that ran on its machines.

In 2009, Amazon established a template for accelerating AWS growth. That year, it introduced a service for managing a database, which is critical software to help companies organize information.

The AWS database service, an instant hit with customers, did not run software that Amazon created. Instead, the company plucked from a freely shared option known as open source.

Technologists initially paid little attention to what Amazon had done with database software. Then in 2015, Amazon copied Elasticsearch and offered its competing service.

This time, heads turned.

There was a company that built a business around an open-source product that people like using, and suddenly they have a competitor using their own stuff against them, said Todd Persen, who started a nonopen-source software company this year so there was zero chance that Amazon could lift his creations.

Again and again, the open-source software industry became a well that Amazon turned to. When it copied and integrated that software into AWS, it did not need permission or have to pay the startups for their work.

That left little recourse for many of these companies, which could not suddenly start charging money for what was free software. Some instead changed the rules around how their wares could be used, restricting Amazon and others who want to turn what they have created into a paid service.

Last year, MongoDB, a popular technology for organizing data in documents, announced that it would require any company that manages its software as a web service to freely share the underlying technology. The move was widely viewed as a hedge against AWS, which does not openly share its technology for creating new services.

AWS soon introduced its own technology with the look and feel of MongoDBs older software, which did not fall under the new requirements.

By the time AWS held its first developer conference in 2012, Amazon was no longer the only big player in cloud computing. Microsoft and Google had introduced competing platforms. So Amazon unveiled more software services to make AWS indispensable.

Amazon has since added AWS services at a blistering pace, going from 30 in 2014 to about 175 as of December. It also built in a home-field advantage: simplicity and convenience.

Customers can add new AWS services with one click and use the same system to manage them. The new service is added to the same bill, while using a non-Amazon service on AWS is more complicated.

Saket Saurabh, chief executive of the startup Nexla, said he signed his startup to work with Amazon in September. The reason? Amazons giant sales teams can give his data-processing and monitoring service access to a vast audience.

What choice do we have? he said.

For all the latest Technology News, download Indian Express App

Read the original post:

Prime Leverage: How Amazon Wields Power in the Technology World - The Indian Express

20 Experts Share Predictions for Cloud in 2020 and Beyond – Solutions Review

As part of our Cloud Insight Jam, we got in touch with several experts and asked for their predictions on cloud computing in 2020. These experts represent the top cloud vendors, cloud solutions providers, and IT software companies, and have decades of combined experience with operating solutions inside the cloud. Weve compiled 26 quotes from 20 experts on where they see the field of cloud in 2020 and beyond.

Thanks to all of these experts for submitting their quotes and advice and be sure to follow us on Twitter all day for insights, advice, and best practices on cloud computing during our #CloudInsightJam!

Weve been hearing people talk about the hybrid cloud for the past three years now. And for the most part, thats all its been talk. 2020 is the year it gets real. But first, what does hybrid cloud actually mean? Red Hat defines hybrid cloud as a combination of two or more cloud environmentspublic or private. We are seeing large enterprises refusing to add capacity on-prem to their Hadoop deployments and instead invest in the public cloud. But they are still not willing to move their core enterprise data to the cloud. Data will stay on-prem and compute will be burst to the cloud, particularly for peak demands and unpredictable workloads. Technologies that provide optimal approaches to achieve this will drive the rise of the hybrid cloud.

In 2019, major brands like Target, British Airways, Facebook and Twitter all experienced major IT outages. 2020 will be no different as we expect to see some high-profile project failures and/or outages. Digital transformation opportunities are encouraging companies to take risks, but the stakes are high as the technology is being stretched to its limits. The deployment and use of multi-cloud/hybrid service architectures are creating fragile environments where many accidents are just waiting to happen.

Work has changed. Weve seen the decomposition of software applications into tens of thousands of best-of-breed applications, complicating the very nature of work [] thankfully, were rapidly moving into a new era of pre-built integrations and powerful new tools that even non-developers can use to quickly and easily integrate their ecosystem of apps. This is unleashing a new kind of innovation, one where enterprises and software companies alike can quickly create entirely new, business processes harmonized across multiple applications. Whether its connecting up the many disparate apps required to manage the workforce in a big enterprise, or its fintechs and traditional finserv players vying to create the next killer open banking application, 2020 will be the biggest year yet for productized integrations.

2020 will mark a notable shift in enterprise IT as the dawn of a new era of edge computing arises. The first-generation model of centralized cloud computing and storage has now run its course, and most of the new opportunities for enterprise data management reside at the edge. [] Such data growth outside the datacenter is the new reality, and its creating a need for enterprises to deploy computing power and storage capabilities at the network edge, aka edge computing. Enterprises are already investing in edge computing to move faster, to have data continuously available, and to improve data security. As edge computing goes mainstream in enterprise IT in 2020, edge-to-cloud architectures that manage data centrally while making it instantly available to users at the edge will be a key enabler for business success.

In 2020, I believe well see the rise of the Omni Cloud, or what multi-cloud will become. Basically, the abstraction above the physical public clouds, providing common ways to access storage, processing, databases, compute, and HPC. This will likely be more of an idea than an actual thing in 2020, but it will be game changing in terms of how we deal with complex heterogenous cloud deployments.

In 2020, I also believe well see public cloud providers finally accept that they will most often be deployed as part of a multi-cloud architecture. Thus, well see native public cloud tools that will be focused on managing, securing, and governing several cloud brands, all from a single cloud brand. At the end of the day, this can provide public cloud providers with a key advantage that they are able to exploit to grow the use of public cloud in general. Who will be first?

Each of the big clouds is looking to capture more enterprise data by moving in this direction, but there are also other players who look to unify all environments. This isnt software defined storage exactly, the model is more about stretching cloud services into the data center, as opposed to deploying storage on commodity hardware to reduce cost. The idea is that users should have a consistent experience from the services they run in the cloud, but across their infrastructure. This cloudification of data is a necessary step in seeing hybrid cloud fulfill its potential, but there is a risk that customers may trade one set of vendor lock-ins for another. They should ask themselves, Am I gaining the freedom to use my data across all sorts of clouds and services? Or am I locking myself into just another ecosystem?

A growing number of users who gravitated to Microsoft Azure, Amazon AWS or other one-stop commodity providers are on the verge of turning negative on the cloud. Theyre looking for vendors to help them do what they originally sought from cloud computing: that is, save on everything from engineering to equipment to support by delivering a truly turnkey offering. This disaffection is tied less to cloud computing as a concept or a platform than to issues around execution; todays commodity cloud simply doesnt have everything baked-in, as the early adopters experienced.

2020 may well be defined by the question: Whos an MSP? In the year ahead, both sides in the industry will become more like the other. Where users once stressed about the hybrid cloud (What is it? Is it for me? Who will decipher it for my organization?) or some other permutation of IaaS, the market will continue to evolve in the users direction, thanks largely to the managed services option. When an organization can deploy Azure or AWS but entrust day-to-day management to a more responsive (local) provider, everyone wins.

I think that in 2020, enterprise tech customers will finally realize that pursuing a multi-cloud strategy is proving to be worthless. It takes enormous effort and adds a lot of complexity to build systems that can switch between different public clouds for the relatively meager benefit of hedging against outages and vendor lock-in. Of course, technology vendors must continue to build solutions that work across all major public clouds in order to satisfy the demands of a diverse base of customers that each choose cloud providers based on their specific needs. But for tech customers, the goal of hedging against failures is just not meaningful when prolonged outages among major cloud providers, the kind that would require a company to shift operations to another cloud, have been practically non-existent. As for avoiding vendor lock-in, it ends up being more expensive for end-users to build the same system in multiple clouds than to build for your cloud of choice and then possibly move to another cloud if the terms or functionality get bad.

The coming year will show that cloud-based applications are the winning monetization strategy for open-source software companies. While the first generation of companies that developed open-source software tried to use a paid customer support and training model, the last couple of years have shown that to be unsustainable. In 2020, pay-per-use, cloud-based services will take hold as the leading business model for open-source companies. As enterprises increasingly focus on agility and time-to-value, cloud-based services can deliver speed and scalability for customers that are willing to pay and by extension they offer a source of revenue for software companies that want to develop and monetize open-source technology.

The war over cloud compute dominance is a real one between Amazon, Microsoft and Google. Recent announcements from these vendors (Google in particular) are showing the acquiescence to polarize and consolidate where computing is done, in favor of transportability of assets between them. Kubernetes and the vendor ecosystem surrounding this will be a very specific sector to watch.

When it comes to DevOps, companies are looking for more than just tools. They are looking for platforms that play nice with their current tool offerings while also adding value with artificial intelligence (AI) and analytics. Between 2019s public cloud adoption rate of 94% and private cloud adoption rate of 72%, I predict a strong shift in 2020 toward flexible platforms that can be adapted to the unique needs of the company rather than tools that offer out-of-the-box solutions.

Kubernetes is exciting to application developers because its vendor-agnostic and provides automation for deployment, operations, and scaling for microservices-based applications. While Kubernetes can simplify development, operations, and scaling, its also quite a complex beast, which can make it challenging to operate. Early adopters of Kubernetes discovered that while they were seeing the benefits in their application, they were spending an increasing amount of time and effort managing Kubernetes itself. AWS, Microsoft, and Google now all offer fully-managed Kubernetes services, freeing the next wave of Kubernetes adopters to reap the benefits without having to take on the complex task of managing Kubernetes itself. As a result, an increasing number of workloads will make the leap to Kubernetes in 2020, and the ecosystem of tools surrounding Kubernetes will continue to explode.

AWS introduced Lambda in 2014 to much fanfare, pioneering serverless computing. Since 2014, AWS has rapidly evolved Lambda, making it more flexible, powerful, and usable. But, these days, serverless is about much more than just functions, like AWS Lambda. AWS has introduced an entire suite of fully-managed services that enable customers to build and run serverless applications, covering compute, storage, orchestration, databases, analytics, and more. By leveraging these services such as AWS Lambda, AWS Fargate, Amazon S3, DynamoDB, Aurora Serverless, and Amazon API Gateway application developers can create sophisticated, powerful, and cost efficient applications that require no management or maintenance of servers or the underlying execution environment. In 2020, I expect AWS to introduce dozens of new services and features that continue to drive serverless adoption, while the other hyperscalers will race to follow their lead.

Public cloud represents the largest growth market in the history of computing, and the race for market share has been heating up over the past few years. AWS is the clear market leader, but both Microsoft and Google have been pushing to win a slice of the market. Google has made interesting advances technically, but has struggled to gain a foothold selling into enterprise. Microsoft, on the other hand, has made a huge investment in leveraging its existing enterprise install base and enterprise sellers to push its Azure cloud service. In late October of 2019, Microsoft surprised many by winning the $10B JEDI contract from the U.S. Department of Defense. AWS had been widely considered the clear leader for the contract, so much so that Oracle sued, alleging that the requirements were stacked in their favor. The victory is a huge boost for Microsoft, and a blow to AWS.

Cloud migrations introduce an opportunity to evaluate and switch to other databases, such as Amazon Aurora or Azure SQL MI, since the process takes the same amount of effort, regardless of whether a company is sticking with its existing vendor or selecting a new one. This trend is disrupting the enterprise database market most notably, pitting Oracle against Amazon, with the latter rapidly gaining ground. [] Regardless of which vendor takes the top spot, organizations will be the true winners of this race, by taking advantage of the myriad benefits the cloud has to offer while gaining access to more providers and services than ever before.

Recurring cloud and managed services is the way MSPs will have to evolve their business. No longer will it be possible for to survive making a one-off service or technology. Many MSPs arent equipped with a business model that allows them to grasp making money in a cloud world, so to stay competitive in 2020 they will need to adopt a business model based on recurring revenue.

Public cloud adoption will continue to rise as a result of MSPs seeking a more secure IT environment, as its currently a top concern causing them to rethink the continued feasibility of their practice and offerings. Public cloud is a huge comfort here because if MSPs store their data and apps with someone like Microsoft they get the work and knowledge of thousands of people who are dedicated solely to security.

Microsoft Azure (Azure Arc, Azure Stack Hub, Azure Stack Edge), Amazon Web Services (AWS Outpost, VMware on AWS) , and Google (Anthos, Google Kubernetes Engine) are all investing heavily not only in solutions that connect on-premises infrastructure to their own public clouds, but also in cross-cloud interoperability and management. This blurring of lines between vendors and technologies is an excellent development for enterprises who are looking not to be locked into a single vendor, but for the best technology to solve specific business problems.

Enterprises will seek to take full advantage of the clouds agility by re-architecting their application/technology stacks to optimize them specifically for the cloud environment. IT departments regularly use a lift and shift approach to migrating applications to the cloud, but the effort still requires some changes to ensure meeting desired service levels owing to some differences between private and public infrastructures. After the initial wave of migration to the cloud is optimized, DevOps will drive re-architecting their application/technology stacks to a cloud-native implementation to take further advantage of the clouds greater efficiency, reliability, scalability and affordability.

As the migration of enterprise applications to the cloud accelerates and matures, the need to ensure mission-critical high availability (HA)will create opportunities for resellers and system integrators. This window of opportunity is forming as enterprises seek more robust HA solutions that have yet to be fully integrated into the application and system software. Some system integrators may have the expertise and resources needed to leverage opensource software in their Linux offerings. But an increasing percentage will choose to integrate solutions purpose-built to provide HA and disaster recovery protections, as these have proven to be more dependable for the customer, while also being just as (if not more) profitable for the integrator.

Rather than consolidation, organizations will look to best of breed cloud technologies to provide the services that they need for their enterprise deployments. The agility provided through the flow and exchange of data across companies will become more critical as organizations are looking to optimize their operations and maximize ROI. We will see a trend of the top cloud providers joining forces to house their applications in each others environments.

Open source technologies enable a common environment across different cloud environments because they are cloud agnostic, and easy to run. The ability to accommodate an array of applications and host them in any cloud or open source containers will help spur multi-cloud adoption. Public cloud providers; like Azure ARC, Google Anthos and Amazon Outposts; will leverage multi-cloud deployments powered by their stacks. While the use case is still rare today, the movement will pick up in 2020, as more enterprises realize the capabilities of open source technologies in cloud environments.

Multi-cloud deployments are becoming the norm in todays enterprise. In 2020, this trend will continue to accelerate. A multi-cloud approach is critically important for organizations that run on-premise, since they need to stay in a hybrid mode when moving microservices to the cloud. As a result, we expect to see enterprises widely embrace distributed SQL databases to ensure agility without the availability constraints of traditional monolithic databases, like Oracle.

In the decade ahead, organizations of all sizes will continue to eschew Oracle for cloud-native databases, shrinking the user base of the monolithic legacy provider. Two major trends driving this activity are the ongoing move to microservices-based redesign of applications and the rise of cloud-native deployments running in public clouds or Kubernetes. Overall, Oracle has lost market share every year since 2013, and legacy relational database players have dropped about five percentage points per year. Major organizations, such as Amazon and Salesforce, have already figured out that its to their benefit to use less, not more, Oracle and this is a trend that will accelerate in the year ahead.

Looking for more info on managed service providers for your cloud solutions? Our MSP Buyers Guide contains profiles on the top cloud managed service providers for AWS, Azure, and Google Cloud, as well as questions you should ask vendors and yourself before buying. We also offer an MSP Vendor Map that outlines those vendors in a Venn diagram to make it easy for you to select potential providers.

Check us out onTwitter for the latest in Enterprise Cloud news and developments!

Dan is a tech writer who writes about Enterprise Cloud Strategy and Network Monitoring for Solutions Review. He graduated from Fitchburg State University with a Bachelor's in Professional Writing. You can reach him at dhein@solutionsreview.com

Related

Continue reading here:

20 Experts Share Predictions for Cloud in 2020 and Beyond - Solutions Review

Global Continuous Integration Tools Market Analysis, Trends, and Forecasts, 2019-2025: Rise of Cloud Computing Opens up Lucrative Growth Opportunities…

DUBLIN--(BUSINESS WIRE)--The "Continuous Integration Tools - Market Analysis, Trends, and Forecasts" report has been added to ResearchAndMarkets.com's offering.

The Continuous Integration Tools market worldwide is projected to grow by US$1.1 Billion, driven by a compounded growth of 19.2%.

On-Premises, one of the segments analyzed and sized in this study, displays the potential to grow at over 17.2%. The shifting dynamics supporting this growth makes it critical for businesses in this space to keep abreast of the changing pulse of the market. Poised to reach over US$924.9 Million by the year 2025, On-Premises will bring in healthy gains adding significant momentum to global growth.

Representing the developed world, the United States will maintain a 20.8% growth momentum. Within Europe, which continues to remain an important element in the world economy, Germany will add over US$43.7 Million to the region's size and clout in the next 5 to 6 years. Over US$54.1 Million worth of projected demand in the region will come from the rest of the European markets. In Japan, On-Premises will reach a market size of US$66.5 Million by the close of the analysis period.

As the world's second largest economy and the new game changer in global markets, China exhibits the potential to grow at 18.9% over the next couple of years and add approximately US$199.6 Million in terms of addressable opportunity for the picking by aspiring businesses and their astute leaders.

Presented in visually rich graphics are these and many more need-to-know quantitative data important in ensuring quality of strategy decisions, be it entry into new markets or allocation of resources within a portfolio.

Several macroeconomic factors and internal market forces will shape growth and development of demand patterns in emerging countries in Asia-Pacific. All research viewpoints presented are based on validated engagements from influencers in the market, whose opinions supersede all other research methodologies.

Competitors identified in this market include:

Key Topics Covered:

1. MARKET OVERVIEW

2. FOCUS ON SELECT PLAYERS

3. MARKET TRENDS & DRIVERS

4. GLOBAL MARKET PERSPECTIVE

For more information about this report visit https://www.researchandmarkets.com/r/syvspv

Read the rest here:

Global Continuous Integration Tools Market Analysis, Trends, and Forecasts, 2019-2025: Rise of Cloud Computing Opens up Lucrative Growth Opportunities...

Will Amazon Make a Dent in Cisco’s Revenue With This New Project? – Motley Fool

Every time Amazon(NASDAQ:AMZN) ventures into a new business, the targeted industry should worry about the risks of competing against such a disruptive player. For instance, traditional retailers have been struggling against Amazon's e-commerce dominance. And the giant retailer has become a leading public cloud computing company with Amazon Web Services (AWS).

Legacy traditional computer network vendors such as Cisco Systems (NASDAQ:CSCO) may be concerned by Amazon's latest move. Last week, the Linux Foundation announced the DENT project, which is aiming at "the creation of network operating system for disaggregated network switches in campus and remote enterprise locations," and Amazon is one of the six premier members of this initiative.But this apparently disruptive technology may actually have a limited impact.

Before cloud computing emerged a few years ago, computer networking had been following the same model over a couple of decades. Network vendors had been selling monolithic, proprietary solutions that integrated network devices with the software to run these boxes. This was great for customers that wanted a network that just works and that would not distract them from their core business.

Giant cloud providers such as Amazon with AWS and Microsoft with Azure required more scale and flexibility at low cost, though. As a result, solutions that disaggregated software from network devices emerged. For instance, Microsoft developed its network operating system SONiC that can run on any compatible network device. The advantage of this technology is cloud titans can run their tailor-made software on top of any hardware that better fits their needs.

And this technology has been having a significant impact on the networking industry. For instance,Arista Networks grew its revenue from $361 million in 2013 to $2.45 billion over the last 12 months thanks to its network solutions that address cloud titans' requirements.

In contrast, Cisco was late to adapt. Its market share in the high-speed data center network segment dropped from 74.4% in 2013 to 46.6% during the first half of this year.And Cisco announced only last week the disaggregation of its software from its hardware for its new data center networking solution.

Image source: Getty Images.

The idea of the DENT project is to applyto smaller and remote networks the same disruptive technology thatconsisted of separating software and hardware in the cloud data center networks. As an illustration, the project's first use case targets the retail industry, which involves many small locations.

Cisco doesn't disclose its revenue from its enterprise and campus business, but the company's largest segment, "infrastructure platforms" (which includes network devices), represented 57.3% of its revenue during the most recent quarter.Besides,a study indicates Cisco controlled 59% of the enterprise and campus markets in 2018.

Thus, with its disruptive technology on the campus and enterprise network, the DENT project represents a real threat to Cisco, but its impact may stay limited.

First, disaggregating network software and hardwaremakes sense for giant cloud providers since their scale allows them to develop their tailor-made technology in an economical way. But smaller companies don't necessarily want to deal with integrating networking software and hardware. They may still just want an integrated and trouble-free solution that connects their remote locations to their networks. The same concept exists with computer operating systems: Linux is available for free and can run on any personal computer, but many companies prefer to pay for Microsoft's operating system because it won't distract them from their core business.

Second, the DENT project is initially targeting the retail industry. However, some retailers may be reluctant to choose a solution provided by their biggest competitor (Amazon) as we saw when theypreferred Microsoft's cloud for their data centers.

Third, companies may worry about potential future integration of DENT with AWS, making their whole network -- remote locations and data centers -- dependent on Amazon's infrastructure and software.

Finally, even if this disruptive technology expands, Cisco can now quickly adapt since it developed a similar disaggregated solution for data centers.

Thus, investors should not expect any meaningful impact from the DENT project for either Amazon or Cisco. This project may actually signal Amazon is preparing to expand its physical footprint with a tailor-made networking solution that would lower its costs to deploy its remote locations.

And even if the DENT project gains traction, Cisco remains an attractive tech stock. The company will quickly adapt because it recently embraced the concept of disaggregating software and hardware in the data center segment.

Read more from the original source:

Will Amazon Make a Dent in Cisco's Revenue With This New Project? - Motley Fool

Raleigh analytics firm Cymatic named most promising startup finalist in The Cloud Awards – WRAL Tech Wire

RALEIGH Cymatic,a Raleigh-based provider of user and entity behavior analytics, today announced that is has been names a finalist as the 2019-2020 Most Promising Startup in the international Cloud Computing Awards program, The Cloud Awards.

To be shortlisted for our revolutionary work in web application defense is not only an honor, but a clear recognition of our early success in leading secure cloud technologies, said Cymatic Founder and Chief ExecutiveJason Hollander, in a statement. We offer the only unified WAF that deploys at the client through a simple line of JavaScript without agents or proxies to deliver first-look, first-strike capability that is earliest in the kill chaina fundamental shift from traditional approaches to security.

Founded in 2011, Cymatic has developed a next-generationall-in-one web application defense platformthat moves protection from the network side to the client to defend against todays most sophisticated client and browser-based attacks.

The startup says it is the only UEBA platform to provide web applications with deep visibility into and proactive remediation of the threats from human and non-human attacks, as well as the vulnerabilities users bring with them on their devices.Instead of just protecting network-based threats like traditional WAFs, Cymatic said it uses sophisticated artificial intelligence and machine-learning algorithms to identify page mutations and user anomalies. The platform protects against user-derived and device-based threats such as poor credential hygiene, dark web vulnerabilities and potentially risky devices. It is invisible and frictionless to users, deploys in mere minutes and has immediate time-to-value.

Web applications continue to be highly vulnerable to human derived threats such as poor credential and device hygiene. Unfortunately, todays current security solutions fail at really understanding how a users security hygiene directly affects the cyberhealth of companies they interact with, said Hollander.

Back in August, the firm raised $4.5 million in seed fundingfrom prominent private angel investors.

Cymatic was selected from hundreds of organizations that entered from the Americas, Australia, Europe and the Middle East. Its advanced cloud-based microservices and real-time message-bus architecture offer unparalleled scale to provide the resilience necessary to validate and process the millions of users and transactions that touch web properties every second. The platform is engineered to work across all web applications regardless of operating system, browser or device. It eliminates the OWASP top 10, bots, CAPTCHAs, dark web threats, forced MFA, shared accounts, IP threats, device vulnerabilities and other cloud-based threats with no erosion of the user experience.

The Cloud Awards is an international program which recognizes and honors industry leaders, innovators and organizational transformation in cloud computing. The awards are open to large, small, established and start-up organizations from across the entire globe, with an aim to find and celebrate the pioneers who will shape the future of the Cloud as we move into 2020 and beyond. The Cloud Awards currently offers two awards programs, the Cloud Computing Awards and the Software-as-a-Service Awards.

Raleigh analytics firm Cymatic raises $4.5M in seed funding

See the rest here:

Raleigh analytics firm Cymatic named most promising startup finalist in The Cloud Awards - WRAL Tech Wire

12 Highlights from the First Annual Cloud Insight Jam – Solutions Review

Yesterday, Solutions Review hosted our first ever Cloud Insight Jam and it was a huge success! We received more participation from cloud vendors, thought leaders, and IT experts than we could have hoped for, all sharing their thoughts and insights on cloud computing and how they expect the market to change in 2020. In case you missed the event, wed like to share the key highlights from the Insight Jam!

First, wed like to share our video compilation containing clips on advice and best practices for deploying cloud solutions. We compiled advice from 12 experts in the field of cloud from companies across the globe.

We also pulled several Tweets containing valuable cloud insights and predictions that vendors and individuals shared during the event. The intended backbone of the Cloud Insight Jam was to generate discussion and allow experts a forum for them to provide their thoughts. It was great seeing this concept come to life, as all throughout the day, several cloud solution providers and thought leaders Tweeted their perspectives!

Looking for more info on managed service providers for your cloud solutions? Our2020 MSP Buyers Guide contains profiles on the top cloud managed service providers for AWS, Azure, and Google Cloud, as well as questions you should ask vendors and yourself before buying. We also offer a2020 MSP Vendor Mapthat outlines those vendors in a Venn diagram to make it easy for you to select potential providers.

Check us out onTwitterfor the latest in Enterprise Cloud news and developments!

Dan is a tech writer who writes about Enterprise Cloud Strategy and Network Monitoring for Solutions Review. He graduated from Fitchburg State University with a Bachelor's in Professional Writing. You can reach him at dhein@solutionsreview.com

Related

Read the original:

12 Highlights from the First Annual Cloud Insight Jam - Solutions Review

Cloud computing IaaS in Life Science Market Growth Rate by 2026 Top Key Vendors, Trend, Segmentation, Drivers, Challenges and Forecast – Market…

Cloud computing IaaS in Life Science Market Overview:

The report titled Cloud computing IaaS in Life Science Market is one of the most comprehensive and important additions to Verified Market Research archive of market research studies. It offers detailed research and analysis of key aspects of the Cloud computing IaaS in Life Science market. The market analysts authoring this report have provided in-depth information on leading growth drivers, restraints, challenges, trends, and opportunities to offer a complete analysis of the Cloud computing IaaS in Life Science market. Market participants can use the analysis on market dynamics to plan effective growth strategies and prepare for future challenges beforehand. Each trend of the Cloud computing IaaS in Life Science market is carefully analyzed and researched about by the market analysts.

Global Cloud computing IaaS in Life Science market was valued at USD 946.1 million in 2017 and is projected to reach USD 5,245.31 million by 2025, growing at a CAGR of 32.7% from 2018 to 2025.

The report includes detailed analysis of the vendor landscape and thorough company profiling of leading players of the Cloud computing IaaS in Life Science market.The researchers have considered almost all important parameters for company profiling, including market share, recent development, gross margin, future development plans, product portfolio, production, and revenue.

Get | Download Sample Copy @ https://www.verifiedmarketresearch.com/download-sample/?rid=4625&utm_source=MRS&utm_medium=002

Leading players covered in the Cloud computing IaaS in Life Science market report:

Insight into Competitive Landscape :

Market players need to have a complete picture of the competitive landscape of the Cloud computing IaaS in Life Science market as it forms an essential tool for them to plan their future strategies accordingly. The report puts forth the key sustainability strategies taken up by the companies and the impact they are likely to have on the Cloud computing IaaS in Life Science market competition. The report helps the competitors to capitalize on opportunities in the Cloud computing IaaS in Life Science market and cope up with the existing competition. This will eventually help them to make sound business decisions and generate maximum revenue.

Market Segment Analysis :

The report offers a comprehensive study of product type and application segments of the Cloud computing IaaS in Life Science market. The segmental analysis provided in the report is based on significant factors such as market share, market size, consumption, production, and growth rate of the market segments studied.Readers of the report are also provided with exhaustive geographical analysis to provide clear understanding of the regional growth of the Cloud computing IaaS in Life Science market. Developed as well as developing regional markets for Cloud computing IaaS in Life Science have been deeply studied to help market players identify profit-making opportunities in different regions and countries.

North America (United States, Canada and Mexico)

Asia Pacific (China, Japan, South Korea, India, Australia, Indonesia, Thailand, Malaysia, Philippines and Vietnam)

Middle East and Africa (Turkey, GCC Countries, Egypt and South Africa)

South America (Brazil and others)

Ask for Discount @https://www.verifiedmarketresearch.com/ask-for-discount/?rid=4625&utm_source=MRS&utm_medium=002

Table of Content

1 Introduction of Cloud computing IaaS in Life Science Market

1.1 Overview of the Market 1.2 Scope of Report 1.3 Assumptions

2 Executive Summary

3 Research Methodology of Verified Market Research

3.1 Data Mining 3.2 Validation 3.3 Primary Interviews 3.4 List of Data Sources

4 Cloud computing IaaS in Life Science Market Outlook

4.1 Overview 4.2 Market Dynamics 4.2.1 Drivers 4.2.2 Restraints 4.2.3 Opportunities 4.3 Porters Five Force Model 4.4 Value Chain Analysis

5 Cloud computing IaaS in Life Science Market, By Deployment Model

5.1 Overview

6 Cloud computing IaaS in Life Science Market, By Solution

6.1 Overview

7 Cloud computing IaaS in Life Science Market, By Vertical

7.1 Overview

8 Cloud computing IaaS in Life Science Market, By Geography

8.1 Overview 8.2 North America 8.2.1 U.S. 8.2.2 Canada 8.2.3 Mexico 8.3 Europe 8.3.1 Germany 8.3.2 U.K. 8.3.3 France 8.3.4 Rest of Europe 8.4 Asia Pacific 8.4.1 China 8.4.2 Japan 8.4.3 India 8.4.4 Rest of Asia Pacific 8.5 Rest of the World 8.5.1 Latin America 8.5.2 Middle East

9 Cloud computing IaaS in Life Science Market Competitive Landscape

9.1 Overview 9.2 Company Market Ranking 9.3 Key Development Strategies

10 Company Profiles

10.1.1 Overview 10.1.2 Financial Performance 10.1.3 Product Outlook 10.1.4 Key Developments

11 Appendix

11.1 Related Research

Complete Report is Available @ https://www.verifiedmarketresearch.com/product/global-cloud-computing-iaas-in-life-science-market-size-and-forecast-to-2025/?utm_source=MRS&utm_medium=002

We also offer customization on reports based on specific client requirement:

1- Free country level analysis for any 5 countries of your choice.

2- Free Competitive analysis of any market players.

3- Free 40 analyst hours to cover any other data points

About Us:

Verified market research partners with clients to provide insight into strategic and growth analytics; data that help achieve business goals and targets. Our core values include trust, integrity, and authenticity for our clients.

Analysts with high expertise in data gathering and governance utilize industry techniques to collate and examine data at all stages. Our analysts are trained to combine modern data collection techniques, superior research methodology, subject expertise and years of collective experience to produce informative and accurate research reports.

Contact Us:

Mr. Edwyne Fernandes Call: +1 (650) 781 4080 Email: [emailprotected]

This post was originally published on Market Research Sheets

See the rest here:

Cloud computing IaaS in Life Science Market Growth Rate by 2026 Top Key Vendors, Trend, Segmentation, Drivers, Challenges and Forecast - Market...

Cloud computing IaaS in Life Science Market Global Size, Share, Outlook and Growth Opportunities 2019-2026 – Market Research Sheets

The global Cloud computing IaaS in Life Science Market is broadly studied in the report with large focus on market competition, segmentation, geographical expansion, and other important aspects. The analysts who have prepared the report are highly experienced in market research and possess vast knowledge about the global Cloud computing IaaS in Life Science Market . The report includes deep analysis of microeconomic and macroeconomic factors impacting the growth of the global Cloud computing IaaS in Life Science Market . It also offers analysis of production, sales, and consumption growth in the global Cloud computing IaaS in Life Science Market . With the help of exhaustive research studies provided in the report, readers can easily become familiar with key dynamics of the global Cloud computing IaaS in Life Science Market , including drivers, restraints, and opportunities.

The trends analysis offered in the report will help players operating in the global Cloud computing IaaS in Life Science Market to cash in on lucrative business opportunities. The all regional analysis included in the report will help players to explore untapped markets and increase their market presence in key regions. Most importantly, the report offers crucial market information and data that will prepare players to effectively strategize for their business to gain significant profits. On the whole, it comes out as a powerful tool that players can use to gain a competitive edge in the global Cloud computing IaaS in Life Science Market .

Global Cloud computing IaaS in Life Science market was valued at USD 946.1 million in 2017 and is projected to reach USD 5,245.31 million by 2025, growing at a CAGR of 32.7% from 2018 to 2025.

https://www.verifiedmarketresearch.com/download-sample/?rid=4625&utm_source=MRS&utm_medium=001

Topmost Leading Key Players in this report :

Cleardata Networks, Dell Global Net Access (GNAX), Carecloud Corporation, Vmware, Carestream Health, IBM Corporation, Iron Mountain, Athenahealth, and Oracle Corporation

As part of primary research, our analysts interviewed a number of primary sources from the demand and supply sides of the global Cloud computing IaaS in Life Science Market . This helped them to obtain both quantitative and qualitative data and information. On the demand side of the global Cloud computing IaaS in Life Science Market are end users, whereas on the supply side are distributors, vendors, and manufacturers.

During our secondary research, we collected information from different sources such as databases, regulatory bodies, gold and silver-standard websites, articles by recognized authors, certified publications, white papers, investor presentations and press releases of companies, and annual reports.

The research report includes segmentation of the global Cloud computing IaaS in Life Science Market on the basis of application, technology, end users, and region. Each segment gives a microscopic view of the market. It delves deeper into the changing political scenario and the environmental concerns that are likely to shape the future of the market. Furthermore, the segment includes graphs to give the readers a birds eye view.

Last but not the least, the research report on global Cloud computing IaaS in Life Science Market profiles some of the leading companies. It mentions their strategic initiatives and provides a brief about their structure. Analysts have also mentioned the research and development statuses of these companies and their provided complete information about their existing products and the ones in the pipeline.

Based on regions, the market is classified into North America, Europe, Asia Pacific, Middle East & Africa and Latin America. The study will provide detailed qualitative and quantitative information on the above mentioned segments for every region and country covered under the scope of the study.

Finally, Cloud computing IaaS in Life Science Market report gives you details about the market research finding and conclusion which helps you to develop profitable market strategies to gain competitive advantage. Supported by comprehensive primary as well as secondary research, the Cloud computing IaaS in Life Science Market report is then verified using expert advice, quality check and final review. The market data was analyzed and forecasted using market dynamics and consistent models.

Verified Market Research has been providing Research Reports, with up to date information, and in-depth analysis, for several years now, to individuals and companies alike that are looking for accurate Research Data. Our aim is to save your Time and Resources, providing you with the required Research Data, so you can only concentrate on Progress and Growth. Our Data includes research from various industries, along with all necessary statistics like Market Trends, or Forecasts from reliable sources.

Mr. Edwyne Fernandes

Call: +1 (650) 781 4080

Email:[emailprotected]

This post was originally published on Market Research Sheets

See more here:

Cloud computing IaaS in Life Science Market Global Size, Share, Outlook and Growth Opportunities 2019-2026 - Market Research Sheets

Global Cloud Computing Industry Emerging Trends and Prospects by leading Players Yahoo Inc. CISCO Systems, IBM Co., Hewlett Packard, Dell Inc. -…

The Latest Research Report published by Global Reports Store on Cloud Computing Industry Forecast 2019-2025. Which contains 120 Pages, 80 Figures, and Tables, With a detailed description of past, present, and future of Cloud Computing Industry along with 15 Companies detailed profile analysis. The global Cloud Computing industry valued approximately USD 209.9 billion in 2016 is anticipated to grow with a healthy growth rate of more than 17.93% over the forecast period 2019-2025.

Ask For Sample Of This Report: https://www.globalreportsstore.com/request-sample/13633

The major driver for this industry is the cost-effectiveness. This service of cloud computing helps various organizations to save up to one-third of their annual operations costs. Also, the rising number of SMEs will bolster the use of cloud services. The objective of the study is to define Industry sizes of different segments & countries in previous years and to forecast the values for the next eight years. The report is designed to incorporate both qualitative and quantitative aspects of the industry with respect to each of the regions and countries involved in the study. Furthermore, the report also caters the detailed information about the crucial aspects such as drivers & restraining factors which will define the future growth of the Industry. Additionally, it will also incorporate the opportunities available in micro Industries for stakeholders to invest, detailed analysis of competitive landscape and product offerings of key players.

Market Player in Cloud Computing Industry:Yahoo Inc.CISCO SystemsIBM Co.Hewlett PackardDell Inc.Akamai TechnologiesVM WareMicrosoft CorporationAmazon Web Services

Market Segmentation:By Service:Infrastructure as a Service (IaaS)Platform as a Service (PaaS)Software as a Service (SaaS)

By Deployment Model:Public CloudPrivate CloudHybrid Cloud

By Organization Size:Small & Medium Size Enterprises (SMEs)Large Enterprises

By End-User:Telecommunications & IT (ICT)HealthcareRetailPublic SectorMedia & EntertainmentBanking, Financial Services and Insurance (BFSI)

By RegionNorth America (USA, Canada)Europe (Germany, U.K., France, Italy, Rest of Europe)APAC (China, India, Japan, Rest of Asia-Pacific)RoW (Latin America, Middle East & Africa)

Buy This Research Report at 2900 USD Only(Report Will Be Delivered only in 2 Days): https://www.globalreportsstore.com/checkout/13633

High lite Form This Research Report:1 Business Overview: An exhaustive description of the companies operation and business divisions.2 Corporate Strategy: Analysts summarisation of the companies business strategy.3 SWOT Analysis: A detail analysis of company Strength, Weakness, Opportunity, and Threats.4 Company History: Progression of key events associated with the company.5 Major Products and Services: A list of major Products, Services, and Brands of the company.6 Key Competitors: A list of key competitors to the company.7 Important Locations and subsidiaries: a list and contact details of key locations and subsidiaries of the company.

Table of Contents Major Key PointsChapter 1: Executive summaryChapter 2: Scope of the reportChapter 3: Market research methodologyChapter 4: IntroductionChapter 5: Market landscapeChapter 6: Market segmentation by productChapter 7: Key leading countriesChapter 8: Market driversChapter 9: Impact of drivers

Contact Us:Manager [Business Development]Global Reports StoreUSA+1- 618-310-3972IND +91- 739-102-4425[emailprotected]

This post was originally published on Market Research Sheets

Continued here:

Global Cloud Computing Industry Emerging Trends and Prospects by leading Players Yahoo Inc. CISCO Systems, IBM Co., Hewlett Packard, Dell Inc. -...

National Science Foundation Awards Grant to Develop Next-Generation Cloud Computing Testbed Powered by Red Hat – Yahoo Finance

Grant to fund creation of national cloud testbed aimed at accelerating innovation in advanced infrastructure technologies

Red Hat, Inc., the world's leading provider of open source solutions, today announced that the National Science Foundation (NSF) Division of Computer and Network Systems has awarded a grant to a research team from Boston University, Northeastern University and the University of Massachusetts Amherst (UMass) to help fund the development of a national cloud testbed for research and development of new cloud computing platforms.

The testbed, known as the Open Cloud Testbed, will integrate capabilities previously developed for the CloudLab testbed into the Massachusetts Open Cloud (MOC), a production cloud developed collaboratively by academia, government, and industry through a partnership anchored at Boston Universitys Hariri Institute for Computing. As a founding industry partner and long-time collaborator on the MOC project, Red Hat will work with Northeastern University and UMass, as well as other government and industry collaborators, to build the national testbed on Red Hats open hybrid cloud technologies.

Testbeds such as the one being constructed by the research team, are critical for enabling new cloud technologies and making the services they provide more efficient and accessible to a wider range of scientists focusing on research in computer systems and other sciences.

By combining open source technologies and a production cloud enhanced with programmable hardware through field-programmable gate arrays (FPGAs), the project aims to close a gap in computing capabilities currently available to researchers. As a result, the testbed is expected to help accelerate innovation by enabling greater scale and increased collaboration between research teams and open source communities. Red Hat researchers plan to contribute to active research in the testbed, including a wide range of projects on FPGA hardware tools, middleware, operating systems and security.

Beyond this, the project also aims to identify, attract, educate and retain the next generation of researchers in this field and accelerate technology transfer from academic research to practical use via collaboration with industry partners such as Red Hat.

Since its launch in 2014, Red Hat has served as a core partner of the MOC, which brings together talent and technologies from various academic, government, non-profit, and industry organizations to collaboratively create an open, production-grade public cloud suitable for cutting-edge research and development. The MOCs open cloud stack is based on Red Hat Enterprise Linux, Red Hat OpenStack Platform and Red Hat OpenShift.

Beyond creating the national testbed, the grant will also extend Red Hats collaboration with Boston University researchers to develop self-service capabilities for the MOCs cloud resources. For example, via contributions to the OpenStack bare metal provisioning program (Ironic), the collaboration aims to produce production quality Elastic Secure Infrastructure (ESI) software, a key piece to enabling more flexible and secure resource sharing between different datacenter clusters. And by sharing new developments that enable moving resources between bare metal machines and Red Hat OpenStack or Kubernetes clusters in open source communities such as Ironic or Ansible, Red Hat and the MOCs researchers are helping to advance technology well beyond the Open Cloud Testbed.

Supporting Quotes

Michael Zink, associate professor, Electrical and Computer Engineering (ECE), University of Massachusetts Amherst"This testbed will help accelerate innovation in cloud technologies, technologies affecting almost all of computing today. By providing capabilities that currently are only available to researchers within a few large commercial cloud providers, the new testbed will allow diverse communities to exploit these technologies, thus democratizing cloud-computing research and allowing increased collaboration between the research and open-source communities. We look forward to continuing the collaboration in MOC to see what we can accomplish with the testbed."

Orran Krieger, professor of Electrical and Computer Engineering, Boston University; co-director, Red Hat Collaboratory; PI, Massachusetts Open Cloud"An important part of the MOC has always been to enable cloud computing research by the academic community. This project dramatically expands our ability to support researchers both by providing much richer capabilities and by expanding from a regional to a national community of researchers."

Story continues

Chris Wright, senior vice president and chief technology officer, Red Hat"This grant and the work being done by the MOC show how open source solutions can positively impact real-world challenges outside of enterprise data centers. Red Hat is no stranger to pioneering new ways in which open source software can be used for innovative research, and we are pleased to help drive this initiative in bringing open cloud technologies to a wider range of disciplines, from social sciences to physics, while also continuing our commitment to the next generation of open source practitioners."

Additional Resource

Connect with Red Hat

About Red Hat, Inc.

Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Forward-Looking Statements

Certain statements contained in this press release may constitute "forward-looking statements" within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements provide current expectations of future events based on certain assumptions and include any statement that does not directly relate to any historical or current fact. Actual results may differ materially from those indicated by such forward-looking statements as a result of various important factors, including: risks related to the ability of the Company to compete effectively; the ability to deliver and stimulate demand for new products and technological innovations on a timely basis; delays or reductions in information technology spending; the integration of acquisitions and the ability to market successfully acquired technologies and products; risks related to errors or defects in our offerings and third-party products upon which our offerings depend; risks related to the security of our offerings and other data security vulnerabilities; fluctuations in exchange rates; changes in and a dependence on key personnel; the effects of industry consolidation; uncertainty and adverse results in litigation and related settlements; the inability to adequately protect Company intellectual property and the potential for infringement or breach of license claims of or relating to third party intellectual property; the ability to meet financial and operational challenges encountered in our international operations; and ineffective management of, and control over, the Company's growth and international operations, as well as other factors. In addition to these factors, actual future performance, outcomes, and results may differ materially because of more general factors including (without limitation) general industry and market conditions and growth rates, economic and political conditions, governmental and public policy changes and the impact of natural disasters such as earthquakes and floods. The forward-looking statements included in this press release represent the Company's views as of the date of this press release and these views could change. However, while the Company may elect to update these forward-looking statements at some point in the future, the Company specifically disclaims any obligation to do so. These forward-looking statements should not be relied upon as representing the Company's views as of any date subsequent to the date of this press release.

Red Hat, Red Hat Enterprise Linux, the Red Hat logo, and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Linux is the registered trademark of Linus Torvalds in the U.S. and other countries. The OpenStack Word Mark is either a registered trademark/service mark or trademark/service mark of the OpenStack Foundation, in the United States and other countries, and is used with the OpenStack Foundation's permission. Red Hat is not affiliated with, endorsed or sponsored by the OpenStack Foundation, or the OpenStack community.

View source version on businesswire.com: https://www.businesswire.com/news/home/20191218005067/en/

Contacts

Media Contact:Gaby Berkman Red Hat, Inc. +1 978-392-2495 gberkman@redhat.com

See original here:

National Science Foundation Awards Grant to Develop Next-Generation Cloud Computing Testbed Powered by Red Hat - Yahoo Finance

The rise of cloud computing is having an impact on data center efficiency and it’s not great – Utility Dive

Dive Brief:

Uptime Institute's 2019 survey found data centers averaged a PUE of 1.67, versus a PUE of 1.8 in 2011, meaning more of the energy that data centers consume is used for computing processes. That's a significant improvement, but the industry may be losing ground.

The rise of cloud computing is putting downward pressure on data center PUEas fewer computers are now doing the actual work, according to the Uptime Institute, an advisory group focused on business infrastructure. This follows years of improvement.

"Improvements in data center facility energy efficiency have flattened out and even deteriorated slightly in the past two years," according to the firm's annual survey, conducted online in March and April with 1,600 respondents.

Larger and more efficient data centers that power the cloud are doing more of the work now, with efficiency improvements slowing at smaller centers as the computing migrates.

"It's not overall efficiency that is stalled out it's infrastructure efficiency," Matt Stansberry, the institute's vice president for North American operations, told Utility Dive. "For a long time, data centers were highly inefficient so, for every unit of energy used to run IT equipment doing productive work, there was a giant amount of overhead."

The PUE ratio "went down over the last 10 years as people started paying attention and making improvements," Stansberry said. Primarily, those improvements were to cooling systems. But with the rise of cloud computing, companies are using fewer computers and instead are relying on cloud-based systems.

"But [data center]buildings don't shift in real-time" to adjust the infrastructure supporting those computers, Stansberry said.

As more computing is done via the cloud, fewer computers mean these data centers may be over-built and less efficient. For utilities, they likely wont see significant load changes in the near term, though older data centers could close and growth in new additions could slow.

Further improvements to data center efficiency "will require significant investment and effort, with increasingly diminishing returns," Uptime Institute concluded. "While managers and operators should remain vigilant and seek to maintain high facility efficiency levels, higher gains may be found by focusing on IT efficiency."

That doesn't mean there isn't room for more traditional efficiency, Jim Kozlowski, vice president of global capacity planning and data center operations at Ensono, an IT services provider.

"Utilities help drive data center efficiency," Kozlowski told Utility Dive in an email. "By driving incentives or better economics, data center users will install more energy efficient infrastructure in the long term."

Many companies are in the process of modernizing their data centers, Kozlowski said. "So as they upgrade equipment and building management systems to meet certain standards, energy efficiency is improving."

The federal government has been keeping an eye on the sector as well. In 2016, the U.S. Department of Energy's Advanced Research Projects Agency-Energy offered $25 million for projects and technologies focused on increasing the energy efficiency of data centers.

Visit link:

The rise of cloud computing is having an impact on data center efficiency and it's not great - Utility Dive

Cloud computing in 2020: views of the industry – Techerati

We asked six industry experts to weigh in on whats now and whats next in cloud computing

When we published our selection of cloud predictions last year, most predicted container orchestrator Kubernetes to consolidate its stranglehold over the container space and, correspondingly, modern cloud infrastructure.

Last November, one of the most extensive customer surveys bore this prediction out. In its study of thousands of companies, cloud and infrastructure monitoring company Datadog found 45 percent of its customers were using Kubernetes. And if that isnt evidence enough, just reflect on VMwares announcement in March that it plans to transition its enterprise virtualisation platform to a system that runs (and runs on) Kubernetes.

But in reality, Kubernetes centrality to cloud was put beyond doubt weeks before we published last years roundup. In January, IBM steamrollered into 2019 fresh off the back of its $34 billion acquisition of Red Hat. This year IBM confirmed it would integrate Red Hats popular Kubernetes implementation, OpenShift, into a new multi-cloud business focus.

It is in this context that most of this years experts consulted their cloud crystal balls. Rackspaces Lee James predicts 2020 to be a year of stiff competition between enterprise IT giants jostling to deliver a Kubernetes solution that unlocks multi-cloud for their customers. On the other hand, Stephan Fabel of Canonical says end-users will start to understand the limitations of Kubernetes, and accordingly, utilise it more strategically. Lastly, Pivotals Michael Cote expects companies to use this new-found savoir-faire to establish a singular, overall Kubernetes strategy.

Read the predictions in their entirety below.

Hybrid becomes the new multi-cloud, again

While the popularity of multi-cloud is undisputed with 81 per cent of companies using cloud technologies in some way, many firms are still making investments in their private cloud solutions. This is due to a number of reasons, such as the security posture, ongoing data centre leasing, or just because its the best platform for the application in some cases business. Indeed, even the UK Government plans to revise its cloud first policy to cloud right (or something similar) early next year, acknowledging that public cloud isnt right for everyone or every use case.

Reflecting this trend, weve seen the cloud giants respond with private cloud solutions that link directly into their public cloud solutions, such as Azure Arc, Google Cloud Anthos and AWS Outposts.

In 2020, theres going to be significant competition between the three biggest cloud hyperscalers and VMware as they all explore and deliver on how Kubernetes will unlock their potential to be the multi hybrid cloud provider of choice for customers. For customers, its ultimately going to come down to which fits and works best, as well as what gives the best bang for their buck. But this sets us up for an exciting year of new product and service announcements as each of the major cloud companies try to establish themselves as the cloud broker of choice.

Unicorn start-ups will begin repatriating workloads from the cloud

There has been a lot said about cloud repatriation of late. While this wont be a mass exodus from the cloud in fact quite the opposite, with public cloud growth expected to increase 2020 will see cloud native organisations leveraging a hybrid environment to enjoy greater cost savings.

For businesses starting out or working with limited budgets, which require an environment for playing around with the latest technology, public cloud is the perfect place to start. With the public cloud, you are your own limit and get immediate reward for innovation. But as these costs begin mounting, its prudent to consider how to regain control of cloud economics.

Repatriating workloads to on-premise is certainly a viable option, but it doesnt mean to say that we will start to see the decline of cloud. As organisations get past each new milestone in the development process, repatriation becomes more and more of a challenge. What we will likely see is public cloud providers reaching into the data centre to support this hybrid demand, so that they can capitalise on the trend

Kubernetes has become an integral part of modern cloud infrastructure and serves as a gateway to building and experimenting with new technology. Its little surprise that many companies we observe are doubling down on the application and reorienting their DevOps team around it to explore new things such as enabling serverless applications and automating data orchestration. We think this trend will continue at strength in 2020.

On a more cautious note, we may also see some companies questioning whether Kubernetes is really the correct tool for their purposes. While the technology can provide tremendous value, in some cases it can be complex to manage and requires specialist skills.

As Kubernetes is now commonly being used for production at scale, it becomes increasingly likely that users encounter issues around security and downtime. As a result of these challenges, we can expect the community will mature and in some cases come to the viewpoint that it might not be right for every application or increase the need to bring in outsourced vendors to aid with specialised expertise.

Organisations should try to find one standard Kubernetes approach

Kubernetes has already emerged as the leading choice for running containerised or cloud native applications. Organisations will now spend the time to create a kubernetes strategy, choosing the distro or services theyll use, to then run a few applications on it. Having multiple initiatives here would be a huge waste and delay the overall strategy. Instead organisations should try to find one standard Kubernetes approach. In 2020, though, the bulk of the work will be finding and modernising the apps that will run on that platform.

Most large organisations are doing just that and will spend 2020 modernising how they build and run software. To start, they need to find a handful of small, high value applications and the services those applications use. Then, run a working Proof of Concept (POC) to validate the platform choice by launching actual applications on the platform.

If it goes well, organisations can then put more apps on it. If it doesnt go well, they can try to find out why and try again, maybe with a new platform. Its important to look at the full end-to-end process: from development, to running in production, to re-deploying companies need to judge the success of the platform choice.

All applications will become mission-critical

The number of applications that businesses classify as mission-critical will rise during 2020 paving the way to a landscape in which every app is considered a high-priority. Previously, organizations have been prepared to distinguish between mission-critical apps and non-mission-critical apps. As businesses become completely reliant on their digital infrastructure, the ability to make this distinction becomes very difficult.

On average, the 2019 Veeam Cloud Data Management report revealed that IT decision-makers say their business can tolerate a maximum of two hours downtime of mission-critical apps. But what apps can any enterprise realistically afford to have unavailable for this amount of time? Application downtime costs organisations a total of $20.1 million globally in lost revenue and productivity each year, with lost data from mission-critical apps costing an average of $102,450 per hour. The truth is that every app is critical.

Businesses will continue to pick and choose the storage technologies and hardware that work best for their organisation, but data centre management will become even more about software. Manual provisioning of IT infrastructure is fast-becoming a thing of the past. Infrastructure as Code (IaC) will continue its proliferation into mainstream consciousness. Allowing business to create a blueprint of what infrastructure should do, then deploy it across all storage environments and locations, IaC reduces the time and cost of provisioning infrastructure across multiple sites.

Software-defined approaches such as IaC and Cloud-Native a strategy which natively utilises services and infrastructure from cloud computing providers are not all about cost though. Automating replication procedures and leveraging the public cloud offers precision, agility and scalability enabling organisations to deploy applications with speed and ease. With over three-quarters of organisations using software-as-a-service (SaaS), a software-defined approach to data management is now relevant to the vast majority of businesses.

Companies will take a step down from the cloud

The pattern goes something like this: lift and shift infrastructure VMs to the cloud, see costs actually go up, then move some workloads back to on-prem, then application lifecycle drivers push new apps (or new features for old apps) to be built in the cloud using PaaS/DbaaS technologies with more favourable cost model, then retire old IaaS apps. The key takeaway is this dynamic is one of the drivers for hybrid approaches.

We saw many companies this past year lift and shift to the cloud. Now, in 2020, I expect well see companies take a step back, and reevaluate their all-in approach. After feeling the effects of the full shift to cloud, the high associated costs and lack of flexibility, many companies will likely move to a multi-cloud or hybrid cloud approach next year.

Taking a hybrid cloud approach enables organizations to leverage the cloud (using AWS, Azure, or GCP) for some applications and computing needs while still keeping mission-critical or sensitive data closer to home. With a multi-cloud strategy, organizations can reduce costs and, instead of being constrained to one cloud, departments have the flexibility to select the service that works best for their individual needs (using a mix of AWS, Azure, or GCP).

Customers and organisations will really begin to look for more management layers on top of their solutions

One of the things I believe well see in 2020 is the true adoption of things like hybrid and multi-cloud solutions but the difference in this upcoming year will be that customers and organisations will really begin to look for more management layers on top of their solutions.

A lot of companies already have things like backup-in-the-cloud and DRaaS somewhere else, so what theyre now looking for is a uniform management layer on top of that to give visibility on cost, as well as a knowledge of where all data is located. Its important to know where data lives, whether workloads are protected, and whether they need to move workloads between different clouds if and when requirements change.

Read the original post:

Cloud computing in 2020: views of the industry - Techerati

Cloud Computing Security: Agencies Increased Their Use of the Federal Authorization Program, but Improved Oversight and Implementation Are Needed -…

What GAO Found

The 24 federal agencies GAO surveyed reported using the Federal Risk and Authorization Management Program (FedRAMP) for authorizing cloud services. From June 2017 to July 2019, the number of authorizations granted through FedRAMP by the 24 agencies increased from 390 to 926, a 137 percent increase. However, 15 agencies reported that they did not always use the program for authorizing cloud services. For example, one agency reported that it used 90 cloud services that were not authorized through FedRAMP and the other 14 agencies reported using a total of 157 cloud services that were not authorized through the program. In addition, 31 of 47 cloud service providers reported that during fiscal year 2017, agencies used providers' cloud services that had not been authorized through FedRAMP. Although the Office of Management and Budget (OMB) required agencies to use the program, it did not effectively monitor agencies' compliance with this requirement. Consequently, OMB may have less assurance that cloud services used by agencies meet federal security requirements.

Four selected agencies did not consistently address key elements of the FedRAMP authorization process (see table). Officials at the agencies attributed some of these shortcomings to a lack of clarity in the FedRAMP guidance.

Agency Implementation of Key Elements of the FedRAMP Authorization Process

HHS

GSA

EPA

USAID

Element

Control implementation summaries identified security control responsibilities

Security plans addressed required information on control implementation

Security assessment reports summarized results of control tests

Remedial action plans addressed required information

Cloud service authorizations prepared and provided to FedRAMP Program Office

Legend: fully addressed the element partially addressed the element

FedRAMP = Federal Risk and Authorization Management Program; HHS = Department of Health and Human Services; GSA = General Services Administration; EPA = Environmental Protection Agency; USAID = U.S. Agency for International Development

Source: GAO analysis of agency documentation| GAO-20-126

Program participants identified several benefits, but also noted challenges with implementing the FedRAMP. For example, almost half of the 24 agencies reported that the program had improved the security of their data. However, participants reported ongoing challenges with resources needed to comply with the program. GSA took steps to improve the program, but its FedRAMP guidance on requirements and responsibilities was not always clear and the program's process for monitoring the status of security controls over cloud services was limited. Until GSA addresses these challenges, agency implementation of the program's requirements will likely remain inconsistent.

Federal agencies use internet-based (cloud) services to fulfill their missions. GSA manages FedRAMP, which provides a standardized approach to ensure that cloud services meet federal security requirements. OMB requires agencies to use FedRAMP to authorize the use of cloud services.

GAO was asked to review FedRAMP. The objectives were to determine the extent to which 1) federal agencies used FedRAMP to authorize cloud services, 2) selected agencies addressed key elements of the program's authorization process, and 3) program participants identified FedRAMP benefits and challenges. GAO analyzed survey responses from 24 federal agencies and 47 cloud service providers. GAO also reviewed policies, plans, procedures, and authorization packages for cloud services at four selected federal agencies and interviewed officials from federal agencies, the FedRAMP program office, and OMB.

GAO is making one recommendation to OMB to enhance oversight, two to GSA to improve guidance and monitoring, and 22 to the selected agencies, including GSA. GSA and HHS agreed with the recommendations, USAID generally agreed, EPA generally disagreed, and OMB neither agreed nor disagreed. GAO revised four recommendations and withdrew one based on new information provided; it maintains that the remaining recommendations are warranted.

For more information, contact Gregory C. Wilshusen at (202) 512-6244 or wilshuseng@gao.gov.

Originally posted here:

Cloud Computing Security: Agencies Increased Their Use of the Federal Authorization Program, but Improved Oversight and Implementation Are Needed -...

The Top 13 Cloud Computing Conferences to Attend in 2020 – Solutions Review

What are the top cloud computing conferences to visit in 2020? Tech conferences are some of the best places to learn about whats happening in a specific technology field. Not only are you able to hear the top tech experts speak, but you can network with professionals with an abundance of combined technical and operational knowledge.

As 2020 is almost upon us, were looking to the future to list 13 cant-miss conferences centered around cloud computing. These conferences cover every corner of the globe and span the entire year. Weve also included links to each conferences website so you can find more information or book your ticket!

WHEN: NumerousdatesWHERE:Numerous locations

Microsoft Ignite The Tour brings the very best of Microsoft Ignite to a city near you. The tour provides technical training led by Microsoft experts and your community. Youll learn new ways to build solutions, migrate and manage infrastructure, and connect with local industry leaders and peers.

You can find out more info here.

WHEN:Numerous datesWHERE: Numerous locations

Corporate infrastructure will enter a whole new dimension towards 2030. Existing ways of thinking will no longer suffice, and a people-centric New World will fundamentally change the business itself. In Gartner IT Infrastructure, Operations & Cloud Strategies Conference 2020, we will present strategies to be adopted, actions to be taken, and advices on how to achieve it for future-oriented business and technology leaders based on future trends.

You can find out more info here.

WHEN:March 11th 12thWHERE:London, UK

Technology enabled change is on the boardroom agenda for businesses of all types and sizes. Cloud Expo Europe is the UKs leading event for connecting technologists, business leaders and senior business managers with experts, solutions and services to help accelerate digital transformation plans. Whether you are cloud-first, scaling up, refining, or just getting started, Cloud Expo Europe is an unrivaled opportunity to meet with leading technology innovators and service providers. Network with your peers. Access a wealth of knowledge and advice including emerging trends, tech deep dives, lessons learned and market forecasts.

You can find out more info here.

WHEN:March 14th 19thWHERE:Europa-Park, Germany

CloudFest 2020 is the second in a three-year theme arc. We began with hyperscale enablement: go big or go home! Now we will explore how AI helps you maximize the potential that hypervisor partnership offers. The Intelligent Cloud allows AI to manage and distribute complex workloads, with smart tools that make interoperability and scale more cost-effective and efficient. Its a tech paradigm that is coming up quickly, and CloudFest will help pave the way so you can be in the drivers seat. CloudFest will explore how the cloud industry is preparing for the AI evolution in terms of technology, oversight, economics, and morality.

You can find out more info here.

WHEN:March 30th April 2ndWHERE:Zuidas, Amsterdam, The Netherlands

The Cloud Native Computing Foundations flagship conference gathers adopters and technologists from leading open source and cloud native communities in Amsterdam, The Netherlands from March 30 April 2, 2020. Join Kubernetes, Prometheus, Envoy, CoreDNS, containerd, Fluentd, Jaeger, Vitess, OpenTracing, gRPC, CNI, Notary, TUF, NATS, Linkerd, Helm, Rook, Harbor, etcd, Open Policy Agent, CRI-O, TiKV, and CloudEvents as the community gathers for four days to further the education and advancement of cloud native computing.

You can find out more info here.

WHEN:April 6th 8thWHERE:San Francisco, California

Were bringing together some of the brightest minds in tech for 3 days of networking, learning, and collaboration. Experience the magic of Google Cloud Next alongside IT professionals, developers, executives, and Google experts.

You can find out more info here.

WHEN:May 12th 14thWHERE:Miami Beach, Florida

2020 is bringing you a Cloud Summit unlike any youve ever experiencedwith more of everything you need to succeed. With 50+ cutting-edge breakout sessions. world-class keynote speakers and the hands-on Cloud Showcase, youll get exclusive access to unlimited industry solutions and the latest go-to-market innovations that are shaping the future of tech. And youll get it all Miami Beach-style.

You can find out more info here.

WHEN:May 19th 21stWHERE:San Antonio, Texas

Join us in the scenic Texas hill country for the first North America Micro Focus Universe! San Antonio is one of the oldest and largest cities in Texas and is home to thousands of long-horn cattle, six Fortune 500 companies, and a growing number of next-generation companies that are moving in. This unique venue provides a spectacular backdrop to network and learn from Micro Focus product experts, fellow customers, and partners.

You can find out more info here. There will also be an event in the Netherlands in March.

WHEN:June 22nd 26thWHERE:Honolulu, Hawaii

The International Conference on Cloud Computing (CLOUD) has been a prime international forum for both researchers and industry practitioners to exchange the latest fundamental advances in the state of the art and practice of cloud computing, identify emerging research topics, and define the future of cloud computing. All topics regarding cloud computing align with the theme of CLOUD. we will celebrate our 2020 version of gathering, to strive to advance the largest international professional forum on cloud computing.

You can find out more info here.

WHEN:August 30th September 3rdWHERE:San Francisco, California

VMworld captures the momentum of todays rapidly changing IT environment and puts it within your grasp so you can accelerate your cloud journey to support your business. Transform networking and security for speed and flexibility. Deliver digital workspaces for amazing mobile experiences. Whatever you need to know, youll find the best information, tools, and partnerships to take ITand your power to shape itto the next level.

You can find out more info here.

WHEN:October 5th 7thWHERE:Las Vegas, Nevada

HCTS is the premier forum for executives in the hosting, cloud, datacenter and managed services sectors. The agenda is carefully crafted by 451 Research analysts and industry experts to provide timely, actionable insight into the competitive dynamics of innovation. In addition to analyst and executive sessions, the three-day schedule includes a plethora of networking opportunities that will maximize your time spent in Las Vegas and foster the industry-changing relationships HCTS is known for igniting.

You can find out more info here.

WHEN:November 4th 5thWHERE:Santa Clara, California

The Cyber Security & Cloud Expo North America 2020 will return to the Santa Clara Convention Center in the heart of Silicon Valley on November 4-5 to host its second North American event. It will bring together key industries from across the globe for two days of top-level content and discussion across 5 co-located events covering IoT, 5G, Cyber Security, Cloud, Blockchain, AI and Big Data, energy, financial services, healthcare and more.

You can find out more info here.

WHEN:November 30th December 4th (tentative)WHERE:Las Vegas, Nevada (tentative)

AWS re:Invent is a learning conference hosted by Amazon Web Services for the global cloud computing community. The event will feature keynote announcements, training and certification opportunities, access to more than 2,500 technical sessions, a partner expo, after-hours events, and so much more.

You can find out more info here.

Looking for information on managed service providers for your cloud solutions? Our MSP Buyers Guidecontains profiles on the top cloud managed service providers for AWS, Azure, and Google Cloud, as well as questions you should ask vendors and yourself before buying. We also offer anMSP Vendor Mapthat outlines those vendors in a Venn diagram to make it easy for you to select potential providers.

Check us out onTwitterfor the latest in Enterprise Cloud news and developments!

Dan is a tech writer who writes about Enterprise Cloud Strategy and Network Monitoring for Solutions Review. He graduated from Fitchburg State University with a Bachelor's in Professional Writing. You can reach him at dhein@solutionsreview.com

Related

Read the rest here:

The Top 13 Cloud Computing Conferences to Attend in 2020 - Solutions Review

Cloud Security in 2020 Starts With Protecting Data Wherever It Resides – Security Intelligence

2019 saw massive growth in the cloud market. The worldwide public cloud services market is projected to grow 17.5 percent in 2019 totaling $214.3 billion, up from $182.4 billion in 2018, according to Gartner.

Why has there been such a surge in cloud growth? Because as organizations move toward cloud computing, they are benefiting from capital expenditure cost savings and leveraging the flexibility of software-as-a-service (SaaS) solutions. However, as cloud adoption continues, organizations need to ensure they maintain a robust cloud security posture.

To dig deeper into this, as well as inquire about where cloud security will be heading in 2020, I spoke with subject matter expert and IBM Security Program Director for QRadar Cloud, SaaS and MSSP, Chris Collard. Chris is an information security professional with over 15 years of experience managing information systems and services, a Certified Information Systems Security Professional (CISSP) and holds a Certificate of Cloud Security Knowledge (CCSK) from the Cloud Security Alliance.

Question: As we near the end of the year, what are your key takeaways from the 2019 cloud security market?

Collard: We continue to see a growing number of clients solidify and execute on their cloud-first strategies as well as make inroads into migrating applications, data and workloads to the cloud. Increasingly, cloud is the platform of choice for building new applications as well as acquiring new software services.

For those organizations that have already made the transition, often the focus turns squarely to the challenges of effectively monitoring these environments and building orchestration and augmented intelligence into operations and response capabilities.

Nevertheless, even with the significant momentum that has developed throughout 2019, we are still a fair way off from realizing all the benefits of pure cloud deployments. Our IBM Cloud team estimates that approximately 80 percent of production workloads are still not yet migrated to the cloud. This means that a significant set of the opportunities and discussions about cloud security are still in front of many of us.

As organizations prepare for and design their path to the cloud, they absolutely have the opportunity to reimagine business processes and an imperative to protect and secure their data at each stage in the journey including the destination. When we look at the security product and solutions market, we continue to see a fragmented one leaving many organizations to attempt to manage a patchwork of point solutions on-premises and in the cloud.

What organizations ultimately need is a cohesive and well-structured set of solutions able to continuously monitor the compliance of multi-cloud and hybrid environments. When teams can better connect their environments and data, they are better positioned to gain security insights and to take action and respond quickly when required. With a unified approach to security, organizations can gain immediate benefits and be better prepared for security in a hybrid, multi-cloud world.

What is the state of the SIEM-as-a-service market as we near the end of 2019?

Collard: Cloud is increasingly the future of security information and event management (SIEM). The cloud as the platform for SIEM allows organizations to scale better and more flexibly to align with, and meet, the present demands of their business.

Consuming capabilities as a service typically comes with the added benefit of helping free organizations from the responsibility of staffing the range of specialists required to deploy and maintain complex technology stacks. Managing threats is hard enough without having to also manage and maintain on-premises software deployments. When organizations are freed up from nonessential activities, such as managing hardware and software related life cycles, they can re-invest this found time and further focus on more important activities, such as protecting and defending critical corporate data and other important assets.

As this market continues to expand, we expect to see further adoption of open standards for data and applications. The increased adoption of STIX, TAXII and other open standards points to a future built on interoperability and the ability to protect data everywhere it exists. By not adopting open standards, you run the risk of losing visibility into the breadth of your data over time or in limiting your abilities to analyze your data into the future.

What would you say should be the No. 1 priority for organizations moving to cloud security in 2020?

Collard: The short answer is protect your data wherever it resides. The longer answer ultimately depends upon where clients are in their journey, whether they are just embarking on their journey to the cloud or they have already fully adopted the cloud as their deployment platform of choice. Protecting data from loss or leakage is the ultimate goal. To get there, organizations should embrace the opportunity to refresh their overall deployment strategy, from the ground up if necessary, and ensure that this strategy has cloud considerations integrated throughout.

After protecting your data no matter where it lives, what other aspects of cloud security should organizations focus on in 2020?

Collard: While outlining a modernized strategy, you should also take the opportunity to rebuild your security policies. Applying best practices, including a zero-trust security model, can help protect not only your data but also your networks, users, workloads and devices. This strategy should include the definition of microperimeters based on the end-to-end flow of data as well as the employment of microsegmentation, wherein identities and access can be strictly controlled to a granular degree and not just at the level of an entire server or subnet.

Where possible, organizations should look to leverage available cloud-native security controls. These controls can unlock additional visibility into your environments and can be used to further feed SIEM detection capabilities.

Achieving the goal of protecting your data across multi-cloud and hybrid environments also requires a strong DevOps or DevSecOps organization that can help automate, apply and manage your security at the critical intersections within your business. DevOps can play an important policy enablement role within your organization. Through DevOps, you should expect to see your policies and rules enacted with greater speed, velocity and precision.

If you have instrumented your environment correctly, have built the right monitors and have the right processes in place, you should then be effectively positioned to continuously monitor your environments for compliance. Having defined what needs to be filtered out versus kept and what needs to be analyzed versus maintained for posterity, organizations are best positioned to deliver orchestrated incident response.

No matter the size, organizations understand the benefits of migrating data and applications to cloud environments as they see the necessity to leverage cloud infrastructure to elastically scale up, store data in a cost-effective manner and reach a global customer base.

Learn more about securing the cloud

Read more from the original source:

Cloud Security in 2020 Starts With Protecting Data Wherever It Resides - Security Intelligence

10 Toolbox Stories You Loved in 2019 – Toolbox

This is the whole point of technology. It creates an appetite for immortality on the one hand. It threatens universal extinction on the other. Technology is lust removed from nature. - Don DeLillo, American Novelist

With so much going on in artificial intelligence (AI) and cloud computing world, its hard to find articles and resources which not only inform but educate about the shifts in technologies, career-building tips and accelerate your progress in these fields. At Toolbox, we help you stay current with our coverage of current and emerging technologies and keeping a pulse on shifting technology trends. Needless to say, you ended up loving the content we curated. Heres a look at your favorite picks of 2019 that have staying power.

Table of Contents

1. 5 AI Programming Languages for Beginners in 2019

2. 10 Industries AI Will Disrupt the Most by 2030

3. The Top 5 Cloud Computing Books to Read in 2019

4. How to Build a Career in Artificial Intelligence and Machine Learning

5. 10 Most Common Myths About AI

6. What Is Deep Learning: Definition, Framework, and Neural Networks

7. What Is the Difference Between Artificial Intelligence, Machine Learning, and Deep Learning?

8. What Is Cloud Computing Architecture: Front-End & Back-End Explained

9. Top 5 Private Cloud Service Providers in 2019: HPE, VMware, Dell, Oracle, and IBM

10. 10 Experts on the Future of Artificial Intelligence

Artificial Intelligence, machine learning (ML), and deep learning (DL), are being used to simplify various business processes today. With cloud-based AI solutions, it has become even easier for companies to deploy AI to the average user, thereby paving the way for exciting possibilities.

Note: With so much buzz on AI, how does one get started with it. Check out this ready list of programming languages for AI and ML.

This article explores the fate of the industries that are most likely to be impacted by the wide adoption of artificial intelligence.

Note: Quick snapshot on industries poised for disruption by AI and emerging technologies.

Every aspiring cloud professional should aim to build a good foundation in basic cloud computing concepts. The article provides a compilation of the top 5 books on cloud computing that aims to help readers master the basics and understand the current landscape of cloud.

Note: Planning to learn cloud computing, dont miss out on this carefully crafted book list.

However, with growing competition, it is getting difficult to land a job as a machine learning professional. This article discusses the skills and certifications that are required to get hired as a Machine Learning engineer, AI developer, or a Research Scientist. It shares detailed information about the five sought-after AI and machine learning roles one can choose from.

Note: Short and concise guide on upcoming jobs roles in AI/ML and learning resources.

This article lists some of the most popular myths about artificial intelligence and digs deeper into them. It aims to investigate the real-world implications and possibilities of these theories and makes an attempt to separate hype from reality. Quite an interesting read, this one!

Note: Get real about artificial intelligence with this piece that busts common misconceptions and more.

This article educates readers about the basics of Deep Learning and neural networks, as well as the frameworks that are used to create them. It also lists some examples of neural network algorithms.

Note: This article is your go-to guide on Deep Learning algorithms.

However, due to the growing buzz around artificial intelligence, machine learning, and deep learning, the difference between these terms has become unclear. Due to the similarity of these terms, a lot of new entrants find these concepts confusing. This article aims to help readers differentiate between various artificial intelligence-related terminologies with the help of definitions and helpful examples. Definitely a must-read!.

Note: Stay on top of your concepts and understand the conceptual differences with this article.

This article explains front-end and back-end cloud computing architecture in detail. Lets take a look!

Note: Dont miss out on this ready explainer on cloud computing which is a good read for beginners and techies!

Note: Find out whos on top of the stack in the cloud computing world.

Various companies that have been early adopters of AI solutions have been able to see significant gains in the optimization of their business processes. Artificial intelligence is seeing wide adoption in the enterprise sector. However, professionals who have not yet adopted AI are clueless about how the technology actually works and what its future will be like.

Therefore, the only way to gauge the effect of AI on various industries is to talk to professionals who work with this technology on a daily basis. Toolbox reached out to industry experts to understand their views about the future of artificial intelligence. Take a look at the compilation.

Note: Check out the top 10 predictions on the future of AI by industry experts and what are the next steps in industry adoption.

What would you like us to write about in 2020? Comment below or let us know on LinkedIn, Twitter, or Facebook. Wed love to hear from you.

See original here:

10 Toolbox Stories You Loved in 2019 - Toolbox