What is Cloud Computing Technology?: Cloud Definition …

More and more, we are seeing technology moving to the cloud. Its not just a fadthe shift from traditional software models to the Internet has steadily gained momentum over the last 10 years. Looking ahead, the next decade of cloud computing promises new ways to collaborate everywhere, through mobile devices.

So what is cloud computing? Essentially, cloud computing is a kind of outsourcing of computer programs. Using cloud computing, users are able to access software and applications from wherever they need, while it is being hosted by an outside party in the cloud. This means that they do not have to worry about things such as storage and power, they can simply enjoy the end result.

There are three types of cloud computing:

– Infrastructure as a Service (IaaS)

A third party hosts elements of infrastructure, such as hardware, software, servers, and storage, also providing backup, security, and maintenance.

– Software as a Service (SaaS)

Using the cloud, software such as an internet browser or application is able to become a usable tool.

– Platform as a Service (PaaS)

The branch of cloud computing that allows users to develop, run, and manage applications, without having to get caught up in code, storage, infrastructure and so on.

There are several types of PaaS. Every PaaS option is either public, private, or a hybrid mix of the two. Public PaaS is hosted in the cloud and its infrastructure is managed by the provider. Private PaaS, on the other hand, is housed in on-site servers or private networks, and is maintained by the user. Hybrid PaaS uses elements from both public and private, and is capable of executing applications from multiple cloud infrastructures.

PaaS can be further categorized depending on whether it is open or closed source, whether it is mobile compatible (mPaaS), and what business types it caters to.

When choosing a PaaS solution, the most important considerations beyond how it is hosted are how well it integrates with existing information systems, which programing languages it supports, what application-building tools it offers, how customizable or configurable it is, and how effectively it is supported by the provider.

As digital technologies grow ever more powerful and available, apps and cloud-based platforms are becoming almost universally widespread. Businesses are taking advantage of new PaaS capabilities to further outsource tasks that would have otherwise relied on local solutions. This is all made possible through advances in cloud computing.

Traditional business applications have always been very complicated and expensive. The amount and variety of hardware and software required to run them are daunting. You need a whole team of experts to install, configure, test, run, secure, and update them.

When you multiply this effort across dozens or hundreds of apps, its easy to see why the biggest companies with the best IT departments arent getting the apps they need. Small and mid-sized businesses dont stand a chance. The affordability of cloud-hosted data makes it an essential tool for these types of situations. Here are some other benefits of cloud computing.

– Adaptable

Cloud computing allows for adaptable programs and applications, that are customizable, while allowing the owners control over the core code.

– Multi-tenancy

Cloud software provides the opportunity to provide personalized applications and portals to a number of customers or tenants.

– Reliable

Because it is hosted by a third party, businesses and other users have greater assurance of reliability, and when there are problems, easy access to customer support.

– Scalability

With the Internet of Things, it is essential that software functions across every device and integrates with other applications. Cloud applications can provide this.

– Secure

Cloud computing can also guarantee a more secure environment, thanks to increased resources for security and centralization of data.

EXPLORE THE SALESFORCE CLOUD

WATCH DEMO

With cloud computing, you eliminate those headaches that come with storing your own data, because youre not managing hardware and software that becomes the responsibility of an experienced vendor like salesforce.com. The shared infrastructure means it works like a utility: you only pay for what you need, upgrades are automatic, and scaling up or down is easy.

Cloud-based apps can be up and running in days or weeks, and they cost less. With a cloud app, you just open a browser, log in, customize the app, and start using it.

Businesses are running all kinds of apps in the cloud, like customer relationship management (CRM), HR, accounting, and much more. Some of the worlds largest companies moved their applications to the cloud with salesforce.com after rigorously testing the security and reliability of our infrastructure.

As cloud computing grows in popularity, thousands of companies are simply rebranding their non-cloud products and services as cloud computing. Always dig deeper when evaluating cloud offerings and keep in mind that if you have to buy and manage hardware and software, what youre looking at isnt really cloud computing but a false cloud.

Salesforce can provide a comprehensive solution to all of your cloud computing needs. Using a wide array of tools and services, Salesforce can be a one stop shop for businesses looking to manage customer relationships, sales, marketing, and application development. Using artificial intelligence, Salesforce applications can also provide predictive analytics that allow for all around better decision making.

The best way to get to know Sales Cloud is to get your hands on the actual product. The free trial also comes with unlimited access to our Sales Community, a group of sales thought leaders sharing ideas about sales for salespeople.

QUESTIONS? OUR REPS HAVE ANSWERS. 1-800-667-6389

Excerpt from:

What is Cloud Computing Technology?: Cloud Definition …

The Cloud Computing Era Could Be Nearing Its End | WIRED – WIRED

Fasten your harnesses , because the era of cloud computings giant data centers is about to be rear-ended by the age of self-driving cars. Heres the problem: When a self-driving car has to make snap decisions, it needs answers fast. Even slight delays in updating road and weather conditions could mean longer travel times or dangerous errors. But those smart vehicles of the near-future dont quite have the huge computing power to process the data necessary to avoid collisions, chat with nearby vehicles about optimizing traffic flow, and find the best routes that avoid gridlocked or washed-out roads. The logical source of that power lies in the massive server farms where hundreds of thousands of processors can churn out solutions. But that wont work if the vehicles have to wait the 100 milliseconds or so it usually takes for information to travel each way to and from distant data centers. Cars, after all, move fast.

Jeremy Hsu is a science and tech journalist based in New York.

Sign up to get Backchannel’s weekly newsletter.

That problem from the frontier of technology is why many tech leaders foresee the need for a new edge computing networkone that turns the logic of todays cloud inside out. Today the $247 billion cloud computing industry funnels everything through massive centralized data centers operated by giants like Amazon, Microsoft, and Google. Thats been a smart model for scaling up web search and social networks, as well as streaming media to billions of users. But its not so smart for latency-intolerant applications like autonomous cars or mobile mixed reality.

Its a foregone conclusion that giant, centralized server farms that take up 19 city blocks of power are just not going to work everywhere, says Zachary Smith, a double-bass player and Juilliard School graduate who is the CEO and cofounder of a New York City startup called Packet. Smith is among those who believe that the solution lies in seeding the landscape with smaller server outpoststhose edge networksthat would widely distribute processing power in order to speed its results to client devices, like those cars, that cant tolerate delay.

Packets scattered micro datacenters are nothing like the sprawling facilities operated by Amazon and Google, which can contain tens of thousands of servers and squat outside major cities in suburbs, small towns, or rural areas, thanks to their huge physical footprints and energy appetites. Packets centers often contain just a few server racksbut the company promises customers in major cities speedy access to raw computing power, with average delays of just 10 to 15 milliseconds (an improvement of roughly a factor of ten). That kind of speed is on the must have lists of companies and developers hoping to stream virtual reality and augmented reality experiences to smartphones, for example. Such experiences rely upon a neurological processthe vestibulo-ocular reflexthat coordinates eye and head movements. It occurs within seven milliseconds, and if your device takes 10 times that long to hear back from a server, forget about suspension of disbelief.

Roni Jacobson

This Big Beef Exposes The Ugly Underbelly of Vegan Vlogging

Miranda Katz

Amazon’s Turker Crowd Has Had Enough

Steven Levy

Facebook, Apple, and Google Will Hasten the Next Era of TV

Katie Orenstein

Letter Home from Camp Wolfram

Immersive experiences are just the start of this new kind of need for speed. Everywhere you look, our autonomously driving, drone-clogged, robot-operated future needs to shave more milliseconds off its network-roundtrip clock. For smart vehicles alone, Toyota noted that the amount of data flowing between vehicles and cloud computing services is estimated to reach 10 exabytes per month by 2025 .

Cloud computing giants havent ignored the lag problem. In May, Microsoft announced the testing of its new Azure IoT Edge service, intended to push some cloud computing functions onto developers own devices. Barely a month later, Amazon Web Services opened up general access to AWS Greengrass software that similarly extends some cloud-style services to devices running on local networks. Still, these services require customers to operate hardware on their own. Customers who are used to handing that whole business off to a cloud provider may view that as a backwards step.

US telecom companies are also seeing their build-out of new 5G networks which should eventually support faster mobile data speedsas a chance to cut down on lag time. As the service providers expand their networks of cell towers and base stations, they could seize the opportunity to add server power to the new locations. In July, AT&T announced plans to build a mobile edge computing network based on 5G, with the goal of reaching single-digit millisecond latency . Theoretically, data would only need to travel a few miles between customers and the nearest cell tower or central office, instead of hundreds of miles to reach a cloud data center.

Our network consists of over 5,000 central offices, over 65,000 cell towers, and even several hundred thousand distribution points beyond that, reaching into all the neighborhoods we serve, says Andre Fuetsch, CTO at AT&T. All of a sudden, all those physical locations become candidates for compute.

AT&T claims it has a head start on rival telecoms because of its network virtualization initiative, which includes the software capability to automatically juggle workloads and make good use of idle resources in the mobile network, according to Fuetsch. Its similar to how big data centers use virtualization to spread out a customers data processing workload across multiple computer servers.

Meanwhile, companies such as Packet might be able to piggyback their own machines onto the new facilities, too. I think were at this time where a huge amount of investment is going into mobile networks over the next two to three years, Packets Smith says. So its a good time to say Why not tack on some compute? (Packets own funding comes in part from the giant Japanese telecom and internet conglomerate Softbank, which invested $9.4 million in 2016.) In July 2017, Packet announced its expansion to Ashburn, Atlanta, Chicago, Dallas, Los Angeles, and Seattle, along with new international locations in Frankfurt, Toronto, Hong Kong, Singapore, and Sydney.

Packet is far from the only startup making claims on the edge. Austin-based Vapor IO has already begun building its own micro data centers alongside existing cell towers. In June, the startup announced its Project Volutus initiative, which includes a partnership with Crown Castle, the largest US provider of shared wireless infrastructure (and a Vapor IO investor). That enables Vapor IO to take advantage of Crown Castles existing network of 40,000 cell towers and 60,000 miles of fiber optic lines in metropolitan areas. The startup has been developing automated software to remotely operate and monitor micro data centers to ensure that customers dont experience interruptions in service if some computer servers go down, says Cole Crawford, Vapor IOs founder and CEO.

Dont look for the edge to shut down all those data centers in Oregon, North Carolina, and other rural outposts: Our eras digital cathedrals are not vanishing anytime soon. Edge computings vision of having thousands of small, regional and micro-regional data centers that are integrated into the last mile networks is actually a natural extension of todays centralized cloud, Crawford says. In fact, the cloud computing industry has extended its tentacles toward the edge with content delivery networks such as Akamai, Cloudflare, and Amazon CloudFront that already use edge locations to speed up delivery of music and video streaming.

Nonetheless, the remote computing industry stands on the cusp of a back to the future moment, according to Peter Levine, general partner at the venture capital firm Andreessen Horowitz. In a 2016 video presentation , Levine highlighted how the pre-2000 internet once relied upon a decentralized network of PCs and client servers. Next, the centralized network of the modern cloud computing industry really took off, starting around 2005. Now, demand for edge computing is pushing development of decentralized networks once again (even as the public cloud computing industrys growth is expected to peak at 18 percent this year, before starting to taper off).

That kind of abstract shift is already showing up, unlocking experiences that could only exist with help from the edge. Hatch, a spinoff company from Angry Birds developer Rovio, has begun rolling out a subscription game streaming service that allows smartphone customers to instantly begin playing without waiting on downloads. The service offers low-latency multiplayer and social gaming features such as sharing gameplay via Twitch-style live-streaming. Hatch has been cagey about the technology it developed to slash the number of data-processing steps in streaming games, other than saying it eliminates the need for video compression and can do mobile game streaming at 60 frames per second. But when it came to figuring out how to transmit and receive all that data without latency wrecking the experience, Hatch teamed up withguess whoPacket.

We are one of the first consumer-facing use cases for edge computing, says Juhani Honkala, founder and CEO of Hatch. But I believe there will be other use cases that can benefit from low latency, such as AR/VR, self-driving cars, and robotics.

Of course, most Hatch customers will not know or care about how those micro datacenters allow them to instantly play games with friends. The same blissful ignorance will likely surround most people who stream augmented-reality experiences on their smartphones while riding in self-driving cars 10 years from now. All of us will gradually come to expect new computer-driven experiences to be made available anywhere instantlyas if by magic. But in this case, magic is just another name for putting the right computer in the right place at the right time.

There is so much more that people can do, says Packets Smith, than stare at their smartphones and wait for downloads to happen. We want our computation now . And the edge is the way well get it.

More:

The Cloud Computing Era Could Be Nearing Its End | WIRED – WIRED

Marketo decides to go all-in on cloud computing, and picks Google as its home – GeekWire

Diane Greene, senior vice president for Google Cloud, speaks at Google Cloud Next this morning. (Google Photo)

One of the bigger marketing software companies, Marketo, has decided its ready to ditch its servers and move into the cloud, and Google is getting the business.

The two companies announced a multiyear collaboration strategy Thursday that will see Marketo move its business onto Google Cloud Platform over the next couple of years, and Google will do some work to integrate Marketos products into G Suite. Forbes noted that Google provided migration incentives in order to sweeten the deal, which will further the notion that a lot of Googles major customer wins have come at the cost of steep discounts for its services.

Still, the multiyear agreement provides Google with another long-term customer that could help it woo others, especially other marketing companies. Marketo told Forbes that one of the main reasons it choose Google was because of its in-house marketing savvy as one of the biggest advertising brokers in the world, and that might be an interesting niche for Google to pursue as other software-as-a-service marketing companies plot out cloud strategies.

Marketos software is used by a lot of companies to manage their marketing operations, from lead generation to campaign measurement. It might have decided that it needed some IT assistance earlier this year when it somehow forgot to renew its domain name registration and went down for several hours until it could fix the problem.

Google has been making slow but steady process in its cloud efforts, as it tries to shed a reputation for lacking the enterprise sales touch that Amazon Web Services and Microsoft enjoy. It has stepped up its support of hybrid cloud strategies through deals with Nutanix and just this week lowered prices on networking costs for customers that dont require all the performance that Googles fiber network provides.

Here is the original post:

Marketo decides to go all-in on cloud computing, and picks Google as its home – GeekWire

Biz Cloud Computing – Four States Homepage

More News

JOPLIN, Mo. – “It’s just like I’m there at the office,” says Wendy Brunner-Lewis.

She says it’s hard to imagine not being able to tap into the Cloud. “How many times have you woken up and your kids are sick and you think, ‘Oh gosh, all my stuff’s at the office.’ You know it’s nice that you don’t have to try to remember everything you need the night before and bring home,” Brunner-Lewis says.

A study by IDG Enterprise says almost seven out of ten offices are doing at least part of their work remotely, or on the Cloud. And it predicts it will be 100% within three years.

John Motazedi with Joplin IT company SNCSquared says that’s likely due to advantages like reduced maintenance. “Most of those things are already done by the vendor. So you don’t spend time backing it up, you don’t spend time patching it doing updates,” he says.

He also points to the flexibility; you aren’t limited by the size of your hardware on site, kind of like electricity. “Most people don’t have a generator in their house. They use electricity whenever they need it, they have wires that come to their house,” Motazedi says.

He adds security is a high priority, so new users should check out the Cloud service before signing up. “There is a difference in data centers and how secure access to those data centers are,” Motazedi says.

SNC Squared is holding a seminar on Cloud computing next month.

See the original post here:

Biz Cloud Computing – Four States Homepage

The Benefits of Multi-Cloud Computing Architectures for MSPs – MSPmentor

Multi-cloud computing architectures are the next step up from cloud computing.

If you’re an MSP, it may no longer be enough to have just one cloud.

Here’s why a multi-cloud strategy can helped managed services providers.

As the term implies, multi-cloud computing refers to the use of more than one cloud.

A multi-cloud architecture could involve multiple public clouds — such as AWS, Azure and Google Cloud Platform.

Multi-cloud could also take the form of a mixture of different types of clouds — a public cloud, a private cloud and a managed cloud, for example.

In the latter sense, there is some overlap between multi-cloud architectures and hybrid architectures, which mix public and private clouds together.

Think of hybrid cloud as one form of multi-cloud computing.

Multi-cloud is a broader category, because it involves mixing clouds of many different types.

What do businesses — and MSPs in particular — have to gain from a multi-cloud strategy?

Consider the following advantages of a multi-cloud architecture:

Here is the original post:

The Benefits of Multi-Cloud Computing Architectures for MSPs – MSPmentor

Why 2017 Is The Year To Understand Cloud Computing – Nasdaq

The Cloud has become a major buzzword in business for very good reason. Small businesses and large enterprises alike cantake advantage of cloud computingto build and expand the computer based infrastructurebehind the scenes. Follow this guide to better understand what cloud computing is, how it works, and how you can take advantage.

In the old world of web servers and internet infrastructure, websites and other online assets were typically limited to one main server, or a few linked servers using tools called load balancers, to process and send data, whether it be acustomer facing websiteor internal facing application. The advent of content delivery networks (CDNs) powered up those servers to host and serve data from the edge of the network for faster serving and sometimes lower costs.

As computing demand exploded with the rise of the smartphone and high-speed internet, consumer and business needs downstream of those servers continues to creep upward. Cloud computing has emerged as the best option to handle an array of computing needs for startups and small businesses due to the ability to start at a low cost and scale, almost infinitely, as demand grows. Advances in cloud technology at Amazon, Google, Microsoft, IBM, Oracle, and other major cloud providers is making cloud computing more desirable for all businesses.

When cloud computing first emerged, large enterprises were the only businesses able to afford the cost of elastic, flexible computing power. Now, however, those costs are more likely a drop in the bucket for small businesses.

For example, I use the cloud to store and serve videos forDenver Flash Mob, a side hustle business I run with my wife. Our monthly bill is typically around a dollar or two, and heavy months lead to a bill around five bucks. No big deal! Mylending startup Money Molais also cloud based, with costs to run both a development server and public facing server running us around $30 per month.

The first time I logged into Amazon Web Services (AWS) it seemed like I needed a computer science degree to use it! I had a hard time doing even basic tasks outside of uploading and sharing videos. Thankfully Amazon has made using AWS much easier, though it is not without its challenges.

Im a pretty techy guy, so my skillset is a bit more advanced than the average computer user. I have setup AWS to send outgoing transactional emails,automatically backup websites, and more on my own. If you are willing and able to hire a cloud expert, the possibilities of the cloud are endless. Anything from web hosting to artificial intelligence and big data analysis can run in the cloud.

The most basic way to get started with cloud computing is website and computer backups. If you use WordPress for your website, setting up cloud backups is simple with one of a handful of plugins likeUpdraft Plus. If you can use the WordPress dashboard, you can setup cloud backups with Updraft plus. It is quick and easy and includes out of the box support. Easy from companies like AWS, Drobox, Google Drive, Rackspace Cloud, and other services. The paid plugin version adds access to Microsoft OneDrive and Azure, Google Cloud Storage, and other options.

I runseveral backups of both my laptop and my web based assets. If my home were to be burglarized or burned down, the cloud has me covered. If my laptop is stolen, I have a backup at home and in the cloud. Redundant backups are not optional, they are a must in 2017.

In addition to safe, secure backups, the cloud can reach far corners of the planet. Utilizingcloud based CDNs, you know your customers will get every video and web page they want with near instant speeds.

Lets say your business has a popular video you want to share around the world. With acloud CDN, you upload your video once to the web. Then the CDN takes over and creates copies of that video file in data centers around the world. Whenever a customer clicks to view that video, they are served a copy from the closest data center to their location.

Thanks to the power of a CDN, you dont have to send viewers in Australia, London, Bangkok, and Buenos Aires a video from your web server in Texas. Each one gets a local copy so they get their video even faster, offering a better customer experience. App based businesses can even run multiple versions of their app in data centers around the world. This will nsure every user has the same great experience.

It doesnt matter what your business does, there is some way the cloud can help you achieve better results. The cloud is only going to grow and become more prominent in business. Older computer methods will go the way of the fax machine. If you want serious computing success with scalability and flexibility, the cloud is your best option.

This article was originally published on Due.com.

The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.

Read more from the original source:

Why 2017 Is The Year To Understand Cloud Computing – Nasdaq

Google Unveils Custom Hardware Chip for Cloud – Investopedia


Investopedia
Google Unveils Custom Hardware Chip for Cloud
Investopedia
subsidiary Google is planning to unveil technical details of the Titan chip its custom chip for hardware security for its cloud computing division today. The Mountain View, California-based company had already announced the chip at its Google Next …

Read more from the original source:

Google Unveils Custom Hardware Chip for Cloud – Investopedia

Cloud Computing Confirmed for Travers | TDN | Thoroughbred Daily … – Thoroughbred Daily News

Cloud Computing at Saratoga | Sarah K. Andrew

After some deliberation by trainer Chad Brown, Klaravich Stables and William Lawrences GI Preakness S. winner Cloud Computing (Macleans Music) will compete in Saturdays GI Travers S. at Saratoga, the Eclipse Award-winning conditioner confirmed Monday morning. The latest addition to the expected full field augments an already competitive race that is expected to also draw GI Kentucky Derby winner Always Dreaming (Bodemeister) and GI Belmont S. hero Tapwrit (Tapit) from the Todd Pletcher barn.

Second in the Mar. 4 GIII Gotham S. and third in the Apr. 8 GII Wood Memorial S. prior to his win in the Preakness May 20, Cloud Computing returned from a two-month layoff with a disappointing last-of-five finish as the 6-5 second choice in the July 29 GII Jim Dandy S., the traditional prep for the Travers. He was asked to stay closer to a decidedly moderate pace that day and came up empty in the stretch, despite being beaten just 4 3/4 lengths by Good Samaritan (Harlans Holiday). The colt has worked twice since then, most recently posting a five-furlong move in 1:01.65 Saturday.

He couldnt have worked any better, said Brown. I was very happy with the work and Javier was pleased, and he came out of his work well.

Brown said recently inducted Hall of Fame jockey Javier Castellano will ride Cloud Computing in the Traversa race he has won a record five times.

Not a subscriber? Click here to sign up for the daily PDF or alerts.

Read more:

Cloud Computing Confirmed for Travers | TDN | Thoroughbred Daily … – Thoroughbred Daily News

VMware to surge more than 20 percent because the Amazon cloud … – CNBC

Wall Street rarely talks about its mistakes, but Deutsche Bank admitted it overestimated the Amazon Web Services threat to VMware’s business.

The firm raised its rating for VMware shares on Monday to buy from hold, saying the company’s server virtualization software can continue to thrive in a cloud-computing world.

“We’ve spent much of the last two years worried about VMware’s on-premise core server business given its maturity and the threat from AWS/Cloud adoption [Amazon Web Services],” analyst Karl Keirstead wrote in a note to clients entitled “Overcoming our AWS fears.”

“This upgrade should be seen in the context of growing evidence that large enterprises are embracing a hybrid model, materially lowering the out-year risk profile of VMware shares.”

The hybrid model is defined by companies using both local servers on-site and cloud-computing servers off-site. Keirstead said he realized the staying power of VMWare’s on-site server market was more “durable” than he originally forecast.

“We believe that large enterprises are migrating IT workloads to the public cloud model at a slower-than-expected pace and are electing to ramp spending to modernize their on-premise IT infrastructures,” he wrote. “Our recent checks agree that VMware technology is proving to be more durable than they would have thought 12-18 months ago.”

As a result, Keirstead increased his VMware price target to $120, which is 24 percent higher than Monday’s close. His previous price target was $110.

VMware shares are outperforming the market this year. Shares have risen 23.2 percent year to date through Monday compared with the S&P 500’s 8.5 percent gain.

The analyst said he is also cautiously optimistic about the VMware and Amazon AWS strategic partnership announced in October, which enables access to AWS computing power for the company’s customers.

“We are positive on the deal for both parties. It is hard to imagine how this could end up being a net negative for either party,” he wrote. “We conclude that the stock can still work even if the initial lift from VMware Cloud on AWS is modest.”

VMware will report second-quarter earnings on Thursday after the market close. Its stock traded up 1.8 percent short after Tuesday’s market open.

CNBC’s Michael Bloom contributed to this story.

Read more from the original source:

VMware to surge more than 20 percent because the Amazon cloud … – CNBC

Microsoft reportedly set to lay off thousands as part of massive sales reorganization – GeekWire

Thousands of layoffs are planned to hit Microsofts global workforce this week, part of a shakeup of the companys sales organization, according to a report Sunday in TechCrunch.

As part of the layoffs, Microsoft plans to merge parts of its enterprise customer business with its small-and-medium-enterprise business unit, according to TechCrunch,citing a source with knowledge of the downsizing.

Weve reached out to Microsoft for comment, and well update this post as we learn more.

On Friday, Bloomberg reported that Microsoft was planning a reorganization of its sales group in order to focus more intently on its growing cloud computing business. Bloomberg did not report on the number of potential layoffs, but noted that the changes were likely to be announced this week.

The changes in the sales operation appear to be putting even more weight behind Microsofts Azure business, which is growing rapidly. Microsofts commercial cloud run rate hit $15.2 billion during the second quarter, up from $14 billion in the previous quarter. Meanwhile, revenue in the companys intelligent cloud group grew by 93 percent to $6.8 billion in the second quarter.

It is unclear whether the layoffs will be accompanied by new cloud-focused job openings.

Microsofts fiscal year ended on June 30, so the timing of the cutbacks could be tied to the companys move into a new fiscal year, which started on July 1.

Last July, Microsoft cut 2,850 people from its smartphone and sales teams.

As of March 31, Microsoft employed 121,567 worldwide, including 45,535 in Washington state.

Originally posted here:

Microsoft reportedly set to lay off thousands as part of massive sales reorganization – GeekWire

Five podcasts to catch up on the latest trends in cloud computing – TechTarget

Cloud is a dynamic technology, and enterprises need to be flexible to keep up.

But before they successfully adopt the latest trends in cloud computing — ranging from containers to continuous monitoring — enterprises first face a number of challenges. David Linthicum, a TechTarget contributor and SVP of Cloud Technology Partners, a cloud consultancy company in Boston, explores top cloud trends, their effect on enterprise IT teams and more in these five podcasts with cloud experts. Read on and tune in to know what to expect.

With the proliferation of cloud services, enterprises want to take advantage of new offerings and better prices at any time. Unfortunately, the more dependent an enterprise becomes on a particular cloud provider and its native services, the harder it is to move applications.

Lock-in risks are high in cloud — not just with vendors but also with models. With a private cloud, enterprises can be locked into their own design, and in public cloud, they can become dependent on add-on services. Any type of lock-in will result in high prices, according to Marten Mickos, CEO at HackerOne, a provider of vulnerability tracking software.

Many believe one of the latest trends in cloud computing — containers — could reduce these lock-in risks through the promise of portability. Containers continue to rise in popularity because they can make it easier to move applications from one cloud platform to another. But there’s a catch: Many of the cloud providers’ container management services, such as Azure Container Service, Google Container Engine and Amazon Elastic Compute Cloud Container Service, pose lock-in risks of their own.

“Docker and container management and orchestration solutions have made portability vastly easier, but as soon as you start availing yourself to the special services of whatever platform you’re on, you’re hooked,” Linthicum says.

As public cloud adoption continues to rise, some enterprises question whether private cloud is dead. Others, however, believe that private cloud is alive and well, as certain compliance, cost and security requirements still fuel deployments.

Compliance is tricky, and certain requirements and standards restrict some enterprises to a private cloud. Others are reluctant to migrate to public cloud because of potentially higher costs and previous investments in an on-premises data center. In addition, there can be high costs associated with training and hiring staff to maintain a public cloud deployment.

“There is no magic button on the side of the server that you press that makes it suddenly cloud-capable. It’s going to require software infrastructure, hardware infrastructure [and] operational skills,” says Bernard Golden, CEO of Navica, a cloud consulting firm.

Before you make your final decision about migration — either public or private — review what applications you currently run and what you want to run in the future. If compliance is still an issue, consider hybrid or multicloud models.

Hacking is a growing threat and large businesses, such as Target and Home Depot, have been victims of malicious attacks. It is time for enterprises to go on the offensive and adopt ongoing monitoring and testing practices to ensure their data is secure.

“Hacking is a business now,” says Zohar Alon, CEO and co-founder of Dome9 Security Ltd., a provider of cloud management as a service. “When the other side can benefit from it financially, [and] quite easily now with bitcoin, it’s not surprising to see those [hacking businesses] emerge [and see] ransomware all over the place.”

In addition, one of the latest trends in cloud computing is serverless architectures, which bring new security risks. Because of a serverless application’s design, enterprises can’t secure it with the same encryption or identity access management practices they are used to. To reduce risk, they should also ensures serverless functions don’t have more permissions than they need, Alon says.

Once an enterprise runs applications in the cloud, they generally want those apps to keep running — which is where backup and recovery come in. But some of the latest trends in cloud computing, such as the internet of things, increase the amount of data floating around. This has some IT teams rethinking their backup and recovery strategies to maintain availability in case of an outage.

For example, some enterprises have replaced strongly consistent databases, such as MySQL, with eventually consistent databases, such as Apache Cassandra, says Tarun Thakur, co-founder and CEO at Datos IO, a data protection software provider. These databases are more distributed in nature and can offer more scale.

Airlines such as Delta, JetBlue and United, experienced data center outages that affected operations. It is important to have a backup and recovery plan in place to prevent major disruptions as one system fails over to another. IT teams should learn from these high-profile outages — namely, that, even as they adopt cloud and other new services, they shouldn’t “compromise what is needed to keep … applications and business running all the time,” Thakur says.

Some of the latest trends in cloud computing — such as hybrid and multicloud models — have forced vendors to reevaluate their services and ask whether they meet enterprise needs. In many cases, the easiest way for them to fill out their portfolios is to partner with or acquire other companies.

A notable example of this is Amazon Web Services (AWS) and VMware, whose partnership enables VMware’s software-defined data center software to run on AWS. Enterprises benefit from these deals because of simplified integration, but often in these situations, one vendor makes out better than the other.

“AWS gets to sit back and watch the meter go higher and lock in to more VMware install base and, perhaps, put in an advantage over what Microsoft can offer,” says Dana Gardner, president and principal analyst at Interarbor Solutions.

In 2013, IBM acquired SoftLayer to strengthen its cloud platform. To differentiate itself from the top public cloud providers, IBM continues to focus its efforts on hybrid cloud, as well as machine learning, artificial intelligence and other higher-level services. With more cloud models and technologies, Linthicum and Gardner agree that we will continue to see more partnerships and acquisitions in the future.

Artificial intelligence services expand in public cloud

Consider Azure Functions for your serverless needs

Are enterprises ready for machine learning in cloud?

Read more here:

Five podcasts to catch up on the latest trends in cloud computing – TechTarget

Cloud computing challenges today: Planning, process and people – TechTarget

With its promises of lowering costs and fostering a more agile IT, cloud computing holds an almost magical allure for many companies today: They think moving an application or two to the cloud will solve all their problems, said Ed Featherston, vice president and principal architect at Cloud Technology Partners.

“Then they get the bill at the end of the month and go, ‘Oh my God, what happened?'” Featherston said at the recent Cloud Expo in New York. “The magic doesn’t happen by itself. You actually need to plan.”

In this SearchCIO video, filmed on the concourse of the Jacob K. Javits Convention Center, Featherston and two other cloud watchers discuss the biggest cloud computing challenges IT execs are dealing with today.

Sumit Sarkar sees another type of struggle. He’s a data evangelist at Progress, a vendor of data integration and data interoperability services. As organizations build cloud architecture, plugging in new technologies and services, “data is getting somewhat more abstracted,” or not immediately and easily available to analytics professionals, who need to slice and dice it.

“If I build these microservices, my data is behind these different APIs. What if I have a data science practice team? How do I make sure they have access to data to really bring business value?” Sarkar says. IT executives and their business counterparts have to keep that in mind as they’re “rearchitecting and refactoring all of their systems.”

Greg Bledsoe, managing consultant at Accenture, says companies moving to the cloud need to adopt new ways of working. Bledsoe helps companies make the transition to DevOps, the software development process that emphasizes frequent interaction and communication between development and operations teams.

Cloud computing, he says, makes experimenting cheap. “If it doesn’t work, throw it away. But companies are still managing their cloud infrastructure as if it were physical infrastructure.”

So someone on the business side might request a new tech project from IT, gets it, tries it out, but then it’s not quite right.

“Throw something back over the wall and have somebody hoist it back over the wall to you. This makes no sense for cloud,” Bledsoe says. “It’s totally a legacy artifact of our past management strategies that is completely unnecessary.”

What cloud computing challenges plague IT execs today?

Ed Featherston: The biggest struggle I’ve seen with clients in cloud is fully understanding what it is and what it isn’t for them — and what it’s going to provide them. The classic of, ‘If I go to the cloud, it’s going to solve all my problems.’

One of my favorite mantras is ‘No technology negates the need for good design and planning.’ Cloud is no exception. And the biggest challenge I see people having with cloud is if they don’t do that first. If they just say, ‘Oh, I’m just going to take this workload, I’m going to drop it in AWS [Amazon Web Services], I just log into the console, fire up a couple instances — boom, I’m off to the races.’ Then they get the bill at the end of the month and go, ‘Oh my God, what happened?’

The magic doesn’t happen by itself. You actually need to plan. You need to understand ‘What am I going to get out of it? Am I going to the cloud for cost savings?’ Then you better look really closely at it when you do that.

I was talking with somebody earlier about the fact that — you move your first application over and you say, ‘OK, why am I not saving any money?’ Well, because the servers that application was on still have five other applications on them. I still have to maintain them. I still have to pay for them. So, I’m paying for those servers still. Plus, now I’m paying Amazon or [Microsoft] Azure for that cloud instance that I just created, so I’m actually spending more money. You actually have to think that out if you’re going for the agility and being able to move faster. Do your development processes and operations processes support that capability? Yes, cloud can make you very agile — if you have the processes in place to do it. If you’re still a standard, Waterfall development type of shop that has no concept of what development and operations are and tying them together, the cloud’s not going to make it go faster for you. If anything, it’s probably going to make it go slower if you’re not ready for that.

So those are the kinds of challenges I see clients having out there. It’s getting those expectations set. It’s part of why I enjoy being where I am at our company, because it’s one of the things we push really hard with the clients — of No, we’re not going to start with, Go to the cloud. We’re going to start with, What do you want from it? Let’s look at what you’ve got, let’s understand how we’re going to get there and what the stumbling blocks are — then, we start moving to the cloud.

Sumit Sarkar: What we’re seeing is there’s a lot of people who start — I don’t know if they’re buzzwords, but at the show in the morning we had a kickoff about cloud-native architectures, and there’s 12 attributes of them. And then there’s something — I think I heard the term cloud washed: You take an application, you stick it in the cloud, and you rebrand it as cloud.

But the thing is, in between, there’s a big mix of different levels. I think with the innovation that’s happening is that between a cloud-native architecture and something that’s cloud-washed, for example, there’s a whole lot of things happening in innovation. So I’ve heard different people who are taking maybe some NoSQL technologies to supplement an ERP system. Some people are building out some microservices on top of existing databases if you have a distributed data architecture.

So, what’s happening is data is getting somewhat more abstracted as we have this spectrum of cloud-native and, let’s say, untraditional or the monoliths. So they decompose these things, data gets moved around, make it scalable. So, what’s happening is it’s causing a challenge for the analytics professionals. So, if you think about the people doing operations intelligence, that is the last thing sometimes people think about. I encourage folks, the CxO people, to think about analytics as they’re rearchitecting and refactoring all of their systems. Think about it: If I build these microservices, my data is behind these different APIs. What if I have a data science practice team? How do I make sure they have access to data to really bring business value? Or I have this data engineering team who can really build these nice data repositories to get 360-degree intelligence. How do I make it easy for them to get the data?

So that’s what we’re seeing in the connectivity space is, How do you provide connectivity for those professionals to still access data as you’re refactoring these things in that big spectrum? So the CxO folks, they have these initiatives, and that’s something to really think about is, Don’t forget the data integration for analytics.

Greg Bledsoe: Because we’ve come from this legacy of managing physical infrastructure, and we’re used to managing physical infrastructure in a very tightly controlled way to protect our investment and control cost, we bring that same mindset to cloud.

This mindset does not really apply to cloud. You don’t have to pay for things when you’re not using them. The whole power of cloud and the reason that cloud empowers DevOps is because you’ve cheapened experimentation. It has become dirt-cheap to try something, and if it doesn’t work, throw it away.

But companies are still managing their cloud infrastructure as if it were physical infrastructure. And you have a DevOps team or someone that sets up cloud infrastructure for you, you put in a request, you put in ticket, and someone builds something, and a few days later you get a response with some things that are built. And then you start trying to use those and it’s not quite right; you do this again. Throw something back over the wall and have somebody hoist it back over the wall to you. This makes no sense for cloud. It’s totally a legacy artifact of our past management strategies that is completely unnecessary. So there are mechanisms that you can use to protect your cost and investment without having to centrally manage these architectures — which is essentially the exact opposite of DevOps. It’s a DevOps team that’s another silo that doesn’t really collaborate to solve the problem. It just becomes another source of wait time, another source of wheel spinning and another source of invisibility to the other teams.

This is exactly how to do DevOps wrong. And a lot of people are trying to implement it this way because it fits in with what they understand. It fits in with what they know. Because they haven’t really understood yet that DevOps is a completely different way to manage everything from the infrastructure to the people.

Mekhala Roy filmed this video.

Read this article:

Cloud computing challenges today: Planning, process and people – TechTarget

US action on Microsoft email case could devastate cloud computing – Irish Times

The Microsoft case has been less headline-grabbing than Googles news-dominating mega-fine this week, but it is the far more important case of the two. Photograph: Brian Snyder/Reuters

A week may be a long time in politics, but in business, just five days has been time enough for two developments that will worry many tech multinationals with European Union operations.

First came the US department of justice (DOJ) decision late last week to request the US supreme court hear an appeal in the internationally significant Microsoft Dublin email case.

Then, early this week, the European Commission smacked an extraordinary 2.4 billion fine onGoogle, having determined, after a seven-year investigation, that it had violated EU anti-trust laws by using a dominant position in the search market to favour its own shopping listings service.

The two cases are different in scope and implication, but both will fray nerves in boardrooms and executive suites worldwide.

The Microsoft case has been less headline-grabbing than Googles news-dominating mega-fine this week, but it is the far more important and potentially devastating case of the two.

Thats because while the Google decision may restrict how some internet-based businesses operate across the EU, the Microsoft case, if overturned by the US supreme court, would devastate one of the fastest-growing areas of business cloud computing undermining the foundation for how data is stored and handled.

Because most businesses worldwide rely on at least some international handling of data, this exposes Business with a capital B, not just the tech or internet-based sectors.

The case involves a judges demand that Microsoft hand over emails held in Ireland for a New York state case. Microsoft refused. But importantly, it has not fought compliance with lawful government requests, but rather how this particular one was made: without going through existing international agreements by which US authorities would normally request permission from and work with Irish authorities to access the emails.

The US has argued that Microsoft is an American company, giving US courts the right to directly demand the emails, regardless of where they are held. However, this is misleading. First, the US is trying to treat digital data as a different category of evidence. If the desired evidence were concrete (say, paper documents) rather than digital, US authorities would have to use existing international law-enforcement agreements. Digital is, wrongly, a legislative grey area.

Second, as Microsoft president and chief counsel Brad Smith argued in a blog post last week, if the US government has the right to directly seize internationally-held data, then other countries will of course, expect the same right to in effect conduct international digital raids for American or other nations data, in the US or around the world, with near-impunity.

This raises obvious data-protection, data-privacy, and surveillance concerns. It also completely undermines the whole concept of cloud computing the movement and storing of data by organisations in international jurisdictions and suggests businesses would have to run stand-alone operations and data centres in every geography in which they operate.

Having the supreme court hear this case would be a pointless waste of the courts time. As Smith notes, US legislators already accept that fresh legislation is needed to clarify and better streamline access to digital evidence. In the US, bipartisan efforts have begun in this regard.

A supreme court ruling could curtail or prematurely affect needed legislation. Hence, the DOJ referral request is unneeded and potentially catastrophic.

As for the Google case, the writing was on the wall for this decision for some time, as the company had failed in several attempts to reach a settlement with the EU over those seven years. The decision is likely to have a number of impacts.

First, it signals the EU is willing to make business-affecting decisions, backed with gasp-inducing fines, against multinationals seen to compete unfairly in areas of market dominance. And keep in mind the EU competition commissioner still has two ongoing investigations into other areas of Google business, its Android mobile operating system and its Ad Sense online advertising.

Overall (and without knowing yet the details of the judgement), the EU is showing it will closely examine and regulate competition in market verticals. Many other market-dominating companies in such verticals Amazon, for example, or Apple must be nervous.

The willingness to impose major fines is a sharp shock, too. For years, EU actions have been seen as minor swats, not big wallops. Big fines will certainly focus corporate minds.

Finally, the EU, interestingly, is moving firmly into an anti-trust watchdog role the US has only dithered in for the two decades since the DOJ went after Microsoft on anti-trust grounds (using Windows to shoehorn its Internet Explorer browser on to desktops). The US abandoned its own anti-trust investigation of Google two years ago.

And that, of course, is an historical connecting thread in the two current cases.

Google says it may appeal the EU decision. Microsoft, when under further anti-trust investigation in the EU in past years, eventually decided the best, business-stabilising approach was to settle with the EU.

But these days Microsoft has led corporate efforts to confront the US government on over-reaching data access.

An interesting turn of affairs, indeed.

Originally posted here:

US action on Microsoft email case could devastate cloud computing – Irish Times

Microsoft Signs Cloud Computing Partnership with Box – CIO Today

Microsoft and file-storage startup Box have signed a deal to sell each other’s products, the latest blurring of the lines between friends and rivals in the growing business of cloud-computing.

Box builds web-based file storage and management tools, services that compete head-to-head with Microsoft’s own OneDrive and Sharepoint.

Despite that rivalry, the companies have agreed to jointly sell Box services and elements of Microsoft’s Azure cloud-computing platform, they said on Tuesday.

The companies say their engineering teams are also working on building more links between their products, including adding Azure the Box Zones program. That effort lets Box customers opt to store their content in specific areas of Azure’s massive global network of data centers. (Box Zones already includes Azure rivals Amazon Web Services and IBM).

Cloud-computing has made some partnerships that would have seemed bizarre in the world of out-of-the-box business software of a generation ago. Microsoft during its dominance of the personal computer heyday developed a reputation for pushing customers to use its range of products at all costs, and shunning those developed by others.

But as the company prioritizes growth in its Azure cloud-computing platform, which enables other companies to build services on Microsoft’s network of data centers and rented software services, the Redmond firm has abandoned some of its scorched earth tactics. The company, analysts say, is betting that customers who plug into the cloud will demand that the products they use work well with those of other technology vendors.

Box, based in Redwood City, Calif., began as a startup founded by a pair of college students in Mercer Island. The company is among a slate of startups born in the cloud era that has thrived by building on-demand, web-based tools that replicate or improve on programs companies used to run from their own servers. Box held an initial public offering in 2015, and had sales of $425 million during the most recent 12-month period.

2017 Seattle Times syndicated under contract with NewsEdge/Acquire Media. All rights reserved.

Follow this link:

Microsoft Signs Cloud Computing Partnership with Box – CIO Today

Quiz time: Test your knowledge of hybrid cloud computing trends – TechTarget

Download this free guide

In this expert handbook, we explore the issues and trends in cloud development and provide tips on how developers can pick the right platform.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

The rest is here:

Quiz time: Test your knowledge of hybrid cloud computing trends – TechTarget

Microsoft signs cloud-computing partnership with Box – The Seattle Times

Microsoft and Box, which builds on-demand file storage and sharing tools, will work on new links between their products and jointly sell some tools.

Seattle Times technology reporter

Microsoft and file-storage startup Box have signed a deal to sell each others products, the latest blurring of the lines between friends and rivals in the growing business of cloud-computing.

Box builds web-based file storage and management tools, services that compete head-to-head with Microsofts own OneDrive and Sharepoint.

Despite that rivalry, the companies have agreed to jointly sell Box services and elements of Microsofts Azure cloud-computing platform, they said on Tuesday.

The companies say their engineering teams are also working on building more links between their products, including adding Azure to the Box Zones program. That effort lets Box customers opt to store their content in specific areas of Azures massive global network of data centers. (Box Zones already includes Azure rivals Amazon Web Services and IBM).

Cloud-computing has made some partnerships that would have seemed bizarre in the world of out-of-the-box business software of a generation ago. Microsoft, during its dominance of the personal computer heyday, developed a reputation for pushing customers to use its range of products at all costs, and shunning those developed by others.

But as the company prioritizes growth in its Azure cloud-computing platform, which enables other companies to build services on Microsofts network of data centers and rented software services, the Redmond firm has abandoned some of its scorched earth tactics. The company, analysts say, is betting that customers who plug into the cloud will demand that the products they use work well with those of other technology vendors.

Box, based in Redwood City, Calif., began as a startup founded by a pair of college students in Mercer Island. The company is among a slate of startups born in the cloud era that has thrived by building on-demand, web-based tools that replicate or improve on programs companies used to run from their own servers. Box held an initial public offering in 2015, and had sales of $425 million during the most recent 12-month period.

Original post:

Microsoft signs cloud-computing partnership with Box – The Seattle Times

Cloud Computing: Moving Desktops to the Cloud – DABCC.com

Cloud desktops are here already!

After years of promise, last year was a banner year for Desktop-as-a-Service (DaaS).

In January 2016, Citrix led the trend with its Citrix Cloud message at Citrix Summit. Later in 2016, Citrix announced a partnership with Microsoft to replace Azure RemoteApp and their direct support in XenApp and Xen Desktop for both AWS and Azure. Amazon, VMware, and Azure all made Desktop-as-a-Service announcements in 2016. And in August, Gartner stated that they expect that by 2019, 50% of new VDI users will be deployed on DaaS platforms.

Despite all the new vendor capabilities, most organizations have decided to put off DaaS, planning one last refresh of their on-premises desktops and VDI infrastructure.

Why are organizations putting off cloud computing (DaaS)?

Read the entire article here, Cloud Computing: Moving Desktops to the Cloud

via the fine folks at Ivanti.

The rest is here:

Cloud Computing: Moving Desktops to the Cloud – DABCC.com

Lady Eli, Cloud Computing Among Workers for Brown – BloodHorse.com (press release) (registration) (blog)

Trainer Chad Brown sent out a number of graded stakes winners to work on Belmont Park’s main track June 25.

Klaravich Stables and William Lawrence’s grade 1 winner Practical Joke breezed four furlongsin :48.09 as he gears up for the $400,000 Dwyer Stakes (G3) July 8.

The Into Mischief colt, who will make his first start in the Dwyer since a fifth-place run in the Kentucky Derby Presented by Yum! Brands (G1), seeks his first win of his 3-year-old campaignafter runner-up efforts in the Xpressbet Fountain of Youth Stakes and Toyota Blue Grass Stakes (both G2).

“He breezed and continues to train very well ahead of the Dwyer,” Brown said.

Preakness Stakes (G1) winner Cloud Computing also put in a maintenance work Sunday, breezingfour furlongs in :49.49. It was his second work since winning the second leg of the Triple Crown May 20. Also owned by Klaravich and Lawrence, Cloud Computing is training toward the $600,000 Jim Dandy Stakes (G2)July 29 at Saratoga Race Course.

Grade 1 winners Lady Eli and Antonoe breezed in company and covered four furlongs in :49.42. Lady Eli, who most recently won the Gamely Stakes (G1) at Santa Anita Park, is slated to make her next start on Saratoga’s opening weekend in the $500,000 Diana Stakes (G1T).

Brown said Antonoe, fresh off her win in the Longines Just a Game Stakes (G1T)June 10, is also a possibility for the Diana.

“They went together. They’re a good team and we’re happy with both of them,” Brown said.

Don Alberto Stable’sRubilinda, the first U.S. winner for 10-time group 1 winner Frankel, was scratched from the June 24 Wild Applause Stakes after the race was moved off the turf.

“It puts me in a bad spot. I likely now will have to go on to an allowance race and if she does well, then on to a stakes race,” Brown said. “I’d like to run her (at Belmont)if I could.”

Breeders’ Cup Juvenile Fillies Turf (G1T) winnerNew Money Honey and grade 3 winnerFifty Fivetwo of the four expected Brown entrants for the $1 million Belmont Oaks Invitational (G1T) July 8are expected to breeze on the turf June 26 at Belmont.

Read more:

Lady Eli, Cloud Computing Among Workers for Brown – BloodHorse.com (press release) (registration) (blog)

Cloud computing key to 4th industrial revolution – News VietNamNet – VietNamNet Bridge

Cloud computing is a crucial technological trend and has become an important technology during the fourth industrial revolution, according to Nguyen Thanh Phuc, Director General of the Authority of Information Technology Application.

Cloud computing is a crucial technological trend and has become an important technology during the fourth industrial revolution

Phuc made the remarks at the recent Vietnam Cloud Computing Conference 2017, sponsored by the Vietnam Software Association (VINASA), in coordination with the Lee Kuan Yew School of Public Policy (National University of Singapore).

Of note, Vietnamese ministries and authorities have boosted information and technology applications, the building of e-government and improved the investment environment to create clear and favourable conditions for enterprises, said he.

Also at the conference, Associate Professor Dr. Vu Minh Khuong from Lee Kuan Yew School of Public Policy delivered a presentation about a survey on cloud computing at 800 enterprises and organisations in Vietnam.

The survey results indicated that the country had the fastest growth in investment for cloud computing in the 2010-16 period among ASEAN countries, increasing 64.4 percent per year, higher than the average in ASEAN (49.5 percent) and the world (42.5 percent).

However, real spending on cloud computing in Vietnam was still rather low last year, which was 107 times lower than in Singapore; 6.5 times lower than Malaysia; 2.4 times lower compared with Thailand and 1.3 times lower compared with the Philippines, he added.

The above numbers revealed that there were many barriers to promoting cloud computing in Vietnam. The largest barrier is the popular use of unlicensed software, the lack of knowledge about the benefits of cloud computing, information security concerns and the quality of cloud services in Vietnam, he said.

According to experts at the conference, in the early stages of digital transformation, priority should be given to developing ICT infrastructure, especially broadband connections and cloud computing applications.

At the same time, there should be priority policies created for cloud computing, in order to trigger digital conversions using big data and Internet of Things applications.

Nguyen Dinh Thang, VINASA Vice Chairman, added that cloud computing offered tremendous benefits, such as product and service standardisation, investment cost reductions, the shortening of the time to develop products and improvements in the quality of services.

Therefore, the agency proposed that the government need to have an orientation policy, while businesses and organisations need to develop strategies on research, investment and early cloud applications to improve production and business efficiency, contributing to the countrys economic development and boosting the countrys progress during the fourth industrial revolution.

VNA

Continued here:

Cloud computing key to 4th industrial revolution – News VietNamNet – VietNamNet Bridge

How the cloud has changed education and training – TNW

A few years ago, the cloud was a promise to reduce costs of IT and improve flexibility and scaling by providing on-demand computing, storage and services to every organization.

Today, the cloud is a ubiquity we take for granted. We expect every file, every service and digital asset we have to be available across all our devices everywhere we go, at any time of the day.

The omnipresence of the cloud has streamlined and transformed quite a number of domains, including education. Today, thanks to cloud computing, education and training has become more affordable, flexible and accessible to millions of people and thousands of businesses.

Heres a look at how cloud-based education has changed things for the better.

One of the problems schools and training departments in organizations have constantly struggled with is to keep up with hardware, software and IT staff costs and complexities. In contrast, the cloud has been offering low-cost, subscription-based model that can support more companies and organizations.The elegance of the cloud is that the user only requires little more than a browser and an internet connection. This is a welcome shift from the need to manually install and update applications on every single computer in a department.

In the past years, solutions such as Googles suite of educational tools have provided schools with a free access to general classroom tools such as word processors, spreadsheets and presentation software. Cloud applications such as Google Docs allow students to easily collaborate on assignments in an easy-to-use environment.

Microsoft has also made its move to the cloud, providing subscription-based access to the cloud version of its popular Office suite, which it offers for free to students and teachers.

One of the interesting developments in the space has been the advent of virtual classrooms in the cloud. Virtual cloud classrooms provide teachers with a paperless way to set up classes and courses, distribute material and assignments, and track and grade student progress from their desktop browser or smartphone.On-premise virtual classroom software have existed for a while, but their installation and deployment came with heavy technical and financial requirements. In recent years, established companies such as Blackboard have started offering cloud-based services, making it possible for more schools and institutions to enroll.

Bigger tech corporations are also entering the space. Google launched its Classroom app as part of G Suite for Education in 2014 and Microsoft released its own Classroom last year. Both solutions revolve around providing a unified environment to better use office cloud apps in managing classes.

Cloud platforms can be a boon to professional education. For instance, IT training is traditionally associated with large investments in hardware and complex setup costs. However specialized cloud platforms have provided a flexible, cost-effective and easy-to-deploy alternative.

One example is CloudShare, a provider of cloud-based virtual machines, which enables companies to setup virtual training labs for their training sessions. With CloudShare, trainers can create any number of VMs of various operating systems in a virtual class environment, assign them to students, monitor their use and actively assist students when needed.The use of cloud computing and virtual classes in IT training brings huge benefits by cutting back hardware costs and complexity while providing an interactive experience that is not possible in legacy classroom settings. It also benefits companies that need to train staff and employees across the world by sparing them additional traveling and trainer fees.

By 2025, the global demand for higher education will double to approximately 200 million students per year, mostly from emerging economies. Elsewhere, the disruption of the economy and employment landscape by artificial intelligence is increasing demand for professional training in various fields.

But thanks to cloud-based education, more and more people can now attend academic and professional courses. In recent years, weve seen the emergence of massive open online courses (MOOCs) platforms, which provide easy and affordable (sometimes free) access to knowledge and training.

In 2012, Stanford University professors Andrew Ng and Daphne Koller founded Coursera, a cloud platform that offers online courses, specializations, and degrees in a variety of subjects, including data science, computer science, engineering and medicine. Aside from Stanford, other top universities such as Princeton, University of Michigan and Penn State University are now using the platform to offer their programs to students worldwide.

Applicants can enroll for courses, specialization certificates or full higher education degrees. As of 2017, the platform offers more than 2,000 courses and has more than 24 million registered users worldwide.

edX, a platform similar to Coursera created by Harvard University in collaboration with the Massachusetts Institute of Technology, added high school education to its platform in 2014 to help people across the world get access to secondary education.Tech corporations have launched their own education platforms to give access to knowledge and education in specific fields. One example is IBMs Big Data University, a free platform that aims to put more people into data science and machine learning jobs and now has more than 400,000 signed up users.

Cloud-based learning platforms offer anyone with an internet connection classrooms, lectures, course material and a seamless environment where they can learn at their own pace and work on assignments and projects on any device and anywhere they go.

With such huge amounts of data being collected and processed in the cloud, the next step of cloud education is the integration of artificial intelligence in the process. AI algorithms can assist both teachers and students in the learning process, finding pain-points in the teaching process and lending a hand where learners are struggling. Most major vendors have either taken their first steps or are now considering integrating AI-powered tools in their training solutions.

Weve already seen acceleration and enhancements in education and training thanks to the cloud. What will come next can be even more exciting.

Read next: Facebook has a magical Harry Potter easter egg to celebrate the books 20th anniversary

Read this article:

How the cloud has changed education and training – TNW