Page 68«..1020..67686970..8090..»

Category Archives: Cloud Computing

Cloud computing will be a $540 billion market in 2022. Heres how ASX cloud stocks stack up – Stockhead

Posted: August 14, 2021 at 1:00 am

Cloud computing is arguably one of the biggest disruptors to have happened over the last 15 years.

The revolution has impacted many facets of our daily lives. Think Netflix, Spotify or Zoom which enable us to watch movies, listen to music, or have meetings on any device (laptops, tablets, smartphones), anywhere.

At the enterprise level, the cloud technology has saved billions of dollars globally by allowing companies to pay on the go, and enabling them to scale up or down their storage and computing requirements on a need-to basis.

Advisory company Gartner predicts that worldwide spending on public cloud serviceswill reach US$400 billion ($540 billion) in 2022, from US$270 billion in 2020.

Broadly speaking, cloud computing refers to the on-demand availability of computer resources specifically data storage and computing power without any direct active management by the user.

Rather than having to operate their own costly computing infrastructure or data centres, companies can rent access to a cloud server (called the public cloud servers), to store data and run all their applications.This removes the need for large capital spend on an in-house infrastructure.

As technology develops, the cloud industry has branched out into several different services, but the bulk of revenues are coming from two main sources : infrastructure-as-a-service (IaaS) and software-as-a-service (SaaS).

The IaaS service, which provides data storage and computing resources described above,is dominated by the big tech global firms.

Amazon pioneered the industry when it introduced the Amazon Web Services (AWS) back in 2006, and still dominate market share today. Then came the Google Cloud Platform, Microsoft Azure, IBM SmartCloud, Oracle Cloud, and Alibaba Cloud in the years following.

Collectively, these six giants make up 80% of the global IaaS market, according to a Telsyte report.

The SaaS market meanwhile, provides software solutions or applications that are built and run on top of the cloud infrastructure.

The SaaS market is dominated by five giant players globally Microsoft, Salesforce, Adobe, SAP, and Oracle.

In light of such overwhelming competition, how do Australian cloud computing companies on the ASX stack up?

Generally speaking, domestic players in Australia are not building new cloud servers to compete with the global giants, but are instead providing the services to connect to the exisiting infrastructure.

Brisbane-based Megaport (ASX:MP1) is probably the most well known player on the ASX in the IaaS space.

Megaport does not own data centres per se, but rather partners with them and provides fast access to the major public cloud servers AWS, Microsoft Azure, Google Cloud Platform etc.

The companys Software Defined Networking (SDN) technology provides the on-demand point of entry and connects to 740+ enabled servers across the Asia Pacific, North America, Europe, and the Middle East.

Nexion (ASX:NNG) is another IaaS company that provides clients access to the public cloud networks, but it does this with a certain twist.

The company has partnered with IBM Global Technology Services to provide a hybrid cloud service, which basicallyallows customers to access the public cloud, but also bring their own physical resources.

This is because a lot of companieswant to store data in the cloud, but often they have legacy infrastructure to run their core delivery systems thats not compatible with a public cloud. Thehybrid cloud solution effectively solves this problem.

Sovereign Cloud (ASX:SOV), which operates under the AuCloud brand, is an IaaS provider exclusively focused on the defence and critical industries.

Due to the sensitive nature of its clients, the company does not provide an access point to the public cloud, but rather operates its own data centres in Canberra and Sydney designed to meet ASIO standards.

Sovereign made its ASX debut in December after raising $20m at 75c per share.

Other IaaS companies on the ASX include NextDC (ASX:NXT), DXN (ASX:DXN), and DC Two (ASX:DC2).

For a more global and diversified exposure to the cloud computing sector, investors could also buy into ASX-listed BetaShares Cloud Computing ETF (ASX:CLDD).

CLDD tracks exposure to leading companies in the global cloud computing industry (both IaaS and SaaS), which includes international tech names such as Zoom, Shopify, Salesforce, and Dropbox.

SaaS is the value-added side of the cloud computing sector.SaaS is basically a software distribution model that allows data/platform to be accessed from any device with an internet connection and a web browser.

The SaaS model offers lower upfront costs than traditional software download and installation, making them more available to a wider range of smaller businesses.

Arguably the most well known SaaS company on the ASX is Xero (ASX:XRO).

The Kiwi company provides enterprise accounting software, and uses a single unified ledger that allows users to work in the same set of books regardless of location or operating system. The Xero software can also be accessed through mobile apps.

Wisetech (ASX:WTC) provides cloud-based SaaS solutions to the logistics industry globally.

Its flagship platform, CargoWise, enables the worlds supply chains executing over 60 billion data transactions annually. The company has grown from a $1bn into a $10bn market cap since listing in 2016.

In the smaller end of the market, cloud-based SaaS companies like Whispir (ASX:WSP) provide communications workflow platform to its clients globally. This includes SMS, email, voice messaging, forms, and videos.

Technology One (ASX:TNE)meanwhile, is an SaaS ERP (enterprise resource planning) solutions company, allowing clients to integrate with third party providers such as Salesforce.

As we transition into a world of artificial intelligence, smart IoT devices, and autonomous vehicles, storage and processing of big data will be of paramount importance.

These data will have to be stored and processed in the cloud, and this is where 5G becomes a game-changer for the cloud computing industry.

5G will revolutionise the cloud by providing ultra fast transmission rates and reduced latency by as much as 100x compared to 4G.

Industries that will benefit massively from 5G include healthcare, automotive, smart cities, as well as end users like us.

In this space, telco carriers like Telstra (ASX:TLS) and 5G Networks (ASX:5GN) will come into play.

5GN provides internet broadband and cloud infrastructure services to mid-market corporate industries.

Other 5G plays on the ASX include Aussie Broadband (ASX:ABB), which is currently building a 150km of fibre cables underground that will connect to eight data centres across the country.

At Stockhead we tell it like it is. While Nexion is a Stockhead advertiser, it did not sponsor this article.

Get the latest Stockhead news delivered free to your inbox.

It's free. Unsubscribe whenever you want.

You might be interested in

Read more from the original source:

Cloud computing will be a $540 billion market in 2022. Heres how ASX cloud stocks stack up - Stockhead

Posted in Cloud Computing | Comments Off on Cloud computing will be a $540 billion market in 2022. Heres how ASX cloud stocks stack up – Stockhead

Study: What are the areas marketing proficiency lag behind in Malaysia? – Marketing Interactive

Posted: at 1:00 am

Malaysians might be relatively more adept at digital skills such as cloud computing and data analysis, but overall, there is still a significant skills gap across data science, business and technology. According to Coursera's latest Global Skills Report which ranked 108 countries, Malaysian learners showcased high skill proficiency for cloud computing and data analysis at 91% and 79% respectively, indicating a growing pocket of highly-skilled technical professionals in the country. When it comes to marketing, however, the report found Malaysians to be competitive (64%) instead of cutting-edge like their Singaporean counterparts (92%).

Malaysia secured mid-rankings globally in each domain - #52 in business, #48 in technology and #51 in data science. With 56% proficiency in technology, Malaysia lags behind neighbours such as Singapore (96%), Vietnam (88%) and Indonesia (78%). Similarly, the data science domain recorded 52% proficiency, placing Malaysia only slightly ahead of Indonesia (46%), Thailand (46%) and the Philippines (44%).

While proficiency in machine learning is at 50%, Malaysia has more learners opting for online learning to arm themselves with the skills of the future. The top trending skills among Malaysian learners included Python programming, machine learning, the Internet of Things, and data management. Meanwhile, key skills for data science including software engineering, mathematics and programming scored less than 50% in the report's benchmark, despite the nationwide push to encourage a new generation of STEM talents, the report found.

Join our Digital Marketing Asia conference happening from 9 November 2021 - 25 November 2021 to learn about the upcoming trends and technologies in the world of digital. Check out the agenda here.

Nonetheless, Malaysia is considered competitive with a 57% proficiency overall, ranking 46th globally and fourth in Southeast Asia. While it is ahead of the Philippines (#69) and Thailand (#76), it lags behind Singapore (#10), Vietnam (#20) and Indonesia (#45).

Coursera's MD - India and Asia Pacific, Raghav Gupta, said the pace of skills transformation is slower than the pace of digital transformation in Malaysia, as is the case in several countries worldwide. "Learners must invest in both soft and technical skills to prepare for jobs of the future," he said.

Meanwhile, Gupta also noted that the skills needed for high-demand entry-level roles can be developed "in a matter of months, not years". The report found that recent graduates and mid-career changes can develop entry-level, digital job skills in as little as 35 to 70 hours, or one to two months with 10 learning hours per week. On the other hand, an individual without a degree or technological experience can be job-ready in 80 to 240 hours, or two to six months with 10 learning hours per week. The most transferrable skills across all future jobs, according to Coursera, are human skills such as problem solving and communication, computer literacy and career management.

There are currently 347k Coursera learners in Malaysia and 44% of them are female. Machine learning was the top most popular course among learners in the country, with First Step Korean and The Science of Well-being being the top two topics. Meanwhile, trending skills among Malaysian learners include marketing, digital marketing, C programming, design and product, and data management, among others.

According to Coursera, the report focuses on business, technology and data science as they are the most popular domains on Coursera in terms of enrollments, and they encapsulate the skills most crucial to the future of work.The competencies within each domain capture the broad capabilities required to achieve expertise in these areas, and individual skills capture specific requirements to achieve mastery within each competency.

Functionally, the report's competencies and skills come from Courseras Skills Graph, which is a set of skills assembled through both open-source taxonomies such as Wikipedia, as well as crowdsourcing from Coursera educators and learners on what they teach/learn on the Coursera platform. The 108 countries within the report are ranked against each other. A country or industry that is at 100% ranks at the top of the 108 countries and a country at 0% is at the bottom.

For each groups percentile rankings, Coursera also broke them apart into four categories based on quartiles: cutting edge (76th percentile or above), competitive (51st to 75th percentile), emerging (26th to 50th percentile), and lagging (25th percentile and below).

Join our Digital Marketing Asia conference happening from 9 November 2021 - 25 November 2021 to learn about the upcoming trends and technologies in the world of digital. Check out the agenda here.

Photo courtesy: 123RF

See the rest here:

Study: What are the areas marketing proficiency lag behind in Malaysia? - Marketing Interactive

Posted in Cloud Computing | Comments Off on Study: What are the areas marketing proficiency lag behind in Malaysia? – Marketing Interactive

Bare Metal Performance in the Cloud – HPCwire

Posted: at 1:00 am

High Performance Computing (HPC) is known as a domain where applications are well-optimized to get the highest performance possible on a platform. Unsurprisingly, a common question when moving a workload to AWS is what performance difference there may be from an existing on-premises bare metal platform. This blog will show the performance differential between bare metal instances and instances that use the AWS Nitro hypervisor is negligible for the evaluated HPC workloads.

TheAWS Nitro systemis a combination of purpose-built hardware and software designed to provide performance and security. Recent generation instances, including instance families popular with HPC workloads such as c5, c5n, m5zn, c6gn, andmany othersare based on the Nitro System. As shown in Figure 1, the AWS Nitro System is composed of three main components: Nitro cards, the Nitro security chip, and the Nitro hypervisor. Nitro cards provide controllers for the VPC data plane (network access), Amazon Elastic Block Store (Amazon EBS) access, instance storage (local NVMe), as well as overall coordination for the host. By offloading these capabilities to the Nitro cards, this removes the need to use host processor resources to implement these functions, as well as offering security benefits. The Nitro security chip provides a hardware root of trust and secure boot, among other features to help with system security.The Nitro hypervisor is lightweight hypervisor that manages memory and CPU allocation.

With this design, the host system no longer has direct access to AWS resources. Only the hardened Nitro cards can access other resources, and each of those cards provides software-defined hardware devices that are the only access points from the host device. With the I/O accesses handled by Nitro cards, this allows the last component, the Nitro hypervisor, to be light-weight and have a minimal impact to workloads running on the host. The Nitro hypervisor has only necessary functions, with a design goal of being quiescent, which means it should never activate unless it is doing work for an instance that requested it. This also means there are no background tasks running consuming any resources when it is not needed.

Figure 1. The AWS Nitro System building blocks.

The Nitro system architecture also allows AWS to offer instances that offer direct access to the bare metal of the host. Since their initial introduction in2017,many instance families offer *.metal variants, which provide direct access to the underlying hardware and no hypervisor. As in the case where the Nitro hypervisor is used, the Nitro cards are still the only access points to resources outside of the host. These instances are most commonly used for workloads that cannot run in a virtualized environment due to licensing requirements, or those that need specific hardware features only provided through direct access.

With both the option of bare metal instances and Nitro virtualized instances, this provides a method to show the performance differential between HPC application performance on bare metal vs running on the AWS Nitro hypervisor.

You can read the full blog to see how different HPC applications perform on Amazon EC2 instances with AWS Nitro hypervisor vs. bare metal instances.

Reminder: You can learn a lot from AWS HPC engineers by subscribing to the HPC Tech Short YouTube channel, and following the AWS HPC Blog channel.

See original here:

Bare Metal Performance in the Cloud - HPCwire

Posted in Cloud Computing | Comments Off on Bare Metal Performance in the Cloud – HPCwire

Samsung Has Its Own AI-Designed Chip. Soon, Others Will Too – WIRED

Posted: at 1:00 am

Samsung is using artificial intelligence to automate the insanely complex and subtle process of designing cutting-edge computer chips.

The South Korean giant is one of the first chipmakers to use AI to create its chips. Samsung is using AI features in new software from Synopsys, a leading chip design software firm used by many companies. What you're seeing here is the first of a real commercial processor design with AI, says Aart de Geus, the chairman and co-CEO of Synopsys.

Others, including Google and Nvidia, have talked about designing chips with AI. But Synopsys tool, called DSO.ai, may prove the most far-reaching because Synopsys works with dozens of companies. The tool has the potential to accelerate semiconductor development and unlock novel chip designs, according to industry watchers.

Synopsys has another valuable asset for crafting AI-designed chips: years of cutting-edge semiconductor designs that can be used to train an AI algorithm.

A spokesperson for Samsung confirms that the company is using Synopsys AI software to design its Exynos chips, which are used in smartphones, including its own branded handsets, as well as other gadgets. Samsung unveiled its newest smartphone, a foldable device called the Galaxy Z Fold3, earlier this week. The company did not confirm whether the AI-designed chips have gone into production yet, or what products they may appear in.

Across the industry, AI appears to be changing the way chips are made.

A Google research paper published in June described using AI to arrange the components on the Tensor chips that it uses to train and run AI programs in its data centers. Googles next smartphone, the Pixel 6, will feature a custom chip manufactured by Samsung. A Google spokesperson declined to say whether AI helped design the smartphone chip.

AI lends itself to these problems that have gotten massively complex.

Mike Demler, senior analyst, Linley Group

Chipmakers including Nvidia and IBM are also dabbling in AI-driven chip design. Other makers of chip-design software, including Cadence, a competitor to Synopsys, are also developing AI tools to aid with mapping out the blueprints for a new chip.

Mike Demler, a senior analyst at the Linley Group who tracks chip design software, says artificial intelligence is well suited to arranging billions of transistors across a chip. It lends itself to these problems that have gotten massively complex, he says. It will just become a standard part of the computational tool kit.

Using AI tends to be expensive, Demler says, because it requires a lot of cloud computing power to train a powerful algorithm. But he expects it to become more accessible as the cost of computing drops and models become more efficient. He adds that many tasks involved in chip design cannot be automated, so expert designers are still needed.

Modern microprocessors are incredibly complex, featuring multiple components that need to be combined effectively. Sketching out a new chip design normally requires weeks of painstaking effort as well as decades of experience. The best chip designers employ an instinctive understanding of how different decisions will affect each step of the design process. That understanding cannot easily be written into computer code, but some of the same skill can be captured using machine learning.

The AI approach used by Synopsys, as well as by Google, Nvidia, and IBM, uses a machine-learning technique called reinforcement learning to work out the design of a chip. Reinforcement learning involves training an algorithm to perform a task through reward or punishment, and it has proven an effective way of capturing subtle and hard-to-codify human judgment.

The method can automatically draw up the basics of a design, including the placement of components and how to wire them together, by trying different designs in simulation and learning which ones produce the best results. This can speed the process of designing a chip and allow an engineer to experiment with novel designs more efficiently. In a June blog post, Synopsys said one North American manufacturer of integrated circuits had improved the performance of a chip by 15 percent using the software.

Most famously, reinforcement learning was used by DeepMind, a Google subsidiary, in 2016 to develop AlphaGo, a program capable of mastering the board game Go well enough to defeat a world-class Go player.

Continued here:

Samsung Has Its Own AI-Designed Chip. Soon, Others Will Too - WIRED

Posted in Cloud Computing | Comments Off on Samsung Has Its Own AI-Designed Chip. Soon, Others Will Too – WIRED

Cloud Computing: Its Always Sunny in the Cloud – IEEE Spectrum

Posted: July 29, 2021 at 8:59 pm

This is part of IEEE Spectrums special report: Top 11 Technologies of the Decade

Illustration:Frank Chimero

Just 18 years ago the Internet was in its infancy, a mere playground for tech-savvy frontiersmen who knew how to search a directory and FTP a file. Then in 1993 it hit puberty, when the Webs graphical browsers and clickable hyperlinks began to attract a wider audience. Finally, in the 2000s, it came of age, with blogs, tweets, and social networking dizzying billions of ever more naive users with relentless waves of information, entertainment, and gossip.

This, the adulthood of the Internet, has come about for many reasons, all of them supporting a single conceptual advance: Weve cut clean through the barrier between hardware and software. And its deeply personal. Videos of our most embarrassing moments, e-mails detailing our deepest heartaches, and every digit of our bank accounts, social security numbers, and credit cards are splintered into thousands of servers controlled by dozenshundreds?of companies.

Welcome to cloud computing. Weve been catapulted into this nebulous state by the powerful convergence of widespread broadband access, the profusion of mobile devices enabling near-constant Internet connectivity, and hundreds of innovations that have made data centers much easier to build and run. For most of us, physical storage may well become obsolete in the next few years. We can now run intensive computing tasks on someone elses servers cheaply, or even for free. If this all sounds a lot like time-sharing on a mainframe, youre right. But this time its accessible to all, and its more than a little addictive.

The seduction of the business world began first, in 2000, when Salesforce.com started hosting software for interacting with customers that a client could rebrand as its own. Customers personal details, of course, went straight into Salesforces databases. Since then, hundreds of companies have turned their old physical products into virtual services or invented new ones by harnessing the potential of cloud computing.

Consumers were tempted four years later, when Google offered them their gateway drug: Gmail, a free online e-mail service with unprecedented amounts of storage space. The bargain had Faustian overtonesstore your e-mail with us for free, and in exchange well unleash creepy bots to scan your prosebut the illusion of infinite storage proved too thoroughly enthralling. This was Google, after all: big, brawny, able to warp space and time.

Gmails infinite storage was a start. But the programs developers also made use of a handy new feature. Now they could roll out updates whenever they pleased, guaranteeing that Gmail users were all in sync without having to visit a Web site to download and install an update. The same principle applied to the collaborative editing tools of Google Docs, which moved users documents into the browser with no need for backups to a hard drive. Six years agobefore the launch of Docsoffice productivity on the Web wasnt even an idea, recalls Rajen Sheth, a product manager at Google.

Docs thus took a first, tentative bite out of such package software products as Microsoft Office. Soon hundreds of companies were nibbling away.

Adding new features and fixing glitches, it turned out, could be a fluid and invisible process. Indeed, sites like the photo storage service Flickr and the blog platform WordPress continually seep out new products, features, and fixes. Scraping software off individual hard drives and running it in anonymous data centers obliterated the old, plodding cycles of product releases and patches.

In 2008, Google took a step back from software and launched App Engine. For next to nothing, Google now lets its users upload Java or Python code that is then modified to run swiftly on any desired number of machines. Anyone with a zany idea for a Web application could test it out on Googles servers with minimal financial risk. Lets say your Web app explodes in popularity: App Engine will sense the spike and swiftly increase your computing ration.

With App Engine, Google began dabbling in a space already dominated by another massive player, Amazon.com. No longer the placid bookstore most customers may have assumed it to be, in 2000 Amazon had begun to use its sales platform to host the Web sites of other companies, such as the budget retailer Target. In 2006 came rentable data storage, followed by a smorgasbord of instances, essentially slices of a server available in dozens of shapes and sizes. (Not satisfied? Fine: The CPU of an instance, which Amazon calls a compute unit, is equivalent to that of a 1.0- to 1.2-gigahertz 2007 Opteron or 2007 Xeon processor.)

To get a flavor of the options, for as little as about US $0.03 an hour, you can bid on unused instances in Amazons cloud. As long as your bid exceeds a price set by Amazon, that spare capacity is yours. At the higher end, around $2.28 per hour can get you a quadruple extra large instance with 68 gigabytes of memory, 1690 GB of storage, and a veritable bounty of 26 compute units.

In a sense, the cloud environment makes it easier to just get things done. The price of running 10 servers for 1000 hours is identical to running 1000 machines for 10 hoursa flexibility that doesnt exist in most corporate server rooms. These are unglamorous, heavy-lifting tasks that are the price of admission for doing what your customers value, says Adam Selipsky, a vice president at Amazon Web Services.

As unglamorous as an electric utility, some might say. Indeed, Amazons cloud services are as close as weve gotten to the 50-year-old dream of utility computing, in which processing is treated like power. Users pay for what they use and dont install their own generating capacity. The idea of every company running its own generators seems ludicrous, and some would argue that computing should be viewed the same way.

Selling instances, of course, is nothing like selling paperbacks, toasters, or DVDs. Where Googles business model revolves around collecting the worlds digital assets, Amazon has more of a split personality, one that has led to some odd relationships. To help sell movies, for example, Amazon now streams video on demand, much like companies such as Netflix. Netflix, however, also uses Amazons servers to stream its movies. In other words, Amazons servers are so cheap and useful that even its competitors cant stay away. But to understand whats truly fueling the addiction to the cloud, youll need to glance a bit farther back in time.

COMPANY TO WATCH:F-Secure,Helsinki, Finland

F-Secure Corp. uses the cloud to protect the cloud. Its global network of servers detects malicious software and distributes protective updates in minutes. To assess a threat, it uses the Internet itself: A widely available application is more likely to be safe than a unique file.

FUN FACT:Transmitting a terabyte of data from Boston to San Francisco can take a week. So the impatient are returning to an old idea, Sneakernet: Put your data on a disc, take it to FedEx, and get it to a data center in a day.

FUN FACT:Dude, where are my bits? In the growing obfuscation of whos responsible for what data, Amazon recently deployed its storefront platform on privacy-challenged Facebook for the first time. The irresistible business case? Selling Pampers diapers.

In the mid-1990s, a handful of computer science graduate students at Stanford University became interested in technologies that IBM had developed in the 1960s and 70s to let multiple users share a single machine. By the 1980s, when cheap servers and desktop computers began to supplant mainframe computers, those virtualization techniques had fallen out of favor.

The students applied some of those dusty old ideas to PCs running Microsoft Windows and Linux. They built whats called a hypervisor, a layer of software that goes between hardware and other higher-level software structures, deciding which of them will get how much access to CPU, storage, and memory. We called it Discoanother great idea from the 70s ready to make a comeback, recalls Stephen Herrod, who was one of the students.

They realized that virtualization could address many of the problems that had begun to plague the IT industry. For one thing, servers commonly operated at as little as a tenth of their capacity, according to International Data Corp., because key applications each had a dedicated server. It was a way of limiting vulnerabilities because true disaster-proofing was essentially unaffordable.

So the students spawned a start-up, VMware. They started by emulating an Intel x86 microprocessors behavior in software. But those early attempts didnt always work smoothly. When you mess up an emulation and then run Windows 95 on top of it, you sometimes get funny results, Herrod, now VMwares chief technology officer, recalls. Theyd wait an hour for the operating system to boot up, only to see the Windows graphics rendered upside down or all reds displayed as purple. But slowly they figured out how to emulate first the processor, then the video cards and network cards. Finally they had a software version of a PCa virtual machine.

Next they set out to load multiple virtual machines on one piece of hardware, allowing them to run several operating systems on a single machine. Armed with these techniques, VMware began helping its customers consolidate their data centers on an almost epic scaleshrinking 500 servers down to 20. You literally go up to a server, suck the brains out of it, and plop it on a virtual machine, with no disruption to how you run the application or what it looks like, Herrod says.

Also useful was an automated process that could switch out the underlying hardware that supported an up-and-running virtual machine, allowing it to move from, say, a Dell machine to an HP server. This was the essence of load balancingif one server started failing or got too choked up with virtual machines, they could move off, eliminating a potential bottleneck.

You might think that the virtual machines would run far more slowly than the underlying hardware, but the engineers solved the problem with a trick that separates mundane from privileged computing tasks. When the virtual machines sharing a single server execute routine commands, those computations all run on the bare metal, mixed together with their neighbors tasks in a computational salad bowl. Only when the virtual machine needs to perform a more confidential task, such as accessing the network, does the processing retreat back into its walled-off software alcove, where the calculating continues, bento-box style.

Those speedy transitions would not have been possible were it not for another key trendthe consolidation of life into an Intel world. Back in virtualizations early days, a major goal was to implement foreign architectures on whatever hardware was at handsay, by emulating a Power PC on a Sun Microsystems workstation. Virtualization then had two functions, to silo data and to translate commands for the underlying hardware. With microprocessor architectures standardized around the x86, just about any server is now compatible with every other, eliminating the tedious translation step.

VMware no longer has a monopoly on virtualizationa nice open-source option exists as wellbut it can take credit for developing much of the master idea. With computers sliced up into anywhere between 5 and 100 flexible, versatile virtual machines, users can claim exactly the computing capacity they need at any given moment. Adding more units or cutting back is simple and immediate. The now-routine tasks of cloning virtual machines and distributing them through multiple data centers make for easy backups. And at a few cents per CPU-hour, cloud computing can be cheap as dirt.

So will all computing move into the cloud? Well, not every bit. Some will stay down here, on Earth, where every roofing tile and toothbrush seems fated to have a microprocessor of its own.

But for you and me, the days of disconnecting and holing up with ones hard drive are gone. IT managers, too, will surely see their hardware babysitting duties continue to shrink. Cloud providers have argued their case well to small-time operations with unimpressive computing needs and university researchers with massive data sets to crunch through. But those vendors still need to convince Fortune 500 companies that cloud computing isnt just for start-ups and biology professors short on cash. They need a few more examples like Netflix to prove that mucking around in the server room is a choice, not a necessity.

And we may just need more assurances that our data will always be safe. Data could migrate across national borders, becoming susceptible to an unfriendly regimes weak human rights laws. A cloud vendor might go out of business, change its pricing, be acquired by an archrival, or get wiped out by a hurricane. To protect themselves, cloud dwellers will want their data to be able to transfer smoothly from cloud to cloud. Right now, it does not.

The true test of the cloud, then, may emerge in the next generation of court cases, where the murky details of consumer protections and data ownership in a cloud-based world will eventually be hashed out. Thats when well grasp the repercussions of our new addictionand when we may finally learn exactly how the dream of the Internet, in which all the worlds computers function as one, might also be a nightmare.

For all of IEEE Spectrums Top 11 Technologies of the Decade, visit the special report.

Read the original:

Cloud Computing: Its Always Sunny in the Cloud - IEEE Spectrum

Posted in Cloud Computing | Comments Off on Cloud Computing: Its Always Sunny in the Cloud – IEEE Spectrum

Cloud Computing Impact On The Gaming Industry | Invision Game Community – Invision Game Community

Posted: at 8:59 pm

Cloud computing is the instant, remote access to computing systems and resources without being actively involved in managing infrastructure. Its a data center made accessible to many users using the Internet. Anyone with access rights can interact with the cloud and retrieve, manage, download information from anywhere around the world.

Cloud computing services get provided on a pay as you go basis. The essential cloud computing features that you can enjoy:

The gaming industry is openly embracing cloud computing technology and is also implementing Gaming as a Service (GaaS). The tremendous processing power of cloud computing enables users to stream video games directly to their devices and run them from remote servers. The cloud handles all the processing requirements for the device. You dont need next-generation hardware to enjoy the latest games.

You can now stream gaming content from a network server. GaaS capabilities get supplied in different provisions: local rendering GaaS, remote rendering GaaS, and cognitive resource allocation GaaS.

All you need is low latency and a large bandwidth with minimum response time and high-quality video output. Although, there are many models for the provision of cloud gaming, including a monthly subscription to access an entire library of gamers or pay per game you request.

The high costs of gaming equipment usually present shortfalls in the gaming experience. Especially now that cloud computing is deeply established, its become more expensive to set up shop with physical games.

The number of gamers plus the total time spent playing and watching video games online has been rising over the years. An Entertainment Software Association (ESA) report entails that around 64% of adults in the U.S. regularly play video games.

The scope of Cloud computing in the gaming industry has enormous potential to expand. Today, video gaming is actively engaging about 2.8 billion people worldwide, a number expected to soar beyond 3 billion as of 2023. The entire video game industry is on the verge of reaching $189.3 billion in revenues as of 2021. At the same time, the global gaming market has estimates of getting a value of $256.97 billion by 2025.

Cloud computing is resolving many of the computing challenges faced by both gamers and gaming companies. Hence, its not shocking that companys like Google and Microsoft decided to migrate to Cloud gaming services (Google Stadia and Project xCloud).

Although, some realists look beyond the hype to argue that the Internet presents limitations regarding processing speed. But, the coming years have the possibility for significant changes and solutions to latency and processing problems.

Ongoing developments are driving us closer to faster adoption of cloud gaming services. The complete rollout of 5G technology will speed up the power of cloud computing and drive further adoption.

Microsoft established Project xCloud, which aims to enhance the gaming experience across multiple devices. And they launched cloud gaming (beta) for Xbox Game Pass Ultimate members in 2020.

Initially made its cloud computing gaming debut in 2014 when it launchedPlayStation Now. Sonyacquired a leading interactive cloud gaming company back in 2012. It successfully established its place in the world of cloud-based Gaming. Although sony remained unchallenged for years,now there are more companies expanding investments into the field.

Google has investedin the development of Stadia, a video game platform developed to provide instant access to video games regardless of the screen type. By providing the capacity to play 4K games on your TV minus a console.

You stream the games through a browser on a laptop or your phone.

EAestablished Project Atlas in 2018 to leverage cloud computing and artificial intelligence in enabling game developers to access services at optimal capacity, with an easy-to-use experience.

The leading cloud service providers also launched a cloud computing gaming service, Luna, which harnesses the extensive cloud capacity of AWS. Amazon is also establishing a new gaming channel in collaboration with global video game developer Ubisoft. You can access the massive library of games by subscription.

This company has been actively building cloud gaming solutions for many years. The evidence of its research and development lies in the release of GeForce. In February 2020, GeForce became accessible to everyone.

Nvidia also collaborated with Tencentto establish PC Cloud gaming in China.

Want more news from the Tech world for Gaming Peripherals to Hardware Click Here

See the original post here:

Cloud Computing Impact On The Gaming Industry | Invision Game Community - Invision Game Community

Posted in Cloud Computing | Comments Off on Cloud Computing Impact On The Gaming Industry | Invision Game Community – Invision Game Community

Amazon Web Services is getting ready to retire one of its oldest cloud computing services – ZDNet

Posted: at 8:59 pm

In coming months Amazon Web Services (AWS) will shut one of its oldest cloud computing infrastructure services, EC2-Classic, and is warning remaining users to move off the service to avoid application downtime.

"EC2-Classic has served us well, but we're going to give it a gold watch and a well-deserved sendoff,"writes AWS evangelist Jeff Barr.

EC2-Classic arrived with original release of Amazon EC2 but itwas not supported for accounts created after April 2013, at which point it required users to launch EC2 instances in a virtual private cloud (VPC) -- a logically-separated section AWS.

With EC2-Classic, instances run in a single, flat network that is shared with other customers. EC2-Classic required public IP addresses made available at the time, or tunneling, to communicate with AWS resources in a VPC.

There are some deadlines coming up for any business still on EC2-Classic, but Barr says the process will be gradual.

"Rest assured that we are going to make this as smooth and as non-disruptive as possible. We are not planning to disrupt any workloads and we are giving you plenty of lead time so that you can plan, test, and perform your migration," he notes.

Key dates to keep in mind are October 30, 2021, and August 15, 2022.

On October 30, AWS will disable EC2-Classic in Regions for AWS accounts that have no active EC2-Classic resources in the region. On that date, AWS won't sell 1-year and 3-year Reserved Instances for EC2-Classic.

By August 15, 2022, AWS reckons all migrations will be done and that all EC2-Classic resources will have been extinguished from AWS accounts.

Key AWS resources that EC2-Classic customers will need to keep an eye on include:

It could be tricky finding all services dependent on EC2-Class resources, so AWS has released the EC2 Classic Resource Finder script to help locate EC2-Classic resources in an account.

It's also offering the AWS Application Manager Service (AWS MGN) to help customers migrate instances and databases from EC2-Classic to VPC.

EC2-Classic customers should note that disabling it in a region is meant to be a "one-way door", but Barr says users can contact AWS Support if they need to re-enable EC2-Classic for a region.

View post:

Amazon Web Services is getting ready to retire one of its oldest cloud computing services - ZDNet

Posted in Cloud Computing | Comments Off on Amazon Web Services is getting ready to retire one of its oldest cloud computing services – ZDNet

Students will benefit from new cloud computing pathway – Brunswick News

Posted: at 8:59 pm

Technology has changed many aspects about the way we live. Its changed everything from how we communicate with each other to how we shop for goods and services.

It would make sense that as our technological society continues to move forward, new tech will infiltrate the workplace. That means workers will need to learn new skills to stay ahead of the ever-evolving technological landscape. A recent announcement from the State Board of Education shows how schools in Georgia are working to make sure todays students have access to learn these skills.

The state board recently approved a recommendation from State School Superintendent Richard Woods to add a new career pathway in cloud computing, according to a report from Capitol Beat News Service. Three courses introduction to software technology, computer science principles and cloud computing will be a part of the pathway.

A lot of people have probably heard of the term cloud computing, but they may not know what it entails. In general, the term refers to delivering services through the internet such as data storage. When you back up your photos or data to the cloud, you are using a system built off the skills students will learn in this pathway.

Adding this pathway as an option for high schoolers in the state is a no-brainer. Cloud computing is one of the most in-demand hard skills employers are looking for, according to professional networking and employment website Linkedin. In fact, Capitol Beat reported that there are more than 4,000 cloud computing related jobs opening currently in the state.

The curriculum for the course was also being developed with feedback from some of the biggest technology firms in the world such as Amazon Web Services, Google and Microsoft. Students will get the chance to learn cloud computing skills from a program designed with input from the firms most responsible for the leaps in technology we use every day.

Students that start down this pathway could one day come up with the next great technological invention. Even if they dont become the next Bill Gates, they will have the skills to find a job in a field that could keep growing as we become even more technologically advanced.

The goal of high school is to not only educate and assist the development of our youth, but it is also to make sure they have the best chance possible to succeed when they graduate.

This cloud computing pathway is just another tool to help complete the mission.

Excerpt from:

Students will benefit from new cloud computing pathway - Brunswick News

Posted in Cloud Computing | Comments Off on Students will benefit from new cloud computing pathway – Brunswick News

2021 Thematic Research into Cloud Computing in Healthcare – Featuring Amazon, Microsoft and Google Among Others – ResearchAndMarkets.com – Business…

Posted: at 8:59 pm

DUBLIN--(BUSINESS WIRE)--The "Cloud Computing in Healthcare, 2021 Update - Thematic Research" report has been added to ResearchAndMarkets.com's offering.

Healthcare providers are extremely cost-conscious because they are under constant pressure to improve patient care while maintaining profitability. Cloud solutions support this by reducing the costs of in-house IT infrastructure. Cloud computing also greatly reduces the time required to deploy software, which can take months in on-premises deployments. Cloud software deployment and updates can be conducted remotely and typically very quickly, so employees can spend less time waiting and be more productive. Major categories of cloud solutions include infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS).

In the healthcare industry, there is much concern for protecting patients' personally identifiable information (PII) as records include all forms of personal data, including name, patient number, addresses, next of kin, and detailed health information. When data privacy is breached, a healthcare company faces legal liability and penalties for regulatory noncompliance. Healthcare is one of the biggest targets of ransomware attacks, in which hackers infect a computer system and demand payment to restore it. One of the best-known examples is the 2017 WannaCry attack, which affected over 200,000 computers in over 150 countries, costing $126m in the UK alone and up to $7.9bn globally. As cloud adoption grows, developers are increasingly aware of security risks and how to combat them. Security measures for cloud include minimizing attacks, controlling logins, and improving data encryption.

This report explores the theme of cloud computing in healthcare, through coverage of healthcare challenges, players, market size and more.

Scope

Reasons to Buy

Key Topics Covered:

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/vbq26w

Link:

2021 Thematic Research into Cloud Computing in Healthcare - Featuring Amazon, Microsoft and Google Among Others - ResearchAndMarkets.com - Business...

Posted in Cloud Computing | Comments Off on 2021 Thematic Research into Cloud Computing in Healthcare – Featuring Amazon, Microsoft and Google Among Others – ResearchAndMarkets.com – Business…

Now is the time for a federal cloud modernization moonshot – Brookings Institution

Posted: at 8:59 pm

Now is the time to launch a Federal Cloud Modernization moonshot to modernize all practical legacy civilian IT systems within a decade. COVID vividly demonstrated the importance of our IT systems to a resilient and robust economy. Yet from security breaches to delayed tax processing, the weaknesses of government IT systems are well known.

In ITIFs Secrets from Cloud Computings First Stage report, I show how cloud computing offers a better way to modernize federal IT. This will bring improved citizen services, lower operating cost, andas repeated security breaches highlight the need forbetter cybersecurity. The initiative should be led by the Federal Chief Information Officer (CIO) Council, the White House Office of Management and Budget (OMB), and the Federal Chief Technology Officer (CTO), with deep engagement by agencies and support from Congress. The Technology Modernization Fund (TMF) can serve as a starting point, with IT modernization funding targeting $10 billion a year.

This will involve modernizing thousands of systems. We will need to develop new agile migration methodologies and stand up migration factories. The U.S. Digital Service can play an important role here. Leadership must also rally the federal IT industry partner community to implement system migrations at scale. Its not just fundingthe initiative needs a robust program office and careful governance. Ten years is arguably too long, but even so this will be a challenge to achieve; once it starts showing success, lessons learned should be applied to modernize state and local government IT.

Cloud better enables the government missions and programs that the American public depends on. Cloud computing is a powerful platform that provides hundreds of IT services with a common architecture, security model, development tools, and management approach. This now provides a better way to automate and scale modernization in a more repeatable fashion. Cloud computing has 31% lower operational costs than comparable on-premises infrastructure, and even greater savings when people and downtime costs are included Elsewhere, I show that cloud is a more flexible and automated system that enables rapid changes to new demands. This provides better, more reliable citizen services, whether they be innovative public facing websites, faster payment processing, or veterans health care scheduling. Moreover, cloud provides substantially stronger security that is built into the platform by design. While the recent cybersecurity executive order makes important process and policy changes, the systems and code still need to be modernized.

The initiative should be led by the Federal CIO Council, OMB, and the Federal CTO. Agency and department leadership, in addition to CIOs, need to be deeply engaged to support IT modernization. DHS Cybersecurity and Infrastructure Security Agency should be an integral partner. Congress will need to support funding and will expect transparency. The CIO Council should set a baseline of systems to modernize and then set measurable, agency-specific outcome goals such as the number of target applications, servers, and petabytes of data moved, cost savings, and priority programs supported. Federal CIOs will need to prioritize all major systems and provide plans to move them in smaller stages, learning along the way. A cloud modernization moonshot program office is crucial to manage the program and should issue public progress reports at least bi-annually, in addition to managing ongoing performance metrics, timelines, and cost savings.

The U.S. federal government is the largest technology buyer, spending well over $100 billion a year on IT. However, we need to get out of the trap where annual appropriations only pay for ongoing operations, leaving little funds to move to lower-cost, more capable systems. As Congress and the President negotiate an infrastructure modernization package, digital infrastructure and Federal IT should be included. The federal Technology Modernization Fund provides funding and expertise to upgrade and transform IT systems. Now is the time to build on the lessons learned and scale it. Funding should be increased to roughly $10 billion a year, or roughly 10% of the federal IT budget. This is substantial but would place the government at the low end of the target share of IT spending dedicated to modernization. The TMF repayment requirement should be aggressively lowered for moonshot projects, with more funding for the most important systems. The OMB, in consultation with Congress, should develop criteria for funding. Funding can prioritize target public domains including health, education, security, and benefit payments and fraud.

The federal government relies on federal IT-focused companies to provide IT services, and private industry will be integral to moving thousands of systems to the cloud. New migration methodologies will be needed to move workloads at this scale, with attention to related mission work-flows and governance. The U.S. Digital Service has an important role to play here. Migration factories that move IT systems and data in more standardized, repeatable processes will be needed. Moving to the public cloud should be the desired default choice due to its better cost, operational flexibility, and agility. However, private clouds for sensitive data and on-premises modernization remain options where appropriate. They should require specific justification and approval, with criteria developed by OMB and the Federal CIO council.

Earlier Cloud First and Cloud Smart policies helped start the federal move to cloud. Ten years later, its time to build on them with additional action. The goal is ambitious. Yet the federal Data Center Optimization Initiative, for example, targeted closing ~10% of federal data center square footage a year, and included goals such as cost savings, server utilization, and energy efficiency. For sure there will be setbacks along the way. But the initiative should learn from these and course-correct. A parallel initiative at the Department of Defense for national security systems could follow. Lessons from the federal level should then be applied to a state and local government modernization initiative. We are moving to a digital economy to generate growth, resiliency, and improve social opportunities. A robust government IT capability is integral to this progress.

Amazon is a general, unrestricted donor to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the authors and not influenced by any donation.

Follow this link:

Now is the time for a federal cloud modernization moonshot - Brookings Institution

Posted in Cloud Computing | Comments Off on Now is the time for a federal cloud modernization moonshot – Brookings Institution

Page 68«..1020..67686970..8090..»