Page 30«..1020..29303132..4050..»

Category Archives: Cloud Computing

The Future of Data Management According to the Father of Edge Computing – MarketScale

Posted: April 15, 2022 at 12:47 pm

The increasingly endless pool of data that businesses generate is becoming increasingly difficult to sort and analyze. Businesses are inundated with data that compounds daily as IoT and AI technology continues to evolve. And unless a business is properly set up to collect, store, and analyze the data they collect, the data itself becomes virtually useless.

Edge Computing versus the Cloud

The solution edge computing. Edge computing is predicted to reach a market size of $40B by 2027 with a compound annual growth rate of over 30%. Today, every major IT company and telecommunications company has embraced edge computing. Its important to understand that cloud computing is also growing, so this is not a zero-sum solution. The growth of edge computing does not come at the cost of cloud computing. Both will grow well into the future, explained Mahadev Satyanarayanan (Satya).

Satya is often referred to as one of the fathers of edge computing. His long list of accolades includes being a Deloitte Cloud Institute Fellow and Carnegie Group University Professor of Computer Science at Carnegie Mellon University. When it comes to the future of data management and evolving technology, Satya is one expert that provides insightful guidance on emerging trends and what they mean to the world at large.

Introducing Edge Native Applications

Applications are being created and designed to work specifically with the properties of edge computing. These applications utilize multiple layers of computing which is a confusing subject for many people. Satya explained edge computing to reduce confusion about what the term means. Its useful to think in terms of tiers of computing. Tier one is the cloud. Tier three is mobile and IoT devices Tier two refers to a small cloud with excellent network proximity to tier three, and we refer to such a small cloud as a cloudlet.

Edge native applications require the use of both tier two and three to operate successfully, especially applications that are compute-intensive, bandwidth-hungry, and latency-sensitive. However, it should be noted that tier one computing is often required for successful operations. Satya predicts that this class of applications [will] grow and enrich the space of edge computing in the future.

Platform9 notes that the global coronavirus epidemic has accelerated the need for edge computing due to the massive increase in remote work environments, telemedicine, and distance learning to name a few. And while most data will stay local, the need for scalable, easily programmable, responsive infrastructure at the edge will increase.

Cloudlets Critical to Successful Data Management

Although data is captured with tier three computing devices, the ability of these devices to process data in real-time is limited at best. In todays hyperconnected world, companies need a way to scale and analyze data faster, cheaper and better. The only way to do that is to move out of the cloud and on to the edge of the network and internet, where most of the future data will be generated (Crunchbase).

Satya noted that the ability to offload computation to cloudlets is crucial to success, and the role of the cloud is to provide the virtually unlimited computing resources that can be used for tasks that do not have stringent bandwidth or latency constraints on them.

This is further supported by industry predictions estimating that 75% of all data will be processed at the edge by 2022. Furthermore, it is also estimated that by 2025 nearly one-third of all the workloads in the world will run at the edge. (Crunchbase)

More here:

The Future of Data Management According to the Father of Edge Computing - MarketScale

Posted in Cloud Computing | Comments Off on The Future of Data Management According to the Father of Edge Computing – MarketScale

Insights on the Software Consulting Global Market to 2027 – Lower Infrastructure and Storage Costs Result in a Higher ROI – GlobeNewswire

Posted: at 12:47 pm

Dublin, April 15, 2022 (GLOBE NEWSWIRE) -- The "Global Software Consulting Market Size, Share & Industry Trends Analysis Report By Enterprise Size, By Vertical, By Application (Enterprise Solutions, Migration & Maintenance Services & Others), By Regional Outlook and Forecast, 2021-2027" report has been added to ResearchAndMarkets.com's offering.

The Global Software Consulting Market size is expected to reach $458.5 billion by 2027, rising at a market growth of 12.1% CAGR during the forecast period.

Software consulting is the practice of advising on the best software solutions for a company's business strategy. Different software solutions are being sought by businesses in order to align their technology investments. A software consultant is employed as a contractor for a certain amount of time in the software consulting process. Software consulting firms assist businesses in optimizing, designing, processing, architecting, and implementing software. Furthermore, these services assist businesses in making decisions about software technology and their software adoption investment plan. It also enables businesses to have a clear strategy for technology advancement and to come up with innovative ways to streamline corporate procedures.

Technical skills for improving and maintaining software, onsite management services, support & testing services, and system designing and planning services are all provided by software consulting organizations. Software consulting is the process of finding a software solution to optimize a company's operations while saving time.

Software consultants are familiar with the business domain and model. They advise businesses on software solutions that are vital for their technical progress and can add value to their organization through new solutions, enhanced IT architecture, or better integration between current solutions based on this knowledge. For technology updates, businesses are turning to third-party service providers. The service providers examine a company's software's technical advancements.

Many businesses rely on software consulting services to stay afloat in today's fast-paced technology environment. The market is likely to be driven by increasing digitization in corporate operations. Offshoring and globalization are also likely to boost consulting services demand. Enterprises are projected to adopt new technologies like cloud computing and mobile computing, resulting in an increase in the need for consulting services. The expansion in the number of small and medium businesses, as well as the need for more advanced software solutions, is likely to open up new market prospects for building successful software consulting business models.

COVID-19 Impact Analysis

The market is rising because to a growing demand for digitization of business processes throughout sectors and verticals, particularly for the smooth integration of software into an enterprise's IT system. However, as a result of the Covid-19 epidemic and the global economic slump, various sectors are experiencing substantial consequences and problems across crucial operations.

While the industry has grown significantly in recent years as a result of digitization and technological penetration, the pandemic has forced numerous nations to the brink of recession. As a result, a large number of consulting customers are deferring projects, lowering project scope to save money, or cancelling them entirely. Multiple client projects that were cancelled have had a negative impact on vendor revenues and have hampered market growth in the short term.

Market Growth Factors:

Lower infrastructure and storage costs result in a higher return on investment

Businesses are apprehensive about the expenses of data hosting on-premises, both in terms of deployment and maintenance. Furthermore, employee costs and challenges with downtime are two additional worries for businesses. The current competitive environment and global economic conditions have hastened the use of cost-effective business model restructuring strategies. Another reason driving the use of cloud computing services is the rising movement of businesses toward digital transformation and the acceleration of customer experience, both of which are lowering corporate expenses. Furthermore, the cloud provides the pay-as-you-go approach, which allows businesses to pay for cloud services based on how often they use them, resulting in lower prices.

While some IT positions were affected during the epidemic, most IT employees have found it relatively simple to obtain new jobs because to the current skills need. Because IT leaders are aware of the scarcity of talent, they are deliberating carefully before cutting roles. Their organisations, on the other hand, require the correct combination of abilities at any given time, which is a shifting objective.

Hybrid cloud services are becoming more popular

Enterprises with current infrastructure are migrating toward cloud computing services and are prepared to use a hybrid strategy in order to gain the benefits of both on-premises and cloud services. Furthermore, SMEs are significantly considering cloud computing services which is leading to significant benefits such as no upfront infrastructure expenses and compute resources that are available on demand. These variables are influencing the adoption of hybrid cloud services by businesses. Furthermore, the hybrid cloud provides improved workload management, higher security and compliance, and seamless interaction within DevOps teams.

Businesses are turning to the hybrid cloud approach to solve problems that are tough to solve with legacy systems. Hybrid cloud bridges the gap between IT and companies by increasing agility and efficiency while also delivering IT resources quickly and at a reasonable cost. It allows businesses to scale up or down existing applications and infrastructure as needed, while also providing users with high-speed performance and high dependability.

Market Restraining Factors

A growing array of multi-sourcing strategies are being used

Rather than taking a one-size-fits-all strategy, an increasing number of companies are splitting down big consulting contracts into smaller parts and enlisting the help of many vendors to complete tasks. Because consulting companies do not always have competence in every practise area, healthcare organisations are gradually embracing the multi-sourcing approach.

Healthcare providers, payers, and government agencies are all pushing for numerous consulting companies to work together on projects. Multi-sourcing, on the other hand, may have its own set of challenges and issues since it necessitates effective and reliable service integration amongst vendors. This might have a detrimental influence on consultancy businesses' profitability.

Key Topics Covered:

Chapter 1. Market Scope & Methodology

Chapter 2. Market Overview2.1 Introduction2.1.1 Overview2.1.1.1 Market Composition and Scenario2.2 Key Factors Impacting the Market2.2.1 Market Drivers2.2.2 Market Restraints

Chapter 3. Competition Analysis - Global3.1 KBV Cardinal Matrix3.2 Recent Industry Wide Strategic Developments3.2.1 Partnerships, Collaborations and Agreements3.2.2 Product Launches and Product Expansions3.2.3 Acquisition and Mergers3.3 Top Winning Strategies3.3.1 Key Leading Strategies: Percentage Distribution (2017-2021)3.3.2 Key Strategic Move: (Acquisitions and Mergers: 2017, Sep - 2022, Jan) Leading Players

Chapter 4. Global Software Consulting Market by Enterprise Size4.1 Global Large Enterprises Market by Region4.2 Global Small & Medium Enterprises (SMEs) Market by Region

Chapter 5. Global Software Consulting Market by Vertical5.1 Global BFSI Market by Region5.2 Global IT & Telecom Market by Region5.3 Global Manufacturing Market by Region5.4 Global Government Market by Region5.5 Global Retail Market by Region5.6 Global Healthcare Market by Region5.7 Global Automotive Market by Region5.8 Global Others Market by Region

Chapter 6. Global Software Consulting Market by Application6.1 Global Enterprise Solutions Market by Region6.2 Global Migration & Maintenance Services Market by Region6.3 Global Software Security Services Market by Region6.4 Global Design Services Market by Region6.5 Global Application Development Market by Region6.6 Global Application Testing Services Market by Region6.7 Global Others Market by Region

Chapter 7. Global Software Consulting Market by Region

Chapter 8. Company Profiles8.1 Ernst & Young Global Limited8.1.1 Company Overview8.2 Accenture PLC8.2.1 Company Overview8.2.2 Financial Analysis8.2.3 Segmental and Regional Analysis8.2.4 Research & Development Expenses8.2.5 Recent strategies and developments:8.2.5.1 Partnerships, Collaborations, and Agreements:8.2.5.2 Acquisition and Mergers8.2.5.3 Product Launches and Product Expansions:8.2.6 SWOT Analysis8.3 Cognizant Technology Solutions Corporation8.3.1 Company overview8.3.2 Financial Analysis8.3.3 Segmental and Regional Analysis8.3.4 Recent strategies and developments:8.3.4.1 Acquisition and Mergers8.3.5 SWOT Analysis8.4 Deloitte Touche Tohmatsu Limited8.4.1 Company Overview8.4.2 Financial Analysis8.4.3 Segmental Analysis8.5 IBM Corporation8.5.1 Company Overview8.5.2 Financial Analysis8.5.3 Regional & Segmental Analysis8.5.4 Research & Development Expenses8.5.5 Recent strategies and developments:8.5.6 SWOT Analysis8.6 Atos Group8.6.1 Company Overview8.6.2 Financial Analysis8.6.3 Segmental and Regional Analysis8.6.4 Recent strategies and developments:8.6.4.1 Acquisition and Mergers8.6.4.2 Product Launches and Product Expansions:8.6.5 SWOT Analysis8.7 Capgemini SE8.7.1 Company Overview8.7.2 Financial Analysis8.7.3 Regional Analysis8.7.4 Recent strategies and developments:8.8 Oracle Corporation8.8.1 Company Overview8.8.2 Financial Analysis8.8.3 Segmental and Regional Analysis8.8.4 Research & Development Expense8.8.5 Recent Strategies and Developments8.8.5.1 Partnerships, Collaborations, and Agreements:8.8.5.2 Acquisitions and Mergers:8.8.6 SWOT Analysis8.9 SAP SE8.9.1 Company Overview8.9.2 Financial Analysis8.9.3 Segmental and Regional Analysis8.9.4 Research & Development Expense8.9.5 Recent strategies and developments:8.9.5.1 Product Launches and Product Expansions:8.9.6 SWOT Analysis8.10. CGI, Inc.8.10.1 Company Overview8.10.2 Financial Analysis8.10.3 Regional Analysis8.10.4 Recent strategies and developments:

For more information about this report visit https://www.researchandmarkets.com/r/a7t2dw

Read more:

Insights on the Software Consulting Global Market to 2027 - Lower Infrastructure and Storage Costs Result in a Higher ROI - GlobeNewswire

Posted in Cloud Computing | Comments Off on Insights on the Software Consulting Global Market to 2027 – Lower Infrastructure and Storage Costs Result in a Higher ROI – GlobeNewswire

Software for Open Networking in the Cloud (SONiC) Moves to the Linux Foundation – PR Newswire

Posted: at 12:47 pm

Leading open source network operating system enabling dis-aggregation for data centers now hosted by the Linux Foundation to enable neutral governance in a software ecosystem

SAN FRANCISCO, April 14, 2022 /PRNewswire/ -- Today, the Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced the Software for Open Networking in the Cloud (SONiC, an open source networking operating system), is now part of the Linux Foundation. The Linux Foundation provides a venue for continued ecosystem, developer growth and diversity, as well as collaboration across the open source networking stack.

"We are pleased to welcome SONiC to the Linux Foundation family of open networking projects," said Arpit Joshipura, general manager, Networking, Edge, and IoT, the Linux Foundation. "SONiC is a leader in open source data center NOS deployments, and we're looking forward to growing its developer community."

The Linux Foundation will primarily focus on the software component of SONiC, and continue to partner with Open Compute Platform(OCP) on aligning hardware and specifications like SAI.

"Microsoft founded SONiC to bring high reliability and fast innovation to the routers in Azure cloud data centers. We created it as open source so the entire networking ecosystem would grow stronger. SONiC already runs on millions of ports in the networks of cloud scalers, enterprises, and fintechs. The SONiC project is thrilled to be joining the Linux Foundation to take the community to its next jump in scale, participation, and usage," said Dave Maltz, Technical Fellow and Corporate Vice President, Microsoft Azure Networking.

About SONiC

Created by Microsoft for its Azure data centers, SONiC is an open source network operating system (NOS) based on Linux that runs on over 100 different switches from multiple vendors and ASICs. It offers a full-suite of network functionality, like BGP and RDMA, that has been production-hardened in the data centers of some of the largest cloud-service providers. It offers teams the flexibility to create the network solutions they need while leveraging the collective strength of a large ecosystem and community.

Existing Ecosystem

SONiC brings a strong existing ecosystem, with premier members including Alibaba, Broadcom, Dell, Google, Intel, Microsoft, NVIDIA and 50+ global partners.

The SONiC community will host its first hackathon later this year. Stay tuned for details and registration information.

More information about SONiC, including how to join, is available at SONiC (azure.github.io).

Support from Key Stakeholders & Customers

Alibaba

"This is a big milestone for the SONiC community. After joining the Linux Foundation, the SONiC community will play a much more important role in the networking ecosystem," said Dennis Cai, Head of Network Infrastructure, Alibaba Cloud. "Congratulations! As one of the pioneering SONiC users and contributors, Alibaba Cloud has widely deployed SONiC- based whitebox switches in our data centers, edge computing cloud, P4- based network gateways, and will extend the deployment to Wide Area Networks. With modern network OS design and operation- friendly features, we already gained tremendous value from the large-scale deployments. Alibaba is committed to the SONiC community, and will continue bringing our large-scale deployment best practices to the community, such as open hardware specs , network in-band telemetry, high performance networking, and network resiliency features, SRv6, etc."

Broadcom

"Large hyperscalers agree that merchant silicon, hardware independence, and open source protocol and management stack are essential for running their data center networks. Broadcom has wholeheartedly supported this vision with leading-edge, predictable silicon execution and contributions to the SONiC project. We are excited to see the SONiC initiative join the Linux Foundation and look forward to working with the streamlined ecosystem to drive the data center and hyperscale needs of the future," said Mohammad Hanif, senior director of engineering, Core Switching Group, Broadcom.

Dell Technologies

"We believe SONiC will continue its accelerated adoption into the modern data center, delivering the scale, flexibility and programmability needed to run enterprise-level networks," said Dave Lincoln, vice president of product management at Dell Technologies. "As a leading SONiC contributor, we see the advantages it brings to the supporting open source community and customers. As we continue the drive to take open-source-based solutions mainstream, we look forward to working with the Linux Foundation and its supporting communities to drive SONIC's development and adoption."

EBay

"eBay operates a large-scale network infrastructure to support its growing global business. eBay cares about the openness and quality of NOS to operate its network infrastructure. eBay is an active participant in the SONiC community and deploys SONiC at scale in its infrastructure. eBay is excited to see this next step of growth of the SONiC community," said Parantap Lahiri, vice president, Network and Datacenter Engineering at eBay.

EPFL

"At EPFL, we have been looking for a vendor neutral and flexible NOS that can provide HaaS capabilities for our Private Cloud Environment. SONiC OS provides us the solution we have been looking for in our Data Centre, allowing us to migrate to a powerful and modern Data Centre network. We are looking forward to this next phase in the SONiC community," said Julien Demierre, Network and System architect at EPFL.

Google

"We believe moving SONiC to the Linux Foundation is very important as it will further enhance collaboration across the open source network, community and ecosystem. Google has more than a decade of experience in SDN; our data centers and WAN are exclusively SDN controlled, and we are excited to have helped bring SDN capabilities to SONiC . We fully support the move to the LF and intend to continue making significant upstream contributions to drive feature velocity and make it easier for operators to realize the benefits of SDN with PINS/SONiC and P4," said Dan Lenoski, vice president, Engineering, Network Infrastructure, Google.

Intel

"Intel has a strong history of working with SONiC and the Linux Foundation to help to propel innovation in an open, cooperative environment where ideas are shared and iterated. We continually promote open collaboration, encompassing open-source technologies such as the Infrastructure Programmer Developer Kit and P4 integrated networking stack (PINS), using Intel Xeon Scalable processors, Infrastructure Processing Units and Tofino Intelligent Fabric Processors as base hardware," said Ed Doe, vice president and general manager, Switch and Fabric Group at Intel. "Joining the Linux Foundation will help SONiC to flourish, and in turn create greater benefit for cloud service providers, network operators and enterprises to create customized network solutions and transform data-intensive workloads from data center to the edge."

NVIDIA

"This is an important milestone for SONiC and the community behind it," said Amit Katz, vice president of Ethernet Switches at NVIDIA. "NVIDIA is committed to supporting the community version of SONiC that is 100 percent open source, enabling data center operators to control the code inside their cloud fabrics, accelerated by state-of-the-art platforms with SONiC support, such as NVIDIA's Spectrum family of switches."

Open Compute Project

"The Open Compute Project Foundation is pleased to continue its collaboration with SONIC as part of the OCP's new hardware software co-design strategy. The open source SONiC Network Operating System is enabling rapid innovation across the network ecosystem, and it began with the definition of the Switch Abstraction Interface (SAI) at OCP. Hardware software co-design focuses on software that requires intimate knowledge of the hardware to drive maximum hardware performance, and speed time-to-market for hardware where system performance and ecological footprint can be highly dependent on software and hardware interactions," said George Tchaparian, CEO Open Compute Project Foundation.

About the Linux Foundation

The Linux Foundation is the organization of choice for the world's top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at http://www.linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

Media ContactJill LovatoThe Linux Foundation[emailprotected]

SOURCE The Linux Foundation

Go here to read the rest:

Software for Open Networking in the Cloud (SONiC) Moves to the Linux Foundation - PR Newswire

Posted in Cloud Computing | Comments Off on Software for Open Networking in the Cloud (SONiC) Moves to the Linux Foundation – PR Newswire

Google Distributed Cloud: Who Will It Benefit? – ITPro Today

Posted: at 12:47 pm

Wouldn't it be great if you could take a public cloud platform like Google Cloud and deploy its services in your own data center, or even on edge devices?

Well, you can, using Google Distributed Cloud, one of the newest offerings in Google's cloud services portfolio.

Related: Will Hybrid Cloud Save You Money? Predict Your Hybrid Cloud Costs

Here's how Google Distributed Cloud works, which use cases it targets, and why you may (or may not) want to use it as part of an edge or hybrid cloud strategy.

What Is Google Distributed Cloud?

Related: The Pros and Cons of Kubernetes-Based Hybrid Cloud

Google Distributed Cloud is a suite of cloud products from Google designed primarily to support the deployment of public cloud services at the "edge."

Thus, Google Distributed Cloud isn't a specific platform or service as much as it's a set of various tools and services, which you can use in a variety of ways.

Most of the services built into Google Distributed Cloud come from Google's standard public cloud platform (such as Anthos), so there's nothing really brand-new about Google Distributed Cloud from a technical perspective. It's mostly the way that Google is packaging the services to enable edge and hybrid cloud use cases that makes Google Distributed Cloud unique.

Google announced its Distributed Cloud portfolio in October 2021. The offering currently remains in preview mode, and there have been few real-world deployments of the platform to date. A deployment by Bell Canada, announced in February 2022, is one of the first.

Google Distributed Cloud works by allowing users to extend public cloud services that are hosted on Google Cloud Platform to private servers, internet of things (IoT) devices, or other infrastructure. In other words, you can use Distributed Cloud to manage infrastructure that you own as opposed to infrastructure owned by a cloud provider like Google using many of the same tools and services that Google makes available to its public cloud customers.

In this way, Distributed Cloud is similar to offerings such as AWS Outposts and Azure Arc, which also extend public cloud functionality into private infrastructure.

Currently, Google Distributed Cloud is designed to run on four types of infrastructure setups:

This means Google Distributed Cloud can operate on basically any infrastructure including conventional data centers and less orthodox environments, like networks of IoT devices.

If Google Distributed Cloud sounds like a hybrid cloud platform, it's because it basically is. Extending public cloud services like those of Google Cloud into private infrastructure would certainly qualify as a hybrid cloud deployment by most definitions.

Notably, however, Google is not calling the platform a hybrid cloud solution. Instead, Google is using language that centers on "edge" and "distributed" infrastructure.

That's probably because Google already markets Anthos (which, again, is integrated into the Distributed Cloud portfolio but which exists as a stand-alone product, too) as its main hybrid cloud solution.

Since Distributed Cloud is also based in part on Anthos, you could argue that the main difference between Distributed Cloud and a hybrid cloud platform is marketing and branding, not technology. And indeed, to a significant extent, Distributed Cloud seems to be a reflection of Google's efforts to position itself as a leader in the edge computing market above all else.

It's understandable why Google would choose to brand Distributed Cloud as something different from a hybrid cloud platform, even if it's technically really not that different. With Distributed Cloud, Google is in a stronger position to cater to use cases like running network functions on telco infrastructure or managing edge IoT devices deployments that aren't usually the focus of conventional hybrid cloud platforms.

Ultimately, Google Distributed Cloud is likely to become a product that is very important in certain narrow niches, but that most companies won't use.

In verticals such as telco, or for businesses with large IoT infrastructures, Google Distributed Cloud offers an easy way of managing large, distributed networks of devices. It's not the only solution of its kind; you could also manage distributed infrastructures using most Kubernetes distributions, for example, or via proprietary services like Azure IoT Edge. But the fact that Google Distributed Cloud is based on Google Cloud services will give it an advantage among customers who are already invested in the Google Cloud ecosystem.

That said, companies that just want to run a conventional hybrid cloud meaning one that extends public cloud services to private servers or data centers, without edge infrastructure in the mix aren't likely to benefit from Google Distributed Cloud. They should choose a more traditional hybrid cloud platform, like Anthos or a similar offering from a different public cloud provider.

Continue reading here:

Google Distributed Cloud: Who Will It Benefit? - ITPro Today

Posted in Cloud Computing | Comments Off on Google Distributed Cloud: Who Will It Benefit? – ITPro Today

Unveiling the Potential Relationship between IoT and Cloud Computing – IoT For All

Posted: April 9, 2022 at 4:05 am

Today, if we look around, we find that IoT, the Internet of Things, disrupts our daily lives, either at home or the workplace. It has been 20 years since this concept has knocked the tech world. Since then, it has offered excellent solutions that have made everything seamless and better.

From smart Fitbits to the Amazon Echo or Google Home, today, most people are using connected smart devices and wearables to monitor their health like heart rate, calories, and daily activities. In fact, some use it to manage their heating, lighting, home security, and re-order household staples when supplies run low.

The digital changes occurring everywhere, at home or business, in hospitals, buildings, or the entire town, show that IoT is growing at a breakneck speed.

As per research conducted by Juniper Research, the number of IoT devices is set to increase from 35 billion connections in 2020 and reach 83 billion by 2024, which results in around 130 percent increase in the coming four years as enterprise IoT users will extend their IoT ecosystems to improve operational efficiencies to generate real-time insights.c

The pandemic strike has proved that digitization in every sector has become necessary. Companies have to double down on digital transformation projects, and technologies like IoT are the only way to make it possible. Embracing these technologies is the only way to improve customer services, automate processes and tasks, track assets, detect existing loopholes, and re-invent and renovate existing business models. But the success of IoT is not possible without cloud computing. We can simply say the cloud offers a lot more than just connectivity on which IoT devices are dependent. This means IoT devices are dependent on the cloud to store essential data in one central location that can be easily accessed, managed, and distributed in real-time.

When it comes to the relationship between IoT and cloud computing, here are four significant benefits that can compel organizations to use clouds to unleash the full potential of their IoT devices.

The cloud can assist organizations in overcoming the technical and cost hurdles that get dragged in with deploying an IoT solution.

The cloud eliminates the requirement to set up physical servers, deploy databases, configure networks, manage connections, or do other infrastructure tasks. It makes it speedy and easy to spin up virtual servers, launch databases, and generate the needed data pipelines to operate an IoT solution.

On the other hand, on-premises IoT network infrastructure needs much hardware, and time-consuming configuration efforts to make sure things run appropriately; implementing a cloud-powered IoT system is significantly more streamlined.

For instance, scaling up the number of IoT-enabled devices just requires leasing another virtual server or more cloud space.

In the same way, cloud services can easily streamline remote device lifecycle management, ensuring delivery of a 360-degree view of the device infrastructure and tools that automate the update and setup of software and firmware over the air.

IoT devices are essential for both consumers and enterprises because of their information. But they become more helpful when communicating with each other.

For instance, a connected thermostat can communicate with smart refrigerators to increase or decrease the temperature. A connected micro-controller can analyze and predict preventative maintenance, which is needed to reduce the chances of any damage.

The cloud helps in this operation by streamlining and optimizing machine-to-machine communications and facilitating this across interfaces. With the increased interactions between many connected devices and immense volumes of data generated, organizations will have to find a cost-effective way to store, process, and access data from their IoT solutions.

In addition, they also need to be capable to scale up to manage peaks of demand or extending the infrastructure to handle extra functionality whenever they add more features to their IoT solution.

We all know that an IoT solution generates immense amounts of data. In that case, adding built-in management tools and processing capabilities that support the successful transfer of data between devices effectively and efficiently will make the process easy and convenient. The cloud also offers a hosting platform for Big Data and data analytics at a significantly lower cost.

Data generated by IoT devices can be stored and processed in a cloud server and easily accessible at any time from any place without any infrastructure or networking issues. In the same way, data can be collected remotely and in real-time from devices located anywhere and in any time zone.

Sometimes, interoperability hampers the ability of enterprises to link or integrate data generated by IoT devices to other data resources.

Adding a cloud can assist in linking applications and seamlessly integrate all the data sources so they can be analyzed, regardless of source.

Cloud can also help the organization streamline the integration of the IoT solution with other smart products developed by their third parties to generate additional value for users.

Security has been a much-talked concern as security lapses, and failure to update IoT devices has created a gateway for cybercriminals. Cloud platforms can support enterprises in improving and strengthening security in two ways.

Firstly, as we already know, cloud providers can make it simple to undertake regular software and firmware updates, signed with digital certificates that ensure users these updates are safe and authorized.

Secondly, cloud platforms help initiate customized client and server-side encryption that guarantees complete security of data flowing through the IoT ecosystem and even when it is at rest in the database. Many cloud service providers provide 24*7 monitoring to minimize the risk of a security breach.

Many organizations embrace IoT technologies, and those who are still reliant on the old-traditioned infrastructure will find themselves at a loss.

Adopting the Cloud, a power plant for IoT device communications and memory, organizations will experience better connectivity rates and improved ROI.

Embracing a hybrid cloud approach facilitates IT teams to establish the right mix of hosting opportunities that allows them to manage rapid rollout and enablement while getting the max out of IoT devices and securing better IoT IoT strategy without investing much time and money and efforts into developing costly infrastructure.

For organizations that have decided to extend their IoT ambitions, the cloud can assist in developing IoT products faster, can easily manage and handle all the generated data, secure the IoT ecosystem, and establish better integration with existing systems and other IoT devices. This means cloud computing will be the key to unlocking a faster time to market, with greater flexibility and adding lifetime value for success and profit churning IoT deployment.

Read more:

Unveiling the Potential Relationship between IoT and Cloud Computing - IoT For All

Posted in Cloud Computing | Comments Off on Unveiling the Potential Relationship between IoT and Cloud Computing – IoT For All

Cloud Computing Now IT’s Default Tech Are You in the Game? – Channel Futures

Posted: at 4:05 am

This weeks cloud news roundup also features takeaways from Prosimo, Google Cloud and NetApp.

Cloud computing is turning into the most in-demand technology, in all its different facets. Think SaaS, PaaS, IaaS, everything as a service. New research from Foundry (formerly IDG Communications) highlights the momentum organizations are undergoing as they adopt more cloud services and infrastructure. Partners will want to take note of the end-user demand, given the channels growing involvement in guiding, deploying and managing customers cloud environments. (Well be talking about cloud computing opportunities for smaller partners, April 11-14, at the Channel Partners Conference & Expo.)

Along similar lines, Prosimo, a vendor built by Viptelas cofounders, has a new cloud computing platform partners will want to investigate. Its geared toward resellers and managed services providers.

Next up, you probably know that Google Cloud is a founding member of the new Data Cloud Alliance. We covered that a couple days ago but here we offer a few more details. Channel partners should benefit, albeit in a trickle-down kind of way.

Finally, NetApp, which bought CloudCheckr, just announced another acquisition. FInd out which cloud computing company its buying to add to its Spot by NetApp portfolio (where CloudCheckr now resides).

Its all in the slideshow above.

More:

Cloud Computing Now IT's Default Tech Are You in the Game? - Channel Futures

Posted in Cloud Computing | Comments Off on Cloud Computing Now IT’s Default Tech Are You in the Game? – Channel Futures

Software-as-a-Service Rules the Cloud – DARKReading

Posted: at 4:05 am

The majority of respondents in a new survey on cloud adoption say they use software-as-a-service (SaaS). Eight in 10 (80%) IT professionals at organizations that employ cloud computing have implemented SaaS.

That's according to "State of the Cloud: A Security Perspective," a report from Dark Reading released in March. The research surveyed decision-makers with IT job titles at organizations that use cloud services. Those organizations include companies of all sizes from a variety of industries, including technology, healthcare, financial services, manufacturing, and education.

The overall picture from the survey is one ofa robust remote workforce environment. While SaaS was by far the most widely embraced cloud application, almost half of respondents also reported using infrastructure-as-a-service (49%) and platform-as-a-service (47%). Other everyday work solutions were rarer, including desktop-as-a-service (19%) and containers-as-a-service (14%).

One of the obstacles to implementing cloud computing more widely is concerns about security, the main focus of this report. Many respondents say they already use cloud tools to secure their data and networks. One-quarter (25%) of respondents use disaster-recovery-as-a-service to help bounce back from cyberattacks and natural disasters. Almost one in five (18%) implement security-as-a-service. Only 10% of cloud users run secure access service edge (SASE) defense, however, which means that area is still wide open for competition.

For more data and insights, download the full report.

Keep up with the latest cybersecurity threats, newly-discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

See more here:

Software-as-a-Service Rules the Cloud - DARKReading

Posted in Cloud Computing | Comments Off on Software-as-a-Service Rules the Cloud – DARKReading

Cloud computing spending is growing again and there’s more to come – ZDNet

Posted: at 4:05 am

Businesses around the world spent $21.1 billion on cloud infrastructure services in the fourth quarter (Q4) of 2021, signaling a rebound in spending on cloud storage and compute power.

Spending on cloud infrastructure was up 13.5% in the quarter year on year to $21.1 billion, according to tech research firm, IDC.

The previous quarter saw spending on cloud infrastructure reach $18.6 billion after a remarkable year-on-year decline of 1.9% in Q2 2021, which was the first time in seven quarters that spending on cloud decreased.

Cloud spending rose as businesses and governments across the world embarked on major digital transformation projects over the last two years. The big winners are the big three cloud players: Amazon Web Services (AWS), Google Cloud, and Microsoft with Azure and its other cloud services, like Office 365.

"This marked the second consecutive quarter of year-over-year growth as supply chain constraints have depleted vendor inventories over the past several quarters. As backlogs continue to grow, pent-up demand bodes well for future growth as long as the economy stays healthy, and supply catches up to demand," IDC said of Q4 2021.

Over the whole of 2021, spending on cloud was 8.8% higher than in 2020, reaching a total of $73.9 billion for the year.

Enterprise spending on traditional IT grew too, but not as fast as cloud spending. Enterprise invested in non-cloud infrastructure to the tune of $17.2 billion, up 1.5% year on year in 4Q21, according to IDC. It marks the fourth consecutive quarter of growth on traditional IT spending, which grew 4.2% over 2020 to $59.6 billion for the year.

Cloud giants are trying to gain an edge on each other in all sorts of ways. Last week a Google Cloud survey argued that government workers were worried thatreliance on Microsoft's products was undermining the government's cybersecurity: Microsoft didn't agree.

IDC forecasts that firms will spend $90.0 billion cloud infrastructure in 2022, up 21.7% compared to 2021. The biggest loser is traditional IT spends for organizations that maintain their own infrastructure. CIO spending on non-cloud infrastructure is to decline 0.3% to $59.4 billion.

However, IDC reckons that spending on shared cloud infrastructure spending will grow 25.5% year over year to $64.5 billion for 2022, while spending on dedicated cloud infrastructure is expected to grow 13.1% to $25.4 billion in 2022.

See the article here:

Cloud computing spending is growing again and there's more to come - ZDNet

Posted in Cloud Computing | Comments Off on Cloud computing spending is growing again and there’s more to come – ZDNet

Importance of cloud computing highlighted at investment summit – Gulf Today

Posted: at 4:05 am

Top dignitaries and participants during the investment summit in Riyadh.

Gulf Today Report

Sir Anthony Ritossa's 18th Global Family Office Investment Summit held recently in Riyadh in the presence of Prince Abdulaziz Bin Faisal Bin Abdulmajeed Al Saud.

The event hosted more than 400 world leaders who share similar values regarding the importance of tailoring solutions to the worlds greatest challenges and making an impact through investment, positive action and personal commitment.

Royal families, global business leaders, leading entrepreneurs, and private investors were guests at the event. The previous summits have raised $2.8 billion in investments for leading global start-ups, entrepreneurs, funds, and philanthropic endeavors.

As a VIP guest at the event Ethernity CLOUD presented the latest developments in its network for private, confidential data processing.

In light of todays focus on Web 3.0, the applicability of Ethernity CLOUD is limitless and long-term use cases include healthcare, military, and government applications. The companys technology is decentralised, confidential anonymous, and highly available, which makes it an ideal solution for the fast-growing confidential computing market, which is estimated to reach $52 billion in the next three years, according to Everest Group Market research.

Our processes are transparent, and the data is confidential - while in storage, in transit, and during processing. With data being at the core of the digital world and privacy demand on the rise, Ethernity CLOUD provides a valuable network where data can be processed in a private and confidential manner, said Iosif Peterfi, Founder and CEO of Ethernity CLOUD.

Because Ethernity CLOUD uses blockchain technology, the activity on the network is fully transparent and it takes place in a trustless environment, preventing foul play. On top of everything else, because the network is comprised of independent CPUs (nodes), downtime is virtually reduced to zero. Essentially, Ethernity CLOUD provides a way for business entities to run dApps in a private and confidential manner, while also providing a passive income source for node operators.

Importantly, the company is also environmentally oriented and takes proper measures to offset carbon emissions. Through a partnership with Green Ant, a project which plants trees in Thailand and mints an NFT for each tree that they plant, Ethernity CLOUD is committed to becoming carbon negative.

Iosif Peterfi speaks during the summit.

As a former security guy, having worked with the US Department of Defense, I recognise that blockchain is the building block that is the answer to solve a lot of our current issues. At Ethernity CLOUD, data privacy is considered a human right. That is our mission. Blockchain opened the door to make our project feasible; blockchain for us is about decentralisation, said Peterfi.

Ethernity CLOUD is revolutionising the cloud computing industry with its fair ecosystem where everyones right to privacy is fully protected, andintegrity is insured by the blockchain itself. Data is protected from abusive cloud providers activities ensuring fair, decentralised and truly private operations. We are honoured to have had the team join us in Riyadh, said Sir Anthony Ritossa, Chairman of Ritossa Family Office.

Read the rest here:

Importance of cloud computing highlighted at investment summit - Gulf Today

Posted in Cloud Computing | Comments Off on Importance of cloud computing highlighted at investment summit – Gulf Today

Google Finally Gets The Edge Computing Strategy Right With Distributed Cloud Edge – Forbes

Posted: at 4:05 am

Announced at the Google Cloud Next 21 conference, Google Distributed Cloud (GDC) plays a critical role in the success of Anthos by making it relevant to telecom operators and enterprise customers. Google Distributed Cloud Edge, a part of GDC, aims to make Anthos the foundation for running 5G infrastructure and modern workloads such as AI and analytics.

Recently, Google announced the general availability of GDC Edge by sharing the details of the hardware configuration and the requirements.

5G

In its initial form, GDC Edge runs on two form factors - rack-based configuration and GDC Edge appliance. Lets take a closer look at these choices.

This configuration targets telecom operators and communication service providers (CSP) for running 5G core and radio access networks (RAN). The CSPs can expose the same infrastructure to their end customers for running workloads like AI inference that need ultra-low latency.

The location where the rack-based hardware runs is designated as a Distributed Cloud Edge Zone. Each zone runs on dedicated hardware that Google provides, deploys, operates, and maintains. The hardware consists of six servers, and two top-of-rack (ToR) switches connecting the servers to the local network. In terms of storage, each physical server comes with 4TiB disks. The gross weight of a typical rack is 900lbs or 408kg. The Distributed Cloud Edge rack arrives pre-configured with the hardware, network, and Google Cloud settings specified when it was ordered.

Once a DCE zone is fully configured, customers can group one or more servers from the rack to create a NodePool. Each node of the NodePool acts as a Kubernetes worker node connected to the Kubernetes control plane running in the nearest Google Cloud region.

This distributed topology gives Google the flexibility to upgrade, patch, and manage the Kubernetes infrastructure with minimal disruption to customer workloads. It allows DCE to benefit from a secure and highly available control plane without taking up the processing capacity on the nodes.

Google took a unique approach to edge computing by moving the worker nodes to the edge while running the control plane in the cloud. This is very similar to how Google manages GKE, except that the worker nodes are a part of the NodePool deployed at the edge.

The clusters running on DCE may be connected to Anthos management plane to gain better control over the deployments and configuration.

A secure VPN tunnel connects the local Distributed Cloud Edge infrastructure to a virtual private cloud (VPC) configured within Google Cloud. Workloads running at the edge can access Google Compute Engine resources deployed in the same VPC.

The rack-based configuration demands connectivity to the Google Cloud at all times. Since it runs in a controlled environment in a CSP facility, meeting this requirement is not a challenge.

Once the clusters are provisioned on the DCE infrastructure, they can be treated like other Kubernetes clusters. It is also possible to provision and run virtual machines based on kubevirt within the same environment.

CSPs from the United States, Canada, France, Germany, Italy, Netherlands, Spain, Finland, and the United Kingdom can order rack-based infrastructure from Google.

The GDC Edge Appliance is a Google Cloud-managed, secure, high-performance appliance for edge locations. It provides local storage, ML inference, data transformation, and export functionality.

According to Google, GDC Edge Appliances are ideal for use cases where bandwidth and latency limitations prevent organizations from processing the data from devices like cameras and sensors back in cloud data centers. These appliances simplify data collection, analytics, and processing at remote locations where copious amounts of data coming from these devices need to be processed quickly and stored securely.

The Edge Appliance targets enterprises from the manufacturing, supply chain, healthcare, and automotive verticals with low-latency and high throughput requirements.

GCD Edge Appliance

Each appliance comes with a 16 core CPU, 64GB RAM, an NVIDIA T4 GPU, and 3.6TB usable storage. It has a pair of 10 Gigabit and 1 Gigabit Ethernet ports. With the 1U rack-mount form factor, it supports both horizontal or vertical orientation.

The Edge Appliance is essentially a storage transfer device that can also run a Kubernetes cluster and AI inference workloads. With ample storage capacity, customers can use it as a cloud storage gateway.

For all practical purposes, the Edge Appliance is a managed device running Anthos clusters on bare metal. Customers follow the same workflow as installing and configuring Anthos in bare metal environments.

Unlike the rack-based configuration, the clusters run both the control plane and the worker nodes locally on the appliance. But, they are registered with the Anthos management plane running in the nearest Google Cloud region. This configuration makes it possible to run the edge appliance in an offline, air-gapped environment with intermittent connectivity to the cloud.

Analysis and Takeaways

With Anthos and GDC, Google defined a comprehensive multicloud, hybrid, and edge computing strategy. GDC Edge targets CSPs and enterprises through purpose-built hardware offerings.

The telecom operators need a reliable and modern platform to run 5G infrastructure. Google is positioning Anthos as the cloud native, reliable platform for running containerized network functions (CNFs) required for 5G Core and Radio Access Networks (RAN). By delivering a combination of managed hardware (rack-based GDC Edge) and software (Anthos) stack, Google wants to enable CSPs to offer 5G Multi-Access Edge Computing (MEC) to enterprises. It has partnered with AT&T, Reliance JIO, TELUS, Indosat Ooredoo, and more recently with Bell Canda and Verizon to run 5G infrastructure.

Googles approach is different from Amazon and Microsoft for delivering 5G MEC. Both AWS and Azure have 5G-based zones that act as extensions to their data center footprint. AWS Wavelength and Azure Private MEC enable customers to run workloads in the nearest edge location, managed by a CSP. Both Amazon and Microsoft are partnering with telecom providers such as AT&T, Verizon and Vodafone to offer hyperlocal edge zones.

Google is betting big on Anthos as the fabric to run 5G MEC. Its partnering with leading telcos worldwide in helping them build the 5G infrastructure based on its proven cloud native infrastructure based on Anthos. Though Google may have a competing offering for AWS Wavelength and Azure Private MEC in the future, its current strategy is to push GDC Edge as the preferred 5G MEC platform. This approach puts the CSP at the front and center of its edge computing strategy.

Google has finally responded to Azure Stack HCI and AWS Outposts with the GDC Edge Appliance. Its targeting enterprises who need a modern, cloud native platform to run data-driven, compute-intensive workloads at the edge. The edge appliance may be deployed in remote locations with intermittent connectivity, unlike the rack-based configuration.

With Anthos as the cornerstone, Google's Distributed Cloud strategy looks promising. It is aiming to win the enterprise edge as well as the telco edge with purpose-built hardware offerings. Google finally has a viable competitor for AWS Wavelength, AWS Outposts, Azure Edge Zones, and Azure Stack.

View original post here:

Google Finally Gets The Edge Computing Strategy Right With Distributed Cloud Edge - Forbes

Posted in Cloud Computing | Comments Off on Google Finally Gets The Edge Computing Strategy Right With Distributed Cloud Edge – Forbes

Page 30«..1020..29303132..4050..»