Page 5«..4567..1020..»

Category Archives: Cloud Computing

Year-in-Review: 2023 Was a Turning Point for Microservices – The New Stack

Posted: December 25, 2023 at 6:33 am

Maybe we are doing microservices all wrong?

This was the main thesis of Towards Modern Development of Cloud Applications (PDF), a paper from a bunch of Googlers (led by Google software engineer Michael Whittaker) that was presented in June at HOTOS 23: Proceedings of the 19th Workshop on Hot Topics in Operating Systems.

The problem, as Whittaker et al pointed out, was that microservices largely have not been set up correctly, architecturally speaking. They conflate logical boundaries (how code is written) with physical boundaries (how code is deployed). And this is where the issues start.

Instead, the Google engineers suggested another approach. Build the applications as logical monoliths but hand them off to automated runtimes, which makes decisions on where to run workloads, based on what is needed by the applications and what is available.

With this latency, they were able to lower latency systems by 15x and cost by up to 9x.

If people would just start with organized modular code, we can make the deployment architecture an implementation detail, Kelsey Hightower commented on this work in October.

A few months earlier, the engineering team at Amazon Prime Video posted a blog post explaining that, at least in the case of video monitoring, a monolithic architecture has produced superior performance than amicroservices and serverless-led approach.

In fact, Amazon saved 90% in operational costs by moving off a microservices architecture.

For a generation of engineers and architects raised on the superiority of microservices, the assertion is shocking indeed.

This post is an absolute embarrassment for Amazon as a company. Complete inability to build internal alignment or coordinated communications,wroteanalystDonnie Berkholz, who recently started his own industry-analyst firmPlatify.

What makes this story unique is that Amazon was the original poster child for service-oriented architectures, weighed in Ruby-on-Rails creator and Basecamp co-founderDavid Heinemeier Hansson. Now the real-world results of all this theory are finally in, and its clear that in practice, microservices pose perhaps the biggest siren song for needlessly complicating your system. And serverless only makes it worse.

The original Amazon video delivery system.

The task of Amazon engineers was to monitor the thousands of video streams that Prime delivered to customers. Originally this work was done by a set of distributed components orchestrated by AWS Step Functions, a serverless orchestration service, AWS Lambda serverless service.

In theory, the use of serverless would allow the team to scale each service independently. It turned out, however, that at least for how the team implemented the components, they hit a hard scaling limit at only 5% of the expected load. The costs of scaling up to monitor thousands of video streams would also be unduly expensive, due to the need to send data across multiple components.

Initially, the team tried to optimize individual components, but this did not bring about significant improvements. So,the team moved all the components into a single process, hosting them on Amazon Elastic Compute Cloud (Amazon EC2) and Amazon Elastic Container Service (Amazon ECS).

Microservices and serverless components are tools that do work at high scale, but whether to use them over monolith has to be made on a case-by-case basis, the Amazon team concluded.

Arguably, the term microservices was coined by Peter Rodgers in 2005, though he called it micro web services. He gave a name to the idea that many were thinking though, especially in the age of web services and service-oriented architecture (SOA) gaining attraction at the time.

The main driver behind micro web services at the time was to break up single large monolithic designs into multiple independent components/processes, thereby making the codebase more granular and manageable, explained software engineer Amanda Bennett in a blog post.

The concept took hold, especially with cloud native computing, over the following decades, and has only started receiving criticism in some quarters.

Software engineer Alexander Kainz contributed to TNS a great comparison on monoliths and microservices.

In their paper, the Google engineers list a number of shortcomings with the microservices approach, including:

When The New Stack first covered the Amazon news, many quickly pointed out to us that the architecture the video folks used was not exactly a monolithic architecture either.

This definitely isnt a microservices-to-monolith story, remarkedAdrian Cockcroft, the former vice president of cloud architecture strategy at AWS,now an advisor for Nubank, in an interview with The New Stack. Its a Step Functions-to-microservices story. And I think one of the problems is the wrong labeling.

He pointed out that in many applications, especially internal applications, the cost of development exceeds the runtime costs. In these cases, Step Functions make a lot of sense to save dev time, but can cost for heavy workloads.

If you know youre going to eventually do it at some scale, said Cockcroft, you may build it differently in the first place. So the question is, do you know how to do the thing, and do you know the scale youre going to run it at? Cockcroft said.

The Google paper tackles this issue by making lives easier for the developer while letting the runtime infrastructure bets figure out the most cost-effective way to run these applications.

By delegating all execution responsibilities to the runtime, our solution is able to provide the same benefits as microservices but with much higher performance and reduced costs, the Google researchers wrote.

This year has been a lot of basic architectural reconsiderations, and microservices are not the only ideal being questioned.

Cloud computing, for instance, has also come under scrutiny.

In June, 37signals, which runs both Basecamp and the Hey email application, procured a fleet of Dell servers, and left the cloud, bucking a decades tradition of moving operations off-prem for vaguely defined greater efficiencies.

This is the central deceit of the cloud marketing, that its all going to be so much easier that you hardly need anyone to operate it, David Heinemeier Hansson explained in a blog post. Ive never seen it. Not at 37signals, not from anyone else running large internet applications. The cloud has some advantages, but its typically not in a reduced operations headcount.

Of course, DHH is a race car driver, so naturally he wants to dig into the bare metal. But there are others willing to back this bet. Later this year, Oxide Computers launched their new systems hoping to serve others with a similar sentiment: running cloud computing workloads, but more cost-effectively in their own data centers.

And this sentiment seems to be at least considered more now that the cloud bills are coming due. FinOps became a noticeable thing in 2023, as more organizations turned to companies like KubeCost to control their cloud spend. And how many people were taken aback by the news that a DataDog customer received a $65 million bill for cloud monitoring?

Arguably, a $65 million observability bill might be worth it for an outfit that generates billions in revenue. But as chief architects take a harder look at engineering decisions made in the last decade, they may decide to make a few adjustments. And microservices will not be an exception.

TNS cloud native correspondent Scott M. Fulton III contributed to this report.

YOUTUBE.COM/THENEWSTACK

Tech moves fast, don't miss an episode. Subscribe to our YouTube channel to stream all our podcasts, interviews, demos, and more.

SUBSCRIBE

Read this article:

Year-in-Review: 2023 Was a Turning Point for Microservices - The New Stack

Posted in Cloud Computing | Comments Off on Year-in-Review: 2023 Was a Turning Point for Microservices – The New Stack

AI and Cloud: The Proving Ground for Regulatory Resilience in 2024 – Finextra

Posted: at 6:33 am

The Cloud Adoption Imperative

The current macroeconomic landscape is marked by exceptional volatility and uncertainty, posing challenges to traditional models in the financial services sector. This climate demands greater flexibility and responsiveness from established players to meet evolving customer needs, while managing costs and risks effectively.

Cloud infrastructure and artificial intelligence (AI) are key technological drivers enabling financial institutions to adapt to this changing market. Cloud computing, in particular, has emerged as a fundamental component of the financial system. It supports the necessary transformations banks and other financial bodies must undertake.

The adoption of cloud computing aids firms in streamlining processes, minimising risks, and enhancing efficiency. It also improves the capacity to identify new business opportunities and revenue sources. A central aspect of its impact lies in offering more tailored customer propositions, better pricing, and conducting operations that are safer and less risky.

Furthermore, cloud computing is essential for maintaining competitiveness in the new financial landscape. It empowers financial institutions to be more agile, resilient, and efficient, aligning with regulatory standards, especially in terms of outsourcing. By embracing cloud technology, financial institutions can revolutionise their business models, fostering new competencies, increasing efficiency, and delivering greater value to customers.

Download this Finextra impact study, produced in association with Microsoft Azure, to learn more.

Read this article:

AI and Cloud: The Proving Ground for Regulatory Resilience in 2024 - Finextra

Posted in Cloud Computing | Comments Off on AI and Cloud: The Proving Ground for Regulatory Resilience in 2024 – Finextra

Cognata Redefines Sensor Suite Selection Processes Through Digital Twin-based Sensor Simulation and Cloud … – PR Newswire

Posted: at 6:32 am

Cognata had been chosen by Microsoft to drive the ADPH global program as a global leader for its photorealistic sensor simulation and unique sensor models such as thermal cameras and 4D Lidars.

Key Highlights

REHOVOT, Israel, Dec. 21, 2023 /PRNewswire/ -- Cognata proudly announces its collaboration with Microsoft to drive the Automated Driving Perception Hub (ADPH) global program, running on Microsoft Azure, and AMD EPYC processors and Radeon GPUs, to allow Automotive customers to virtually and efficiently evaluate ADAS/AV sensors through digital twin-based sensor simulation. Cognata's Automated Driving Perception Hub allows sensors to be evaluated versus a common set of industry-standard scenarios, and their performance is quickly and easily analyzed.

Sensor selection is pivotal in steering the automotive industry toward reliable and safe autonomous vehicles and ADAS systems. Cognata's ADPH platform incorporates highly accurate sensor modeling, manufacturer approved, with a wide spectrum of sensors such as RGB cameras with varying lens distortions, Point-cloud (LiDAR) systems, as well as Thermal cameras (IR), all integrated with a DNN-based photorealistic layer, ensuring sensor performance precision.

With Microsoft's support, Cognata is accelerating the digital transformation on Azure's global cloud, services, and computing capabilities to accelerate ADAS/AV development, verification, and validation. Cognata's digital twin-based simulation requires powerful computing and graphics resources to run and scale. These advanced workloads and features are accelerated by AMD high-performance CPU and GPU technologies, enabling streamlined execution.

"We have joined forces to advance the automotive industry by bringing digital twin-based simulation into Microsoft Azure. The ADPH is a milestone achievement in our collaboration, showcasing our commitment to delivering cutting-edge technology," DannyAtsmon, CEO and founder of Cognata, highlights the significance of the project. "This platform seamlessly integrates manufacturer-approved sensor models and a sophisticated simulation environment, solving the challenge of optimizing sensor selection in a fully scalable manner."

"AMD is excited to deliver transformative innovation to the autonomous driving market in collaboration with Cognata and Microsoft," said Jeff Connell, corporate vice president and general manager, Strategic Silicon Solutions, AMD. "Our innovative portfolio of highly performant computing solutions, including AMD Radeon PRO V620 GPUs and AMD EPYC processors, provides an ideal combination of capabilities to power the critical workloads that enable autonomous driving technologies."

"We are pleased to collaborate with AMD for their high-performance CPU and GPU IP and Cognata for their photorealistic sensor simulation capabilities to provide a platform that will allow engineers to accurately evaluate real-world sensor performance on the cloud early on in the design process, said Dominik Wee, Corporate Vice President, Manufacturing & Mobility at Microsoft. "By front-loading upstream design and moving sensor evaluation from the physical world to the virtual world, we believe our customers will be able to innovate more rapidly and cost-effectively."

About Cognata

Cognata provides cutting-edge autonomous driving technologies with its end-to-end solutions for autonomous platforms. Other than an advanced engine creating a photorealistic simulation platform, Cognata offers the know-how of the market offerings, product integration, and a comprehensive V&V walkthrough, end-to-end. Working with some of the largest autonomous vehicle makers tier 1's in the world, Cognata accelerates the autonomous and ADAS engineering capabilities, and brings the unique power and expertise of artificial intelligence and computer vision, taking off years of the development process.

AMD, the AMD Arrow logo, EPYC, Radeon and combinations thereof are trademarks of Advanced Micro Devices.

Contact: Shay Rootman [emailprotected] http://www.cognata.com

SOURCE Cognata

Continued here:

Cognata Redefines Sensor Suite Selection Processes Through Digital Twin-based Sensor Simulation and Cloud ... - PR Newswire

Posted in Cloud Computing | Comments Off on Cognata Redefines Sensor Suite Selection Processes Through Digital Twin-based Sensor Simulation and Cloud … – PR Newswire

Microsoft and Amazon the focus of cloud computing probe – Proactive Investors USA

Posted: October 5, 2023 at 5:21 pm

About Ian Lyall

Ian Lyall, a seasoned journalist and editor, brings over three decades of experience to his role as Managing Editor at Proactive. Overseeing Proactive's editorial and broadcast operations across six offices on three continents, Ian is responsible for quality control, editorial policy, and content production. He directs the creation of 50,000 pieces of real-time news, feature articles, and filmed interviews annually. Prior to Proactive, Ian helped lead the business output at the Daily... Read more

Proactive financial news and online broadcast teams provide fast, accessible, informative and actionable business and finance news content to a global investment audience. All our content is produced independently by our experienced and qualified teams of news journalists.

Proactive news team spans the worlds key finance and investing hubs with bureaus and studios in London, New York, Toronto, Vancouver, Sydney and Perth.

We are experts in medium and small-cap markets, we also keep our community up to date with blue-chip companies, commodities and broader investment stories. This is content that excites and engages motivated private investors.

The team delivers news and unique insights across the market including but not confined to: biotech and pharma, mining and natural resources, battery metals, oil and gas, crypto and emerging digital and EV technologies.

Proactive has always been a forward looking and enthusiastic technology adopter.

Our human content creators are equipped with many decades of valuable expertise and experience. The team also has access to and use technologies to assist and enhance workflows.

Proactive will on occasion use automation and software tools, including generative AI. Nevertheless, all content published by Proactive is edited and authored by humans, in line with best practice in regard to content production and search engine optimisation.

Read more from the original source:

Microsoft and Amazon the focus of cloud computing probe - Proactive Investors USA

Posted in Cloud Computing | Comments Off on Microsoft and Amazon the focus of cloud computing probe – Proactive Investors USA

Cloud cover benefits of being on the cloud – The Actuary

Posted: at 5:20 pm

Host with the most the cloud hosts the IT you want; it also offers a host of knock-on benefits. Aristides Zenonos describes the key choices, and the latest developments.

The concept of digital transformation has gained significant attention recently. It involves integrating digital technologies across an organisations operations, reshaping how they deliver value to customers, optimise internal processes and maintain competitiveness.

One essential aspect of it is the migration of data from traditional storage systems to a centralised data lake that is hosted on the cloud, while leveraging the powerful processing capabilities offered by cloud platforms. This opens up opportunities for data-driven solutions and the optimisation of business processes, enabling organisations to modernise their systems and adopt new software applications.

In the actuarial sector, motor and health insurance stand out for their handling of substantial data volumes. Today, as insurers seek enhanced precision, flexibility and efficiency from their digital tools and ecosystems, cloud computing is redefining the actuarial profession and enabling a wealth of new possibilities.

What are the definitions?

Cloud environments encompass the flexible provision of computing power, databases, storage, applications and other IT resources through the internet; delivery of these services is referred to as cloud computing. An on-premises environment involves the deployment of resources within an organisations internal IT infrastructure.

In simpler terms, on-premises refers to the physical servers and infrastructure on an organisations literal site, while the cloud refers to remote resources that are managed by a cloud provider somewhere else.

The three major players in cloud computing are Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP), which currently have market shares of around 32%, 23% and 10% respectively.

Cloud computing is redefining the actuarial profession and enabling new possibilities

Control

On-premises environments give organisations complete control, ensuring the infrastructure is self-contained and eliminating the need for third-party involvement. This offers a sense of security and mitigates issues around internet connectivity or remote servers. They can also be advantageous in situations where organisations rely on legacy systems that are particularly difficult to migrate to the cloud.

Cloud environments obviously offer less control, as they rely on third-party providers. One significant advantage, though, is the flexibility. There are three main categories of cloud services: Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).

IaaS essentially forms the foundation, providing the building blocks for network access, computing power and data storage. It offers maximum flexibility and control over IT resources, which can be tailored to an organisations needs.

PaaS is in the next level up. It removes the burden of overseeing the underlying infrastructure, enabling users to develop and manage applications without worrying about operating-system updates or hardware maintenance. Fully managed services such as data warehousing and machine learning modelling options fall under PaaS, such as AWS Redshift or GCP BigQuery and AWS SageMaker and GCP Vertex AI.

At the top is SaaS, which offers complete, ready-to-use applications that eliminate the need for installation, configuration or maintenance on users own systems. With SaaS, users can directly access fully functional cloud-based products, such as Gmail and Google Drive, without concerns about the underlying infrastructure.

By leveraging different cloud service models, particularly SaaS, end users can direct their attention to their primary tasks and actuaries can change the way they deliver value to clients. They can exploit machine learning and artificial intelligence (AI) to achieve higher accuracy in reserving, ratemaking, pricing and capital modelling. By analysing new data points, actuaries could address changing dynamics in life and pensions and understand emerging risks such as ESG (environmental, social and governance) and even AI itself.

Deployment

Rapid deployment is essential for meeting user expectations in todays competitive market, making it critical when comparing environments. A benefit of cloud computing is its speed and agility. With cloud resources instantly available, they can be accessible to practitioners in just minutes.

On-premises environments, on the other hand, must be configured and maintained by the organisation so that they are accessible for practitioners. These set-ups often need more time, due to having to configure their own systems.

Cloud solutions excel at speed; data-driven solutions can be effortlessly deployed in multiple regions worldwide with just a few clicks, allowing organisations to deliver solutions with low latency and back-up options. However, its important to note that on-premises environments can offer lower latency and faster response times in some cases, especially when cloud providers do not host resources in locations close to end-users.

Cost

Cost is where the two environments show significant differences. Cloud computing offers a pay as you go model, which provides flexibility and cost optimisation. On-premises infrastructure, meanwhile, often involves a significant upfront cost but provides long-term cost predictability. Organisations can plan their expenses more accurately over time without worrying about things like potential price increases.

One of cloud computings key advantages is its economy of scale. Cloud providers aggregate the usage of hundreds of thousands of customers, enabling them to offer lower prices. On-premises environments typically serve a single organisation, limiting the potential for cost optimisation through shared resources.

By adopting cloud computing, organisations can shift their focus away from managing and maintaining costly data centres and towards core business projects. This allows them to allocate resources to customer-centric initiatives and strategic endeavours, reducing overheads associated with physical infrastructure management.

Security

Many organisations place their trust in on-premises environments because the data remains on local machines, providing a sense of control and ownership. This is particularly crucial for industries with strict regulatory requirements or sensitive data that cannot be stored or processed in a public cloud environment. Industries such as healthcare and finance often have strict compliance and security requirements that necessitate maintaining data this way.

However, cloud providers extensive network of data centres ensures high availability, redundancy and disaster-recovery capabilities, which can be beneficial in ensuring business continuity. In addition, cloud providers nowadays follow strict protocols such as encryption, user and group permissions, and firewall components to ensure the safety of their data against malicious attacks.

Its true that an on-premises environment without direct internet connectivity provides more data security than the cloud. However, such isolated on-premises systems may lack a cloud-based solutions flexibility, scalability and collaborative features. Using the power of the cloud and tools such as version control and shared platforms promotes collaboration among staff.

In recent years, new approaches such as federated learning have emerged to foster collaboration while addressing data privacy concerns. Federated learning is a distributed method for training machine learning models; it operates in a decentralised manner, eliminating the need to transfer data from client devices to central servers. Cloud solutions also provide easily configurable functionalities and user permissions to meet diverse actuarial workflow needs, ensuring compliance with global insurance industry regulations such as Solvency II and IFRS 17.

Hybrid solutions

All this said, there is another option: the hybrid cloud. This offers businesses the ability to host resources both locally and remotely. Simply put, it combines different computing environments to support application deployment and data management. It allows organisations to leverage the benefits of both on-premises infrastructure and cloud computing.

This approach is widely adopted because it provides flexibility and enables businesses to continue using their existing on-premises services while taking advantage of the possibilities offered by cloud computing. It offers a balance between control, security and scalability, allowing organisations to optimise their own IT infrastructure.

Positioning for success

Cloud computing has emerged as a leading force that is driving innovation. To stay competitive, an increasing number of companies are embracing digital transformation, and actuarial professionals are recognising the role of data readiness in leveraging the potential of data-driven solutions and AI. They understand that access to high-quality and abundant data is essential for optimising business processes.

Deciding between on-premise or cloud environments requires a thorough understanding of your organisations circumstances. By considering factors such as data sensitivity, budget and resource requirements, businesses can make an informed choice. The goal is to create an infrastructure that supports operations, maximises efficiency, and positions the business for success in the technological era.

Aristides Zenonos is a senior data scientist

Read more here:

Cloud cover benefits of being on the cloud - The Actuary

Posted in Cloud Computing | Comments Off on Cloud cover benefits of being on the cloud – The Actuary

AI, Cloud Computing among 36 FREE Online Courses Now … – Philippine Information Agency

Posted: at 5:20 pm

(Photo courtesy: dict.coursebank.ph)

A remarkable collaboration among the Department of Information and Communications Technology (DICT), ICT Literacy and Competency Development Bureau (DICT-ILCDB), and Digitaljobsph, Coursebank has opened doors to democratize knowledge about information technology.

The result? A treasure trove of 36 FREE online courses accessible to the general public, ushering in an era of seemingly limitless learning opportunities.

A Vision of Upskilling Excellence

Initially conceived as a means to elevate the skills of the Philippine Government's Information Technology (IT) professionals, the program's objectives have grown to encompass a broader horizon. It now aspires to empower IT professionals across government agencies, LGUs (Local Government Units), and SUCs (State Universities and Colleges), among others, while nurturing the next generation of tech-savvy professionals.

The driving force behind this initiative is a shared vision of a digital future where every individual is equipped with the knowledge and expertise to thrive in a rapidly evolving technological landscape.

The courses offered cover a spectrum of cutting-edge technological tools and emerging fields, including AI (Artificial Intelligence), Big Data, IoT (Internet of Things), and Cloud computing.

The Future of Learning: Asynchronous Online Training

One of the hallmarks of this program is its adaptability to the modern learner's needs.

Recognizing the diversity of schedules and learning paces, the training program adopts an asynchronous online training approach. Drawing inspiration from global leaders in massive open online courses, this method ensures that knowledge is accessible to all, irrespective of their daily commitments.

Embark on Your Learning Journey

If the prospect of enhancing your skills, exploring the frontiers of technology, and contributing to a digitalized future excites you, then Coursebank's array of free online courses is the gateway you've been waiting for.

To discover more about these groundbreaking programs brought to you in collaboration with Coursebank, visit dict.coursebank.ph. Your journey toward a digitally empowered future begins with a single click.

Don't miss this unparalleled opportunity to unlock your potential, enrich your knowledge, and become a part of a brighter, digitally adept Philippines. Embrace the future of learning today! (PIA-NCR)

See more here:

AI, Cloud Computing among 36 FREE Online Courses Now ... - Philippine Information Agency

Posted in Cloud Computing | Comments Off on AI, Cloud Computing among 36 FREE Online Courses Now … – Philippine Information Agency

Amazon Web Services isn’t trying to win the A.I. race. It wants to own the road. – Slate

Posted: July 31, 2023 at 8:30 pm

Thisarticleis fromBig Technology, a newsletter by Alex Kantrowitz.

Amazons absence fromthis years generativeA.I. bonanza has been a bit puzzling. The company invented Alexa, intuiting peoples interest in speaking with computers, yet when OpenAI released ChatGPT it seemed to cede the territory.

But rather than sitting out the game, Amazon is waiting to play on its terms. Instead of building one A.I. product, it wants a piece of all of them. And its not shy about its ambition.

I wouldnt be at all surprised if just the A.I. part of our cloud-computing business was larger than the rest of AWS combined in a couple years, Amazon Web Services vice president Matt Wood told me in an interview at the cloud services summit this week.

Rather than releasing just one product or a large language model by itself, Amazon wants to enable companies building with generative A.I. to create any product using any model. Put another way, instead of developing one ChatGPT or GPT-4, Amazon wants to empower every would-be ChatGPT developer to use any GPT-like model and get going. Amazon will supply the model access, customization, and raw computing power to developers, and make its money as they build.

It is a great business opportunity, said Andrew Lipsman, principal analyst at Insider Intelligence. Its smart strategically to focus on where the profits are.

At the core of Amazons effort is a new product called Bedrock. Available inside AWS, Bedrock lets developers select from a range of A.I. models, including from Anthropic, AI21 Labs, and Stability AI. Using these models, developers can build their own products, like A.I. chatbots, and then run them on AWS infrastructure.

Bloomberg, for instance, built BloombergGPT, a bot for financial information, on a Bedrock precursor called SageMaker. To do it, the company took four decades of unstructured financial data and analytics, loaded them into AWS, added some other training material, and tuned the model. Bedrock should make such a process faster, with preloaded models in a catalog. Once a product is built, Amazon will continue to support it. When people chat with BloombergGPT, for instance, it uses Amazons storage to work, so Amazon gets a slice every step of the way.

We get paid by providing compute capacity to actually do the model training, and for providing the access to the large amounts of storage that are needed, said Wood. You may train a model once a month, once a week, but youre going to be running predictions and inference and chatting with that model hundreds, thousands, tens of thousands of times a day.

Competitors including Microsoft and Google offer similar capabilities, but Amazon has a few advantages. Without its own consumer chatbot, or a multibillion-dollar attachment to an A.I. research house, its pitching itself as a company with more neutrality and pragmatism than its peers. This could be compelling for developers looking for more customizability or assurance that their data stays at home, a pressing issue for many. Its also helpful for Amazon that so many internet companies already have their data on its cloud.Youd be surprised how many customers have exabytes of data on AWS, said Wood. An exabyte is 1 billion gigabytes.

Amazon does have its own A.I. model, called Titan, that it offers alongside its menu of others. So its not entirely neutral. The company also develops its own A.I.specific chips, which underlie some of the computing, but it doesnt sell them like Nvidia does. Both efforts are meant to enhance the core service offering.

Long ago, Amazon learned theres value to being first. It established its cloud-services lead early and still dominates. But on A.I., its playing catch-up as Microsoft appears to be in the lead, with11,000 customers using its generativeA.I. servicevia a partnership with OpenAI.

Amazon is, at least, joining right after the starting gun, with a plan that could work no matter what model or product wins. We are three steps into a marathon race, said Wood. And I dont think anybody without a smile on their face could call a winner three steps into a marathon, right?

Read more:

Amazon Web Services isn't trying to win the A.I. race. It wants to own the road. - Slate

Posted in Cloud Computing | Comments Off on Amazon Web Services isn’t trying to win the A.I. race. It wants to own the road. – Slate

The Machines Behind the FinOps Curtain: Operationalizing Your Strategy with AI – ITPro Today

Posted: at 8:30 pm

This article was originally published on Network Computing.

FinOps is gaining steam, but companies shouldn't just forge ahead. They should carefully consider their approach to FinOps. The tech which garners most of the headlines is only one ingredient. The amount of data cloud services generate is so complex that people and processes are just as important.

Related: Why FinOps Is Key to Cloud Cost Optimization

Think about what FinOps has to do:

The human brain can't possibly do all of this. So, today's typical human tools manual processes and spreadsheets are not up to the task.

Related: How FinOps Can Help Optimize Cloud Spending

Integration:Cost management requires integration to discover shadow IT and evaluate usage across hundreds of SaaS apps, as well as private and publicIaaS. Integration is crucial so you can understand, in granular detail, cloud usage. In turn, this helps put boundaries around spending and can track dynamic pricing. FinOps should work in synch with financial management systems and should be able to produce general ledger files.

Real-time AI, ML, behavioral analytics (BA), and predictive analytics (PA):Handling all the data that the cloud produces means FinOps platforms must handle the cloud usage data that comes along for the ride. Multiple times a day, you'll have to crunch expansive data sets, track dynamic pricing (which can change daily), and compare your service configurations to millions of other pricing schemas all while normalizing cost data across multiple providers. AI, ML, BA, and PA can help with this task.

Cloud optimization and CEM working together:FinOps requires cloud service usage optimization and expense management to work in unison. Cloud optimization tools alone can only go so far. Yes, you need usage optimization tools that identify waste and unused resources, recommend lower-coststorage tiers, and help with rightsizing. But you also need an expense management platform to handle and automate invoice processing, cost allocations, and forecasting. And you need it all automated with a single-pane view, so you leave the manual headaches in the past.

Automated response (closed-loop automation):Insights are useless if you can't act. FinOps solutions and tech platforms should generate more than recommendations. They should be integrated into IaaS service control panels so you can approve a recommendation and ensure the system makes suitable service modifications automatically, which speeds up the time to savings.

How a tech research firm moved from manually managing the cloud to FinOps and started controlling multi-cloud infrastructure costs.

Talking about FinOps in the abstract is one thing. Seeing it at work is something else.

Senior IT and financial leaders at a technology research firm faced visibility, management, and cost control challenges after migrating much of the company's network infrastructure to the cloud. As their digital transformation matured, they found the cloud resource intensive.

Every month, IT financial analysts manually evaluated millions of dollars in expenses with little control over costs and invoice information. Analysts had to study tens of thousands of rows of billing information to decipher which departments were using each cloud service. Allocating each cost to its associated department was also a manual process. With roughly 500 departments, this administrative work had mushroomed into a full-time job for a larger group of employees. Because this work had become so laborious, finding the time to optimize costs seemed nearly impossible.

The company had been using a third-party cloud optimization tool for several years. As a point solution, it could highlight ways to optimize costs but couldn't automate IaaS financial management tasks. The company asked Tangoe to solve both problems, replacing the existing tool with one comprehensive FinOps solution for improved productivity and cost savings and do it all in 30 days, just before their contract expired.

Using the Tangoe One Cloud for IaaS solution, expense management experts implemented a proof-of-concept that worked across Amazon Web Services, Microsoft Azure, as well as Google Cloud Platform to deliver what the existing tool could not:

The company recognized productivity gains in both IT and financial departments. What used to take IT financial analysts an estimated 40 hours of manual work every week or month now happens in just minutes.

Today, company leaders have one complete solution giving them the peace of mind that comes from knowing their cloud investments are being utilized responsibly, and their spending is under control across their multi-cloud estate.

Taking a DIY approach can be risky unless you can build all these tools in-house with homegrown systems.

You need a sophisticated technology platform behind your cost management program and team. Tangoe is one such system. I recently wrote a buyer's guide titled, What To Look For In A Cloud Expense Management Solution. You can access ithere.

Zeus Kerravala is the founder and principal analyst with ZK Research.

Read more from the original source:

The Machines Behind the FinOps Curtain: Operationalizing Your Strategy with AI - ITPro Today

Posted in Cloud Computing | Comments Off on The Machines Behind the FinOps Curtain: Operationalizing Your Strategy with AI – ITPro Today

Strengthening security in a multi-SaaS cloud environment – TechCrunch

Posted: at 8:30 pm

Steven Tamm is a technology adviser to Spin.AI and a former Salesforce CTO with extensive experience in cloud computing, e-commerce, virtualization, developer tools, cybersecurity, compliance and SaaS.

Managing security across multiple SaaS cloud deployments is becoming more challenging as the number of zero-day and ransomware attacks continues to rise. In fact, recent research reveals that a staggering 76% of organizations fell victim to a ransomware attack in the past year.

Its no secret that protecting data is hard, and with the rise of cloud technologies, its becoming harder. But when it comes to cloud SaaS application risk, what does that look like? And what actionable steps can teams and IT pros take to help mitigate those risks at their organization? In this article, Im going to explore those questions and provide some insights.

Modern organizations encounter a variety of SaaS challenges, including the absence of configuration standards, multiple APIs, and user interfaces (UIs) with varying access levels and potential data leaks across interconnected systems. Securing structured data in CRM applications, communication data in messaging platforms, and unstructured data from file providers is already difficult.

However, when these systems are sourced from different vendors, it becomes even more challenging to detect and prevent attacks in a timely manner. The interconnected nature of these systems makes tracking data provenance difficult and facilitates broad spread of malware and ransomware.

This challenge is further exacerbated when organizations extend their systems to include external users. With expanding footprints, the inadvertent leakage or destruction of sensitive data becomes a significant concern. Popular platforms like Salesforce Communities, Slack Connect, Microsoft Teams, Microsoft 365, and Google Drive create a complex web of identity, permissions, and integration controls.

Unfortunately, most endpoint management tools on the market were designed for a pre-cloud, pre-bring-your-own-device (BYOD) era, making them inadequate for managing the modern SaaS landscape. So how do you take control?

When managing risk in the cloud, its crucial to select IT and security solutions that truly address the intricacies of the deployed SaaS applications and were born 100% in the cloud without any legacy on-premises components. The good news is that vendors are developing innovative solutions to help IT and security teams do this. But its essential to explore the options and consider the following:

First, do they go beyond basic factors such as OAuth scopes, login IP addresses, and high-level scores, and instead delve deeper into data usage patterns and even examine the code of all integrations?

Second, many major SaaS vendors provide event monitoring, antivirus protection, and basic data leak prevention as check boxes. But these features often fall short when it comes to preventing and remediating data attacks because of miscalibrated thresholds in alert systems and logs that are not tuned for specific organizations. That results in alert overload and fatigue. Its important to understand how a solution improves risk scoring and alert prioritization.

View post:

Strengthening security in a multi-SaaS cloud environment - TechCrunch

Posted in Cloud Computing | Comments Off on Strengthening security in a multi-SaaS cloud environment – TechCrunch

Oracle Introduces First Cloud Native Secure Cloud Computing … – PR Newswire

Posted: at 8:30 pm

New OracleCloud Native SCCA Landing Zone solution reduces barriers to cloud adoption and enables mission owners to rapidly build compliant architectures

AUSTIN, Texas, July 31, 2023 /PRNewswire/ -- Oracle Cloud Infrastructure (OCI) has introduced a new Secure Cloud Computing Architecture (SCCA) for the U.S. Department of Defense (DoD). The solution helps make security compliance and cloud adoption for mission-critical workloads easier, faster, and more cost effective by using a framework of cloud native services.

SCCA is a DoD security frameworkdesigned to provide a standard approach for boundary and application-level security for the Defense Information Systems Agency(DISA) Impact Level 4 and 5 data hosted in commercial cloud environments. Historically, SCCA compliance has required significant investment from DoD mission owners in the form of independent development efforts and third-party software licensing. The cost and time result in a significant challenge during cloud migrations.

Oracle Cloud Native SCCA Landing Zone provides a framework for securely running DoD mission workloads and storing Impact Level 2, 4, and 5 data in OCI government regions. The automation provided by the solution enables DoD mission owners to establish a compliant security architecture in just a few hours or days, instead of months. It uses cloud native infrastructure services, significantly accelerating the time to deployment of mission critical workloads by reducing architecture time and minimizing decision points.

"Oracle Cloud Native SCCA Landing Zone is a game changer for our customers. What we are doing is fundamentally different,"said Rand Waldron, vice president, OCI Global Government Sector. "We will deliver all the capabilities necessary for SCCA completely in native OCI services. Our customers will no longer have to manage multiple licenses, multiple vendor relationships, or multiple kinds of security configurations. Our SCCA solution will provide everything the customer needs to stand up an SCCA-compliant workload in the cloud.".

Learn more about Oracle's new Cloud Native SCCA Landing Zone solution in the OCI SCCA Architecture Guide.

Simplifying and accelerating DoD security complianceThe Oracle Cloud Native SCCA Landing Zone includes baseline configurations, rules, and templates that meet DISA Impact Level 2, 4 and 5 accreditation requirements. This is delivered using a standardized Infrastructure-as-Code (IAC) template that meets a set of SCCA controls in a simplified and repeatable way. Based on Terraform, OCI Landing Zones allow OCI customers to perform one click, best-practice deployments of multiple Oracle services at once. Customers can launch the templates from the Cloud Native SCCA Landing Zone, answer a few simple questions about their configuration, and have an architecture set up same day.

The solution also addresses the four primary technical componentsof the SCCA framework; Cloud Access Point (CAP), Virtual Data Center Security Stack (VDSS), Virtual Data Center Management Service (VDMS), and Trusted Cloud Credential Manager (TCCM). Customers who deploy the secure baseline using the Cloud Native SCCA Landing Zone are provided with an architecture guide, implementation guide, requirements checklist, reference architecture, and best practices to accelerate the accreditation of their application on OCI.

Security, compliance, consistent high performance, and simple, predictable pricingThe Oracle Cloud Native SCCA Landing Zone script and associated technical documentation are provided at no separate or additional charge under a customer's contract. Underlying consumable cloud services used to stand up Oracle Cloud Native SCCA in a customer's tenancy may be billable in accordance with the customer's contract. Oracle Cloud for DoD services are priced at the same, consistent global pricing as Oracle's commercial public cloud regions and meet DISA Impact Levels 2, 4, and 5 and FedRAMP+ authorization standards.

Commercial customers can also take advantage of the automated security posture outlined above. All OCI customers can leverage custom security zones with the SCCA and other OCI Landing Zones that allow organizations to apply security policies and prevent changes that could weaken a customer's security configuration quickly and easily. Learn more about Oracle Landing Zones here.

Oracle Cloud Native SCCA Landing Zone scripts are available within the OCI Console, through GitHub, and from the Hosting and Compute Center (HaCC) website.

Additional Resources

About OracleOracle offersintegrated suites of applications plus secure, autonomous infrastructure in the Oracle Cloud.For more information about Oracle (NYSE: ORCL), please visit us atoracle.com.

TrademarksOracle, Java, MySQL, and NetSuite are registered trademarks of Oracle Corporation. NetSuite was the first cloud company--ushering in the new era of cloud computing.

SOURCE Oracle

Read the original here:

Oracle Introduces First Cloud Native Secure Cloud Computing ... - PR Newswire

Posted in Cloud Computing | Comments Off on Oracle Introduces First Cloud Native Secure Cloud Computing … – PR Newswire

Page 5«..4567..1020..»