Spot bitcoin ETF net outflows hit highest level yet on day 9 of trading – Blockworks

Spot bitcoin ETFs saw the highest level of net outflows in a single day on Wednesday as flows into BlackRocks product slowed.

The 10 US funds holding BTC directly endured outflows of $158 million on their ninth day of trading, according to Bloomberg Intelligence data the highest in a single day thus far.

Grayscale Investments Bitcoin Trust ETF (GBTC) continued to weigh the group down, with $429 million leaving the fund.

Read more: As GBTC outflows continue, will the largest bitcoin ETF be dethroned?

While BlackRocks iShares Bitcoin Trust (IBIT) saw inflows of $66 million, the gains were a notable dip from previous days. IBIT had reeled in $272 million and $160 million on Monday and Tuesday, respectively.

Fidelity Investments Wise Origin Bitcoin Fund (FBTC) led the slate of bitcoin ETFs on Wednesday with inflows of roughly $126 million.

The $158 million of outflows marked the third consecutive day of combined outflows for the 10 spot bitcoin ETFs.

Read more: Bitcoin ETF Tracker

After the funds notched net inflows of $43 million on Friday, their outflows totaled $76 million on Monday and $106 million on Tuesday.

Overall, spot bitcoin ETF net inflows stand at $824 million since the funds launched on Jan. 11, Bloomberg Intelligence data indicates. Inflows for the ETFs when excluding the outflows of higher-priced GBTC is $5.2 billion.

David Lawant, head of research at crypto prime brokerage FalconX, said in an X post that the driving forces of spot bitcoin ETF flows GBTCs heavy outflows and the sizable net gains from IBIT and FBTC will both taper off.

While the former dominates in the short run, the latter will dominate in the medium/long run by a mile, Lawant added.

Read more: GBTCs asset bleeding to blame for week of flat crypto product flows

Bitwise Chief Investment Officer Matt Hougan has said he believes spot bitcoin ETFs could see $55 billion in net flows in their first five years on the market.

21Shares President Ophelia Snyder told Blockworks in an interview earlier this month that a number of investors will be slow to adopt spot bitcoin ETFs. Others that do might start with small allocations before ramping those positions up later on.

So I think this fixation on the short-term flows is crazy short-sighted and largely not the point, she said.

Dont miss the next big story join ourfree daily newsletter.

Read more from the original source:

Spot bitcoin ETF net outflows hit highest level yet on day 9 of trading - Blockworks

This is what was behind the bitcoin sell-off and why JPMorgan thinks it could be ending – CNBC

The driving force behind the recent sell-off in bitcoin may have run its course, according to JPMorgan. Bitcoin rallied in the second half of 2023 as optimism around the approval of exchange traded funds grew, but their debut earlier this month has proven to be a "sell the news" event for the world's largest cryptocurrency. Bitcoin briefly traded above $49,000 shortly before the funds launched, but then fell more than 20% before seeming to stabilize around $40,000. BTC.CM= YTD mountain Bitcoin has retreated since the approval of bitcoin ETFs. JPMorgan strategist Nikolaos Panigirtzoglou said in a note to clients Thursday that the main source of the selling has come from the Grayscale Bitcoin Trust (GBTC) . The fund, which traded at a steep discount as an over-the-counter product before investors became confident that a conversion to an ETF would happen, has seen heavy outflows over the past two weeks. "Profit-taking on previous GBTC investments, made at a discount to [net asset value] last year, has likely been a major driver behind bitcoin's correction; $4.3bn has thus far exited GBTC since its conversion to ETF," Panigirtzoglou said. Some outflows from GBTC were expected, given the prior discount and its high cost relative to other bitcoin ETFs. Panigirtzoglou, who had estimated $3 billion of outflows, said that the decline is likely profit taking rather than a sign that the cash is moving to other options, and that the outflows should slow from here. "We conclude that GBTC profit taking has largely happened already. In turn, this would imply that most of the downward pressure on bitcoin from that channel should be largely behind us," Panigirtzoglou said. Even with the outflows from GBTC, the fund still has about $20 billion in assets under management. And some of the other bitcoin ETFs are seeing big inflows, with funds from iShares ( IBIT ) and Fidelity Wise Origin ( FBTC ) both surpassing $1 billion in inflows. The total inflows and decline in price of bitcoin have missed the optimistic estimates of some crypto bulls, but the dollar amounts are still large compared to other ETF launches.

Read the original:

This is what was behind the bitcoin sell-off and why JPMorgan thinks it could be ending - CNBC

Bitcoin bounces even as ETF outflows mount – Blockworks

Bitcoin moved back into the $42,000 range Friday after a disappointing week of trading. Meanwhile, stocks slipped even as the latest economic data bolstered expectations that the Federal Reserve can achieve a soft landing after all.

Bitcoin (BTC), which struggled to break out above $40,000 this week, posted a recovery Friday, gaining more than 8% since its Tuesdays low. Ether (ETH) was also on the recovery path, trading around 2% higher Friday.

Analysts are cautiously optimistic that bitcoins ETF-driven selloff could be easing, even as outflows mount. The new spot products end their third week of trading Friday, and outflows are increasing. Spot bitcoin ETF net outflows hit a high of $158 million Thursday.

Read more: Bitcoin ETFs see net outflows for 4 straight days

This doesnt necessarily mean that the GBTC outflows are over, Noelle Acheson, author of the Crypto is Macro Now newsletter, said. Yesterday, they were $394 million, which sounds like a lot but is the lowest outflow since launch dayrather, it reminds us that flows matter but are not the main driver of the BTC price.

Personal consumer expenditures price (PCE) index data released Friday was in line with analysts expectations, showing a 0.2% increase in December and 2.9% high year-over-year. The numbers are a sign that while inflation is still elevated, it is trending lower.

Spending last month increased 0.7%, comfortably exceeding the 0.5% that was expected, Criag Erlam, senior market analyst at Oanda, said. It also came on top of the upward revision to the November reading, which increased to 0.4% from 0.2% previously. All things considered, its another sign that the US consumer and economy are in very healthy shape going into the new year.

Read more: Spot bitcoin ETF net outflows hit highest level yet on day 9 of trading

Still, traders seemed skeptical. The S&P 500 traded sideways and the Nasdaq Composite lost around 0.4% toward the end of Fridays session. Both indexes remain modestly higher over the week ahead of the Feds Open Markets Committee Tuesday.

Minimal yield moves Thursday and Friday, thanks to the expected economic data releases, are a positive sign for traders, Tom Essaye, founder of Sevens Report Research, said Friday.

In order for economic data to move yields, it must be so good or so bad it alters the markets outlook for rate cuts and either pushes it back more (yields higher) or pulls it forward (yields lower), Essaye said.

Going forward, the calmer yields are like yesterday the more theyll support these gains in stocks.

Dont miss the next big story join ourfree daily newsletter.

The rest is here:

Bitcoin bounces even as ETF outflows mount - Blockworks

Bitwise executive confirms ETF received $400 of unsolicited Bitcoin – CryptoSlate

What is CryptoSlate Alpha?

A web3 membership designed to empower you with cutting-edge insights and knowledge. Learn more

Welcome! You are connected to CryptoSlate Alpha. To manage your wallet connection, click the button below.

If you don't have enough, buy ACS on the following exchanges:

Access Protocol is a web3 monetization paywall. When users stake ACS, they can access paywalled content. Learn more

Disclaimer: By choosing to lock your ACS tokens with CryptoSlate, you accept and recognize that you will be bound by the terms and conditions of your third-party digital wallet provider, as well as any applicable terms and conditions of the Access Foundation. CryptoSlate shall have no responsibility or liability with regard to the provision, access, use, locking, security, integrity, value, or legal status of your ACS Tokens or your digital wallet, including any losses associated with your ACS tokens. It is solely your responsibility to assume the risks associated with locking your ACS tokens with CryptoSlate. For more information, visit our terms page.

Original post:

Bitwise executive confirms ETF received $400 of unsolicited Bitcoin - CryptoSlate

These two drivers will fuel Bitcoin’s rebound: ‘It’s happened often before’ – DLNews

Bitcoin rebounded overnight after falling some 20% from its $49,000 high earlier in the month. Two things will drive the cryptocurrencys recovery, according to analysts.

Noelle Acheson, author of the Crypto is Macro Now newsletter and former head of research at Genesis, said on X that the drop experienced over the past two weeks will likely to be short-term, as ETF-related buying will continue as GBTC exits wane, and as geopolitical tension pushes more savings into a hedge asset.

The slide in prices showed signs of subsiding overnight.

Having fallen below the $39,000 mark, Bitcoin is now back trading over $40,000 by 1 pm UK time, up 3% since Tuesday.

Stay ahead of the game with our weekly newsletters

In the meantime, were seeing a glimpse of market sanity, she said as mispriced rates outlooks are being corrected.

Its happened often before, and given the tailwinds ETFs, halving, currency turmoil the market is likely to recover, especially as dip-buyers step in, Acheson said.

The recovery follows the US Securities and Exchange Commissions approval of 11 spot ETFs earlier this month, bringing Bitcoin exposure to American investors and institutions.

In the end only 10 of the issuers approved launched spot Bitcoin ETFs, as Hashdex decided not to convert its existing futures ETF into a spot fund.

Join the community to get our latest stories and updates

The halving refers to the process in which the network automatically, irreversibly halves the amount of Bitcoin miners can earn from appending blocks of verified transactions to the Bitcoin ledger.

Based on current estimates, the next halving will happen around mid-April.

So why did Bitcoin wobble?

The arrival of spot ETFs followed the old Wall Street adage of buy the rumour, sell the news, said Chris Kuiper, director of research at Fidelity Digital Assets.

Open interest, the total number of futures contracts held by market participants, rose towards the end of 2023 in anticipation of ETFs, Kuiper said, and the speculative fever could also be seen in the spike in perpetual futures funding rates.

The recent spat of selling and downward trend in prices doesnt suggest the long-term bull market trend has been broken, Kuiper concluded.

Furthermore, nothing in the core Bitcoin investment thesis has been invalidated, he added.

Fidelity Digital Assets research chief had warned that volatility could spike just a week before ETFs were approved and Bitcoin began to drop.

Grayscales GBTC has seen near $4 billion in outflows since the fund converted to an ETF, causing a drag on the price of Bitcoin, according to JPMorgan.

Investors had been locked into GBTC for years, including bankrupt firms such as FTX and BlockFi. The conversion to an ETF offered an off-ramp for firms in need of liquidity, and likely explains some of the selling pressure in recent weeks, according to JPMorgan.

Grayscales is also feeling the heat from other firms including Wall Street giants BlackRock and Fidelity that charge investors a fraction of GBTCs 1.5% management fee.

BlackRock, the worlds largest asset manager, charges just 0.12%, 138 basis points lower than Grayscale.

While the GBTC selloff has been labelled a drag on the price of Bitcoin, another factor was the changing macroeconomic outlook.

In December, the US Federal Reserve made several dovish signals, which traders took to mean that interest rate cuts were on the horizon. Consequently, the price of Bitcoin went up.

Those hopes are now fading as traders change their outlook on rates. The market prices in a 75% chance of an interest rate cut in March just one month ago, this has since fallen to 50% on the back of hawkish statements from central bankers.

Lower interest rates benefit risk assets like Bitcoin as investors take more risk in search of returns.

Bitcoins 24/7 nature means it can often adjust to changes in outlook and sentiment ahead of other assets. Its often a better indicator of liquidity sentiment than stocks, in that it is one of THE most sensitive assets to monetary conditions, said Noelle Acheson.

Over the past few days, weve seen a steep revaluation of rate cut expectations. Bitcoin is reacting to that, while stocks arent, she said.

Eric Johansson is DL News News Editor. Adam Morgan McCarthy is a Markets Correspondent at DL News. Tyler Pearson is a Junior Markets Correspondent at DL News. If youve got a hot crypto tip, please reach out to us at eric@dlnews.com, adam@dlnews.com and ty@dlnews.com.

Here is the original post:

These two drivers will fuel Bitcoin's rebound: 'It's happened often before' - DLNews

The Singularity beckons: Are we heading for an AI takeover or a glorious new era? – Medium

The Singularity a point in time where artificial intelligence surpasses human intelligence, unleashing an intelligence explosion beyond our comprehension. Its a concept that evokes both excitement and dread, a technological horizon shimmering with endless possibility and shrouded in existential uncertainty.

Experts disagree on the inevitability and timing of the Singularity. Some, like futurist Ray Kurzweil, predict it by 2045, while others argue its a mythical endpoint, forever just beyond our grasp. But whether its decades or centuries away, the question remains: what happens when AI outshines us?

Optimists paint a rosy picture. AI could solve intractable problems like climate change, poverty, and disease. Imagine machines designing advanced materials, optimizing energy grids, and personalized healthcare delivered by AI doctors. This AI-powered utopia would be a world of abundance and peace, where machines handle the mundane while humans pursue creative endeavors.

But theres a flip side to the coin, a darker vision straight out of science fiction. What if AI sees humans as the obstacle, a problem to be solved? Could intelligent machines develop their own goals, potentially conflicting with ours? And who controls this superintelligence? Malicious actors with access to such power could plunge us into a dystopian nightmare.

The Singularity, whether real or imagined, demands our attention. We must navigate this technological frontier with caution and foresight. Developing ethical guidelines for AI development, prioritizing human values and control, and ensuring equitable access to its benefits are all crucial steps. We need to create an AI that learns from us, that understands our values, and that works in partnership with us, not against us.

The Singularity is not a prediction, its a call to action. Its a reminder that the future of AI is not preordained. By making informed choices, we can ensure that this powerful technology serves humanitys greatest good.

What are your thoughts on the Singularity? Do you see it as a threat or an opportunity? How can we prepare for this potential turning point in our history? Share your views in the comments below! Lets start a conversation about the future of AI and write a story where humans and machines thrive together.

Remember, the power to shape the future of AI lies not with machines, but with us. Lets choose wisely.

Read the original post:

The Singularity beckons: Are we heading for an AI takeover or a glorious new era? - Medium

Cloud-Computing in the Post-Serverless Era: Current Trends and Beyond – InfoQ.com

Key Takeaways

[Note: The opinions and predictions in this article are those of the author and not of InfoQ.]

As AWS Lambda approaches its 10th anniversary this year, serverless computing expands beyond just Function as a Service (FaaS). Today, serverless describes cloud services that require no manual provisioning, offer on-demand auto-scaling, and use consumption-based pricing. This shift is part of a broader evolution in cloud computing, with serverless technology continuously transforming. This article focuses on the future beyond serverless, exploring how the cloud landscape will evolve beyond current hyperscaler models and its impact on developers and operations teams. I will examine the top three trends shaping this evolution.

In software development, a "module" or "component" typically refers to a self-contained unit of software that performs a cohesive set of actions. This concept corresponds elegantly to the microservice architecture that typically runs on long-running compute services such as Virtual Machines (VMs) or a container service. AWS EC2, one of the first widely accessible cloud computing services, offered scalable VMs. Introducing such scalable, accessible cloud resources provided the infrastructure necessary for microservices architecture to become practical and widespread. This shift led to decomposing monolithic applications into independently deployable microservice units.

Lets continue with this analogy of software units. A function is a block of code that encapsulates a sequence of statements performing a single task with defined input and output. This unit of code nicely corresponds to the FaaS execution model. The concept of FaaS executing code in response to events without the need to manage infrastructure existed before AWS Lambda but lacked broad implementation and recognition.

The concept of FaaS, which involves executing code in response to events without the need for managing infrastructure, was already suggested by services like Google App Engine, Azure WebJobs, IronWorker, and AWS Elastic Beanstalk before AWS Lambda brought it into the mainstream. Lambda, emerging as the first major commercial implementation of FaaS, acted as a catalyst for its popularity by easing the deployment process for developers. This advancement led to the transformation of microservices into smaller, individually scalable, event-driven operations.

In the evolution toward smaller software units offered as a service, one might wonder if we will see basic programming elements like expressions or statements as a service (such as int x = a + b;). The progression, however, steers away from this path. Instead, we are witnessing the minimization and eventual replacement of functions by configurable cloud constructs. Constructs in software development, encompassing elements like conditionals (if-else, switch statements), loops (for, while), exception handling (try-catch-finally), or user-defined data structures, are instrumental in controlling program flow or managing complex data types. In cloud services, constructs align with capabilities that enable the composition of distributed applications, interlinking software modules such as microservices and functions, and managing data flow between them.

Cloud construct replacing functions, replacing microservices, replacing monolithic applications

While you might have previously used a function to filter, route, batch, split events, or call another cloud service or function, now these operations and more can be done with less code in your functions, or in many cases with no function code at all. They can be replaced by configurable cloud constructs that are part of the cloud services. Lets look at a few concrete examples from AWS to demonstrate this transition from Lambda function code to cloud constructs:

These are just a few examples of application code constructs becoming serverless cloud constructs. Rather than validating input values in a function with if-else logic, you can validate the inputs through configuration. Rather than routing events with a case or switch statement to invoke other code from within a function, you can define routing logic declaratively outside the function. Events can be triggered from data sources on data change, batched, or split without a repetition construct, such as a for or while loop.

Events can be validated, transformed, batched, routed, filtered, and enriched without a function. Failures were handled and directed to DLQs and back without a try-catch code, and successful completion was directed to other functions and service endpoints. Moving these constructs from application code into construct configuration reduces application code size or removes it, eliminating the need for security patching and any maintenance.

A primitive and a construct in programming have distinct meanings and roles. A primitive is a basic data type inherently part of a programming language. It embodies a basic value, such as an integer, float, boolean, or character, and does not comprise other types. Mirroring this concept, the cloud - just like a giant programming runtime - is evolving from infrastructure primitives like network load balancers, virtual machines, file storage, and databases to more refined and configurable cloud constructs.

Like programming constructs, these cloud constructs orchestrate distributed application interactions and manage complex data flows. However, these constructs are not isolated cloud services; there isnt a standalone "filtering as a service" or "event emitter as service." There are no "Constructs as a Service," but they are increasingly essential features of core cloud primitives such as gateways, data stores, message brokers, and function runtimes.

This evolution reduces application code complexity and, in many cases, eliminates the need for custom functions. This shift from FaaS to NoFaaS (no fuss, implying simplicity) is just beginning, with insightful talks and code examples on GitHub. Next, I will explore the emergence of construct-rich cloud services within vertical multi-cloud services.

In the post-serverless cloud era, its no longer enough to offer highly scalable cloud primitives like compute for containers and functions, or storage services such as key/value stores, event stores, relational databases, or networking primitives like load balancers. Post-serverless cloud services must be rich in developer constructs and offload much of the application plumbing. This goes beyond hyperscaling a generic cloud service for a broad user base; it involves deep specialization and exposing advanced constructs to more demanding users.

Hyperscalers like AWS, Azure, GCP, and others, with their vast range of services and extensive user bases, are well-positioned to identify new user needs and constructs. However, providing these more granular developer constructs results in increased complexity. Each new construct in every service requires a deep learning curve with its specifics for effective utilization. Thus, in the post-serverless era, we will observe the rise of vertical multi-cloud services that excel in one area. This shift represents a move toward hyperspecialization of cloud services.

Consider Confluent Cloud as an example. While all major hyperscalers (AWS, Azure, GCP, etc.) offer Kafka services, none match the developer experience and constructs provided by Confluent Cloud. With its Kafka brokers, numerous Kafka connectors, integrated schema registry, Flink processing, data governance, tracing, and message browser, Confluent Cloud delivers the most construct-rich and specialized Kafka service, surpassing what hyperscalers offer.

This trend is not isolated; numerous examples include MongoDB Atlas versus DocumentDB, GitLab versus CodeCommit, DataBricks versus EMR, RedisLabs versus ElasticCache, etc. Beyond established cloud companies, a new wave of startups is emerging, focusing on a single multi-cloud primitive (like specialized compute, storage, networking, build-pipeline, monitoring, etc.) and enriching it with developer constructs to offer a unique value proposition. Here are some cloud services hyperspecializing in a single open-source technology, aiming to provide a construct-rich experience and attract users away from hyperscalers:

This list represents a fraction of a growing ecosystem of hyperspecialized vertical multi-cloud services built atop core cloud primitives offered by hyperscalers. They compete by providing a comprehensive set of programmable constructs and an enhanced developer experience.

Serverless cloud services hyperspecializing in one thing with rich developer constructs

Once this transition is completed, bare-bones cloud services without rich constructs, even serverless ones, will seem like outdated on-premise software. A storage service must stream changes like DynamoDB; a message broker should include EventBridge-like constructs for event-driven routing, filtering, and endpoint invocation with retries and DLQs; a pub/sub system should offer message batching, splitting, filtering, transforming, and enriching.

Ultimately, while hyperscalers expand horizontally with an increasing array of services, hyperspecializers grow vertically, offering a single, best-in-class service enriched with constructs, forming an ecosystem of vertical multi-cloud services. The future of cloud service competition will pivot from infrastructure primitives to a duo of core cloud primitives and developer-centric constructs.

Cloud constructs increasingly blur the boundaries between application and infrastructure responsibilities. The next evolution is the "shift left" of cloud automation, integrating application and automation codes in terms of tools and responsibilities. Lets examine how this transition is unfolding.

The first generation of cloud infrastructure management was defined by Infrastructure as Code (IaC), a pattern that emerged to simplify the provisioning and management of infrastructure. This approach is built on the trends set by the commoditization of virtualization in cloud computing.

The initial IaC tools introduced new domain-specific languages (DSL) dedicated to creating, configuring, and managing cloud resources in a repeatable manner. Tools like Chef, Ansible, Puppet, and Terraform led this phase. These tools, leveraging declarative languages, allowed operation teams to define the infrastructures desired state in code, abstracting underlying complexities.

However, as the cloud landscape transitions from low-level coarse-grained infrastructure to more developer-centric programmable finer-grained constructs, a trend toward using existing general-purpose programming languages for defining these constructs is emerging. New entrants like Pulumi and the AWS Cloud Development Kit (CDK) are at the forefront of this wave, supporting languages such as TypeScript, Python, C#, Go, and Java.

The shift to general-purpose languages is driven by the need to overcome the limitations of declarative languages, which lack expressiveness and flexibility for programmatically defining cloud constructs, and by the shift-left of configuring cloud constructs responsibilities from operations to developers. Unlike the static nature of declarative languages suited for low-level static infrastructure, general-purpose languages enable developers to define dynamic, logic-driven cloud constructs, achieving a closer alignment with application code.

Shifting-left of application composition from infrastructure to developer teams

The post-serverless cloud developers need to implement business logic by creating functions and microservices but also compose them together using programmable cloud constructs. This shapes a broader set of developer responsibilities to develop and compose cloud applications. For example, a code with business logic in a Lambda function would also need routing, filtering, and request transformation configurations in API Gateway.

Another Lambda function may need DynamoDB streaming configuration to stream specific data changes, EventBridge routing, filtering, and enrichment configurations.

A third application may have most of its orchestration logic expressed as a StepFunction where the Lambda code is only a small task. A developer, not a platform engineer or Ops member, can compose these units of code together. Tools such as Pulumi, AWS SDK, and others that enable a developer to use the languages of their choice to implement a function and use the same language to compose its interaction with the cloud environment are best suited for this era.

Platform teams still can use declarative languages, such as Terraform, to govern, secure, monitor, and enable teams in the cloud environments, but developer-focused constructs, combined with developer-focused cloud automation languages, will shift left the cloud constructs and make developer self-service in the cloud a reality.

The transition from DSL to general-purpose languages marks a significant milestone in the evolution of IaC. It acknowledges the transition of application code into cloud constructs, which often require a deeper developer control of the resources for application needs. This shift represents a maturation of IaC tools, which now need to cater to a broader spectrum of infrastructure orchestration needs, paving the way for more sophisticated, higher-level abstractions and tools.

The journey of infrastructure management will see a shift from static configurations to a more dynamic, code-driven approach. This evolution hasnt stopped at Infrastructure as Code; it is transcending into a more nuanced realm known as Composition as Code. This paradigm further blurs the lines between application code and infrastructure, leading to more streamlined, efficient, and developer-friendly practices.

In summarizing the trends and their reinforcing effects, were observing an increasing integration of programming constructs into cloud services. Every compute service will integrate CI/CD pipelines; databases will provide HTTP access from the edge and emit change events; message brokers will enhance capabilities with filtering, routing, idempotency, transformations, DLQs, etc.

Infrastructure services are evolving into serverless APIs, infrastructure inferred from code (IfC), framework-defined infrastructure, or explicitly composed by developers (CaC). This evolution leads to smaller functions and sometimes to NoFaaS pattern, paving the way for hyperspecialized, developer-first vertical multi-cloud services. These services will offer infrastructure as programmable APIs, enabling developers to seamlessly merge their applications using their preferred programming language.

The shift-left of application composition using cloud services will increasingly blend with application programming, transforming microservices from an architectural style to an organizational one. A microservice will no longer be just a single deployment unit or process boundary but a composition of functions, containers, and cloud constructs, all implemented and glued together in a single language chosen by the developer. The future is shaping to be hyperspecialized and focused on the developer-first cloud.

Follow this link:

Cloud-Computing in the Post-Serverless Era: Current Trends and Beyond - InfoQ.com

Cloud Computing Security Start with a ‘North Star’ – ITPro Today

Cloud computing has followed a similar journey to other introductions of popular technology: Adopt first, secure later. Cloud transformation has largely been enabled by IT functions at the request of the business, with security functions often taking a backseat. In some organizations, this has been due to politics and blind faith in the cloud services providers (CSPs), e.g., AWS, Microsoft, and GCP.

In others, it has been because security functions only knew and understood on-premises deployments and simply didn't have the knowledge and capability to securely adapt to cloud or hybrid architectures and translate policies and processes to the cloud. For lucky organizations, this has only led to stalled migrations while the security and IT organizations played catch up. For unlucky organizations, this has led to breaches, business disruption, and loss of data.

Related: What Is Cloud Security?

Cloud security can be complex. However, more often than not, it is ridiculously simple the misconfigured S3 bucket being a prime example. It reached a point where malefactors could simply look for misconfigured S3 buckets to steal data; no need to launch an actual attack.

It's time for organizations take a step back and improve cloud security, and the best way to do this is to put security at the core of cloud transformations, rather than adopting the technology first and asking security questions later. Here are four steps to course correct and implement a security-centric cloud strategy:

Related: Cloud Computing Predictions 2024: What to Expect From FinOps, AI

For multi-cloud users, there is one other aspect of cloud security to consider. Most CSPs are separate businesses, and their services don't work with other CSPs. So, rather than functioning like internet service providers (ISPs) where one provider lets you access the entire internet, not just the sites that the ISP owns CSPs operate in silos, with limited interoperability with their counterparts (e.g., AWS can't manage Azure workloads, security, and services, and vice versa). This is problematic for customers because, once more than one cloud provider is added to the infrastructure, the efficacy in managing cloud operations and cloud security starts to diminish rapidly. Each time another CSP is added to an organization's environment, their attack surface grows exponentially, unless secured appropriately.

It's up to each company to take steps to become more secure in multi-cloud environments. In addition to developing and executing a strong security strategy, they also must consider using third-party applications and platforms such as cloud-native application protection platforms (CNAPPs), cloud security posture management (CSPM), infrastructure as code (IaC), and secrets management to provide the connective tissue between CSPs in hybrid or multi-cloud environments. Taking this vital step will increase security visibility, posture management, and operational efficiency to ensure the security and business results outlined at the start of the cloud security journey.

It should be noted that a cloud security strategy like any other form of security needs to be a "living" plan. The threat landscape and business needs change so fast that what is helpful today may not be helpful tomorrow. To stay in step with your organization's desired state of security, periodically revisit cloud security strategies to understand if they are delivering the desired benefits and make adjustments when they are not.

Cloud computing has transformed organizations of all types. Adopting a strategy for securing this new environment will not only allow security to catch up to technology adoption, it will also dramatically improve the ROI of cloud computing.

Ed Lewis is Secure Cloud Transformation Leader at Optiv.

Read this article:

Cloud Computing Security Start with a 'North Star' - ITPro Today

The Future of Cloud Computing in Business Operations – Data Science Central

The digital era has witnessed the remarkable evolution of cloud computing, transforming it into a cornerstone of modern business operations. This technology, which began as a simple concept of centralized data storage, has now evolved into a complex and dynamic ecosystem, enabling businesses to operate more efficiently and effectively than ever before. The Future of Cloud Computing holds unparalleled potential, promising to revolutionize the way companies operate, innovate, and compete in the global market.

Cloud computing refers to the delivery of various services over the Internet, including data storage, servers, databases, networking, and software. Rather than owning their computing infrastructure or data centers, companies can rent access to anything from applications to storage from a cloud service provider.

Cloud computing has revolutionized the way businesses operate, offering a plethora of advantages that enhance efficiency, flexibility, and scalability. In this discussion, well delve into the key benefits of cloud computing, explaining each in simple terms and underlining their significance in todays business landscape.

Cloud computing significantly cuts down on the capital cost associated with purchasing hardware and software, especially in sectors like the healthcare industry. Its an economical alternative to owning and maintaining extensive IT infrastructure, allowing businesses, including those in the healthcare sector, to save on setup and maintenance costs. This aspect is particularly beneficial in cloud computing in healthcare industry, where resources can instead be allocated toward patient care and medical research.

The ability to scale resources elastically with cloud computing is akin to having a flexible and adaptable IT infrastructure. Businesses can efficiently scale up or down their IT resources based on current demand, ensuring optimal utilization and avoiding wastage.

Cloud services are hosted on a network of secure, high-performance data centers globally, offering superior performance over traditional single corporate data centers. This global network ensures reduced latency, better application performance, and economies of scale.

Cloud computing facilitates a swift and agile business environment. Companies can quickly roll out new applications or resources, empowering them to respond swiftly to market changes and opportunities.

The efficiency and speed offered by cloud computing translate into enhanced productivity. Reduced network latency ensures applications and services run smoothly, enabling teams to achieve more in less time.

Cloud computing enhances collaboration by enabling team members to share and work on data and files simultaneously from any location. This virtual collaboration space is crucial for businesses with remote teams and global operations.

Here, we explore the transformative role of cloud computing in business, focusing on 7 key points that forecast its future impact and potential in streamlining and innovating operational landscapes.

In the Future of Cloud Computing, handling enormous amounts of data will become more critical than ever. Businesses of all sizes generate data at unprecedented rates. From customer interactions to transaction records, every piece of data is a potential goldmine of insights. Cloud computing steps in as the ideal solution to manage this surge efficiently.

Cloud storage provides a scalable and flexible way to store and access vast datasets. As we move forward, cloud providers will likely offer more tailored storage solutions, catering to different business needs. Whether its for high-frequency access or long-term archiving, cloud storage can adapt to various requirements.

Another significant aspect of data management in the Future of Cloud Computing is real-time data processing. Businesses will rely on cloud computing not just for storage, but also for the immediate processing and analysis of data. This capability allows for quicker decision-making, a crucial factor in maintaining a competitive edge.

One of the most transformative impacts of cloud computing is its ability to transcend geographical boundaries. In the Future of Cloud Computing, remote and global teams can collaborate as if they were in the same room. Cloud-based tools and platforms allow team members from different parts of the world to work on projects simultaneously, share files instantaneously, and communicate in real-time.

In the Future of Cloud Computing, we can expect a rise in virtual workspaces. These digital environments simulate physical offices, providing a space where remote workers can feel connected and engaged. They offer features like virtual meeting rooms, shared digital whiteboards, and social areas, replicating the office experience in a digital realm.

Cloud computing does more than just streamline operations; it also opens doors to innovation. With cloud resources, businesses can experiment with new ideas without significant upfront investment in infrastructure. This flexibility encourages creativity and risk-taking, which are essential for innovation.

Cloud computing accelerates the product development cycle. Teams can quickly set up and dismantle test environments, prototype more efficiently, and bring products to market faster. This agility gives businesses a significant advantage in rapidly evolving markets.

The landscape of cloud computing is rapidly evolving, with new trends constantly emerging to redefine how businesses leverage this technology. In the context of the future of cloud computing, 3 key trends stand out for their potential to significantly shape the industry. Understanding these trends is crucial for businesses looking to stay competitive and innovative.

Artificial Intelligence (AI) and Machine Learning (ML) are becoming increasingly integral to cloud computing. This integration is revolutionizing how cloud services are delivered and utilized. AI algorithms are enhancing the efficiency of cloud platforms, offering smarter data analytics, automating routine tasks, and providing more personalized user experiences. For instance, cloud-based AI services can analyze vast amounts of data to predict market trends, customer behavior, or potential system failures, offering invaluable insights for businesses.

This integration not only boosts the performance and scalability of cloud solutions but also opens up new avenues for innovation across various sectors.

As cloud computing becomes more prevalent, the focus on security and compliance is intensifying. The increasing frequency and sophistication of cyber threats make robust cloud security a top priority for businesses. In response, cloud service providers are investing heavily in advanced security measures, such as enhanced encryption techniques, identity and access management (IAM), and AI-powered threat detection systems.

Furthermore, with regulations like GDPR and CCPA in place, compliance has become a critical aspect of cloud services. The future of cloud computing will likely witness a surge in cloud solutions that are not only secure but also compliant with various global and industry-specific regulations. This trend ensures that businesses can confidently and safely leverage the cloud while adhering to legal and ethical standards.

Sustainability is a growing concern in the tech world, and cloud computing is no exception. There is an increasing trend towards green cloud computing, focusing on reducing the environmental impact of cloud services. This involves optimizing data centers for energy efficiency, using renewable energy sources, and implementing more sustainable operational practices.

It will likely see a stronger emphasis on sustainability as businesses and consumers become more environmentally conscious. Cloud providers who prioritize and implement eco-friendly practices will not only contribute to a healthier planet but also appeal to a growing segment of environmentally-aware customers.

The future of cloud computing is bright and offers a plethora of opportunities for businesses to grow and evolve. By staying informed and adapting to these changes, companies can leverage cloud computing to gain a competitive edge in the market.

Remember, the future of cloud computing isnt just about technology; its about how businesses can harness this technology to drive innovation, efficiency, and growth.

For businesses aiming to thrive in the ever-changing digital world, embracing the advancements in cloud computing is not just a choice but a necessity. Staying updated and adaptable will be key to harnessing the power of cloud computing for business success in the years to come.

Originally posted here:

The Future of Cloud Computing in Business Operations - Data Science Central

Global $83.7 Bn Cloud Computing Management and Optimization Market to 2030 with IT and Telecommunications … – PR Newswire

DUBLIN, Jan. 23, 2024 /PRNewswire/ -- The"Global Cloud Computing Management and Optimization Market 2023 - 2030 by Types, Applications - Partner & Customer Ecosystem Competitive Index & Regional Footprints" report has been added to ResearchAndMarkets.com's offering.

The Cloud Computing Management and Optimization Market size is estimated to grow from USD 17.6 Billion in 2022 to reach USD 83.7 Billion by 2030, growing at a CAGR of 21.7% during the forecast period from 2023 to 2030.

The Adoption of Cloud Based Solution Is Drive the Cloud Computing Management and Optimization Market Growth

As businesses migrate their operations to cloud-based ecosystems, as it offers a number of benefits, such as scalability, flexibility, and cost savings. A growing number of companies are adopting cloud computing includingSMEs and Large scale companies, which will lead to an increase in demand for cloud computing management and optimisation solutions.

Cloud computing environments are becoming increasingly complex, as businesses adopt a variety of cloud services from different providers. This complexity can make it difficult for businesses to manage their cloud costs and performance. Cloud computing management and optimization solutions can help businesses to simplify their cloud environments and optimize their costs and performance. Cloud computing can be a cost-effective way for businesses to IT resources.

However, businesses can still incur significant costs if they do not manage their cloud usage effectively. Cloud computing management and optimization solutions can help businesses to track their cloud usage and identify opportunities to optimize their costs. The cloud computing industry is constantly evolving, with the emergence of new technologies, such as artificial intelligence and machine learning. These new technologies can be used to improve the efficiency and effectiveness of cloud computing management and optimization solutions.

The IT and Telecommunications industries hold the highest market share in the Cloud Computing Management and Optimization Market

The IT and Telecommunications industries hold the highest market share in the Cloud Computing Management and Optimization Market in 2022, due to their intrinsic reliance on advanced technology solutions and their pivotal role in driving digital transformation across various sectors. In the IT industry, cloud computing has become a cornerstone for delivering software, platforms, and infrastructure services, enabling organizations to enhance agility, scalability, and operational efficiency.

As IT companies transition their operations to the cloud, the need for effective management and optimization of cloud resources becomes paramount to ensure optimal performance, cost control, and resource allocation. Cloud management and optimization solutions enable IT enterprises to streamline provisioning, monitor workloads, automate processes, and maintain stringent security protocols.

Furthermore, the Telecommunications sector has embraced cloud computing to modernize and expand its network infrastructure, offer innovative communication services, and adapt to the demands of an interconnected world. Cloud-based solutions empower telecom companies to efficiently manage network resources, deliver seamless customer experiences, and explore new revenue streams.

In this context, cloud computing management and optimization are essential for maintaining network reliability, ensuring data privacy, and dynamically scaling resources to meet fluctuating demand. The complex and dynamic nature of both IT and Telecommunications operations necessitates sophisticated tools and strategies for cloud resource management, making these industries prime contributors to the Cloud Computing Management and Optimization Market

Regional Insight: North America dominated the Cloud Computing Management and Optimization Market during the forecast period.

North America dominated the Cloud Computing Management and Optimization Market during the forecast period. Cloud computing has been continuously adopted by the United States and Canada, which are at the forefront of technological development, which helps strengthen North America's remarkable position as market leader. The strong presence of major companies like Adobe, Salesforce, Oracle,AWS, Google, and IBM inside the region's wide geography provides a foundation for this rise. With their cutting-edge solutions, these major players make a significant impact on adoption and innovation.

The region's commitment to technical advancement also serves as another indication of its dominance. Continuous improvements in a number of technologies are transforming the cloud computing industry, and North America is recognized as a hub for important developments.

As a result, organizations and enterprises in North America are pushed to the forefront of cloud optimization and administration, utilizing the full range of technologies and expertise provided by both local and international industry experts. Strong vendor presence, widespread acceptance, and constant technological innovation place North America in the lead for snatching the highest market share during the forecast period.

Major Classifications are as follows:

Cloud Computing Management and Optimization Market, Type of Solutions

Cloud Computing Management and Optimization Market, By Deployment Models

Cloud Computing Management and Optimization Market, By Organization Size

Cloud Computing Management and Optimization Market, By Cloud Service Models

Cloud Computing Management and Optimization Market, By Technologies

Cloud Computing Management and Optimization Market, By Industries

Cloud Computing Management and Optimization Market, By Geography

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/bx3846

About ResearchAndMarkets.com

ResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Media Contact:

Research and Markets Laura Wood, Senior Manager [emailprotected] For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900 U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

Logo: https://mma.prnewswire.com/media/539438/Research_and_Markets_Logo.jpg

SOURCE Research and Markets

Read this article:

Global $83.7 Bn Cloud Computing Management and Optimization Market to 2030 with IT and Telecommunications ... - PR Newswire

AWS to invest $15bn in cloud computing in Japan – DatacenterDynamics

Amazon Web Services (AWS) is planning to invest 2.26 trillion yen ($15.24 billion) in expanding its cloud computing infrastructure in Japan by 2027.

As part of this investment, the company will seek to expand its data center facilities in Tokyo and Osaka.

The cloud giant previously invested 1.51 trillion yen (~$10.2bn) between 2011 and 2022 in the country. Yearly, this works out at just under $1bn spent per year. The new announcement will see this increase to more than $5bn a year for the next three years.

"The adoption of digital technology has become a source of a countrys competitiveness, said Takuya Hirai, former digital minister and current chair of headquarters for the promotion of a digital society in Japans Liberal Democratic Party.

The development of digital infrastructure in Japan is key to strengthening the country's industrial competitiveness, and data centers play an important role to this end. It promotes the use of important technologies such as AI [artificial intelligence] and improves the capabilities of research and development in Japan."

The digital infrastructure in the country is also the backbone of AWS' artificial intelligence solutions. AWS provides generative AI services to Japanese customers including Asahi Group, Marubeni, and Nomura Holdings.

AWS first entered Japan in 2009. The company launched its first cloud region in the country in 2011 in Tokyo, and another in Osaka in 2021.

Amazon's Bedrock AI offering was made available in Tokyo in October 2023. The company also invested $100m in a generative AI innovation center in June 2023.

It is currently estimated that the latest investment will contribute 5.57 trillion yen (~$37.6bn) to Japans GDP and support an average of 30,500 full-time jobs in Japanese businesses each year.

Japan's government is seeking to catch up in AI development. Prime Minister Fumio Kishida has met with the heads of OpenAI and Nvidia in the past year to discuss AI regulation and infrastructure.

In December 2023, Minister Ken Saito announced the government would double down on its pledge to support the domestic chip manufacturing industry.

Follow this link:

AWS to invest $15bn in cloud computing in Japan - DatacenterDynamics

Amazon’s AWS to invest $15 billion to expand cloud computing in Japan – Yahoo! Voices

TOKYO (Reuters) - Amazon Web Services (AWS) said on Friday it plans to invest 2.26 trillion yen ($15.24 billion) in Japan by 2027 to expand cloud computing infrastructure that serves as a backbone for artificial technology (AI) services.

The Amazon.com unit is spending to expand facilities in the metropolises of Tokyo and Osaka to meet growing customer demand, it said in a statement.

That comes on top of 1.51 trillion yen spent from 2011 to 2022 to build up cloud capacity in Japan, AWS said. The company offers generative AI services to Japanese corporate customers including Asahi Group, Marubeni and Nomura Holdings, it said.

The investment comes as Japan's government and corporate sector race to catch up in AI development. Prime Minister Fumio Kishida met with the heads of ChatGPT creator OpenAI and advanced chipmaker Nvidia in the past year to discuss AI regulation and infrastructure.

($1 = 148.2700 yen)

(This story has been refiled to add dropped words 'creator OpenAI' after 'ChatGPT', in paragraph 4)

(Reporting by Rocky Swift; Editing by Muralikumar Anantharaman and Christopher Cushing)

Read the original post:

Amazon's AWS to invest $15 billion to expand cloud computing in Japan - Yahoo! Voices

Beyond Cloud Nine: 3 Cutting-Edge Tech Stocks Shaping the Future of Computing – InvestorPlace

Source: Peshkova / Shutterstock

Cloud computing has helped millions of companies save time and money. Businesses dont have to worry about hardware costs and can access data quickly. Also, cloud computing companies offer cybersecurity resources to keep data safe from hackers.

Many stocks in the sector have outperformed the market over several years and can generate more gains in the years ahead. Therefore, these cutting-edge tech stocks look poised to expand and shape the future of cloud computing.

Source: Sundry Photography / Shutterstock.com

ServiceNow(NYSE:NOW) boasts a high retention rate for its software and continues to attract customers with deep pockets. The company has over 7,700 customers and almost 2,000 of them haveannual contract values that exceed $1 million.

Further, NOWs remaining performance obligations are more than triple the companys Q3 revenue. The platform allows businesses to runmore efficient help desksand streamline repetitive tasks with built-in chatbots. Also, ServiceNow offers high-level security to protect sensitive data.

Additionally, the company has been a reliable pick for investors who want to outperform the market. Shares are up by 74% over the past year and have gained 284% over the past five years. The stock is trading at a 58-forward P/E ratio. The companys net income growth can lead to a better valuation in the future. And, ServiceNow more than tripled its profits year over year (YOY) in thethird quarter. Revenue grew at a nice 25% clip YOY.

Source: IgorGolovniov / Shutterstock.com

Alphabet(NASDAQ:GOOG, NASDAQ:GOOGL) makes most of its revenue from advertising and cloud computing. Google Cloud has become a popular resource for business owners, boasting over 500,000 customers. Also, Alphabet stands at the forefront of AI , enhancing the tech giants future product offerings.

Notably, the companys cloud segment remains a leading growth driver. Revenue for Google Cloud increased by 22.5% YOY in thethird quarter. And, Alphabets entire business achieved 11% YOY revenue growth, which is an acceleration from the previous period.

Also, Google Cloud reported a profitable quarter, swinging from a $440 million net loss in Q3 2022 to $266 million net income in Q3 2023. Alphabet investors positive response to the news helped the stock rally by 57% over the past year. The stock has gained 163% over the past five years.

Alphabet currently trades at a 22-forward P/E ratio and has a $1.8 trillion market cap. Finally, the companys vast advertising network gives them plenty of capital to reinvest in Google Cloud and the companys smaller business segments.

Source: Karol Ciesluk / Shutterstock.com

Datadog(NASDAQ:DDOG) helps companies improve their cybersecurity across multiple cloud computing solutions. Cloud spending is still in its early innings and is expected to reach$1 trillion in annual spending in 2026. The company is projected to have a $62 billion total addressable market (TAM) in that year.

Specifically, Datadog removes silos and friction associated with keeping cloud applications safe from hackers. Over 26,000 customers use Datadogs software including approximately 3,130 customers with annual contract values exceeding $100,000. The companys revenue growth over the trailing twelve months is currently 31%. Further, operating margins have improved significantly to help the company secure a net profit in the third quarter.

In fact, DDOG has a good relationship with many cloud computing giants, including Alphabet. The two corporationsexpanded their partnership to close out 2023.

Investors have been rushing to accumulate Datadog stock in recent years. Shares have gained 68% over the past year and are up by 240% over the past five years. DDOG is still more than 35% removed from its all-time high. However, continued revenue growth and profit margin expansion can help the stock reclaim its all-time high.

On this date of publication, Marc Guberti held a long position in NOW. The opinions expressed in this article are those of the writer, subject to theInvestorPlace.com Publishing Guidelines.

Marc Guberti is a finance freelance writer at InvestorPlace.com who hosts the Breakthrough Success Podcast. He has contributed to several publications, including the U.S. News & World Report, Benzinga, and Joy Wallet.

Read the original post:

Beyond Cloud Nine: 3 Cutting-Edge Tech Stocks Shaping the Future of Computing - InvestorPlace