IBM Is Said to Consider Sale of Watson Health Amid Cloud Focus – Data Center Knowledge

Nico Grant and Tom Giles(Bloomberg) --International Business Machines Corp. is considering a sale of its IBM Watson Health business, a person with knowledge of the matter said, a move that would help newly appointed Chief Executive Officer Arvind Krishna focus on faster-growing cloud computing operations.

Deliberations are at a very early stage and the company may opt not to pursue a deal, said the person, who asked not to be identified discussing private talks. IBM is exploring a range of alternatives, from a sale to a private equity firm or a merger with a blank-check company, according to the Wall Street Journal, which earlier Thursday reported the possibility of a deal.

Related: Why IBM Is Untethering Watson AI Software from Its Cloud

IBM has been trying toboostits share of revenue from hybrid-cloud software and services, which lets customers store data in private servers and on multiple public clouds, including those of rivals Amazon.com Inc. and Microsoft Corp. IBM bought RedHat for $34 billion in 2018 to boost this effort.

As a testament to the companys bet on the cloud, Krishna said in October that he would spin off IBMs managed infrastructure services unit into a separate publicly traded company. The division, currently part of the Global Technology Services division, handles day-to-day infrastructure service operations, like managing client data centers and traditional information-technology support for installing, repairing and operating equipment. While the unit accounts for about a quarter of IBMs sales and staff, it has seen business shrink as customers embraced the shift to the cloud, and many clients delayed infrastructure upgrades during the pandemic. The spinoff is scheduled to be completed by end of 2021.

Related: IBM Watson AI to Help CBRE Manage Client Data Centers

Offloading IBM Watson Health, which helps health care providers manage data, would further Krishnas efforts to streamline the company. The unit generates about $1 billion of annual revenue and isnt profitable, according to the Journal, which cited people it didnt identify. An IBM representative declined to comment.

IBM embarked on Watson with lofty goals, such as revolutionizing health care for cancer patients, but many of its ambitions havent panned out and some customers have complained that the products didnt match the companys hype. IBM has scaled back its Watson ambitions, including through job cuts last year.

More:

IBM Is Said to Consider Sale of Watson Health Amid Cloud Focus - Data Center Knowledge

IBM Explores Sale of IBM Watson Health – The Wall Street Journal

International Business Machines Corp. is exploring a potential sale of its IBM Watson Health business, according to people familiar with the matter, as the technology giants new chief executive moves to streamline the company and become more competitive in cloud computing.

IBM is studying alternatives for the unit that could include a sale to a private-equity firm or industry player or a merger with a blank-check company, the people said. The unit, which employs artificial intelligence to help hospitals, insurers and drugmakers manage their data, has roughly $1 billion in annual revenue and isnt currently profitable, the people said.

Its brands include Merge Healthcare, which analyzes mammograms and MRIs; Phytel, which assists with patient communications; and Truven Health Analytics, which analyzes complex healthcare data.

It isnt clear how much the business might fetch in a sale, and there may not be one.

IBM, with a market value of $108 billion, has been left behind as cloud-computing rivals Microsoft Corp. and Amazon.com Inc. soar to valuations more than 10 times greater. The Armonk, N.Y., company has said its focused on boosting its hybrid-cloud operations while exiting some unrelated businesses.

Original post:

IBM Explores Sale of IBM Watson Health - The Wall Street Journal

Colocation on the edge: Why regional data centres and hybrid cloud can maximise performance – Cloud Tech

A year or so ago there were over 500 hyperscale data centre in existence worldwide and a further 170 or more in the pipeline, according to Synergy Research. But the explosion in cloud adoption public and private is not only driving demand for hyperscale data centres. Fuelled by the IoT and the arrival of 5G, the ongoing decentralisation of the cloud is also contributing hugely to the growing shift towards distributed edge computing.

Edge cloud environments are now pivotal in extending the cloud down to the local level. These enable much of the data processing, storage, control and management of local applications to take place much closer to users, machines and devices. With this latency is significantly improved and optimised applications responsiveness, therefore maximising enterprise productivity, efficiency, competitive advantage, user and customer experience. Low latency also ensures the future availability and performance potential of 5G mobile network coverage; super-fast streaming video for content delivery providers; real-time cloud gaming; real-time AI, machine learning/deep learning decision making in industrial automation and medical environments; pinpoint control of driverless vehicles and much more.

Fuelled by the IoT and 5G, the ongoing decentralisation of the cloud is contributing hugely to the growing shift towards distributed edge computing

For lower latency and greater agility, data centres must be able to rapidly provision and scale compute and storage resources exactly where they are needed but without risk of compromising IT security and resilience. At the same time, it is important to bear in mind that edge computing in edge data centres complements rather than competes against public cloud services.

Therefore, CIOs and developers reliant on the lowest latency possible must consider the best place to deploy and support new services as well as rethink the network architecture. In doing so, large enterprises, SMEs as well as cloud and telecoms service providers will benefit from their data and applications being much closer to users and customers with only less time sensitive, non-mission critical data being sent to the centralised public cloud for further analysis or archiving.

Apart from improved latency the cost of backhauling all data to one or two large hyperscale data centres can be significantly reduced by keeping it local. High volume data transmission costs can be enormous, such as in the case of autonomous vehicles.

In response to new and growing market requirements a more regionalised edge data centre colocation solution has become necessary. This directly addresses the latency issues and data transit costs that typically occur with centralised cloud business models overly reliant on data centres in far off locations at the other end of the country or even further afield. Edge colocation facilities are purpose-designed to fill the considerable gaps between modular micro (unmanned) data centres located at the very edge of the network, for example next to mobile cell towers, on factory floors and hospital wards and the centralised hyperscale ones.

However, to optimise a best of both worlds approach between public and local edge private clouds requires strategically located data centres to regional internet exchanges as well as diverse onsite carrier fibre connectivity. Hybrid architectures combining public, private and perhaps on premise legacy IT will also be required which often creates complex engineering challenges.

Applications migrations will dictate the hybrid strategy and one size does not fit all. Building the business case and the preparation work can be challenging when considering which applications will be placed in the edge data centre and which in the hyperscale data centre; how long it will take to migrate all the applications to the new infrastructure; the skills and experience available within the IT department; whether any remaining on premise legacy IT infrastructure needs to be accommodated; the software required for managing all environments within a hybrid implementation.

To optimise a best of both worlds approach between public and local edge private clouds requires strategically located data centres to regional internet exchanges as well as diverse onsite carrier fibre connectivity

With the above in mind, the level of on-site engineering competence available at regional colocation sites will be very important. Connectivity directly into public cloud provider infrastructure via on-site gateways is another factor, along with the flexibility to carry out pre-production testing in the data centre to ensure everything works prior to launching.

The need for speed in terms of achieving low latency connectivity along with greater bandwidth and the benefits of reduced data transit costs must not be allowed to distract from the fundamentals of colocation: Continuous 24/7 data and storage systems availability through the provision of secure and resilient critical infrastructure.

It is wise to check physical and cyber security, forwards power availability, the types of cooling systems used and overall energy efficiency (PUE) use of 100 per cent renewably sourced power should be a given by now but also look at how else a potential data centre provider is addressing sustainability. Finally, request proof of uptime service record, proven certifiable security and operational credentials, DR and business continuity contingencies, and the ability to provide end to end server migration and installation services.

Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? The Data Centre Congress, 4th March 2021 is a free virtual event exploring the world of data centres. Learn more here and book your free ticket:https://datacentrecongress.com/

Tags: Data Centres, edge cloud, edge computing, latency

View original post here:

Colocation on the edge: Why regional data centres and hybrid cloud can maximise performance - Cloud Tech

Global Market Analysis Healthcare Cloud Computing Industry by Size, Share, Growth, Trends and Forecast 2021 2026 Express Keeper – Express Keeper

The latest report on the Healthcare Cloud Computing market by In4Research provides a brief overview of the Industry along with the product definition and market scope. The sections following the introductory chapter provide an in-depth study of the Healthcare Cloud Computing market based on extensive research analysis. Along with the market dynamics, the report also presents a comprehensive analysis of the market covering the supply and demand forces.

Also, this report offers key drivers that propel the growth in the global Healthcare Cloud Computing market. These insights help market players in devising strategies to gain market presence. The research also outlined restraints of the market. Insights on opportunities are mentioned to assist market players in taking further steps by determining the potential in untapped regions.

Get a Sample Copy of Report @ https://www.in4research.com/sample-request/369

Major Segments of the Healthcare Cloud Computing Market are:Market Breakdown by Applications:

Market Breakdown by Types:

The report makes use of the market data sourced from the year 2015 to 2020 while the market analysis aims to forecast the market up to the year 2026. The various strategic developments have been studied to present the current Healthcare Cloud Computing market scenario.

The top 10 leading companies in the global Healthcare Cloud Computing Market are analyzed in the report along with their business overview, operations, financial analysis, SWOT profile and Healthcare Cloud Computing Market products and services.

The key players operating in the Healthcare Cloud Computing Industry.

Speak to Our Expert @ https://www.in4research.com/speak-to-analyst/369

The report scrutinizes different business approaches and frameworks that pave the way for success in businesses. The report used expert techniques for analyzing the Healthcare Cloud Computing Market; it also offers an examination of the regional market. To make the report more potent and easier to understand, it consists of infographics and diagrams. Furthermore, it has different policies and development plans which are presented in summary. It analyzes the technical barriers, other issues, and cost-effectiveness affecting the market.

Additionally, the Healthcare Cloud Computing Market report takes into light the summation of the market incorporates orders, definitions, and applications. Further, it contains the comprehensive investigation of various angles, for example, openings, limitations, drivers, challenges and danger, and major miniature business sectors. Furthermore, the report isolates the Healthcare Cloud Computing Market dependent on a few portions and sub-fragments alongside the past, current, and conceivable conjecture development patterns for each section and sub-sections shrouded in the report.

Key Influence of the Healthcare Cloud Computing Market:

To Buy the Full Report, Connect with us athttps://www.in4research.com/buy-now/369

About In4Research

In4Researchis aprovider of world-classmarket research reports, customized solutions and consultingservices, and high-quality market intelligence thatfirmly believes in empowering the success of its clientssuccesses in growing or improving their business.We combine a distinctive package ofresearch reports and consulting services,global reach, andin-depth expertise in markets such asChemicals and Materials, Food and Beverage, Energy, and Powerthatcannot be matched by our competitors. Our focus is on providing knowledge and solutions throughout the entire value chain of the industries we serve. We believe in providing premium high-quality insights at an affordable cost.

For More Details Contact Us:

Contact Name: Rohan

Email: [emailprotected]

Phone: +1 (407) 768-2028

Read the original:

Global Market Analysis Healthcare Cloud Computing Industry by Size, Share, Growth, Trends and Forecast 2021 2026 Express Keeper - Express Keeper

Cloud Computing in Industrial IoT Market to Witness Robust Expansion Throughout – News.MarketSizeForecasters.com

The latest report on ' Cloud Computing in Industrial IoT market' Added by Market Study Report, LLC, provides a concise analysis of the industry size, revenue forecast and regional spectrum of this business. The report further illustrates the major challenges and the latest growth strategies adopted by key players who are a part of the dynamic competitive spectrum of this industry.

The research report on Cloud Computing in Industrial IoT market contains an in-depth assessment of the growth driving factors, opportunities, and restraints impacting the regional terrain and competitive arena of this business sphere.

Request a sample Report of Cloud Computing in Industrial IoT Market at:https://www.marketstudyreport.com/request-a-sample/3087474?utm_source=marketsizeforecasters.com&utm_medium=SK

As per the report, the market is expected to record a CAGR of XX% and grow substantially over the analysis period of 2020-2025.

Market fluctuations due to lockdowns imposed on account of COVID-19 pandemic has bolstered uncertainty. Besides near-term revenue concern, certain industries are likely to face challenges even post pandemic.

All businesses in several sectors have reformed their budget to restore their profits for the ensuing years. A granular analysis of this business sphere will help organizations manage market uncertainty and take informed decisions building contingency plans.

The study delivers a detailed assessment of several market segmentations to provide a better understanding of lucrative growth prospects of this market.

Pivotal pointers from the Cloud Computing in Industrial IoT market report:

Cloud Computing in Industrial IoT Market segmentations present in the report:

Regional segmentation: North America, Europe, Asia-Pacific, South America, Middle East and Africa

Product types:

Application spectrum:

Ask for Discount on Cloud Computing in Industrial IoT Market Report at:https://www.marketstudyreport.com/check-for-discount/3087474?utm_source=marketsizeforecasters.com&utm_medium=SK

Competitive outlook:

Cloud Computing in Industrial IoT Report Effectively Addresses the Below Queries.

For More Details On this Report: https://www.marketstudyreport.com/reports/global-cloud-computing-in-industrial-iot-market-2020-by-company-regions-type-and-application-forecast-to-2025

Related Reports:

1. Global Employment Background Check Software Market 2021 by Company, Regions, Type and Application, Forecast to 2026Read More: https://www.marketstudyreport.com/reports/global-employment-background-check-software-market-2021-by-company-regions-type-and-application-forecast-to-2026

2. Global Employment Background Screening Software Market 2021 by Company, Regions, Type and Application, Forecast to 2026Read More: https://www.marketstudyreport.com/reports/global-employment-background-screening-software-market-2021-by-company-regions-type-and-application-forecast-to-2026

Related Report : https://www.marketwatch.com/press-release/portable-optical-spectrum-analyzers-market-detailed-analysis-of-current-industry-figures-with-forecasts-growth-by-2025-2021-02-17

Contact Us:Corporate Sales,Market Study Report LLCPhone: 1-302-273-0910Toll Free: 1-866-764-2150 Email: [emailprotected]

Read more:

Cloud Computing in Industrial IoT Market to Witness Robust Expansion Throughout - News.MarketSizeForecasters.com

Commencing a digital transformation journey with application modernization – Web Hosting | Cloud Computing | Datacenter | Domain News – Daily Host…

The automated process has been tailored to ensure that the ever-widening gap between the development and operations can be bridged by picking professional devops consulting services. The comprehensive approaches help in accelerating the efficiency while vouching for value addition to your company. Let us now have a look at the hand-crafted listicle of benefits catered by the devops services companies:

Data security :Security bricks needs to strengthen at every step of application rebuilding. Especially, when you pick options for revaluation. Security building needs layering in every minute process for a well-augmented solution. Thus, application modernization promises to eradicate the hurdles of data security. Data security is one of the greatest indicators of utmost stability.

Business agility to expand horizons :The new wave of innovation cannot pass through the rigid application architecture. In no small measure, this limits business agility. Thus, it is significant to pick legacy application modernization. This ensures that the core system can adapt to the long list of modern problems. It is time to open up the gates for flooding innovations and expand horizons.

Building cornerstone of satisfying customer experiences :The optimal transition from an oldmonolith to a cloud platform isnt a cakewalk. It definitely comes with a lot of expertise. You can understand that you have tapped the potential benefits,if:

Opening taps of revenue streams: Legacy application modernization is all about unlocking therevenue streams with utmost flexibility.With an established updated system, you can add scores ofcustomers in the loop of enhanced services, bringing innovation to the table. Furthermore, the advancement caters operational excellence. It also vouches for enhanced agility to lower the maintenance costs.

Painting a picture of digital transformation:The business landscape needs moderation. In the era pacing towards digital advancement, there is a need for transformation. If you want your business to hold the placard of Committed to the future then it is time for you to resort to the digital transformation.

In a nutshell, the legacy applications need modernization and immense expertise. Atiauro, we aim to bring innovation to the table to mark a green flag of advancement. We believe in never resting our laurels!

With our modernization of application, we unleash the powers with brilliant technical heads. Our culture is based on the foundation to address your complex problems. We perform our operations with dedication and compassion. Reach out to us, today. Were merely one tap away to bring you the application modernization services.

Read more:

Commencing a digital transformation journey with application modernization - Web Hosting | Cloud Computing | Datacenter | Domain News - Daily Host...

Is cloud-native computing as influential as its stacked up to be? [Status Report] – ZDNet

Making an automated mechanism for building and deploying software workloads onto cloud platforms was a superb idea. Since 2015, the task of building a business around it has, at least on paper, been tackled by the open-source community. Today, its champions are the names we recognize: Google, VMware, Microsoft, Amazon (if reluctantly), Red Hat, and by extension, IBM. Containerized workload automation must be a somewhat viable business, otherwise, those companies wouldn't have swallowed it up.

"Cloud-native computing" is a great idea that would have been poison for any single vendor to have claimed and defended just for itself. Revolutions have always been great (at least the successful ones) as long as someone else leads the charge, and you're not the one getting bloodied. The overall goal of cloud-native is to unseat a dynasty: to eject virtualization from its throne at the center of the data center economy, and in its place install workload automation. From a distance, the task resembles replacing a refrigerator with a washing machine.

Yet witness how Google swiftly and adeptly pulled off this masterstroke: At exactly the time that Docker, a workload deployment system, sought to leverage the gains from the movement it sparked, Google started its own movement at a campsite down the road. It took an in-house workload orchestration project (even its original name, Borg, delivers gorgeous irony); offered it up as an open-source initiative; stewarded its evolution into the raison d'tre for workload portability; marginalized the Docker whale; infused Red Hat, VMware, to a lesser degree Microsoft, and kicking and screaming, Amazon with the workload portability bug; and cemented Kubernetes' place as the axis around which a software ecosystem revolves -- all without ever "pulling a Microsoft" or being accused of monopolizing a market, and all by leveraging other people's efforts.

Minimum investment, maximum payoff, clean shirts. A clinic on influencing an industry. If only all insurgencies could work like this.

But is the scale of this revolution broad enough to have genuinely changed the way we live and work? That's the type of question we've created Status Report to help us answer. It aims to take the ten most important categories for any technology's or platform's capability to influence its users' businesses and livelihoods, examine each one in isolation, score its positive and negative factors separately, then toss them all onto a graph to see if its capacity for change is as great as our proclivity for emotion. Through this cleansing process, we may be able to extrapolate pertinent, relevant facts about whether technology makes a genuine difference.

Today, when I refer to "cloud-native" technologies, there's a general presumption that I'm referring not just to "containerization" (another terrible word) but specifically to Kubernetes, and the ecosystem of open source components that comprise its "stack." The Linux Foundation maintains a project called the Cloud Native Computing Foundation, whose job is to nurture new technologies in this space, as well as promote and support all of them.

Three years ago, I was told emphatically that presuming containerization meant Docker or Kubernetes, or any one name, would be contrary to the spirit of the open-source movement. Openness, I was sternly corrected, was all about choice, and ensuring that customers are never again locked into one way of doing things. Now, of course, it's an all-Kubernetes stack all the time. The CNCF's accreditation program for IT professionals in the cloud-native space is called Certified Kubernetes Administrator. Its competition has been kicked to the curb, then stomped on and flattened, followed by being declared not only irrelevant but never all that relevant to begin with.

The original, and still true, meaning of containerization is the method of packaging any workload with all the tools and dependencies it needs so that it can run on any cloud platform without modification. Think of a camper who carries everything she needs in her backpack, as opposed to a traveler who carries a light briefcase but expects to stay in a hotel. (Actually, virtualization would be like a traveler who drags the entire hotel behind him on chains.)

The problem with the term "cloud-native" is that it implies a software workload stays in one place throughout its lifecycle. That's not what "cloud" is about. A cloud platform is an abstract plane whereby a workload is addressable through the network at all times, regardless of where it is. This is the actual value of the concept: It ensures isolation (for security), portability, and at the same time, accessibility. This reduces the cost of change for enterprises charged with maintaining it. Historically, IT has kept costs low by maintaining system configurations until their platforms can no longer sustain them, and usually beyond that time. "Cloud-native" flips the whole maintenance cost argument on its head: Ideally, it ends up costing organizations less to continually adapt their own workloads to new situations and requirements at a moderate pace, than to constantly pour more effort and capital into maintaining IT systems' stability.

In the end, this is why Kubernetes won. (I expect to be told once again, it's not about winning or losing. The people telling me this will, without exception, be the victors.) Yes, it's the packaging of the workload that enabled its system of orchestration. But it's in the orchestration that customers find value.

The software industry is, first and foremost, about intellectual property. The open-source movement is largely an effort to neutralize the legal, financial, and other natural advantages a competitor may attain by having thought of the idea first. It forces its market to be about something else.

In a market where the incumbent is the leader by way of ingenuity, that something else is usually an effort to drive the value of that advantage down through product homogenization, and substitute service and support as value-adds. This is how Linux first gained a foothold in the enterprise. However, in an infrastructure market, where the incumbents reinforce their value propositions by stifling change instead of driving it, an open-source player has the unusual opportunity to reverse the situation: to encourage the developer community to foster ingenuity among themselves, as a tool for disruption.

As Docker, and immediately afterwards Kubernetes, proved, this plan can work out rather well at the outset. But then suddenly it hits a wall. It takes tremendous human effort to sustain even a moderate pace of change for any technology that has established itself in its market -- an effort that has traditionally been supported through vendors' IP license fees. When IP is at the core of a market, its incumbent has the luxury of slowing down ingenuity to a pace it can comfortably manage (see: Apple, Microsoft, Qualcomm, Amazon). That method doesn't work anymore if the incumbent is a shared entity: It must evolve or die. And since the benefits of that ingenuity are shared by design, it takes even more effort to spark the incentive that encourages ingenuity in the first place, especially when everyone knows ahead of time that such ingenuity must be short-lived (see: "Union, Soviet").

A successful business is all about minimizing risk. Information technology is the most volatile component of any business. The types of organizations that take a risk here are often the newer ones, or perhaps the very old ones for whom casting off their mortal coils has become a mandate -- companies with very little pre-existing baggage that's not worth losing anyway. Early adopters are the groups that have calculated the potential for significant gains, either through disruption or just pure innovation.

But the earliest adopters of Kubernetes and cloud-native technologies include public cloud service providers (of which there are few), communications providers (of which there are few), small businesses, and startups. Although they're arguably reaping benefits, they have yet to lay down the kinds of patterns for success that other industries can follow. In financial services, healthcare, government, and incredibly, logistics, there remain highly placed individuals in IT, including at the CIO level, who have never heard of Kubernetes or "cloud-native" or containerization. Not that they've rejected them, or would want to do so -- they don't even know what these things are. Such individuals tend to be myopically focused on other institutions less risk-averse than their own. Until these other firms take the plunge, they won't know a plunge can be taken.

Now let's score each component of our evaluation individually:

What the cloud-native ecosystem (CNE) intends to do is give vendors, both old and new, something amounting to a freeze-dried infrastructure kit. They can get into the business of producing products and services, so long as they don't make their involvement about intellectual property or exclusivity. [+7] What its benefactors do not do -- perhaps because they cannot -- is provide instruction or guidance about how to provide distinguished services around this business [-4], which is one reason why the earliest open-source trailblazers, such as Red Hat, have come out on top thus far. [Net: +3]

From the outset, Kubernetes' engineers have said they'll know their efforts have pierced the envelope into the realm of permanent success once everyone finds it a dull and boring topic already. Their experience comes from Linux. Linux is dull and boring. Take it from a journalist who has been paid to come up with reasons to make Linux exciting again.

But Linux established itself through homogenization -- by driving down the advantages gained through ingenuity. Kubernetes actually took the reverse route: It disrupted a market that could have comfortably fed off of the first-generation virtualization until the end of time. From the outset, however, no single player has attained a native competitive advantage -- no one has "come out on top." This is a problem for prospective customer organizations that typically wait for someone to emerge from the rubble, before investing in IT. [+2 | -7, net: -5]

This is Kubernetes' present dilemma: For the CNE to thrive, it must continually spawn new entities and innovative projects. Because Kubernetes is the axis of the ecosystem, those projects cannot be another orchestrator, but rather another service mesh, logging analytics tool, or key/value store. Because the axis of the ecosystem is strong, you can build a business around a product or service that orbits it. . . for a while. [+5]

Actual map of the CNCF's Cloud-native Ecosystem. Magnification available from CNCF.io

As the CNCF's own landscape map reveals all too clearly, even without a magnifying glass, this leads to dozens of startups essentially competing for the same market spaces. Venture capitalists will find it difficult to invest in a project that, by design, must compete with dozens of other projects that appear on the surface to be functionally identical. While we may call this a meritocracy, Charles Darwin would have called it something else. [-7, net: 2]

As noted in the Executive Summary, the components of the CNE must evolve or die. That's not necessarily a bad thing; indeed, it leads to consistent improvements. [+8] But it requires a long-term vision -- for 10, 15 years down the road -- that few, if any, have yet articulated. To withstand a threat from a future disabler, Kubernetes and its satellites cannot afford to become, like Linux, dull and void. They must continue a level of sustained pressure just to remain relevant. They are, in a word, hungry. Over time, this creates a vulnerability, increasing the likelihood of a future competitive challenge. [6, net +2]

At a certain point, a product or service becomes so fundamental to a market that its existence as an enabling factor is mandated. The Kubernetes model of application deployment is now a fact of the public cloud. [-7] It has yet, however, to significantly penetrate the small enterprise data center, and not even its integration into the latest VMware vSphere appears to be changing this fact measurably. So if you're in the cloud services business, and your customers include medium to large enterprises, you have to position yourself differently than you would for a small company that's starting fresh with the newest tools and methods. [-5, net +2]

The shift to a cloud-native operating model can, and has, revolutionized corporate IT [+7]. . . for those organizations capable of understanding it, and meeting its requirements. Kubernetes is a hungry beast. As of yet, there has yet to be a single, successful, persuasive initiative to educate the organization as a whole, including the C-suite, as to the end value of cloud-native, outside of IT departments and software developers who have already been convinced. [-6, net: +1]

Through having infiltrated the mindsets of the leading data center virtual infrastructure vendors, the CNE is now a measurable contributor to the world's gross domestic product. [+7] While some effort and capital continue to be expended toward buttressing pre-existing systems and methods, most of those expenditures have yet to be directed specifically against the advancement of the CNE. [-3, net: +4]

.

Perhaps the best measure of whether a concept has thoroughly permeated the fabric of a society is the degree to which its peoples benefit without their having to pay attention to it. The public cloud sustained the tremendous surge in user demand brought on in the onset of the global pandemic. No other industry has fared as well in the face of potential economic peril. [+8] What has yet to materialize, although it still could, is a sustainable system of education and certification around cloud-native IT, beyond just a bunch of YouTube videos and TED talks. [-2, net +6]

Has this integration helped advance people further forward, making them more capable? Through the enablement of real-time video communications that were not feasible with the first-generation virtualization, undoubtedly yes. [+6] Through the incidental enablement of certain negative aspects of social media that would not have been feasible otherwise, no. [-2, net: +4]

.

The cloud-native ecosystem is the strongest collective community of software developers and IT engineers ever assembled. [+9] Granted, it did not metastasize under its own power -- it required leadership and, certainly at the outset, a positive vision. It also adopted a mandate for personal empowerment and even betterment that no other information technology initiative has ever achieved.

Yes, there are pockets of community initiatives to reinforce those resources devoted to maintaining old, and admittedly dated IT systems. But these pocket communities do not form an ecosystem themselves. What's more, we're seeing evidence that these efforts are at last collapsing. [-3, net +6]

As technologies fare with this new series, a score of +2 or above is very strong. As a geometry teacher may point out, it's possible for an evaluated technology to post weaker net scores in one or more categories, and end up yielding stronger final influence scores. This is intentional. These categories are not intended to represent cumulative virtues. Should Kubernetes ever find that magic formula that guarantees its vendors revenue for the long-term, as it very well may, that final score could be pulled closer to 0, or at least toward the centerline of the X-axis?

But if a technology cracks the customer value barrier, so that not only is it necessary but desired, then that final influence score could skyrocket. What company has ever sustained itself in business for an extended period of time, earning value for its shareholders and also benefitting communities and society as a whole, without also generating products and services that have high perceived value? And what industry has ever come to fruition with a plurality of organizations that could successfully accomplish the first two, but not the third?

Here for the first time, we have enough background data to render a comparison between evaluated technologies. Notice in this comparison, we've zoomed in somewhat, narrowing our scale from 10 points in both directions to 3 points. Our benchmark is the World-Wide Web as of September 12, 2001, when that new technology faced the opportunity to bring a society together. It posted a strong positive score that leaned somewhat toward societal benefits, and away from corporate benefactors. 5G edge computing has nearly found that balance between altruistic and self-serving interests and could post a much stronger influence score if it can make a stronger value proposition.

The CNE is highly slanted toward the altruistic, self-sacrificing, part of the chart. That's a good thing, but only for a short time -- which is why the metaphor for this series is a pendulum.

More:

Is cloud-native computing as influential as its stacked up to be? [Status Report] - ZDNet

Impact of Covid-19 on Cloud Computing in Automotive Market Biggest Innovation with Top Competitive Landscape | Amazon Web Services, Microsoft Azure,…

The Cloud Computing in Automotive report sheds considerable light upon seller landscape and rivalry spectrum, with colorful highlights of frontline players along with other avid gamers looking for smooth entry amidst amazing competition. The production profile and notable companies working on the Cloud Computing in Automotive industry area have been assessed on the basis of numerous preset parameters, research jobs, technological benchmarks and geographic expanse to inculcate and update varied Cloud Computing in Automotive market offerings poised to lure profitable customer and response that finally guide healthy expansion travel in the next several years.

The vital report on global Cloud Computing in Automotive market offers concise information dependent on the past and current industry pieces of these affiliations all through the analysis time span. Further it gives more data about the buyer needs and the money related/political regular changes in the business biological system.

Major Key Manufacturers of Cloud Computing in Automotive Market are:

Amazon Web Services, Microsoft Azure, and Google Cloud Platform

Request Sample Report with Complete TOC and Figures & Graphs @ https://www.adroitmarketresearch.com/contacts/request-sample/981?utm_source=AD

Major nations that contribute a significant industry share in the global Cloud Computing in Automotive market are Chile, Egypt, Switzerland, Mexico, Nigeria, Sweden, Turkey, India, UAE, Taiwan, Indonesia, Thailand, UK, Philippines, Italy, Spain, Saudi Arabia, Brazil, Belgium, Japan, Columbia, Russia, Argentina, Malaysia, South Australia, China, Canada, Korea, United States, Germany, Poland, Netherlands, South Africa, France, and Rest of the World.

The Cloud Computing in Automotive analysis is a prominent appraisal guide to look at and rate varied activities and investment discretion containing merchandise portfolio expansions, profitable business ventures in addition to promotional and advertising activities spread across areas to lure growth feasibility and balanced expansion throughout competition spectrum.

The literature further contains examination of the business based on several segments including applications, competitors, and products of the business space. The literature contains assessment of the data set dependent on the product designs, estimating designs, their expanding guides, which are totally concentrated to appreciate the extension of the global Cloud Computing in Automotive market. Inside and out appraisal of the significant organizations that work in the market space are dependent upon their situation in the business space and their impact on the associations, their product portfolio close by different experiences is offered with the evaluation record.

The report on global Cloud Computing in Automotive market helps in thorough understanding of the current and future risks and fundamental dangers related with the Cloud Computing in Automotive Market report and propose certain business strategies to help the organizations in building benefits in coming years, utilizing the previous systems and new examples.

You can buy the complete report @ https://www.adroitmarketresearch.com/industry-reports/cloud-computing-in-automotive-market?utm_source=AD

Some Major Table of Contents:

1.Executive Summary2.Assumptions and Acronyms Used3.Research Methodology4.Cloud Computing in Automotive Market Overview5.Global Cloud Computing in Automotive Market Analysis and Forecast by Type6.Global Cloud Computing in Automotive Market Analysis and Forecast by Application7.Global Cloud Computing in Automotive Market Analysis and Forecast by Sales Channel8.Global Cloud Computing in Automotive Market Analysis and Forecast by Region9.North America Cloud Computing in Automotive Market Analysis and Forecast10.Latin America Cloud Computing in Automotive Market Analysis and Forecast11.Europe Cloud Computing in Automotive Market Analysis and Forecast12.Asia Pacific Cloud Computing in Automotive Market Analysis and Forecast13.Asia Pacific Cloud Computing in Automotive Market Size and Volume Forecast by Application14.Middle East & Africa Cloud Computing in Automotive Market Analysis and ForecastContinue.

Entire goal of the Cloud Computing in Automotive Market study assessment

* The global Cloud Computing in Automotive Market business report mainly consists of facts and figures giving an idea about the industry growth based on basic aspects such as market share, growth rates, profit margins and others.

* The essential purpose of the report is to offer thorough bits of knowledge about market encounters on production and usage plans.

* The report contains reasonable experiences and systems for market headway and gives affirmed figures relating to basic industry plans, improvement rate surmises, production plans and different subtleties.

* The new report on the global Cloud Computing in Automotive market consolidates far reaching subtleties containing snippets of data concerning the basic driving relationship and offering the inside and out features about the business approaches utilized by the businesses.

*The record prescribes business frameworks to the relationship in the midst of unfortunate occasions, for example, the Covid-19 pandemic and guarantees them solid compensations in coming years.

*The record further offers brief data on the incidents in the business space are the ways with which the organizations defeated them.

Get Exclusive Discount on this report @ https://www.adroitmarketresearch.com/contacts/enquiry-before-buying/981?utm_source=AD

About Us :

Contact Us :

Ryan JohnsonAccount Manager Global3131 McKinney Ave Ste 600, Dallas,TX 75204, U.S.APhone No.: USA: +1 972-362 -8199 / +91 9665341414

See the article here:

Impact of Covid-19 on Cloud Computing in Automotive Market Biggest Innovation with Top Competitive Landscape | Amazon Web Services, Microsoft Azure,...

Everything you need to know about cloud security – ITWeb

Despite all evidence pointing to the business efficiencies, cost savings, and competitive advantages that cloud computing brings, many companies are still reluctant to move their data and applications to the cloud due to security concerns.

They are worried about unauthorised data access, exposure and leaks, weak access controls, susceptibility to attacks, and possible disruptions.

Where businesses are going wrong when it comes to cloud security, is that they are not taking a holistic approach to their security, says Phil Keeling, regional sales director at Zarepath & Cato Networks.

Keeling will be presenting on Everything you need to know about cloud security, at the ITWeb Cloud, Data Centre & DevOps Summit, to be held on 24 February as a virtual event.

He says by not taking a holistic view, companies end up with incomplete solutions based on several different point solutions which make their environment complex and difficult to manage.

However, he says using a cloud-based security stack takes away the dependence on point products and enables enterprises to deliver security services to their users regardless of where they are and which device they are on, without compromising on performance.

Keeling stresses that customers are still responsible for their data in the cloud and by using secure access service edge (SASE) they can apply the same security policies across the enterprise.SASE, a term coined by Gartner, refers to simplified wide-area networking and security, by delivering both as a cloud service directly to the source of connection as opposed to the enterprise data centre.

Delegates attending Keeling's talk will learn more about SASE, and why SD-WAN is a must when it comes to making sure that cloud security is delivered with the needed performance and functionality.

Excerpt from:

Everything you need to know about cloud security - ITWeb

Cloud computing Market Key Players And Information Analysis Covering COVID-19 Analysis With Forecast 2020-2030 – The Courier

United Kingdom, Date, 16 Feb 2021 FATPOS Global recently added a new report titled Cloud computing Market 2020: By Types, Applications, Industry Size, Share, COVID-19 Impact Analysis and Forecast to 2030 in its database. The global Cloud computing Market size and share 2018 report includes an in-depth overview of the research report and industry trends throughout the forecast period.

This report covers an in-depth analysis and forecast for theCloud computing Marketon a global and regional level from 2015 to 2024. The research report covers historical data from 2015 to 2017 with a forecast from 2018 to 2022 based on revenue (USD Billion). The report covers a comprehensive view of the market including market size, share, trends, industry growth, market drivers, restraints, and future opportunities. It also provides the level of impact of drivers and restraints on the Cloud computing Market between 2017 and 2024.

The Cloud computing Market research report included a detailed competitive scenario and product portfolio of major market key players. The report evaluates Porters Five Forces model to analyze the different factors affecting the growth of the Cloud computing Market.

Request Free Sample Report of Cloud computing Market:https://www.fatposglobal.com/sample-request-410

Our Every Free Sample Includes:

[The Final Sample and Full Report Covers COVID-19 Impact Analysis.]

Cloud computing Market: Report Structure & Growth Drivers

Moreover, the research report includes in-depth analysis on Price and Gross Margin, Capacity, Production, Revenue, current Cloud computing Market geographical zones, technology, and demand-supply, Consumption, Import, Export, Market Drivers and Opportunities. The report mainly focused on industry size, share, trends, growth factors, and major company profiles through the forecast period.

In addition, this report discusses the key drivers influencing market growth, opportunities, the challenges, and the risks faced by key manufacturers and the market as a whole. It also analyzes key emerging trends and their impact on present and future development.

Download Free PDF Brochure of this Report:https://www.fatposglobal.com/free-broucher-410

Our Free PDF Brochure Contains:

COVID-19 Pandemic Global Business Impact Analysis

We at Fatpos Global understand how difficult it is for you to plan, strategize, or make business decisions, and as such, we have your back to support you in these uncertain times with our research insights. Following is the list of things that we as a management consulting, advisory, and market research team are doing at Fatpos Global to provide you with the latest information that will aid you in your business decisions.

Request COVID-19 Impact Business Analysis on Cloud computing Market:https://www.fatposglobal.com/sample-request-410

Scope of Cloud computing Market Report

Request for Discount on this Report:https://www.fatposglobal.com/request-discount-410

Cloud computing Market: Report Coverage Overview

The report additionally majorly includes the major market players business strategies, end-to-end factors such as application, development, innovation, value supply & distribution channels, profit and loss figures, production capability, and others.

Browse Detail Report:https://www.fatposglobal.com/reports/cloud-computing-market/410

The Study Objectives of this Report are:

About Us:

Fatpos Global stands for Failures Are The Pillar Of Success. We are a rapidly-growing global management consulting, advisory and market research services provider that aims to aid businesses with bold decisions that help them embrace change for their sustainable growth. With the help of our experts and industry veterans and their years of expertise across different industry verticals, we aid businesses with solutions that help in their efficient decision making and Developing executable strategies.

Contact Us:

Fatpos Global

275 New North Road,

Islington Suite 1275

London, N1 7AA, UK

+1 (484) 775 0523

info@fatposglobal.com

Read this article:

Cloud computing Market Key Players And Information Analysis Covering COVID-19 Analysis With Forecast 2020-2030 - The Courier

News . NASA’s Mars Helicopter Reports In – Jet Propulsion Laboratory

Ensuring that Ingenuity has plenty of stored energy aboard to maintain heating and other vital functions while also maintaining optimal battery health is essential to the success of the Mars Helicopter. The one-hour power-up will boost the rotorcrafts batteries to about 30% of its total capacity. A few days after that, theyll be charged again to reach 35%, with future charging sessions planned weekly while the helicopter is attached to the rover. The data downlinked during tomorrows charge sessions will be compared to battery-charging sessions done during cruise to Mars to help the team plan future charging sessions.

Like much of the 4-pound (2-kilogram) rotorcraft, the six lithium-ion batteries are off-the-shelf. They currently receive recharges from the rovers power supply. Once Ingenuity is deployed to Mars surface, the helicopters batteries will be charged solely by its own solar panel.

After Perseverance deploys Ingenuity to the surface, the helicopter will then have a 30-Martian-day (31-Earth-day) experimental flight test window. If Ingenuity survives its first bone-chilling Martian nights where temperatures dip as low as minus 130 degrees Fahrenheit (minus 90 degrees Celsius) the team will proceed with the first flight of an aircraft on another world.

If Ingenuity succeeds in taking off and hovering during its first flight, over 90% of the projects goals will have been achieved. If the rotorcraft lands successfully and remains operable, up to four more flights could be attempted, each one building on the success of the last.

We are in uncharted territory, but this team is used to that, said MiMi Aung, project manager for the Ingenuity Mars Helicopter at JPL. Just about every milestone from here through the end of our flight demonstration program will be a first, and each has to succeed for us to go on to the next. Well enjoy this good news for the moment, but then we have to get back to work.

Next-generation rotorcraft, the descendants of Ingenuity, could add an aerial dimension to future exploration of the Red Planet. These advanced robotic flying vehicles would offer a unique viewpoint not provided by current orbiters high overhead or by rovers and landers on the ground, providing high-definition images and reconnaissance for robots or humans, and enable access to terrain that is difficult for rovers to reach.

More About Ingenuity

The Ingenuity Mars Helicopter was built by NASAs Jet Propulsion Laboratory in Southern California which also manages the technology demonstration for NASA Headquarters in Washington. NASAs Ames and Langley Research Centers provided significant flight performance analysis and technical assistance. AeroVironment Inc., Qualcomm, Snapdragon, and SolAero also provided design assistance and major vehicle components. The Mars Helicopter Delivery System was designed and manufactured by Lockheed Space Systems in Denver.

For more information about Ingenuity:

https://go.nasa.gov/ingenuity-press-kit

https://mars.nasa.gov/technology/helicopter

View original post here:

News . NASA's Mars Helicopter Reports In - Jet Propulsion Laboratory

From Dayton to Mars – University of Dayton – News Home

As the Perseverance rover descended to the surface of Mars just before 4 p.m. Feb. 18, University of Dayton Research Institute scientist Chad Barklay closely watched NASAs live feed of its Joint Propulsion Lab control room, listening for touchdown confirmation.

Exactly four years prior, Barklay and UDRI colleague and engineer Allan Tolson were also closely watching as they essentially cooked a generator similar to the one that will power the Mars 2020 rover, heating it up to temperatures never before experienced by the unit or its sister units on earth and Mars. By assessing the effects of high heat on the prototype unit, the laboratory test was designed to predict whether the Multi-Mission Radioisotope Generator (MMRTG) attached to Perseverance would continue to perform normally should it encounter unanticipated extreme temperatures during the rover's mission.

The successful test helped NASA prepare for the Mars 2020 mission, which launched in July and successfully delivered Perseverance by sky crane to the red planet surface seven months later.

Since opening UDRI's MMRTG Laboratory in late 2013, Barklaygroup leader for advanced high-temperature materials in UDRI's power and energy divisionand his team have designed and performed qualification and evaluation tests on generators in support of NASA's Curiosity and Perseverance rover and future deep-space missions. Their research, sponsored by the Department of Energy, provides critical information on the performance of the power units over time and under the punishing temperatures and other harsh operating conditions of space.

"The MMRTG is essentially the lifeblood of the rover," said Barklay, who helped develop the layout and assembly procedures for the MMRTG that continues to power Curiosity at Mars' Gale Crater. "Heat generated by naturally decaying isotopes within the generator's core keep the rover warm during the extreme cold of Martian nights, and is also converted to electricity to power the rover's mechanical, computer and communication systems. The information we provide on MMRTG performance helps mission planners understand how much power they'll have and how long they'll have it for the science they want to do."

In addition to helping NASA plan for the routine, UDRI researchers also help the agency prepare for the unexpected, Barklay said.

"There are a number of factors that can affect generator performance, including heat. Four years ago, there were still several potential elements of the pending Perseverance mission that each could have caused the MMRTG to run hotter than the its predecessor unit on Curiosity, including landing site, which hadn't been selected yet; the Martian climate at time of landing; the age of the generator at launch; and some minor design differences in the rover that could affect heat transference from its generator," Barklay added. "So we were asked to design and conduct what was basically a worst-case-scenario experiment that would assume the hottest temperatures for each of those factors, and then some."

To prepare for that 2017 test, Barklay and Tolson wrapped a generator in an insulating material, then heated the unit to 428 F approximately 100 degrees hotter than the maximum temperature Curiosity's generator experiences. They held the unit at that temperature for 24 hours, neither sleeping nor leaving the generator unattended, prepared to quickly shut down the experiment if they observed any behavior that threatened the system.

"The outcome was highly successful; better than we could have hoped," Barklay said.

For several years after opening, the MMRTG lab was equipped with the first two multi-mission generators built by NASA. These earthbound generators, designed for qualification and testing, are identical to their sister units on the Curiosity and Perseverance rovers with one significant exceptionthey are powered by electricity rather than by the plutonium at the core of generators attached to the rovers on Mars.

In the last two years, one of the prototype generators was shipped to Johns Hopkins Applied Physics Lab for research and testing related to the Dragonfly rotocopter mission to Saturn's moon, Titan, currently scheduled for launch in 2026. The second generator traveled first to the Idaho National Lab and NASA's Joint Propulsion Lab before ultimately landing at Kennedy Space Center last year for final testing prior to the launch of Mars 2020.

Barklay anticipates the test generators returning to Dayton at some point in the future. In the meantime, his team built a thermal simulator that mimics the NASA generators in appearance and behavior. "We made some modifications to the thermal simulator that will allow us to adapt to advancing generator technology if needed," Barklay said. "For now, we'll continue to support the missions of Curiosity, Perseverance and future deep space exploration, and we're hoping to do some work in support of Dragonfly."

The simulator was also used for a research study into the possibility of using MMRTGs to power lunar experiments, should the U.S. go back to the moon, Barklay said. He will publish and present a paper on the study at the IEEE Aerospace Conference in early March.

Barklay, who formerly worked at Mound Laboratories in Miamisburgwhich he calls the birthplace of radioisotope generator technologysaid he will never tire of watching launches and landings, in spite of the number he has already seen.

"Nothing can really prepare you for the emotions you experience when watching," he said. "It's amazing to feel part of something so much bigger than yourself, and it's a profound experience to realize that you and your colleagues contributed to the success of the mission."

For interviews, contact Pamela Gregg, UDRI communication administrator, at 937-269-8963 (cell).

Read the original post:

From Dayton to Mars - University of Dayton - News Home

Oregon City STEM robotics team wins share of national NASA competition – KGW.com

The "Lunar Ladies" won the state's grand prize for the second year in a row and were national co-winners with 3 other teams

OREGON CITY, Ore. When the Perseverance Rover landed on Mars on Thursday, a middle and high school STEM robotics team was watching.

"It was nice to see when the parachute came out and when then the rotors landing system was showing, so it'd figure out where it was going to land. I really liked that," said 10th grader Pahlychai Thao.

Before NASA landed on Mars, the Oregon City STEM team had already landed their own rover weeks earlier on an 8 x 10 mat simulating Mars' surface. They were competing in NASA's ROADS (Rover Observation and Drone Survey) on Mars Challenge.

"ROADS on Mars is a robotics challenge that involves programming robots, Legos, construction and drone flying," said 7th grader Sophia Misley. "It also educates you a lot about lunar landings and things about NASA."

The team calls themselves "The Lunar Ladies". Team members are PJ Misley (9th grade), Sophia Misley (7th Grade), Pahlychai Thao (10th Grade), Lily Kirkpatrick (9th Grade) and Ariana Nackos (9th Grade).

Tom Misley is their coach.

"This was their most complicated challenge," Tom Misley said. "They had to run their rover. It was probably 50 feet back and forth, picking up samples off of a mat and it was not an easy challenge to do at all."

The team was scored on how well they landed a replica of the rover using a drone on an 8 x 10 mat that showed the surface of Mars. They had to drive their Lego-built rover around obstacles and collect samples as well as perform other mini challenges.

"I just think that it's super cool that you can create things and you can use electronics to make them move," said Ariana Nackos about what draws her to the challenge.

Teammate PJ Misley added, "I really enjoy robotics cause it's just a ton of fun and it's really cool to be able see how just putting blocks of code and putting numbers in can get your robot to do cool things like go around a mat and complete little missions."

The competition started in September of 2019 and was supposed to wrap up by April 2020. The pandemic and lockdown pushed back the final date until October 2020, which gave the team time to perfect some of their final submissions.

The extra time paid off. Not only did The Lunar Ladies win in the state of Oregon for the second year in a row, they also were national co-winners with 3 other teams and the only all-girl team to win.

"It's pretty cool knowing that for our state we won and knowing that in other states people were also good enough to win. It's kind of interesting now to be able to see the other people's videos and see just how well everyone did," PJ Misley said.

As part of their prize, the team will get an all expense paid trip to NASA to watch a launch once restrictions from the pandemic ease up.

For now, the team is focused on their next mission and that means landing on an asteroid.

Read more:

Oregon City STEM robotics team wins share of national NASA competition - KGW.com

Extreme Winter Weather Causes US Blackouts – nasa.gov

A potent arctic weather system chilled much of the United States with frigid weather in mid-February 2021, shattering low-temperature records in the middle of the country. The extreme cold combined with several snow and ice storms to leave millions of people without power.

Texas was hit particularly hard. According to news reports, natural gas shortages were already limiting power generation across Texas prior to the mid-February storm. Demand intensified after the polar air mass moved in on February 13, and controlled outages and downed power lines left parts of the state in the dark.

The Houston Chronicle reported that 4 million customers across the state were without power on February 15, including 1.4 million in the Houston area. Many of those outages continued into the next day and are apparent in the images above depicting nighttime lights. Satellite data for the right image was acquired around 1 a.m. Central Standard Time on February 16; the left image was acquired around the same time on February 7, prior to the severe cold spell. Nighttime lights data have been overlaid on Landsat imagery so that city structure can still be distinguished.

The nighttime lights data were acquired with the Visible Infrared Imaging Radiometer Suite (VIIRS) on the NOAANASA Suomi NPP satellite. VIIRS has a low-light sensorthe day/night bandthat measures light emissions and reflections. Data are processed by a team of scientists from NASAs Goddard Space Flight Center (GSFC) and Universities Space Research Association (USRA) to account for changes in the landscape (such as snow cover), the atmosphere, and Moon phase.

The team has produced power outage maps for years. But according to Miguel Romn, director of the Earth from Space Institute at USRA and a principal investigator of the Black Marble research team, this event is unique. Large area outages are rare in developed countries, yet this outage spans the entire state of Texas. He noted that Texas is the only state that has isolated its power grid from the rest of the country.

The map above provides a view of the extreme cold associated with the arctic air mass. Data for the map were derived from the Goddard Earth Observing System (GEOS) model; they represent air temperatures at 2 meters (about 6.5 feet) above the ground on February 15, 2021. The darkest blue areas are where the model indicates temperatures reaching as low as -35C (-31F). Areas outside of the southern and central U.S. are relatively warmer, but still cold. White colors equate to temperatures around 0C (32F).

Note that some areas in Texas are colder than Maine and even Alaska. According to news reports, Dallas reached a low of 4F (-16C) on February 15the coldest temperature the city has seen since 1989. Temperatures near 60F are more typical this time of year. The temperature at Houstons Intercontinental Airport early that day was 17F (-8C)the coldest temperature there in 32 years.

NASA Earth Observatory images by Joshua Stevens, using Black Marble data courtesy of Ranjay Shrestha/NASA Goddard Space Flight Center, Landsat data from the U.S. Geological Survey, and GEOS-5 data from the Global Modeling and Assimilation Office at NASA GSFC. Story by Kathryn Hansen.

Read more:

Extreme Winter Weather Causes US Blackouts - nasa.gov

How to watch live as NASA lands its Perseverance rover on Mars today – CNET

Editors' note: NASA's Perseverance rover has landed safely on Mars. Read all about it here.

When it comes to spacehappenings, few are as tense, exciting and high stakes as landing a vehicle on another planet. Within minutes,NASA's Perseverance rover will endeavor to stick the landing on Mars, kicking off a new era in red planet exploration.

While NASA has a lot of experience with delivering machines to Mars (here's looking at you, Curiosity and InSight), that doesn't make it any easier this time. "Landing on Mars is hard," NASA said. "Only about 40% of the missions ever sent to Mars -- by any space agency -- have been successful."

It's going to be a wild ride. Here's what to expect on Perseverance's landing day.

From the lab to your inbox. Get the latest science stories from CNET every week.

NASA will provide live coverage of the landing, which you can watch below. The NASA TV broadcast from mission control kicks off on Thursday, Feb. 18 at 11:15 a.m. PT. Touchdown in the Jezero Crater on Mars is scheduled for around 12:30 p.m. PT.

Here are the times across different timezones:

This won't be like a rocket launch where we get to see every detail as it's happening. We will get NASA commentary and updates, views from mission control, and hopefully some images not too long after landing. It will be a must-watch event for space fans.

Thursday, Feb. 18

Friday, Feb. 19

We've been to Mars before. So why all the hype? The red planet is our solar system neighbor. It's rocky like Earth. It has a long history of water. We can imagine ourselves perhaps living there some day.

"The level of interest that people have in this planet is just extraordinary," Alice Gorman-- space archaeologist and associate professor at Flinders University in Australia -- told CNET. Gorman highlighted humanity's search for life beyond Earth and how Mars is a candidate for having hosting microbial life in its ancient past.

There's also something special about a rover, a wheeled mechanical creature with a "head" and "eyes." "People feel towards the rovers because they're active and they move," said Gorman, likening it an almost parental sense of attachment. The outpouring of emotion over the demise of NASA's Opportunity rover proves how connected humans can get to a Mars explorer. Perseverance is set to become our new Martian sweetheart.

Mars arrivals are always harrowing. NASA calls the process EDL for "entry, descent and landing."

"During landing, the rover plunges through the thin Martian atmosphere, with the heat shield first, at a speed of over 12,000 mph (about 20,000 kph)," NASA said in a landing explainer. There's a reason NASA describes the landing process as "seven minutes of terror."

This NASA graphic shows the entire entry, descent and landing (EDL) sequence.

Small thrusters will fire to keep the rover on track on the potentially bumpy ride through the atmosphere. The rover's protective heat shield helps to slow it down. At an altitude of around 7 miles (11 kilometers), asupersonic parachute will deploy and Perseverance will soon separate from its heat shield.

NASA gave a briefing on Jan. 27 with a detailed rundown on the entire EDL sequence, including the "sky crane" maneuver, which lowers the rover the final distance to the surface using a set of cables.

If all goes well, Perseverance will end up standing on the surface of Mars. "The really hard part is to soft land and not crash land, and then to deploy the moving parts," said Gorman. Perseverance is not alone on the trip. It also carries a helicopter named Ingenuity in its belly. Ingenuity will be unleashed later in the mission.

Now playing: Watch this: How NASA's Mars helicopter could change the future of...

5:20

The mission is equipped with cameras and microphones designed to capture the EDL process, so we can expect to both see and hear the excitement of the landing at some point. "It will be the raw sounds of the descent and coming onto the surface," Gorman said. "So that's a whole other level of sensory engagement."

It takes time to send data between Mars and Earth. For us back home, we can expect a first photo not too long after landing, but the full visual and audio experience may take a few days for NASA to share with the world.

The agency released an arrival trailer in December that shows an animated, sped-up version of the process. You'll get the idea of just how wild it is to land a rover on another planet.

Gorman is excited about getting visuals of the rover's landing spot in Jezero Crater. It will be our first close-up look at the landscape in an area that had a history of water. Perseverance hopes to explore that history and look for evidence of life.

While the photos, sounds, helicopter and all-around science will be reasons to celebrate, there's the big lingering question the mission might answer: Was Mars home to microbial life? Said Gorman, "It would just be really great if we've got a bit of a closer handle on whether anything once lived on Mars."

Perseverance is our next great hope in the search for signs of life beyond Earth. It all starts with sticking the landing.

FollowCNET's 2021 Space Calendarto stay up to date with all the latest space news this year. You can even add it to your own Google Calendar.

Read the original:

How to watch live as NASA lands its Perseverance rover on Mars today - CNET

Crypto market report: Bitcoin, Ethereum, Litecoin & Co .: How the crypto prices develop on Saturday | message – The Times Hub

The Bitcoin price fell on Saturday. At noon, Bitcoin fell to $ 46,887.78 after trading at $ 47,586.24 the day before.

Bitcoin Cash price fell to $ 558.30 after trading at $ 579.65 the previous day.

display

Do you want to invest in Bitcoin? We explain the possibilities to youHere you can easily buy and sell Bitcoin

Ethereum is in the red at $ 1,797.30.

The previous evening, the digital currency was still at $ 1,846.07.

The price of the digital currency Litecoin rose to 198.73 US dollars on Saturday.

The day before, the rate of the digital currency was put at 197.45 US dollars.

The ripple is worth $ 0.5817 on Saturday.

The Ripple price fell compared to the previous day when it was still at $ 0.6138.

The Cardano course has fallen compared to the previous day. A Cardano is currently worth $ 0.8810.

The price was yesterday at $ 0.9254.

The course of the digital currency Monero is quoted today at 214.78 US dollars in the plus.

The previous day the price was $ 201.19.

The IOTA course is stronger than the day before. One IOTA is currently worth $ 1.266. Yesterday the price was still at $ 1.232.

The Verge price runs sideways at 0.0255 US dollars compared to the previous days level.

The Stellar exchange rate rose to $ 0.5423 compared to the previous day.

There was still $ 0.5258 on the price board.

The NEM rate trades lighter at $ 0.3870.

The previous day the price was $ 0.3986.

The Dash price rose to $ 207.98.

The day before, the cryptocurrency was worth $ 168.16.

The price of the NEO rose to $ 36.86 today, while the previous day it was trading at $ 37.99.

Finanzen.net editorial team

Image sources: Wit Olszewski / Shutterstock.com

Natasha Kumar has been a reporter on the news desk since 2018. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining The Times Hub, Natasha Kumar worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my [emailprotected] 1-800-268-7116

More:

Crypto market report: Bitcoin, Ethereum, Litecoin & Co .: How the crypto prices develop on Saturday | message - The Times Hub

The case for, and against, the still-unseen Planet 9 Astronomy Now – Astronomy Now Online

A plot showing the relationship between the clustered orbits of several Trans-Neptunian Objects, or TNOs, in the extreme outer solar system as a result of gravitational interactions with an unseen world dubbed Planet 9. Image: Caltech/R. Hurt (IPAC)

For the past several years, astronomers have been searching for an unseen planet beyond the orbit of Pluto, a presumed world with 10 times the mass of Earth that could be responsible for the seemingly clustered orbits of small Trans-Neptunian Objects, or TNOs, in the extreme outer solar system. So far, Planet 9 has eluded detection.

Dealing a possible blow to the theorised planet, a team of researchers led by Kevin Napier of the University of Michigan suggests selection bias may have played a role in the original justification for Planet 9.

TNOs are so distant and dim they can only be detected, if seen at all, when their orbits carry them relatively close to the inner solar system. Napiers team analysed 14 other extreme TNOs discovered in three surveys and concluded their detection was based on where they happened to be at the time and the ability of the telescopes in question to detect them.

In other words, the clustering seen in the orbits of the original TNOs cited in support of Planet 9 may have been the result of where the bodies happened to be when they were observed. TNOs may well be uniformly distributed across the outer solar system without any need for the gravitational influence of an unseen planet.

It is important to note that our work does not explicitly rule out Planet X/Planet 9; its dynamical effects are not yet well enough defined to falsify its existence with current data, the researchers write in a paper posted on ArXiv. Instead, we have shown that given the current set of ETNOs (extreme TNOs) from well-characterised surveys, there is no evidence to rule out the null hypothesis.

Mike Brown and Konstantin Batygin at the California Institute of Technology, the original proponents of Planet Nine, beg to differ.

Can their analysis distinguish between a clustered and uniform distribution, and the answer appears to be no, Batygin said in an update posted by the journal Science.

Brown took to Twitter on 16 February to voice his thoughts, showing diagrams of the TNOs that he says support the original case for Planet 9. And in a blog post, he provided a detailed rebuttal, concluding that in the end, the previously measured clustering from our 2019 paper is still valid and the conclusions of that paper remain.

The clustering of distant Kuiper belt objects is highly significant, Brown writes. Its hard to imagine a process other than Planet Nine that could make these patterns. The search continues.

Visit link:

The case for, and against, the still-unseen Planet 9 Astronomy Now - Astronomy Now Online

‘Farfarout’ confirmed to be really, seriously far out Astronomy Now – Astronomy Now Online

A graphic representation of the scale of the solar system shows Earths position, 93 million miles from the Sun, or one astronomical unit, at the extreme left. A body nicknamed Farfarout is at the far right end of the scale, currently 132 times farther from Sun than Earth. Pluto is just to the left of the 40 AU marker. Image: Roberto Molar Candanosa, Scott S. Sheppard from Carnegie Institution for Science, and Brooks Bays from University of Hawaii.

Extended tracking has allowed astronomers to pin down the orbit of a presumed dwarf planet in the extreme outer solar system that takes a thousand years to complete one trip around the Sun. Knicknamed Farfarout, the frigid body is the most distant solar system object yet detected, eclipsing the previous record holder, Farout.

Listed as 2018 AG37 by the Minor Planet Center, Farfarout is currently 132 times farther from the Sun than Earth (132 astronomical units, or AU) and nearly four times more distant than Pluto. Its highly elongated trajectory carries it inside the orbit of Neptune and as far as 175 AU from the Sun. Analysis indicates the object is about 250 miles across, putting in on the low end of the dwarf planet scale (assuming it is an icy body).

A single orbit of Farfarout around the Sun takes a millennium, said University of Hawaii researcher David Tholen, a member of the team that discovered the body in 2018. Because of this long orbital period, it moves very slowly across the sky, requiring several years of observations to precisely determine its trajectory.

Tholen, Scott Sheppard of the Carnegie Institution for Science and Chad Trujillo of Northern Arizona University lead an ongoing survey to map the outer solar system beyond Pluto. They discovered the previous record holder, Farout, which is 120 AU from the sun.

The discovery of the even more distant Farfarout demonstrates our increasing ability to map the outer solar system and observe farther and farther towards the fringes of our solar system, said Sheppard. Only with the advancements in the last few years of large digital cameras on very large telescopes has it been possible to efficiently discover very distant objects like Farfarout.

Farfarout will be given an official name after its orbit is known with greater precision. In the meantime, Sheppard described Farfarout as just the tip of the iceberg of objects in the very distant solar system.

See the article here:

'Farfarout' confirmed to be really, seriously far out Astronomy Now - Astronomy Now Online

What the Heavens Declared to a Young Astronomer – ChristianityToday.com

I grew up a Jewish boy in a South African gold-mining town known as Krugersdorp. I remember sitting in shul (synagogue), enthralled as our learned rabbi expounded how God was a personal Godhe would speak to Moses, to Abraham, Isaac, and Jacob, and to many others. Growing up, I often pondered how I fit into all this.

By the time I entered the University of Witwatersrand, Johannesburg, I was deeply concerned that I had no assurance that God was indeed a personal God. I was confident that he was a historical God who had delivered our people from the hands of Pharaoh. But he seemed so far removed from the particulars of my life in Krugersdorp. Where was the personality and the vibrancy of a God who truly could speak to me?

As a student, I began working toward a degree in applied mathematics and computer science. Over the course of my studies, I became friendly with Lewis Hurst, then a professor of psychiatry and genetics. He had a great interest in astronomy, and we would discuss the complexities of the cosmos for hours at a time. Whenever we met, I would delight in explaining basic features of astronomy, such as black holes and quasars.

Intellectually, these were greatly satisfying years. Over time, I became fascinated with the elegance of the mathematical formulation of general relativity, and at age 19 I submitted my first research paper on that theme to the Royal Astronomical Society of London. When it was published one year later, I started receiving requests from observatories and universities for reprints or printed copies (on the mistaken belief that I was already a senior academic!).

But spiritually, this period was rather dry. I remember attending a meeting of the Royal Astronomical Society graced ...

To continue reading, subscribe now. Subscribers have full digital access.

Already a CT subscriber? Log in for full digital access.

Have something to add about this? See something we missed? Share your feedback here.

Read the original post:

What the Heavens Declared to a Young Astronomer - ChristianityToday.com

BYU Department of Physics and Astronomy hosts Three Minute Thesis competition – The Daily Universe – Universe.byu.edu

The BYU Department of Physics and Astronomy held its Three Minute Thesis competition on Feb. 11, where students presented shortened versions of their masters theses in an effort to move on to future rounds. (Ingrid Sagers)

The BYU Department of Physics and Astronomy held its Three Minute Thesis competition where graduate students summarized their masters theses in three minutes.

Rochelle Steele and Nick Allen competed to win the department level round of the competition on Feb. 11, with Steele coming in first and Allen second.

The competition is a research presentation and skills development contest for graduate students. Participants have three minutes to present the essential aspects of their thesis.

Steeles thesis title was Searching for something in nothing: a study of voids in space. She focused on whether small galaxies can be found in voids in space.

Allen titled his thesis, Carbon Electrodes for Bioimpedance to Measure Blood Glucose with a SmartWatch. He discussed the possibility of measuring glucose levels through smartwatches for people with diabetes.

Students have to present their thesis on a single, static Powerpoint slide by spoken word and cannot use additional props or any electronic media. Those who exceed three minutes are disqualified.

The BYU Graduate Studies webpage says contestants are judged on their explanation of research, how they create interest in their topic and how they break down their message to a non-specialist audience.

The University of Queensland created the competition in 2008 and BYU has since adopted the competition, creating rounds starting within departments, then individual colleges and finally the university level.

The two winning contestants will go on to the College of Physical and Mathematical Sciences level round, competing against other departments winners on Feb. 19. The winner of that round will go on to the university-wide round on March 11.

Steele won both the departments first place slot and peoples choice award during the competition on Feb. 11.

Steeles thesis covered voids in space. She said a method called spectroscopy has been previously used to try to determine whether there are extremely small galaxies inside these voids. However, using a new method she developed, shes concluded that the energy in the voids are not galaxies.

I have learned that none of the galaxies tested are inside the void. As far as I can tell, voids are as empty as they seem. There is something else causing this energy: something that we dont yet understand about the universe, something more to discover, Steele said.

Allen presented on the long-standing finger pricking and patch use testing methods people with diabetes practice. He said the possible use of smart watches to measure glucose levels would provide real time evaluations of blood sugar.

Smartwatches effortlessly monitor other aspects of human health and would be extremely helpful for diabetes patients, he said. However, he found many challenges in measuring glucose without access to blood.

The finals round of the Three Minute Thesis university competition will be held on March 11 in the Harold B. Lee Library, with presentations streamed over Zoom.

Originally posted here:

BYU Department of Physics and Astronomy hosts Three Minute Thesis competition - The Daily Universe - Universe.byu.edu