The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Monthly Archives: September 2021
Cloud computing is the new frontier for companies looking to get ahead of Google – Digiday
Posted: September 20, 2021 at 8:42 am
Editors Note: This story is part of a 10-part series that examines life after the third-party cookie. Visit this interactive graphic outlining the full series here.
At the end of August, Erin Yasgar concluded that Google really wanted her to attend its upcoming cloud conference.
Yasgar, a practice lead for marketing and agency strategy at Prohaska Consulting, was being pestered in that way sought-after people will find familiar: Getting multiple emails a week, reminding her of the date, just checking in.
This felt unusual. The cloud side of Googles business had previously shown little interest in agencies or marketers in general.Google is actively courting the marketers [for its cloud business], Yasgar said.
As the media industry begins to turn itself upside down before Google ends its support of third-party cookies, large cloud tech companies including Google, Amazon and Microsoft are using the upheaval as an opportunity to try and grow their cloud businesses.
In taking third-party cookies out of digital advertisings equation, Google has forced advertisers and publishers alike to focus on their own first-party data to an unprecedented degree. Without cookies to fall back on for targeting or measurement, they will need to bring more data into more clean rooms for matching. As the importance of e-commerce continues to swell, marketers will need to do more to tie their ad data to other kinds of data typically walled off in different corners of their organizations. And as advertisers, as well as publishers, reshuffle their always-on ad spending strategies, so will the need to use artificial intelligence to spot pattern growths too.
Signs of these growing needs are already visible. For example, searches for customer data platform software more than doubled last year, according to data from the business software marketplace G2.
But at the most developed end of the spectrum, most of these needs can be solved by cloud infrastructure and computing services. And its largest purveyors are ready to pounce.
They never had to understand unknown, or anonymous people. They were just buying return on ad spend through an agency.
David Novak, the co-chair of the data and marketing practice at Prophet
As you get into the billions and above in revenue, its becoming one of the highest priorities, said David Novak, the co-chair of the data and marketing practice at Prophet. Those largest firms, Novak said, are driving a very tech-centric view of how to gain customer understanding that they didnt have a few years ago.
They [marketers] never had to understand unknown, or anonymous people. They were just buying return on ad spend through an agency.
Today, Novak and Yasgar see large organizations instead focused on trying to drive personalized interactions with their customers across many disparate surfaces; sharing data not just across their own different parts internally but with growing numbers of strategic partners externally.
All of that makes the need for strong cloud infrastructure more important.
That need for the flexibility [of data], and the tech that CMOs need to deal with the identity graphs, cloud computing supports all of it, Yasgar said. All the infrastructure thats needed, what both sides of the coin need, is absolutely set to grow.
While the cloud providers are most interested in the largest marketers and media companies at the moment, this shift seems likely to spread to smaller publishers too, some, such as Amazon, are positioning themselves in a way that could hook even mid- and long-tail publishers into their clouds.
QUICK STAT
For Amazon, Google and Microsoft combined, cloud computing is already a $100 billion business. Timothy K. Horan, a cloud and communications analyst at Oppenheimer
In the great cloud battle, media would be a minor theater of war. Cloud computing is already a $100 billion business for Amazon, Google and Microsoft combined, and it is growing by more than $30 billion annually, said Timothy K. Horan, a cloud and communications analyst at Oppenheimer.
Securing the largest advertisers and publishers might amount to a single-digit percentage of their cloud revenues. Maybe its hundreds of millions of revenue in a couple years, Horan said. Theyre always pursuing these kinds of things, but its a minor product [in that context].
But even if each individual marketer is small potatoes to the cloud purveyors themselves, committing to one kind of infrastructure or another represents a profound decision for most of their customers.
Were talking about something that is incredibly complex, Yasgar said. Theres always that decision of, Do I go all in on one vendor, or do I go best in class and match in my stack?
Viewed from one angle, all of that complexity, as well as the significant amount of time and investment needed to commit to a cloud provider and strategy, could create the kind of oligopoly that currently rules the digital advertising market: If marketers have to work more closely with one another, the incentive to use shared cloud infrastructure grows strong enough that it may force the largest marketers and publishers and agencies into the hands of Google, Amazon and Microsoft, freezing out independent services such as Snowflake.
But others say that complexity may have the opposite effect, essentially buying time for competitors, which have made separate in-roads with marketers, to develop their cloud offerings to better suit these larger needs.
It could give them [Adobe and Salesforce] a dark horse position, Novak said. They are already working with marketers, hand in hand.
For now, smaller marketers and publishers are mostly relying on agencies or less sophisticated software to handle these problems. But if you squint while looking at the moves that cloud providers such as Amazon are making, its possible to see them as part of the foundation of a much larger cloud-based services business for them.
The companys publisher services division, APS, already provides a server-side tool that allows them to manage ad requests. It is currently building an ad tech services marketplace inside APS as well, which would allow publishers to run many of the revenue-boosting (but often site-slowing) services on Amazon servers, rather than theirs.
That basket of services could represent the strong foundation of a product that most publishers would have to at least consider subscribing to, sources say. It could sweeten the deal further by using its dominant market position to secure preferential pricing for APS customers.
Amazon may not wind up building out that infrastructure to target smaller publishers. But the shadow of Amazons cloud ambitions already color the conversations it has, say sources at multiple publishers who work with Amazon, who asked not to be identified while discussing a key business partner.
All they care about is their cloud business, one said.
Read more here:
Cloud computing is the new frontier for companies looking to get ahead of Google - Digiday
Posted in Cloud Computing
Comments Off on Cloud computing is the new frontier for companies looking to get ahead of Google – Digiday
Cloud computing transformation in banking risk – McKinsey
Posted: at 8:42 am
On May 11, 1997, when IBMs chess-playing supercomputer Deep Blue beat Garry Kasparov, the reigning world champion, it became apparent how computers could make judgments on par with the sharpest human minds. To process the 10123 possible moves in a game (far more than the 1082 atoms in the observable universe), Deep Blue had been programmed to proxy human judgment, using 4,000 positions and 700,000 grandmaster games. Eight years later, AlphaGo, a computer program developed by DeepMind Technologies, defeated Lee Sedol, one of the worlds best players at Go, a game with as many as 10360 possible moves. AlphaGo used a combination of machine learning and neural network algorithms and was trained using 30 million moves. Today, AlphaZero, a successor, is considered the worlds leading player of both Go and chess. It trains itself.
For years, however, this kind of computing power was difficult for most organizations to obtain. That is no longer the case, thanks to an equally dramatic change: the move from owned systems (such as the dedicated hardware of the chess and Go champions) to public cloud-based computing, giving users everywhere instant access to computing power and storage. Many businesses are embracing cloud-based software as a game changer that lets them process vast amounts of data, run new methods of advanced analytics, and benefit from more flexible technology setups. Despite rapid growth in spending (the top three cloud service providers reached $100 billion in combined revenue in 2020), cloud infrastructure still represents a small fraction of the $2.4 trillion global market for enterprise IT services. A recent research effort by McKinsey Digital foresees more than $1 trillion in run-rate EBITDA (earnings before interest, taxes, depreciation, and amortization) across Fortune 500 companies in 2030 from rejuvenating existing operations, innovating in business processes, and pioneering new businesses.
Despite the potential, banking has been slower than other sectors to adopt the cloud. Most banks find it difficult to give up their legacy on-premises applications, with only a few exceptions of early adopters like Capital Onewhich started a migration to the Amazon Web Services (AWS) cloud in 2012 and closed the last of its eight on-premises data centers in November 2020.
Now attitudes are starting to change. Some in the banking regulatory community are taking a more open stance toward cloud in financial services, considering the enhanced transparency, monitoring tools, and security features of cloud computing. For example, Don Anderson, chief information officer (CIO) at the Federal Reserve Bank of Boston, noted in 2019 that ignoring the cloud may even introduce new security vulnerabilities as on-premises vendors discontinue support for their products. On the other hand, regulators continue to issue guidance that highlights the key risks of cloud computing to individual institutions and to the stability of broader financial systems. In a recent report, the Bank of England noted that since the start of 2020, financial institutions have accelerated their plans to scale up their reliance on CSPs (cloud service providers), and that the resulting concentration among a small number of cloud providers could pose risks to financial stability. Other concerns pointed out by regulators relate to information security and the need to build cloud-appropriate risk management frameworks as an integral part of cloud migrations.
Among banking activities, one of the biggest areas of opportunity for cloud computing is risk management, both for financial risks (such as credit, market, and liquidity) and nonfinancial risks (cybersecurity, fraud, financial crime). At a time when risk management leaders are being asked to process greater amounts of data in shorter amounts of timeoften amid budget and staff constraintscloud computing could unlock considerable benefits. It can help risk teams react rapidly to changes in the external environment and dive deeper into the analytics life cycle (exhibit) to better understand the drivers of risk, all without major capital expenditures.
Exhibit
First movers are already employing cloud-based solutions in both financial and nonfinancial risk use cases. They are, for instance, deploying them to run large and complex daily and intraday liquidity risk calculations, do close monitoring of peer-to-peer payments and mobile banking transactions, improve regulatory compliance, and get smarter about the identification of money-laundering activity. Since the pricing model for many cloud providers is flexible and usage based, cloud computing also provides economic benefits. Chief risk officers (CROs) only pay for what they use and can scale up for the surge-based computing needs of certain risk analytics activities, enabling them to shift their technology cost model from capital expense to operating expense.
By adopting cloud computing, CROs could better address four historically intractable risk management challenges: the need to process much more data, the need for more powerful processing systems, the complexity of analytics required to compete, and the greater challenges these all present to todays systems developers.
To make effective risk decisions, financial institutions have always had to convert information into insights, but todays data requirements are massive. Not only do banks gather data in greater volumes, but it comes from multiple sources and in multiple formats. For example, to assess sales conduct risk, banks are using point-of-sale transactions, sales-force performance data, consumer complaint feedback, and a range of other sources.
Developing insights from so much data is all the more difficult because financial institutions often house their information in disconnected systems and then govern these systems through different processes. This siloed approach makes it hard to integrate internal and external sources of information and develop a complete and unified view of risks. As a result, teams can miss useful insights.
With cloud-based solutions, risk management teams have the potential to easily and quickly integrate many different data sources and systems. Some solutions have standardized, easy-to-use web-based interfaces, which eliminate the need for specialized configurations between a banks systems and those of a third party. American Express, for instance, introduced its cloud-based Cornerstone data ecosystem several years ago to share capabilities and data across functions and geographies.
Processing the large data sets needed for sophisticated advanced analytics and machine-learning models requires heavy loads of computing power, especially when multiple legacy systems are involved. Banking risk management functions seldom have access to such high levels of computing power, and resource barriers prevent teams from simply adding more servers.
Cloud deployments offer a more flexible model, pioneered by AWS elastic-computing capabilities, giving teams access to on-demand increases in computing power and a library of easy-to-deploy tools. Today those capabilities are changing the way bankings risk function operates. One leading bank was able to multiply the processing power dedicated to the Monte Carlo simulations it uses for trading risk projections, running them in a matter of hours as opposed to multiple days, according to executives at Microsoft Azure. At one investment bank, the relaxation of legacy computing capacity constraints enabled by cloud computing resulted in a significant increase in the use of analytics: more experimentation by trading strategy teams and adoption of new types of analyses they could not have tried before (e.g., modeling certain types of interest rate volatility directly rather than assuming it).
Similarly, a global systemically important bank had been assessing global liquidity by running its model 14 hours a day. After the bank switched to a cloud-based solution, the time dropped to less than three hours, allowing risk teams to do more frequent analyses and iterations, incorporate additional data, and make quicker business and balance sheet strategy decisions, according to executives at Google Cloud.
Migrating to a cloud-enabled platform can also streamline upgrades. Instead of spending substantial time and effort configuring new upgrades and capabilities on disconnected legacy systems, risk teams let their technology partners handle both the software and hardware upgrades. This reduces ongoing technology operating costs and minimizes the risk of obsolescence in an age of rapid evolution.
In recent years, cloud-based providers of risk management solutions have developed a wide variety of innovative, user-friendly, out-of-the-box automation tools, such as data drift analysis, critical misconfiguration alerts, and digital forensics tools. Utilizing such technology frees up risk analysts time to focus on what they do best. Instead of spending time configuring tools and technology, they can move quickly to develop sophisticated models and alert mechanisms. Barclays freed up time for its risk analysts by working with a cloud-based provider to improve its automation process for granting transaction risk analysis exemptions for merchants.
Executing risk management in the cloud also makes it easier for teams to recalibrate and manage their models and set up new tests. Cloud-based infrastructure can be continuously fed with real-time datasomething beyond the capabilities of many legacy systems. This makes models more accurate and precise, and helps analysts quickly make data-based decisions on their effectiveness.
HSBC, for instance, uses cloud computing services to look for money-laundering and other criminal activity in a completely new way. By mapping networks of connections between people and companies, the banks Global Social Network Analytics platform lets risk management teams find suspicious transactions that previously were identifiable only by humans. Using cloud services, another bank detected a data breach and found the individual responsible within two weeks, while at a competitor, the detection and apprehension of the same breach took over a year, according to executives at Google Cloud.
The flexibility and connectivity of cloud-based environments can have a meaningful impact not only on the productivity of risk analysts but also on the developers who create and maintain the models that identify, measure, and mitigate risks. After moving to the cloud, developers often report significant improvement in key performance metrics, including improvements in release frequency, lead time to deploy, and mean time to recover. Commerzbank, for instance, says its developers use cloud services to follow a continuous integration and delivery (CICD) approach, enabling them to do code updates much more seamlessly and easily.
Finally, the impact of cloud-based solutions extends beyond the risk function, since their ease of use makes robust risk identification and assessment tools more accessible to business units, which are the first line of defense. This allows for a better understanding of risks and a sense of ownership for risk decisions. Loan officers, for instance, can stress test loan portfolios or simulate the performance of a loan before approving it, enabling a deeper awareness of risk-return trade-offs.
While the potential benefits of cloud computing are substantial, so are the challenges of migrating risk management systems and activities from on premises to the cloud. CROs must plan for managing complexity, investing the necessary resources, and meeting needs for new capabilities and culture.
For the most part, risk systems are not stand-alone; they thread through the banks core applications and processes. As a result, moving risk applications to the cloud may have implications for other systems and ultimately require the reconfiguration of other applications. Thus, the migration journey for risk applications needs to be designed as part of the broader enterprise migration, which will involve hundreds of applications in total. Some companies have established a private cloud in which computing resources are hosted on a network used by only one organization and located within their own data center. Others have opted for a hybrid between this approach and the public cloud hosted by a major provider.
Migrating to the cloud can have a significant impact on financial statements. Although the legacy technology systems on which banks often operate carry maintenance costs, their depreciation expenses are minimal. While most cloud providers offer incentives for multiyear commitments that can offset near-term migration costs, substantial expenses will still hit the P&L. Therefore, investments needed for cloud migration and the subsequent operating costs must be carefully planned and sequenced over time to manage their financial impact.
The skills required to migrate and operate in the cloud include a much heavier focus on engineering and data science than is needed for on-premises computing. This kind of talent is difficult to recruit and even harder to retain, especially amid currently high attrition rates. In addition, the culture of teams working in the cloud is faster moving, more adaptable, and more focused on rapid delivery. Risk functions at banks will need to adjust their operating model to enable this new culture while keeping the rigor, control, and governance required for risk management activities.
Given these challenges, migrating to the cloud isnt an express trip. Instead, for most risk leaders, it is a multistage journey that will need planning and execution within the broader cloud strategy of the entire organization. As our colleagues have pointed out recently, companies that adopt cloud have to manage their overall strategy and business case, adoption in each business domain, and the construction of foundational capabilities that enable security and scale, all in concert.
CROs and other risk leaders have an important role driving adoption across the risk domain, but also can influence the overall strategy and business case, and need to help scope the foundational capabilities required, in particular when it comes to security and controls. Three actions can help guide the cloud adoption journey for risk management:
The common error of scattering tests and use cases throughout multiple domains will not create the momentum delivered by a deep dive into one or two major domains, whether consumer credit risk, trading risk, or consumer fraud. This is because the migration of data and tech to a cloud provider is often the toughest challenge. Once a single use case is complete for a given domain, its easier to develop additional use cases in parallel.
A transition to cloud-based risk management offers too many benefits for risk leaders to ignore. For banks, cloud computing is quickly becoming an imperative. Those that do not migrate their systems and capabilities could lose the ability to innovate quickly and respond effectively to the competitive pressures and increasing number of risks facing banks. The many decisions to make along the journey can paralyze firms, but a focus on the key issues and a prudent approach to implementation can help risk managers think several moves ahead on the chessboard.
See the article here:
Posted in Cloud Computing
Comments Off on Cloud computing transformation in banking risk – McKinsey
Cloud Exchange: Leveraging cloud data and migration in the public sector healthcare arena – Federal News Network
Posted: at 8:42 am
Cloud computing is rapidly advancing health care on three important fronts.
First, according to Mathew Soltis, vice president of Cloud Solutions at GDIT, is how its improving health care research. The cloud enables aggregation and sharing of data in a given research domain, speeding visualization and data analysis, and therefore the development of remedies.
NIH is a good example of some data sharing here around COVID. They have a commons platform, where researchers can get together and share data in a common standard, a common infrastructure, Soltis said at the Federal News Network Cloud Exchange. Earlier, the exchange of hard drives or snail-like downloads simply held things up, he said. Now cloud deployments are accelerating research outcomes.
The second area of cloud-induced improvement, Soltis said, is how it eases compliance with a myriad of health-domain requirements, such as privacy protection under HIPAA. Basic cybersecurity also improves under a well-crafted cloud implementation, he added.
What we see now is a lot of the cloud providers and the service providers have raised the standards, Soltis said. So if you want to use data or access infrastructure in the cloud, its now HIPAA compliant or can support your HIPAA outcomes. Thats advancing the cyber posture of a lot of customers.
Third is how the cloud is enabling new modes of remote health care delivery and thereby improving outcomes. For example, cloud-hosted data related to electronic health records aids EHR interoperability and access from anywhere, Soltis said.
Taking advantage of cloud computing comes with challenges, Soltis pointed out. A central question is how to handle and govern the large amounts of data that characterize health information.
When we talk to federal customers, over 40% of them are experiencing issues with data migration, Soltis said. Beyond any technical challenges lie the questions of data ownership, who is going to use the data, who in sharing arrangements pays cloud costs, and whos responsible for backups.
Key, Soltis said, is having an official role to own, manage, charter, chaperone and govern the data. Namely, the chief data officer.
But the CDO organization must partner with other stakeholders. When the CDO sits at the same level as technology and the mission owner and the application owner, that provides the best outcome, Soltis said.
On the technical side lie questions of how to design architecture for data lakes, dealing with mixed structured and unstructured data, and the most economical ways to tier the storage setup.
If that infrastructure is in place with your architecture, you can then do some higher level activities like machine learning, artificial intelligence and data visualization, Soltis said.
Success depends on getting those pieces in place, he said, but additional strategies can provide shortcuts for healthcare agencies.
For instance, look at how practitioners in other domains have done it.
Sometimes in parallel industries things like high performance computing, anomaly detection, image recognition there may be other agencies or organizations doing this, Soltis said. Some of the cross-government communities have been really successful here, looking at those use cases, where you can apply that technology and that solution to healthcare data.
The specific strategy leading organizations take, he added, takes data as the primary focus, and moves data to the cloud first ahead of applications.
Contractors can help too. Soltis said GDITs secure, cloud-native data reference architecture applies in many circumstances. He cited the Indian Health Service, which is modernizing its EHR system to incorporate the cloud.
Read the rest here:
Posted in Cloud Computing
Comments Off on Cloud Exchange: Leveraging cloud data and migration in the public sector healthcare arena – Federal News Network
New Study Finds Salesforce Economy Will Create 9.3 Million Jobs and $1.6 Trillion in New Business Revenues by 2026 – PRNewswire
Posted: at 8:42 am
SAN FRANCISCO, Sept. 20, 2021 /PRNewswire/ --Salesforce (NYSE: CRM), the global leader in CRM, today announced a new study from IDC that finds Salesforce and its ecosystem of partners will create 9.3 million new jobs and $1.6 trillion in new business revenues worldwide by 2026. The study also finds that Salesforce is driving immense growth for its partner ecosystem, which will make $6.19 for every $1 Salesforce makes by 2026.
Building digital HQs helps solve for urgent transformation needsIDC forecasts1 that cloud-related technologies will account for 27% of digital transformation IT spending this year, growing to 37% in 2026, as businesses focus on establishing digital HQs to deliver customer and employee success from anywhere. Remote work, contactless customer engagement, and sustainability efforts are becoming more prevalent than ever, and IDC expects this trend will only continue.
As more companies build out digital HQs to support an increasingly remote workforce, Salesforce technologies have helped its customers adjust to uncertainty enabling remote work and remote contact with customers, and making it possible to develop new products in weeks, not months2.
IDC also conducted a survey of 525 enterprises across eight countries on cloud deployment and the benefits and challenges of cloud computing. Of the 74% of survey respondents who say their organizations have a formal digital transformation strategy, 97% rate cloud computing as important to that strategy. The survey also found that Salesforce solutions have enabled:
Salesforce technologies can also help companies plan for a more sustainable future. IDC forecasts that from 2021 to 2024, migration from on-premise software to the cloud could reduce as much as 1 billion metric tons of CO23. Salesforce itself has set a goal of pursuing 100% renewable energy for its global operations by 2022, and currently delivers a carbon-neutral cloud to all its customers. And, according to IDC's customer survey, 39% of Salesforce customers surveyed look to Salesforce as a source of support in reaching their own sustainability objectives.
Salesforce partner ecosystem helps drive worldwide acceleration of growth IDC predicts that the use of Salesforce and its ecosystem's cloud services will generate $308 billion in the customer base this year and more than double that in 2026, at $724 billion. Today, the ecosystem of Salesforce partners delivering cloud services to customers is five times as big as Salesforce itself, and will be more than six times as big in 2026. The study also found that 2026 ecosystem revenues are forecast to be 3.5 times those in 2020.
"The Salesforce partner ecosystem extends the power of Salesforce to companies of all sizes, across industries and helps make customer success possible," said Tyler Prince, EVP, Alliances & Channels, Salesforce. "As Salesforce grows, so do our partners and we are committed to providing our expanding partner ecosystem with the tools needed to succeed in the jobs of the future."
Salesforce paves pathways to help unlock career opportunities in the Salesforce Economy 23% of new jobs created in the Salesforce customer base this year leverage significant digital skills such as using automation tools, the Internet of Things ( IoT), and other complex applications. Trailhead, Salesforce's free online learning platform, and its Trailblazer Community, which accelerates this learning through peer-to-peer knowledge sharing and support, empower anyone to learn digital skills for the growing Salesforce economy.
Salesforce is also helping companies navigate their digital transformations through these platforms; 84% of survey respondents at companies using Trailhead say it's important to their organization's deployment of cloud solutions, and 44% said it's "critically" so.
"For Salesforce, it's not only about creating new technology and career opportunities; we have to pave pathways to these new jobs," said Kris Lande, SVP, Trailblazer Ecosystem, Salesforce. "We've made it our mission to empower people with the tools they need to build dynamic careers, companies, and communities with Salesforce, and thrive in a digital-first world."
How Salesforce is creating jobs to fuel the Salesforce Economy Salesforce has a number of programs and initiatives to help create the jobs of the future and to fill them with well-equipped candidates:
What is the Salesforce Economy? IDC defines "The Salesforce Economy" as the footprint of Salesforce and its partner ecosystem on the economy at large. This includes the revenues and jobs directly generated in the Salesforce customer base from the use of Salesforce and its partners cloud services, as well as jobs created indirectly in the economy by local spending by direct employees and Salesforce and its partners themselves.
Salesforce's multi-faceted partner ecosystem is a driving force behind the Salesforce Economy's massive growth:
Additional Resources
IDC Methodology The Salesforce Economic Impact Model is an extension to IDC's IT Economic Impact Model. It estimates Salesforce's current and future share of the benefits to the general economy generated by cloud computing, and it also estimates the size of the ecosystem supporting Salesforce using IDC's market research on the ratio of spending on professional services to cloud subscriptions; the ratio of sales of hardware, software, and networking to spending on public and private cloud computing; and the ratio of spending on application development tools to applications developed.
Note that the ecosystem may include companies that are not formal business partners of Salesforce but that nevertheless sell products or services associated with the Salesforce implementations.
IDC White Paper, sponsored by Salesforce, "The Salesforce Economic Impact," doc #US48214821, September 20, 2021
1IDC's WW Spending Guide on Digital Transformation, 20212The Impact of Digital Transformation During Times of Change,July 20203IDC Press Release, Cloud Computing Could Eliminate a Billion Metric Tons of CO2 Emission Overthe Next Four Years, and Possibly More, According to a New IDC Forecast, March 2021
About Salesforce Salesforce is the global leader in Customer Relationship Management (CRM), bringing companies closer to their customers in the digital age. Founded in 1999, Salesforce enables companies of every size and industry to take advantage of powerful technologiescloud, mobile, social, internet of things, artificial intelligence, voice and blockchainto create a 360-degree view of their customers. For more information about Salesforce (NYSE: CRM), visit: http://www.salesforce.com.
SOURCE Salesforce
More:
Posted in Cloud Computing
Comments Off on New Study Finds Salesforce Economy Will Create 9.3 Million Jobs and $1.6 Trillion in New Business Revenues by 2026 – PRNewswire
Why we need green hosting and higher density services now more than ever – TechRadar
Posted: at 8:42 am
It would be easy to adopt a fatalistic attitude in the wake of the IPCCs report into climate change, that the changes necessary are too big for any individual or enterprise to tackle. But the lesson that should be learned is that while the crisis is dire, we can and should make every effort to limit our effects on the environment.
This includes hosting. We think of cloud computing as emission-less. Everything happens out of sight. But it actually isn't emission-less at all. Cloud computing now contributes to 2% of global CO2 emissions.
That might seem small, but another way to think about it is that for every 50 tonnes of CO2 emissions, one of those is from online hosting.
But there are reasons to embrace green hosting that go beyond just saving the planet.
One simple way to reduce emissions, though its not comprehensive, is to use money as a proxy: if you're spending money on something, it is most likely contributing to emissions.
Reducing spending can be a reasonable target for any business that wants to emit less. One way to approach this is through the use of high-density services.
One common feature of modern life is that we use tools that are designed for a far more powerful job. We commute or take short shopping trips in vehicles designed to travel way over the speed limit, deal with extreme off-road conditions, or move tonnes of stuff. We wear jackets that claim to deal with arctic temperatures when it starts to get a little autumnal, and sports gear for elite athletes when out for a light jog.
Computers are no different, whether its personal computing - a top-of-the-range gaming PC used to idly browse the web, say - or the equivalent in the enterprise. Computers often have boring lives given how powerful their processors are, the size of their storage, and their high-speed internet connections.
Often, they are given a small task that they only need to do part of the time, or with a fraction of their processing power. This unused computing power of deployed computers could be used more efficiently by increasing service density - simply put, using fewer computers to do the same tasks.
With hosting, service density is key to reducing the carbon footprint of any workload and the energy efficiency of a data center. Every application has resources allocated to it whenever it needs them but also when those resources are unused, they can be allocated to another application or service.
Weve seen this work in action. In one case, a university was able to replace sites hosted on 2,000 devices, scattered across their university campus with dense, containerised hosting - creating an immediate 30% reduction on hosting costs.
Fewer computers in action being used means both saving money and cutting emissions by reducing energy use - not to mention reduced e-waste when components need to be replaced.
Businesses are keen to do more to reduce the environmental impact, but green hosting is often quite far down the list of things that are being considered. Hosting is very much out of sight, unlike, for example, the waste that a business might produce.
Green hosting has the potential to be an easy win, reducing emissions and saving costs at the same time. Many businesses and organisations have built their IT in an ad-hoc and DIY manner, leading to inefficiencies - not just in density but in how IT and development teams work with this infrastructure.
Consolidation and higher density can mean increased efficiencies in these teams, too. There is also the opportunity to consolidate infrastructure into a single bill, and even move to a more efficient data center if this is possible - moving cloud to the Nordics can mean even greater savings in emissions.
When businesses are looking for ways to be more eco-friendly, whether its buying offsets, considering more hybrid working, or any number of disruptive or expensive options, its important to consider the quick wins.
Green hosting may be less visible than many other options, but it has the potential to both slash emissions and save money. Every business should consider it.
Read the original here:
Why we need green hosting and higher density services now more than ever - TechRadar
Posted in Cloud Computing
Comments Off on Why we need green hosting and higher density services now more than ever – TechRadar
Leveraging cloud computing capabilities can help organizations reduce their carbon footprint – Express Computer
Posted: at 8:42 am
As per a report by McKinsey & Company, migration of assets to clouds, globally, became amongst one of the key business priorities during Covid-19. In 2020, amongst many factors that contributed to the sustenance of a larger ecosystem, technology and particularly cloud adoption played an instrumental role. Since the pandemic, business models have pivoted to cater to the new normal consumer needs like online shopping, increased demand for video streaming, doorstep healthcare facilities, online education, and much more. The need for robust yet efficient cloud computing has thus become relevant and meaningful in the overall consumer experience matrix.
Businesses are increasingly adopting cloud technologies for its functional benefits such as pay-as-you-go pricing models, flexibility to scale, security, agility, mobility, data as an asset, collaboration, quality control, disaster recovery, loss prevention, automated software updates, competitive advantage, last but not the least, sustainability. Clouds popularity grows as it facilitates intelligent technologies and other tech-extensive solutions, in lieu of on-premise deployments that could be vulnerable to dynamic environmental and business requirements.
Amidst this, while technology proliferation is positive for growth and modern innovations, there is a need to work towards making its impact, less intrusive to the environment. Datacenters are core to our technological needs, but they consume a lot of electricity, which is not limited to computing but also to cool the heat generated from computing equipments thereby resulting in CO2 emissions. As responsible corporates and communities, dedicated attempts need to make to draw electricity from renewable sources such as solar and wind. While it takes effort and investment to go carbon neutral, it does pay off.
According to a forecast from International Data Corp. (IDC) released in March 2021 Continued adoption of cloud computing could prevent the emission of more than 1 billion metric tons of carbon dioxide (CO2) from 2021 through 2024. Asia Pacific regions in particular utilize coal for much of their power generation across datacenters and account for significant CO2 emissions.
Cloud computings aggregated compute resource is a key driver in reducing carbon emissions as the framework can efficiently utilize power capacity, optimize cooling, leverage the most power-efficient servers, and increase server utilization rates. On the side-lines of switching over to renewable sources of energy, cloud infrastructure is inherently well suited to address energy efficiencies because:
Efficient resource management as the pay-as-you-go model in cloud computing allows individual users to judiciously utilize the services, thereby reducing wastage.
It helps reduce carbon emissions from multiple physical servers. Virtualization allows cloud solutions to be delivered from a single server which can run multiple operating systems, simultaneously.
In an automated environment, users can operate on higher utilization ratios and consolidation which reduces input from the physical infrastructure.
Cloud is also unaffected by multiple users & organizations accessing its common infrastructure as automation can balance the workloads and minimize the requirement for additional infrastructure or resources.
Modern and efficient cloud data centers are taking the idea of Green IT forward in a meaningful way, saving not just the environment but also building a more robust ecosystem.
The differentiated ability to shift IT service workloads, virtually to any location in the world also creates an opportunity to enable greater usage of any available renewable sources of energy of that location.
Sustainability is frequently viewed from an operational point of view while environmental goals are viewed as a cost center in businesses, risk, or compliance to adhere to. Green datacenter and sustainable cloud infrastructure go beyond the business; they are incredible opportunities to give back to the communities where we operate.
If datacenters get designed for sustainability which starts with shifting to cleaner, renewable sources of energy like wind and solar power, LED usage across datacenters, then carbon emissions can be reduced. An efficient data center will have energy diverted towards running the IT equipment vs cooling the environment where it resides.
Businesses in several countries are taking lead in shifting their IT system to cloud centers and are deriving immense value from the exercise. They are not only able to tackle the problem of fluctuations in the electricity supply but also take add value to the overall brand image & reputation in being an environmentally conscious entity among stakeholders. Businesses may take measures to become carbon neutral through carbon offset efforts or designing data centers with efficiency and environmental protection as the guiding principles and help accelerate sustainability goals.
Authored by AS Rajgopal, MD & CEO, NxtGen Infinite Datacentre
If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]
Read more from the original source:
Posted in Cloud Computing
Comments Off on Leveraging cloud computing capabilities can help organizations reduce their carbon footprint – Express Computer
DGT Releases Result for 1st Batch of Advanced Diploma (Vocational) in IT, Networking, Cloud Computing – News18
Posted: at 8:42 am
The Directorate General of Training (DGT), Ministry of Skill Developmentand Entrepreneurshiphas announced the results of the first batch (2018-20) of the advanced diploma (vocational) in IT, networking, and cloud computing.
The first batchincluded 19 trainees out of which 14 cleared the exam. In addition, 18 trainees have been offered placements in IBM and their channel partners.Vurukuti Pavan Kumar from NSTI Hyderabad has secured the top rank while Vinod Kumar K V from the Bengaluru branch has placed at rank two and Dusa Srilekha at rank three.
The course began in 2018 at the two National Skill Training Institutes (NSTI) in Hyderabad and Bangalore on a pilot basisbut has been expanded to 16 NSTIs in 2019.The courseis approved by National Council for Vocational Training (NCVT) as a level 6 National Skills Qualification Framework (NSQF) program.
The two-year course includes industry-relevant courses on hardware maintenance, web development, cloud-based development and deployment, analytics, and soft skills training.
In the first year, there are five core modules, each of 320 hours which are credit-based, independent, and with a focus on employment skills. In the second year, the trainee has to select two out of three elective modules, each of 320 hours, and complete 800 hours of on the job paid training,supportedby IBM whichalso providesa monthly stipend to each trainee for the remaining duration of training (1.5 years) for the third batch onwards.
With the pandemic forcing companies to rapidly adopt new-age technology solutions to run their businesses, demand for the right skills in artificial intelligence, analytics, cloud computing, cyber security etc is on the rise. A 2020 IBV study states that 6 out of 10 companies plan to accelerate their digital transformation efforts, but inadequate skillsets is one of the biggest hurdles to their progress, " said Manoj Balachandran, CSR Leader, IBM India/South Asia.
Read all the Latest News, Breaking News and Coronavirus News here
Here is the original post:
Posted in Cloud Computing
Comments Off on DGT Releases Result for 1st Batch of Advanced Diploma (Vocational) in IT, Networking, Cloud Computing – News18
Cloud Security Alliance Releases New Guidance for Healthcare Delivery Organizations That Provides Measurable Approach to Detecting and Defending…
Posted: at 8:42 am
With 560 ransomware attacks on healthcare providers in 2020, HDOs must architect their cloud for failure to better protect patient data
BELLEVUE, Wash., September 16, 2021--(BUSINESS WIRE)--The Cloud Security Alliance (CSA), the worlds leading organization dedicated to defining standards, certifications, and best practices to help ensure a secure cloud computing environment, today released Ransomware in the Healthcare Cloud, new guidance from the CSA Health Information Management Working Group. The document explains how cybercriminals use ransomware to attack both the healthcare delivery organization (HDO) and the cloud service provider, and offers security practitioners strategies for detecting ransomware and protecting an HDOs data.
"When one considers that 2020 saw a 715-percent year-over-year increase in ransomware attacks and the devastating effects and cost ransomware leaves in its wake, its no wonder HDOs are under significant strain to prevent these attacks. Ransomware can significantly impact an HDOs operation, patient safety, and reputation and cause a complete shutdown, putting patients at risk. This makes it imperative that they do all they can to secure their data regardless of where its housed," said Dr. Jim Angle, the papers author and co-chair of the Health Information Management Working Group.
Presented in accordance with the National Institute of Standards and Technology (NIST) Cybersecurity Frameworks structure of identify, protect, detect, respond, and recover, the guidance takes a structured, measurable approach to defending against ransomware and details the processes HDOs should be taking to lessen the chance of a successful attack. The document, in addition to reviewing the seven stages of a ransomware attack and common social and physical engineering attack vectors, points readers to several control frameworks, including the Cloud Controls Matrix, an industry-recognized cybersecurity control framework for cloud computing, that can be used to support the NIST Cybersecurity Framework.
Story continues
"Ransomware attacks can be devastating for HDOs. Not only is there the potential loss of valuable and irreplaceable files, but it can take hundreds of hours of manpower to remove the infection and get systems working again. Its critical that HDOs have a clear understanding of their business and technology so they can apply the appropriate security measures and mitigate their risk," said John Yeoh, Global Vice President of Research, Cloud Security Alliance.
As the paper explains, traditional backup methods no longer suffice in the face of time-delayed ransomware attacks. Nor are public clouds impervious, and while they do offer greater protection, because cloud storage is increasingly being used to back up healthcare data, it too is a popular target for ransomware attacks. To protect patients data, HDOs must architect their cloud for failure, beginning with identifying an HDOs assets, business environment, governance, risk management, and supply chain. To help users ensure they are following the proper steps, the document also includes a quick-response checklist from the Department of Health and Human Services, Office for Civil Rights.
Download the full Ransomware in the Healthcare Cloud now.
The CSA Health Information Management Working Group aims to provide a direct influence on how health information service providers deliver secure cloud solutions (services, transport, applications, and storage) to their clients, and to foster cloud awareness within all aspects of healthcare and related industries. Individuals interested in becoming involved in Health Information Management future research and initiatives are invited to join the working group.
About Cloud Security Alliance
The Cloud Security Alliance (CSA) is the worlds leading organization dedicated to defining and raising awareness of best practices to help ensure a secure cloud computing environment. CSA harnesses the subject matter expertise of industry practitioners, associations, governments, and its corporate and individual members to offer cloud security-specific research, education, training, certification, events, and products. CSA's activities, knowledge, and extensive network benefit the entire community impacted by cloud from providers and customers to governments, entrepreneurs, and the assurance industry and provide a forum through which different parties can work together to create and maintain a trusted cloud ecosystem. For further information, visit us at http://www.cloudsecurityalliance.org, and follow us on Twitter @cloudsa.
View source version on businesswire.com: https://www.businesswire.com/news/home/20210916005009/en/
Contacts
Kari Walker for the CSAkari@zagcommunications.com
See more here:
Posted in Cloud Computing
Comments Off on Cloud Security Alliance Releases New Guidance for Healthcare Delivery Organizations That Provides Measurable Approach to Detecting and Defending…
Where The Big Freelance Opportunities Are Now: Tech Insights From Coursera – Forbes
Posted: at 8:42 am
Getty
Coursera has just published a comprehensive analysis of tech and data science skills on an industry by industry basis, and its very worthwhile reading for any freelancer working in one or more of the industries covered by the report, which I encourage you to read in detail.Heres why:
Heres an industry by industry look at specific strengths and gaps in tech and data science skills, and the opportunities thus presented for tech freelancers, based on Courseras proficiency scoring:
Automotive
Tech:
Data Science:
Talent Opportunity: Auto must keep up with advancing technology. Microsoft estimates 6m new tech FTEs by 2025 in software development, cloud and data roles, data analysis, machine learning and AI.
Consumer goods
Tech:
Data Science:
Talent Opportunity: McKinsey predicts 1719m new FTEs in data science, software development, AI, and robotics. Microsoft estimates 4m new technology FTEs in the consumer goods industry by 2025.
Energy and utilities
Tech:
Data Science:
Talent Opportunity: The industry is predicted to have 4m new technology FTEs by 2025, with more than 2m in software development; 1m cloud and data FTEs, and 500k FTEs added in data analysis, machine learning, and AI.
Financial services
Tech:
Data Science:
Talent Opportunity: By 2025, the industry may create 14m new digital FTEs, requiring skills in software development, data analysis, machine learning, AI, cloud computing and data, cybersecurity, and privacy.
Healthcare
Tech:
Data Science:
Talent Opportunity: New tech FTEs are forecast to grow by 5m by 2025, with key demand skills in software development, data analysis, machine learning, and AI, as well as cloud and data roles, and cybersecurity.
Insurance
Tech:
Data Science:
Talent Opportunity: The industry expects 2m new FTEs by 2025 in software development, data analysis, machine learning, AI, cloud computing, cybersecurity, and data science.
Manufacturing
Tech:
Data Science:
Talent Opportunity: More than 2 million unfilled roles plague manufacturing firms. The industry may need 20m new technology FTEs by 2025.
Professional services
Tech:
Data Science:
Talent Opportunity: Microsofts estimates 12m new tech FTEs in services by 2025, including software development, machine learning, AI, cloud and cybersecurity.
Technology
Tech:
Data Science:
Talent Opportunity: IT is projected to grow by 500k FTEs in the US.Global software and IT services may need 45m FTEs by 2025, with 5m estimated in hardware and networking.
Telecommunications
Tech:
Data Science:
Talent opportunity: Microsoft forecasts 3m new tech FTEs by 2025 in software development, cloud, data, and cybersecurity. As 5G expands, engineers with hybrid technical and business skill are essential.
Watching the industry evolve, like Gretzky
The legendary Canadian hockey player Wayne Gretzky was once described by a sportscaster something like this: He sees the whole game, like hes watching from the stands, and sees where the play and puck will be 30 seconds from now. Courseras analysis combined with other helpful surveys by Contra and Payoneer - gives the rest of us a bit of that kind of competitive insight, what we call seeing around corners. Tech freelancers, take note.
Viva la Revolution!
Excerpt from:
Where The Big Freelance Opportunities Are Now: Tech Insights From Coursera - Forbes
Posted in Cloud Computing
Comments Off on Where The Big Freelance Opportunities Are Now: Tech Insights From Coursera – Forbes
Astronomers Should be Willing to Look Closer at Weird Objects in the Sky – Scientific American
Posted: at 8:42 am
When purchasing a new phone or tablet, it is common practice to select the best technology that fits your needs within the available budget. This is also the strategy adopted by our research team at the Galileo Project, a new initiative to image unidentified aerial phenomena (UAP) like those reported by the Office of the Director of National Intelligence (ODNI) to the U.S. Congress on June 25, 2021.
To my amusement, I recently came across an online retailer that would allow us to add to cart a one-meter telescope for half a million dollars. Fortunately, cheaper telescopes are all that is needed for surveying the sky at the proper resolution to identify UAP.
Under typical weather conditions, Earths atmosphere is opaque to infrared light beyond a distance of about 10 kilometers or less. Resolving a feature the size of a cell phone on the surface of a UAP at that distance requires a telescope diameter on the order of 10 centimeters. Having a few such telescopes on a given site will allow us to monitor the motion of an object in three dimensions. These telescopes could be supplemented by a radar system that would distinguish a physical object in the sky from a weather pattern or a mirage.
If UAP are solid objects, they should heat up as they rub against air at high speed. The surfaces of objects that move in air faster than sound, such as supersonic airplanes or space rockets, are heated by hundreds of degrees. I calculated that the infrared glow of fast objects above a meter in size, supplemented by the heat from shockwaves in the air around them or an engine they carry, should be detectable with infrared sensors on telescopes out to the desired distance.
The data from a system of optical, infrared or radio telescopes will be fed to state-of-the-art video cameras linked to software that will filter out objects of interest for the telescope to track. If a bird flies above a common astronomical observatory, it will be ignored. The Galileo-Scopes will track it. Human-made drones or airplanes might be of great interest to some residents of Washington, D.C., but they are as uninteresting as birds for the Galileo Project.
My student Amir Siraj and I calculated that the number of UAP described in the ODNI report corresponds to about one object per hundred thousand square kilometers per year (with large uncertainties up to a factor of 100). This is well below the rate of unidentified objects from cell phone photographs or civilian eyewitness testimonies, implying that many of these unofficial sightings may have mundane explanations. Millions of cell phones with millimeter-size apertures are inferior to what Galileo proposes: a much smaller number of optimized telescope systems with apertures 100 times larger that are designed to rapidly track UAP.
The iceberg of classified reportsof which only the tip has been exposed publiclymay contain higher-quality images than those released to the public. Galileos goal is to capture new crisp images with better instruments than have ever been used by civilians. The full data set from the project will be open, whereas much of the data associated with the ODNI report is classified because it was obtained by government-owned sensors. Because the sky is not classified, Galileo-Scopes will operate just like common astronomical telescopesexcept that they will focus on nearby objects. We aim to change the intellectual landscape of UAP studies by bringing them into the mainstream of credible scientific inquiry.
In my book Extraterrestrial, published half a year ago, I argued that bringing the search for technological relics into the mainstream of astronomy would attract new funds and young talent to science. In recent weeks, this forecast became a reality. The Galileo Project has attracted millions of dollars from private donors and thousands of commitments from volunteers who offered to contribute their time and resources. Given the low incidence of UAP reported by ODNI, however, the project will need hundreds of telescopes to find UAP over a few years. That represents an order of magnitude more funding than we have collected so far.
With good enough data, extraterrestrial technologies can be distinguished from terrestrial technologies or natural objects. The Galileo Project will attempt to obtain this data from both UAP and unusual interstellar objects like `Oumuamua.
If prehistoric cave dwellers were to discover a cell phone, they would initially assume it to be a shiny rock of a type never seen before. But this might be the beginning of their learning experience. By pressing buttons on this weird rock, these early humans would record voices and images.
Similarly, the strange `Oumuamua has been interpreted as a new type of asteroid, such as a frozen chunk of pure hydrogen or nitrogen. But what if high-resolution images of such a weird object revealed buttons? It could encourage us to learn more by landing on the surface, just as the OSIRIS-REx craft recently landed on the asteroid Bennu. Heres hoping that astronomers will be open-minded enough to check.
This is an opinion and analysis article; the views expressed by theauthor or authorsare not necessarily those of Scientific American.
See the article here:
Astronomers Should be Willing to Look Closer at Weird Objects in the Sky - Scientific American
Posted in Astronomy
Comments Off on Astronomers Should be Willing to Look Closer at Weird Objects in the Sky – Scientific American







