The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Daily Archives: November 28, 2021
Novavax to Participate in Evercore ISI’s 4th Annual HealthCONx Virtual Conference – PRNewswire
Posted: November 28, 2021 at 9:49 pm
GAITHERSBURG, Md., Nov. 23, 2021 /PRNewswire/ --Novavax, Inc. (Nasdaq: NVAX), a biotechnology company dedicated to developing and commercializing next-generation vaccines for serious infectious diseases, today announced that it will participate in Evercore ISI's 4th Annual HealthCONx Virtual Conference. Novavax' recombinant nanoparticle protein-based COVID-19 vaccine candidate, NVX-CoV2373, will be a topic of discussion.
Conference Details:
Fireside Chat
Date:
Thursday, December 2, 2021
Time:
9:15 9:35 a.m. Eastern Time (ET)
Moderator:
Josh Schimmer
Novavax participants:
Gregory M. Glenn, M.D., President, Research and Development and John J. Trizzino, Executive Vice President, Chief Commercial Officer and Chief Business Officer
Conference
Event:
Investor meetings
Date:
Thursday, December 2, 2021
A replay of the recorded fireside session will be available through the events page of the Company's website at ir.novavax.com for 90 days.
About NovavaxNovavax, Inc. (Nasdaq: NVAX) is a biotechnology company that promotes improved health globally through the discovery, development and commercialization of innovative vaccines to prevent serious infectious diseases. The company's proprietary recombinant technology platform harnesses the power and speed of genetic engineering to efficiently produce highly immunogenic nanoparticles designed to address urgent global health needs. NVX-CoV2373, the company's COVID-19 vaccine, received Emergency Use Authorization in Indonesia and the Philippines and has been submitted for regulatory authorization in multiple markets globally. NanoFlu, the company's quadrivalent influenza nanoparticle vaccine, met all primary objectives in its pivotal Phase 3 clinical trial in older adults. Novavax is currently evaluating a COVID-NanoFluTMcombination vaccine in a Phase 1/2 clinical trial, which combines the company's NVX-CoV2373 and NanoFluTM vaccine candidates. These vaccine candidates incorporate Novavax' proprietary saponin-based Matrix-M adjuvant to enhance the immune response and stimulate high levels of neutralizing antibodies.
For more information, visit http://www.novavax.com and connect with us on Twitter and LinkedIn.
Contacts:
InvestorsNovavax, Inc. Erika Schultz | 240-268-2022[emailprotected]
Solebury TroutAlexandra Roy | 617-221-9197[emailprotected]
MediaAlison Chartan | 240-720-7804Laura Keenan Lindsey | 202-709-7521 [emailprotected]
SOURCE Novavax, Inc.
Original post:
Novavax to Participate in Evercore ISI's 4th Annual HealthCONx Virtual Conference - PRNewswire
Posted in Genetic Engineering
Comments Off on Novavax to Participate in Evercore ISI’s 4th Annual HealthCONx Virtual Conference – PRNewswire
The plenary session of the Cuban Academy of Sciences today – SmallCapNews.co.uk
Posted: at 9:49 pm
Havana November 27 A full Today, the regular session of the Cuban Academy of Sciences (ACC) will meet in person and in practice at the headquarters of the Information Technology and Advanced Remote Services Company (CITMATEL).
The deliberations will take place by video conference in four rooms prepared for academics from the provinces of Havana and Mayabeque, Doctor of Physical and Mathematical Sciences, Liliam Alvarez Diaz, Secretary of the Foundation, told CNA.
He explained that those belonging to the provincial branches will participate in the online discussions in each of the delegations of the Ministry of Science, Technology and Environment (CITMA).
According to its programme, one of the issues to be brought into academic consideration concerns the overall programs corresponding to the National Economic and Social Development Plan 2030.
The other will consist of the accountability of the ACC, by its chair, Luis Velzquez Perez, MD, a second-tier specialist in physiology.
The Cuban Academy of Sciences expanded its advisory job last May, when 420 scientific figures included it in its most recent internal election.
The latter is held every six years, and the academic body currently consists of those elected for the period 2018-2024, with a total of 183 full members and Merit 100; The honorable 44-year-old and the 31-year-old reporter to exercise their advisory role.
CITMATEL is one of the four national entities with High Technology status, characterized by demonstrating extensive R&D and innovation activity, as well as production and marketing of high value-added products and services, with an emphasis on exports.
The same is done by the Centers for Genetic Engineering and Biotechnology, Molecular Immunology (in the province of Havana) and the National Biopreparados (in Mayabeque). (Lino Lupine Perez)
Creator. Devoted pop culture specialist. Certified web fanatic. Unapologetic coffee lover.
Read more:
The plenary session of the Cuban Academy of Sciences today - SmallCapNews.co.uk
Posted in Genetic Engineering
Comments Off on The plenary session of the Cuban Academy of Sciences today – SmallCapNews.co.uk
Technology is killing our shared reality | Information Age | ACS – ACS
Posted: at 9:49 pm
Has social media already done too much damage? Image: Shutterstock
Unchecked technological advancement is destroying our shared reality and sewing severe discord in social democracies, 2021 Nobel Peace Prize winner Maria Ressa has warned.
Speaking at the Australian Strategic Policy Institutes Sydney Dialogue event last week, Ressa founder of Filipino news site Rappler and the countrys first Nobel Laureate described what she called big techs insidious manipulation of human biology.
Theres something fundamentally wrong with our information ecosystem because the platforms that deliver the facts are actually biased against the facts, Ressa said.
The worlds largest delivery platform for news is Facebook and social media in general has become a big behaviour modification system.
Ressa was talking to questions about whether social media companies ought to create different versions of their platforms to protect weaker democracies from the damaging effects of online propaganda campaigns and misinformation.
One of the revelations in the recent Facebook Papers was that the social media company struggled to effectively moderate content to match its growing scale the more places Facebook reached, it seemed, the less control its Silicon Valley headquarters appeared to have over the way information moved.
Our biology is very, very vulnerable to this technolgoy, Ressa said.
The design of this technology and the way it can insidiously manipulate people is powerful in the same way as genetic engineering technology.
She used the example of gene editing technology CRISPR, saying that governments and regulators put guard rails in place very quickly around it during its development.
This is what we failed to do collectively on information technology, Ressa continued.
Now it is manipulating our minds insidiously, creating alternate realities, and making it impossible for us to think slow at a time when we need to solve existential problems.
Junk food for the mind
Ressas fellow panelist in the discussion, Dr Zeynep Tufekci, an Associate Professor with the University of North Carolina and long-time critic of the use of data to manipulate information flows, agreed that the information technology landscape as it stands is toxic to individuals and societies.
Dr Tufekci said the ongoing debate around censorship by tech companies and social media often ignores the more fundamental problem with how these products get designed in the first place.
Its easier to try and say who should we kick off which platform and harder to think about how we need to shift the entire information ecology by design, she said.
Its like food. If you have humans who evolved under conditions of hunger and then you build a cafeteria the business model of which is to keep you there that cafeteria is going to serve you chips, ice cream, chips, ice cream one after the other.
In that case you have taken a very human vulnerability hunger and youve monetised it using an automated cafeteria.
Similarly, Dr Tufekci suggests the human need for information, knowledge, and social connection has been monetised in a way that takes advantage, as Ressa said, of our very biology.
She doesnt blame the engineers working on these technologies, but rather suggests we have done a poor job of incentivising companies to build more careful products.
Its not because the people working on these technologies are not great or smart or well-meaning, Dr Tufeki said.
[Fixing this] has to be something that we ask them to do, rather than not telling them what to do, then getting mad at them.
One fix at a time
Twitters Head of Legal, Policy and Trust, Vijaya Gadde, defended the position of social media companies by saying that solving some of the well-known problems with these platforms isnt always simple but that it can be done.
We piloted a bunch of things at Twitter like what we call nudges, Gadde said.
These are just quick little pop-ups that appear before youre tweeting information or before you re-tweeting an article which might say did you actually read this article? or a warning to say this was considered misleading by certain groups, are you sure you want to retweet this.
And weve had remarkable incidents of reducing harm on the platform because of those little speed bumps that were putting in place.
So those are the types of things I want to encourage platforms to do and experiment with but the thing is that if the solutions were easy, we would have found them and implemented them already.
Read this article:
Technology is killing our shared reality | Information Age | ACS - ACS
Posted in Genetic Engineering
Comments Off on Technology is killing our shared reality | Information Age | ACS – ACS
How Dr. Fauci and Other Officials Withheld Information on China’s Coronavirus Experiments – Newsweek
Posted: at 9:49 pm
For half a year, Anthony Fauci, the nation's top infectious-disease official, and Kentucky senator and physician Rand Paul have been locked in a battle over whether the National Institutes of Health funded dangerous "gain of function" research at the Wuhan Institute of Virology (WIV) and whether that research could have played a role in the pandemic. Against Senator Paul's aggressive questioning over three separate hearings, Dr. Fauci adamantly denied the charge. "The NIH has not ever and does not now fund gain-of-function research in the Wuhan Institute of Virology," he said in their first fracas on May 11, a position he has steadfastly maintained.
Recently, however, a tranche of documents surfaced that complicate Dr. Fauci's denials. The documents, obtained by Freedom of Information Act requests, show that the NIH was funding research at the Wuhan lab that involved manipulating coronaviruses in ways that could have made them more transmissible and deadly to humanswork that arguably fits the definition of gain-of-function. The documents establish that top NIH officials were concerned that the work may have crossed a line the U.S. government had drawn against funding such risky research. The funding came from the NIH's National Institute of Allergy and Infectious Diseases (NIAID), which Dr. Fauci heads.
The resistance among Dr. Fauci and other NIH officials to be forthcoming with information that could inform the debate over the origins of COVID-19 illustrates the old Watergate-era saw that the coverup is often worse than the crime. There's no evidence that the experiments in question had any direct bearing on the pandemic. In the past, Dr. Fauci has made strong arguments for why this type of research, albeit risky, was necessary to prevent future pandemics, and he could have done so again. But the NIH has dragged its feet over FOIA requests on the matter, handing over documents only after The Intercept took the agency to court.
The apparent eagerness to conceal the documents has only raised suspicions about the controversial research and put the NIH on the defensive. Fauci told ABC, "neither I nor Dr. Francis Collins, the director of the NIH, lied or misled about what we've done." The episode is a self-inflicted wound that has further eroded trust in the nation's public health officials at a time when that trust is most important.
While Dr. Fauci takes the political heat, the revelations center on another figure in this drama: Peter Daszak, president of the private research firm EcoHealth Alliance, which received the $3 million NIH grant for coronavirus research and subcontracted the gain-of-function experiments to the Wuhan lab. The activities of Daszak and EcoHealth before the pandemic and during it show a startling lack of transparency about their work with coronaviruses and raise questions about what more there may be to learn.
From the start, Daszak has worked vigorously to discredit any notion that the pandemic could have been the result of a lab accident. When the media was first grappling with the basics of the situation, Daszak organized a letter in the prestigious medical journal The Lancet from 27 scientists, to "strongly condemn conspiracy theories suggesting that COVID-19 does not have a natural origin," and got himself appointed to the WHO team investigating COVID origins, where he successfully argued that there was no need to look into the WIV's archives.
What Daszak didn't reveal at the time was that the WIV had been using the NIH grant money to genetically engineer dozens of novel coronaviruses discovered in bat samples, and that he knew it was entirely possible that one of those samples had contained SARS-CoV-2 and had infected a researcher, as he conceded to the journal Science in a November 17 interview: "Of course it's possiblethings have happened in the past."
The NIH fought for more than a year to keep details about the EcoHealth grant under wraps. The 528 pages of proposals, conditions, emails, and progress reports revealed that EcoHealth had funded experiments at the WIV that were considerably riskier than the ones previously disclosed.
The trouble began in May 2016, when EcoHealth informed the NIH that it wanted to conduct a series of new experiments during the third year of its five-year grant. One proposed producing "chimeras" made from one SARS-like virus and the spike proteins (which the virus uses to infiltrate animal cells) of others, and testing them in "humanized" mice, which had been genetically engineered to have human-like receptors in their lungs, making them better stand-ins for people. When such novel viruses are created, there is always a risk they will turn out to be dangerous pathogens in their own right.
Another risky experiment involved the MERS virus. Although MERS is lethalit kills 35 percent of those who catch itit's not highly transmissible, which is partly why it has claimed fewer than 900 lives so far. EcoHealth wanted to graft the spikes of other related coronaviruses onto MERS to see how that changed its abilities.
Both experiments seemed to cross the gain-of-function line. NIH program officers said as much, sending Daszak a letter asking him to explain why he thought they didn't.
In his reply, Daszak argued that because the new spikes being added to the chimeras were more distantly related to SARS and MERS than their original spikes, he didn't anticipate any enhanced pathogenicity or infectiousness. That was a key distinction that arguably made them exempt from the NIH's prohibition on gain-of-function experiments. But, of course, one never knows; as a precaution, he offered that if any of the chimeric viruses began to grow 10 times better than the natural viruses, which would suggest enhanced fitness, EcoHealth would immediately stop all experiments, inform the NIH program officers, and together they'd figure out what to do next.
The NIH accepted Daszak's terms, inserting his suggestions into the grant conditions. Scientists at WIV conducted the experiments in 2018. To their surprise, the SARS-like chimeras quickly grew 10,000 times better than the natural virus, flourishing in the lab's humanized mice and making them sicker than the original. They had the hallmarks of very dangerous pathogens.
WIV and EcoHealth did not stop the experiment as required. Nor did they let the NIH know what was going on. The results were buried in figure 35 of EcoHealth's year-four progress report, delivered in April 2018.
Did the NIH call Peter Daszak in to explain himself? It did not. There are no signs in the released documents that the NIH even noticed the alarming results. In fact, NIH signaled its enthusiasm for the project by granting EcoHealth a $7.5 million, five-year renewal in 2019. (The Trump administration suspended the grant in 2020, when EcoHealth's relationship with the WIV came under scrutiny.)
In a letter to Congress on October 20, the NIH's Principal Deputy Director, Lawrence Tabak, acknowledged the screwup, but he placed the blame on EcoHealth's door, citing its duty to immediately report the enhanced growth that had occurred: "EcoHealth failed to report this finding right away, as was required by the terms of the grant." In a follow-up interview with the Washington Post, NIH Director Francis Collins was more blunt: "They messed up here. There's going to be some consequences for EcoHealth." So far, the NIH has not elaborated on what those consequences might be.
As damning as the NIH grant documents are, they pale in comparison to another EcoHealth grant proposal leaked to the online investigative group DRASTIC in September. In that 2018 proposal to the Defense Advanced Research Projects Agency, a Pentagon research arm, EcoHealth sketched an elaborate plan to discover what it would take to turn a garden-variety coronavirus into a pandemic pathogen. They proposed widely sampling Chinese bats in search of new SARS-related viruses, grafting the spike proteins from those viruses onto other viruses they had in the lab to create a suite of chimeras, then, through genetic engineering, introducing mutations into those chimeras and testing them in humanized mice.
One piece of the proposal was especially Strangelovian. For years, scientists had known that adding a special type of "cleavage site" to the spike could supercharge a virus's transmissibility. Although many viruses in nature have such sites, neither SARS nor any of its cousins do. EcoHealth proposed incorporating human-optimized cleavage sites into the SARS-like viruses it discovered and testing their infectiousness. Such a cleavage site, of course, is exactly what makes SARS-CoV-2 wildly more infectious than its kin. That detail was the reason some scientists initially suspected SARS-CoV-2 might have been engineered in a lab. And while there's no proof that EcoHealth or the WIV ever actively experimented with cleavage sitesEcoHealth says that "the research was never conducted"the proposal makes it clear that they were considering taking that step as early as 2018.
DARPA rejected the proposal, listing among its shortcomings the failures to address the risks of gain-of-function research and the lack of discussion of ethical, legal, and social issues. It was a levelheaded assessment. What's remarkable is that much of the same work that crossed a line for the Department of Defense was embraced by the National Institutes of Health.
The NIH and EcoHealth have asserted that none of the engineered viruses created with the NIH grant could have become SARS-CoV-2. On that, everyone agreesthe viruses are too distantly related. But the detailed recipe in the DARPA application is a blueprint for doing just that with a more closely related virus.
In September, scientists from France's Pasteur Institute announced the discovery of just such a virusSARS-CoV-2's closest known relativein a bat cave in Laos. Although still too distant from SARS-CoV-2 to have been the direct progenitor, and lacking the all-important cleavage site, it was a kissing cousin.
The discovery was hailed by some scientists as evidence that SARS-CoV-2 must have had a natural origin. But the plot turned in November, when another trove of NIH documentsreleased in response to a FOIA request by the White Coat Waste Projectbrought the evidence trail right to EcoHealth's doorstep.
In 2017, EcoHealth had informed the NIH that it would be shifting its focus to Laos and other countries in Southeast Asia, where the wildlife trade was more active, relying on local partner organizations to do the sample collecting and to send the samples to the WIV for their ongoing work. EcoHealth told Newsweek that it did not directly undertake or fund any of the sampling in Laos. "Any samples or results from Laos are based on WIV's work, funded through other mechanisms," says a company spokesman.
Regardless of who paid for the collecting portion of the project, it's clear that for years, a large number of bat samples from the region that harbors viruses similar to SARS-CoV-2 were sent to the WIV. In other words, EcoHealth's team was in the right place at the right time to have found things very close to SARS-CoV-2 and to have sent them to Wuhan. Because there's a lag of several years between when samples are collected and when experiments involving those viruses are published, the most recent papers from EcoHealth and the WIV date to 2015. The identity of the viruses found between 2016 and 2019 are known only to the two organizations, neither of which has been willing to share that information with the world.
A lack of evidence proves nothing, but neither does it put EcoHealth's or the WIV's actions in the early days of the pandemic in a good light. Why choose not to share valuable information on SARS-like coronaviruses with the world? Why not explain your projects and proposals and give scientists access to the unpublished virus sequences in your databases?
For whatever reason, they chose crisis-management mode instead. The WIV went into lockdown. Databases were taken offline. Daszak launched his preemptive campaign to prevent anyone from looking behind the curtain. And EcoHealth and the NIH tried hard to keep the details of their collaboration private.
Congressional inquiries focusing on Dr. Fauci and the NIH's decisions to fund unnecessarily risky research by a lab in Wuhan are probably forthcoming if, as appears increasingly likely, Republicans take control of Congress after the 2022 midterms. While it's important to understand how the NIH came to use such poor judgment in its dealings with EcoHealth Alliance, that won't tell us much about the WIV's research in the months leading up to the pandemic, especially since China is not likely to open its books. Answers are more likely to lie in the records of EcoHealth Alliance. Republicans and Democrats alike should be eager to find them.
See the article here:
How Dr. Fauci and Other Officials Withheld Information on China's Coronavirus Experiments - Newsweek
Posted in Genetic Engineering
Comments Off on How Dr. Fauci and Other Officials Withheld Information on China’s Coronavirus Experiments – Newsweek
AstraZeneca: Five innovations from Cambridge’s new 1bn headquarters – ITV News
Posted: at 9:49 pm
During the pandemic, Cambridge-based AstraZeneca became a household name for its role in creating a Covid-19 vaccination alongside scientists from Oxford University.
But the biopharmaceutical company has also led the way in several other cutting-edge scientific innovations.
The company has more than 76,000 employees worldwide, and its work focuses on developing prescription medication in areas such as oncology, rare diseases and the respiratory system.
Much of that work will now be driven from its new 1bn Cambridge headquarters - so here are five ways that the research centre is leading the way.
1. 'Heart-in-a-jar'
In collaboration with biotech company Novoheart, scientists at AZ are re-creating miniature organs to help them better understand things like the human heart.
A mini beating heart is created using the company's "3D human ventricular cardiac organoid chamber" - better known as the heart-in-a-jar. Scientists hope it will help them understand the characteristics of heart failure better, and therefore get treatments to patients quicker.
2. Functional genomics
Scientists are finding new ways of understanding how human genes work. Through what they call 'functional geonomics', AZ is testing the function of a given gene in a relevant disease model. And that, they say, will help them understand the complex relationship between our DNA and disease.
3. Using 'living medicines' to find cancer cells hiding in the body
Scientist are looking at regenerating tissues and organs by extracting a patient's own cells or using cells which have been expanded in the lab or enhanced through genetic engineering.
Those cells are then used to produce "living medicines" and are administered to the patient - known as cell therapy. It builds on research that analyses the way serious diseases affect different parts of the body.
The aim is to find ways to target and arm these living medicines to locate and destroy cancer cells that hide in the body, including even the hardest-to-treat solid tumours.
4. Cancer 'warheads'
AZ scientists say they are "re-defining" cancer by replacing chemotherapy with targeted, personalised therapies. While chemo kills cancer cells, it also impacts healthy ones too.
AZ is working on a tailored treatment it calls "the warhead". It is designed to kill cells and - unlike chemotherapy - scientists can now achieve precise cancer cell killing by attaching the warhead to an antibody, that provides cancer cell selectivity for example by targeting a protein that is highly expressed in breast cancer.
5. Clinical trials of the future
AstraZeneca is hoping to change the way pharmaceutical companies conduct clinical research, encouraging a more "holistic and human-centred" type of care.
Scientists want to do this by altering the design of clinical trials themselves in a way that gives patients the best experience possible.
Read more about science innovation in the Anglia region here:
See original here:
AstraZeneca: Five innovations from Cambridge's new 1bn headquarters - ITV News
Posted in Genetic Engineering
Comments Off on AstraZeneca: Five innovations from Cambridge’s new 1bn headquarters – ITV News
NIST Cloud Computing Program – NCCP | NIST
Posted: at 9:49 pm
Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is composed of five essential characteristics (On-demand self-service, Broad network access, Resource pooling, Rapid elasticity, Measured Service); three service models (Cloud Software as a Service (SaaS), Cloud Platform as a Service (PaaS), Cloud Infrastructure as a Service (IaaS)); and, four deployment models (Private cloud, Community cloud, Public cloud, Hybrid cloud). Key enabling technologies include: (1) fast wide-area networks, (2) powerful, inexpensive server computers, and (3) high-performance virtualization for commodity hardware.
The Cloud Computing model offers the promise of massive cost savings combined with increased IT agility. It is considered critical that government and industry begin adoption of this technology in response to difficult economic constraints. However, cloud computing technology challenges many traditional approaches to datacenter and enterprise application design and management. Cloud computing is currently being used; however, security, interoperability, and portability are cited as major barriers to broader adoption.
The long term goal is to provide thought leadership and guidance around the cloud computing paradigm to catalyze its use within industry and government. NIST aims to shorten the adoption cycle, which will enable near-term cost savings and increased ability to quickly create and deploy enterprise applications. NIST aims to foster cloud computing systems and practices that support interoperability, portability, and security requirements that are appropriate and achievable for important usage scenarios
Read the original:
Posted in Cloud Computing
Comments Off on NIST Cloud Computing Program – NCCP | NIST
What is Cloud Computing? | Oracle
Posted: at 9:49 pm
There are three types of clouds: public, private, and hybrid. Each type requires a different level of management from the customer and provides a different level of security.
In a public cloud, the entire computing infrastructure is located on the premises of the cloud provider, and the provider delivers services to the customer over the internet. Customers do not have to maintain their own IT and can quickly add more users or computing power as needed. In this model, multiple tenants share the cloud providers IT infrastructure.
A private cloud is used exclusively by one organization. It could be hosted at the organizations location or at the cloud providers data center. A private cloud provides the highest level of security and control.
As the name suggests, a hybrid cloud is a combination of both public and private clouds. Generally, hybrid cloud customers host their business-critical applications on their own servers for more security and control, and store their secondary applications at the cloud providers location.
The main difference between hybrid cloud and multicloud is the use of multiple cloud computing and storage devices in a single architecture.
There are three main types of cloud services: software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). Theres no one-size-fits-all approach to cloud; its more about finding the right solution to support your business requirements.
SaaS is a software delivery model in which the cloud provider hosts the customers applications at the cloud providers location. The customer accesses those applications over the internet. Rather than paying for and maintaining their own computing infrastructure, SaaS customers take advantage of subscription to the service on a pay-as-you-go basis.
Many businesses find SaaS to be the ideal solution because it enables them to get up and running quickly with the most innovative technology available. Automatic updates reduce the burden on in-house resources. Customers can scale services to support fluctuating workloads, adding more services or features they grow. A modern cloud suite provides complete software for every business need, including customer experience, customer relationship management, customer service, enterprise resource planning, procurement, financial management, human capital management, talent management, payroll, supply chain management, enterprise planning, and more.
PaaS gives customers the advantage of accessing the developer tools they need to build and manage mobile and web applications without investing inor maintainingthe underlying infrastructure. The provider hosts the infrastructure and middleware components, and the customer accesses those services via a web browser.
To aid productivity, PaaS solutions need to have ready-to-use programming components that allow developers to build new capabilities into their applications, including innovative technologies such as artificial intelligence (AI), chatbots, blockchain, and the Internet of Things (IoT). The right PaaS offering also should include solutions for analysts, end users, and professional IT administrators, including big data analytics, content management, database management, systems management, and security.
IaaS enables customers to access infrastructure services on an on-demand basis via the internet. The key advantage is that the cloud provider hosts the infrastructure components that provide compute, storage, and network capacity so that subscribers can run their workloads in the cloud. The cloud subscriber is usually responsible for installing, configuring, securing, and maintaining any software on the cloud native solutions, such as database, middleware, and application software.
Read more:
Posted in Cloud Computing
Comments Off on What is Cloud Computing? | Oracle
Micronations, A World Tour Of 8 Bizzaro Spots Barely On The Map – Worldcrunch
Posted: at 9:49 pm
Taiwanese businessman James Chang has been mired in a long battle with municipal authorities over what he sees as "excessive" taxes on the hotel he owns on the eastern coast of Australia.
So when all traditional legal and political means have been exhausted, what do you do?
Chang's idea, according to the Brisbane-based Courier Mail daily, is to try to turn his Rockhampton Plaza Hotel into a country of its own.
Welcome to the far-out world of micronations, mini-states that range from a personal hobby to a political stance. This unusual designation has sprung up over the decades as a solution for everything from protesting your government to wanting to issue your own banknotes to even simply being able to sing your very own self-acclaiming national anthem. A variable count suggests there are currently around 100 active micronations around the world, with more than 400 recorded in recent history.
It's unclear the chances that Chang will join the ranks, but he's vowed to write for permission to the Queen of England (Australia is part of the British Commonwealth) and/or take his case to international courts. His bid is not so surprising as estimates suggest that one-third of all micronations are located in Australia, earning the vast country a reputation as the micronations central.
But there have been others spread out across the world over the years from the Principality of Sealand, a platform floating on the North Sea to the Republic of Rose Island which was once bombed by the Italian government. Worldcrunch takes you on a tour around some of the world's more peculiar micronations.
The flag of the Republic of Parva Domus
Wikimedia
Petite intro
The self-proclaimed micronation is easy to miss: with a territory of just 0.2 km nestled in a Punta Carretas neighborhood of the Uruguayan capital of Montevideo. We know the history of the Pilgrims who first sailed the Atlantic to escape religious persecutions, and Liberia was created as a place for former African slaves to live freely The Republic of Parva Domus? Well, 140 years ago, it was established by amateur Uruguayan fishermen who wanted to have their own, handy storage unit for their fishing rods.
Big idea
The Latin inscription on the house's front, Parva Domus Magna Quies, encapsulates the nation's motto: "small house, big rest." Like any other country in the world, those who enter its grounds have to abide by the nation's Constitution. Among the centuries-old rules, citizens are forbidden to talk about touchy subjects such as politics, religion, and, yes, sports. The nation's mission is "pleasure and happiness," as proclaimed in their national anthem.
The male-only democracy also punishes any member who gets angry, by sending him in a cage before holding a trial. The defendant is allowed to have a lawyer, which isn't too hard to find. While women aren't allowed, many of the Parva citizens are lawyers, doctors or politicians.
As of today, 843,297 citizenships have been granted since Parva Domus' date of foundation, in 1878. The "day of independence", as they call it, is celebrated every year with a parade of its citizens wearing goofy costumes in the streets of Punta Carretas. At this point, if this sounds more like an odd private club or a fraternity than a micronation to you, you're not the only one.
The long platform was constructed in 1968
Wikimedia
Petite intro
Floating in the Adriatic Sea, between Cesenatico and Rimini, Italy, the Republic of Rose Island was a 400 square-meter-long platform that served mostly as a tourist attraction 11 kilometers off the Rimini coast. Constructed in 1968 in the middle of the Atlantic sea, the long platform of Bologna-born engineer Giorgio Rosa became a real republic in only 55 days, from getting its own flag and national language the Esperanto to writing an official constitution.
Big idea
Take a European climate of anti-conformism in the late 1960s, add an innovative Italian engineer, and you end up with the short-lived Respubliko de la Insulo de la Rozoj.
Sadly, it took the Italian government 55 days to decide and organize the island's destruction. After this episode, the UN changed Italy's maritime delimitation rules from 9.6 km to to 17.7 km. The small micronation was assembled on a platform that used an innovative technique that Rosa later patented.
Paddy Roy Bates claimed the platform in 1967
Flickr
Petite intro
The tiny platform of 4,000 m is located in the North Sea approximately 12 kilometers off the coast of Suffolk. Paddy Roy Bates, a British citizen & pirate radio broadcaster, ejected the radio members and claimed the platform as its own from 1967 onwards.
Big idea
This unrecognized man-made structure, has hereditary royal rulers, fantasy passports, a flag and a national anthem. It is located on a platform that was constructed during World War II as a defense emplacement, then occupied by a pirate radio station.
In August 1978, German-born Alexander Achenbach, self-proclaimed Prime Minister of Sealand, hired mercenaries to attack the micronation and took Bates's son hostage. However, he was able to retake Sealand, captured Achenbach and charged him with "treason against Sealand". He would stay imprisoned unless he paid close to $35,000. Germany had to send people to negotiate Achenbach's liberation. After the negotiation, the Bates family claimed that the diplomat's visit meant a facto recognition of Sealand by Germany.
Sealand sells "fantasy passports," not valid for international travel and in 1997, the Bates family revoked all Sealand passports issued for 22 years. This came to dismantle the international money laundering ring that had appeared by using the sale of fake Sealand passports to finance drug trafficking and money laundering. In 1987, the United Kingdom extended its territorial waters to 12 nautical miles. Sealand is now part of British territorial waters.
Georgette Bertin-Pourchet, presidente of the Rpublique and daughter of Georges Pourchet
Leo Delafontaine
Petite intro
In the dpartement of Doubs, in eastern France, the Republic of 128 km extends for 11 municipalities. In 1947, the prefect of the department of Doubs came to Montbenot and had lunch in the Htel de l'Abbaye, owned by Georges Pourchet. Jokingly, Pourchet asked the Prefect if he had a permit to enter the Republic of Saugeais. After inventing details about the fantasy republic, the prefect responded by naming Pourchet president of the Free Republic of Saugeais.
Big idea
Widow of George, president Gabrielle Pourchet took the micronation to another level by appointing a prime minister, a general secretary, twelve ambassadors and over 300 honorary citizens. Langue Saugette, a Franco-Provenal dialect, was the language of the song adopted as the national anthem, written in 1910. As what could seem a kind of recognition, the French Postal Service created a postal stamp to commemorate the Republic in 1987.
Petite intro
Established in 1981 by three Sydney teenagers, they claimed a 10 m provisional territory in the suburb of Narwee, Australia, as Atlantium's first capital. In 1999, the founder's 61-square-meter apartment became the second capital of and Concordia became the third capital of Atlantium on 2008, when the rural 0.76 km Province of Aurora, was created. In 2015, the micronation had almost 3,000 "citizens", most of whom signed up online from more than 100 countries, and have never been to Atlantium.
Big idea
Atlantium was described as a refreshing antidote to the reactionary self-aggrandizement of so many micronations by The Lonely Planet Guide to Home-Made Nations in 2006. The book highlights the micronation's progressive, liberal policies and describes it as a "secular humanist utopia."
The micronation sold stamps, coins and banknotes online and claimed to use the profit for the Empire's ongoing operations and charitable causes. Some of the political stances of Atlantium are for unrestricted international freedom of movement, the right to abortion and euthanasia. The microstate insists their citizens should be active in the political process no matter where they live.
A set of stamps from the Gay & Lesbian Kingdom of the Coral Sea Islands
Wikimedia
Petite intro
In 2004, a group of gay rights activists from Australia established the micronation as a symbolic political protest, after the government of Australia's refusal to recognize same-sex marriages. It was located in Australia's external overseas territory of the Coral Sea Islands, uninhabited islets east of the Great Barrier Reef.
Big idea
Matthew Briggs came up with the initiative to establish the "gay kingdom" during Brisbane Gay and Lesbian Pride Festival in 2003. On June 14, 2004, the activists arrived at Cato island on the Gayflower ship, raised the rainbow pride flag and declared the Coral Sea Islands an independent gay and lesbian state. A campsite at Cato named Heaven was claimed to be the capital of the micronation. The Kingdom was dissolved in November 2017 after Australia legalized gay marriage.
From Your Site Articles
Related Articles Around the Web
Excerpt from:
Micronations, A World Tour Of 8 Bizzaro Spots Barely On The Map - Worldcrunch
Posted in Micronations
Comments Off on Micronations, A World Tour Of 8 Bizzaro Spots Barely On The Map – Worldcrunch
Cloud computing has won. But we still don’t know what that means – ZDNet
Posted: at 9:49 pm
There's little doubt that cloud computing is now the absolutely dominant force across enterprise computing. Most companies have switched from buying their own hardware and software to renting both from vendors who host their services in vast anonymous data centers around the globe.
Tech analysts are predicting that the vast majority of new computing workloads will go straight into the cloud, and most companies will switch to a cloud-first policy in the next couple of years: total cloud spending will soon hit $500 billion.
The best cloud storage services
Free and cheap personal and small business cloud storage services are everywhere. But, which one is best for you? Let's look at the top cloud storage options.
Read More
There are plenty of good reasons for this. The cloud companies whether that's software-as-a-service or infrastructure-as-a-service or any other as-a-service are experts at what they do, and can harness the economies of scale that come with delivering the same service to a vast number of customers. Most companies don't need to be experts in running email servers or invoicing systems when it doesn't bring them any real competitive advantage, so it makes sense to hand over these commodity technologies to cloud providers.
SEE: Having a single cloud provider is so last decade
Still, that doesn't mean every consequence of the move to the cloud is resolved.
Renting is often more expensive than buying, so keeping a lid on cloud costs remains a challenge for many businesses. And hybrid cloud where enterprises pick and choose the best services for their needs and then try to connect them up is increasingly in vogue. Few companies want to trust their entire infrastructure to one provider; services do go down and everyone needs a backup option. The risk of vendor lock-in in the cloud is something companies increasingly want to avoid.
The impact of cloud computing on skills is more complex. Certainly the shift has seen some tech jobs disappear as companies no longer need to manage basic services themselves. Tech staff will need to shift from maintaining systems to developing new ones, most likely by tying cloud services together. That's going to be important for companies that want to create new services out of the cloud, but it's a significant skills shift for many staff to go from admin to developer and not everyone will want to.
Also, as those administrator jobs vanish, the career path in IT will shift, too: skills around project management, innovation and teamwork will become more important for tech workers that want to move up.
There's no obvious cloud-computing backlash ahead right now. Even a few major outages have done little to shake confidence in the idea that for most applications and most organisations the cloud makes business sense. However, the implications of that decision may take a few years to play out yet.
Continue reading here:
Cloud computing has won. But we still don't know what that means - ZDNet
Posted in Cloud Computing
Comments Off on Cloud computing has won. But we still don’t know what that means – ZDNet
Why your cloud computing costs are so high – and what you can do about them – SiliconANGLE News
Posted: at 9:49 pm
Small mistakes in the cloud can have big consequences.
John Purcell, chief product officer at custom developer DoiT International Ltd., tells of one customer who made a keystroke error that caused the company to spin up an Amazon Web Services Inc. instance much larger than what was needed. A job that was supposed to finish on Friday was never turned off and ran all weekend, resulting in $300,000 in unnecessary charges. There is a small single-digit percentage of companies that manage cloud costs well, he said.
Fifteen years after Amazon.com Inc. launched the first modern cloud infrastructure service, customers are still coming to grips with how to plan for and manage in an environment with dozens of variables that dont exist in the data center including a nearly limitless capacity to waste money.
DoiTs Purcell: There is a small single-digit percentage of companies that manage cloud costs well Photo: DoiT
Not that this is slowing cloud adoption. The recently released 2022 State of IT Reportfrom Spiceworks Inc. and Ziff Davis Inc. reported that 50% of business workloads are expected to run in the cloud by 2023, up from 40% in 2021. But information technology executives express frustration at the difficulty of getting the visibility they need to plan accurately for cloud infrastructure costs.
A recent survey of 350 IT and cloud decision-makers by cloud observability platform maker Virtana Inc. found that 82% said they had incurred unnecessary cloud costs, 56% lack tools to manage their spending programmatically and 86% cant easily get a global view of all their costs when they need it. Gartner Inc. predicts that 60% of infrastructure and operations leaders will encounter public cloud cost overruns. And Flexera Software LLCs 2020 State of the Cloud Report estimated that 30% of enterprise spending on cloud infrastructure is wasted.
Nearly 50% of cloud infrastructure spend is unaccounted for, estimated Asim Razzaq, chief executive of Yotascale Inc., which makes dynamic cost management software targeted at engineers.
The issue leapt into public view earlier this year inaposttitled The Cost of Cloud, a Trillion Dollar Paradox.Martin CasadoandSarah Wangat the venture capital firm Andreessen Horowitz concluded that for software companies operating at large scale, the cost of cloud could double a firms infrastructure bill resulting in a collective loss of$100 billion in market value based on the impact of cloud costs on margins.
Although not everyone agreed with the analysis, its clear that cloud costs can rise quickly and unexpectedly, and its something even staunch cloud advocates say needs to be addressed head-on. The topic is likely to be discussed this coming week in the exhibit halls at AWS re:Invent conference in Las Vegas, since AWS remains far and away the largest cloud services provider.
No one is laying the blame for the situation squarely at the door of infrastructure-as-a-service companies. Every single cloud service provider wants good revenue, said Eugene Khvostov, vice president of product and engineering at Apptio Inc., a maker of IT cost management products. They dont want to make money on resources that arent used.
But the sheer complexity of options users have for deploying workloads, compounded by multiple discount plans, weak governance policies and epic bills can frustrate anyone trying to get a coordinated picture of how much theyre spending. You want granularity, but the costs are coming in every day, every hour, very second, Khvostov said.
Cloud bills can be 50 pages long, agreed Randy Randhawa, Virtanas senior vice president of research and development and engineering. Figuring out where the optimize is difficult.
Much of the reason for cloud cost overruns comes down to organizations failing to understand and accommodate the fundamental differences between the data center capital-expense cost model and the operating-expense nature of the cloud. Simply stated, the costs of running a data center are front-end loaded into equipment procurement, but the marginal cost of operating that equipment once its up and running is relatively trivial.
Yotascales Razzaq: Nearly 50% of cloud infrastructure spend is unaccounted for. Photo: Yotascale
In the cloud, there are no capital expenses. Rather, costs accrue over time based on the size, duration and other characteristics of the workload. That means budgeting for and managing cloud resources is a constant ongoing process that requires unique tools, oversight and governance.
To get a sense of where economies can be achieved, SiliconANGLE contacted numerous experts who specialize in cloud economics. Their approaches to helping clients rein in costs range from automated tools that look for cost-saving opportunities to consulting services centered on budgeting and organizational discipline. They identified roughly three major areas where money is most often wasted provisioning, storage and foregone discounts as well as an assortment of opportunities for what Yotascales Razzaq called micro-wastage, or small money drips that add up over time.
Provisioning infrastructure in the cloud is pretty much the same as it is in the data center. The application owner or developer specifies what hardware and software resources are needed and a dedicated virtual server or instance is allocated that matches those requirements.
If needs change, though, the time and cost dynamics of the data center and the cloud diverge. Getting access to additional on-premises memory or storage can take hours or days in the case of a virtual machine and weeks if new hardware must be procured.
In contrast, cloud providers allow additional resources and machines to be quickly allocated either on a temporary or a permanent basis. There is no longer a need to provision a workload with all the capacity it will need over its lifespan, said Karl Adriaenssens, chief technology officer at GCSIT Inc., an infrastructure engineering firm.
Old habits die hard, though. To accommodate the relative inflexibility of data center infrastructure, developers and application owners tend to overestimate the resources that will be needed for a given workload. Developers are concerned about making sure their apps perform well and they tend to overprovision to be on the safe side, said Harish Doddala, senior director of product management at cloud cost management firm Harness Inc.
Incentives reward this approach. You dont get into trouble if you overspend a little but you do get into trouble if the application doesnt perform, said Razzaq.
All cloud platform providers offer autoscaling capabilities that make allocating additional capacity automatic, with the only charges being for the additional capacity used. However, users often dont think to deploy them.
As a result, 40% of cloud-based instances are at least one size too big, estimates DoiTs Purcell. Youd be surprised how often workloads run at 5% to 10% utilization.
CloudCheckrs Rehl: Scaling down and RDS database is painful. It may change your license type. Photo: CloudCheckr
As attractive as autoscaling sounds, experts advise using it with caution. On-demand instances, which are the most commonly used but also the most expensive type of cloud VM, can run up large bills if capacity expands too much. Autoscaling can become addictive, prompting users to create multiple database copies and log files that eat up storage and money.
Autoscaling will allow you to meet just about any workload requirement, but running almost unlimited scaling with on-demand instances can get out of hand, said Adriaenssens.
The tactic also doesnt always work as easily in reverse. If database sizes grow during a period of high activity, they may exceed a threshold that makes it difficult to scale the instance back down again. If you scale down [Amazons] RDS, its painful. It may change your license type, said Travis Rehl, vice president of product at CloudCheckr Inc., a maker of cloud visibility and management software thats being acquired by NetApp Inc. Its possible but the effort can be very high.
The second type of overprovisioning occurs when cloud instances are left running after theyre no longer needed. In an on-premises environment, this isnt a big problem, but the clock is always running in the cloud.
Usage policies that give users too much latitude to control their own instances are a common culprit. Someone may spin up an instance for a short-term project and then forget to shut it down. It may be months before anyone notices if its noticed at all.
Companies may have a policy of creating new accounts for new workloads, and after hundreds have been created, it becomes a bear to manage, said Razzaq. IT administrators may fear shutting down instances because they dont know whats running in them and the person who could tell them has left the company, he said.
Wasabis Friend: If you want to leave, it can be massively expensive. Photo: SiliconANGLE
Developers, who are more motivated by creating software than managing costs, are often culprits, particularly when working on tight deadlines. Typically, the budget is managed by finance, but the ones who actually cause the overruns are the developers themselves, said Harness Doddala.
When Cognizant Technology Solutions Corp. was called in to help one financial services customer rein in its costs in the Microsoft Corp. Azure cloud, it found numerous unnecessary copies of databases, some of which exceeded a terabyte in size. Virtual machines were running round-the-clock whether needed or not.
The company was prioritizing deadlines over efficiency, said Ryan Lockard, Cognizants global chief technology officer. Cognizant cut its cloud costs by half mainly by imposing operational discipline.
A wide variety of automated tools from the cloud providers and their marketplace partners can help tame runaway instances, but customers often dont have time to learn how to use them. Simple tactics can yield big savings, though, such as tagging instances so that administrators can view and manage them as a group. You can specify policies for apps by tags and within those policy constructs define what you want to track and take actions, said Virtanas Randhawa.
All cloud providers offer automated tools to manage instances in bulk. For example, Amazons Systems Manager Automation can start and shut down instances on a pre-defined schedule and the companys CloudWatch observability platform has drawn high praise for its ability to spot and stop overages. Microsofts Azure Cost Management and Billing does the same on the Azure platform and Google LLCs Active Assist uses machine learning to automate a wide variety of administrative functions, including sizing instances appropriately and identifying discount opportunities.
Numerous well-funded startups are also active in this market, including NetApp Inc.s Spot for optimizing the use of Spot Instances, ParkMyCloud for resource optimization and CloudZero for cost visibility. IBM Corp., VMware Inc., Nutanix Inc. and HashiCorp all have footholds in the market. Zesty Tech Ltd. just this week announced a $35 million Series A funding round for an approach that uses artificial intelligence to automatically adjust instances that allocate storage.
Its cheap to move data into the cloud but expensive to take it out. That means data volumes and costs tend to grow over time, with charges accruing month by month.
Apptios Khvostov: Cloud providers dont want to make money on resources that arent used. Photo: SiliconANGLE
This so-called data gravity is core to keeping customers in the fold, said Corey Quinn, chief cloud economist at The Duckbill Group. The more data the customer commits to a provider, the more applications tend to follow and the greater the risk of abandoned instances because no one wants to delete data, he said. As a result, cloud providers will continue to grow even without new customers.
The costs are attractive AWS charges a little over a penny per gigabyte for infrequently accessed data but that creates a temptation to shortcut discipline.
Studies show that up to 46% of data is just trash, said Gary Lyng, chief marketing officer at Aparavi Software Corp., a distributed data management software provider. Get rid of that first before you back it up or move it to the cloud.
Time-based pricing can also be insidious in the long term. The two cents per month per gigabyte that AWS charges for S3 storage becomes a dollar over a four-year period, making it far more expensive than local disk storage.
And getting it out adds to that cost. A customer that downloads 10 terabytes of data per month can expect to pay about $90 for the privilege. Extracting 150 terabytes costs $7,500. If you want to leave, it can be massively expensive, said David Friend, CEO of cloud storage service provider Wasabi Technologies Inc.
Cloud infrastructure customers may know how much storage they have but not how often they use it, Friend said. That can lead to overpaying for high-availability access to data that is rarely touched. And the more data they have, the more expensive it is for you to leave, he said.
Archeras Khanna: Customers leave money on the table by failing to negotiate. Photo: Archera
Data and compute instances are functionally separate in cloud infrastructure, meaning that shutting down a virtual machine doesnt affect the data. You pay for everything, whether you use it or not, Randhawa said.
Apptio has found tens of thousands of storage instances in the Azure cloud that are orphaned, not because operations have bad intentions but because they forget to hit the switch to terminate them or move them to cold storage, Khvostov said.
Cloud providers also bundle high-performance package offerings based on input/output operations per second to defined database sizes, mean that buyers seeking the fastest speed can inadvertently pay for too much storage. Overprovisioning can get very expensive on the storage side very fast, said GCSITs Adriaenssens.
As in the case of infrastructure, automated tools can move little-used storage to archive automatically, but customers need to know of their existence and take the time to configure them. In the meantime, cloud providers have little incentive to make it easy for customers to take data out, since it makes switching to other platforms that much more difficult.
Cloud infrastructure providers can deliver bills at nearly any level of granularity customer desires, but the tradeoff for specificity is nearly impossible complexity. Cloud providers make all that data available to you but you have to be looking for it, said DoiTs Purcell.
Numerous discount plans are available, but its generally up to the customer to ask for them.
The vendors are happy to teach people how to use the cloud as opposed to understanding the different modalities of working in the cloud, said Aran Khanna, CEO of Archera.ai Inc., a maker of cloud cost management software. Cloud providers say theyre more than happy to help customers look for cost savings and provider calculators that weigh various options.
Amazon Spot and Reserved Instances (Microsoft calls them Spot VM and Reserved VM instances) offer customers deep discounts for committing to using capacity over an extended period of time in the case of Reserved Instances, or for buying surplus time temporarily as available in the case of Spot Instances. There are also discount plans for customers that are willing to exercise some flexibility in shifting workloads across regions.
Virtanas Randhawa: You pay for everything, whether you use it or not. Photo: Virtana
However, DoiTs Purcell estimates that fewer than 25% of customers take advantage of cost-savings plans such as reserved instances and spot instances. Its like going to the grocery store; I have a pocket full of coupons, but I have to make sure theyre the right ones, he said.
They also tend to be reluctant to accept terms that limit their flexibility. Where customers leave money on the table is where they buy the least risky thing and dont negotiate, said Archeras Khanna. Its easier to buy the most expensive option.
Fear of overcommitting can deter users from seeking the most aggressive long-term discounts, but the savings from three-year reserved instance plans, for example, can more than compensate for underutilization, experts say.
A prepaid three-year reserved instance on AWS provides for a discount of more than 60%, while the one-year version saves a little over 40%. A customer that is confident in needing an instance for two years would be better off buying the three-year option and letting one year go unused than opting for the smaller discount. AWS provides a marketplace for buying and selling instances and Microsoft will buy back unused time up to a limit.
Having a negotiated global rate discount plan yields the first base of a strong discounted pricing portfolio, said Cognizants Lockard. Combining that with pay-as-you-go-style Reserved Instances allows for credits to be applied for planned future consumption.
GCSITs Adriaenssens advises users to budget for a balance of reserved, on-demand and even spot instances so that the most economical options are available for a given workload. He also recommends creating a Cloud Center of Excellence team thats responsible for measuring, planning and tuning deployment parameters so that workloads align with a cloud providers savings plans.
If youre willing to pay someone else to get your cloud costs in order, there are plenty of businesses ready to take your money. Many say they typically save their customers 30% or more, making their fees easy to justify.
Cognizants Lockard: Lifted-and-shifted virtual machines are the most expensive way to operate in the cloud. Photo: Ryan Lockard
However, many of the savings can be achieved by simply applying more organizational discipline to the task. That starts with making informed decisions about which applications to put in the cloud in the first place. The perception that cloud platforms are cheaper is baloney, said CloudCheckrs Rehl. Cloud is more expensive but you are intentionally buying speed.
That means leaving legacy applications where they are is often also a better strategy than moving them to the cloud, experts advise. Workloads built for a data center environment one in which resources, once deployed, are always available waste money by design in an opex spending model.
Legacy applications running in lifted-and-shifted virtual machines are the most expensive way to operate in the cloud, said Cognizants Lockard. You are paying for the services and storage 24 hours a day, seven days a week, whether you use them or not.
Legacy applications can also be opaque, with little documentation and no one around who built them in the first place. As a result, said Rehl, We have seen customers who lift and shift bring over all sorts of thing they dont need. They may import data sets they think were necessary even if they havent been touched in a very long time.
Everyone agrees that the best way to optimize costs is to use applications built for the cloud. These take advantage of dynamic scaling, ephemeral servers, storage tiering and other cloud-native features by design. Cloud management needs to be automated using the many tools the cloud providers have to offer, said Chetan Mathur, CEO of cloud migration specialist Next Pathway Inc.
FinOps is a relatively new discipline that addresses the new reality that a lot of things that finance and procurement would have taken care of is now the domain of engineers, said Archeras Khanna. FinOps brings together engineers and financial professionals to understand each other better and to set guidelines that enable more cost-efficient decision-making. A recent survey by the FinOps Foundation found that the discipline is now becoming mainstream across large enterprises in particular and that FinOps team sizes grew 75% in the last 12 months.
Major platform shifts always bring disruption and the move to the cloud is the biggest one most IT managers will see in their careers. Despite the adjustments theyre making to a new operating model, most are willing to accept the tradeoffs of business agility and speed to market for what FinOps Foundation Executive Director J.R. Storment recently told ZDNet: The dirty little secret of cloud spend is that the bill never really goes down.
Original post:
Why your cloud computing costs are so high - and what you can do about them - SiliconANGLE News
Posted in Cloud Computing
Comments Off on Why your cloud computing costs are so high – and what you can do about them – SiliconANGLE News







