How Cloud Computing Is Turning the Tide on Heart Attacks – Fortune

When tech people talk about "the cloud," it often comes across as an abstract computer concept. But a visit to a village in India shows how cloud computing can bring about enormous change in far-flung places, and quite literally save lives.

On Wednesday, at the Fortune Brainstorm Health summit in San Diego, cardiologist Charit Bhograj spoke to a medical counterpart in India who was in the course of treating a rural man with chest pains.

As the doctors explained, it was recently impossible to offer advanced heart treatment in poor villages: It cost too much to administer an Electrocardiogram (EKG) and, even if you could get an EKG, the local physician was not in a position to interpret it.

This situation has changed dramatically, however, with the advent of portable EKG devices, specialized software and cloud computing.

In the course of a 10-minute presentation, the audience watched as the physician in India took an EKG reading from the man with chest pains, and relayed the results to Bhograj in San Diego. Bhograj then assessed the results and typed his advice into a tool called Tricog, which the Indian doctor then downloaded via a smartphone app.

This arrangement, which relied on a EKG device supplied by GE Health, represents a striking advancement in technology. But it also has huge health implications.

"It will change the odds of a heart attack taking your life from 80% to an 80% chance you will survive," said Bhograj, explaining how cloud-based medical services are transforming cardiac health in rural areas.

And according to Vikram Damodaran, the chief product officer of Sustainable Health Solutions at GE Healthcare, the transformation is only beginning. He explained that GE has made investments worth $300 million in the public health system in recent years, and that the sort of services appearing in rural India are also expanding to Southeast Asia and Africa.

All of this confirms an observation this morning by Fortune President Alan Murraythat there's an incredible burst of innovation taking place in the health care industry right now.

Link:

How Cloud Computing Is Turning the Tide on Heart Attacks - Fortune

Hospital CIOs see benefits of healthcare cloud computing – TechTarget

Thank you for joining!

May 2017, Vol. 5, No. 3

In healthcare, some illnesses can be cured quickly; some can't. But before applying a proper antidote, several factors need to be considered about the patient in question. The same can be said when hospital CIOs and IT pros work to formulate a strategy for moving their computing processes to the cloud, sometimes by choice and sometimes out of necessity. Critical issues need to be weighed, such as security of patient records, the cost to vacate the premises, how much information really needs to be stored in the cloud and actual savings to hospitals as a result of the move.

Our cover story examines these issues through the eyes of hospital CIOs, who see healthcare cloud computing delivering noticeable improvements in security, patient care and cost savings. They're learning to embrace the benefits of moving in part or whole to the cloud as they choose from among the various private, public and hybrid options.

In another feature, we look at the prevalence of mobile devices throughout the hospital community. They can cause migraines for CIOs and IT departments trying to maintain security with healthcare cloud computing safeguards. That's not to mention the inherent resistance IT departments can encounter from doctors, nurses and other hospital staff who share patient healthcare information over their personal smartphones and tablets.

Also in this issue, we look at some steps hospitals will need to take, including revamping IT teams, to gain full advantage of the cloud's benefits. Sometimes baby steps can go a lot farther than giant steps.

View original post here:

Hospital CIOs see benefits of healthcare cloud computing - TechTarget

Verizon sells cloud services to IBM in ‘unique cooperation between … – Cloud Tech

Verizon has announced it is selling its cloud and managed hosting service to IBM, alongside working with the Armonk giant on a number of strategic initiatives involving networking and cloud services.

This is a unique cooperation between two tech leaders to support global organisations as they look to fully realise the benefits of their cloud computing investments, said George Fischer, SVP and group president of Verizon Enterprise Solutions (VES) in a statement.

Last February, Verizon told customers in an email that it was shutting down any virtual servers running on Public Cloud or Reserved Performance Cloud Spaces on April 12. The company clarified in a statement to CloudTech that it was discontinuing its cloud service that accepts credit card payments, however John Dinsdale, a chief analyst at Synergy Research, saw things differently.

Telcos generally are having to take a back seat on cloud and especially on public cloud services, he told this publication last year. They do not have the focus and the data centre footprint to compete effectively with the hyperscale cloud providers, so they are tending to drop back into subsidiary roles as partners or on-ramps to the leading cloud companies.

How prescient that statement is now. IBM would certainly be classified as one of the hyperscale operators; alongside Amazon Web Services (AWS), Microsoft and Google, the four leading players continue to grow more quickly than the overall market, according to Synergys figures.

Whats more, various links between the two companies means this move makes sense. John Considine, general manager at IBM Cloud Infrastructure Services, was previously CTO of Verizon Terremark. The companies have also partnered on various initiatives, including in the creation of Verizons cognitive customer experience platform, built using IBMs cloud and infrastructure as a service offerings.

Our customers want to improve application performance while streamlining operations and securing information in the cloud, Fischer added. VES is now well positioned to provide those solutions through intelligent networking, managed IT services and business communications.

Verizon said it was notifying affected customers directly, though adding it did not expect any immediate impact to their services. The transaction is expected to close later this year.

Excerpt from:

Verizon sells cloud services to IBM in 'unique cooperation between ... - Cloud Tech

Red Hat’s New Products Centered Around Cloud Computing, Containers – Virtualization Review

Dan's Take

The company made a barrage of announcements at its recent Summit show.

Red Hat has made a number of announcements at its user group conference, Red Hat Summit. The announcements ranged from the announcement of OpenShift.io to facilitate the creation of software as a service applications, pre-built application runtimes to facilitate creation of OpenShift-based workloads, an index to help enterprises build more reliable container-based computing environments, an update to the Red Hat Gluster storage virtualization platform allowing it to be used in an AWS computing environment, and, of course, an announcement of a Red Hat/Amazon Web Services partnership.

Red Hat summarized the announcements as follows:

The announcements targeted a number of industry hot buttons, including containers, rapid application development, storage virtualization and cloud computing. As with other announcements in the recent past, the company is integrating multiple open source projects and creating commercial-grade software products designed to provide an easy-to-use, reliable and maintainable enterprise computing environment.

In previous announcements, Red Hat has pointed out that it has certified Red Hat software executing in both Microsoft Hyper-V and Azure cloud computing environments. So, the company can claim to support a broad portfolio of enterprise computing environments.

These announcements will be of the most interest to large enterprises since they are the ones most likely to adopt these products. These tools might be used by independent software vendors (ISVs) to create IT solutions for smaller firms as well, leading to potential impact on some small to medium size business.

About the Author

Daniel Kusnetzky, a reformed software engineer and product manager, founded Kusnetzky Group LLC in 2006. He's literally written the book on virtualization and often comments on cloud computing, mobility and systems software. He has been a business unit manager at a hardware company and head of corporate marketing and strategy at a software company.

The rest is here:

Red Hat's New Products Centered Around Cloud Computing, Containers - Virtualization Review

Adobe bets big on cloud computing for marketing, creative professionals – Livemint

Mumbai: Known for its Photoshop and Illustrator software packages used primarily by design professionals, Adobe Systems Inc. is now betting big on providing creative and marketing professionals solutions that reside in the cloud.

Cloud computing typically allows companies to use software as a service (SaaS) rather than pay for it upfront.

Adobes solutions broadly cover three areasthe Document Cloud (to help create and manage documents), Creative Cloud (for designing purposes) and Experience Cloud (to monitor and analyse customer behaviour).

We couldnt have been more pleased with what we have done with (our) Creative Cloud, Shantanu Narayen, chairman, president and CEO, Adobe told a media gathering in Mumbai on Wednesday.

Narayen insisted that there is a massive tailwind of digital globally, and consumer expectations have risen dramatically. The next generation of software will be consumer-in, he said, implying that companies need to sharpen their focus on customer satisfaction in todays digital world.

The companys senior executives are also bullish about Adobes prospects in India. In India, we are just starting to ride the (customer) experience wave, said Kulmeet Bawa, Adobes managing director for South Asia. He added that there is a lot of headroom for growth for Adobe in India, which employs about 5,200 people in the country30% of the global headcount.

In this context, Narayen also underscored Adobes reliance on partnerships.

Citing the example of the companys long-term partnership with Microsoft Corp., he said, While we currently have our Experience Cloud running on Microsofts Azure platform, the vision, going forward, is to have all our clouds on Azure.

ALSO READ: Despite Trumps protectionism, India to remain Adobes innovation hub

Speaking about trends, Narayen pointed out that chief marketing officers (CMOs) and chief digital officers (CDOs) and other C-suite executives are increasingly asking how they can also figure out digital transformation for their organizations.

Analysts concur that as customers become central to how enterprises transform themselves digitally, CMOs and CDOs are having more say in how advertising campaigns are devised and runand how the tech tools needed to create, run, manage and analyse those campaigns are bought and implemented.

Research firm Gartner Inc. noted in its CMO Spend Survey 2016-17 that CMOs now oversee or heavily influence customer experience, technology spending, and profit and loss performance as means to deliver growth. A report from research firm International Data Corp. (IDC), too, forecasts that spending on marketing technology will increase from $20.2 billion in 2014 to $32.4 billion in 2018.

Gartner uses the term digital marketing hub that can be likened to the so-called marketing clouds that consolidate and simplify the use of multiple marketing technology tools.

In its February 2017 report, Magic Quadrant for Digital Marketing Hubs, Gartner lists 22 companies. Adobe, Salesforce.com Inc. and Oracle Corp. dominate this market, according to the report.

There are a few challenges, though, in expanding this market, analysts say. For instance, Sujit Janardanan, vice-president of marketing at Aranca, a global research and advisory firm, believes that many of the tools that are part of the marketing clouds do not work smoothly.

There are integration and skills-availability issues, he said. Whats more, he added, is that the cloud offerings from large companies such as Adobe and Oracle are super-expensive, costing many times more than what smaller providers such as HubSpot Inc. would charge.

First Published: Thu, May 04 2017. 02 11 AM IST

Here is the original post:

Adobe bets big on cloud computing for marketing, creative professionals - Livemint

5 Cloud Computing Stocks to Buy – TheStreet.com

President Trump's proposed tax reforms may incentivize U.S. multinational companies to bring cash back to the U.S., potentially setting off a frenzy of mergers and share buybacks. However, it may also increasespending on some of the biggest trends in technology.

Cloud computing is one such area where companies are likely to increase spending over the next several years, as companies look to reduce operating costs and increase flexibility. Research firm IDC recently noted that worldwide spending on the public cloud -- the areas where the largest tech conglomerates mostly reside -- is expected to reach $122.5 billion this year, an increase of nearly 25% over 2016 spending levels.

By 2020, IDC expects that spending to reach $203.4 billion worldwide, indicating there is much more room to run as companies shift their computing habits, leaving opportunities for investors.

"Some offorecasts we've seen -- for example, Goldman -- shows cloud spending from 2016 to 2020 will quadruple," said Exencial Wealth Advisors senior analyst Rich Erwin, who helps handle$1.6 billion in assets under management. "Last year, overall spending was around $32 billion and maybe $135 billion or so is devoted to the public cloud, which is the real growth vehicle."

That growth is expected to largely be captured by the largest companies, giving an opportunity to investors to concentrate their bets and generate outsized returns if it comes to fruition.

"I've seen numbers that in roughly tenyears, Microsoft will have between 25% and 30% of its revenue and operating income from cloud services business," Erwin added. "It's a $3 billion business now, but it has the potential to be really big. It's the biggest trend in technology now and will be for the next decade."

What follows below is a Q&A with Erwin about where investors should look for cloud computing stocks to buy.It has been lightly edited for brevity and clarity.

TheStreet: How much money can we expect to come back from overseas if we get a repatriation holiday?

Erwin: At Exencial, we're expecting about $200 billion to come back in the first year of the holiday. Much of that is in companies like Apple (AAPL) , Cisco (CSCO) , Alphabet (GOOG) (GOOGL) and Microsoft (MSFT) .

TheStreet: Where does that money go?

Erwin: The money will likely go to stock buybacks and M&A deals -- we think the majority of that cash will be targeted for those activities.

TheStreet: Then what makes you bullish on some of these companies that are tied to cloud computing?

Erwin: Alphabet, or Google, has around $26 billion in free cash flow and they spend $14 billion in research and development spending, so they're not really dependent upon the money coming back -- they're already highly profitable.

TheStreet: What do you like about each of these companies?

The rest is here:

5 Cloud Computing Stocks to Buy - TheStreet.com

Cloud Computing Continues to Influence HPC – insideHPC

This is the second entry in an insideHPC series that explores the HPC transition to the cloud, and what your business needs to know about this evolution. This series, compiled in a complete Guideavailable here, covers cloud computing for HPC, industry examples, IaaS components, OpenStack fundamentals and more.

Cloud technologies are influencing HPC just as it is the rest of enterprise IT. The main drivers of this transformation are the reduction of cost and the increase in accessibility and availability to users within an organization.

Traditionally, HPC applications have been run on special-purpose hardware, managed by staff with specialized skills. Additionally, most HPC software stacks are rigid and distinct from other more widely adopted environments, and require a special skillset by the researchers that want to run the applications, often needing to become programmers themselves. The adoption of cloud technologies increases the productivity of your research organization by making its activities more efficient and portable. Cloud platforms such as OpenStack provide a way to collapse multiple silos into a single private cloud while making those resources more accessible through self-service portales and APIs. Using OpenStack, multiple workloads can be distributed among the resources in a granular fashion that increases overall utilization and reduces cost.

While traditional HPC systems are better for a certain workload, cloud infrastructures can accommodate many.

Another benefit of breaking down computation siloes is the ability to accommodate multidisciplinary workloads and collaboration. While traditional HPC systems are better for a certain workload, cloud infrastructures can accommodate many. For example, they can be used to teach computation techniques to students as well as provide a resource for researchers to make scientific discoveries. Traditional HPC infrastructures are great at solving a particular problem, but they are not very good at the kind of collaboration that modern research requires. A multidisciplinary cloud can make life-changing discoveries and provide a platform to deliver those discoveries to other researchers, practitioners or even directly to patients on mobile devices.

Definitions of cloud computing vary, but the National Institute of Standards and Technologies(NIST) has defined it as having the following characteristics:

Applied to HPC workloads, the service and delivery model is generally understood to include the following buckets, either individually or combined (derived from NIST definition):

Public clouds will contain sufficient compute servers, storage amounts and the networking necessary for many HPC applications.

The various types of infrastructure described here can physically reside or be deployed over the following types of clouds:

The various types of infrastructure can physically reside or be deployed over the above three types of clouds.

Over the next few weeks this series on the HPC transition to the cloud will cover the following additional topics:

You can also download the complete report, insideHPC Research Report onHPC Moves to the Cloud What You Need to Know, courtesy of Red Hat.

The rest is here:

Cloud Computing Continues to Influence HPC - insideHPC

RCom arm in tie-up for cloud computing – Moneycontrol.com

Reliance Communications undersea cable arm Global Cloud Xchange has entered into an agreement with two other companies to provide cloud computing services.

Under the agreement, data centre company Aegis Data will host cloud solutions of vScaler within its data centre and GCX will connect customers to access cloud solution through its network.

"As part of this strategic partnership, Aegis will provide vScaler with the necessary power and infrastructure requirements that will allow both organisations to capture the increasing demand for scalable HPC (high power compute)-on- demand services from enterprises in the region," a joint statement from the three firms said.

The partnership supported by Global Cloud Xchange (GCX) will enable direct access to vScaler's Cloud Services platform, it added.

Industry findings have projected that the HPC market is expected to grow up to USD 36.62 billion by 2020, at a compounded annual growth rate (CAGR) of 5.45 percent.

"This triangulated partnership supports these demands in perfect harmony, meaning that those organisations looking for HPC requirements can have their demands serviced all under one roof," vScaler Chief Technology Officer David Power said.

Read more:

RCom arm in tie-up for cloud computing - Moneycontrol.com

How Do You Define Cloud Computing? – Data Center Knowledge

Steve Lack is Vice President of Cloud Solutions for Astadia.

New technology that experiences high growth rates will inevitably attract hyperbole. Cloud computing is no exception, and almost everyone has his or her own definition of cloud from its on the internet to a full-blown technical explanation of the myriad compute options available from a given cloud service provider.

Knowing what is and what is not a cloud service can be confusing. Fortunately, the National Institute of Standards and Technology (NIST) has provided us with a cloud computing definition that identifies five essential characteristics.

On-demand self-service. A consumer [of cloud services] can unilaterally provision computing capabilities, such as server time and network storage, as needed, automatically without requiring human interaction with each service provider.

Read: Get what you want, when you want it, with little fuss.

Broad network access. Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, tablets, laptops and workstations).

Read: Anyone, anywhere can access anything you build for them.

Resource pooling. The providers computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand.

Read: Economies of scale on galactic proportions.

Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly outward and inward commensurate with demand. To the consumer, the capabilities available for provisioning often appear unlimited and can be appropriated in any quantity at any time.

Read: Get what you want, when you want it then give it back.

Measured service. Cloud systems automatically control and optimize resource usage by providing a metering capability as appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled and reported, providing transparency for both the provider and consumer of the utilized service.

Read: Get what you want, when you want it, then give it back and only pay for what you use.

Each of these five characteristics must be present, or it is just not a cloud service, regardless of what a vendor may claim. Now that public cloud services exist that fully meet this cloud computing definition, you the consumer of cloud services can log onto one of the cloud service providers dashboards and order up X units of compute capacity, Y units of storage capacity and toss in other services and capabilities as needed. Your IT team is not provisioning any of the hardware, building images, etc., and this all happens within minutes vs. the weeks it would normally take in a conventional on-premise scenario.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

Here is the original post:

How Do You Define Cloud Computing? - Data Center Knowledge

quantum computing – WIRED UK

Wikimedia Commons

In a world where we are relying increasingly on computing, to share our information and store our most precious data, the idea of living without computers might baffle most people.

But if we continue to follow the trend that has been in place since computers were introduced, by 2040 we will not have the capability to power all of the machines around the globe, according to a recent report by the Semiconductor Industry Association.

To prevent this, the industry is focused on finding ways to make computing more energy efficient, but classical computers are limited by the minimum amount of energy it takes them to perform one operation.

This energy limit is named after IBM Research Lab's Rolf Landauer, who in 1961 found that in any computer, each single bit operation must use an absolute minimum amount of energy. Landauer's formula calculated the lowest limit of energy required for a computer operation, and in March this year researchers demonstrated it could be possible to make a chip that operates with this lowest energy.

It was called a "breakthrough for energy-efficient computing" and could cut the amount of energy used in computers by a factor of one million. However, it will take a long time before we see the technology used in our laptops; and even when it is, the energy will still be above the Landauer limit.

This is why, in the long term, people are turning to radically different ways of computing, such as quantum computing, to find ways to cut energy use.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

"Traditionally qubits are treated as separated physical objects with two possible distinguishable states, 0 and 1," Alexey Fedorov, physicist at the Moscow Institute of Physics and Technology told WIRED.

"The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called 'entangled states'."

D-Wave

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states - at either of the two poles of the sphere - a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Last year, a team of Google and Nasa scientists found a D-wave quantum computer was 100 million times faster than a conventional computer. But moving quantum computing to an industrial scale is difficult.

IBM recently announced its Q division is developing quantum computers that can be sold commercially within the coming years. Commercial quantum computer systems "with ~50 qubits" will be created "in the next few years," IBM claims. While researchers at Google, in Nature comment piece, say companies could start to make returns on elements of quantum computer technology within the next five years.

Computations occur when qubits interact with each other, therefore for a computer to function it needs to have many qubits. The main reason why quantum computers are so hard to manufacture is that scientists still have not found a simple way to control complex systems of qubits.

Now, scientists from Moscow Institute of Physics and Technology and Russian Quantum Centre are looking into an alternative way of quantum computing. Not content with single qubits, the researchers decided to tackle the problem of quantum computing another way.

"In our approach, we observed that physical nature allows us to employ quantum objects with several distinguishable states for quantum computation," Fedorov, one of the authors of the study, told WIRED.

The team created qubits with various different energy "levels", that they have named qudits. The "d" stands for the number of different energy levels the qudit can take. The term "level" comes from the fact that typically each logic state of a qubit corresponds to the state with a certain value of energy - and these values of possible energies are called levels.

"In some sense, we can say that one qudit, quantum object with d possible states, may consist of several 'virtual' qubits, and operating qudit corresponds to manipulation with the 'virtual' qubits including their interaction," continued Federov.

"From the viewpoint of abstract quantum information theory everything remains the same but in concrete physical implementation many-level system represent potentially useful resource."

Quantum computers are already in use, in the sense that logic gates have been made using two qubits, but getting quantum computers to work on an industrial scale is the problem.

"The progress in that field is rather rapid but no one can promise when we come to wide use of quantum computation," Fedorov told WIRED.

Elsewhere, in a step towards quantum computing, researchers have guided electrons through semiconductors using incredibly short pulses of light. Inside the weird world of quantum computers

These extremely short, configurable pulses of light could lead to computers that operate 100,000 times faster than they do today. Researchers, including engineers at the University of Michigan, can now control peaks within laser pulses of just a few femtoseconds (one quadrillionth of a second) long. The result is a step towards "lightwave electronics" which could eventually lead to a breakthrough in quantum computing.

A bizarre discovery recently revealed that cold helium atoms in lab conditions on Earth abide by the same law of entropy that governs the behaviour of black holes. What are black holes? WIRED explains

The law, first developed by Professor Stephen Hawking and Jacob Bekenstein in the 1970s, describes how the entropy, or the amount of disorder, increases in a black hole when matter falls into it. It now seems this behaviour appears at both the huge scales of outer space and at the tiny scale of atoms, specifically those that make up superfluid helium.

"It's called an entanglement area law, explained Adrian Del Maestro, physicist at the University of Vermont. "It points to a deeper understanding of reality and could be a significant step toward a long-sought quantum theory of gravity and new advances in quantum computing.

Go here to see the original:

quantum computing - WIRED UK

Chinese scientists build world’s first quantum computing machine – India Today

China has beaten the world at building the first ever quantum computing machine that is 24,000 times faster than its international counterparts.

Making the announcement at a press conference in the Shanghai Institute for Advanced Studies of University of Science and Technology, the scientists said that this quantum computing machine may dwarf the processing power of existing supercomputers.

Researchers also said that quantum computing could in some ways dwarf the processing power of today's supercomputers.

HOW THE WORLD'S FIRST QUANTUM COMPUTING MACHINE CAME TO BE?

The manipulation of multi-particle entanglement is the core of quantum computing technology and has been the focus of international quantum computing research.

Recently, Pan Jianwei of the Chinese Academy of Sciences, Lu Chaoyang and Zhu Xiaobo of the University of Science and Technology of China and Wang Haohua of Zhejiang University set international records in quantum control of the maximal numbers of entangled photonic quantum bits and entangled superconducting quantum bits.

Pan said quantum computers could, in principle, solve certain problems faster than classical computers.

Despite substantial progress in the past two decades, building quantum machines that can actually outperform classical computers in some specific tasks - an important milestone termed "quantum supremacy" - remains challenging.

In the quest for quantum supremacy, Boson sampling - an intermediate quantum computer model - has received considerable attention, as it requires fewer physical resources than building universal optical quantum computers, Pan was quoted as saying by the state-run Xinhua news agency.

Last year, the researchers had developed the world's best single photon source based on semiconductor quantum dots.

Now, they are using the high-performance single photon source and electronically programmable photonic circuit to build a multi-photon quantum computing prototype to run the Boson sampling task.

The test results show the sampling rate of this prototype is at least 24,000 times faster than international counterparts, researchers said.

At the same time, the prototype quantum computing machine is 10 to 100 times faster than the first electronic computer, ENIAC, and the first transistor computer, TRADIC, in running the classical algorithm, Pan said.

It is the first quantum computing machine based on single photons that goes beyond the early classical computer, and ultimately paves the way to a quantum computer that can beat classical computers.

Last year, China had successfully launched the world's first quantum satellite that will explore "hack proof" quantum communications by transmitting unhackable keys from space, and provide insight into the strangest phenomenon in quantum physics - quantum entanglement.

The research was published in the journal Nature Photonics.

(With inputs from PTI)

Read more at FYI:

China to have its own Wikipedia soon: How the country is expanding its digital universe

Chinese daily appreciates ISRO but says we lag behind the US and China

Chinese man gets arrested for inviting 200 'paid' guests from his side on his wedding

Watch more:

Read the original post:

Chinese scientists build world's first quantum computing machine - India Today

The Quantum Computer Revolution Is Closer Than You May Think – National Review

Lets make no mistake: The race for a quantum computer is the new arms race.

As Arthur Herman wrote in a recent NRO article, Quantum Cryptography: A Boon for Security, the competition to create the first quantum computer is heating up. The country that develops one first will have the ability to cripple militaries and topple the global economy. To deter such activity, and to ensure our security, the United States must win this new race to the quantum-computer revolution.

Classical computers operate in bits, with each bit being either a 0 or 1. Quantum computers, by contrast, operate in quantum bits, or qubits, which can be both 0 and 1 simultaneously. Therefore, quantum computers can do nearly infinite calculations at once, rather than sequentially. Because of these properties, a single quantum computer could be the master key to hijack our country.

The danger of a quantum computer is its ability to tear through the encryption protecting most of our online data, which means it could wipe out the global financial system or locate weapons of mass destruction. Quantum computers operate much differently from todays classical computers and could crack encryption in less time than it takes to snap ones fingers.

In 2016, 4.2 billion computerized records in the United States were compromised, a staggering 421 percent increase from the prior year. Whats more, foreign countries are stealing encrypted U.S. data and storing it because they know that in roughly a decade, quantum computers will be able to get around the encryption.

Many experts agree that the U.S. still has the advantage in the nascent world of quantum computing, thanks to heavy investment by giants such as Microsoft, Intel, IBM, D-Wave, and Google. Yet with China graduating 4.7 million of its students per year with STEM degrees while the U.S. graduates a little over half a million, how long can the U.S. maintain its lead?

Maybe not for long. Half of the global landmark scientific achievements of 2014 were led by a European consortium and the other half by China, according to a 2015 MIT study. The European Union has made quantum research a flagship project over the next ten years and is committed to investing nearly $1 billion. While the U.S. government allocates about $200 million per year to quantum research, a recent congressional report noted that inconsistent funding has slowed progress.

According to Dr. Chad Rigetti, a former member of IBMs quantum-computing group and now the CEO of Rigetti Computing, computing superiority is fundamental to long-term economic superiority, safety, and security. Our strategy, he continues, has to be viewing quantum computing as a way to regain American superiority in high-performance computing.

Additionally, cyber-policy advisor Tim Polk stated publicly that our edge in quantum technologies is under siege. In fact, China leads in unhackable quantum-enabled satellites and owns the worlds fastest supercomputers.

While quantum computers will lead to astounding breakthroughs in medicine, manufacturing, artificial intelligence, defense, and more, rogue states or actors could use quantum computers for fiercely destructive purposes. Recall the hack of Sony by North Korea, Russian spies hacking Yahoo accounts, and the exposure of 22 million federal Office of Personnel Management records by Chinese hackers.

How can the United States win this race? We must take a multi-pronged approach to guard against the dangers of quantum computers while reaping their benefits. The near-term priority is to implement quantum-cybersecurity solutions, which fully protect against quantum-computer attacks. Solutions can soon be built directly into devices, accessed via the cloud, integrated with online browsers, or implemented alongside existing fiber-optic infrastructure.

Second, the U.S. needs to consider increasing federal research and development and boost incentives for industry and academia to develop technologies that align private interests with national-security interests, since quantum technology will lead to advances in defense and forge deterrent capabilities.

Third, as private companies advance quicker than government agencies, Washington should engage regularly with industry. Not only will policies evolve in a timely manner, but government agencies could become valuable early adopters.

Fourth, translating breakthroughs in the lab to commercial development will require training quantum engineers. Dr. Robert Schoelkopf, director of the Yale Quantum Institute, launched Quantum Circuits, Inc., to bridge this gap and to perform the commercial development of a quantum computer.

The United States achieved the unthinkable when it put a man on the Moon. Creating the first quantum computer will be easier but the consequences if we dont will be far greater.

Idalia Friedson is a research assistant at the Hudson Institute.

Read more:

The Quantum Computer Revolution Is Closer Than You May Think - National Review

Time Crystals Could be the Key to the First Quantum Computer – TrendinTech

Its been proven that time crystals do in fact exist. Two different teams of researchers created some time crystals just recently, one of which was from the University of Maryland and the other from Harvard University. While the first team used a chain of charged particles called ytterbium ions, the others used a synthetic diamond to create an artificial lattice.

It took a while for the idea of time crystals to stick because they are essentially impossibilities. Unlike conventional crystals where the lattices simply repeat themselves in space, time crystals also repeat in time to breaking time-translation symmetry. This unique phenomenon is the first in demonstrating non-equilibrium phases of matter.

The Harvard researchers are excited with their discoveries so far and are now hoping to uncover more about these time crystals. Mikhail Lukin and Eugene Demler are both physics professors and joint leaders of the Harvard research group. Lukin said in a recent press release, There is now broad, ongoing work to understand the physics of non-equilibrium quantum systems. The team is keen to move on with further research as they know by researching materials such as time crystals will help us better understand our own world as well as the quantum world.

Research such as that carried out by the Harvard team will allow others to develop new technologies such as quantum sensors, atomic clocks, or precision measuring tools. In regards to quantum computing, time crystals could be the missing link that were searching for when it comes to developing the worlds first workable model. This is an area that is of interest for many quantum technologies, said Lukin, because a quantum computer is a quantum system thats far away from equilibrium. Its very much at the frontier of research and we are really just scratching the surface. Quantum computer could change the way in which research is carried out and help in solving the most complex of problems. We just need to figure it out first.

More News to Read

comments

See more here:

Time Crystals Could be the Key to the First Quantum Computer - TrendinTech

Quantum Physics: Are Entangled Particles Connected Via An Undetected Dimension? – Forbes


Forbes
Quantum Physics: Are Entangled Particles Connected Via An Undetected Dimension?
Forbes
The informed reader will note a stunning parallel with the ultraviolet catastrophe which led to quantum theory. This term, discussed elsewhere, refers to the fact that using Maxwell's equations and classic mechanics, we get spontaneous infinite ...

See the original post here:

Quantum Physics: Are Entangled Particles Connected Via An Undetected Dimension? - Forbes

Scientists ‘BREED’ Schrodinger’s Cat in massive quantum physics breakthrough – Express.co.uk

GETTY

In Erwin Schrodingers thought experiment, the hypothetical cat can either be alive or dead at the same time in a quantum phenomenon known as superposition.

Physicists have now found a way to carry out the experiment and reveal the exact point that objects can switch between classical physics and quantum physics physics on a subatomic scale.

Team leader Alexander Lvovsky, from the University of Calgary and the Russian Quantum Centre, said: "One of the fundamental questions of physics is the boundary between the quantum and classical worlds.

Can quantum phenomena, provided ideal conditions, be observed in macroscopic objects?

GETTY

"Theory gives no answer to this question - maybe there is no such boundary.

What we need is a tool that will probe it.

In the researchers experiment, two coherent light waves represented Schrodingers cat for which the fields of the electromagnetic waves pointed in opposite directions at the same time.

GETTY

The University of Calgarys Anastasia Pushkina, co-author of the research, said: In essence, we cause interference of two 'cats' on a beam splitter.

This leads to an entangled state in the two output channels of that beam splitter.

In one of these channels, a special detector is placed.

In the event this detector shows a certain result, a 'cat' is born in the second output whose energy is more than twice that of the initial one.

When the team measured the results, they found that they could convert a pair of negative Schrodingers cats with an amplitude of 1.15 to a single positive cat with an amplitude of 1.85 in steps which could have huge implications for the quantum physics.

1 of 10

The X-ray caused a sensation when it was discovered by German scientist Prof. Roentgen in 1895. He was awarded the first Nobel Prize for physics in 1901. Pictured below are X-rays of the hands of King George and Queen Mary, 1896 / Pics: SSPL

Demid Sychev, a graduate student from the Russian Quantum Centre, added: It is important that the procedure can be repeated: new 'cats' can, in turn, be overlapped on a beam splitter, producing one with even higher energy, and so on.

"Thus, it is possible to push the boundaries of the quantum world step by step, and eventually to understand whether it has a limit."

Read the rest here:

Scientists 'BREED' Schrodinger's Cat in massive quantum physics breakthrough - Express.co.uk

The application of three-axis low energy spectroscopy in quantum physics research – Phys.Org

May 1, 2017 ThALES. Credit: R. Cubitt, ILL

In modern physics of the past century, understanding the electronic properties and interactions between electrons inside matter has been a major challenge. Electrons are responsible for the chemical link between atoms and almost all factors that characterise a piece of matter, such as colour, heat transport, conductivity and magnetism. An elementary property of electrons is the spin, and the combination of electronic spins on the atomic level can induce a magnetic moment on certain atoms, which constitute the material. These moments can add up to macroscopic magnetic forces.

As magnetism is the footprint of the interactive behaviour of electrons, studying it on the atomic level informs us about the collective electronic behaviour in the atomic environment. This can explain macroscopically observed electronic properties, like the temperature dependence of the conductivity.

On the atomic level, magnetic ions are closely packed and thus mutually influence each other, resulting in the adoption of a common magnetic order to minimise their energy balance. A slight perturbation leads to a spin wave, whereby an oscillation of one magnetic moment around its central axis induces oscillating perturbations with a slight phase shift on the atomic neighbours. Spin waves are routinely observed in ordered magnetic materials by inelastic neutron scattering (INS) on spectrometers at the Institut Laue-Langevin (ILL).

Transitioning from a classical to a quantum magnetic world

The magnetic moment is characterised by its spin number. The larger the spin number, the more appropriate it is to compare the atomic magnetic moment with a classical magnet. Lowering the spin means accentuating its quantum properties; exploring the transition into the quantum world, which is fundamentally different from the daily, macroscopic world, is one of the most exciting challenges in solid state physics.

The most cited example is the spin -1/2 moments placed in the corner of an equidistant triangle. Due to its quantum nature, one spin can only point upwards or downwards with respect to its local axis. A magnetic exchange between the spin moments, that is antiferromagnetic in nature, forces them to align antiparallel to each other. As a quantum magnet cannot order, rather than adopting one ground state, several states are equally likely (6 in the case of the triangle), and the spins are in a super-positioned state pointing in several directions at once.

Combining equidistant triangles leads to a two-dimensional network of spins. Its ground state, i.e. the spin arrangement with the lowest possible energy cost, has challenged theorists for decades. In 1973, noble laureate P.W. Anderson proposed a so-called 'quantum spin liquid state,' which is conceptually completely different to ordered magnetic phases. Anderson argued that for a triangular system, it is energetically more favourable for spins to organise into bonds. In these valence bonds, electrons are quantum mechanically 'entangled,' a purely quantum mechanical state. A superposition of a manifold of bond pattern exists in parallel and bonds fluctuate due to a quantum mechanical principle, which imposes zero point motions on the particles. This state is called a Resonant Valence Bond (RVB) state.

Neutron scattering provides experimental proof for the RVB state

Here at ILL, two cold three-axis spectrometers, IN14 and IN12, contributed over decades to the discovery and unravelling of magnetic correlations in classical and non-conventional superconductors, multiferroic crystals and a wide range of low-dimensional, frustrated and quantum magnetic systems. As both instruments dated from the 1980s, they were in need of a complete refurbishment to be able to continue contributing to the scientific progress in these fields. The new IN12 spectrometer's relocation and refurbishment was completed in 2012, and by the end of 2014, the IN14 spectrometer was replaced by its successor, ThALES.

ThALES, Three-Axis instrument for Low Energy Spectroscopy, is a next generation cold neutron three-axis spectrometer that builds on the strengths of its predecessor, IN14, but uses state-of-the-art neutron optics. The ThALES project is a collaboration between ILL and Charles University, Prague, and is financed by the Czech Ministry of Science and Education.

After replacing the IN14, ThALES became the new reference for cold single crystal neutron spectroscopy at a steady state neutron source like the ILL reactor. ThALES has been fully optimised to address the physics of highly correlated electron systems and scientific problems in the field of quantum magnetism. Moreover, the flexibility of the spectrometer has been enhanced through the implementation of various optical elements.

The key aims of ThALES are:

ThALES was used to carry out INS measurements in a recent study conducted by a collaboration of scientists, including ILL's Martin Boehm, current co-ordinator of the EU-funded neutron network SINE2020. The study published in Nature, titled 'Evidence for a spinon Fermi surface in a triangular lattice quantum-spin-liquid candidate,' argued that the triangular-lattice antiferromagnet YbMgGaO4 has the long sought quantum spin liquid RVB ground state. This study was the first to use neutron scattering as a means of providing experimental proof for the RVB state.

The experimental effort to discover the RVB ground state has considerably increased since P.W. Anderson suggested that it might explain the phenomenon of superconductivity in a class of materials that show particularly high transition temperatures between a normal conducting and superconducting state. However, providing experimental proof for the existence of the RVB state is very challenging, because while a magnetically ordered system has a clear experimental response, the RVB state is characterised by the absence of a measurable quantity.

Due to the lack of a measurable quantity, the experimental approach of this study, using ThALES, selected indirect experimental proof by deliberately exciting the ground state with neutrons and measuring the dynamic response. According to theoretical expectations, the excited spin liquid behaves 'exotically,' meaning the excited state is explained by spinons with very unusual properties. Spinons can rearrange the distribution of valence bonds and travel throughout the triangular plane with a minimum amount of energy.

In a scattering process between the neutron and the spin liquid, the law of conservation of total momentum imposes the creation of two spin-1/2 spinons in the liquid. This pair of spinons travel in opposite directions with a total amount of energy equalling the loss of neutron energy in the scattering process. Using the ThALES spectrometer, it is possible to trace the direction and energies of the spinons by measuring the direction and energy of the neutron that created the spinon pair. In this way, this study traced a complete dynamical landscape of the spin quantum liquid in the triangular plane, and compared the measurements with theoretical predictions, which gave strong evidence for the existence of the spin liquid phase in YbMgGaO4.

This research is important as a quantum spin liquid state of matter is potentially relevant for applications of quantum information. Moreover, experimental identification of a quantum spin liquid state contributes greatly to our understanding of quantum matter.

Explore further: Novel state of matter: Observation of a quantum spin liquid

More information: Yao Shen et al. Evidence for a spinon Fermi surface in a triangular-lattice quantum-spin-liquid candidate, Nature (2016). DOI: 10.1038/nature20614

Journal reference: Nature

Provided by: Institut Laue-Langevin

A novel and rare state of matter known as a quantum spin liquid has been empirically demonstrated in a monocrystal of the compound calcium-chromium oxide by team at HZB. According to conventional understanding, a quantum ...

Magnetism is one of the oldest recognised material properties. Known since antiquity, records from the 3rd century BC describe how lodestone, a naturally occurring magnetised ore of iron, was used in primitive magnetic compasses. ...

An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons - thought to be ...

A little frustration can make life interesting. This is certainly the case in physics, where the presence of competing forces that cannot be satisfied at the same time known as frustration can lead to rare material ...

Fermions are ubiquitous elementary particles. They span from electrons in metals, to protons and neutrons in nuclei and to quarks at the sub-nuclear level. Further, they possess an intrinsic degree of freedom called spin ...

Antiferromagnets are materials that lose their apparent magnetic properties when cooled down close to absolute zero temperature. Different to conventional magnets, which can be described with classical physics even at the ...

Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level.

The first experimental result has been published from the newly upgraded Continuous Electron Beam Accelerator Facility (CEBAF) at the U.S. Department of Energy's Thomas Jefferson National Accelerator Facility. The result ...

Sudden cardiac death resulting from fibrillation - erratic heartbeat due to electrical instability - is one of the leading causes of death in the United States. Now, researchers have discovered a fundamentally new source ...

(Phys.org)A team of researchers at Sandia Labs in the U.S. has developed a type of atom interferometer that does not require super-cooled temperatures. In their paper published the journal Physical Review Letters, the ...

(Phys.org)A team of researchers working on the CERN Axion Solar Telescope (CAST) project report passing an important milestone in their search for the axionthey have moved below established astrophysical constraints ...

When spacecraft and satellites travel through space they encounter tiny, fast moving particles of space dust and debris. If the particle travels fast enough, its impact appears to create electromagnetic radiation (in the ...

Adjust slider to filter visible comments by rank

Display comments: newest first

Electrons are repelled by other electrons (Coulomb's Law). This is the opposite of a "bond". Electrons are attracted by protons. The most simple atom is Hydrogen. This is a very engaging subject, which I have studied since 1989. Max Planck's original quantum theory was based on the hydrogen atom as an electronic system, and there were no conflicts. My book ("The Secret of Gravity", 1997) presents proof that gravity is an electronic force. The dynamic forces of hydrogen atoms can be analyzed using special computer programs ("Analyzing Atoms Using the SPICE Computer Program", Computing in Science and Engineering, Vol. 14, No. 3, May/June 2012). An electronic model of the hydrogen atom is presented and analyzed.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Read the original post:

The application of three-axis low energy spectroscopy in quantum physics research - Phys.Org

Physicists breed Schrdinger’s cats to find boundaries of the | Cosmos – Cosmos

Entangled cats? Stranger things could happen if quantum rules scaled up to the everyday world.

Ryan Schneider / Getty

What is the limit to self-contradiction? The question arises in politics and quantum physics alike.

A team of Russian and Canadian physicists have figured out how to push the limits of self-contradicting quantum states, by breeding Schrdingers cats.

Their experiment, which involves sending cat-state photons through a hall of mirrors which multiplies their number, is described in Nature Photonics today.

Using the new method, the authors hope to help answer a fundamental question, namely: at what scale does the absurdity of quantum mechanics end and common-sense reality begin?

In the microscopic world of quantum mechanics, particles can do seemingly impossible things: such as being simultaneously in two contradictory states at once. For the Austrian physicist Erwin Schrdinger, who helped put quantum mechanics on firm foundations in 1926 with his Nobel- winning equation, this idea was too crazy to be believed.

In 1935, to illustrate how absurd quantum ideas had become, Schrdinger came up with a scenario involving a cat which, according to quantum theory, is both alive and dead at the same time.

The way he did it was to link the fate of a cat to a specific quantum event.

With ingenuity more typical of a Bond villain than a physicist, Schrdinger imagined a cat trapped inside a steel box along with some radioactive material, a Geiger counter, a hammer and a vial of hydrogen cyanide. If one of the radioactive atoms decays a chance quantum event it would trigger the hammer to smash the vial of poisonous gas, and farewell Felix.

Before you open the box to check, says quantum theory, the radioactive atom is both decayed and not-decayed. By extension, said Schrdinger, the cat is both alive and deadthe distinction between them blurry and smeared out.

But what seemed impossible to Schrdinger, is a commonplace for modern day physicists, who have worked out how to produce various analogues of Schrdingers cat in real physical systems. They are used in many quantum technologies including quantum computation, teleportation, and cryptography.

In essence, a particle in a Schrdingers cat state is one that is holding two contradictory states at once. For example, an electron could be simultaneously spin up and spin down. Or, a photon of light could be simultaneously waving in two opposite directions.

Until now, experimenters have only managed to muster small groups of Schrdingers cat photons with limited energies, but the new work creates any number by breeding them.

The method works by taking two photons, already in cat states, and firing them simultaneously through the same beam-splitter, which gets the two photons entangled. After some more beam-splitting the arrangement spits out more cat states than went in a bit like if Felix hopped through a cat-flap and two cats appeared on the other side.

The snag is, the process only works about one fifth of the time. (The rest of the time, there's no entanglement, and no breeding of cats.)

And running the photons through the ring again would increase the amplitude even further. Using this iterative approach could potentially produce as many quantum cat states as you like.

Thus, it is possible to push the boundaries of the quantum world step by step, and eventually to understand whether it has a limit, says Demid Sychev, of the Russian Quantum Center and the Moscow State Pedagogical University, and lead author of the study.

Meanwhile, the debate which originated with Schrdinger, Bohr and Einstein continues today: the question of whether the universe is innately fuzzy or whether it is just the way we see it. As Schrdinger eloquently put it in 1935: There is a difference between a shaky or out-of-focus photograph and a snapshot of clouds and fog banks.

Producing quantum phenomena with more particles, and in larger scales, might just help us spot the difference between these two pictures, and finally get to grips with reality.

Even if our politicians still struggle with it.

See original here:

Physicists breed Schrdinger's cats to find boundaries of the | Cosmos - Cosmos

Make America Great Again! | Donald J Trump for President

Donald J. Trump For President, Inc. Why Now?

On November 8, 2016, the American People delivered a historic victory and took our country back. This victory was the result of a Movement to put America first, to save the American economy, and to make America once again a shining city on the hill. But our Movement cannot stop now - we still have much work to do.

This is why our Campaign Committee, Donald J. Trump for President, Inc., is still here.

We will provide a beacon for this historic Movement as our lights continue to shine brightly for you - the hardworking patriots who have paid the price for our freedom. While Washington flourished, our American jobs were shipped overseas, our families struggled, and our factories closed - that all ended on January 20, 2017.

This Campaign will be a voice for all Americans, in every city near and far, who support a more prosperous, safe and strong America. Thats why our Campaign cannot stop now - our Movement is just getting started.

Together, we will Make America Great Again!

Read more here:

Make America Great Again! | Donald J Trump for President

The Donald Trump Zone of Uncertainty shows up in the health-care debate – Washington Post

During a news conference Wednesday, White House press secretary Sean Spicer was asked how an amendment to the American Health Care Act that could increase premiums for those with preexisting conditions squares with the presidents pledge that this wouldnt happen.

His response? Something we could have expected from this administration.

White House press secretary Sean Spicer said it would be "impossible" to calculate the potential cost of insurance plans for people with preexisting conditions who would be forced to buy insurance from state-run high-risk pools under the new GOP health care bill, on May 3 at the White House. (Reuters)

REPORTER: An analysis from AARP showed that the sickest patients will pay nearly $26,000 a year in premiums under the new health-care law and that $8 billion which was included in that amendment this morning is not nearly enough to lower those costs.

So Im wondering, how does that, which would be a major premium hike on the sickest patients, square with the presidents promise to both lower premiums and take care of those with preexisting conditions?

SPICER: So it sounds interesting to me that, without there are so many variables that are unknown, that to make an analysis of that level of precision, it seems almost impossible.

Let me give you an example. So right now preexisting conditions are covered in the bill. They always have been; weve talked about that before. States have a right to receive a waiver. If someone has continuous coverage, thats never going to be an issue, regardless of no circumstance does anyone with continuous coverage would ever have a problem with preexisting.

If someone chose not to have coverage for 63 days or more, and they were in a state that opted out, and they had a preexisting condition, and they were put into a high-risk pool then weve allocated an additional $8 billion over five years to help drive down those costs.

So for someone to know how many people that is, what number of states are going to ask for and receive a waiver is literally impossible at this point. So to do an analysis of any level of factual basis would be literally not a [possibility].

That right there is a natural end point of the Donald Trump phenomenon: A representative of the administration declaring that there is no knowable truth behind the debate over a policy, so the policy might just as well be supported.

It is true that it is literally impossible to know exactly how many people with preexisting conditions will live in states that ask for a waiver on their coverage and to know how much that will cost. It is similarly impossible to know precisely how many Americans do any number of things. How many Americans like President Trump? How many Americans have jobs? How many Americans are Hispanic? Measuring each of these things offers a level of imprecision, but that doesnt mean that we cant know generally what those numbers look like.

As explained by the reporter, the estimate about those with preexisting conditions that is, serious health issues that existed beforereceiving insurance coverage comes from AARP. Heres the relevant excerpt from an April 27 article:

States that want to allow insurers to charge more for people with preexisting conditions would have to have a high-risk pool program or a reinsurance program. For consumers who buy coverage in a high-risk pool, AARPs PPI projects that the premiums could reach $25,700 a year in 2019, when this provision would go into effect.

That figure would disproportionately affect those ages 50to 64, since AARP estimates that 40 percent of Americans in that age bracket have such conditions. Whats more, the density of the population with such conditions is higher in Appalachia and the South, areas that are more conservative and therefore more likely to ask for some sort of waiver from the stipulations in place.

As Spicer notes, the $25,700 would be paid only by those whohad let their coverage lapse. How many that may be isnt known. But $8 billion spread over five years would cover $25,700 in premiums for fewer than 63,000 people a year.

AARP estimates that 24.8 million Americans have preexisting conditions, just within that 50-64 age range. The Kaiser Family Foundation figures that 52 million in total have such a condition.

So the question is valid: How does that square with the presidents promise to both lower premiums and take care of those with preexisting conditions? We dont know a hard number for those who will be affected, no. But we know that some large number is likely to be.

Over the course of the 2016 campaign, Trump used one rhetorical trick repeatedly. Questioned about an issue, hed gin up some anecdotal example providing an opposing line of thinking and use that to sort of shrug the whole thing off. Trump says his phones were wiretapped at Trump Tower and, look, the New York Times says that someone associated with his campaign was surveilled in some way, so that basically proves the point. Remember when he sat down with Bill OReilly and said explicitly to forget all that about not having actual data, pointing instead to a report that had nothing to do with voter fraud?

This is an actual strategy: Cast doubt about the certainty of an issue and use that doubt to press forward as you see fit.

In this case, theres a direct political advantage. When a Congressional Budget Office analysis of the original iteration of the AHCA came out in March showing that 24 million fewer people would be insured in a decade, it spurred a number of Republicans to bail on the legislation. Spicers who knows strategy isnt just meant to rebut reporters, its meant to keep House Republicans in line until they vote.

Spiceris right that we dont know precisely how many people will be negatively affected by the updated American Health Care Act. In fact, its probably safer to assume that the uncertainty in how many people will be negatively affected will work against the administration, given how many people have preexisting conditions. Regardless, the exact number isnt the point. The point is that we know that some will be, and we know that Trump said that wouldnt happen, which is why the question came up.

For that, Spicer had no answer.

Original post:

The Donald Trump Zone of Uncertainty shows up in the health-care debate - Washington Post