Monthly Archives: May 2017

Scientists found a wave of ultra hot gas bigger than the Milky Way – Astronomy Magazine

Posted: May 6, 2017 at 4:08 am

Scientists found a wave of hot gas twice the size of the Milky Way in the Perseus galaxy cluster that they believe is billions of years old.

The study, which is published in the June 2017 issue of Monthly Notices of the Royal Astronomical Society, combined data from NASAs Chandra X-Ray Observatory with radio observations and computer simulations.

Perseus, named after its host constellation, is 240 million light years away and is made of gas burning so hot it can only glow in X-rays. While studying the burning gas, Chandra found many interesting things, but focused on an enigmatic concave called the bay.

After combining 10.4 days worth of high-resolution Chandra data with 5.8 days of wide-field observations, the team had created an X-ray image of the gas in Perseus. They then filtered the data to highlight the more subtle details and compared the enhanced image to computer simulations of merging galaxy clusters.

Go here to read the rest:

Scientists found a wave of ultra hot gas bigger than the Milky Way - Astronomy Magazine

Posted in Astronomy | Comments Off on Scientists found a wave of ultra hot gas bigger than the Milky Way – Astronomy Magazine

Google: No to Price War Over Cloud Computing – Investopedia

Posted: at 4:07 am


Investopedia
Google: No to Price War Over Cloud Computing
Investopedia
Concerns about a price war in the cloud market started last year when Amazon and Google both moved to lower the prices on some of their cloud offerings. Microsoft took a page from both companies in 2017, also reducing the prices on some of its ...
Google says it doesn't need to get into a cloud price war with Amazon, Microsoft to winCNBC
Google May Need an Acquisition to Catch Amazon, Microsoft, Says GoldmanBarron's
Is the Google Cloud Pricing Strategy Really That Different from AWS or Microsoft?1redDrop
InfoWorld
all 9 news articles »

See the rest here:

Google: No to Price War Over Cloud Computing - Investopedia

Posted in Cloud Computing | Comments Off on Google: No to Price War Over Cloud Computing – Investopedia

A prepaid wallet that helps start-ups access cloud-computing services – The Hindu

Posted: at 4:07 am


The Hindu
A prepaid wallet that helps start-ups access cloud-computing services
The Hindu
MUMBAI: For start-ups in India and around the world, the emergence of cloud-computing services to build and scale business has been an incredible advantage, allowing entrepreneurs to work immediately on a great idea without worrying about the time it ...

and more »

Visit link:

A prepaid wallet that helps start-ups access cloud-computing services - The Hindu

Posted in Cloud Computing | Comments Off on A prepaid wallet that helps start-ups access cloud-computing services – The Hindu

CLOUD COMPUTING Cisco Expands Cloud IoT Services with $610M Viptela Acquisition – CIO Today

Posted: at 4:07 am

By Jef Cozza / CIO Today. Updated May 05, 2017.

This year's event, which will be held in the MGM Grand from May 15-19, will focus on issues such as security, cloud, DevOps, data and analytics, and infrastructure. The conference will include 130 sessions consisting of hands-on, panel, and speaker-led sessions.

Expanded Format

Event organizers said the decision to change Interop's format and expand its programming was due to the need to address modern trends such as artificial intelligence and security, expanding beyond its traditional focus on networking and infrastructure technology. The event will include more than two dozen sessions related to cloud technology, with cloud content being offered on each of the five days.

Several of those sessions aim to take a closer look at the role containers play in cloud-delivered services, and how they can be deployed and managed both within the cloud and on-premises. Other sessions will focus on various in-cloud services, such as the need for cloud operations to be unbound from infrastructure and tied to applications.

Security will also be a subject of major attention this year, with events focusing on strategies enterprises can use to block ransomware attacks, and how to respond to attacks once they've taken place.

Other sessions will focus on what companies can do to promote internal security, identify malicious insiders, and mitigate threats coming from within the organization without having to resort to Big Brother tactics.

Skills Shortages and IoT Threats

Skills shortages among IT professionals is another timely topic for Interop this year. A panel discussion titled "Surviving the Security Skills Shortage" will tackle questions such as how organizations can survive with small IT staffs, discover new talent, and retain talented IT security professionals once they're hired.

One of the biggest security issues for enterprises in recent years has been the advent of devices for the Internet of Things. Interop has three events scheduled on the issue. Among the IoT topics to be discussed are ways organizations can prepare to address IoT issues, adjustments they can make to identify management and risk profiles, and how to protect DNS services against security threats such as IoT botnets.

Other sessions will focus on ways enterprises can analyze the mountains of security data they have in order to extract actionable intelligence, how managers can address security issues with developers in order to get them to produce more secure code, and the basics of cyber-insurance policies.

Enterprise I.T. Exhibits

The Interop ITX exhibit hall opens Tuesday evening, May 16, followed by a full day May 17, and half day May 18.

Approximately 100 exhibitors will be on hand, including 18 designated as featured exhibitors: AT&T, IBM, Comcast Business, Kaspersky Lab, VMware, ManageEngine, Cylance, 128 Technology, Veeam, WatchGuard, Viptela, Axis Communications, ExtraHop, Cumulus, Extreme Networks, Capterra, PathSolutions, and Pluribus Networks.

Keynotes and Panels

Keynote addresses will be presented Wednesday and Thursday from 8:30 to 10:00 AM, in the MGM Grand Ballroom.

Wednesday's keynote addresses include Otto Berkes, chief technology officer for CA Technologies, speaking about "Freeing Technology to Drive Creativity."

Cyber security expert and FireEye CEO Kevin Mandia will address "Cyber Securitys Grown-Up Phase," providing tangible recommendations for what enterprises can do to survive today's increasingly complex security landscape.

A "Fireside Chat" with VMware CTO Chris Wolf will address business demands faced by VMware customers, including why IT leaders must adapt to a new type of infrastructure, plus an overview of specific technologies to help drive their businesses forward.

Wednesday "Lightning" panel presenters include analyst Sam Charrington who founded CloudPulse Strategies; Josh Bloom, who founded and serves as CTO for Wise.io; and Coco Krumme, who heads the Data Science team at Haven Inc, a technology platform for trade and logistics.

Thursday's keynotes begin with MIT Research Scientist Andrew McAfee's talk on "Harnessing the Digital Revolution." Andrew will discuss what enterprises and technology leaders need to think about with regard to machine learning and other disruptive changes expected over the next 10 years.

Also on Thursday, Susie Wee, who founded Cisco's developer program for infrastructure and application developers, will address innovative solutions using "Modern Apps on a Programmable Infrastructure."

Thursday "Lightning" panel presenters feature Amazon's Senior Manager of Talent Acquisition Ester Frey; Olga Braylovskiy who is VP of the workforce technology at Intuit; Ed McLaughlin, CIO for Mastercard; and Janine Gianfredi, Chief Marketing Officer of the United States Digital Service.

And finally, Best of Interop awards will be presented on Thursday, May 18 at 12:45pm in the Interop ITX Theater.

See the original post here:

CLOUD COMPUTING Cisco Expands Cloud IoT Services with $610M Viptela Acquisition - CIO Today

Posted in Cloud Computing | Comments Off on CLOUD COMPUTING Cisco Expands Cloud IoT Services with $610M Viptela Acquisition – CIO Today

Quantum Computing Market Forecast 2017-2022 | Market …

Posted: at 4:07 am

The quantum computing processor, a physical device enabling the principle of quantum computing, is still rather a theoretical concept than a ready-to-implement engineering solution. Yet this notion has been broken recently by D-Waves announcement of shipping the first commercially available quantum computer model D-Wave 2000Q. IBM is also launching a new quantum computing division IBM Q, a move that might be a turning point in commercialization of quantum computing technology. IBM has pioneered quantum computing in the cloud with API enabling apps mostly for research purposes. We expect vigorous development of the cloud market segment to continue at double digit rate.

The quantum computing market is projected to surpass $5 Billion through 2020.

Despite technology advances the quantum computing market is still fledgling. At the same time this rapidly evolving market is one of the most active R&D fields, attracting substantial government funding that supports research groups at internationally leading academic institutions, national laboratories, and major industrial-research centers. The governments are the major driving force behind investments in quantum computing R&D, fiercely competing for what is perceived as the most promising technology of the 21st century. The worlds largest government IT/Defense contractors follow the government suit.

So, what is the rationale for quantum computing market?

a. National Security Considerations:

b. National Economy Considerations:

The report covers the quantum computing R&D, products, technologies and services as well as government, corporate and venture capital investments in quantum computing.

The report provides detailed year-by-year (2017 2022) forecasts for the following quantum computing (QC) market segments:

Quantum Computing Market Forecast 2017-2022, Tabular Analysis, March 2017, Pages: 23, Figures: 13, Tables: 6, Single User Price: $5,950.00 Reports are delivered in PDF format within 24 hours. Analysis provides quantitative market research information in a concise tabular format. The tables/charts present a focused snapshot of market dynamics.

2CheckOut.com Inc. (Ohio, USA) is an authorized retailer for goods and services provided by Market Research Media Ltd.

Quantum Computing Market Forecast 2017-2022, Tabular Analysis, March 2017, Pages: 23, Figures: 13, Tables: 6, Global Site License: $9,950.00 Reports are delivered in PDF format within 24 hours. Analysis provides quantitative market research information in a concise tabular format. The tables/charts present a focused snapshot of market dynamics.

2CheckOut.com Inc. (Ohio, USA) is an authorized retailer for goods and services provided by Market Research Media Ltd.

Table of Contents

1. Market Report Scope & Methodology 1.1. Scope 1.2. Research Methodology

2. Executive Summary

3. Quantum Computing Market in Figures 2017-2022 3.1. Quantum Computing Market 2017-2022 3.2. Quantum Computing Market 2017-2022 by Technology Segments 3.3. Quantum Computing in the Cloud Market 2017-2022 3.4. Quantum Computing Market 2017-2022 by Country

List of Figures Fig. 1- Quantum Computing Market Forecast 2017-2022, $Mln Fig. 2- Quantum Computing Market: Growth Rates 2017-2022 by Technology Segments, CAGR % Fig. 3- Cumulative Quantum Computing Market 2017-2022, Market Share by Technology Segments, % Fig. 4- Quantum Computing Market 2017-2022 by Technology Segments, $Mln Fig. 5- Quantum Computing Market Dynamics 2017-2022: Market Share by Technology Segments, % Fig. 6- Quantum Computing Market 2017-2022: Quantum Cryptography, $Mln Fig. 7- Quantum Computing Market 2017-2022: Physical QC Device, $Mln Fig. 8- Quantum Computing Market 2017-2022: QC Simulation, $Mln Fig. 9- Quantum Computing Market 2017-2022: QC Programming Infrastructure, $Mln Fig. 10- Quantum Computing in the Cloud Market 2017-2022, $Mln Fig. 11- Cumulative Quantum Market 2017-2022, market share by country, % Fig. 12- Quantum Computing Market 2017-2022 by Country, $Mln Fig. 13- Quantum Computing Market Dynamics 2017-2022: Market Share by Country, %

List of Tables Table 1 The Rationale for Quantum Computing Market Table 2 Quantum Computing Approaches by Physical Principle Table 3 Quantum Computing Market Forecast 2017-2022, $Mln Table 4 Global Quantum Computing Market 2017-2022 by Technology Segments, $Mln Table 5 Quantum Computing in the Cloud Market 2017-2022, $Mln Table 6 Quantum Computing Market 2017-2022 by Top 8 Countries, $Mln

Originally posted here:

Quantum Computing Market Forecast 2017-2022 | Market ...

Posted in Quantum Computing | Comments Off on Quantum Computing Market Forecast 2017-2022 | Market …

What is Quantum Computing? Webopedia Definition

Posted: at 4:07 am

Main TERM Q

First proposed in the 1970s, quantum computing relies on quantum physics by taking advantage of certain quantum physics properties of atoms or nuclei that allow them to work together as quantum bits, or qubits, to be the computer's processor and memory. By interacting with each other while being isolated from the external environment, qubits can perform certain calculations exponentially faster than conventional computers.

Qubits do not rely on the traditional binary nature of computing. While traditional computers encode information into bits using binary numbers, either a 0 or 1, and can only do calculations on one set of numbers at once, quantum computers encode information as a series of quantum-mechanical states such as spin directions of electrons or polarization orientations of a photon that might represent a 1 or a 0, might represent a combination of the two or might represent a number expressing that the state of the qubit is somewhere between 1 and 0, or a superposition of many different numbers at once.

A quantum computer can do an arbitrary reversible classical computation on all the numbers simultaneously, which a binary system cannot do, and also has some ability to produce interference between various different numbers. By doing a computation on many different numbers at once, then interfering the results to get a single answer, a quantum computer has the potential to be much more powerful than a classical computer of the same size. In using only a single processing unit, a quantum computer can naturally perform myriad operations in parallel.

Quantum computing is not well suited for tasks such as word processing and email, but it is ideal for tasks such as cryptography and modeling and indexing very large databases.

Microsoft: Quantum Computing 101

TECH RESOURCES FROM OUR PARTNERS

Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.

Follow this link:

What is Quantum Computing? Webopedia Definition

Posted in Quantum Computing | Comments Off on What is Quantum Computing? Webopedia Definition

Quantum computing: A simple introduction – Explain that Stuff

Posted: at 4:07 am

by Chris Woodford. Last updated: February 18, 2017.

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there's more number-crunching ability in a 21st-century cellphone than you'd have found in a room-sized, military computer 50 years ago. Yet, despite such amazing advances, there are still plenty of complex problems that are beyond the reach of even the world's most powerful computersand there's no guarantee we'll ever be able to tackle them. One problem is that the basic switching and memory units of computers, known as transistors, are now approaching the point where they'll soon be as small as individual atoms. If we want computers that are smaller and more powerful than today's, we'll soon need to do our computing in a radically different way. Entering the realm of atoms opens up powerful new possibilities in the shape of quantum computing, with processors that could work millions of times faster than the ones we use today. Sounds amazing, but the trouble is that quantum computing is hugely more complex than traditional computing and operates in the Alice in Wonderland world of quantum physics, where the "classical," sensible, everyday laws of physics no longer apply. What is quantum computing and how does it work? Let's take a closer look!

Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics. Photo courtesy of US Department of Energy.

You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it's much more and much less than that. It's more, because it's a completely general-purpose machine: you can make it do virtually anything you like. It's less, because inside it's little more than an extremely basic calculator, following a prearranged set of instructions called a program. Like the Wizard of Oz, the amazing things you see in front of you conceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.

Conventional computers have two tricks that they do really well: they can store numbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can be done as a series of additions, for example). Both of a computer's key tricksstorage and processingare accomplished using switches called transistors, which are like microscopic versions of the switches you have on your wall for turning on and off the lights. A transistor can either be on or off, just as a light can either be lit or unlit. If it's on, we can use a transistor to store a number one (1); if it's off, it stores a number zero (0). Long strings of ones and zeros can be used to store any number, letter, or symbol using a code based on binary (so computers store an upper-case letter A as 1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 different characters (such as A-Z, a-z, 0-9, and most common symbols). Computers calculate by using circuits called logic gates, which are made from a number of transistors connected together. Logic gates compare patterns of bits, stored in temporary memories called registers, and then turn them into new patterns of bitsand that's the computer equivalent of what our human brains would call addition, subtraction, or multiplication. In physical terms, the algorithm that performs a particular calculation takes the form of an electronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend on conventional transistors. This might not sound like a problem if you go by the amazing progress made in electronics over the last few decades. When the transistor was invented, back in 1947, the switch it replaced (which was called the vacuum tube) was about as big as one of your thumbs. Now, a state-of-the-art microprocessor (single-chip computer) packs hundreds of millions (and up to two billion) transistors onto a chip of silicon the size of your fingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the 1960s, Intel co-founder Gordon Moore realized that the power of computers doubles roughly 18 monthsand it's been doing so ever since. This apparently unshakeable trend is known as Moore's Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That's roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we're talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you're being picky) packed into an area the size of a postage stamp!

It sounds amazing, and it is, but it misses the point. The more information you need to store, the more binary ones and zerosand transistorsyou need to do it. Since most conventional computers can only do one thing at a time, the more complex the problem you want them to solve, the more steps they'll need to take and the longer they'll need to do it. Some computing problems are so complex that they need more computing power and time than any modern machine could reasonably supply; computer scientists call those intractable problems.

As Moore's Law advances, so the number of intractable problems diminishes: computers get more powerful and we can do more with them. The trouble is, transistors are just about as small as we can make them: we're getting to the point where the laws of physics seem likely to put a stop to Moore's Law. Unfortunately, there are still hugely difficult computing problems we can't tackle because even the most powerful computers find them intractable. That's one of the reasons why people are now getting interested in quantum computing.

Quantum theory is the branch of physics that deals with the world of atoms and the smaller (subatomic) particles inside them. You might think atoms behave the same way as everything else in the world, in their own tiny little waybut that's not true: on the atomic scale, the rules change and the "classical" laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman, one of the greatest physicists of the 20th century, once put it: "Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen." (Six Easy Pieces, p116.)

If you've studied light, you may already know a bit about quantum theory. You might know that a beam of light sometimes behaves as though it's made up of particles (like a steady stream of cannonballs), and sometimes as though it's waves of energy rippling through space (a bit like waves on the sea). That's called wave-particle duality and it's one of the ideas that comes to us from quantum theory. It's hard to grasp that something can be two things at oncea particle and a wavebecause it's totally alien to our everyday experience: a car is not simultaneously a bicycle and a bus. In quantum theory, however, that's just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger's cat. Briefly, in the weird world of quantum theory, we can imagine a situation where something like a cat could be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushing Moore's Lawkeep on making transistors smaller until they get to the point where they obey not the ordinary laws of physics (like old-style transistors) but the more bizarre laws of quantum mechanics. The question is whether computers designed this way can do things our conventional computers can't. If we can predict mathematically that they might be able to, can we actually make them work like that in practice?

People have been asking those questions for several decades. Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantum computing in the 1960s when he proposed that information is a physical entity that could be manipulated according to the laws of physics. One important consequence of this is that computers waste energy manipulating the bits inside them (which is partly why computers use so much energy and get so hot, even though they appear to be doing not very much at all). In the 1970s, building on Landauer's work, Bennett showed how a computer could circumvent this problem by working in a "reversible" way, implying that a quantum computer could carry out massively complex computations without using massive amounts of energy. In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principles of quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basic computations. A few years later, Oxford University's David Deutsch (one of the leading lights in quantum computing) outlined the theoretical basis of a quantum computer in more detail. How did these great scientists imagine that quantum computers might work?

The key features of an ordinary computerbits, registers, logic gates, algorithms, and so onhave analogous features in a quantum computer. Instead of bits, a quantum computer has quantum bits or qubits, which work in a particularly intriguing way. Where a bit can store either a zero or a 1, a qubit can store a zero, a one, both zero and one, or an infinite number of values in betweenand be in multiple states (store multiple values) at the same time! If that sounds confusing, think back to light being a particle and a wave at the same time, Schrdinger's cat being alive and dead, or a car being a bicycle and a bus. A gentler way to think of the numbers qubits store is through the physics concept of superposition (where two waves add to make a third one that contains both of the originals). If you blow on something like a flute, the pipe fills up with a standing wave: a wave made up of a fundamental frequency (the basic note you're playing) and lots of overtones or harmonics (higher-frequency multiples of the fundamental). The wave inside the pipe contains all these waves simultaneously: they're added together to make a combined wave that includes them all. Qubits use superposition to represent multiple states (multiple numeric values) simultaneously in a similar way.

Just as a quantum computer can store multiple numbers at once, so it can process them simultaneously. Instead of working in serial (doing a series of things one at a time in a sequence), it can work in parallel (doing multiple things at the same time). Only when you try to find out what state it's actually in at any given moment (by measuring it, in other words) does it "collapse" into one of its possible statesand that gives you the answer to your problem. Estimates suggest a quantum computer's ability to work in parallel would make it millions of times faster than any conventional computer... if only we could build it! So how would we do that?

In reality, qubits would have to be stored by atoms, ions (atoms with too many or too few electrons) or even smaller things such as electrons and photons (energy packets), so a quantum computer would be almost like a table-top version of the kind of particle physics experiments they do at Fermilab or CERN! Now you wouldn't be racing particles round giant loops and smashing them together, but you would need mechanisms for containing atoms, ions, or subatomic particles, for putting them into certain states (so you can store information), knocking them into other states (so you can make them process information), and figuring out what their states are after particular operations have been performed.

Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their states using laser beams, electromagnetic fields, radio waves, and an assortment of other techniques. One method is to make qubits using quantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another method makes qubits from what are called ion traps: you add or take away electrons from an atom to make an ion, hold it steady in a kind of laser spotlight (so it's locked in place like a nanoscopic rabbit dancing in a very bright headlight), and then flip it into different states with laser pulses. In another technique, the qubits are photons inside optical cavities (spaces between extremely tiny mirrors). Don't worry if you don't understand; not many people do! Since the entire field of quantum computing is still largely abstract and theoretical, the only thing we really need to know is that qubits are stored by atoms or other quantum-scale particles that can exist in different states and be switched between them.

Although people often assume that quantum computers must automatically be better than conventional ones, that's by no means certain. So far, just about the only thing we know for certain that a quantum computer could do better than a normal one is factorisation: finding two unknown prime numbers that, when multiplied together, give a third, known number. In 1994, while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computer could follow to find the "prime factors" of a large number, which would speed up the problem enormously. Shor's algorithm really excited interest in quantum computing because virtually every modern computer (and every secure, online shopping and banking website) uses public-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentially an "intractable" computer problem). If quantum computers could indeed factor large numbers quickly, today's online security could be rendered obsolete at a stroke.

Does that mean quantum computers are better than conventional ones? Not exactly. Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any other algorithms have been discovered that would be better performed by quantum methods. Given enough time and computing power, conventional computers should still be able to solve any problem that quantum computers could solve, eventually. In other words, it remains to be proven that quantum computers are generally superior to conventional ones, especially given the difficulties of actually building them. Who knows how conventional computers might advance in the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.

Three decades after they were first proposed, quantum computers remain largely theoretical. Even so, there's been some encouraging progress toward realizing a quantum machine. There were two impressive breakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM's Almaden Research Center) used five fluorine atoms to make a crude, five-qubit quantum computer. The same year, researchers at Los Alamos National Laboratory figured out how to make a seven-qubit machine using a drop of liquid. Five years later, researchers at the University of Innsbruck added an extra qubit and produced the first quantum computer that could manipulate a qubyte (eight qubits).

These were tentative but important first steps. Over the next few years, researchers announced more ambitious experiments, adding progressively greater numbers of qubits. By 2011, a pioneering Canadian company called D-Wave Systems announced in Nature that it had produced a 128-qubit machine. Thee years later, Google announced that it was hiring a team of academics (including University of California at Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave's approach. In March 2015, the Google team announced they were "a step closer to quantum computation," having developed a new way for qubits to detect and protect against errors. In 2016, MIT's Isaac Chang and scientists from the University of Innsbruck unveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine might evolve into the long-promised, fully fledged encryption buster! There's no doubt that these are hugely important advances. Even so, it's very early days for the whole fieldand most researchers agree that we're unlikely to see practical quantum computers appearing for many yearsperhaps even decades.

View original post here:

Quantum computing: A simple introduction - Explain that Stuff

Posted in Quantum Computing | Comments Off on Quantum computing: A simple introduction – Explain that Stuff

China adds a quantum computer to high-performance computing arsenal – PCWorld

Posted: at 4:07 am

Thank you

Your message has been sent.

There was an error emailing this page.

China already has the world's fastest supercomputer and has now built a crude quantum computer that could outpace today's PCs and servers.

Quantum computers have already been built by companies like IBM and D-Wave, but Chinese researchers have taken a different approach. They are introducing quantum computing using multiple photons, which could provide a superior way to calculate compared to today's computers.

The Chinese quantum computing architecture allows forfive-photonsampling and entanglement. It's an improvement over previous experiments involving single-photon sourcing, up to 24,000 times faster, the researchers claimed.

The Chinese researchers have built components required for Boson sampling, which has been theorized for a long time and is considered an easy way to build a quantum computer. The architecture built by the Chinese can include a large number of photons, which increases the speed and scale of computing.

China is strengthening its technology arsenal in an effort to be self-sufficient. China's homegrown chip powers TaihuLight, the world's fastest computer.

In 2014, China said it would spend US$150 billion on semiconductor development so that PCs and mobile devices would convert to homegrown chips. Afraid that low-cost Chinese chips will flood the market, the U.S. earlier this year accused China of rigging the semiconductor market to its advantage.

It's not clear yet if a quantum computer is on China's national agenda. But China's rapid progress of technology is worrying countries like the U.S. A superfast quantum computer could enhance the country's progress in areas like weapons development, in which high-performance computers are key.

But there's a long way to go before China builds its first full-fledged quantum computer. The prototype quantum computer is good for specific uses but is not designed to be a universal quantum computer that can run any task.

The research behind quantum computers is gaining steam as PCs and servers reach their limit. It's becoming difficult to shrink chips to smaller geometries, which could upset the cycle of reducing costs of computers while boosting speeds.

If they deliver on their promise, quantum computers will drive computing into the future. They are fundamentally different from computers used today.

Bits on todays computers are stored as ones or zeros, while quantum computers rely on qubits, also called quantum bits. Qubits can achieve various states, including holding a one and zero simultaneously, and those states can multiply.

The parallelism allows qubits to do more calculations simultaneously. However, qubits are considered fragile and highly unstable, and can easily breakdown during entanglement, a technical term for when qubits interact. A breakdown could bring instability to a computing process.

The Chinese quantum computer has a photon device based on quantum dots, demultiplexers, photonic circuits, and detectors.

There are multiple ways to build a quantum computer, including via superconducting qubits, which is the building block for D-Wave Systems' systems. Like the Chinese system, D-Wave's quantum annealing method is another easy way to build a quantum computer but is not considered ideal for a universal quantum computer.

IBM already has a 5-qubit quantum computer that is available via the cloud. It is now chasing a universal quantum computer using superconducting qubitsbut has a different gating model to stabilize systems. Microsoft is trying to chase a new quantum computer based on a new topography and a yet-undiscovered particle called non-abelian anyons.

In a bid to build computers of the future, China has also built a neuromorphic chip called Darwin.

Excerpt from:

China adds a quantum computer to high-performance computing arsenal - PCWorld

Posted in Quantum Computing | Comments Off on China adds a quantum computer to high-performance computing arsenal – PCWorld

All Major TV Networks Block Trump’s ‘Fake News’ Ad – Variety

Posted: at 4:06 am


Variety
All Major TV Networks Block Trump's 'Fake News' Ad
Variety
The major television networks have all decided not to run Donald Trump's so-called fake news ad, according to a statement released by his daughter-in-law Lara Trump. Lara, an adviser on Trump's 2020 campaign, called the rejection a chilling ...
Watch The Donald Trump Ad The Mainstream Media Don't Want You To SeeThe Federalist

all 37 news articles »

Continue reading here:

All Major TV Networks Block Trump's 'Fake News' Ad - Variety

Posted in Donald Trump | Comments Off on All Major TV Networks Block Trump’s ‘Fake News’ Ad – Variety

Donald Trump’s Tweets of the Week: Blasting North Korea, Shading the Democrats, Winning Bigly – Newsweek

Posted: at 4:06 am

It's been another doozy of a week in Washington, D.C. Sometimes, its almost impossible to keep up with it all, from the Senate Judiciary Committee hearingon Russian interference in the 2016 election, to a slew of executive orders being delivered from the new White House administration, to the new health care bill. Even reporters covering the ongoings of the capital are finding it difficult to follow all of the developments.

And yet, President Donald Trump, the man who has managed to turn the "politics as usual"sentiment upside-down, continues to find time to write his supporters, critics and 28 million Twitter followers a quick guide to his days in the Oval Office and beyondwhether theyre asking for it or not.

Related: New site lets you donate to causes Trump hates every time he tweets

Subscribe to Newsweek from $1 per week

Trump's tweets provide more than just a SparkNotes-stylebreakdown of the week, however: Theyre the best glimpse the American public has inside the presidents headspace, as he maintains his tendency to shoot from the hip after 100 days in office. His Twitter account also serves as a catalogof what the leader of the free world was focused on during global crisesand crucial moments in his own presidency.

An anti-Trump demonstrator interacts with Trump supporters in New York City on May 4. Reuters

Lets walk through the presidents thoughts and activities this week via his personal Twitter account:

The president spoke of North Korea only once this week on hisTwitter account, when he blasted the oppressive regime for its failed weekend missile test launch. Trump also managed to praise Chinese President Xi Jinping within the same 140 characters, saying the nation "disrespected the wishes of China [and]its highly respected president."

Trump (once again) targeted the "mainstream (FAKE) media"for its coverage of his first 100 days in office, claiming most news networks were refusing to acknowledge the bevy of executive orders he had signed in histenure as president. The president retweeted a hot take from Foxs Tucker Carlson, posted to the Fox Nation Twitter account, claiming the Democrats are using the Russian cyberattacks on the election as a political tool to make the president less popular.

The president wants you to know he beat former Secretary of State Hillary Clinton in the 2016 election. He won. Bigly. Got it?

Trump celebrated a "big win"for the Republican Party after a repeal-and-replacebill to begin the process of overturning former President Barack Obamas landmark legislation, the Affordable Care Act, barely scraped through the House.

The Senate won't vote on the congressional billsenators plan on making their own version instead. But Trumps Twitter was still ablaze with victory tweets. (Oh yeah, and Obamacare still sucks, according to Trump.)

Trump signed acontroversial religious liberty orderthis week with little support from either party, but a whole lot of love from his baseand the man who hasbeen by his side through it all: Vice President Mike Pence. The VP has reportedly been pushing for this order to be signed into law ever since the pair took office. It aims to provide legal protectionfor religious groups claiming exceptions to Obamacare mandatesand undermines enforcement of legislation that prevents nonprofits from explicitpolitical activity.

Notice the #ICYMI hashtag Trump expertly used to remind his followers he's workingor at least signing off on ordersthroughout the weekend.

You'd think a president wouldn't have enough time to compose snarky, shady tweets toward his critics in a day filled with controversy, big meetings, executive order signings, photo opportunities and negotiations on crucial legislation. You'd be wrong.

Trump managed to shade the Democrats nearly every single day of the week in some fashion. Trump seems to have no plans to tone down his public persona, for better or worse, with his lowapproval ratings largely unchanged after his busy week.

(Oh yeah, and Andrew Jackson is his new favorite president, since Trump says Jackson could have stopped the American civil war. See you next week.)

See more here:

Donald Trump's Tweets of the Week: Blasting North Korea, Shading the Democrats, Winning Bigly - Newsweek

Posted in Donald Trump | Comments Off on Donald Trump’s Tweets of the Week: Blasting North Korea, Shading the Democrats, Winning Bigly – Newsweek