Daily Archives: May 20, 2017

Astronomers create the largest map of the universe | Astronomy.com – Astronomy Magazine

Posted: May 20, 2017 at 7:28 am

Quasars are extremely bright, extremely distant objects. They are the disks of material around supermassive black holes, which heat up and glow as material streams inward toward the event horizon. And now, the Sloan Digital Sky Survey (SDSS)has identified more than 147,000 of these objects in the distant universe to create a first-of-its-kind 3-dimensional map of the early universe that is also the largest such map available to date.

The observations were taken over the course of two years as part of the SDSS Extended Baryon Oscillation SpectroscopicSurvey, abbreviated eBOSS, which uses the Sloan Foundation 2.5m Telescope at Apache Point Observatory. eBOSS aims to more accurately measure the expansion history of the universe. The quasars in the study were spotted shining at a time when the universe was between 3 and 7 billion years old.

This new, more complete map of the universe and its expansion history agrees with the standard cosmology astronomers have developed over the past two decades and are currently using today, which includes properties such as Einsteins general theory of relativity and the existence of dark matter and dark energy.

Studying this time frame is particularly important because its the epoch leading up to when the universes expansion changed from a decelerating to an accelerating expansion. That happened when the universe was roughly 7.8 billion years old, or about 6 billion years ago. Today, that acceleration continues in short, objects farther away from us are receding faster, and continue to do so because of a cosmological component called dark energy. Characterizing the exact nature of this transition from a decelerating to accelerating universe can place further constraints on the nature of dark energy, which is the dominant component in our universe at the present time.

In a recent press release, Will Percival, a professor of cosmology at the University of Portsmouth and the eBOSS survey scientist, explains, Even though we understand how gravity works, we still do not understand everything there is still the question of what exactly dark energy is. We would like to understand dark energy further.

eBOSS is looking into the nature of dark energy by studying baryonic acoustic oscillations, or BAOs. BAOs are the remnant signature of sound waves traveling through the very early universe; when the universe was about 380,000 years old, these sound waves were essentially frozen in place by changing conditions. Since then, their signature has been stretched by the expansion of the universe. But astronomers have a good idea of what the BAOs should have looked like at the time they were frozen, making the changes we see in BAOs over time a clear record of the universes expansion over time.

Thus, BAOs can be used as a sort of standard ruler by cosmologists. The size of the oscillations corresponds to the most likely distance between galaxies, including the quasars that reside within them. According to Pauline Zarrouk, a PhD student at the University Paris-Saclay, You have metres for small units of length, kilometers or miles for distances between cities, and we have the BAO for distances between galaxies and quasars in cosmology.

Today, the distribution of galaxies (and their quasars) is a reflection of the frozen-in BAOs, so the better we can map the universe, the better we can understand how its expansion has changed over time, independent of other ways of measuring that expansion, such as using supernovae or lensed quasars.

Thus far, Our results are consistent with Einsteins theory of general relativity says Hector Gil-Marin of the Laboratoire de Physique Nuclaire et de hautes nergies in Paris. Gil-Marin is one of the astronomers who contributed to the analysis of the quasars and the creation of the map. We now have BAO measurements covering a range of cosmological distances, and they all point to the same thing: the simple model matches the observations very well.

See the article here:

Astronomers create the largest map of the universe | Astronomy.com - Astronomy Magazine

Posted in Astronomy | Comments Off on Astronomers create the largest map of the universe | Astronomy.com – Astronomy Magazine

The weird star that totally isn’t aliens is dimming again | Astronomy … – Astronomy Magazine

Posted: at 7:28 am

Theres a star 1,300 light years away that has exhibited some of the strangest behavior ever seen: something dims 20 percent of its light, something that is beyond the size of a planet. Its called KIC 8462852, but most people shorthand it Tabbys Star, or Boyajians Star for its discoverer, Tabitha Boyajian.

Heres the thing, though. Absolutely nobody knows why its dimming that much. It could be a massive fleet of comets or the debris of a planet. But its not giving off much infrared excess, which is a sort of "heat glow" from reflected starlight. And now, it seems to be dimming again, either helping or complicating the search for a solution.

Boyajian and co-investigator Jason Wright first put out the alert, hoping to garner observations from telescopes worldwide. Theyre hoping at least one of telescope can grab spectra from the star to see what is causing the dimming.

So far the dimming is at 2-3 percent, meaning the transit of something is just starting. Tabbys Star has a dedicated telescope waiting to find such an event, so the big observation period could yield further clues to whats occurring.

Ok, its time we tell you: some people think its aliens. The hypothesis, put forth by Wright, states that in the absence of a good hypothesis, all avenues must be explored, and that includes giant Dyson Swarm machines harnessing the power of the star. Gathering the spectra could help rule that out or bolster the case for that all other avenues exhausted scenario.

Heres the thing, too: you can get in on the action. Amateur astronomers use smaller scopes to track the star, which is bigger and older than the Sun. Its at around 12th magnitude in the direction of Cygnus. So get out there tonight and hunt for some aliens.

Read more:

The weird star that totally isn't aliens is dimming again | Astronomy ... - Astronomy Magazine

Posted in Astronomy | Comments Off on The weird star that totally isn’t aliens is dimming again | Astronomy … – Astronomy Magazine

[ 19 May 2017 ] Icy ring around Fomalhaut observed in new wavelength News – Astronomy Now Online

Posted: at 7:27 am

Composite image of the Fomalhaut star system. The ALMA data, shown in orange, reveal the distant and eccentric debris disk in never-before-seen detail. The central dot is the unresolved emission from the star, which is about twice the mass of our sun. Optical data from the Hubble Space Telescope is in blue; the dark region is a coronagraphic mask, which filtered out the otherwise overwhelming light of the central star. Credit: ALMA (ESO/NAOJ/NRAO), M. MacGregor; NASA/ESA Hubble, P. Kalas; B. Saxton (NRAO/AUI/NSF)

An international team of astronomers using the Atacama Large Millimeter/submillimeter Array (ALMA) has made the first complete millimeter-wavelength image of the ring of dusty debris surrounding the young star Fomalhaut. This remarkably well-defined band of rubble and gasis likely the result of exocomets smashing together near the outer edges of a planetary system 25 light-years from Earth.

EarlierALMA observations of Fomalhaut taken in 2012 whenthe telescope was still under construction revealed only about one half of the debris disk. Though this first image was merely a test of ALMAs initial capabilities, it nonetheless providedtantalizing hints about the nature and possible origin of the disk.

The new ALMA observations offer a stunningly complete view of this glowing band of debris and also suggest that there are chemical similarities between its icy contents and comets in our own solar system.

ALMA has given us this staggeringly clear image of a fully formed debris disk, said Meredith MacGregor, an astronomer at the Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass., and lead author on one of two papers accepted for publication in theAstrophysical Journaldescribing these observations. We can finally see the well-definedshape of the disk, which may tell us a great deal about the underlyingplanetary systemresponsible for its highly distinctive appearance.

Fomalhaut is a relatively nearby star system and one of only about 20 in which planets have been imaged directly. The entire system is approximately 440 million years old, or about one-tenth the age of our solar system.

As revealed in the new ALMA image, a brilliant band of icy dust about 2 billion kilometers wide has formed approximately 20 billion kilometers from the star.

Debris disks are common features around young stars and represent a very dynamic and chaotic period in the history of a solar system. Astronomers believe they are formed by the ongoing collisions of comets and other planetesimalsin the outer reaches of a recently formed planetary system.The leftover debris from these collisions absorbs light from its central star and reradiates that energy as a faint millimeter-wavelength glow that can be studied with ALMA.

Using the new ALMA data and detailed computer modeling, the researchers were able to calculate the precise location, width, and geometry of the disk. These parameters confirm that such a narrow ring is likely produced through the gravitational influence of planets in the system, noted MacGregor.

The new ALMA observations are also the first to definitively show apocenter glow, a phenomenon predicted in a 2016 paper by Margaret Pan, a scientist at the Massachusetts Institute of Technologyin Cambridge, who is also a co-author on the new ALMA papers. Like all objects with elongated orbits, the dusty material in the Fomalhaut disk travels more slowly when it is farthest from the star. As the dust slows down, it piles up, forming denser concentrations in the more distant portions of the disk. These dense regions can be seen by ALMA as brighter millimeter-wavelength emission.

Using the same ALMA dataset, but focusing on distinct millimeter-wavelength signals naturally emitted by molecules in space, the researchers also detected vast stores of carbon monoxide gas in precisely the same location as the debris disk.

These data allowed us to determine that the relative abundance of carbon monoxide plus carbon dioxide around Fomalhaut is about the same as found in comets in our own solar system, said Luca Matr with the University of Cambridge, UK, and lead author on the teams secondpaper. This chemical kinship may indicate a similarity in comet formation conditions between the outer reaches of this planetary system and our own. Matr and his colleaguesbelieve this gas is either released from continuous comet collisions or the result of a single, large impact between supercomets hundreds of times more massive than Hale-Bopp.

The presence of thiswell-defined debris disk around Fomalhaut, along with its curiously familiar chemistry, may indicate that this system is undergoing its own version of the Late Heavy Bombardment, a period approximately 4 billion years ago when the Earth and other planets were routinely struck by swarms of asteroids and comets left over from the formation of our solar system.

Twenty years ago, the best millimeter-wavelength telescopes gave the first fuzzy maps of sand grains orbiting Fomalhaut. Now with ALMAs full capabilities the entire ring of material has been imaged, concluded Paul Kalas, an astronomer at the University of California at Berkeley and principal investigator on these observations. One day we hope to detect the planets that influence the orbits of these grains.

The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.

Read more:

[ 19 May 2017 ] Icy ring around Fomalhaut observed in new wavelength News - Astronomy Now Online

Posted in Astronomy | Comments Off on [ 19 May 2017 ] Icy ring around Fomalhaut observed in new wavelength News – Astronomy Now Online

Researchers find a tiny moon around a large unnamed dwarf planet – Astronomy Magazine

Posted: at 7:27 am

2007 OR10 is the largest body in the solar system with no common name. And now, the no-name dwarf planet with a diameter between 800 and 950 miles (1,2901,528km) has been discovered to have a moon.

As the name implies, 2007 OR10 was initially discovered in 2007. Its the third or fourth largest object in the Kuiper Belt, just after Pluto, Eris, and Makemake. There is some debate as to its size. Dwarf planet planetary scientist Mike Brown lists it as the fourth largest, while other estimates place it above Makemake in diameter. Browns graduate student at the time, Meg Schwamb, discovered the world. It is a bright, red, icy world that swings from 33 to 101 AU in its orbit. (One AU is the distance between the Earth and the Sun, and Neptune is at 30 AU.)

OR10 has a very slow rotation rate, which hid the moon in plain sight from Hubble for quite some time. Its a large moon for OR10s size, estimated at 150 to 250 miles (240400km) in diameter. The higher estimate places the moon at the lower limits of dwarf planet status, had it orbited the Sun on its own. A body is considered a dwarf planet if it can attain a round shape and is only in orbit around the Sun and not another body. The moon is about one-quarter the size of OR10, a similar ratio to the Moon and Earth.

So now theres a dwarf planet without a name with a fairly large moon around it. Maybe this boosts the case for giving it a real name.

Excerpt from:

Researchers find a tiny moon around a large unnamed dwarf planet - Astronomy Magazine

Posted in Astronomy | Comments Off on Researchers find a tiny moon around a large unnamed dwarf planet – Astronomy Magazine

Don’t miss Jupiter’s moons and Great Red Spot during May – Astronomy Now Online

Posted: at 7:27 am

This image of Jupiter was taken on 3April 2017 when the planet was at a distance of 414million miles (667million kilometres) from Earth. The NASA/ESA Hubble Space Telescope reveals the intricate, detailed beauty of Jupiters clouds as arranged into bands of different latitudes. Lighter coloured areas, called zones, are high-pressure where the atmosphere rises. Darker low-pressure regions where air falls are called belts. Constantly stormy weather occurs where these opposing east-to-west and west-to-east flows interact. The planets Great Red Spot (GRS, lower left), is a long-lived storm roughly the diameter of Earth. Oval BA, affectionately referred to as the Little Red Spot (lower right), transits roughly 90minutes ahead of the GRS. Image credit: NASA, ESA, and A. Simon (GSFC).Observers in the heart of the British Isles have already entered that time of the year when astronomical twilight lasts all night. But even if the sky never truly gets dark, take solace in the sight of Jupiter, currently highest in the sky to the south around 10pmBST.

The Solar Systems largest planet is now seven weeks past opposition, but still presents a magnitude-2.3 disc with an angular width of 42arcseconds. This is means that a telescope magnification of just 45 is sufficient to enlarge it to the same apparent size of an average full Moon as seen with the unaided eye.

Even the smallest telescope (and powerful binoculars, if suitably steadied) will reveal Jupiters four largest Galilean moons Io, Europa, Ganymede and Callisto in their orbital dance around their parent planet.

With a quality telescope of 3-inch (7.6-cm) aperture or greater at magnifications of 100 and more you can occasionally observe (subject to good seeing conditions) the shadows of these moons slowly drift slowly across the face of Jupiter, like ink-black dots. The timings of the start and end of such events visible from the UK for the remainder of the month are tabulated below. The times that the moons themselves are occulted (hidden) by Jupiter, or pass into or emerge from the planets shadow (eclipsed) are also shown.On the UK morning of Sunday 28May 2017, the shadows of Jupiters Galilean moons Io and Ganymede may be seen simultaneously on the face of their parent planet from 1:16am to 1:39amBST. This computer simulation depicts the scene at 1:20amBST. Note that this is an erect-image view (north up, east left). Users of Newtonian/Dobsonian telescopes should rotate this image 180degrees to match the eyepiece view, while owners of refractors, Maksutov- and Schmidt-Cassegrain telescopes with a star diagonal should mirror the image left to right. AN graphic by Ade Ashford/SkySafari.Great Red Spot and OvalBA (aka Little Red Spot) Jupiters Great Red Spot (GRS) also puts on an appearance at times suitable for observing from the UK during the remainder of the month, also shown in the table below. The GRS is said to transit when it lies on an imaginary line joining Jupiters north and south poles. Owing to the planets fast rotation (at the latitude of the Great Red Spot it takes little more than 9h55m to make one revolution), the GRS is well seen for roughly an hour either side of the transit time. The Great Red Spot has an unmistakable brick red hue at present, making it an easy object to identify in quality telescopes capable of 100 magnification or more when seeing conditions are good.

For observers with 8-inch (20-cm) and larger telescopes, try to see the smaller OvalBA (also shown in the image at the top of the page), popularly known as the Little Red Spot though do bear in mind that it transits the central meridian of Jupiter around 1hours before the times given for the GRS.Predictions for the start and end times of Galilean shadow transits, plus information on their eclipses and occultations for any given date in a slightly more user friendly format may also be obtained through our Almanac. To see the satellite events for any given day, ensure that the Add phenomena of Jupiter? checkbox is ticked. Like the Great Red Spot predictions, all Galilean moon phenomena events are in Universal Time (UT). For help using the Almanac, see this article.

View post:

Don't miss Jupiter's moons and Great Red Spot during May - Astronomy Now Online

Posted in Astronomy | Comments Off on Don’t miss Jupiter’s moons and Great Red Spot during May – Astronomy Now Online

Microsoft Extends Cloud-Computing Arms Race to Africa – Wall Street Journal (subscription)

Posted: at 7:27 am


Wall Street Journal (subscription)
Microsoft Extends Cloud-Computing Arms Race to Africa
Wall Street Journal (subscription)
The data centers, which will serve customers of the software giant's Azure cloud-computing business, will be the first of their size built in Africa by one of the three major cloud-infrastructure providersMicrosoft, Amazon.com Inc., and Alphabet Inc ...
Microsoft offers cloud computing services to non-profitsETCIO.com
Workshop on cloudcomputing for NGOsThe Hindu
Microsoft to Open Cloud Data Center in Africa By 2018 | Fortune.comFortune
Bloomberg -The Official Microsoft Blog - Microsoft
all 77 news articles »

View post:

Microsoft Extends Cloud-Computing Arms Race to Africa - Wall Street Journal (subscription)

Posted in Cloud Computing | Comments Off on Microsoft Extends Cloud-Computing Arms Race to Africa – Wall Street Journal (subscription)

Is edge computing set to blow away the cloud? – Cloud Tech

Posted: at 7:27 am

Just about every new piece of technology is considered disruptive to the extent that they are expected to replace older technologies. Sometimes as with the cloud, old technology is simply re-branded to make it more appealing to customers and thereby to create the illusion of a new market. Lets remember that cloud computing had previously existed in one shape or form. At one stage it was called on-demand computing, and then it became application service provision.

Now there is edge computing, which some people are also calling fog computing and which some industry commentators feel is going to replace the cloud as an entity. Yet the question has to be: Will it really? The same viewpoint was given when television was invented. Its invention was meant to be the death of radio. Yet people still tune into radio stations by their thousands each and every day of every year.

Of course, there are some technologies that are really disruptive in that they change peoples habits and their way of thinking. Once people enjoyed listening to Sony Walkmans, but today most folk listen to their favourite tunes using smartphones thanks to iPods and the launch of the first iPhone by Steve Jobs in 2007, which put the internet in our pockets and more besides.

So why do people think edge computing will blow away the cloud? This claim is made in many online articles. Clint Boulton, for example, writes about it in his Asia Cloud Forum article, Edge Computing Will Blow Away The Cloud, in March this year. He cites venture capitalist Andrew Levine, a general partner at Andreessen Horowitz, who believes that more computational and data processing resources will move towards edge devices such as driverless cars and drones which make up at least part of the Internet of Things. Levine prophesises that this will mean the end of the cloud as data processing will move back towards the edge of the network.

In other words, the trend has been up to now to centralise computing within the data centre, while in the past it was often decentralised or localised nearer to the point of use. Levine sees driverless cars as being a data centre; they have more than 200 CPUs working to enable them to operate without going off the road and causing an accident. The nature of autonomous vehicles means that their computing capabilities must be self-contained, and to ensure safety they minimise any reliance they might otherwise have on the cloud. Yet they dont dispense with it.

The two approaches may in fact end up complementing each other. Part of the argument for bringing data computation back to the edge falls down to increasing data volumes, which lead to ever more frustratingly slow networks. Latency is the culprit. Data is becoming ever larger. So there is going to be more data per transaction, more video and sensor data. Virtual and augmented reality are going to play an increasing part in its growth too. With this growth, latency will become more challenging than it was previously. Furthermore, while it might make sense to put data close to a device such as an autonomous vehicle to eliminate latency, a remote way of storing data via the cloud remains critical.

The cloud can still be used to deliver certain services too, such as media and entertainment. It can also be used to back up data and to share data emanating from a vehicle for analysis by a number of disparate stakeholders. From a data centre perspective, and moving beyond autonomous vehicles to a general operational business scenario, creating a number of smaller data centres or disaster recovery sites may reduce economies of scale and make operations more inefficient than efficient. Yes, latency might be mitigated, but the data may also be held within the same circles of disruption with disastrous consequences when disaster strikes; so for the sake of business continuity some data may still have to be stored or processed elsewhere, away from the edge of a network. In the case of autonomous vehicles, and because they must operate whether a network connection exists or not, it makes sense for certain types of computation and analysis to be completed by the vehicle itself. However, much of this data is still backed up via a cloud connection whenever it is available. So, edge and cloud computing are likely to follow more of a hybrid approach than a standalone one.

Saju Skaria, senior director at consulting firm TCS, offers several examples of where edge computing could prove advantageous in his LinkedIn Pulse article, Edge Computing Vs. Cloud Computing: Where Does the Future Lie?. He certainly doesnt think that the cloud is going to blow away.

Edge computing does not replace cloud computingin reality, an analytical model or rules might be created in a cloud then pushed out to edge devices and some [of these] are capable of doing analysis. He then goes on to talk about fog computing, which involves data processing from the edge to a cloud. He is suggesting that people shouldnt forget data warehousing too, because it is used for the massive storage of data and slow analytical queries.

In spite of this argument, Gartners Thomas Bittman, seems to agree that Edge Will Eat The Cloud. Today, cloud computing is eating enterprise datacentres, as more and more workloads are born in the cloud, and some are transforming and moving to the cloud but theres another trend that will shift workloads, data, processing and business value significantly away from the cloud. The edge will eat the cloud and this is perhaps as important as the cloud computing trend ever was.

Later on in his blog, Bittman says: The agility of cloud computing is great but it simply isnt enough. Massive centralisation, economies of scale, self-service and full automation get us most of the way there but it doesnt overcome physics the weight of data, the speed of light. As people need to interact with their digitally-assisted realities in real-time, waiting on a data centre miles (or many miles) away isnt going to work. Latency matters. Im here right now and Im gone in seconds. Put up the right advertising before I look away, point out the store that Ive been looking for as I driver, let me know that a colleague is heading my way, help my self-driving car to avoid other cars through a busy intersection. And do it now.

He makes some valid points, but he falls into the argument that has often been used about latency and data centres: They have to be close together. The truth, however, is that wide area networks will always be the foundation stone of both edge and cloud computing. Secondly, Bittman clearly hasnt come across data acceleration tools such as PORTrockIT and WANrockIT. While physics is certainly a limiting and challenging factor that will always be at play in networks of all kinds including WANs, it is possible today to place your datacentres at a distance from each other without suffering an increase in data and network latency. Latency can be mitigated, and its impact can be significantly reduced no matter where the data processing occurs, and no matter where the data resides.

So lets not see edge computing as a new solution. It is but one solution, and so is the cloud. Together the two technologies can support each other. One commentator says in response to a Quora question about the difference between edge computing and cloud computing that edge computing is a method of accelerating and improving the performance of cloud computing for mobile users. So the argument that edge will replace cloud computing is a very foggy one. Cloud computing may at one stage be re-named for marketing reasons but its still here to stay.

Read the rest here:

Is edge computing set to blow away the cloud? - Cloud Tech

Posted in Cloud Computing | Comments Off on Is edge computing set to blow away the cloud? – Cloud Tech

Firms Face Decelerating Cloud Spending: Analyst – Investopedia

Posted: at 7:27 am

Firms Face Decelerating Cloud Spending: Analyst
Investopedia
The cloud computing revolution has been one of the most disruptive catalysts for change in the technology sector over the past couple of years. As enterprises move their data centers off premise and migrate to hybrid IT structures, companies such as ...

More:

Firms Face Decelerating Cloud Spending: Analyst - Investopedia

Posted in Cloud Computing | Comments Off on Firms Face Decelerating Cloud Spending: Analyst – Investopedia

The route to high-speed quantum computing is paved with error – Ars Technica UK

Posted: at 7:27 am

Topical Press Agency/Getty Images

When it comes toquantum computing, mostly I get excited about experimental results rather than ideas for new hardware. New devicesor new ways to implement old devicesmay end up being useful,but we won'tknow for sure when the results are in. If we are to grade existing ideas by their usefulness, then adiabatic quantum computing has to beright up there, since you can use it to perform some computations now. And at this point, adiabatic quantum computing has the best chance of getting the number of qubits up.

But qubits aren't everythingyou also need speed. Sohow, exactly, do you compare speeds between quantum computers? If you begin looking into thisissue, you'll quickly learnit's far more complicated than anyone really wanted it to be. Even when you can compare speeds today, you also want to be able to estimate how much better you could do with an improved version of the same hardware. This, it seems, often proveseven more difficult.

Unlike classical computing, speed itself is not so easy to define for a quantum computer. If we just take something like D-Wave's quantum annealer as an example, it has no system clock, and it doesn't use gates that perform specific operations. Instead, the whole computer goes through a continuous evolution from the state in which it was initialized to the state that, hopefully, contains the solution. The time that takesis called the annealing time.

At this point, you can all say, "Chris ur dumb, clearly the time from initialization to solution is what counts." Except, I used the word hopefully in that sentence above for good reason. No matter how a quantum computer is designed and operated, the readout process involves measuring the states of the qubits. That means there is a non-zero probability of getting the wrong answer.

This does not mean that a quantum computer is useless. First, for some calculations, it is possible to check a solution very efficiently. Finding prime factors is a good example. I simply multiply the factors together; if the answer doesn't come to the number I initialized the computer with, I know it got it wrong. In case of a wrong answer, I simply repeat the computation. When you can't efficiently check the solution, you can rely on statistics: the correct answer is the most probable outcome of any measurement of the final state. I can just run the same computation multiple times and determine the correct answer from the statistical distribution of the results.

So for an adiabatic quantum computer, this means speed is the annealing time multiplied by the number of runs required to determine the most probable outcome. While notthe most satisfactory answer, it's stillbetter than nothing.

Unfortunately, these two factors are not independent of each other. During annealing, the computation requires that all the qubits stay in the ground state. However, fast changes are more likely to disturb the qubits out of the ground stateso decreasing the annealing time increases the probability of getting an incorrect result. Do the work faster, andyou may need to perform the computation more times to correctly determine the most probable outcome. And as you decrease the annealing time, wrong answers will eventually become so probable that they are indistinguishable from correct answers.

Sodetermining the annealing time of an adiabatic quantum computer has something of a trial-and-error approach to it. The underlying logic is that slower is probably better, but we'll go as fast as we dare. A new paperpublished inPhysical Review Lettersshows that, actually, under the right conditions, it might be better to throw caution to the wind and speed up even more. However, that speed comes at the cost of high peak power consumption.

To recap,in an adiabatic quantum computer, the qubits are all placed in the ground state of some simple global environment. That environment is then modified such that the ground state is the solution to some problem that you want to solve. Now, provided that the qubits remain in the ground state as you change the environment, you will then obtain the correct solution.

The key liesin how fast you are allowed to modify the environment. If you do it very slowly, someone with a slide rule might beat you to the answer. If you do it very fast, your computation is likely to go wrong because the qubits leave the ground state. Fast modifications also require high peak power, so there is a trade-off between speed, power, and accuracy.

To understand the trade-off, let's use an example. Imagine the equivalent of a quantum ball and spring, otherwise known as the harmonic oscillator. In its lowest energy state, the oscillator is bouncing up and down with some natural frequency, which is given by the stiffness of the spring and the mass of the oscillator. In this case, changing the environment would mean increasing or decreasing the stiffness of the spring. To complete the analogy, the jumps between different quantum states increase and decrease the amplitude of oscillation, but those jumps don't change the frequency.

Next, imagine that we reduce the stiffness of the spring, making the system a bit floppier. The oscillation frequency slows, and the amplitude should also drop, but it will take a little time. If the pace of reduction is too fast, then the amplitude remains high for a moment, corresponding more closelyto an excited state. As a result, the oscillator might leave the ground state.

To avoid this, we have to change the spring stiffness at a rate that is slow enough for the oscillator to bleed off the excess energy. Likewise, if we tighten the spring, the process gives energy to the oscillator. If we give it all that energy in one big lump, then it will be sufficient for the oscillator to jump to the excited state, if only briefly.

You can also think of this in terms of power. Although we might change the stiffness of the spring between two values, and therefore expend some amount of energy, the total power depends on how fast we make that change. A short sharp change requires high power, while a long slow change requires low power. So, you can think of three parameters that should be optimized: the speed of the change, the power consumption to complete the change, and the chance that the change drives the qubit out of the ground state.

Listing image by Topical Press Agency/Getty Images

Go here to see the original:

The route to high-speed quantum computing is paved with error - Ars Technica UK

Posted in Quantum Computing | Comments Off on The route to high-speed quantum computing is paved with error – Ars Technica UK

IBM scientists demonstrate ballistic nanowire connections, a potential future key component for quantum computing – Phys.Org

Posted: at 7:27 am

May 19, 2017 by Chris Sciacca Johannes Gooth is a postdoctoral fellow in the Nanoscale Devices & Materials group of the Science & Technology department at IBM Research Zurich. His research is focused on nanoscale electronics and quantum physics. Credit: IBM Research

IBM scientists have achieved an important milestone toward creating sophisticated quantum devices that could become a key component of quantum computers. As detailed in the peer-review journal Nano Letters, the scientists have shot an electron through a III-V semiconductor nanowire integrated on silicon for the first time.

IBM scientists are driving multiple horizons in quantum computing, from the technology for the next decade based on superconducting qubits, towards novel quantum devices that could push the scaling limit of today's microwave technology down to the nanometer scale and that do not rely on superconducting components, opening a path towards room-temperature operation.

Now, IBM scientists in Zurich have made a crucial fundamental breakthrough in their paper Ballistic one-dimensional InAs nanowire cross-junction interconnects. Using their recently developed Template-Assisted-Selective-Epitaxy (TASE) technique to build ballistic cross-directional quantum communication links, they pioneered devices which can coherently link multiple functional nanowires for the reliable transfer of quantum information across nanowire networks. The nanowire acts as a perfect guide for the electrons, such that the full quantum information of the electron (energy, momentum, spin) can be transferred without losses.

By solving some major technical hurdles of controlling the size, shape, position and quality of III-V semiconductors integrated on Si, ballistic one-dimensional quantum transport has been demonstrated. While the experiments are still on a very fundamental level, such nanowire devices may pave the way towards fault-tolerant, scalable electronic quantum computing in the future.

The paper's lead author, IBM scientist Dr. Johannes Gooth, noted that the milestone has implications for the development of quantum computing. By enabling fully ballistic connections where particles are in flight at the nanoscale, the quantum system offers exponentially larger computational space.

Earlier this year, IBM launched an industry-first initiative to build commercially available universal quantum computing systems. The planned "IBM Q" quantum systems and services will be delivered via the IBM Cloud platform and will deliver solutions to important problems where patterns cannot be seen by classical computers because the data doesn't exist and the possibilities needed to explore to get to the answer are too enormous to ever be processed by classical systems.

Explore further: Five ways quantum computing will change the way we think about computing

More information: Johannes Gooth et al. Ballistic One-Dimensional InAs Nanowire Cross-Junction Interconnects, Nano Letters (2017). DOI: 10.1021/acs.nanolett.7b00400

Journal reference: Nano Letters

Provided by: IBM

While technologies that currently run on classical computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to important problems where ...

IBM announced today it has successfully built and tested its most powerful universal quantum computing processors. The first new prototype processor will be the core for the first IBM Q early-access commercial systems. The ...

The global race towards a functioning quantum computer is on. With future quantum computers, we will be able to solve previously impossible problems and develop, for example, complex medicines, fertilizers, or artificial ...

A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'the boundary at which problems become impossible for today's computers ...

IBM has announced its plans to begin offering the world's first commercial universal quantum-computing servicecalled IBM Q, the system will be made available to those who wish to use it for a fee sometime later this year. ...

What does the future hold for computing? Experts at the Networked Quantum Information Technologies Hub (NQIT), based at Oxford University, believe our next great technological leap lies in the development of quantum computing.

Researchers have developed the world's thinnest metallic nanowire, which could be used to miniaturise many of the electronic components we use every day.

IBM scientists have achieved an important milestone toward creating sophisticated quantum devices that could become a key component of quantum computers. As detailed in the peer-review journal Nano Letters, the scientists ...

Rice University scientists have created a rechargeable lithium metal battery with three times the capacity of commercial lithium-ion batteries by resolving something that has long stumped researchers: the dendrite problem.

In the race to produce a quantum computer, a number of projects are seeking a way to create quantum bitsor qubitsthat are stable, meaning they are not much affected by changes in their environment. This normally needs ...

Nanocrystals have diverse applications spanning biomedical imaging, light-emitting devices, and consumer electronics. Their unique optical properties result from the type of crystal from which they are composed. However, ...

Today's computers are faster and smaller than ever before. The latest generation of transistors will have structural features with dimensions of only 10 nanometers. If computers are to become even faster and at the same time ...

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Follow this link:

IBM scientists demonstrate ballistic nanowire connections, a potential future key component for quantum computing - Phys.Org

Posted in Quantum Computing | Comments Off on IBM scientists demonstrate ballistic nanowire connections, a potential future key component for quantum computing – Phys.Org