Astronomers confirm nearby star a good model of our early solar system – Phys.Org

May 2, 2017 Artist's illustration of the epsilon Eridani system showing Epsilon Eridani b, right foreground, a Jupiter-mass planet orbiting its parent star at the outside edge of an asteroid belt. In the background can be seen another narrow asteroid or comet belt plus an outermost belt similar in size to our solar system's Kuiper Belt. The similarity of the structure of the Epsilon Eridani system to our solar system is remarkable, although Epsilon Eridani is much younger than our sun. SOFIA observations confirmed the existence of the asteroid belt adjacent to the orbit of the Jovian planet. Credit: Illustration by NASA/SOFIA/Lynette Cook.

NASA's SOFIA aircraft, a 747 loaded with a 2.5-meter telescope in the back and stripped of most creature comforts in the front, took a big U-turn over the Pacific west of Mexico.

The Stratospheric Observatory for Infrared Astronomy aircraft was just beginning the second half of an overnight mission on Jan. 28, 2015. It turned north for a flight all the way to western Oregon, then back home to NASA's Armstrong Flight Research Center in Palmdale, California. Along the way, pilots steered the plane to aim the telescope at a nearby star.

Iowa State University's Massimo Marengo and other astronomers were on board to observe the mission and collect infrared data about the star.

That star is called epsilon Eridani. It's about 10 light years away from the sun. It's similar to our sun, but one-fifth the age. And astronomers believe it can tell them a lot about the development of our solar system.

Marengo, an Iowa State associate professor of physics and astronomy, and other astronomers have been studying the star and its planetary system since 2004. In a 2009 scientific paper, the astronomers used data from NASA's Spitzer Space Telescope to describe the star's disk of fine dust and debris left over from the formation of planets and the collisions of asteroids and comets. They reported the disk contained separate belts of asteroids, similar to the asteroid and Kuiper belts of our solar system.

Subsequent studies by other astronomers questioned that finding.

A new scientific paper, just published online by The Astronomical Journal, uses SOFIA and Spitzer data to confirm there are separate inner and outer disk structures. The astronomers report further studies will have to determine if the inner disk includes one or two debris belts.

Kate Su, an associate astronomer at the University of Arizona and the university's Steward Observatory, is the paper's lead author. Marengo is one of the paper's nine co-authors.

Marengo said the findings are important because they confirm epsilon Eridani is a good model of the early days of our solar system and can provide hints at how our solar system evolved.

"This star hosts a planetary system currently undergoing the same cataclysmic processes that happened to the solar system in its youth, at the time in which the moon gained most of its craters, Earth acquired the water in its oceans, and the conditions favorable for life on our planet were set," Marengo wrote in a summary of the project.

A major contributor to the new findings was data taken during that January 2015 flight of SOFIA. Marengo joined Su on the cold and noisy flight at 45,000 feet, above nearly all of the atmospheric water vapor that absorbs the infrared light that astronomers need to see planets and planetary debris.

Determining the structure of the disk was a complex effort that took several years and detailed computer modeling. The astronomers had to separate the faint emission of the disk from the much brighter light coming from the star.

"But we can now say with great confidence that there is a separation between the star's inner and outer belts," Marengo said. "There is a gap most likely created by planets. We haven't detected them yet, but I would be surprised if they are not there. Seeing them will require using the next-generation instrumentation, perhaps NASA's 6.5-meter James Webb Space Telescope scheduled for launch in October 2018."

That's a lot of time and attention on one nearby star and its debris disk. But Marengo said it really is taking astronomers back in time.

"The prize at the end of this road is to understand the true structure of epsilon Eridani's out-of-this-world disk, and its interactions with the cohort of planets likely inhabiting its system," Marengo wrote in a newsletter story about the project. "SOFIA, by its unique ability of capturing infrared light in the dry stratospheric sky, is the closest we have to a time machine, revealing a glimpse of Earth's ancient past by observing the present of a nearby young sun."

Explore further: Solar System's Young Twin Has Two Asteroid Belts

More information: Kate Y. L. Su et al, The Inner 25 au Debris Distribution in theEri System, The Astronomical Journal (2017). DOI: 10.3847/1538-3881/aa696b

(PhysOrg.com) -- Astronomers have discovered that the nearby star Epsilon Eridani has two rocky asteroid belts and an outer icy ring, making it a triple-ring system. The inner asteroid belt is a virtual twin of the belt in ...

Was it a catastrophic collision in the star's asteroid belt? A giant impact that disrupted a nearby planet? A dusty cloud of rock and debris? A family of comets breaking apart? Or was it alien megastructures built to harvest ...

When planets first begin to form, the aftermath of the process leaves a ring of rocky and icy material that's rotating and colliding around the young central star like a celestial roller derby. Analogs to our own Solar System's ...

(Phys.org)An international team of researchers reports the discovery of a series of concentric rings in the debris disk around a young nearby star known as HIP 73145. These unusual substructures could provide new details ...

Astronomers have successfully peered through the 'amniotic sac' of a star that is still forming to observe the innermost region of a burgeoning solar system for the first time.

In 1936, the young star FU Orionis began gobbling material from its surrounding disk of gas and dust with a sudden voraciousness. During a three-month binge, as matter turned into energy, the star became 100 times brighter, ...

(Phys.org)Russian scientists have presented the first results of solar observations made with the new radioheliograph of the Siberian Solar Radio Telescope (SSRT). The Siberian Radioheliograph (SRH), has recently commenced ...

VISTA's infrared capabilities have now allowed astronomers to see the myriad of stars in the Small Magellanic Cloud galaxy much more clearly than ever before. The result is this record-breaking imagethe biggest infrared ...

A mysterious gamma-ray glow at the center of the Milky Way is most likely caused by pulsars the incredibly dense, rapidly spinning cores of collapsed ancient stars that were up to 30 times more massive than the sun. That's ...

(Phys.org)Jason Wright, an astronomy professor at Penn State, has uploaded a paper to the arXiv preprint sever that addresses the issue of whether we have looked hard enough for extinct alien lifeparticularly intelligent ...

NASA's SOFIA aircraft, a 747 loaded with a 2.5-meter telescope in the back and stripped of most creature comforts in the front, took a big U-turn over the Pacific west of Mexico.

It was a good week for astrobiology. Within days of NASA's announcement that the necessary ingredients for life exist in the plumes erupting from the southern pole of Saturn's moon Enceladus, scientists gathered at Stanford ...

Adjust slider to filter visible comments by rank

Display comments: newest first

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Here is the original post:

Astronomers confirm nearby star a good model of our early solar system - Phys.Org

Local astronomy club offers peek at the heavens – Scranton Times-Tribune

Article Tools

Imy Hernandez, 5, of Throop, looks at Saturn during the Keystone College Thomas G. Cupillari 60 Astronomical Observatorys summer program on Wednesday. The program runs Mondays and Wednesdays through July 31. Jason Farmer / Staff Photographer photos/blumunkee

The heavenly bodies will be there. Whether youll be able to see them is another question.

The Lackawanna Astronomical Society will host Astronomy Day at Keystone Colleges Thomas G. Cupillari Observatory in Benton Twp. on Saturday starting at 7 p.m.

The observatorys telescopes and those of the astronomical societys members will be used to view the moon and its craters, mountains, seas and rills. The moon on Saturday will be in a waxing gibbous phase about three-quarters full.

Telescopes will also offer views of the solar systems largest planet, Jupiter, and its four largest moons. These are the moons first observed by Galileo, proving that some bodies orbit things other than the Earth.

Beyond the solar system, observations are planned of stars, star clusters, double stars, globular clusters and, possibly, nebulae.

Telescopes will be also be set up to safely view the sun before it sets over the western horizon.

The LAS said everyone is welcome, no reservations are required and admission is free.

Much depends on the weather, of course, and the extended forecast doesnt look promising. AccuWeather is calling for considerable clouds, occasional rain and drizzle in the evening, followed by a passing shower late. Saturdays high temperature is expected to be 56 and the overnight low 43.

Even if the weather doesnt cooperate, society members will be available to answer questions about their telescopes and observing the night sky. There will be an illustrated slide program and free sky maps, and free refreshments.

The observatory is at Route 107 and Hack Road in Fleetville, about 1 miles west of Interstate 81 Exit 202, and 7 miles from Keystones campus in La Plume Twp.

Read the rest here:

Local astronomy club offers peek at the heavens - Scranton Times-Tribune

Dark matter may be fuzzier than we thought – Astronomy Magazine

Dark matter has a profound effect on our universe, shaping galaxies and even leaving its fingerprints on the energy left over from the Big Bang. Despite its relevance, dark matter is also extremely hard to detect rather than observe it directly, astronomers instead look for clues based on its gravitational interaction with normal matter (the protons, electrons, and neutrons that make up everything we see and touch). Recent observations made with NASAs Chandra X-ray Observatory have hinted that dark matter may be fuzzier than previously thought.

The study, which was recently accepted for publication in the Monthly Notices of the Royal Astronomical Society, focuses on X-ray observations of 13 galaxy clusters. The authors use observations of the hot gas that permeates galaxy clusters to estimate the amount and distribution of dark matter within the clusters and test its properties against current leading models, looking for the model that best fits the data.

The current standard cosmological model includes cold dark matter as a major component. In this case, cold simply means that dark matter travels slowly when compared to the speed of light. However, cold dark matter models indicate that dark matter and normal matter, which is drawn to the dark matter via gravity should clump together in the centers of galaxies. But no such increase in matter, normal or dark, is seen. Additionally, cold dark matter models predict that the Milky Way should have many more small satellite galaxies than we currently see. Even accounting for the fact that some satellites may be challenging to find, the cold dark matter models still over-predict our satellites by a considerable amount.

However, cold dark matter is only one of several dark matter theories. By contrast, fuzzy dark matter is a model in which dark matter has a mass about 10 thousand trillion trillion times smaller than an electron. In quantum mechanics, all particles have both a mass and a corresponding wavelength. Such a tiny mass would actually cause the wavelength of dark matter to stretch 3,000 light-years between peaks. (The longest wavelength of light, which is radio, stretches just a few miles between peaks.)

With a wavelength this long, dark matter would not clump in the centers of galaxies, which could explain the reason this is not observed. But while simple fuzzy dark matter models fit observations of small galaxies, larger galaxies may require a slightly more complex explanation. And galaxy clusters are larger test beds still, which is why researchers turned Chandra to several massive galaxy clusters for observations.

The results show that while a simple fuzzy dark matter model still didnt explain the cluster observations well, a more complex and fuzzier model did. In this model, dark matter occupying several quantum states at once (think an atom with many electrons, some of which are at higher energy levels) creates overlapping wavelengths that further spread out the effect, which changes the distribution of dark matter expected throughout the galaxy cluster as a whole.

The predictions from this model match the observations of the 13 galaxy clusters much more closely, indicating that fuzzier dark matter may be the best model to incorporate into our cosmological models. However, further study and more precise measurements are needed to better test this theory and ensure it truly reflect what we see throughout the cosmos.

See original here:

Dark matter may be fuzzier than we thought - Astronomy Magazine

How a hidden population of pulsars may leave the Milky Way aglow – Astronomy Magazine

Searches for dark matter arent limited to facilities hundreds of feet underground. In the sky, astronomers continually seek observational evidence of the influence of dark matter on galactic scales. A recent study performed by an international team of astronomers, however, has proposed that the gamma ray glow coming from the Milky Ways center, previously attributed to dark matter, may not arise from so exotic a source. Instead, the study says, the gamma rays could be produced by pulsars.

The study, which has been submitted to The Astrophysical Journal, says that pulsars the rapidly spinning cores left behind by massive stars after they die are responsible for the gamma rays seen in the center of our galaxy. Using data from the Large Area Telescope on NASAs Fermi Gamma-ray Space Telescope, the researchers examined the central portions of the galaxy to determine the origin of the gamma-ray glow that has long been observed there. In a press release, Mattia Di Mauro of the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) said, Our study shows that we dont need dark matter to understand the gamma-ray emissions of our galaxy. Instead, we have identified a population of pulsars in the region around the galactic center, which sheds new light on the formation history of the Milky Way.

Why was this glow previously thought to be a signal of dark matter? Although dark matter doesnt interact with normal matter directly, dark matter particles can decay or annihilate each other. Seth Digel, head of KIPACs Fermi group, explained: Widely studied theories predict that these processes would produce gamma rays. Thus, observers have searched for unexplained gamma rays in areas where dark matter is thought to accumulate, such as the centers of galaxies. And, indeed, the Milky Ways center is brighter in gamma-ray light than expected. Thus, one explanation for the excess radiation is reactions powered by dark matter.

But the galactic center is a challenging place to observe. Not only is it shrouded in dust, its also densely packed with stars and the home of energetic processes that could also explain the gamma-ray excess observed there. A significant portion of the glow is produced when cosmic rays resulting from supernovae hit the molecules in interstellar gas clouds, causing them to give off light. But pulsars can also inject energy into these gas clouds, causing them to glow as well.

And with the addition of this new data, Eric Charles of KIPAC explained, the gamma-ray excess at the galactic center is speckled, not smooth as we would expect for a dark matter signal. The speckles may be individual sources such as pulsars, which are small and hard to see, especially in such a crowded region in the galactic center. By contrast, a signal from dark matter should be smooth, following the general distribution of dark matter particles expected in the region.

Approximately 70 percent of the Milky Ways point sources are pulsars, Di Mauro said. And Pulsars have very distinct spectra that is, their emissions vary in a specific way with the energy of the gamma rays they emit. By modeling the gamma-ray glow expected from the specific emissions of pulsars, the group found that their expectations matched the observations, indicating that pulsars, not dark matter, is responsible.

The study is in agreement with some other findings, which show that gamma-ray signals attributable to dark matter in the centers of other galaxies, particularly dwarf galaxies, are not seen. While our neighbor, the Andromeda Galaxy, also shows a gamma-ray excess in its center, the group argues that it might be due to pulsars as well.

But the complexity of the centers of galaxies continues to make pinpointing the exact source of these gamma rays difficult, and the study cant completely rule out the possibility of dark matter as a contributor to the gamma-rays observed in the Milky Ways center. More direct evidence will be needed; the team is already planning to observe the area with radio telescopes to identify individual pulsars in an attempt to better characterize the origin of gamma rays in the Milky Ways bulge.

Originally posted here:

How a hidden population of pulsars may leave the Milky Way aglow - Astronomy Magazine

[ 3 May 2017 ] NASA probe finds Saturn ring gap emptier than predicted News – Astronomy Now Online

This unprocessed image shows features in Saturns atmosphere from closer than ever before. The view of Saturns polar vortex was captured by NASAs Cassini spacecraft during its first Grand Finale dive past the planet on April 26, 2017. Credit: NASA/JPL-Caltech/Space Science Institute

NASAs Cassini spacecraft sped through a gap between Saturn and its rings for the second time Tuesday after data from the probes first perilous passage through the unexplored region last week found it to contain fewer potentially hazardous dust particles than expected.

The finding is one of several results from Cassinis first trip through the ring gap that has puzzled scientists.Engineers in charge of keeping Cassini safe, on the other hand, are pleased that the space between Saturn and its rings harbours fewer dangers.

The region between the rings and Saturn is the big empty, apparently, said Earl Maize, Cassini project manager at NASAs Jet Propulsion Laboratory in Pasadena, California. Cassini will stay the course, while the scientists work on the mystery of why the dust level is much lower than expected.

Cassini radioed ground controllers April 27 that it safely made the first-ever flight through the 1,500-mile (2,400-kilometre) ring gap, coming closer to Saturn than any spacecraft in history.

The orbiter used its last flyby of Saturns largest moon Titan on April 22 to reshape its path around the planet, plunging Cassini on an orbit that will take it inside the rings once every week until Sept. 15, when it will dive into the ringed worlds hydrogen-helium atmosphere to end the mission.

Cassini made its second journey inside the rings Tuesday, and mission control at JPL received confirmation from the spacecraft around 1530 GMT (11:30 a.m. EDT) that it survived the encounter.

During last weeks flyby, Cassini turned to use its its 13-foot (4-metre) high-gain dish antenna as a shield to protect the spacecrafts sensitive components, like computers and scientific instruments, from the bombardment of any microscopic dust grains in its path.

Scientists crunching data captured last week said the passage produced far fewer dust impacts than predicted.

Models of the dust environment suggested Cassini would sail through the ring gap unscathed, so officials were not too concerned going into the first flyby. Nevertheless, recordings of the dust strikes were quieter than scientists expected.

The crafts radio and plasma wave science instrument detected hundreds of dust hits per second when Cassini was passing just outside Saturns rings over the last few months, but only registered a few impacts inside the ring gap.

Scientists converted the raw radio and plasma wave data into an audio format, NASA said, to listen for debris striking Cassinis antenna.

Dust particles hitting the instruments antennas sound like pops and cracks, covering up the usual whistles and squeaks of waves in the charged particle environment that the instrument is designed to detect, NASA said in a press release. The RPWS team expected to hear a lot of pops and cracks on crossing the ring plane inside the gap, but instead, the whistles and squeaks came through surprisingly clearly on April 26.

It was a bit disorienting we werent hearing what we expected to hear, said William Kurth, radio and plasma wave science team lead at the University of Iowa. Ive listened to our data from the first dive several times and I can probably count on my hands the number of dust particle impacts I hear.

Cassini made the trip through the ring gap at a relative velocity of about 77,000 mph (124,000 kilometres per hour), fast enough to travel from New York to Los Angeles in less than two minutes.

The video posted below includes the audio recording from Cassinis radio and plasma wave science instrument during the April 26 flyby.

The grains that hit Cassini were likely no bigger than a particle of smoke, or about 1 micron in size, according to NASA.

Cassinis swing inside Saturns rings Tuesday occurred without using the crafts antenna as a shield. Mission managers decided such a precaution was no longer necessary after sampling the dust during the first flyby.

But four of the 20 remaining ring gap passages will place Cassini closer to the inner edge of Saturns D ring, where scientists expect more dust particles. During those orbits, which begin in late May, the spacecraft will again turn its high-gain antenna into a shield.

Imagery from Cassinis approach to Saturn on April 26 revealed the closest-ever views of the planets clouds and a bizarre six-sided polar vortex scientists had only studied from afar before.

These images are shocking, said Kevin Baines, an atmospheric scientist on the Cassini team at JPL. We didnt expect to get anything nearly as beautiful as these images. All the different structures we see on them are phenomenal. We predicted wed see fogs and something pretty boring, but were seeing lots of great features a lot of activity going on on Saturn.

Baines called the hexagonal storm swirling at Saturns north pole the planets belly button.

This is a hole in the pole that is very deep, and we can tell that from looking at different colors of light, Baines said Friday in a Facebook Live event, comparing its structure to the behaviour of water in a flushing toilet. This is about 2,000 kilometres (1,200 miles) across.

Winds whip around the storm at up to 180 mph, or 300 kilometres per hour, Baines said. Like a hurricane on Earth, the wind speeds die down farther from the center of circulation, where individual storm clouds appear to move around Saturn in the planets jet stream.

Now we see structure, Baines said. You see the curly cues on here, all sorts of strange features that were trying to understand Now were seeing little tiny circular clouds that really have their own individual characters.

They might (have) convective upwelling from below, so were looking for lightning and other things to see if we can really confirm that, Baines said.

(For) this first dive, were focusing on looking at Saturn, said Linda Spilker, Cassinis project scientist at JPL. We got a series of images from the pole to the equator. We have other data as well, spectra in the infrared, the far-infrared and ultraviolet that will help us put together the puzzle of what were seeing.

During the missions second orbit through the ring gap, Cassinis cameras were programmed to take pictures of Saturns rings backlit by the sun, a viewing geometry that allows the instruments to see faint ringlets and other fine structures.

Future encounters will focus on studying Saturns interior, magnetic field and taking the first measurement of the mass of the planets rings, which will tell scientists about their age and origin.

The video posted below condenses one hour of observations into an animated movie showing a series of Cassini images taken April 26.

The movie shows Cassinis view of Saturn starting from an altitude of 45,000 miles to just 4,200 miles (72,400 kilometers to 6,700 kilometers) above the planets cloud tops.

I was surprised to see so many sharp edges along the hexagons outer boundary and the eye-wall of the polar vortex, said Kunio Sayanagi, an associate of the Cassini imaging team based at Hampton University in Virginia, who helped produce the new movie. Something must be keeping different latitudes from mixing to maintain those edges.

The images from the first pass were great, but we were conservative with the camera settings. We plan to make updates to our observations for a similar opportunity on June 28 that we think will result in even better views, said Andrew Ingersoll, a member of the Cassini imaging team based at Caltech.

Email the author.

Follow Stephen Clark on Twitter: @StephenClark1.

Read the rest here:

[ 3 May 2017 ] NASA probe finds Saturn ring gap emptier than predicted News - Astronomy Now Online

Astronomers Find Enormous Wave of Hot Gas Rolling through Nearby Galaxy Cluster – Sci-News.com

A wave spanning 200,000 light-years (about twice the size of our Milky Way Galaxy) is rolling through the Perseus Cluster, according to observations from NASAs Chandra X-ray Observatory coupled with radio observations and computer simulations.

This X-ray image of the hot gas in the Perseus Cluster was made from 16 days of Chandra observations. An oval highlights the location of an enormous wave found to be rolling through the gas. Image credit: NASAs Goddard Space Flight Center / Stephen Walker et al.

Galaxy clusters are the largest structures bound by gravity in the Universe today.

Approximately 11 million light-years across and located 240 million light-years away, the Perseus Cluster (Abell 426) is named for its host constellation.

Like all galaxy clusters, most of its observable matter takes the form of a pervasive gas averaging tens of millions of degrees, so hot it only glows in X-rays.

Observations from NASAs Chandra X-ray Observatory have revealed a variety of structures in this gas, from vast bubbles blown by the supermassive black hole in the clusters central galaxy, NGC 1275, to an enigmatic concave feature known as the bay.

The bays concave shape couldnt have formed through bubbles launched by the black hole.

Radio observations using the Karl G. Jansky Very Large Array show that the bay structure produces no emission, the opposite of what astronomers would expect for features associated with black hole activity.

In addition, standard models of sloshing gas typically produced structures that arc in the wrong direction.

A team of astronomers led by Dr. Stephen Walker of NASAs Goddard Space Flight Center turned to existing Chandra observations of the Perseus Cluster to further investigate the bay.

The scientists combined a total of 10.4 days of high-resolution data with 5.8 days of wide-field observations at energies between 700 and 7,000 electron volts. For comparison, visible light has energies between about two and three electron volts.

The authors then filtered the Chandra data to highlight the edges of structures and reveal subtle details.

Next, they compared the edge-enhanced Perseus image to computer simulations of merging galaxy clusters.

One simulation seemed to explain the formation of the bay.

In it, gas in a large cluster similar to Perseus has settled into two components, a cold central region with temperatures around 54 million degrees Fahrenheit (30 million degrees Celsius) and a surrounding zone where the gas is three times hotter.

Then a small galaxy cluster containing about a thousand times the mass of the Milky Way skirts the larger cluster, missing its center by around 650,000 light-years.

The flyby creates a gravitational disturbance that churns up the gas like cream stirred into coffee, creating an expanding spiral of cold gas.

After about 2.5 billion years, when the gas has risen nearly 500,000 light-years from the center, vast waves form and roll at its periphery for hundreds of millions of years before dissipating.

These waves are giant versions of Kelvin-Helmholtz waves, which show up wherever theres a velocity difference across the interface of two fluids, such as wind blowing over water. They can be found in the ocean, in cloud formations on Earth and other planets, in plasma near Earth, and even on the Sun.

We think the bay feature we see in Perseus is part of a Kelvin-Helmholtz wave, perhaps the largest one yet identified, that formed in much the same way as the simulation shows, said Dr. Walker, who is the lead author of the paper reporting the results in the Monthly Notices of the Royal Astronomical Society (arXiv.org preprint).

We have also identified similar features in two other galaxy clusters, Centaurus and Abell 1795.

The team also found that the size of the waves corresponds to the strength of the clusters magnetic field.

If its too weak, the waves reach much larger sizes than those observed. If too strong, they dont form at all.

This study allowed astronomers to probe the average magnetic field throughout the entire volume of these clusters, a measurement that is impossible to make by any other means.

_____

S.A. Walker et al. 2017. Is there a giant Kelvin-Helmholtz instability in the sloshing cold front of the Perseus cluster? MNRAS 468 (2): 2506-2516; doi: 10.1093/mnras/stx640

This article is based on text provided by NASAs Goddard Space Flight Center.

See more here:

Astronomers Find Enormous Wave of Hot Gas Rolling through Nearby Galaxy Cluster - Sci-News.com

Students, teachers craft software to make astronomy accessible to the blind – UChicago News

Todays astronomers dont really look at stars or galaxies so much as images produced from data generated by light. If that same data were used to produce 3-D printouts, tactile displays or sound, would it open the study and pursuit of astronomy to the blind and visually impaired?

Thats the kind of question the University of Chicagos Yerkes Observatory and its partners will try to answer with the help of a $2.5 million National Science Foundation grant. Over the next three years, they will develop Afterglow Accessnew software that will make astronomy more accessible to the blind and visually impaired.

Amazing pictures of stars start as numbers on a spreadsheet, and those numbers can be manipulated and presented in myriad ways, said Kate Meredith, director of education outreach at the Yerkes Observatory and the education lead of Innovators Developing Accessible Tools for Astronomy, a new research initiative from the observatory. We wont consider ourselves successful unless within three years we have developed new computer tools with and for the blind and visually impaired that can be used in real applications, learning situations and scholarly research.

The National Federation of the Blind estimates that more than seven million Americans are visually disabled. Unequal access to quantitative information and the lack of vision-neutral tools presents them with barriers to study and master astronomy and other STEM subjects, Meredith said.

To overcome this, the Yerkes research initiative will engage blind and visually impaired students as well as sighted students and their teachers from mainstream and specialized schools for the blind. Twenty teachers and 200 eighth- through 12th-grade students are expected to participate annually. Recruiting teachers and students began this spring. While half of the participating schools will be located in southern Wisconsin and the Chicago area, the remaining schools will be selected from across the United States and its territories.

Students and teachers will participate in user-centered design and universal design processes to develop and test software and learning modules and to improve accessibility aspects of astronomy tools for educational and professional purposes. The project builds upon the success of prior National Science Foundation-supported research projects, including the development of Afterglow; Quorum, an accessible programming language; and the Skynet Junior Scholars, a program that supports collaborative astronomy investigations by young explorers using Skynets international network of telescopes.

The research will advance knowledge about student learning related to computational thinking, the role of computation in astronomy and software design. In addition, it will help determine how participation influences student attitudes and beliefs about who can engage in computing and STEM subjects.

Teaming up blind and visually impaired students with sighted students, teachers and professionals in the design and development of astronomy software and instructional modules will create powerful educational experiences, encourage STEM learning, and lower the barrier-to-entry for blind and visually impaired individuals interested in astronomy and related careers, Meredith said.

Investigators in the program include employees at the University of Chicago; Yerkes Observatory; Associated Universities Inc.; the Technical Education Research Center at the University of Nevada, Las Vegas; and Skynet at the University of North Carolina at Chapel Hill.

Read the original:

Students, teachers craft software to make astronomy accessible to the blind - UChicago News

Hospital CIOs see benefits of healthcare cloud computing – TechTarget

Thank you for joining!

May 2017, Vol. 5, No. 3

In healthcare, some illnesses can be cured quickly; some can't. But before applying a proper antidote, several factors need to be considered about the patient in question. The same can be said when hospital CIOs and IT pros work to formulate a strategy for moving their computing processes to the cloud, sometimes by choice and sometimes out of necessity. Critical issues need to be weighed, such as security of patient records, the cost to vacate the premises, how much information really needs to be stored in the cloud and actual savings to hospitals as a result of the move.

Our cover story examines these issues through the eyes of hospital CIOs, who see healthcare cloud computing delivering noticeable improvements in security, patient care and cost savings. They're learning to embrace the benefits of moving in part or whole to the cloud as they choose from among the various private, public and hybrid options.

In another feature, we look at the prevalence of mobile devices throughout the hospital community. They can cause migraines for CIOs and IT departments trying to maintain security with healthcare cloud computing safeguards. That's not to mention the inherent resistance IT departments can encounter from doctors, nurses and other hospital staff who share patient healthcare information over their personal smartphones and tablets.

Also in this issue, we look at some steps hospitals will need to take, including revamping IT teams, to gain full advantage of the cloud's benefits. Sometimes baby steps can go a lot farther than giant steps.

View original post here:

Hospital CIOs see benefits of healthcare cloud computing - TechTarget

How Cloud Computing Is Turning the Tide on Heart Attacks – Fortune

When tech people talk about "the cloud," it often comes across as an abstract computer concept. But a visit to a village in India shows how cloud computing can bring about enormous change in far-flung places, and quite literally save lives.

On Wednesday, at the Fortune Brainstorm Health summit in San Diego, cardiologist Charit Bhograj spoke to a medical counterpart in India who was in the course of treating a rural man with chest pains.

As the doctors explained, it was recently impossible to offer advanced heart treatment in poor villages: It cost too much to administer an Electrocardiogram (EKG) and, even if you could get an EKG, the local physician was not in a position to interpret it.

This situation has changed dramatically, however, with the advent of portable EKG devices, specialized software and cloud computing.

In the course of a 10-minute presentation, the audience watched as the physician in India took an EKG reading from the man with chest pains, and relayed the results to Bhograj in San Diego. Bhograj then assessed the results and typed his advice into a tool called Tricog, which the Indian doctor then downloaded via a smartphone app.

This arrangement, which relied on a EKG device supplied by GE Health, represents a striking advancement in technology. But it also has huge health implications.

"It will change the odds of a heart attack taking your life from 80% to an 80% chance you will survive," said Bhograj, explaining how cloud-based medical services are transforming cardiac health in rural areas.

And according to Vikram Damodaran, the chief product officer of Sustainable Health Solutions at GE Healthcare, the transformation is only beginning. He explained that GE has made investments worth $300 million in the public health system in recent years, and that the sort of services appearing in rural India are also expanding to Southeast Asia and Africa.

All of this confirms an observation this morning by Fortune President Alan Murraythat there's an incredible burst of innovation taking place in the health care industry right now.

Link:

How Cloud Computing Is Turning the Tide on Heart Attacks - Fortune

Verizon sells cloud services to IBM in ‘unique cooperation between … – Cloud Tech

Verizon has announced it is selling its cloud and managed hosting service to IBM, alongside working with the Armonk giant on a number of strategic initiatives involving networking and cloud services.

This is a unique cooperation between two tech leaders to support global organisations as they look to fully realise the benefits of their cloud computing investments, said George Fischer, SVP and group president of Verizon Enterprise Solutions (VES) in a statement.

Last February, Verizon told customers in an email that it was shutting down any virtual servers running on Public Cloud or Reserved Performance Cloud Spaces on April 12. The company clarified in a statement to CloudTech that it was discontinuing its cloud service that accepts credit card payments, however John Dinsdale, a chief analyst at Synergy Research, saw things differently.

Telcos generally are having to take a back seat on cloud and especially on public cloud services, he told this publication last year. They do not have the focus and the data centre footprint to compete effectively with the hyperscale cloud providers, so they are tending to drop back into subsidiary roles as partners or on-ramps to the leading cloud companies.

How prescient that statement is now. IBM would certainly be classified as one of the hyperscale operators; alongside Amazon Web Services (AWS), Microsoft and Google, the four leading players continue to grow more quickly than the overall market, according to Synergys figures.

Whats more, various links between the two companies means this move makes sense. John Considine, general manager at IBM Cloud Infrastructure Services, was previously CTO of Verizon Terremark. The companies have also partnered on various initiatives, including in the creation of Verizons cognitive customer experience platform, built using IBMs cloud and infrastructure as a service offerings.

Our customers want to improve application performance while streamlining operations and securing information in the cloud, Fischer added. VES is now well positioned to provide those solutions through intelligent networking, managed IT services and business communications.

Verizon said it was notifying affected customers directly, though adding it did not expect any immediate impact to their services. The transaction is expected to close later this year.

Excerpt from:

Verizon sells cloud services to IBM in 'unique cooperation between ... - Cloud Tech

Adobe bets big on cloud computing for marketing, creative professionals – Livemint

Mumbai: Known for its Photoshop and Illustrator software packages used primarily by design professionals, Adobe Systems Inc. is now betting big on providing creative and marketing professionals solutions that reside in the cloud.

Cloud computing typically allows companies to use software as a service (SaaS) rather than pay for it upfront.

Adobes solutions broadly cover three areasthe Document Cloud (to help create and manage documents), Creative Cloud (for designing purposes) and Experience Cloud (to monitor and analyse customer behaviour).

We couldnt have been more pleased with what we have done with (our) Creative Cloud, Shantanu Narayen, chairman, president and CEO, Adobe told a media gathering in Mumbai on Wednesday.

Narayen insisted that there is a massive tailwind of digital globally, and consumer expectations have risen dramatically. The next generation of software will be consumer-in, he said, implying that companies need to sharpen their focus on customer satisfaction in todays digital world.

The companys senior executives are also bullish about Adobes prospects in India. In India, we are just starting to ride the (customer) experience wave, said Kulmeet Bawa, Adobes managing director for South Asia. He added that there is a lot of headroom for growth for Adobe in India, which employs about 5,200 people in the country30% of the global headcount.

In this context, Narayen also underscored Adobes reliance on partnerships.

Citing the example of the companys long-term partnership with Microsoft Corp., he said, While we currently have our Experience Cloud running on Microsofts Azure platform, the vision, going forward, is to have all our clouds on Azure.

ALSO READ: Despite Trumps protectionism, India to remain Adobes innovation hub

Speaking about trends, Narayen pointed out that chief marketing officers (CMOs) and chief digital officers (CDOs) and other C-suite executives are increasingly asking how they can also figure out digital transformation for their organizations.

Analysts concur that as customers become central to how enterprises transform themselves digitally, CMOs and CDOs are having more say in how advertising campaigns are devised and runand how the tech tools needed to create, run, manage and analyse those campaigns are bought and implemented.

Research firm Gartner Inc. noted in its CMO Spend Survey 2016-17 that CMOs now oversee or heavily influence customer experience, technology spending, and profit and loss performance as means to deliver growth. A report from research firm International Data Corp. (IDC), too, forecasts that spending on marketing technology will increase from $20.2 billion in 2014 to $32.4 billion in 2018.

Gartner uses the term digital marketing hub that can be likened to the so-called marketing clouds that consolidate and simplify the use of multiple marketing technology tools.

In its February 2017 report, Magic Quadrant for Digital Marketing Hubs, Gartner lists 22 companies. Adobe, Salesforce.com Inc. and Oracle Corp. dominate this market, according to the report.

There are a few challenges, though, in expanding this market, analysts say. For instance, Sujit Janardanan, vice-president of marketing at Aranca, a global research and advisory firm, believes that many of the tools that are part of the marketing clouds do not work smoothly.

There are integration and skills-availability issues, he said. Whats more, he added, is that the cloud offerings from large companies such as Adobe and Oracle are super-expensive, costing many times more than what smaller providers such as HubSpot Inc. would charge.

First Published: Thu, May 04 2017. 02 11 AM IST

Here is the original post:

Adobe bets big on cloud computing for marketing, creative professionals - Livemint

Red Hat’s New Products Centered Around Cloud Computing, Containers – Virtualization Review

Dan's Take

The company made a barrage of announcements at its recent Summit show.

Red Hat has made a number of announcements at its user group conference, Red Hat Summit. The announcements ranged from the announcement of OpenShift.io to facilitate the creation of software as a service applications, pre-built application runtimes to facilitate creation of OpenShift-based workloads, an index to help enterprises build more reliable container-based computing environments, an update to the Red Hat Gluster storage virtualization platform allowing it to be used in an AWS computing environment, and, of course, an announcement of a Red Hat/Amazon Web Services partnership.

Red Hat summarized the announcements as follows:

The announcements targeted a number of industry hot buttons, including containers, rapid application development, storage virtualization and cloud computing. As with other announcements in the recent past, the company is integrating multiple open source projects and creating commercial-grade software products designed to provide an easy-to-use, reliable and maintainable enterprise computing environment.

In previous announcements, Red Hat has pointed out that it has certified Red Hat software executing in both Microsoft Hyper-V and Azure cloud computing environments. So, the company can claim to support a broad portfolio of enterprise computing environments.

These announcements will be of the most interest to large enterprises since they are the ones most likely to adopt these products. These tools might be used by independent software vendors (ISVs) to create IT solutions for smaller firms as well, leading to potential impact on some small to medium size business.

About the Author

Daniel Kusnetzky, a reformed software engineer and product manager, founded Kusnetzky Group LLC in 2006. He's literally written the book on virtualization and often comments on cloud computing, mobility and systems software. He has been a business unit manager at a hardware company and head of corporate marketing and strategy at a software company.

The rest is here:

Red Hat's New Products Centered Around Cloud Computing, Containers - Virtualization Review

Cloud Computing Continues to Influence HPC – insideHPC

This is the second entry in an insideHPC series that explores the HPC transition to the cloud, and what your business needs to know about this evolution. This series, compiled in a complete Guideavailable here, covers cloud computing for HPC, industry examples, IaaS components, OpenStack fundamentals and more.

Cloud technologies are influencing HPC just as it is the rest of enterprise IT. The main drivers of this transformation are the reduction of cost and the increase in accessibility and availability to users within an organization.

Traditionally, HPC applications have been run on special-purpose hardware, managed by staff with specialized skills. Additionally, most HPC software stacks are rigid and distinct from other more widely adopted environments, and require a special skillset by the researchers that want to run the applications, often needing to become programmers themselves. The adoption of cloud technologies increases the productivity of your research organization by making its activities more efficient and portable. Cloud platforms such as OpenStack provide a way to collapse multiple silos into a single private cloud while making those resources more accessible through self-service portales and APIs. Using OpenStack, multiple workloads can be distributed among the resources in a granular fashion that increases overall utilization and reduces cost.

While traditional HPC systems are better for a certain workload, cloud infrastructures can accommodate many.

Another benefit of breaking down computation siloes is the ability to accommodate multidisciplinary workloads and collaboration. While traditional HPC systems are better for a certain workload, cloud infrastructures can accommodate many. For example, they can be used to teach computation techniques to students as well as provide a resource for researchers to make scientific discoveries. Traditional HPC infrastructures are great at solving a particular problem, but they are not very good at the kind of collaboration that modern research requires. A multidisciplinary cloud can make life-changing discoveries and provide a platform to deliver those discoveries to other researchers, practitioners or even directly to patients on mobile devices.

Definitions of cloud computing vary, but the National Institute of Standards and Technologies(NIST) has defined it as having the following characteristics:

Applied to HPC workloads, the service and delivery model is generally understood to include the following buckets, either individually or combined (derived from NIST definition):

Public clouds will contain sufficient compute servers, storage amounts and the networking necessary for many HPC applications.

The various types of infrastructure described here can physically reside or be deployed over the following types of clouds:

The various types of infrastructure can physically reside or be deployed over the above three types of clouds.

Over the next few weeks this series on the HPC transition to the cloud will cover the following additional topics:

You can also download the complete report, insideHPC Research Report onHPC Moves to the Cloud What You Need to Know, courtesy of Red Hat.

The rest is here:

Cloud Computing Continues to Influence HPC - insideHPC

5 Cloud Computing Stocks to Buy – TheStreet.com

President Trump's proposed tax reforms may incentivize U.S. multinational companies to bring cash back to the U.S., potentially setting off a frenzy of mergers and share buybacks. However, it may also increasespending on some of the biggest trends in technology.

Cloud computing is one such area where companies are likely to increase spending over the next several years, as companies look to reduce operating costs and increase flexibility. Research firm IDC recently noted that worldwide spending on the public cloud -- the areas where the largest tech conglomerates mostly reside -- is expected to reach $122.5 billion this year, an increase of nearly 25% over 2016 spending levels.

By 2020, IDC expects that spending to reach $203.4 billion worldwide, indicating there is much more room to run as companies shift their computing habits, leaving opportunities for investors.

"Some offorecasts we've seen -- for example, Goldman -- shows cloud spending from 2016 to 2020 will quadruple," said Exencial Wealth Advisors senior analyst Rich Erwin, who helps handle$1.6 billion in assets under management. "Last year, overall spending was around $32 billion and maybe $135 billion or so is devoted to the public cloud, which is the real growth vehicle."

That growth is expected to largely be captured by the largest companies, giving an opportunity to investors to concentrate their bets and generate outsized returns if it comes to fruition.

"I've seen numbers that in roughly tenyears, Microsoft will have between 25% and 30% of its revenue and operating income from cloud services business," Erwin added. "It's a $3 billion business now, but it has the potential to be really big. It's the biggest trend in technology now and will be for the next decade."

What follows below is a Q&A with Erwin about where investors should look for cloud computing stocks to buy.It has been lightly edited for brevity and clarity.

TheStreet: How much money can we expect to come back from overseas if we get a repatriation holiday?

Erwin: At Exencial, we're expecting about $200 billion to come back in the first year of the holiday. Much of that is in companies like Apple (AAPL) , Cisco (CSCO) , Alphabet (GOOG) (GOOGL) and Microsoft (MSFT) .

TheStreet: Where does that money go?

Erwin: The money will likely go to stock buybacks and M&A deals -- we think the majority of that cash will be targeted for those activities.

TheStreet: Then what makes you bullish on some of these companies that are tied to cloud computing?

Erwin: Alphabet, or Google, has around $26 billion in free cash flow and they spend $14 billion in research and development spending, so they're not really dependent upon the money coming back -- they're already highly profitable.

TheStreet: What do you like about each of these companies?

The rest is here:

5 Cloud Computing Stocks to Buy - TheStreet.com

How Do You Define Cloud Computing? – Data Center Knowledge

Steve Lack is Vice President of Cloud Solutions for Astadia.

New technology that experiences high growth rates will inevitably attract hyperbole. Cloud computing is no exception, and almost everyone has his or her own definition of cloud from its on the internet to a full-blown technical explanation of the myriad compute options available from a given cloud service provider.

Knowing what is and what is not a cloud service can be confusing. Fortunately, the National Institute of Standards and Technology (NIST) has provided us with a cloud computing definition that identifies five essential characteristics.

On-demand self-service. A consumer [of cloud services] can unilaterally provision computing capabilities, such as server time and network storage, as needed, automatically without requiring human interaction with each service provider.

Read: Get what you want, when you want it, with little fuss.

Broad network access. Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, tablets, laptops and workstations).

Read: Anyone, anywhere can access anything you build for them.

Resource pooling. The providers computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand.

Read: Economies of scale on galactic proportions.

Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly outward and inward commensurate with demand. To the consumer, the capabilities available for provisioning often appear unlimited and can be appropriated in any quantity at any time.

Read: Get what you want, when you want it then give it back.

Measured service. Cloud systems automatically control and optimize resource usage by providing a metering capability as appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled and reported, providing transparency for both the provider and consumer of the utilized service.

Read: Get what you want, when you want it, then give it back and only pay for what you use.

Each of these five characteristics must be present, or it is just not a cloud service, regardless of what a vendor may claim. Now that public cloud services exist that fully meet this cloud computing definition, you the consumer of cloud services can log onto one of the cloud service providers dashboards and order up X units of compute capacity, Y units of storage capacity and toss in other services and capabilities as needed. Your IT team is not provisioning any of the hardware, building images, etc., and this all happens within minutes vs. the weeks it would normally take in a conventional on-premise scenario.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

Here is the original post:

How Do You Define Cloud Computing? - Data Center Knowledge

RCom arm in tie-up for cloud computing – Moneycontrol.com

Reliance Communications undersea cable arm Global Cloud Xchange has entered into an agreement with two other companies to provide cloud computing services.

Under the agreement, data centre company Aegis Data will host cloud solutions of vScaler within its data centre and GCX will connect customers to access cloud solution through its network.

"As part of this strategic partnership, Aegis will provide vScaler with the necessary power and infrastructure requirements that will allow both organisations to capture the increasing demand for scalable HPC (high power compute)-on- demand services from enterprises in the region," a joint statement from the three firms said.

The partnership supported by Global Cloud Xchange (GCX) will enable direct access to vScaler's Cloud Services platform, it added.

Industry findings have projected that the HPC market is expected to grow up to USD 36.62 billion by 2020, at a compounded annual growth rate (CAGR) of 5.45 percent.

"This triangulated partnership supports these demands in perfect harmony, meaning that those organisations looking for HPC requirements can have their demands serviced all under one roof," vScaler Chief Technology Officer David Power said.

Read more:

RCom arm in tie-up for cloud computing - Moneycontrol.com

Chinese scientists build world’s first quantum computing machine – India Today

China has beaten the world at building the first ever quantum computing machine that is 24,000 times faster than its international counterparts.

Making the announcement at a press conference in the Shanghai Institute for Advanced Studies of University of Science and Technology, the scientists said that this quantum computing machine may dwarf the processing power of existing supercomputers.

Researchers also said that quantum computing could in some ways dwarf the processing power of today's supercomputers.

HOW THE WORLD'S FIRST QUANTUM COMPUTING MACHINE CAME TO BE?

The manipulation of multi-particle entanglement is the core of quantum computing technology and has been the focus of international quantum computing research.

Recently, Pan Jianwei of the Chinese Academy of Sciences, Lu Chaoyang and Zhu Xiaobo of the University of Science and Technology of China and Wang Haohua of Zhejiang University set international records in quantum control of the maximal numbers of entangled photonic quantum bits and entangled superconducting quantum bits.

Pan said quantum computers could, in principle, solve certain problems faster than classical computers.

Despite substantial progress in the past two decades, building quantum machines that can actually outperform classical computers in some specific tasks - an important milestone termed "quantum supremacy" - remains challenging.

In the quest for quantum supremacy, Boson sampling - an intermediate quantum computer model - has received considerable attention, as it requires fewer physical resources than building universal optical quantum computers, Pan was quoted as saying by the state-run Xinhua news agency.

Last year, the researchers had developed the world's best single photon source based on semiconductor quantum dots.

Now, they are using the high-performance single photon source and electronically programmable photonic circuit to build a multi-photon quantum computing prototype to run the Boson sampling task.

The test results show the sampling rate of this prototype is at least 24,000 times faster than international counterparts, researchers said.

At the same time, the prototype quantum computing machine is 10 to 100 times faster than the first electronic computer, ENIAC, and the first transistor computer, TRADIC, in running the classical algorithm, Pan said.

It is the first quantum computing machine based on single photons that goes beyond the early classical computer, and ultimately paves the way to a quantum computer that can beat classical computers.

Last year, China had successfully launched the world's first quantum satellite that will explore "hack proof" quantum communications by transmitting unhackable keys from space, and provide insight into the strangest phenomenon in quantum physics - quantum entanglement.

The research was published in the journal Nature Photonics.

(With inputs from PTI)

Read more at FYI:

China to have its own Wikipedia soon: How the country is expanding its digital universe

Chinese daily appreciates ISRO but says we lag behind the US and China

Chinese man gets arrested for inviting 200 'paid' guests from his side on his wedding

Watch more:

Read the original post:

Chinese scientists build world's first quantum computing machine - India Today

quantum computing – WIRED UK

Wikimedia Commons

In a world where we are relying increasingly on computing, to share our information and store our most precious data, the idea of living without computers might baffle most people.

But if we continue to follow the trend that has been in place since computers were introduced, by 2040 we will not have the capability to power all of the machines around the globe, according to a recent report by the Semiconductor Industry Association.

To prevent this, the industry is focused on finding ways to make computing more energy efficient, but classical computers are limited by the minimum amount of energy it takes them to perform one operation.

This energy limit is named after IBM Research Lab's Rolf Landauer, who in 1961 found that in any computer, each single bit operation must use an absolute minimum amount of energy. Landauer's formula calculated the lowest limit of energy required for a computer operation, and in March this year researchers demonstrated it could be possible to make a chip that operates with this lowest energy.

It was called a "breakthrough for energy-efficient computing" and could cut the amount of energy used in computers by a factor of one million. However, it will take a long time before we see the technology used in our laptops; and even when it is, the energy will still be above the Landauer limit.

This is why, in the long term, people are turning to radically different ways of computing, such as quantum computing, to find ways to cut energy use.

Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.

In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.

"Traditionally qubits are treated as separated physical objects with two possible distinguishable states, 0 and 1," Alexey Fedorov, physicist at the Moscow Institute of Physics and Technology told WIRED.

"The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called 'entangled states'."

D-Wave

A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states - at either of the two poles of the sphere - a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Last year, a team of Google and Nasa scientists found a D-wave quantum computer was 100 million times faster than a conventional computer. But moving quantum computing to an industrial scale is difficult.

IBM recently announced its Q division is developing quantum computers that can be sold commercially within the coming years. Commercial quantum computer systems "with ~50 qubits" will be created "in the next few years," IBM claims. While researchers at Google, in Nature comment piece, say companies could start to make returns on elements of quantum computer technology within the next five years.

Computations occur when qubits interact with each other, therefore for a computer to function it needs to have many qubits. The main reason why quantum computers are so hard to manufacture is that scientists still have not found a simple way to control complex systems of qubits.

Now, scientists from Moscow Institute of Physics and Technology and Russian Quantum Centre are looking into an alternative way of quantum computing. Not content with single qubits, the researchers decided to tackle the problem of quantum computing another way.

"In our approach, we observed that physical nature allows us to employ quantum objects with several distinguishable states for quantum computation," Fedorov, one of the authors of the study, told WIRED.

The team created qubits with various different energy "levels", that they have named qudits. The "d" stands for the number of different energy levels the qudit can take. The term "level" comes from the fact that typically each logic state of a qubit corresponds to the state with a certain value of energy - and these values of possible energies are called levels.

"In some sense, we can say that one qudit, quantum object with d possible states, may consist of several 'virtual' qubits, and operating qudit corresponds to manipulation with the 'virtual' qubits including their interaction," continued Federov.

"From the viewpoint of abstract quantum information theory everything remains the same but in concrete physical implementation many-level system represent potentially useful resource."

Quantum computers are already in use, in the sense that logic gates have been made using two qubits, but getting quantum computers to work on an industrial scale is the problem.

"The progress in that field is rather rapid but no one can promise when we come to wide use of quantum computation," Fedorov told WIRED.

Elsewhere, in a step towards quantum computing, researchers have guided electrons through semiconductors using incredibly short pulses of light. Inside the weird world of quantum computers

These extremely short, configurable pulses of light could lead to computers that operate 100,000 times faster than they do today. Researchers, including engineers at the University of Michigan, can now control peaks within laser pulses of just a few femtoseconds (one quadrillionth of a second) long. The result is a step towards "lightwave electronics" which could eventually lead to a breakthrough in quantum computing.

A bizarre discovery recently revealed that cold helium atoms in lab conditions on Earth abide by the same law of entropy that governs the behaviour of black holes. What are black holes? WIRED explains

The law, first developed by Professor Stephen Hawking and Jacob Bekenstein in the 1970s, describes how the entropy, or the amount of disorder, increases in a black hole when matter falls into it. It now seems this behaviour appears at both the huge scales of outer space and at the tiny scale of atoms, specifically those that make up superfluid helium.

"It's called an entanglement area law, explained Adrian Del Maestro, physicist at the University of Vermont. "It points to a deeper understanding of reality and could be a significant step toward a long-sought quantum theory of gravity and new advances in quantum computing.

Go here to see the original:

quantum computing - WIRED UK

Time Crystals Could be the Key to the First Quantum Computer – TrendinTech

Its been proven that time crystals do in fact exist. Two different teams of researchers created some time crystals just recently, one of which was from the University of Maryland and the other from Harvard University. While the first team used a chain of charged particles called ytterbium ions, the others used a synthetic diamond to create an artificial lattice.

It took a while for the idea of time crystals to stick because they are essentially impossibilities. Unlike conventional crystals where the lattices simply repeat themselves in space, time crystals also repeat in time to breaking time-translation symmetry. This unique phenomenon is the first in demonstrating non-equilibrium phases of matter.

The Harvard researchers are excited with their discoveries so far and are now hoping to uncover more about these time crystals. Mikhail Lukin and Eugene Demler are both physics professors and joint leaders of the Harvard research group. Lukin said in a recent press release, There is now broad, ongoing work to understand the physics of non-equilibrium quantum systems. The team is keen to move on with further research as they know by researching materials such as time crystals will help us better understand our own world as well as the quantum world.

Research such as that carried out by the Harvard team will allow others to develop new technologies such as quantum sensors, atomic clocks, or precision measuring tools. In regards to quantum computing, time crystals could be the missing link that were searching for when it comes to developing the worlds first workable model. This is an area that is of interest for many quantum technologies, said Lukin, because a quantum computer is a quantum system thats far away from equilibrium. Its very much at the frontier of research and we are really just scratching the surface. Quantum computer could change the way in which research is carried out and help in solving the most complex of problems. We just need to figure it out first.

More News to Read

comments

See more here:

Time Crystals Could be the Key to the First Quantum Computer - TrendinTech