Daily Archives: May 11, 2017

You really should know what the Andrew File System is – Network World

Posted: May 11, 2017 at 1:25 pm

By Bob Brown, News Editor, Network World | May 10, 2017 2:20 PM PT

Your Alpha Doggs editor is Bob Brown, Network World Online Executive Editor, News.

When I saw that the creators of the Andrew File System (AFS) had been named recipients of the $35K ACM Software System Award, I said to myself "That's cool, I remember AFS from the days of companies like Sun Microsystems... just please don't ask me to explain what the heck it is."

Don't ask my colleagues either. A quick walking-around-the-office survey of a half dozen of them turned up mostly blank stares at the mention of the Andrew File System, a technology developed in the early 1980s and named after Andrew Carnegie and Andrew Mellon. But as the Association for Computing Machinery's award would indicate, AFS is indeed worth knowing about as a foundational technology that paved the way for widely used cloud computing techniques and applications.

MORE: Whirlwind tour of tech's major awards, honors and prizes

Mahadev "Satya" Satyanarayanan, a Carnegie Mellon University Computer Science professor who was part of the AFS team, answered a handful of my questions via email about the origins of this scalable and secure distributed file system, the significance of it, and where it stands today. Satyanarayanan was recognized by ACM along with John Howard, Michael Leon Kazar, Robert Nasmyth Sidebotham, David Nichols, Sherri Nichols, Alfred Spectorand Michael West, who worked as a team via the Information Technology Center partnership between Carnegie Mellon and IBM (the latter of which incidentally funded this ACM prize).

Is there any way to quantify how widespread AFS use became and which sorts of organizations used it most? Any sense of how much it continues to be used, and for what?

Over a roughly 25-year timeframe, AFS has been used by many U.S. and non-U.S. universities. Many national labs, supercomputing centers and similar institutions have also used AFS. Companies in the financial industry (e.g., Goldman Sachs) and other industries have also used AFS. A useful snapshot of AFS deployment was provided by the paper "An Empirical Study of a Wide-Area Distributed File System" that appeared in ACM Transactions on Computer Systemsin 1996. That paper states:

"Originally intended as a solution to the computing needs of the Carnegie Mellon University, AFS has expanded to unite about 1000 servers and 20,000 clients in 10 countries. We estimate that more than 100,000 users use this system worldwide. In geographic span as well as in number of users and machines, AFS is the largest distributed file system that has ever been built and put to serious use."

Figure 1 in that paper shows that AFS spanned 59 educational cells, 22 commercial cells, 11 governmental cells, and 39 cells outside the United States at the time of the snapshot. In addition to this large federated multi-organization deployment of AFS, there were many non-federated deployments of AFS within individual organizations.

What has been AFS's biggest impact on today's cloud and enterprise computing environments?

The model of storing data in the cloud and delivering parts of it via on-demand caching at the edge is something everyone takes for granted today. That model was first conceived and demonstrated by AFS, and is perhaps its biggest impact. It simplifies management complexity for operational staff, while preserving performance and scalability for end users. From the viewpoint of end users, the ability to walk up to any machine and use it as your own provides enormous flexibility and convenience. All the data that is specific to a user is delivered on demand over the network. Keeping in sync all the machines that you use becomes trivial. Users at organizations that deployed AFS found this an addictive capability. Indeed, it was this ability that inspired the founders of DropBox to start their company. They had used AFS at MIT as part of the Athena environment, and wanted to enable at wider scale this effortless ability to keep in sync all the machines used by a person. Finally, many of the architectural principles and implementation techniques of AFS have influenced many other systems over the past decades.

How did AFS come to be created in the first place?

In 1982, CMU and IBM signed a collaborative agreement to create a "distributed personal computing environment" on the CMU campus, that could later be commercialized by IBM. The actual collaboration began in January 1983. A good reference for information about these early days is the1986 CACM paper by [James H.] Morris et al entitled "Andrew: A Distributed Personal Computing Environment". The context of the agreement was as follows. In 1982, IBM had just introduced the IBM PC, which was proving to be very successful. At the same time, IBM was fully aware that enterprise-scale use of personal computing required the technical ability to share information easily, securely, and with appropriate access controls. This was possible in the timesharing systems that were still dominant in the early 1980s. How to achieve this in the dispersed and fragmented world of a PC-based enterprise was not clear in 1982. A big part of the IBM-CMU collaborative agreement was to develop a solution to this problem. More than half of the first year of the Information Technology Center (1983) was spent in brainstorming on how best to achieve this goal. Through this brainstorming process, a distributed file system emerged by about August 1983 as the best mechanism for enterprise-scale information sharing. How to implement such a distributed file system then became the focus of our efforts.

What would the AFS creators have done differently in building AFS if they had to do it over again?

I can think of at least two things: one small and one big.

The small thing is that the design and early evolution of AFS happened prior to the emergence of [network address translation (NAT)]-based firewalls in networking. These are in widespread use today in homes, small enterprises, etc. Their presence makes it difficult for a server to initiate contact with a client in order to establish a callback channel. If we had developed AFS after the widespread use of NAT-based firewalls, we would have carefully rethought how best to implement callbacks in the presence of NAT firewalls.

The bigger thing has to do with the World Wide Web. The Mosaic browser emerged in the early 1990s, and Netscape Navigator a bit later. By then AFS had been in existence for many years, and was in widespread use at many places. Had we realized how valuable the browser would eventually become as a tool, we would have paid much more attention to it. For example, a browser can be used in AFS by using "file://" rather than "http://" in addresses. All of the powerful caching and consistence-maintenance machinery that is built into AFS would then have been accessible through a user-friendly tool that has eventually proved to be enormously valuable. It is possible that the browser and AFS could have had a much more symbiotic evolution, as HTTP and browsers eventually did.

Looks like maybe there are remnants of AFS alive in the open source world?

Indeed. OpenAFS continues to be an active open source project. Many institutions (including CMU) continue to use AFS for production use, and this code is now based on OpenAFS.

Also, my work on the Coda File System forked off from the November 1986 version of AFS. Coda was open-sourced in the mid-1990s. That code base continues to be alive and functional today. Buried in Coda are ideas and actual code from early AFS.

Do any of you have any spectacular plans for what theyll do with the prize money?

Nothing concrete yet. We have discussed possibly donating the funds to a charitable cause.

Go here to see the original:

You really should know what the Andrew File System is - Network World

Posted in Cloud Computing | Comments Off on You really should know what the Andrew File System is – Network World

Enterprise-owned data centres still ‘essential’ despite cloud growth, research notes – Cloud Tech

Posted: at 1:25 pm

Enterprises may be starting to move workloads to the cloud, but enterprise-owned data centres remain the primary compute venue with workloads staying consistent over the past three years, according to new research from the Uptime Institute.

The study, which polled more than 1,000 data centre and IT professionals globally, argues that enterprises continue to see the data centre as essential to their digital-centric strategies with the majority of budgets increasing or staying consistent through 2017.

Respondents reported that nearly two thirds of their IT assets were currently deployed in their own data centres. 22% were deployed in colocation or multi-tenant data centre providers, with only 13% deployed in the cloud.

Despite this, more than two thirds (68%) of companies polled say they rely on IT-based resiliency, relying on live application failover in case of an outage due to multiple, geographically distributed data centres. An overwhelming majority (90%) said their companys management was more concerned around outages compared to this time last year.

The survey findings reflect several key trends that are acting together as a powerful catalyst for change within the industry, said Matt Stansberry, senior director of content and publications at Uptime Institute. Increased performance at the processor level, further expansion of server virtualisation, and the adoption of cloud computing have all created an IT foundation that differs greatly from those seen just five years ago. Through this change, enterprise-owned data centres have remained a central component.

We urge data centre and IT professionals to focus on the business aspects of running their IT foundation, creating sets of repeatable processes to make it work efficiently and adopting new technologies and solutions when the business demands it, added Stansberry.

You can find out more about the results here.

Continue reading here:

Enterprise-owned data centres still 'essential' despite cloud growth, research notes - Cloud Tech

Posted in Cloud Computing | Comments Off on Enterprise-owned data centres still ‘essential’ despite cloud growth, research notes – Cloud Tech

IBM touts its cloud platform as quickest for AI with benchmark tests – Cloud Tech

Posted: at 1:25 pm

IBM claims it has the fastest cloud for deep learning and artificial intelligence (AI) after publishing benchmark tests which show NVIDIA Tesla P100 GPU accelerators on the IBM Cloud can provide up to 2.8 times more performance than the previous generation in certain cases.

The tests, when fleshed out, will enable organisations to quickly create advanced AI applications on the cloud. Deep learning techniques are a key driver behind the increased demand for and sophistication of AI applications, the company noted. However, training a deep learning model to do a specific task is a compute-heavy process that can be time and cost-intensive.

IBM purports to be the first of the large cloud providers to offer NVIDIA Tesla P100 GPUs. Separate tests were carried out, first by IBM engineers and then by cloud simulation platform provider Rescale. For the IBM tests, engineers trained a deep learning model for image classification using two NVIDIA P100 cards on Bluemix bare metal, before comparing the same process to two Tesla K80 GPU cards.

The second performance benchmark, from Rescale, also picked up time reduction on deep learning training, based on its ScaleX platform, which features capabilities for deep learning software as a service (SaaS).

Innovation in AI is happening at a breakneck speed thanks to advances in cloud computing, said John Considine, IBM general manager for cloud infrastructure services in a statement. As the first major cloud provider to offer the NVIDIA Tesla P100 GPU, IBM Cloud is providing enterprises with accelerated performance so they can quickly and more cost-effectively create sophisticated AI and cognitive experiences for their end users.

Another cloud vendor utilising NVIDIAs Tesla P100 GPU although not of the same scale as IBM is Tencent, who made the announcement back in March. As this publication noted at the time, virtually every major cloud player is an NVIDIA customer of some sort, including Amazon Web Services (AWS), Google, and Microsoft.

You can find out more about the IBM tests here.

Continued here:

IBM touts its cloud platform as quickest for AI with benchmark tests - Cloud Tech

Posted in Cloud Computing | Comments Off on IBM touts its cloud platform as quickest for AI with benchmark tests – Cloud Tech

Microsoft is on the edge: Windows, Office? Naah. Let’s talk about cloud, AI – The Register

Posted: at 1:25 pm

Build At the Build 2017 developer conference today, Microsoft CEO Satya Nadella marked a Windows milestone 500 million monthly active users and proceeded to say very little about Windows or Office.

Instead he, along with Scott Guthrie, EVP of the Microsoft Cloud and Enterprise Group, and Harry Shum, EVP of Microsoft's Artificial Intelligence and Research group, spent most of their time on stage, in Seattle, talking about Azure cloud services, databases, and cross-platform development tools.

Arriving on stage to give his keynote address, Nadella in jest said that he thought it would be an awesome idea on such a sunny day "to bring everyone into a dark room to talk about cloud computing."

Office and Windows can wait.

Microsoft watchers may recall that its cloud-oriented businesses have been doing well enough to deserve the spotlight. In conjunction with the company's fiscal second quarter earnings report in January, the Windows and Office empire revealed that Azure revenue grew 93 per cent year-on-year.

During a pre-briefing for the press on Tuesday, Microsoft communications chief Frank Shaw described "a new worldview" for the company framed by the "Intelligent Edge" and the "Intelligent Cloud."

Nadella described this newborn weltanschauung as "a massive shift that is going to play out in the years to come."

He mused about a software-based personal assistant to illustrate his point. "Your personal digital assistant, by definition, will be available on all your devices," he said, to make the case that the centralized computing model, client and server, has become outmoded. Data and devices are dispersed.

In other words, all the data coming off connected devices requires both local and cloud computing resources. The revolution will not be centralized.

That could easily be taken as reheated Cisco frothing about the explosive growth of the Internet of Things and bringing processing smarts to the edge of the network. But Microsoft actually introduced a new service that fit its avowed vision.

Microsoft's bipolar worldview the Intelligent Edge and the Intelligent Cloud manifests itself in a novel "planet scale" database called Azure Cosmos DB. It's a distributed, multi-model database, based on the work of Microsoft Researcher Leslie Lamport, that promises to make data available locally, across Microsoft's 34 regions, while also maintaining a specified level of consistency across various instances of the data.

An Intelligent Meeting demonstration, featuring Cortana, showed how AI has the potential to exchange and coordinate data across multiple services. But "potential" requires developer work it will take coding to create the Cortana Skills necessary to connect the dots and manage the sort of cross-application communication that knowledge workers accomplish today through application switching, copying, and pasting.

Conveniently, the Cortana Skills Kit is now in public preview, allowing developers to extend the capabilities of Microsoft's assistant software to devices like Harman Kardon's Invoke speaker.

Beyond code, it will take data associated with people and devices in an organization to make those connections. That's something Microsoft with its Azure Active Directory, its Graph, and LinkedIn has in abundance.

A demonstration of real-time image recognition to oversee a construction worksite showed how a capability like image recognition might be useful to corporate customers. Cameras spotted unauthorized people and located requested equipment on-site. It looked like something companies might actually find useful.

Artificial intelligence as a general term sounds like naive science fiction. But as employed by Microsoft, it refers to machine learning frameworks, natural language processing, computer vision, image recognition or the like.

"We believe AI is about amplifying human ingenuity," said Shum.

Microsoft's concern is convincing developers and corporate clients to build and adopt AI-driven applications using Microsoft cloud computing resources, rather than taking their business to AWS or Google Cloud Platform.

One way Microsoft hopes to achieve that is by offering cloud computing outside the cloud, on endpoints like IoT devices. The company previewed a service called Azure IoT Edge to run containerized functions locally. It's a way of reducing latency and increasing responsiveness, which matters for customers like Sandvik.

The Swedish industrial automation biz has been testing Azure IoT Edge to anticipate equipment failure in its workplace machines, in order to shut them down before components break, causing damage and delays.

Read this article:

Microsoft is on the edge: Windows, Office? Naah. Let's talk about cloud, AI - The Register

Posted in Cloud Computing | Comments Off on Microsoft is on the edge: Windows, Office? Naah. Let’s talk about cloud, AI – The Register

Oracle launches cloud computing service for India | Business Line – Hindu Business Line

Posted: at 1:25 pm

New Delhi, May 10:

Technology giant Oracle has launched its cloud computing service for India, which aims to support the government's GST rollout in July and plans to open data centres in the country.

Addressing a gathering of 12,000 attendees, which included technology partners, analysts and heads of state such as Maharashtra Chief Minister Devendra Fadnavis, at Oracle OpenWorld, CEO, Safra Catz, said that India is at an "amazing moment'' in terms of sociological factors such as a high concentration of youth as well as efforts taken by the government to use technology more.

My last visit with PM Narendra changed the way I looked at India and my views about the country, she said. Oracle has been present in India for more than two decades, providing database technology that forms the backbone for many commercial transactions.

It was during that visit when Modi urged Catz to do more for Indian citizens - which could unleash the power of people's ideas, Catz said.

In an effort to simplify the tax regime in India and to ensure higher compliance with tax laws, Oracle's ERP solution aims to provide support for GST Network integration, statutory reporting, payment processing, among others. In the GST regime, companies will have to upgrade their ERP systems.

'SARAL GST'

Apart from Oracle, a week ago, Reliance Corporate IT Park, a subsidiary of Reliance Industries Ltd, has signed an MoU with Oracle rival SAP to launch SARAL GST a solution for taxpayers to be GST compliant and access the governments GST System.

Analysts welcomed this move. "It will open up opportunities for software and hardware but the core theme should be on simplification which would benefit an end user," said Sanchit Vir Gogia, CEO, Greyhound Research.

State governments in India are also adopting Oracle's solutions. The Maharashtra Government has a partnership with Oracle. Additionally, the Jharkhand Government and Oracle have signed an MoU to improve citizen services with an aim to make Jharkhand an attractive destination for start-ups.

The MoU was signed at Oracle OpenWorld and Oracle will offer its support to the state through its portfolio of technology solutions, including Oracle Cloud. These solutions cater to the growing requirements and expectations of citizens, businesses and government departments for smarter, transparent and efficient governance within the state of Jharkhand, company executives said.

Earlier this year, Jharkhand had received investment support from the Union Government for an internal venture capital fund to support start-ups in the state.

Further as part of the MoU, Oracle and the Government of Jharkhand will collaborate to create proof of concepts and help new start-ups using Oracle Cloud-based platforms to operationalise citizen services and start-up centres.

Adopters of technology in small businesses have some inherent advantages. A survey by Kantar IMRB in November 2016 of 504 Indian SMBs found that ones who adopt digital technologies grow profits up two times faster than offline SMBs.

The report found that 51 per cent of digitally enabled SMBs sell beyond city boundaries compared with 29 per cent of offline small businesses.

(This article was published on May 10, 2017)

Please enter your email. Thank You.

Newsletter has been successfully subscribed.

Continue reading here:

Oracle launches cloud computing service for India | Business Line - Hindu Business Line

Posted in Cloud Computing | Comments Off on Oracle launches cloud computing service for India | Business Line – Hindu Business Line

New Materials Could Make Quantum Computers More Practical – Tom’s Hardware

Posted: at 1:25 pm

A team of researchers from Stanford University has been investigating some new materials that they believe will bring us closer to building practical quantum computers.

One possible way to build quantum computers would be to use lasers to isolate spinning electrons inside a semiconductor material. When the laser hits the electron, it shows how the electron is spinning by emitting one or more light particles. The spin states can then be used as the most fundamental building blocks for quantum computing, the same way conventional computing uses 1s and 0s.

According to Stanford electrical engineering Professor Jelena Vuckovic, who has been investigating these new materials to build quantum computers, quantum computing would be ideal for studying biological systems, doing cryptography, or data mining, as well as for any other complex problem that cant be solved by conventional computers.

When people talk about finding a needle in a haystack, thats where quantum computing comes in, said Vuckovic.

The challenge in isolating spinning electrons is finding a material that can confine the electrons when the lasers hit them. Vuckovics team has identified three materials that can potentially do this: quantum dots, diamonds, and silicon carbide.

A quantum dot is a small amount of indium arsenide inside a crystal of gallium arsenide. The atomic properties of the two materials are known to trap spinning electrons.

In a recent paper, Kevin Fischer, a graduate student in the Vuckovic lab, described how the laser-electron processes can be used within a quantum dot system to control the input and output of light. For instance, by applying more power behind the lasers, two photons could be emitted instead of one. This could be used as an alternative to the 1s and 0s of conventional computers.

One issue is that the quantum dot system still requires cryogenic cooling, which doesnt make it a suitable candidate for general-purpose computing.

Vuckovics team has also been investigating modifying the crystalline lattice of a diamond to trap light in what is known as a color center. The team replaced some of the carbon atoms in the diamonds crystalline lattice with silicon atoms.

Like the quantum dots approach, doing quantum computing within diamond color centers requires cryogenic cooling.

Silicon carbide is a hard and transparent crystal that is used to make clutch plates, brake pads, and bulletproof vests, among other things. Prior research has shown that silicon carbide could be modified to create color centers at room temperature, but not in a way thats efficient enough to create a quantum chip.

Vuckovics team was able to eliminate some of the atoms in the silicon carbide lattice to create much more efficient color centers. The team also fabricated nanowires around the color centers to improve photon extraction.

Trapping electrons at room temperature could be a significant step forward for quantum computers, according to Vuckovich. However, she and her team are also not sure which method to create a practical quantum computer will work best in the end.

Some of the biggest technology companies in the world are working on building quantum computers right now, including Google, IBM, and Microsoft. Teams at many universities around the world are also experimenting with different approaches to building quantum computers.

Both Google and IBM believe well reach quantum supremacy--the point when quantum computers will be faster than conventional computers at solving a certain type of complex problems--when quantum computers have around 50 qubits (from the fewer than 10 qubits they do now). The two companies expect this goal to be reached in the next few years.

See the article here:

New Materials Could Make Quantum Computers More Practical - Tom's Hardware

Posted in Quantum Computing | Comments Off on New Materials Could Make Quantum Computers More Practical – Tom’s Hardware

Molecular magnets closer to application in quantum computing – Next Big Future

Posted: at 1:25 pm

In a Nature Communications publication, the results of the collaboration between scientists of the Institut Laue-Langevin (ILL), the University of Parma, ISIS and the University of Manchester, the (Cr7Ni)2 dimer has been used as a benchmark system to demonstrate the capability of four-dimensional inelastic neutron scattering to investigate entanglement between molecular qubits. By utilising high-quality single crystals and the full capabilities of the time-of-flight spectrometer IN5, the team was able to demonstrate and quantify the entanglement through the huge amount of data they were able to extract from the 4D phase space (Qx,Qy,Qz,E), where Q is the momentum-transfer vector and E the energy transfer. Indeed, the neutron cross-section directly reflects dynamical correlations between individual atomic spins in the molecule. Hence, the corresponding pattern of maxima and minima in the measured neutron scattering intensity as a function of Q is a sort of portrayal of the entanglement between the molecular qubits. Furthermore, the team has also developed a method to quantify entanglement from INS data.

Such a measurement opens up remarkable perspectives in understanding entanglement in complex spin systems. The research on molecular nanomagnets has been an attractive topic on the IN5 time-of-flight spectrometer since many years. In this recent work the top class chemistry and theoretical work meet the advanced neutron scattering methods to highlight the intricate physics of quantum entanglement, guiding further research towards a better understanding of the practical challenges in quantum information technology, said Dr Hannu Mutka and Dr Jacques Ollivier, ILL scientists.

With this benchmark measurement it looks as though neutrons will continue to be an essential tool in helping molecular nanomagnets realise their potential for quantum technologies of the future.

Nextbigfuture interviewed the researchers.

1. What are the next steps in this research? By exploiting the (Cr7Ni)2 supramolecular dimer as a benchmark, we have shown that the four-dimensional inelastic neutron scattering technique (4D-INS) enables one to portray and quantify entanglement between weakly coupled molecular nanomagnets, which provide ideal test beds for investigating entanglement in spin systems. The next steps will be the application of 4D-INS to dimers of more complex molecular qubits, like those containing 4f or 5f magnetic ions or to supramolecular compounds with more than two qubits. 2. Can the timing be seen for possible commercialization? The use of molecular nanomagnets for quantum information processing (QIP) is a relatively unexplored field. Therefore, as in other approaches to implement qubits, commercialisation is certainly not immediate. However, molecular magnetism constitutes an alternative route to QIP that uses low-cost, yet powerful, chemical methods to fabricate basic components and integrate them in future devices. 3. Is there an effort to enable qubits via this approach? Neutron scattering is a very powerful technique and enables one to achieve a sound characterisation of both molecular qubits and their supramolecular assemblies. Therefore, we plan to apply it to new interesting systems in the near future. In addition, we believe that our work will stimulate similar studies by other research groups. In this way, promising molecules with improved characteristics for QIP will be identified. 4. How does this work fit into a larger area of research? I.e. broad advances are happening and this is just a part. This work provides an important tool for molecular qubits, which in turn fit the broad quest for quantum information technologies. The latter constitutes one of the most important current research areas. Indeed, some of the most important private companies and international institutions are investing a huge amount of money on this subject. For instance, the European Commission will launch a 1 billion quantum technologies flagship in 2018. 5. What do the researchers see as highlights for how this work advances the state of the art?

Experimentally measuring entanglement in complex systems is generally very difficult. In this work, we have put forward a method to demonstrate and quantify entanglement between molecular qubits, by measuring the dependence of the neutron cross-section on the three components of the momentum transfer Q. Such measurements are challenging, but we have demonstrated this with the spectrometer IN5 at the Institut Laue-Langevin, indicating that they can now be performed exploiting state-of-the-art neutron spectrometers.

6. Do the researchers have a context or vision they can share? Quantum computers will be powerful devices able to solve problems that are impossible even on the best traditional computers. Molecular nanomagnets might provide a relatively cheap route to reach this extremely ambitious goal and 4D-INS can be an important tool in the understanding and engineering of molecules with the right characteristics for efficiently encoding and processing quantum information. 7. Anything else that the researchers think is relevant in understanding this work and its importance? In our opinion, this work represents a very good example of how the interplay between theory, experiments and chemical synthesis can be very fruitful and can enable us to make a significant step toward an ambitious objective.

See the article here:

Molecular magnets closer to application in quantum computing - Next Big Future

Posted in Quantum Computing | Comments Off on Molecular magnets closer to application in quantum computing – Next Big Future

China’s New Type of Quantum Computing Device, Built Inside a Diamond – TrendinTech

Posted: at 1:25 pm

A new quantum computing process has been developed by a team of Chinese researchers that may one-day help crack complex digital algorithms used in encryption. It involves using diamonds to break encryption and could be used in quantum code-breaking in the not too distant future.

The new type of quantum device was built inside a diamond in order to break down the number 35 into its factors of five and seven. Led by quantum physicist Professor Du Jiangfeng, the development of this new process may be just what we need to crack the encryption. Du Jiangfeng and team fired lasers ad microwave beams at particles inside the diamonds nitrogen-vacancy center. Results from the experiment demonstrated that the particles locked inside the diamond were able to yield the solution in just two microseconds.

Quantum computers are extremely powerful tools that can solve complex equations in fractions of a second, hence why so many people are trying to be the first to develop a workable model. Not only will they be responsible for achieving one of the greatest feats in all of history, but theyll also make a lot of money from it too.

In a similar experiment, researchers from Sandia National Laboratories in New Mexico and Harvard University embedded two silicon atoms in a diamond matrix in order to demonstrate how to successfully bridge quantum computers on an atomic level. Here, the team used an ion beam implanter to swap over the diamonds carbon atom with a larger silicon atom. With regards to the Chinese experiment, its the first time researchers factorized a number built on solid material and could even favor certain numbers of six or more digits.

More News to Read

comments

Excerpt from:

China's New Type of Quantum Computing Device, Built Inside a Diamond - TrendinTech

Posted in Quantum Computing | Comments Off on China’s New Type of Quantum Computing Device, Built Inside a Diamond – TrendinTech

Researchers Invent Nanoscale ‘Refrigerator’ for Quantum … – Sci-News.com

Posted: at 1:25 pm

A team of researchers from the Department of Applied Physics at Aalto University in Finland has invented a quantum-circuit refrigerator, which can reduce errors in quantum computing.

Photo of the centimeter-sized silicon chip, which has two parallel superconducting oscillators and the quantum-circuit refrigerators connected to them. Image credit: Kuan Yen Tan / Aalto University.

How quantum computers differ from the computers that we use today is that instead of normal bits, they compute with quantum bits (qubits), the physicists said.

The bits being crunched in your laptop are either zeros or ones, whereas a qubit can exist simultaneously in both states. This versatility of qubits is needed for complex computing, but it also makes them sensitive to external perturbations.

Just like ordinary processors, quantum computers also need a cooling mechanism.

In the future, thousands or even millions of logical qubits may be simultaneously used in computation, and in order to obtain the correct result, every qubit has to be reset in the beginning of the computation, they said.

If the qubits are too hot, they cannot be initialized because they are switching between different states too much.

This is the problem Aalto University physicists Mikko Mttnen, Kuan Yen Tan and co-authors have developed a solution to.

The nanoscale refrigerator invented by the team solves a massive challenge: with its help, most electrical quantum devices can be initialized quickly. The devices thus become more powerful and reliable.

I have worked on this gadget for five years and it finally works, Tan said.

The team cooled down a qubit-like superconducting resonator utilizing the tunneling of single electrons through a 2-nm-thick insulator.

The authors gave the electrons slightly too little energy from an external voltage source than what is needed for direct tunneling.

Therefore, the electron captures the missing energy required for tunneling from the nearby quantum device, and hence the device loses energy and cools down.

The cooling can be switched off by adjusting the external voltage to zero.

Then, even the energy available from the quantum device is not enough to push the electron through the insulator.

Our refrigerator keeps quanta in order, Dr. Mttnen said.

We now plan to cool actual quantum bits in addition to resonators and want to lower the minimum temperature achievable with the refrigerator and make its on/off switch super fast.

The research is published in the journal Nature Communications.

_____

Kuan Yen Tan et al. 2017. Quantum-circuit refrigerator. Nature Communications 8, article number: 15189; doi: 10.1038/ncomms15189

This article is based on text provided by Aalto University.

Follow this link:

Researchers Invent Nanoscale 'Refrigerator' for Quantum ... - Sci-News.com

Posted in Quantum Computing | Comments Off on Researchers Invent Nanoscale ‘Refrigerator’ for Quantum … – Sci-News.com

Unbreakable quantum entanglement – Phys.Org

Posted: at 1:24 pm

May 10, 2017 The rotating centrifuge in which the entangled photon source was accelerated to 30 times its weight. Credit: IQOQI/AW

Einstein's "spooky action at a distance" persists even at high accelerations, researchers of the Austrian Academy of Sciences and the University of Vienna were able to show in a new experiment. A source of entangled photon pairs was exposed to massive stress: The photons' entanglement survived the drop in a fall tower as well as 30 times the Earth's gravitational acceleration in a centrifuge. This was reported in the most recent issue of Nature Communications. The experiment helps deepen our understanding of quantum mechanics and at the same time gives valuable results for quantum experiments in space.

Einstein's theory of relativity and the theory of quantum mechanics are two important pillars of modern physics. On the way of achieving a "Theory of Everything," these two theories have to be unified. This has not been achieved as of today, since phenomena of both theories can hardly be observed simultaneously. A typical example of a quantum mechanical phenomenon is entanglement: This means that the measurement of one of a pair of light particles, so-called photons, defines the state of the other particle immediately, regardless of their separation. High accelerations on the other hand can best be described by the theory of relativity. Now for the first time, quantum technologies enable us to observe these phenomena at once: The stability of quantum mechanical entanglement of photon pairs can be tested while the photons undergo relativistically relevant acceleration.

Quantum entanglement proves to be highly robust

Researchers of the Viennese Institute of Quantum Optics and Quantum Information (IQOQI) of the Austrian Academy of Sciences (OeAW) and of the University of Vienna have now investigated this area of research experimentally for the first time. They could show in their experiment that entanglement between photons survives even when the source of entangled photon pairs including the detectors are experiencing free fall or are being accelerated with 30g, that is, 30 times the Earth's acceleration. Doing so, the Viennese researchers have experimentally established an upper bound below which there is no degradation of entanglement quality.

Important for quantum experiments with satellites

"These experiments shall help to unify the theories of quantum mechanics and relativity," says Rupert Ursin, group leader at IQOQI Vienna. The sturdiness of quantum entanglement even for strongly accelerated systems is crucial also for quantum experiments in space. "If entanglement were too fragile, quantum experiments could not be carried out on a satellite or an accelerated spacecraft or only in a very limited range," exemplifies Matthias Fink, first author of the publication.

12 meters falling height and 30g

In order to prove the robustness of quantum entanglement, quantum physicist Matthias Fink and his colleagues mounted a source of polarization-entangled photon pairs in a crate which was firstly dropped from a height of 12 meters to achieve zero gravity during the fall. In the second part of the experiment, the crate was fixed to the arm of a centrifuge and then accelerated up to 30g. As a comparison for the reader: A roller coaster ride exerts maximally 6g on the passengers.

Detectors mounted on the crate monitored the photons' entanglement during the experiments. Analysing the data, the physicists could calculate an upper bound of disadvantageous effects of acceleration on entanglement. The data showed that entanglement quality did not significantly exceed the expected contribution of background noise. "Our next challenge will be to stabilize the setup even more in order for it to withstand much higher accelerations. This would enhance the explanatory power of the experiment even further," says Matthias Fink.

Explore further: Researchers demonstrated violation of Bell's inequality on frequency-bin entangled photon pairs

More information: Experimental test of photonic entanglement in accelerated reference frames, Nature Communications, 2017. DOI: 10.1038/NCOMMS15304

Quantum entanglement, one of the most intriguing features of multi-particle quantum systems, has become a fundamental building block in both quantum information processing and quantum computation. If two particles are entangled, ...

Scientists have discovered a new mechanism involved in the creation of paired light particles, which could have significant impact on the study of quantum physics.

Researchers at the Institute of Quantum Optics and Quantum Information, the University of Vienna, and the Universitat Autonoma de Barcelona have achieved a new milestone in quantum physics: they were able to entangle three ...

Albert Einstein called quantum entanglementtwo particles in different locations, even on other sides of the universe, influencing each other"spooky action at a distance."

Physicists of the group of Prof. Anton Zeilinger at the Institute for Quantum Optics and Quantum Information (IQOQI), the University of Vienna, and the Vienna Center for Quantum Science and Technology (VCQ) have, for the ...

(Phys.org)"Spooky action at a distance," Einstein's famous, dismissive characterization of quantum entanglement, has long been established as a physical phenomenon, and researchers are keen to develop practical applications ...

Researchers have uncovered the exact mechanism that causes new solar cells to break down in air, paving the way for a solution.

More than 400 years ago, renowned mathematician and scientist Johannes Kepler speculated about the creation of one of nature's most angelic and unique shapes: the six-sided snowflake. Although atoms would not be discovered ...

(Phys.org)Physicists have theoretically shown that a superconducting current of electrons can be induced to flow by a new kind of transport mechanism: the potential flow of information. This unusual phenomenon is predicted ...

National Institute of Standards and Technology (NIST) physicists have solved the seemingly intractablepuzzle of how to control the quantum properties of individual charged molecules, or molecular ions. Thesolution is to use ...

Einstein's "spooky action at a distance" persists even at high accelerations, researchers of the Austrian Academy of Sciences and the University of Vienna were able to show in a new experiment. A source of entangled photon ...

An experiment at the cutting edge of condensed matter physics and materials science has revealed that the dream of more efficient energy usage can become reality. An international collaboration led by the scientists of Italy's ...

Adjust slider to filter visible comments by rank

Display comments: newest first

It's way past time for sci-writers to stop gushing, "Spooky action, blah, blah". No one not living on Mars is not sick of this tired put-down of entanglement, not unlike the despised `god particle' moniker of the Higgs. AE did not believe in god or spooks, & used those terms colloquially. But now the sci-writers, lacking any creativity for modern descriptions, all parrot AE.

What? Is this supposed to mean the centrifuge produced 30g?

The use of intriguing and interesting language to inspire interest in the subject at hand is a tool used by all journalists. Spooky action at a distance has been referred to as often as it has because it is effective at gaining attention. To lobby for the removal of such an iconic phrase as spooky action at a distance from scientific journals would be counterproductive to the goal of spreading interest in the scientific study of reality and the laws governing it. There is an intrinsic value to use outrageous language to describe outrageous scientific phenomenon. One of Einstein's greatest contributions to science was the interest he created in the subject. Also while his religious beliefs were far from firmly established he often referred to a Force having a role in the guiding of our universe. The article was informative, interesting, and entertaining. And just to make clear what was very clear in the article, yes 30 times Earth's gravity is 30g. It said 30g multiple times

The force does not guide. It survives. Whether anyone learns... we'll see.

Fill a bucket on a rope with water, whirl it around, figure out why the water doesn't spill. Or read this 🙂 https://en.wikipe...ntrifuge

What was discovered was that the aperatus could survive. The limited "relative forces" of these experiments really can't be expected to do much else. Want to determin the "strength" of entangment then send one of the entangled pairs through an accelerator.

Hence you have here peddling cliches, catch phrases, gross simplifications in the aura of mystery and discovery while scientific formalism suffers. All in attempt to draw audiences to science and advertisers.

This particular piece fails to explain what actually those researchers wanted to accomplish and why would they expect entanglement of photons (a phase entanglement vs. spin (magnetic) entanglement) to be affected by inertial acceleration of measly 30g (1E5g may be).

Also and most importantly why would they expect to influence an "undetermined" state of a phase of light of entangled pair by mechanical force. I am sure in the paper they answer those questions quite simply.

An interesting take on addressing the problem of misinterpretation of QM I found here: https://questforn...-quanta/

It's an important point that entanglement still occurs across varying gravity strengths. It's one of those assumptions that must be tested; these are some important negative results.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

See the article here:

Unbreakable quantum entanglement - Phys.Org

Posted in Quantum Physics | Comments Off on Unbreakable quantum entanglement – Phys.Org