Euro 2020: England only have 5.2% chance of winning tournament, says supercomputer – GIVEMESPORT

England are being viewed as one of the big favourites to win Euro 2020.

Gareth Southgate's 26-man squad is loaded with talent, particularly in the various offensive departments.

Harry Kane, Mason Mount, Jack Grealish, Jadon Sancho, Phil Foden, Marcus Rashford and Raheem Sterling are some of the finest attack-minded players on the planet and they are all available for selection at Euro 2020.

READ MORE - Euro 2020: News, Groups, Fixtures, Dates, Tickets, Odds And Everything You Need To Know

Stopping the Three Lions at this summer's international tournament will be mighty difficult and it would likely be deemed a serious failure if they didn't at least reach the semi-final stage.

However, England have not been looked at fondly by a supercomputer, which claims that Southgate's side only have a 5.2% chance of winning Euro 2020.

The Analyst's Stats Perform Predictorhas ranked all 24 nations heading to the tournament by their probability of lifting the trophy on July 11th.

Here's their brief explanation of how the model actually works:

"Stats Performs Euros Prediction model estimates the probability of each match outcome (win, draw or loss) by using betting market odds and Stats Performs team rankings. The odds and rankings are based on historical and recent team performances.

"The model considers the strength of opponents and the difficulty of the path to the final by using the match outcome probabilities with the composition of the groups and the seedings into the knockout stages.

"The Stats Perform Euros Prediction model simulates the remainder of the tournament 40,000 times. By analysing the outcome of each of these simulations, the model returns the likelihood of progression for each team at each stage of the tournament to create our final predictions."

Got it? Right, let's take at each nation's chance of winning Euro 2020 then...

24. North Macedonia - 0.02%

23. Slovakia - 0.04%

22. Scotland - 0.1%

21. Hungary - 0.1%

20.Finland - 0.1%

19. Austria - 0.2%

18. Czech Republic - 0.2%

17. Turkey - 0.4%

16. Wales - 0.6%

15. Ukraine - 0.8%

14. Poland - 0.8%

13. Russia - 1.0%

12. Croatia - 1.0%

11. Sweden - 1.5%

10. Switzerland - 2.3%

9. England - 5.2%

8. Denmark - 5.4%

7. Holland - 5.6%

6. Italy - 7.6%

5. Portugal - 9.6%

4. Germany - 9.8%

3. Spain - 11.3%

2. Belgium - 15.7%

1. France - 20.5%

That's right, England are behind both Denmark and Holland, which seems a tad odd.

France are the overwhelming favourites to win the tournament according to the supercomputer and it's hard to argue with that to be honest.

Les Bleus' squad is even stronger than England's and with Karim Benzema back in the team alongside Kylian Mbappe, Didier Deschamps' side look a formidable prospect on paper.

Although, France have been drawn in the dreaded 'Group of Death' alongside Germany and Portugal, so it certainly won't be plain sailing for the 2018 World Cup winners...

Sancho & Trippier closing in on moves to Man Utd | The Football Terrace

Read more here:

Euro 2020: England only have 5.2% chance of winning tournament, says supercomputer - GIVEMESPORT

Hate to break it to you, but football’s not coming home if this AI pundit is to be believed – The Register

The Czech Republic footie team is set to be crowned champions next month, beating fellow underdogs Denmark 3-2 in the Euros in what pundits claim will prove to be a "thrilling final".

At least that's what boffins at sports data provider Sportradar reckon after scraping together results from the last 20 years, feeding them into a "super computer" running a simulated-reality program driven by AI, and then, hey presto, coming up with the finding ahead of this week's start to the month-long footballfest.

According to dressing room insiders, the whole tournament was "played out" using an "innovative simulated reality solution using artificial intelligence algorithms." So it must be true.

Of course, now that we know the result, Reg readers with a hectic social schedule can breathe a sigh of relief and reorganise their diaries to spend their free time doing other things.

Oh, and if anyone asks, England will be knocked out in the semi-finals... like you didn't know already.

Despite the heads up, El Reg's very own knobbly-kneed footy pundits are already limbering up for the tournament. They're primed and ready to work overtime cross-referencing sensitive info to make sure there are no anomalies between the highly accurate simulated reality predictions and the results of the on-field kickabout.

One grey-haired hack with fond memories of applying dubbin to boots and experiencing shooting pains after heading a waterlogged leather ball reminisced: "We used to make similar predictions before a big tournament when I was a lad. We didn't have supercomputers back then we just had rolled up bits of paper with numbers written on them and we pulled them out of our school caps to predict the scores before whittling down to a winner. Happy days."

El Reg misses Paul the Octopus, the so-called animal oracle that predicted games played during the 2010 World Cup? Sadly he died of natural causes, which is likely one thing he didn't anticipate. Or maybe he did.

Read the original:

Hate to break it to you, but football's not coming home if this AI pundit is to be believed - The Register

Why Is Quantum Computing So Hard to Explain – Quanta Magazine

Quantum computers, you might have heard, are magical uber-machines that will soon cure cancer and global warming by trying all possible answers in different parallel universes. For 15 years, on my blog and elsewhere, Ive railed against this cartoonish vision, trying to explain what I see as the subtler but ironically even more fascinating truth. I approach this as a public service and almost my moral duty as a quantum computing researcher. Alas, the work feels Sisyphean: The cringeworthy hype about quantum computers has only increased over the years, as corporations and governments have invested billions, and as the technology has progressed to programmable 50-qubit devices that (on certain contrived benchmarks) really can give the worlds biggest supercomputers a run for their money. And just as in cryptocurrency, machine learning and other trendy fields, with money have come hucksters.

In reflective moments, though, I get it. The reality is that even if you removed all the bad incentives and the greed, quantum computing would still be hard to explain briefly and honestly without math. As the quantum computing pioneer Richard Feynman once said about the quantum electrodynamics work that won him the Nobel Prize, if it were possible to describe it in a few sentences, it wouldnt have been worth a Nobel Prize.

Not that thats stopped people from trying. Ever since Peter Shor discovered in 1994 that a quantum computer could break most of the encryption that protects transactions on the internet, excitement about the technology has been driven by more than just intellectual curiosity. Indeed, developments in the field typically get covered as business or technology stories rather than as science ones.

That would be fine if a business or technology reporter could truthfully tell readers, Look, theres all this deep quantum stuff under the hood, but all you need to understand is the bottom line: Physicists are on the verge of building faster computers that will revolutionize everything.

The trouble is that quantum computers will not revolutionize everything.

Yes, they might someday solve a few specific problems in minutes that (we think) would take longer than the age of the universe on classical computers. But there are many other important problems for which most experts think quantum computers will help only modestly, if at all. Also, while Google and others recently made credible claims that they had achieved contrived quantum speedups, this was only for specific, esoteric benchmarks (ones that I helped develop). A quantum computer thats big and reliable enough to outperform classical computers at practical applications like breaking cryptographic codes and simulating chemistry is likely still a long way off.

But how could a programmable computer be faster for only some problems? Do we know which ones? And what does a big and reliable quantum computer even mean in this context? To answer these questions we have to get into the deep stuff.

Lets start with quantum mechanics. (What could be deeper?) The concept of superposition is infamously hard to render in everyday words. So, not surprisingly, many writers opt for an easy way out: They say that superposition means both at once, so that a quantum bit, or qubit, is just a bit that can be both 0 and 1 at the same time, while a classical bit can be only one or the other. They go on to say that a quantum computer would achieve its speed by using qubits to try all possible solutions in superposition that is, at the same time, or in parallel.

This is what Ive come to think of as the fundamental misstep of quantum computing popularization, the one that leads to all the rest. From here its just a short hop to quantum computers quickly solving something like the traveling salesperson problem by trying all possible answers at once something almost all experts believe they wont be able to do.

The thing is, for a computer to be useful, at some point you need to look at it and read an output. But if you look at an equal superposition of all possible answers, the rules of quantum mechanics say youll just see and read a random answer. And if thats all you wanted, you couldve picked one yourself.

What superposition really means is complex linear combination. Here, we mean complex not in the sense of complicated but in the sense of a real plus an imaginary number, while linear combination means we add together different multiples of states. So a qubit is a bit that has a complex number called an amplitude attached to the possibility that its 0, and a different amplitude attached to the possibility that its 1. These amplitudes are closely related to probabilities, in that the further some outcomes amplitude is from zero, the larger the chance of seeing that outcome; more precisely, the probability equals the distance squared.

But amplitudes are not probabilities. They follow different rules. For example, if some contributions to an amplitude are positive and others are negative, then the contributions can interfere destructively and cancel each other out, so that the amplitude is zero and the corresponding outcome is never observed; likewise, they can interfere constructively and increase the likelihood of a given outcome. The goal in devising an algorithm for a quantum computer is to choreograph a pattern of constructive and destructive interference so that for each wrong answer the contributions to its amplitude cancel each other out, whereas for the right answer the contributions reinforce each other. If, and only if, you can arrange that, youll see the right answer with a large probability when you look. The tricky part is to do this without knowing the answer in advance, and faster than you could do it with a classical computer.

Twenty-seven years ago, Shor showed how to do all this for the problem of factoring integers, which breaks the widely used cryptographic codes underlying much of online commerce. We now know how to do it for some other problems, too, but only by exploiting the special mathematical structures in those problems. Its not just a matter of trying all possible answers at once.

Compounding the difficulty is that, if you want to talk honestly about quantum computing, then you also need the conceptual vocabulary of theoretical computer science. Im often asked how many times faster a quantum computer will be than todays computers. A million times? A billion?

This question misses the point of quantum computers, which is to achieve better scaling behavior, or running time as a function of n, the number of bits of input data. This could mean taking a problem where the best classical algorithm needs a number of steps that grows exponentially with n, and solving it using a number of steps that grows only as n2. In such cases, for small n, solving the problem with a quantum computer will actually be slower and more expensive than solving it classically. Its only as n grows that the quantum speedup first appears and then eventually comes to dominate.

But how can we know that theres no classical shortcut a conventional algorithm that would have similar scaling behavior to the quantum algorithms? Though typically ignored in popular accounts, this question is central to quantum algorithms research, where often the difficulty is not so much proving that a quantum computer can do something quickly, but convincingly arguing that a classical computer cant. Alas, it turns out to be staggeringly hard to prove that problems are hard, as illustrated by the famous P versus NP problem (which asks, roughly, whether every problem with quickly checkable solutions can also be quickly solved). This is not just an academic issue, a matter of dotting is: Over the past few decades, conjectured quantum speedups have repeatedly gone away when classical algorithms were found with similar performance.

Note that, after explaining all this, I still havent said a word about the practical difficulty of building quantum computers. The problem, in a word, is decoherence, which means unwanted interaction between a quantum computer and its environment nearby electric fields, warm objects, and other things that can record information about the qubits. This can result in premature measurement of the qubits, which collapses them down to classical bits that are either definitely 0 or definitely 1. The only known solution to this problem is quantum error correction: a scheme, proposed in the mid-1990s, that cleverly encodes each qubit of the quantum computation into the collective state of dozens or even thousands of physical qubits. But researchers are only now starting to make such error correction work in the real world, and actually putting it to use will take much longer. When you read about the latest experiment with 50 or 60 physical qubits, its important to understand that the qubits arent error-corrected. Until they are, we dont expect to be able to scale beyond a few hundred qubits.

Once someone understands these concepts, Id say theyre ready to start reading or possibly even writing an article on the latest claimed advance in quantum computing. Theyll know which questions to ask in the constant struggle to distinguish reality from hype. Understanding this stuff really is possible after all, it isnt rocket science; its just quantum computing!

View original post here:

Why Is Quantum Computing So Hard to Explain - Quanta Magazine

Honeywell Takes Quantum Leap. The Apple of Quantum Computing Is Here. – Barron’s

Text size

Honeywell International and Cambridge Quantum Computing are merging their fledgling quantum-computing businesses into a stand-alone company, signaling that quantum computing is just about ready for prime time.

The deal, essentially, combines Honeywells (ticker: HON) quantum hardware expertise with privately held Cambridges software and algorithms. It is as if the two had formed the Apple (AAPL) of the quantum computing world, in that Apple makes hardware, operating systems, and software applications.

This is an inflection point company that will drive the future of quantum computing, said Tony Uttley, currently the president of Honeywells quantum business. He will be president of the new company.

Honeywell says quantum computing can be a trillion-dollar-a-year industry some day, just like smartphones, although for now, the smartphone market is some 2,000 times bigger. Moving now, at the point before the gap begins to close, could be a win.

We are at a [industry] phase where people are looking to hear more about practical quantum use cases and investors want to know if this is investible, said Daniel Newman, founder of Futurum, a research and advisory firm focused on digital innovation and market-disrupting technologies.

This deal will speed the process of investor education. The new business is targeting $1 billion in annual revenue in the next two to four years. Wed be disappointed if we were only at a billion in a few years, said Ilyas Khan, Cambridges CEO and founder. He will be CEO of the new company, which he said will decide whether to pursue an initial public offering by the end of the year.

A name for the business has yet to be chosen.

The new company plans to have commercial products as soon as late 2021. The initial offerings will be in web security, with products such as unhackable passwords. Down the road, there are commercial applications in chemicals and drug development.

In terms of sheer brainpower the new enterprise is impressive. It will have about 350 employees, including 200 scientists, 120 of them with doctorate degrees.

The company will start off with a cash injection of about $300 million from Honeywell. The industrial giant will own about 54% of the new company for contributing its cash and technology.

Honeywell stock isnt reacting to the news. Quantum computing is still too small to move the needle for a $160 billion conglomerate. Shares were down slightly in early Tuesday trading, similar to moves in the S&P 500 and Dow Jones Industrial Average.

Year to date, Honeywell stock has gained 7%.

Write to Al Root at allen.root@dowjones.com

Go here to read the rest:

Honeywell Takes Quantum Leap. The Apple of Quantum Computing Is Here. - Barron's

BBVA and Zapata Computing Release Study Showing the Potential to Speed Up Monte Carlo Calculations for – GlobeNewswire

The research proposes novel circuit designs that significantly reduce the resources needed to gain a quantum advantage in derivative pricing calculations

BOSTON, June 09, 2021 (GLOBE NEWSWIRE) -- Zapata Computing, a leading enterprise software company for quantum-classical applications, today announced the results of a research project conducted with the global bank BBVA. The projects aim was to identify challenges and opportunities for quantum algorithms to speed up Monte Carlo simulations in finance. Monte Carlo simulations are commonly used for credit valuation adjustment (CVA) and derivative pricing. The research proposes novel circuit designs that significantly reduce the resources needed to gain a practical quantum advantage in derivative calculations, taking years off the projected timeline for the day when financial institutions can generate real value from quantum computers.

Fueled by regulatory pressure to minimize systemic financial risk since the global financial crisis of 2008, banks and other financial institutions have been increasingly focused on accounting for credit risk in derivative pricing. In the US, similar regulation exists to stress-test financial scenarios for Comprehensive Capital Analysis andReview (CCAR) and Dodd-Frank compliance. Monte Carlo simulation is the standard approach for this type of risk analysis, but the calculations required which must account for all possible credit default scenarios are immensely complex and prohibitively time-consuming for classical computers. Zapata and BBVAs research reveals practical ways for quantum algorithms to speed up the Monte Carlo simulation process.

Our innovative approach to quantum-accelerated Monte Carlo methods uses a novel form of amplitude estimation, combined with additional improvements that make the quantum circuit much shallower, in some cases hundreds of times shallower than the well-known alternatives in the literature, said Yudong Cao, CTO and founder of Zapata Computing. This approach reduces the time needed for a quantum computer to complete the CVA calculation by orders of magnitude, and also dramatically reduces the number of qubits needed to gain a quantum advantage over classical methods. Zapata highlights that, in their enterprise customer collaborations, they perform in-depth studies of how much quantum computing resource will be required to obtain practical benefit for business operations. This type of in-depth research can directly inform the hardware specifications needed for quantum advantage in specific use cases.

Improving the performance of these calculations in realistic settings will have a direct impact on the technological resources and costs required for financial risk management, said Andrea Cadarso, BBVA Mexicos Team Lead for Quantitative & Business Solutions. The implications of this research are not limited to CVA calculations. We intend to extend our approach to other applications in quantitative finance, where Monte Carlo simulations are widely used for everything from policy making and risk assessment to financial product pricing calculations.

The BBVA-Zapata Computing joint publication is the result of one in a series of research initiatives thatBBVA Research & Patents launched in 2019. These projects, conducted in partnership with leading institutions and companies including Spanish National Research Council, Multiverse, Fujitsu and Accenture, explore the potential advantages of applying quantum computing in the financial sector.

Escolstico Snchez, leader of the Research & Patents discipline at BBVA, emphasized BBVA's intention to continue exploring this cutting-edge technology: BBVA is fully committed to its work in the quantum area. The bank has assembled a quantum team and is getting professionals from different areas involved in the development of a set of quantum solutions that meet the bank's needs.

About Zapata ComputingZapata Computing, Inc. builds quantum-ready applications for enterprise deployment using our flagship product Orquestra. Zapata has pioneered a new quantum-classical development and deployment paradigm that focuses on a range of use cases, including ML, optimization and simulation. Orquestra integrates best-in-class quantum and classical technologies including Zapatas leading-edge algorithms, open-source libraries in Python, and more. Zapata partners closely with hardware providers across the quantum ecosystem such as Amazon, Google, Honeywell, IBM, IonQ, Microsoft and Rigetti. Investors in Zapata include Comcast Ventures, BASF Venture Capital, Honeywell Ventures, Itochu Corporation, Merck Global Health and Robert Bosch Venture Capital.

Media Contact:Anya NelsonScratch Marketing + Media for Zapata Computinganyan@scratchmm.com617.817.6559

Read the rest here:

BBVA and Zapata Computing Release Study Showing the Potential to Speed Up Monte Carlo Calculations for - GlobeNewswire

IBM partners with U.K. on $300M quantum computing research initiative – VentureBeat

Elevate your enterprise data technology and strategy at Transform 2021.

The U.K. government and IBM this week announced a five-year 210 million ($297.5 million) artificial intelligence (AI) and quantum computing collaboration, in the hopes of making new discoveries and developing sustainable technologies in fields ranging from life sciences to manufacturing.

The program will hire 60 scientists, as well as bringing in interns and students to work under the auspices of IBM Research and the U.K.s Science and Technology Facilities Council (STFC) at the Hartree Centre in Daresbury, Cheshire. The newly formed Hartree National Centre for Digital Innovation (HNCDI) will apply AI, high performance computing (HPC) and data analytics, quantum computing, and cloud technologies to advance research in areas like materials development and environmental sustainability, IBM said in a statement.

Artificial intelligence and quantum computing have the potential to revolutionize everything from the way we travel to the way we shop. They are exactly the kind of fields I want the U.K. to be leading in, U.K. Science Minister Amanda Solloway said.

The Hartree Centre was opened in 2012 by UK Research and Innovations STFC as an HPC, data analytics, and AI research facility. Its housed within Sci-Tech Daresburys laboratory for research in accelerator science, biomedicine, physics, chemistry, materials, engineering, computational science, and more.

The program is part of IBMs Discovery Accelerator initiative to accelerate discovery and innovation based on a convergence of advanced technologies at research centers like HNCDI, the company said. This will be IBMs first Discovery Accelerator research center in Europe.

As part of the HNCDI program, the STFC Hartree Center is joining over 150 global organizations, ranging from Fortune 500 companies to startups, with an IBM Hybrid Cloud-accessible connection to the IBM Quantum Network. The Quantum Network is the computing giants assembly of premium quantum computers and development tools. IBM will also provide access to its commercial and experimental AI products and tools for work in areas like material design, scaling and automation, supply chain logistics, and trusted AI applications, the company said.

IBM has been busy inking Discovery Accelerator deals with partners this year. The company last month made a $200 million investment in a 10-year joint project with the Grainger College of Engineering at the University of Illinois Urbana-Champaign (UIUC). As with the HNCDI in the U.K., the planned IBM-Illinois Discovery Accelerator Institute at UIUC will build out new research facilities and hire faculty and technicians.

Earlier this year, IBM announced a 10-year quantum computing collaboration with the Cleveland Clinic to build the computational foundation of the future Cleveland Clinic Global Center for Pathogen Research & Human Health. That project will see the installation of the first U.S.-based on-premises, private sector IBM Quantum System One, the company said. In the coming years, IBM also plans to install one of its first next-generation 1,000+ qubit quantum systems at another Cleveland client site.

The pandemic added urgency to the task of harnessing quantum computing, AI, and other cutting-edge technologies to help solve medicines most pressing problems, IBM chair and CEO Arvind Krishna said in March at the time of the Cleveland Clinic announcement.

The COVID-19 pandemic has spawned one of the greatest races in the history of scientific discovery one that demands unprecedented agility and speed, Krishna said in a statement.

At the same time, science is experiencing a change of its own with high-performance computing, hybrid cloud, data, AI, and quantum computing being used in new ways to break through long-standing bottlenecks in scientific discovery. Our new collaboration with Cleveland Clinic will combine their world-renowned expertise in health care and life sciences with IBMs next-generation technologies to make scientific discovery faster and the scope of that discovery larger than ever, Krishna said.

Continued here:

IBM partners with U.K. on $300M quantum computing research initiative - VentureBeat

Swedish university is behind quantum computing breakthrough – ComputerWeekly.com

Swedens Chalmers University of Technology has achieved a quantum computing efficiency breakthrough through a novel type of thermometer that is capable of simplifying and rapidly measuring temperatures during quantum calculations.

The discovery adds a more advanced benchmarking tool that will accelerate Chalmers work in quantum computing development.

The novel thermometer is the latest innovation to emerge from the universitys research to develop an advanced quantum computer. The so-called OpenSuperQ project at Chalmers is coordinated with technology research organisation the Wallenberg Centre for Quantum Technology (WACQT), which is the OpenSuperQ projects main technology partner.

WACQT has set the goal of building a quantum computer capable of performing precise calculations by 2030. The technical requirements behind this ambitious target are based on superconducting circuits and developing aquantum computer with at least 100 well-functioning qubits. To realise this ambition, the OpenSuperQ project will require a processor working temperature close to absolute zero, ideally as low as 10 millikelvin (-273.14 C).

Headquartered at Chalmers Universitys research hub in Gothenburg, the OpenSuperQ project, launched in 2018, is intended to run until 2027. Working alongside the university in Gothenburg, WACQT is also operating support projects being run at the Royal Institute of Technology (Kungliga Tekniska Hgskolan) in Stockholm and collaborating universities in Lund, Stockholm, Linkping and Gothenburg.

Pledged capital funding for the WACQT-managed OpenSuperQ project which has been committed by the Knut and Alice Wallenberg Foundation together with 20 other private corporations in Sweden, currently amounts to SEK1.3bn (128m). In March, the foundation scaled up its funding commitment to WACQT, doubling its annual budget to SEK80m over the next four years.

The increased funding by the foundation will lead to the expansion of WACQTs QC research team, and the organisation is looking to recruit a further 40 researchers for the OpenSuperQ project in 2021-2022. A new team is to be established to study nanophotonic devices, which can enable the interconnection of several smaller quantum processors into a large quantum computer.

The Wallenberg sphere incorporates 16 public and private foundations operated by various family members. Each year, these foundations allocate about SEK2.5bn to research projects in the fields of technology, natural sciences and medicine in Sweden.

The OpenSuperQ project aims to take Sweden to the forefront of quantum technologies, including computing, sensing, communications and simulation, said Peter Wallenberg, chairman of the Knut and Alice Wallenberg Foundation.

Quantum technology has enormous potential, so it is vital that Sweden has the necessary expertise in this area. WACQT has built up a qualified research environment and established collaborations with Swedish industry. It has succeeded in developing qubits with proven problem-solving ability. We can move ahead with great confidence in what WACQT will go on to achieve.

The novel thermometer breakthrough opens the door to experiments in the dynamic field of quantum thermodynamics, said Simone Gasparinetti, assistant professor at Chalmers quantum technology laboratory.

Our thermometer is a superconducting circuit and directly connected to the end of the waveguide being measured, said Gasparinetti. It is relatively simple and probably the worlds fastest and most sensitive thermometer for this particular purpose at the millikelvin scale.

Coaxial cables and waveguides the structures that guide waveforms and serve as the critical connection to the quantum processor remain key components in quantum computers. The microwave pulses that travel down the waveguides to the quantum processor are cooled to extremely low temperatures along the way.

For researchers, a fundamental goal is to ensure that these waveguides are not carrying noise due to the thermal motion of electrons on top of the pulses that they send. Precise temperature measurement readings of the electromagnetic fields are needed at the cold end of the microwave waveguides, the point where the controlling pulses are delivered to the computers qubits.

Working at the lowest possible temperature minimises the risk of introducing errors in the qubits. Until now, researchers have only been able to measure this temperature indirectly, and with relatively long delays. Chalmers Universitys novel thermometer enables very low temperatures to be measured directly at the receiving end of the waveguide with elevated accuracy and with extremely high time resolution.

The novel thermometer developed at the university provides researchers with a value-added tool to measure the efficiency of systems while identifying possible shortcomings, said Per Delsing, a professor at the department of microtechnology and nanoscience at Chalmers and director of WACQT.

A certain temperature corresponds to a given number of thermal photons, and that number decreases exponentially with temperature, he said. If we succeed in lowering the temperature at the end where the waveguide meets the qubit to 10 millikelvin, the risk of errors in our qubits is reduced drastically.

The universitys primary role in the OpenSuperQ project is to lead the work on developing the application algorithms that will be executed on the OpenSuperQ quantum computer. It will also support the development of algorithms for quantum chemistry, optimisation and machine learning.

Also, Chalmers will head up efforts to improve quantum coherence in chips with multiple coupled qubits, including device design, process development, fabrication, packaging and testing. It will also conduct research to evaluate the performance of 2-qubit gates and develop advanced qubit control methods to mitigate systematic and incoherent errors to achieve targeted gate fidelities.

Continued here:

Swedish university is behind quantum computing breakthrough - ComputerWeekly.com

Quantum Computing With Holes: A New and Promising Qubit at a Place Where There Is Nothing – SciTechDaily

The two holes are confined to the germanium-rich layer just a few nanometers thick. On top, the electrical gates are formed by individual wires with voltages applied. The positively charged holes feel the push and pull from the wires and can therefore be moved around within their layer. Credit: Daniel Jirovec

Quantum computers with their promises of creating new materials and solving intractable mathematical problems are a dream of many physicists. Now, they are slowly approaching viable realizations in many laboratories all over the world. But there are still enormous challenges to master. A central one is the construction of stable quantum bits the fundamental unit of quantum computation called qubit for short that can be networked together.

In a study published inNature Materialsand led by Daniel Jirovec from the Katsaros group at IST Austria in close collaboration with researchers from the L-NESS Inter-university Centre in Como, Italy, scientists now have created a new and promising candidate system for reliable qubits.

The researchers created the qubit using the spin of so-called holes. Each hole is just the absence of an electron in a solid material. Amazingly, a missing negatively charged particle can physically be treated as if it were a positively charged particle. It can even move around in the solid when a neighboring electron fills the hole. Thus, effectively the hole described as positively charged particle is moving forward.

In a study published in Nature Materials and led by Daniel Jirovec from the Katsaros group at IST Austria in close collaboration with researchers from the L-NESS Inter-university Centre in Como, Italy, scientists now have created a new and promising candidate system for reliable qubits. Credit: Daniel Jirovec

These holes even carry the quantum-mechanical property of spin and can interact if they come close to each other. Our colleagues at L-NESS layered several different mixtures of silicon and germanium just a few nanometers thick on top of each other. That allows us to confine the holes to the germanium-rich layer in the middle, Jirovec explains. On top, we added tiny electrical wires so-called gates to control the movement of holes by applying voltage to them. The electrically positively charged holes react to the voltage and can be extremely precisely moved around within their layer.

Using this nano-scale control, the scientists moved two holes close to each other to create a qubit out of their interacting spins. But to make this work, they needed to apply a magnetic field to the whole setup. Here, their innovative approach comes into play.

In their setup, Jirovec and his colleagues cannot only move holes around but also alter their properties. By engineering different hole properties, they created the qubit out of the two interacting hole spins using less than ten millitesla of magnetic field strength. This is a weak magnetic field compared to other similar qubit setups, which employ at least ten times stronger fields.

But why is that relevant? By using our layered germanium setup we can reduce the required magnetic field strength and therefore allow the combination of our qubit with superconductors, usually inhibited by strong magnetic fields, Jirovec says. Superconductors materials without any electrical resistance support the linking of several qubits due to their quantum-mechanical nature. This could enable scientists to build new kinds of quantum computers combining semiconductors and superconductors.

In addition to the new technical possibilities, these hole spin qubits look promising because of their processing speed. With up to one hundred million operations per second as well as their long lifetime of up to 150 microseconds they seem particularly viable for quantum computing. Usually, there is a tradeoff between these properties, but this new design brings both advantages together.

Reference: A singlet-triplet hole spin qubit in planar Ge by Daniel Jirovec, Andrea Hofmann, Andrea Ballabio, Philipp M. Mutter, Giulio Tavani, Marc Botifoll, Alessandro Crippa, Josip Kukucka, Oliver Sagi, Frederico Martins, Jaime Saez-Mollejo, Ivan Prieto, Maksim Borovkov, Jordi Arbiol, Daniel Chrastina, Giovanni Isella and Georgios Katsaros, 3 June 2021, Nature Materials.DOI: 10.1038/s41563-021-01022-2

Funding: Scientic Service Units of IST Austria, MIBA Machine Shop and the nanofabrication facility, NOMIS Foundation

Read more:

Quantum Computing With Holes: A New and Promising Qubit at a Place Where There Is Nothing - SciTechDaily

The ‘second quantum revolution’ is almost here. We need to make sure it benefits the many, not the few – The Conversation AU

Over the past six years, quantum science has noticeably shifted, from the domain of physicists concerned with learning about the universe on extremely small scales, to a source of new technologies we all might use for practical purposes. These technologies make use of quantum properties of single atoms or particles of light. They include sensors, communication networks, and computers.

Quantum technologies are expected to impact many aspects of our society, including health care, financial services, defence, weather modelling, and cyber security. Clearly, they promise exciting benefits. Yet the history of technology development shows we cannot simply assume new tools and systems will automatically be in the public interest.

We must look ahead to what a quantum society might entail and how the quantum design choices made today might impact how we live in the near future. The deployment of artificial intelligence and machine learning over the past few years provides a compelling example of why this is necessary.

Lets consider an example. Quantum computers are perhaps the best-known quantum technology, with companies like Google and IBM competing to achieve quantum computation. The advantage of quantum computers lies in their ability to tackle incredibly complex tasks that would take a normal computer millions of years. One such task is simulating molecules behaviour to improve predictions about the properties of prospective new drugs and accelerate their development.

One conundrum posed by quantum computing is the sheer expense of investing in the physical infrastructure of the technology. This means ownership will likely be concentrated among the wealthiest countries and corporations. In turn, this could worsen uneven power distribution enabled by technology.

Other considerations for this particular type of quantum technology include concerns about reduced online privacy.

How do we stop ourselves blundering into a quantum age without due forethought? How do we tackle the societal problems posed by quantum technologies, while nations and companies race to develop them?

Last year, CSIRO released a roadmap that included a call for quantum stakeholders to explore and address social risks. An example of how we might proceed with this has begun at the World Economic Forum (WEF). The WEF is convening experts from industry, policy-making, and research to promote safe and secure quantum technologies by establishing an agreed set of ethical principles for quantum computing.

Australia should draw on such initiatives to ensure the quantum technologies we develop work for the public good. We need to diversify the people involved in quantum technologies in terms of the types of expertise employed and the social contexts we work from so we dont reproduce and amplify existing problems or create new ones.

Read more: Scientists want to build trust in science and technology. The alternative is too risky to contemplate

While we work to shape the impacts of individual quantum technologies, we should also review the language used to describe this second quantum revolution.

The rationale most commonly used to advocate for the field narrowly imagines public benefit of quantum technologies in terms of economic gain and competition between nations and corporations. But framing this as a race to develop quantum technologies means prioritising urgency, commercial interests and national security at the expense of more civic-minded concerns.

Its still early enough to do something about the challenges posed by quantum technologies. Its also not all doom and gloom, with a variety of initiatives and national research and development policies setting out to tackle these problems before they are set in stone.

We need discussions involving a cross-section of society on the potential impacts of quantum technologies on society. This process should clarify societal expectations for the emerging quantum technology sector and inform any national quantum initiative in Australia.

Read more: Why are scientists so excited about a recently claimed quantum computing milestone?

View post:

The 'second quantum revolution' is almost here. We need to make sure it benefits the many, not the few - The Conversation AU

Global Quantum Computing Market to Gain $667.3 Million and Surge at a CAGR of 30.0% from 2020-2027 Timeframe – Exclusive [193 pages] COVID-19 Impact…

The global quantum computing industry is projected to surge from 2020 to 2027 due to the rise in the number of cyber-attacks across the world. Consulting solutions sub-segment is estimated to be the most profitable. The European market is estimated to be the most dominating during the forecasted period.

New York, USA, June 07, 2021 (GLOBE NEWSWIRE) -- According to a recent report studied by Research Dive, the global quantum computing market is speculated to exceed $667.3 million by the end of 2027, rising from a market size of $88.2 million in 2019, at a growth rate of 30.0% during 2020-2027 estimated timeframe. The report highlights the coronavirus mayhem impact on the market, major drivers, hindrances, and regional outlook of the market. The research methodology used in the report is a combination of both primary and secondary research methods.

Download FREE Sample Report of the Global Quantum Computing Market: https://www.researchdive.com/download-sample/8332

Covid-19 Outbreak Impact on the Global Market

The quantum computing market is anticipated to experience a positive impact globally during the coronavirus crises. The reason for market growth is that quantum technology offers augmented performance computing that can shift dynamics for quantum chemistry. Further, quantum technology provides exponential speed for amplified optimization and vital calculations. These facets are predicted to govern the market growth during the coronavirus emergency.

Check out How COVID-19 impacts the Global Quantum Computing Market. Click here to Connect with our Analyst to get more Market Insight: https://www.researchdive.com/connect-to-analyst/8332

Aspects Impacting the Market

The global quantum computing market is projected to witness progressive growth due to rise in the cyber-attack cases. Quantum technology assures security to software systems and applications and protects vital data of organizations from attacks such as ransomware, phishing, worms, and much more. Furthermore, key companies of the market are planning strategic frameworks by utilizing quantum personal computers for cyber-security. These aspects are anticipated to surge the market growth during the forecasted timeframe. However, a lack of awareness of quantum technology and unskilled employees is expected to hinder the market growth. On the other hand, the ability of quantum technology to aid farmers in augmenting the yield and efficiency of plants is projected to create promising opportunities for the market growth.

Story continues

Access Varied Market Reports Bearing Extensive Analysis of the Market Situation, Updated With The Impact of COVID-19: https://www.researchdive.com/covid-19-insights

Consulting Solutions Sub-Segment to be the Most Profitable

From the offerings type segment, the consulting solutions sub-segment is anticipated to reach newer heights during the timeframe. The sub-segment is expected to register a revenue of $354.0 million by the end of the 2027 timeframe. The sub-segment upsurge is due to the usage of quantum computing in applications such as drug discovery, formulation of chemicals, material science, and automotive. Apart from this, it is also used in the chemical industry, aerospace & defense, healthcare, and energy & power sectors. These wide-scale applications are speculated to bolster the growth of the sub-segment during the forecasted years.

Check out all Information and communication technology & media Industry Reports: https://www.researchdive.com/information-and-communication-technology-and-media

Machine Learning Sub-Segment to Gain Maximum Revenue

From the application segment, the machine learning sub-segment is projected to achieve maximum revenue during the forecasted timeframe. The sub-segment is anticipated to cross $236.9 million by the end of 2027, rising from a market share of $29.7 million in the year 2019. The ability of quantum learning to accelerate machine learning such as optimization, deep learning, Kernel evaluation, and linear algebra is expected to propel the sub-segment market growth during the analyzed timeframe.

Finance & Banking Sub-Segment to Witness Rapid Growth

From the end-user segment, the finance & banking sub-division is speculated to grow rapidly and register a revenue of $159.2 million by 2027. The sub-segment growth is due to the usage of quantum technology in banking for supporting the large-frequency trading aspect.

Regional Outlook

The European market was expected to hold a market size of $28.2 million in 2019 and is speculated to garner a revenue of $221.2 million by the end of 2027. The market growth is mainly attributed to the extensive use of quantum computing in fields such as chemicals, healthcare, pharmaceuticals, and utilities. Moreover, its usage in cryptography, novel drugs, defense, and cybersecurity is predicted to drive the global market during the estimated timeframe.

Major Key Players

QC Ware, Corp. Cambridge Quantum Computing Limited D-Wave Systems Inc., International Business Machines Corporation Rigetti Computing 1QB Information Technologies River Lane Research StationQ Microsoft Anyon Google Inc.

These leading players are planning varied strategies such as acquisitions of companies, product developments, tie-ups & collaborations for maximizing profits, research & development, and organizational development to gain an upper edge in the market worldwide. For example, in April 2021, Nvidia, a computer systems design services company, revealed cuQuantum SDK. This product is a developmental platform for revitalizing quantum circuits on GPU-accelerated systems.

The report consists of various facets of all the vital players that are operative in the market such as financial performance, product portfolio, present strategic moves, major developments and SWOT. Click Here to Get Absolute Top Companies Development Strategies Summary Report.

TRENDING REPORTS WITH COVID-19 IMPACT ANALYSIS

Advanced Persistent Threat (APT) Protection Market https://www.researchdive.com/8335/advanced-persistent-threat-apt-protection-market

Network Slicing Market https://www.researchdive.com/5670/network-slicing-market

Signal Intelligence (SIGINT) Market https://www.researchdive.com/5478/signals-intelligence-sigint-market

Application Security Market https://www.researchdive.com/5735/application-security-market

See the rest here:

Global Quantum Computing Market to Gain $667.3 Million and Surge at a CAGR of 30.0% from 2020-2027 Timeframe - Exclusive [193 pages] COVID-19 Impact...

UK govt and IBM together to build 210M AI & quantum computing centre in Daresbury – UKTN (UK Technology News

Modern-day complex problems require power-packed technological solutions to revamp industrial growth. UK government is stepping into helping industries get maximum access to the latest technology and modernising by establishing an AI and quantum computing centre in Daresbury, Cheshire.

The government will invest 172m over five years through UK Research and Innovation (UKRI) with a further investment of 38m from computing giant IBM. The centre is now aimed at developing next-generation computers using AI and quantum computing technologies to help out the businesses future-ready.

The Centre will be operated through collaboration between IBM and the Science and Technology Facilities Council (STFC). The Hartree National Centre for Digital Innovation (HNCDI) programme will create 60 new job and exciting opportunities for students to witness complex problem solving through technology application.

Further, the centre will support AI & Quantum Computing application to tasks such as optimising complex logistics, power grid distribution, designing and manufacturing, traffic management, warehouse management and product innovation.

HNCDI will work with different sectors, including materials, life sciences, environment and manufacturing. It will also engage in collaboration with academic and industrial research communities, startups as well as small and medium-sized enterprises (SMEs).

Ms Solloway, the science minister said quantum computing and AI were not just far-fetched ideas, but real technologies that are already transforming our lives. Artificial intelligence and quantum computing have the potential to revolutionise everything from the way we travel to the way we shop. The building blocks of everyday products like your laptop or your phone are already products of quantum technology, harnessing the unique ways that light and matter behave at tiny atomic or subatomic levels.

Further, she added, This fantastic new partnership with IBM will not only help businesses get ready for the future of computing but create 60 jobs in the region boosting innovation and growing the economy as we build back better from the pandemic.

A spokesman for the Department for Business, Energy and Industrial Strategy said the centres aim was to make cutting-edge technologies like AI and quantum computing more accessible to businesses and public sector organisations.

As well as breaking down practical barriers to using new technologies, the team of experts will also provide training and support to make sure the UK is at the forefront of the next generation of computing, he added.

Prof Mark Thomson, STFCs executive chairman said that by allowing industry to access a ready-made community of digital experts and cutting-edge technology, it will provide momentum for new ideas and solutions.

This programme has the potential to transform the way UK industry engages with AI and digital technologies, to the benefit of not just research communities but all of society.

Senior VP and Director of IBM Research, Mr Dario Gil said that This partnership establishes our first Discovery Accelerator in Europe driven by our two UK-based IBM Research locations in Hursley and Daresbury as they contribute to our global mission of building discovery-driven communities around the world.

Read the rest here:

UK govt and IBM together to build 210M AI & quantum computing centre in Daresbury - UKTN (UK Technology News

IBM and UK Government invest 210m in new AI and computing centre – Built Environment Networking

A new artificial intelligence and quantum computing centre has been launched in North West England, thanks to a 210 million investment from the Government and IBM to help cement the UKs status as a science superpower.

The Hartree National Centre for Digital Innovation (HNCDI), based at the Science and Technology Facilities Councils (STFC) Daresbury Laboratory in the Liverpool City Region, will create vacancies for an additional 60 scientists and opportunities for students to gain invaluable hands-on experience.

The centre a partnership between STFC and IBM will bring together world-leading expertise in artificial intelligence (AI) and quantum computing to support the application of the cutting-edge technologies in industry and the public sector.

Possible industry applications of quantum computing include optimising complex logistics such as picking and packing orders in large warehouses for supermarkets; traffic routing; energy distribution; improving design and manufacturing processes across automotive sectors.

The government will invest 172 million over 5 years through UK Research and Innovation (UKRI), with an additional 38 million being invested by IBM. 28 million of the governments investment will be in the first year.

Science Minister Amanda Solloway:

Artificial intelligence and quantum computing have the potential to revolutionise everything from the way we travel to the way we shop.

This fantastic new partnership with IBM will not only help businesses get ready for the future of computing, but create 60 jobs in the region boosting innovation and growing the economy as we build back better from the pandemic.

The HNCDI will make cutting-edge technologies like AI and quantum computing more accessible to businesses and public sector organisations.

As well as breaking down practical barriers to using new technologies, for example by providing access to equipment and infrastructure, the team of experts at HNCDI will also provide training and support to make sure the UK is at the forefront of the next generation of computing.

Dario Gil, Senior Vice President and Director, IBM Research:

The world is facing grand challenges which demand a different approach towards science in computing, including AI and quantum computing, to engage a broad community across industry, government, and academia to accelerate discovery in science and business.

This partnership establishes our first Discovery Accelerator in Europe driven by our two UK-based IBM Research locations in Hursley and Daresbury as they contribute to our global mission of building discovery-driven communities around the world.

The technologies that have transformed our lives the building blocks of modern computers, the mobile phone, the laser, the MRI scanner are all products of quantum science. This involves harnessing the unique ways that light and matter behave at tiny atomic or subatomic levels.

A new generation of quantum technologies exploit breakthroughs in the way that we are able to precisely manipulate and measure these special properties, to engineer quantum devices like sensors and computers with dramatically enhanced functionality and performance.

The centre will work across sectors including materials, life sciences, environment and manufacturing. This will include collaboration with academic and industrial research communities, including start-ups and SMEs, public sector, and government.

Professor Mark Thomson, Executive Chair of STFC:

The HNCDI programme will foster discovery and provide a stimulus for industry innovation in the UK.

By allowing industry to access a ready-made community of digital experts and cutting-edge technology, it will provide momentum for new ideas and solutions.

This programme has the potential to transform the way UK industry engages with AI and digital technologies, to the benefit of not just research communities but all of society.

Read more:

IBM and UK Government invest 210m in new AI and computing centre - Built Environment Networking

Quantum Computing & Technologies Market Share at a CAGR of 32.5 %, Trends, Growth, Sales, Demand, Revenue, Size, Forecast and COVID-19 Impacts…

The Quantum Computing & Technologies Market research report is prepared by implying robust research methodology and including Porters Five Forces analysis to provide the complex matrix of the market. The report covers comprehensive data on emerging trends, market drivers, growth opportunities, and restraints that can change the market dynamics of the report. It provides an in-depth analysis of the market segments which include products, applications, and end-user applications. This report also includes a complete analysis of industry players that cover their latest developments, product portfolio, pricing, mergers, acquisitions, and collaborations. Moreover, it provides crucial strategies that are helping them to expand their market share.

Quantum Computing & Technologies Market is Expected to Grow with a CAGR of 32.5 % over the Forecast Period.

Significant Players of this Global Quantum Computing &

Global Quantum Computing & Technologies market report is segmented on the basis of product type, application, end-user and regional & country level. Based on product type, global Quantum Computing & Technologies markets are classified as holographic display, microscopes, software, holographic prints and others. Based upon application, global Quantum Computing & Technologies markets is classified as medical imaging, medical education and biomedical research. Based upon end-user, global Quantum Computing & Technologies markets is classified as biotechnology and pharmaceutical companies, academic and research institutes and hospitals and clinics.

Segmentation Analysis:

By Type of Technology: Block chain, Adiabatic, Measurement-Based, Superconducting, Topological

By Applications: Cryptography, IoT/Big data/Artificial intelligence/ML, Teleportation, Simulation & Data Optimization, Others

By Component: Hardware, Software & Systems, Services

By End-User Industry: Aerospace and Defense, Healthcare, Manufacturing, IT & Telecommunications, Energy and Power, Others

Geographically, this report split global into several key Regions, revenue (Million USD) the geography (North America, Europe, Asia-Pacific, Latin America and Middle East & Africa) focusing on key countries in each region. It also covers market drivers, restraints, opportunities, challenges, and key issues in Global Post-Consumer Quantum Computing & Technologies Market.

Download Free Exclusive Sample (200 Pages PDF) Report: To Know the Impact of COVID-19 on this Industry @ https://brandessenceresearch.com/requestSample/PostId/941?utm_source=MC&utm_medium=AR

Key Benefits for Quantum Computing & Technologies Market Reports

The analysis provides an exhaustive investigation of the global Post-Consumer Quantum Computing & Technologies market together with the future projections to assess the investment feasibility. Furthermore, the report includes both quantitative and qualitative analyses of the Post-Consumer Quantum Computing & Technologies market throughout the forecast period. The report also comprehends business opportunities and scope for expansion. Besides this, it provides insights into market threats or barriers and the impact of regulatory framework to give an executive-level blueprint the Post-Consumer Quantum Computing & Technologies market.

Key Features of the Report:

Global Quantum Computing & Technologies Markets: Countries and Regions

North America, Asia-Pacific, UK, Europe, Central & South America, Middle East & Africa

Frequently asked questions (FAQs) about the report

Yes. The Quantum Computing & Technologies market report can be customized according to your needs. For instance, the company can be profiled you ask for while specific region/country analysis can be focused that meets your interests. You can talk to our research analyst about your exact requirements and UMR will accordingly tailor the required report.

Yes, the market report can be further segmented on the basis of data availability and feasibility. We can provide a further breakdown in product types and applications (if applicable) by size, volume, or revenue. In the market segmentation part, the latest product developments and customer behavior insights are also included to give an in-depth analysis of the market.

Yes. The market research report covers the detailed analysis of COVID-19 impact on the market. Our research team has been monitoring the market closely while it has been conducting interviews with the industry experts to get better insights on the present and future implications of the COVID-19 virus on the market.

The market report provides vital information on the strategies deployed by industry players during the COVID-19 crisis to maintain their position in the market. Along with this, it also shares crucial data on product developments due to the inevitable pandemic across the globe.

Table of Content

1.1. Research Process1.2. Primary Research1.3. Secondary Research1.4. Market Size Estimates1.5. Data Triangulation1.6. Forecast Model1.7. USPs of Report1.8. Report Description

2.1. Market Introduction2.2. Executive2.3. Global Quantum Computing & Technologies Market Classification2.4. Market Drivers2.5. Market Restraints2.6. Market Opportunity2.7. Quantum Computing & Technologies Market: Trends2.8. Porters Five Forces Analysis2.9. Market Attractiveness Analysis

Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, Europe or Asia.

About Brandessence market research publishes market research reports & business insights produced by highly qualified and experienced industry analysts. Brand Essence Market Research report will be best fit for senior executives, business development managers, marketing managers, consultants, CEOs, CIOs, COOs, and Directors, governments, agencies, organizations and Ph.D. Students. We have a delivery center in Pune, India and our sales office is in London.Contact Us:

Alan Ruffalo

Corporate Sales: +44-20380741

Email: sales@brandessenceresearch.com

Web: https://brandessenceresearch.com/

Trending Link :

https://www.prnewswire.com/news-releases/ai-enabled-kitchen-appliances-market-is-exhibited-to-grow-at-25-8-cagr-over-the-forecast-period-2021-2027brandessence-market-research-301246403.html

https://www.prnewswire.com/news-releases/rise-of-contactless-dining-anticipating-food-service-industry-says-brandessence-market-research-301198600.html

https://www.prnewswire.com/news-releases/at-cagr-of-6-71healthcare-consulting-industry-expected-to-cross-20-usd-billion-by-2025-says-brandessence-market-research-301200108.html

https://www.prnewswire.com/news-releases/at-6-15-cagrglobal-construction-glass-market-is-expected-to-reach-usd-66-83-billion-in-2026says-brandessence-market-research-301201873.html

Original post:

Quantum Computing & Technologies Market Share at a CAGR of 32.5 %, Trends, Growth, Sales, Demand, Revenue, Size, Forecast and COVID-19 Impacts...

STFC and IBM sign 210m AI and quantum computing deal – BusinessCloud

TheScience and Technology FacilitiesCouncilhas announced a 210 million deal with IBM to acceleratediscovery and innovation with artificial intelligence and quantum computing.

Science Minister Amanda Solloway unveiledthefive-year partnershipwhich will see the launch oftheHartree National Centre for Digital Innovation in theNorth Westtosupport UK businesses and the public sector.

The aim is tobreak down practical barriers to innovation such as access to infrastructure or digital skills gaps within organisationsin sectors such asmaterials development, life sciences, environmental sustainability and manufacturing.

By advancing the pace at which businesses can take advantage of new digital technologies, the collaborationis expected toenhance productivity, create new skilled jobs and boost regional and national economic growth.

Based in Daresbury, an additional 60 new scientists, interns and students will join IBM Research and the Hartree Centre.

The research is part of IBMs global Discovery Accelerator initiative, which seeks to accelerate discovery and innovation based on a convergence of advanced technologies by establishing research centres, fostering and enabling collaborative communities, and advancing skills and economic growth in large-scale programs.

Artificial intelligence and quantum computing have the potential to revolutionise everything from the way we travel to the way we shop,saidSolloway.

They are exactly the kind of fields I want the UK to be leading in, and this new centre in theNorth Westis a big step towards that.

Thanks to this fantastic new partnership with IBM, British businesses will have access to the kind of infrastructure and expertise that will help them boost innovation and grow the economy.

The HNCDI programme will support several industry projects to accelerate the adoption of advanced digital technologies with UK companies of various sizes.

HNCDI will enable the UK to develop the skills, knowledge and technical capability required to adopt emerging digital technologies, seeding the UK with new ideas and innovative solutions, said Professor Mark Thomson, Executive Chair of STFC Hartree Centre.

The programme has transformative potential to generate long-term GVA for the economy by embedding AI solutions across the UK industry.

We are applying knowledge from the UKs strong fundamental research base to develop tools and techniques that address identified industry and public sector needs, improving economic and societal outcomes.

View post:

STFC and IBM sign 210m AI and quantum computing deal - BusinessCloud

$100 Million to Advance Duke Science and Technology Research – Duke Today

The Duke Endowment of Charlotte, N.C., is supporting Duke Universitys efforts to expand its faculty in computation, materials science and the resilience of the body and brain by completing the second phase of a $100 million investment.

This is the largest award Duke University has ever received. Advancing Science and Technology

Better designs to capture the full potential of carbon-neutral energy. Harnessing the brain's resilience to fight Alzheimer's Disease. Developing Develop cybersecurity tools to defend us from future threats. Read about these and other investments Duke is making in science and technology research and teaching.

The funds form the base of Duke Science and Technology, a faculty-hiring and fund-raising effort designed to elevate excellence in the sciences at Duke. They will be used to accelerate and expand the recruitment of new faculty in science, medicine, technology, engineering and mathematics. The funds will also expand core research strengths that allow Duke faculty to address difficult global challenges and prepare Duke students to be the leaders of the future.

This extraordinary gift from The Duke Endowment advances our universitys position as a destination for exceptional and visionary faculty in a competitive global market, said Duke President Vincent E. Price. These scholars will accelerate discovery and collaborative research across our campus and around the world. Dukes next century will be one of unbounded intellectual curiosity in which uniquely talented and creative scientists come together in new ways to ask the most difficult questions and try to tackle the most critical challenges of our day.

The first $50 million of The Duke Endowments historic commitment to support Duke Science and Technology was announced in 2019.

Minor Shaw, chair of the Endowments Board of Trustees, said The Duke Endowments founder, James B. Duke, was a visionary leader in business and philanthropy who seized opportunities to experiment and innovate. Advancements in science and technology will transform our world, Shaw said. By investing in the next generation of faculty at Duke, we can achieve a better future for us all.

The funding comes at a time when Duke is placing big bets on emerging technologies like quantum computing and addressing global challenges such as climate change and pandemic disease.

The faculty we are able to recruit thanks to this investment from The Duke Endowment have enormous quality and potential, said Provost Sally Kornbluth, the universitys chief academic officer. We are confident that their work will result in increased impact, elevate Duke to new levels of scientific discovery and improve health outcomes for the citizens of North Carolina and beyond. We want to continue to build on this success.

In the two years since the university announced the first half of this $100 million award, the Duke Endowments investment has been used to recruit and retain some of the countrys leading scholar-scientists in a range of disciplines.

At Duke, we are redefining what is possible in preventing and treating a range of health conditions from cancer, brain disorders and infectious diseases to behavioral health issues, said A. Eugene Washington, M.D., chancellor for health affairs and president and chief executive officer of the Duke University Health System. This generous gift ensures that our exceptional research community will continue to thrive with the very best scientists who value collaboration and interdisciplinarity, and drive bold ideas."

Duke will continue a targeted effort to recruit scientist-scholars at all levels in its strategic areas. The hiring effort is expected to continue over the next few years.

--- --- ---

Based in Charlotte and established in 1924 by industrialist and philanthropist James B. Duke, The Duke Endowment is a private foundation that strengthens communities in North Carolina and South Carolina by nurturing children, promoting health, educating minds and enriching spirits. Since its founding, it has distributed more than $4 billion in grants. The Endowment shares a name with Duke University and Duke Energy, but all are separate organizations.

Read more:

$100 Million to Advance Duke Science and Technology Research - Duke Today

Bristol startup scores 3.1M to control next-gen quantum hacks threatening the future of internet – UKTN (UK Technology News

The quantum computing industry witnessed exponential growth over recent years. As a result, the threat of quantum attacks on our communications is rapidly approaching a point when quantum computers will be able to crack all the existing encryption that protects our data.

Bristol-based KETS Quantum Security is a quantum tech company passionate about solving real-world security issues by leveraging the advantages of quantum technologies.

Redefining the Future of Secure Communications, the company just bagged 3.1 million in funding to bring to market hardware to protect data from a new generation of cyberattacks that will use quantum computers. The round was co-led by Quantonation and Speedinvest, with participation from Mustard Seed MAZE.

The investment will be used to accelerate development, production, and delivery of first products. It will also allow KETS to expand key first trials of the technology in real-world applications and environments that are already in development. To deliver all of this, KETS will continue building a world-leading team passionate about the companys technology and values. Furthermore, KETS will continue to expand into the global marketplace beyond its first international office following its recent expansion into Paris.

In todays world, we dont go 30 seconds without touching digital technology of some kind, all of which is networked, none of which is quantum-safe, said Dr Chris Erven, CEO and co-founder of KETS Quantum Security. At KETS, weve made it our mission to protect the worlds most valuable resourceinformationfrom the threat of quantum computing. This investment will allow us to make quantum-safe communications solutions ubiquitous and easily integrated. Ultimately, KETS is building a world in which we can trust our digital connections as much as our personal ones.

Olivier Tonneau from Quantonation said, KETS is reaching a key point in its story, with products that will now be available to deploy, bringing clients the worlds first on-chip, quantum-secured solutions protecting against the future threat of quantum computers.

Rick Hao from Speedinvest said, KETS is developing technology with a vision to solve some of the global cybersecurity challenges faced by the largest organisations by combining the power of quantum encryption technologies with the scalability and practicality of integrated, chip-based quantum photonics. Bristol is leading the world on building quantum technology hardware, and Speedinvest is excited to be backing great deep tech entrepreneurs here.

Current cybersecurity is threatened by powerful hardware, sophisticated algorithms and the emergence of quantum computing. KETS on-chip Quantum Key Distribution offers a practical solution by optically distributing secure cryptographic keys. Secret random numbers are at the heart of cryptography. Inferior generators can render communications insecure.

Established in 2016 by Chris Erven, Caroline Clark, and Jake Kennard, KETS Quantum Security develops a unique chip-based solutions provide ultra-low size power and weight without compromising performance. It develops protection against quantum security threats, starting with chip-based, quantum-safe encryption development kits.

Go here to read the rest:

Bristol startup scores 3.1M to control next-gen quantum hacks threatening the future of internet - UKTN (UK Technology News

Stars Made of Antimatter Might Be Lurking in the Universe – Scientific American

Antimatter may seem like the stuff of science fictionespecially because scarcely any of it can be seen in our universe, despite physicists best theories suggesting antimatter should have arisen in equal proportion to normal matter during the big bang. But researchers do regularly produce particles of antimatter in their experiments, and they have the inklings of an explanation for its cosmic absence: Whenever antimatter and normal matter meet, they mutually annihilate in a burst of energy. The slimmest overabundance of normal matter at the beginning of time would have therefore effectively wiped antimatter off the celestial map, save for its occasional production in cosmic-ray strikes, human-made particle accelerators and perhaps certain theorized interactions between particles of dark matter.

That is why physicists were so greatly puzzled back in 2018, when the head of the Alpha Magnetic Spectrometer (AMS) experiment mounted on the exterior of the International Space Station announced that the instrument might have detected two antihelium nucleiin addition to six that were possibly detected earlier. Any way you slice it, known natural processes would struggle to produce enough antihelium for any of it to end up in our space-based detectors. But the easiest of all those hard methods would be to cook up the antihelium inside antistarswhich, of course, do not seem to exist. Despite the fact that the entirely unexpected AMS results have yet to be confirmed, let alone formally published, scientists have taken them seriously, and some have scrambled to find explanations.

Inspired by the tentative AMS findings, a group of researchers recently published a study calculating the maximum number of antimatter stars that could be lurking in our universe, based on a count of currently unexplained gamma-ray sources found by the Fermi Large Area Telescope (LAT). Simon Dupourqu, the studys lead author and an astrophysics graduate student at the Research Institute in Astrophysics and Planetology at the University of Toulouse IIIPaul Sabatier in France and the French National Center for Scientific Research (CNRS), made the estimate after looking for antistar candidates in a decades worth of the LATs data.

Antistars would shine much as normal ones doproducing light of the same wavelengths. But they would exist in a matter-dominated universe. As particles and gases made of regular matter fell into such a stars gravitational pull and made contact with its antimatter, the resulting annihilation would produce a flash of high-energy light. We can see this light as a specific color of gamma rays. The team took 10 years of data, which amounted to roughly 6,000 light-emitting objects. They pared the list down to sources that shone with the right gamma frequency and that were not ascribed to previously cataloged astronomical objects. So this left us with 14 candidates, which, in my opinion and my co-authors opinion, too, are not antistars, Dupourqu says. If all of those sources were such stars, however, the group estimated that about one antistar would exist for every 400,000 ordinary ones in our stellar neck of the woods.

In place of any putative antistars, Dupourqu says, these gamma flashes could instead be coming from pulsars or the supermassive black holes at the centers of galaxies. Or they might simply be some kind of detector noise. The next step would be to point telescopes at the locations of the 14 candidate sources to find out if they resemble a star or a prosaic gamma-emitting object.

Given some interesting but questionable gamma sources, calculating the conceivable upper limit to the number of antistars is a long shot from actually discovering such astrophysical objects, So most researchers are not leaning toward that conclusion. According to both theory and observations of extragalactic gamma rays, there should be no antistars in our galaxy.... One would only expect upper limits consistent with zero, says Floyd Stecker, an astrophysicist at NASAs Goddard Space Flight Center, who was not involved in the research. However, it is always good to have further observational data confirming this.

If scientists, including the authors, are skeptical of antistars very existence, why are they worth discussing? The mystery lies in those pesky possible detections of antihelium made by the AMS, which remain unexplained. Antiparticles can be created from two known natural sourcescosmic rays and dark matterbut the odds that either of them are responsible appear to be vanishingly slim.

As we increase the size of an atom, it becomes harder and harder to produce as an antiparticle, says Vivian Poulin, a CNRS cosmologist based in Montpelier, France. This means that its rarer and rarer that it occurs, but its allowed by physics. An antiproton is relatively easy to form, yet anything heavier, such as antideuteriuman antiproton plus an antineutronor antiheliumtwo antiprotons plus typically one or two antineutronsgets progressively harder to make as it gets more massive. In a paper published in 2019, Poulin used the AMSs potential antihelium detections to calculate a rough estimate of the prevalence of antistars, which inspired Dupourqus new study.

In a process called spallation, high-energy cosmic rays from exploding stars can ram into interstellar gas particles, says Pierre Salati, a particle astrophysicist at the Annecy-le-Vieux Particle Physics Laboratory, who worked on Poulins 2019 study. The team responsible for the AMSs antiparticle detections claim it may have detected six antihelium-3 nuclei, which would be incredibly rare products of spallation, and two antihelium-4 nuclei, which would be almost statistically impossible to form from cosmic rays, Salati says. (The difference between the two isotopes is the addition of one antineutron.)

As for dark matter, certain models predict that dark matter particles can annihilate one anothera process that could also create antiparticles. But this process still might not be able to make antihelium-4 in high enough quantities for us to have a realistic chance of ever seeing it (if such speculative models reflect reality at all). That is why the antistar hypothesis is still on the table. Verified antihelium detections would be a good indicator for the existence of antistars, but so far the AMS is the lone experiment to offer any such evidencewhich has yet to be granted peer-reviewed publication, Salati notes.

Its a very challenging analysis because, for every one antihelium event, there are 100 million regular helium events, says Ilias Cholis, an astrophysicist at Oakland University, who also worked on Poulins study. It is possible, he and others say, that the detections turn out to be a fluke of a very complicated analysis.

Samuel Ting, a Nobel laureate physicist at the Massachusetts Institute of Technology, heads the AMS team and first publicly presented the two latest possible antihelium detectionsthe antihelium-4 candidatesin 2018. We are not yet ready to publish any heavy antimatter results, he says.We are collecting more data before any [further] announcement is made.

It is possible that a different experiment may give answers sooner. The General AntiParticle Spectrometer (GAPS) experiment is a balloon-borne detector that will hunt for antiparticles above Antarctica this year. Finding more antiparticlesantideuterons or even antihelium, in particular, according to Choliswith the GAPS detector would make the AMS results far more convincing.

If antistars were found to be the culprit, that discovery would require a major reenvisioning of the universes evolution: no longer could we relegate antistars and other hypothetical astrophysical objects composed of antimatter to the fringes of reasonable speculation. Even if they do exist, however, antistars probably are not forming now, Salati says, because their presumptive natal clouds of antihydrogen would face steep odds of avoiding annihilation for the past 13 billion years or so. Thus, any antistars that might be found likely would be exceedingly old remnants of the early universe. If so, one deep mystery would be replaced with another: How, exactly, did such ancient relics manage to survive to today? As is often the case, a new discovery raises far more questions than it answers.

Continue reading here:

Stars Made of Antimatter Might Be Lurking in the Universe - Scientific American

Brown alumni to oversee new NASA robotic missions to Venus – Brown University

PROVIDENCE, R.I. [Brown University] As NASA embarks upon plans to send a pair of new spacecraft to peer into the thick clouds of Venus over the next decade, each of the two Discovery Program missions will be led by a Brown University graduate.

The DAVINCI+ mission will measure the composition of Venus thick toxic blanket of an atmosphere to understand how it formed and evolved. James Garvin, a scientist at NASAs Goddard Spaceflight Center who earned his Ph.D. in geological sciences from Brown in 1984, will lead the mission. The other mission, VERITAS, will look beneath the clouds to study the planets geology and composition. That mission will be led by Suzanne Smrekar of NASAs Jet Propulsion Laboratory, who earned a bachelors degree from Brown in 1984.

Jim Head, a research professor of planetary science at Brown, has worked with both mission leaders and was a member of the VERITAS science team in its earlier stages. He says that Smrekar and Garvin are the right choices to lead these critical missions to Earths nearest neighbor.

Both Sue Smrekar and Jim Garvin are passionate, energetic and incredibly creative planetary scientists, as well as steady, thoughtful leaders, Head said. I've had the pleasure to work extensively with both of them over the years, and I couldn't be more confident that NASA has made the right choices to lead these two missions.

Over the years, Venus has received less attention from NASA than Earths other next-door neighbor, Mars. But thats not because Venus isnt an interesting place, Head says. Venus is a dead ringer for Earth in many ways. The two are similar in diameter, mass and gravity, and both orbit in the so-called habitable zone around the sun. But at some point, the twin planets diverged onto very different paths. While Earths climate is temperate and conducive to life, Venus runaway greenhouse effect turned it into a stifling inferno, with surface temperatures approaching 900 degrees Fahrenheit.

Understanding how and why these twins wound up with such different fates is a critical question in planetary science, Head says.

Venus is the most Earth-like planet but is so different in so many ways, he said. If we dont understand Venus, we surely cannot fully understand the missing chapters in Earths history, and why the atmospheres are so different. Could the hot, inhospitable Venus we see today be where the Earth is heading in the future?

These two missions should shed light on that question and others. DAVINCI+ will use a descent sphere to dip into the Venusian clouds and measure concentrations of noble gases and other elements in the atmosphere. It will also snap the first high-resolution pictures of Venus tesserae, odd geological features that suggest the planet may have had something like Earths plate tectonics. VERITAS, meanwhile, will use radar to map elevation and surface features across much of the planet. That mapping will help to determine if Venus is volcanically active and whether that volcanism is contributing water vapor to the atmosphere. Using an infrared spectrometer, VERITAS will also look at surface rock composition, which remains largely unknown.

Both missions are part of NASAs Discovery Program and were selected through a competitive process. Both are expected to launch in the 2028-2030 timeframe.

Head worked with the VERITAS mission during much of the selection process, but recently stepped aside to make room for younger scientists to join the team, he said. In addition to Smrekar, three members of the current VERITAS team are Brown Ph.D. graduates: Jennifer Whitten, Caleb Fassett and Lauren Jozwiak. Smrekar is a geophysicist who currently serves as deputy principal investigator for NASAs InSIGHT mission to Mars, and was deputy project scientist for the Mars Reconnaissance Orbiter mission. Garvin is a veteran planetary scientist who has worked on numerous NASA missions, and served as the agencys chief scientist. Brown graduates Noam Izenberg and Mike Ravine work with Garvin on DAVINCI+. Martha Gilmore, a 1997 Ph.D. graduate, serves on both mission teams.

Brown has had strong presence in space exploration over the years, Head says, and the leadership of Brown alumni in these new missions is the latest example.

Starting with Professor Tim Mutch in the 1960's, Brown's robust planetary geoscience research and teaching program has prepared generations of undergraduates and graduates for leadership and partnership roles in NASA and international exploration missions, including those of the European Space Agency, India, Israel, the Soviet Union and Russia, Head said.Brown graduates have included two NASA Chief Scientists and two NASA Astronauts, one of whom, Jessica Meir, is a candidateto explore the Moonin the Artemis Program.

Its been 30 years since NASAs last mission to Venus, and Head says hes thrilled that the agency has decided its time to go back. The data returned by these two missions will shed critical light not only on Venus, but on Earth as well, he says.

When we explore the solar system, were doing comparative planetology, Head said. Everything we learn about the other terrestrial planets helps us to understand our own home, and answer critical questions about how our world came to be and how it will evolve in the future.

See the original post:

Brown alumni to oversee new NASA robotic missions to Venus - Brown University

Op-Ed: Eugenics is making a comeback. Resist, before …

Provided by The LA Times President Trump speaks at a campaign rally Sept. 18 in Bemidji, Minn., where he made remarks espousing eugenics. (Associated Press)

Politicians often flatter their audiences, but at a rally in Bemidji, Minn., last month, President Trump found an unusual thing to praise about the nearly all-white crowd: its genetics. You have good genes, he insisted. A lot of it is about the genes, isnt it, dont you believe? The racehorse theory. You have good genes in Minnesota.

In case it was not clear from the sea of white faces that he was making a point about race, Trump later said the quiet part out loud. Every family in Minnesota needs to know about Sleepy Joe Bidens extreme plan to flood your state with an influx of refugees from Somalia, from other places all over the planet, he declared.

Trumps ugly endorsement of race-based eugenics got national attention, but in a presidency filled with outrages, our focus quickly moved to the next. Besides, this wasnt the first time wed heard about these views. A "Frontline" documentary reported in 2016 that Trump believed the racehorse theory of human development that he referred to in Minnesota that superior men and women will have superior children. That same year, the Huffington Post released a video collecting Trumps statements on human genetics, including his declarations that Im a gene believer and Im proud to have that German blood.

On eugenics, as in so many areas, the scariest thing about Trumps views is not the fact that he holds them, but that there is no shortage of Americans who share them. The United States has a long, dark history with eugenics. Starting in 1907, a majority of states passed laws authorizing the sterilization of people deemed to have undesirable genes, for reasons as varied as feeblemindedness and alcoholism. The Supreme Court upheld these laws by an 8-1 vote, in the infamous 1927 case Buck vs. Bell, and as many as 70,000 Americans were sterilized for eugenic reasons in the 20th century.

Americas passion for eugenics waned after World War II, when Nazism discredited the idea of dividing people based on the quality of their genes. But in recent years, public support for eugenics has made a comeback. Steve King, a Republican congressman from Iowa, tweeted in 2017, We cant restore our civilization with somebody elses babies. The comment struck many as a claim that American children were genetically superior, though King later insisted he was concerned with the culture, not the blood of foreign babies.

Eugenics has also had a resurgence in England, where the movement was first launched in the 1880s by Francis Galton, a cousin of Charles Darwin. In February, Andrew Sabisky, an advisor to British Prime Minister Boris Johnson, resigned after it was revealed that he had reportedly written blog posts suggesting that there are genetic differences in intelligence among races, and that compulsory contraception could be used to prevent the rise of a permanent underclass. Richard Dawkins, one of Britains most prominent scientists, added fuel to the fire by tweeting that although eugenics could be criticized on moral or ideological grounds, of course it would work in practice. Eugenics works for cows, horses, pigs, dogs & roses, he said. Why on earth wouldnt it work for humans?

There is reason to believe the eugenics movement will continue to grow. Americas first embrace of it came at a time when immigration levels were high, and it was closely tied to fears that genetically inferior foreigners were hurting the nations gene pool. Eugenicists persuaded Congress to pass the Immigration Act of 1924, which sharply reduced the number of Italian, Jewish and Asian people allowed in.

Today, the percentage of Americans who were born outside the United States is the highest it has been since 1910, and fear of immigrants is again an animating force in politics. As our nation continues to become more diverse, the sort of xenophobia that fueled Trump's and Kings comments is likely to produce more talk of better genes and babies.

It is critically important to push back against these toxic ideas. One way to do this is by ensuring that people who promote eugenics are denounced and kept out of positions of power. It is encouraging that Sabisky was forced out and that King was defeated for reelection in his Republican primary in June. Hopefully, Trump will be the next to go.

Education, including an honest reckoning with our own tragic eugenics history, is another form of resistance. It is starting to happen: Stanford University just announced that it is removing the name of its first president, David Starr Jordan, a leading eugenicist, from campus buildings, and that it will actively work to better explain his legacy. We need more of this kind of self-scrutiny from universities like Harvard, Yale and many others that promoted eugenics and pseudo race science, as well as institutions like the American Museum of Natural History, which in 1921 hosted the Second International Eugenics Congress, at which eugenicists advocated for eliminating the unfit.

Trumps appalling remarks in Minnesota show how serious the situation is now. Seventy-five years after the liberation of the Nazi concentration camps, a United States president not only spoke about good genes in racialized terms he believed that his observations would help him to win in the relatively liberal state of Minnesota. It is crucial that everyone who understands the horrors of eugenics works to defeat these views before they become any more popular.

Adam Cohen, a former member of the New York Times editorial board, is the author of "Imbeciles: The Supreme Court, American Eugenics, and the Sterilization of Carrie Buck" and, this year, "Supreme Inequality: The Supreme Courts Fifty-Year Battle for a More Unjust America."

This story originally appeared in Los Angeles Times.

See the rest here:

Op-Ed: Eugenics is making a comeback. Resist, before ...

The Horrifying American Roots of Nazi Eugenics | History …

Hitler and his henchmen victimized an entire continent and exterminatedmillions in his quest for a co-called "Master Race."

But the concept of a white, blond-haired, blue-eyed master Nordic race didn't originate with Hitler. The idea was created in the United States, and cultivated in California, decades before Hitler came to power. California eugenicists played an important, although little known, role in the American eugenics movement's campaign for ethnic cleansing.

Eugenics was the racist pseudoscience determined to wipe away all human beings deemed "unfit," preserving only those who conformed to a Nordic stereotype. Elements of the philosophy were enshrined as national policy by forced sterilization and segregation laws, as well as marriage restrictions, enacted in twenty-seven states. In 1909, California became the third state to adopt such laws. Ultimately, eugenics practitioners coercively sterilized some 60,000 Americans, barred the marriage of thousands, forcibly segregated thousands in "colonies," and persecuted untold numbers in ways we are just learning. Before World War II, nearly half of coercive sterilizations were done in California, and even after the war, the state accounted for a third of all such surgeries.

California was considered an epicenter of the American eugenics movement. During the Twentieth Century's first decades, California's eugenicists included potent but little known race scientists, such as Army venereal disease specialist Dr. Paul Popenoe, citrus magnate and Polytechnic benefactor Paul Gosney, Sacramento banker Charles M. Goethe, as well as members of the California State Board of Charities and Corrections and the University of California Board of Regents.

Eugenics would have been so much bizarre parlor talk had it not been for extensive financing by corporate philanthropies, specifically the Carnegie Institution, the Rockefeller Foundation and the Harriman railroad fortune. They were all in league with some of America's most respected scientists hailing from such prestigious universities as Stamford, Yale, Harvard, and Princeton. These academicians espoused race theory and race science, and then faked and twisted data to serve eugenics' racist aims.

Stanford president David Starr Jordan originated the notion of "race and blood" in his 1902 racial epistle "Blood of a Nation," in which the university scholar declared that human qualities and conditions such as talent and poverty were passed through the blood.

In 1904, the Carnegie Institution established a laboratory complex at Cold Spring Harbor on Long Island that stockpiled millions of index cards on ordinary Americans, as researchers carefully plotted the removal of families, bloodlines and whole peoples. From Cold Spring Harbor, eugenics advocates agitated in the legislatures of America, as well as the nation's social service agencies and associations.

The Harriman railroad fortune paid local charities, such as the New York Bureau of Industries and Immigration, to seek out Jewish, Italian and other immigrants in New York and other crowded cities and subject them to deportation, trumped up confinement or forced sterilization.

The Rockefeller Foundation helped found the German eugenics program and even funded the program that Josef Mengele worked in before he went to Auschwitz.

Much of the spiritual guidance and political agitation for the American eugenics movement came from California's quasi-autonomous eugenic societies, such as the Pasadena-based Human Betterment Foundation and the California branch of the American Eugenics Society, which coordinated much of their activity with the Eugenics Research Society in Long Island. These organizations--which functioned as part of a closely-knit network--published racist eugenic newsletters and pseudoscientific journals, such as Eugenical News and Eugenics, and propagandized for the Nazis.

Eugenics was born as a scientific curiosity in the Victorian age. In 1863, Sir Francis Galton, a cousin of Charles Darwin, theorized that if talented people only married other talented people, the result would be measurably better offspring. At the turn of the last century, Galton's ideas were imported into the United States just as Gregor Mendel's principles of heredity were rediscovered. American eugenic advocates believed with religious fervor that the same Mendelian concepts determining the color and size of peas, corn and cattle also governed the social and intellectual character of man.

In an America demographically reeling from immigration upheaval and torn by post-Reconstruction chaos, race conflict was everywhere in the early twentieth century. Elitists, utopians and so-called "progressives" fused their smoldering race fears and class bias with their desire to make a better world. They reinvented Galton's eugenics into a repressive and racist ideology. The intent: populate the earth with vastly more of their own socio-economic and biological kind--and less or none of everyone else.

The superior species the eugenics movement sought was populated not merely by tall, strong, talented people. Eugenicists craved blond, blue-eyed Nordic types. This group alone, they believed, was fit to inherit the earth. In the process, the movement intended to subtract emancipated Negroes, immigrant Asian laborers, Indians, Hispanics, East Europeans, Jews, dark-haired hill folk, poor people, the infirm and really anyone classified outside the gentrified genetic lines drawn up by American raceologists.

How? By identifying so-called "defective" family trees and subjecting them to lifelong segregation and sterilization programs to kill their bloodlines. The grand plan was to literally wipe away the reproductive capability of those deemed weak and inferior--the so-called "unfit." The eugenicists hoped to neutralize the viability of 10 percent of the population at a sweep, until none were left except themselves.

Eighteen solutions were explored in a Carnegie-supported 1911 "Preliminary Report of the Committee of the Eugenic Section of the American Breeder's Association to Study and to Report on the Best Practical Means for Cutting Off the Defective Germ-Plasm in the Human Population." Point eight was euthanasia.

The most commonly suggested method of eugenicide in America was a "lethal chamber" or public locally operated gas chambers. In 1918, Popenoe, the Army venereal disease specialist during World War I, co-wrote the widely used textbook, Applied Eugenics, which argued, "From an historical point of view, the first method which presents itself is execution Its value in keeping up the standard of the race should not be underestimated." Applied Eugenics also devoted a chapter to "Lethal Selection," which operated "through the destruction of the individual by some adverse feature of the environment, such as excessive cold, or bacteria, or by bodily deficiency."

Eugenic breeders believed American society was not ready to implement an organized lethal solution. But many mental institutions and doctors practiced improvised medical lethality and passive euthanasia on their own. One institution in Lincoln, Illinois fed its incoming patients milk from tubercular cows believing a eugenically strong individual would be immune. Thirty to forty percent annual death rates resulted at Lincoln. Some doctors practiced passive eugenicide one newborn infant at a time. Others doctors at mental institutions engaged in lethal neglect.

Nonetheless, with eugenicide marginalized, the main solution for eugenicists was the rapid expansion of forced segregation and sterilization, as well as more marriage restrictions. California led the nation, performing nearly all sterilization procedures with little or no due process. In its first twenty-five years of eugenic legislation, California sterilized 9,782 individuals, mostly women. Many were classified as "bad girls," diagnosed as "passionate," "oversexed" or "sexually wayward." At Sonoma, some women were sterilized because of what was deemed an abnormally large clitoris or labia.

In 1933 alone, at least 1,278 coercive sterilizations were performed, 700 of which were on women. The state's two leading sterilization mills in 1933 were Sonoma State Home with 388 operations and Patton State Hospital with 363 operations. Other sterilization centers included Agnews, Mendocino, Napa, Norwalk, Stockton and Pacific Colony state hospitals.

Even the United States Supreme Court endorsed aspects of eugenics. In its infamous 1927 decision, Supreme Court Justice Oliver Wendell Holmes wrote, "It is better for all the world, if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind. Three generations of imbeciles are enough." This decision opened the floodgates for thousands to be coercively sterilized or otherwise persecuted as subhuman. Years later, the Nazis at the Nuremberg trials quoted Holmes's words in their own defense.

Only after eugenics became entrenched in the United States was the campaign transplanted into Germany, in no small measure through the efforts of California eugenicists, who published booklets idealizing sterilization and circulated them to German officials and scientists.

Hitler studied American eugenics laws. He tried to legitimize his anti-Semitism by medicalizing it, and wrapping it in the more palatable pseudoscientific facade of eugenics. Hitler was able to recruit more followers among reasonable Germans by claiming that science was on his side. While Hitler's race hatred sprung from his own mind, the intellectual outlines of the eugenics Hitler adopted in 1924 were made in America.

During the '20s, Carnegie Institution eugenic scientists cultivated deep personal and professional relationships with Germany's fascist eugenicists. In Mein Kampf, published in 1924, Hitler quoted American eugenic ideology and openly displayed a thorough knowledge of American eugenics. "There is today one state," wrote Hitler, "in which at least weak beginnings toward a better conception [of immigration] are noticeable. Of course, it is not our model German Republic, but the United States."

Hitler proudly told his comrades just how closely he followed the progress of the American eugenics movement. "I have studied with great interest," he told a fellow Nazi, "the laws of several American states concerning prevention of reproduction by people whose progeny would, in all probability, be of no value or be injurious to the racial stock."

Hitler even wrote a fan letter to American eugenic leader Madison Grant calling his race-based eugenics book, The Passing of the Great Race his "bible."

Hitler's struggle for a superior race would be a mad crusade for a Master Race. Now, the American term "Nordic" was freely exchanged with "Germanic" or "Aryan." Race science, racial purity and racial dominance became the driving force behind Hitler's Nazism. Nazi eugenics would ultimately dictate who would be persecuted in a Reich-dominated Europe, how people would live, and how they would die. Nazi doctors would become the unseen generals in Hitler's war against the Jews and other Europeans deemed inferior. Doctors would create the science, devise the eugenic formulas, and even hand-select the victims for sterilization, euthanasia and mass extermination.

During the Reich's early years, eugenicists across America welcomed Hitler's plans as the logical fulfillment of their own decades of research and effort. California eugenicists republished Nazi propaganda for American consumption. They also arranged for Nazi scientific exhibits, such as an August 1934 display at the L.A. County Museum, for the annual meeting of the American Public Health Association.

In 1934, as Germany's sterilizations were accelerating beyond 5,000 per month, the California eugenics leader C. M. Goethe upon returning from Germany ebulliently bragged to a key colleague, "You will be interested to know, that your work has played a powerful part in shaping the opinions of the group of intellectuals who are behind Hitler in this epoch-making program. Everywhere I sensed that their opinions have been tremendously stimulated by American thought.I want you, my dear friend, to carry this thought with you for the rest of your life, that you have really jolted into action a great government of 60 million people."

That same year, ten years after Virginia passed its sterilization act, Joseph DeJarnette, superintendent of Virginia's Western State Hospital, observed in the Richmond Times-Dispatch, "The Germans are beating us at our own game."

More than just providing the scientific roadmap, America funded Germany's eugenic institutions. By 1926, Rockefeller had donated some $410,000 -- almost $4 million in 21st-Century money -- to hundreds of German researchers. In May 1926, Rockefeller awarded $250,000 to the German Psychiatric Institute of the Kaiser Wilhelm Institute, later to become the Kaiser Wilhelm Institute for Psychiatry. Among the leading psychiatrists at the German Psychiatric Institute was Ernst Rdin, who became director and eventually an architect of Hitler's systematic medical repression.

Another in the Kaiser Wilhelm Institute's eugenic complex of institutions was the Institute for Brain Research. Since 1915, it had operated out of a single room. Everything changed when Rockefeller money arrived in 1929. A grant of $317,000 allowed the Institute to construct a major building and take center stage in German race biology. The Institute received additional grants from the Rockefeller Foundation during the next several years. Leading the Institute, once again, was Hitler's medical henchman Ernst Rdin. Rdin's organization became a prime director and recipient of the murderous experimentation and research conducted on Jews, Gypsies and others.

Beginning in 1940, thousands of Germans taken from old age homes, mental institutions and other custodial facilities were systematically gassed. Between 50,000 and 100,000 were eventually killed.

Leon Whitney, executive secretary of the American Eugenics Society declared of Nazism, "While we were pussy-footing aroundthe Germans were calling a spade a spade."

A special recipient of Rockefeller funding was the Kaiser Wilhelm Institute for Anthropology, Human Heredity and Eugenics in Berlin. For decades, American eugenicists had craved twins to advance their research into heredity. The Institute was now prepared to undertake such research on an unprecedented level. On May 13, 1932, the Rockefeller Foundation in New York dispatched a radiogram to its Paris office: JUNE MEETING EXECUTIVE COMMITTEE NINE THOUSAND DOLLARS OVER THREE YEAR PERIOD TO KWG INSTITUTE ANTHROPOLOGY FOR RESEARCH ON TWINS AND EFFECTS ON LATER GENERATIONS OF SUBSTANCES TOXIC FOR GERM PLASM.

At the time of Rockefeller's endowment, Otmar Freiherr von Verschuer, a hero in American eugenics circles, functioned as a head of the Institute for Anthropology, Human Heredity and Eugenics. Rockefeller funding of that Institute continued both directly and through other research conduits during Verschuer's early tenure. In 1935, Verschuer left the Institute to form a rival eugenics facility in Frankfurt that was much heralded in the American eugenic press. Research on twins in the Third Reich exploded, backed up by government decrees. Verschuer wrote in Der Erbarzt, a eugenic doctor's journal he edited, that Germany's war would yield a "total solution to the Jewish problem."

Verschuer had a long-time assistant. His name was Josef Mengele. On May 30, 1943, Mengele arrived at Auschwitz. Verschuer notified the German Research Society, "My assistant, Dr. Josef Mengele (M.D., Ph.D.) joined me in this branch of research. He is presently employed as Hauptsturmfhrer [captain] and camp physician in the Auschwitz concentration camp. Anthropological testing of the most diverse racial groups in this concentration camp is being carried out with permission of the SS Reichsfhrer [Himmler]."

Mengele began searching the boxcar arrivals for twins. When he found them, he performed beastly experiments, scrupulously wrote up the reports and sent the paperwork back to Verschuer's institute for evaluation. Often, cadavers, eyes and other body parts were also dispatched to Berlin's eugenic institutes.

Rockefeller executives never knew of Mengele. With few exceptions, the foundation had ceased all eugenic studies in Nazi-occupied Europe before the war erupted in 1939. But by that time the die had been cast. The talented men Rockefeller and Carnegie financed, the institutions they helped found, and the science it helped create took on a scientific momentum of their own.

After the war, eugenics was declared a crime against humanity--an act of genocide. Germans were tried and they cited the California statutes in their defense. To no avail. They were found guilty.

However, Mengele's boss Verschuer escaped prosecution. Verschuer re-established his connections with California eugenicists who had gone underground and renamed their crusade "human genetics." Typical was an exchange July 25, 1946 when Popenoe wrote Verschuer, "It was indeed a pleasure to hear from you again. I have been very anxious about my colleagues in Germany. I suppose sterilization has been discontinued in Germany?" Popenoe offered tidbits about various American eugenic luminaries and then sent various eugenic publications. In a separate package, Popenoe sent some cocoa, coffee and other goodies.

Verschuer wrote back, "Your very friendly letter of 7/25 gave me a great deal of pleasure and you have my heartfelt thanks for it. The letter builds another bridge between your and my scientific work; I hope that this bridge will never again collapse but rather make possible valuable mutual enrichment and stimulation."

Soon, Verschuer once again became a respected scientist in Germany and around the world. In 1949, he became a corresponding member of the newly formed American Society of Human Genetics, organized by American eugenicists and geneticists.

In the fall of 1950, the University of Mnster offered Verschuer a position at its new Institute of Human Genetics, where he later became a dean. In the early and mid-1950s, Verschuer became an honorary member of numerous prestigious societies, including the Italian Society of Genetics, the Anthropological Society of Vienna, and the Japanese Society for Human Genetics.

Human genetics' genocidal roots in eugenics were ignored by a victorious generation that refused to link itself to the crimes of Nazism and by succeeding generations that never knew the truth of the years leading up to war. Now governors of five states, including California have issued public apologies to their citizens, past and present, for sterilization and other abuses spawned by the eugenics movement.

Human genetics became an enlightened endeavor in the late twentieth century. Hard-working, devoted scientists finally cracked the human code through the Human Genome Project. Now, every individual can be biologically identified and classified by trait and ancestry. Yet even now, some leading voices in the genetic world are calling for a cleansing of the unwanted among us, and even a master human species.

There is understandable wariness about more ordinary forms of abuse, for example, in denying insurance or employment based on genetic tests. On October 14, America's first genetic anti-discrimination legislation passed the Senate by unanimous vote. Yet because genetics research is global, no single nation's law can stop the threats.

This article was first published in the San Francisco Chronicle and is reprinted with permission of the author.

Read the original here:

The Horrifying American Roots of Nazi Eugenics | History ...