Page 80«..1020..79808182..90100..»

Category Archives: Quantum Physics

Realization of an ideal Weyl semimetal band in a quantum gas with 3D spin-orbit coupling – Science Magazine

Posted: April 17, 2021 at 11:51 am

A minimal Weyl semimetal

Many compounds have now been identified as Weyl semimetals, materials with an unusual electronic band structure characterized by the so-called Weyl points. Weyl points always appear in pairs, but the solid-state materials studied so far have at least four. Wang et al. engineered a Weyl semimetallic state with the minimum number of Weyl points (two) in a gas of ultracold atoms trapped in an optical lattice (see the Perspective by Goldman and Yefsah). To do that, the researchers had to create three-dimensional spin-orbit coupling in this system. The relative simplicity of the resulting band structure will make it easier to observe the unusual effects associated with this state.

Science, this issue p. 271; see also p. 234

Weyl semimetals are three-dimensional (3D) gapless topological phases with Weyl cones in the bulk band. According to lattice theory, Weyl cones must come in pairs, with the minimum number of cones being two. A semimetal with only two Weyl cones is an ideal Weyl semimetal (IWSM). Here we report the experimental realization of an IWSM band by engineering 3D spin-orbit coupling for ultracold atoms. The topological Weyl points are clearly measured via the virtual slicing imaging technique in equilibrium and are further resolved in the quench dynamics. The realization of an IWSM band opens an avenue to investigate various exotic phenomena that are difficult to access in solids.

Original post:

Realization of an ideal Weyl semimetal band in a quantum gas with 3D spin-orbit coupling - Science Magazine

Posted in Quantum Physics | Comments Off on Realization of an ideal Weyl semimetal band in a quantum gas with 3D spin-orbit coupling – Science Magazine

Quantum computers are revealing an unexpected new theory of reality – New Scientist

Posted: April 15, 2021 at 6:36 am

A powerful new idea about how the laws of physics work could bring breakthroughs on everything from quantum gravity to consciousness, says researcher Chiara Marletto

By Chiara Marletto

Manshen Lo

QUANTUM supremacy is a phrase that has been in the news a lot lately. Several labs worldwide have already claimed to have reached this milestone, at which computers exploiting the wondrous features of the quantum world solve a problem faster than a conventional classical computer feasibly could. Although we arent quite there yet, a general-purpose universal quantum computer seems closer than ever a revolutionary development for how we communicate and encrypt data, for virtual reality, artificial intelligence and much more.

These prospects excite me as a theoretical physicist too, but my colleagues and I are captivated by an even bigger picture. The quantum theory of computation originated as a way to deepen our understanding of quantum theory, our fundamental theory of physical reality. By applying the principles we have learned more broadly, we think we are beginning to see the outline of a radical new way to construct laws of nature.

It means abandoning the idea of physics as the science of whats actually happening, and embracing it as the science of what might or might not happen. This science of can and cant could help us tackle some of the big questions that conventional physics has tried and failed to get to grips with, from delivering an exact, unifying theory of thermodynamics and information to getting round conceptual barriers that stop us merging quantum theory with general relativity, Einsteins theory of gravity. It might go even further and help us to understand how intelligent thought works, and kick-start a technological revolution that would make quantum supremacy look modest by comparison.

Since the dawn of modern physics in

Read the original post:

Quantum computers are revealing an unexpected new theory of reality - New Scientist

Posted in Quantum Physics | Comments Off on Quantum computers are revealing an unexpected new theory of reality – New Scientist

Student’s physics homework picked up by Amazon quantum researchers – News – The University of Sydney

Posted: at 6:36 am

Pablo Bonilla Ataides (left) with co-author Dr Ben Brown from the School of Physics. Photo: Louise Cooper

What started out as a second-year physics project is making its way into Amazon Web Services (AWS) quantum computing program.

University of Sydney science undergraduate Pablo Bonilla Ataides has tweaked some computing code to effectively double its capacity to correct errors in the quantum machines being designed in the emerging technology sector.

The simple but ingenious change to quantum error correcting code has grabbed the attention of quantum researchers at the AWS Center for Quantum Computing in Pasadena, California, and the quantum technology programs at Yale University and Duke University in the United States.

Quantum technology is in its infancy, partly because we havent been able to overcome the inherent instability in the machines that produce so many errors, 21-year-old Mr Bonilla said.

In second-year physics I was asked to look at some commonly used error correcting code to see if we could improve it. By flipping half of the quantum switches, or qubits, in our design, we found we could effectively double our ability to suppress errors.

The research is published today in Nature Communications.

The results of the study, co-authored by Dr Steve Flammia who has recently moved from the University of Sydney to AWSs quantum computing effort, are to featurein the tech companys arsenal of error correction techniques as it develops its quantum hardware.

Dr Earl Campbell is a senior quantum research scientist at AWS. He said: We have considerable work ahead of us as an industry before anyone sees real, practical benefits from quantum computers.

This research surprised me.I was amazed that such a slight change to a quantum error correction code could lead to such a big impact in predicted performance.

The AWS Center for Quantum Computing team looks forward to collaborating further as we explore other promising alternatives to bring new, more powerful computing technologies one step closer to reality.

Errors are extremely rare in the digital transistors, or switches, that classical computers use to run our phones, laptops and even the fastest supercomputers.

However, the switches in quantum computers, known as qubits, are particularly sensitive to interference, or noise, from the external environment.

In order to make quantum machines work, scientists need to produce a large number of high-quality qubits. This can be done by improving the machines so they are less noisy and by using some capacity of the machines to suppress qubit errors below a certain threshold in order for them to be useful.

That is where quantum error correction comes in.

Assistant Professor Shruti Puri from the quantum research program at Yale University said her team is interested in using the new code for its work.

What amazes me about this new code is its sheer elegance. Its remarkable error-correcting properties are coming from a simple modification to a code that has been studied extensively for almost two decades, Assistant Professor Puri said.

It is extremely relevant for a new generation of quantum technology being developed at Yale and elsewhere. With this new code, I believe, we have considerably shortened the timeline to achieve scalable quantum computation.

Co-author Dr David Tuckett from the School of Physics said: Its a bit like playing battleships with a quantum opponent. Theoretically, they could place their pieces anywhere on the board. But after playing millions of games, we know that certain moves are more likely.

Read more from the original source:

Student's physics homework picked up by Amazon quantum researchers - News - The University of Sydney

Posted in Quantum Physics | Comments Off on Student’s physics homework picked up by Amazon quantum researchers – News – The University of Sydney

The Big Theoretical Physics Problem At The Center Of The ‘Muon g-2’ Puzzle – Forbes

Posted: at 6:36 am

The Muon g-2 electromagnet at Fermilab, ready to receive a beam of muon particles. This experiment ... [+] began in 2017 and will take data for a total of 3 years, reducing the uncertainties significantly. While a total of 5-sigma significance may be reached, the theoretical calculations must account for every effect and interaction of matter that's possible in order to ensure we're measuring a robust difference between theory and experiment.

In early April, 2021, the experimental physics community announced an enormous victory: they had measured the muons magnetic moment to unprecedented precision. With the extraordinary precision achieved by the experimental Muon g-2 collaboration, they were able to measure the spin magnetic moment of the muon not only wasn't 2, as originally predicted by Dirac, but was more precisely 2.00116592040. There's an uncertainty in the final two digits of 54, but not larger. Therefore, if the theoretical prediction differs by this measured amount by too much, there must be new physics at play: a tantalizing possibility that has justifiably excited a great many physicists.

The best theoretical prediction that we have, in fact, is more like 2.0011659182, which is significantly below the experimental measurement. Given that the experimental result strongly confirms a much earlier measurement of the same g-2 quantity for the muon by the Brookhaven E821 experiment, theres every reason to believe that the experimental result will hold up with better data and reduced errors. But the theoretical result is very much in doubt, for reasons everyone should appreciate. Lets help everyone physicists and non-physicists alike understand why.

The first Muon g-2 results from Fermilab are consistent with prior experimental results. When ... [+] combined with the earlier Brookhaven data, they reveal a significantly larger value than the Standard Model predicts. However, although the experimental data is exquisite, this interpretation of the result is not the only viable one.

The Universe, as we know it, is fundamentally quantum in nature. Quantum, as we understand it, means that things can be broken down into fundamental components that obey probabilistic, rather than deterministic, rules. Deterministic is what happens for classical objects: macroscopic particles such as rocks. If you had two closely-spaced slits and threw a small rock at it, you could take one of two approaches, both of which would be valid.

But for quantum objects, you cant do either of those. You could only compute a probability distribution for the various outcomes that could have occurred. You can either compute the probabilities of where things would land, or the probability of various trajectories having occurred. Any additional measurement you attempt to make, with the goal of gathering extra information, would alter the outcome of the experiment.

Electrons exhibit wave properties as well as particle properties, and can be used to construct ... [+] images or probe particle sizes just as well as light can. This compilation shows an electron wave pattern, which cumulatively emerges after many electrons are passed through a double slit.

Thats the quantum weirdness were used to: quantum mechanics. Generalizing the laws of quantum mechanics to obey Einsteins laws of special relativity led to Diracs original prediction for the muons spin magnetic moment: that there would be a quantum mechanical multiplicative factor applied to the classical prediction, g, and that g would exactly equal 2. But, as we all now know, g doesnt exactly equal 2, but a value slightly higher than 2. In other words, when we measure the physical quantity g-2, were measuring the cumulative effects of everything that Dirac missed.

So, what did he miss?

He missed the fact that its not just the individual particles that make up the Universe that are quantum in nature, but also the fields that permeate the space between those particles must also be quantum. This enormous leap from quantum mechanics to quantum field theory enabled us to calculate deeper truths that arent illuminated by quantum mechanics at all.

Magnetic field lines, as illustrated by a bar magnet: a magnetic dipole, with a north and south pole ... [+] bound together. These permanent magnets remain magnetized even after any external magnetic fields are taken away. If you 'snap' a bar magnet in two, it won't create an isolated north and south pole, but rather two new magnets, each with their own north and south poles. Mesons 'snap' in a similar manner.

The idea of quantum field theory is simple. Yes, you still have particles that are charged in some variety:

but they dont just create fields around them based on things like their position and momentum like they did under either Newtons/Einsteins gravity or Maxwells electromagnetism.

If things like the position and momentum of each particle have an inherent quantum uncertainty associated with them, then what does that mean for the fields associated with them? It means we need a new way to think about fields: a quantum formulation. Although it took decades to get it right, a number of physicists independently figured out a successful method of performing the necessary calculations.

A visualization of QCD illustrates how particle/antiparticle pairs pop out of the quantum vacuum for ... [+] very small amounts of time as a consequence of Heisenberg uncertainty. If you have a large uncertainty in energy (E), the lifetime (t) of the particle(s) created must be very short.

What many people expected to happen although it doesnt quite work this way is that wed be able to simply to fold all the necessary quantum uncertainties into the charged particles that generate these quantum fields, and that would allow us to compute the field behavior. But that misses a crucial contribution: the fact that these quantum fields exist, and in fact permeate all of space, even where there are no charged particles giving rise to the corresponding field.

Electromagnetic fields exist even in the absence of charged particles, for instance. You can imagine waves of all different wavelengths permeating all of space, even when no other particles are present. Thats fine from a theoretical perspective, but wed want experimental proof that this description was correct. We already have it in a couple of forms.

As electromagnetic waves propagate away from a source that's surrounded by a strong magnetic field, ... [+] the polarization direction will be affected due to the magnetic field's effect on the vacuum of empty space: vacuum birefringence. By measuring the wavelength-dependent effects of polarization around neutron stars with the right properties, we can confirm the predictions of virtual particles in the quantum vacuum.

In fact, the experimental effects of quantum fields have been felt since 1947, when the Lamb-Retherford experiment demonstrated their reality. The debate is no longer over whether:

But what we do have to recognize is as in the case with many mathematical equations that we know how to write down that we cannot compute everything with the same straightforward, brute-force approach.

The way we perform these calculations in quantum electrodynamics (QED), for example, is we do whats called a perturbative expansion. We imagine what it would be like for two particles to interact like an electron and and electron, a muon and a photon, a quark and another quark, etc. and then we imagine every possible quantum field interaction that could happen atop that basic interaction.

Today, Feynman diagrams are used in calculating every fundamental interaction spanning the strong, ... [+] weak, and electromagnetic forces, including in high-energy and low-temperature/condensed conditions. The electromagnetic interactions, shown here, are all governed by a single force-carrying particle: the photon.

This is the idea of quantum field theory thats normally encapsulated by their most commonly-seen tool to represent calculational steps that must be taken: Feynman diagrams, as above. In the theory of quantum electrodynamics where charged particles interact via the exchange of photons, and those photons can then couple through any other charged particles we perform these calculations by:

Quantum electrodynamics is one of the many field theories we can write down where this approach, as we go to progressively higher loop orders in our calculations, gets more and more accurate the more we calculate. The processes at play in the muons (or electrons, or taus) spin magnetic moment have been calculated beyond five-loop order recently, and theres very little uncertainty there.

Through a herculean effort of the part of theoretical physicists, the muon magnetic moment has been ... [+] calculated up to five-loop order. The theoretical uncertainties are now at the level of just one part in two billion. This is a tremendous achievement that can only be made in the context of quantum field theory, and is heavily reliant on the fine structure constant and its applications.

The reason this strategy works so well is because electromagnetism has two important properties to it.

The combination of these factors guarantees that we can calculate the strength of any electromagnetic interaction between any two particles in the Universe more and more accurately by adding more terms to our quantum field theory calculations: by going to higher and higher loop-orders.

Electromagnetism, of course, isnt the only force that matters when it comes to Standard Model particles. Theres also the weak nuclear force, which is mediated by three force-carrying particles: the W-and-Z bosons. This is a very short-range force, but fortunately, the strength of the weak coupling is still small and the weak interactions are suppressed by large masses possessed by the W-and-Z bosons. Even though its a little more complicated, the same method of expanding to higher-order loop diagrams works for computing the weak interactions, too. (The Higgs is also similar.)

At high energies (corresponding to small distances), the strong force's interaction strength drops ... [+] to zero. At large distances, it increases rapidly. This idea is known as 'asymptotic freedom,' which has been experimentally confirmed to great precision.

But the strong nuclear force is different. Unlike all of the other Standard Model interactions, the strong force gets weaker at short distances rather than stronger: it acts like a spring rather than like gravity. We call this property asymptotic freedom: where the attractive or repulsive force between charged particles approaches zero as they approach zero distance from one another. This, coupled with the large coupling strength of the strong interaction, makes this common loop-order method wildly inappropriate for the strong interaction. The more diagrams you calculate, the less accurate you get.

This doesnt mean we have no recourse at all in making predictions for the strong interactions, but it means we have to take a different approach to our normal one. Either we can try to calculate the contributions of the particles and fields under the strong interaction non-perturbatively such as via the methods of Lattice QCD (where QCD stands for quantum chromodynamics, or the quantum field theory governing the strong force) or you can try and use the results from other experiments to estimate the strength of the strong interactions under a different scenario.

As computational power and Lattice QCD techniques have improved over time, so has the accuracy to ... [+] which various quantities about the proton, such as its component spin contribtuions, can be computed.

If what we were able to measure, from other experiments, was exactly the thing we dont know in the Muon g-2 calculation, there would be no need for theoretical uncertainties; we could just measure the unknown directly. If we didnt know a cross-section, a scattering amplitude, or a particular decay property, those are things that particle physics experiments are exquisite at determining. But for the needed strong force contributions to the spin magnetic moment of the muon, these are properties that are indirectly inferred from our measurements, not directly measured. Theres always a big danger that a systematic error is causing the mismatch between theory and observation from our current theoretical methods.

On the other hand, the Lattice QCD method is brilliant: it imagines space as a grid-like lattice in three dimensions. You put the two particles down on your lattice so that they're separated by a certain distance, and then they use a set of computational techniques to add up the contribution from all the quantum fields and particles that we have. If we could make the lattice infinitely large, and the spacing between the points on the lattice infinitely small, we'd get the exact answer for the contributions of the strong force. Of course, we only have finite computational power, so the lattice spacing can't go below a certain distance, and the size of the lattice doesn't go beyond a certain range.

There comes a point where our lattice gets large enough and the spacing gets small enough, however, that well get the right answer. Certain calculations have already yielded to Lattice QCD that havent yielded to other methods, such as the calculations of the masses of the light mesons and baryons, including the proton and neutron. After many attempts at predicting what the strong forces contributions to the g-2 measurement of the muon ought to be over the past few years, the uncertainties are finally dropping to become competitive with the experimental ones. If the latest group to perform that calculation has finally gotten it right, there is no longer a tension with the experimental results.

The R-ratio method (red) for calculating the muon's magnetic moment has led many to note the ... [+] mismatch with experiment (the 'no new physics' range). But recent improvements in Lattice QCD (green points, and particularly the top, solid green point) not only have reduced the uncertainties substantially, but favor an agreement with experiment and a disagreement with the R-ratio method.

Assuming that the experimental results from the Muon g-2 collaboration hold up and theres every reason to believe they will, including the solid agreement with the earlier Brookhaven results all eyes will turn towards the theorists. We have two different ways of calculating the expected value of the muons spin magnetic moment, where one agrees with the experimental values (within the errors) and the other does not.

Will the Lattice QCD groups all converge on the same answer, and demonstrate that not only do they know what theyre doing, but that theres no anomaly after all? Or will Lattice QCD methods reveal a disagreement with the experimental values, the same way that they presently disagree with the other theoretical method we have that presently disagrees so significantly with the experimental values we have: of using experimental inputs instead of theoretical calculations?

Its far too early to say, but until we have a resolution to this important theoretical issue, we wont know what it is thats broken: the Standard Model, or the way were presently calculating the same quantities were measuring to unparalleled precisions.

Follow this link:

The Big Theoretical Physics Problem At The Center Of The 'Muon g-2' Puzzle - Forbes

Posted in Quantum Physics | Comments Off on The Big Theoretical Physics Problem At The Center Of The ‘Muon g-2’ Puzzle – Forbes

The God Equation Review: One String Theory to Rule Them All – The Wall Street Journal

Posted: at 6:36 am

Whats God got to do with it? Given that the majority of physicists are agnostics at best, I have always found it puzzling that my community is so obsessed with Gods mind, whether or not God plays dice, the God particle and seeing Godand now with Michio Kakus The God Equation. Title notwithstanding, this is an excellent book written by a masterful science communicator elaborating on a subject that is his research home turfsuperstring theory. The prolific author of multiple popular science books, Mr. Kaku is a futurist, broadcaster and professor of theoretical physics at the City University of New York. He is also the host of the wildly successful and popular weekly radio program Science Fantastic. If there is anyone who can demystify the esoteric mathematics and physics of string theory, it is he. And in this wonderful little book, that is precisely what he doesexplain in clear and simple terms the conceptual breakthroughs, the blind alleys and the unanswered questionsin the search for a grand unified theory of everything. Most of all, what I like best is that he remains open to the possibility that there may ultimately not be a single unifying theory after all, encoded into a single tidy equation.

The dream to synthesize all known physical forces has been a longstanding challenge; many physicists, including Einstein, have embarked on the pursuit and failed. The four fundamental forces of nature are gravity, electromagnetism, the weak force responsible for radioactive decay of some nuclei, and the strong force binding the atomic nucleus together.

When Newton discovered the laws of gravity, he accomplished the phenomenal task of connecting the celestial and terrestrial with a universal theory of gravitation that accounted both for a falling apple and the orbit of the Earth around the sun. Subsequently, as physicists uncovered additional fundamental forces in natureelectromagnetism, the weak force and the strong forcethey set about combining all of them into ever-grander theories. Mr. Kaku traces each of these pivotal moments of unification, describing the key insights that permitted those breakthroughs and bringing us to the precipice, where we currently stand, stymied. The ultimate challengeto unify gravity and quantum mechanicsis yet to be accomplished. To highlight how momentous unification would be, Mr. Kaku ends the book with a quote from Stephen Hawking: it would be the ultimate triumph of human reasonfor then we would know the mind of Godhence, I suppose, the God equation.

Mr. Kaku argues persuasively that every time physicists have decoded one of the four fundamental forces of the universe, it not only revealed the secrets of nature, but radically revolutionized society too. He connects Newtons laws to the invention of the steam engine and the launch of the Industrial Revolution, while Michael Faradays later discovery of electric and magnetic fields powered the electrical age. Mr. Kaku offers a superb description of how electrical transmission works, connecting the dots from Faradays equations to Edisons and Teslas experiments and then to our illuminated, electrified life today. Eventually we come to the revolution of quantum mechanicsthe description of matter on the smallest scalewhich shook the very core of physics. The subsequent applications that came out of the quantum revolution, the transistor and laser, ushered in a world dependent on electronics.

The God Equation dazzles in its account of the unfinished quest for a grand unified theory. As Mr. Kaku describes, controversies have dogged the unified theory project from the very start. Faraday was the first to propose a unification of gravity and electromagnetism. In 1832 he conducted a set of experiments from Londons Waterloo Bridge and dropped magnets, hoping to find some quantifiable effect of gravity. Alas, the experiment failed, though he remained convinced that the effect existed, perhaps at an undetectable level. In 1947, one of the founders of quantum mechanics, Erwin Schrdinger, famously held a press conference to announce victoryhe claimed to have a unified field theory. He did notembarrassingly, his version could not even explain the nature of electrons and the atom. The other illustrious co-founders of quantum mechanics, Werner Heisenberg and Wolfgang Pauli, followed suit and failed as well. The first real major step came with the discovery of quantum electrodynamics (QED), which provided a quantum theory of electrons and light. Then came the connection to the best current description of the strong nuclear force with the development of quantum chromodynamics (QCD). The standard model of particle physics that consolidates the zoo of subatomic particles emerged from these developments, bringing us to a theory of almost everything. The quest to unify all four fundamental forces in the universe has unfortunately stalled here. I write this on the heels of an announcement by Fermi National Laboratory of a potential discovery, a likely hint for the existence of a possible additional force of naturewhich, if it stands up, reveals the existence of physics beyond the currently accepted standard model.

Read the rest here:

The God Equation Review: One String Theory to Rule Them All - The Wall Street Journal

Posted in Quantum Physics | Comments Off on The God Equation Review: One String Theory to Rule Them All – The Wall Street Journal

615 Million Euros Awarded to Quantum Delta NL for Quantum Research in the Netherlands – HPCwire

Posted: at 6:36 am

April 9, 2021 Quantum Delta NL, a research programme in which Leiden University participates, has been awarded 615 million euros from the National Growth Fund to help develop the Netherlands into a top player in quantum technology. This has been announced at the presentation of the honoured proposals in The Hague.

Quantum Delta NL is a cooperation of companies and research institutes in which the research has been organised in five hubs at the universities of Delft, Leiden, Amsterdam, Twente and Eindhoven.

The research groupApplied Quantum Algorithms (aQa)at the Leiden institutes for physics and computer science develops quantum algorithms for chemical and material science applications, in cooperation with Google, Shell, Volkswagen and Total.

Great enthusiasm

Research into quantum computing has been going on for twenty years, bringing real world application ever closer, says Carlo Beenakker, professor in Theoretical Physics and Deputy Chair of Quantum Delta NL. I seegreat enthusiasm in my students to apply abstract concepts from quantum physics to the solution of practical problems. This is the revolutionary technology of their generation.

The goal of aQa is to make quantum algorithms practically applicable, pertaining to questions ofsocietal and economical relevance. We cooperate narrowly with our industrial partners to render these large investments as useful as possible, says computer science researcher Vedran Dunjko. Recently, he published in the journal Natureabout artificial intelligence implemented through quantum computers.

Quantum technology

Quantum Delta NLs ambition is to position the Netherlands as a Silicon Valley for quantum technology in Europe during the coming seven years. The programme provides for the further development of the quantum computer and the quantum internet, which will be open for end users in business and societal sectors, including education.

It aims for a flourishing ecosystem where talent is fostered at all levels, and where cooperation happens over institutional borders to develop a new European high-tech industry.

Source: Leiden University

The rest is here:

615 Million Euros Awarded to Quantum Delta NL for Quantum Research in the Netherlands - HPCwire

Posted in Quantum Physics | Comments Off on 615 Million Euros Awarded to Quantum Delta NL for Quantum Research in the Netherlands – HPCwire

The Disordered Cosmos review: An insider take on physics and injustice – New Scientist News

Posted: at 6:36 am

A bold new book by Chanda Prescod-Weinstein combines her love of physics with a strong analysis of the inequalities rife in science

By Anna Demming

The Disordered Cosmos argues that science needs close scrutiny

SOPA Images/LightRocket via Getty Images

The Disordered Cosmos: A journey into dark matter, spacetime, & dreams deferred

Chanda Prescod-Weinstein

Advertisement

Bold Type Books

THIS isnt just a popular science book. There is plenty of physics in it from the big bang and relativity to particle physics, it is all there. But attention rapidly shifts to the authors other preoccupation: social injustice, such as inequalities, prejudices and the kind of social grooming and timidity that also hinder us from calling out these vices.

The author of The Disordered Cosmos is Chanda Prescod-Weinstein, assistant professor of physics and astronomy, a core faculty member in womens and gender studies at the University of New Hampshire and a New Scientist columnist. This gives her an excellent position from which she can both engage in rich detail with sciences most fascinating theories and grapple with human and inhuman social failings.

She works patiently to disabuse readers of the delusion that their favourite pop-sci ideas those lofty products of cerebral ingenuity and academic brilliance are immune from the prejudices pervading society.

Prescod-Weinsteins heritage is a mix of Black American, Black Caribbean, Eastern European Jewish and Jewish American histories. She identifies as agender, and has a history of debilitating health conditions. The inequalities she covers in her book are issues she has dealt with at first hand.

Some readers may question whether, say, there are indeed damaging racist undertones in the term dark matter, or in the way colour analogies are used in quantum chromodynamics, a theory sometimes referred to in textbooks as colored physics. But it is hard to dismiss the broader issues Prescod-Weinstein argues: inequalities around race, gender, class, nationality and disability.

Diversity and inclusivity are todays buzzwords, but she quotes Jin Haritaworn and C. Riley Snorton in their appraisal of trans politics theory, and questions whether it is enough for the scientific establishment to aim to be inclusive if what people are included in retains what she calls a strong relationship with totalitarian, racialized structures.

The author disabuses readers that favourite popsci ideas are immune from everyday prejudices

Despite the obvious conflict between her love of physics and her outrage at some of the social and personal injustices she sees in institutions propagating physics, the different focuses of the book arent necessarily competing for airtime. And Prescod-Weinstein often uses physics explanations as a springboard or analogy for the social issues she wants to discuss.

Take the description of non-binary wave-particle duality in the double-slit experiment, which precedes her dissection of attitudes to people identifying as non-binary or otherwise. It should be obvious that when you refuse to respect someones pronouns you are making a statement about whats important and what is not, she writes. To tell students that it is too difficult is an egregious, brazen lie.

Although there are times when discussions of minority politics get quite dense, perhaps more so than the physics, on the whole, the book feels very intimate I sometimes felt like I was reading her diary. This can be a treat, such as when she is musing over some charming quirk of particle physics: I tend to think of bosons as pep squad particles: they are happy to share the same quantum energy state Fermions? Not so much.

At other times, it gets more uncomfortable, as when she lays bare episodes of anguished introspection, self-doubt and emotional fatigue caused by traumatising experiences. It is all recounted to serve a point, but is incredibly personal and confiding.

So no, her book isnt a typical popular science read and she makes some comments that may prove unpopular. Beyond the already ardently persuaded, it will be interesting to see how much a broader readership may be convinced by the arguments she presents.

More on these topics:

Read the original post:

The Disordered Cosmos review: An insider take on physics and injustice - New Scientist News

Posted in Quantum Physics | Comments Off on The Disordered Cosmos review: An insider take on physics and injustice – New Scientist News

Course explores ‘Magic, Witchcraft and Healing’ > News > USC Dornsife – USC Dornsife College of Letters, Arts and Sciences

Posted: at 6:36 am

USC Dornsifes Thomas Ward of anthropology covers Vodou, witchcraft and shamanism, while focusing on white magic used for holistic healing. [5minread]

Perched on a shelf in Thomas Wards home office is a set of Vodou dolls.

Curiously, theyre not in the shape of human beings but are little round balls topped with conical hats. Filled with dense soil and wrapped tightly with black and red ribbon, theyre as heavy as paperweights.

Ward, associate professor (teaching) ofanthropologyat USC Dornsife College of Letters, Arts and Sciences, shows the Vodou artifacts to students in his spring semester course Magic, Witchcraft and Healing (ANTH 373).

Theyre beautiful objects, he says of the Vodou dolls, which he brought back from a trip to Haiti in 1983.

They can be used for healing and they can also be used for the dark arts.

But Ward is no Severus Snape from the Harry Potter franchise.

Our class explores the magical components of healing, and while witchcraft can be used for healing or harm, our class focuses only on white magic, rather than black magic, which is believed to cause harm.

That doesnt stop some of his students expressing curiosity about the dark side.

They ask me, Are curses real? Can a witch put a curse on you? Ward says. From the indigenous perspective, absolutely. Most non-Western cultures believe that curses can be used for harm.

But anthropologists and Western scientists would argue that its the power of belief that causes people who know theyve been cursed to have accidents.

It depends on who you are, where you are, what culture youre in, your own belief system, Ward says.

Magic and healing

Wards course explores the cross-cultural aspects of healing in non-Western traditions.

In the anthropological, cross-cultural context, the term magic is used for non-Western methods of healing or other ritual practices. Ward defines magic in the context of this course as unexplained causality.

Something happens and it causes something else to happen and we see the result, but we dont know exactly how it works, he says, noting that the term is used even in quantum physics to explain causal relationships that we dont completely understand.

I put a spell on you

Haitian Voudo dolls are often used for healing, according to anthropologist Thomas Ward. (Photo: Courtesy of Thomas Ward.)

Whats fascinating about witchcraft, Ward says, is just how complex and pervasive the idea is, cross-culturally, from Asia to the Americas and Africa. Why have humans for so many thousands of years and from so many different cultures and locations held these beliefs? Is it possible that theres something more to it that were not seeing?

The course explores a number of different historical and geographical aspects of witchcraft: the Salem Witch Trials in New England; the Azande, an ethnic group in Southern Sudan; Vodou tradition and practices in Haiti and Brooklyn, New York;curanderismo traditional healing of the body, mind or soul by shamans or spiritual healers in the Southwest United States; and the genesis of the Wiccan religion in the United Kingdom and its spread to the U.S.

Students read E.E. Evans-Pritchards ethnographyWitchcraft, Oracles and Magic Among the Azande, which explores how witchcraft is used to explain causality. Evans-Pritchard was one of the first anthropologists to point out the logic of witchcraft beliefs and to argue that witchcraft explains unfortunate circumstance in situations that Western science would simply put down to being in the wrong place at the wrong time.

For instance, if you are sitting on a verandah and one of the columns breaks and the roof falls and hits you on the head, Western scientists might tell you that termites were eating the wood, causing the column to fall. The questions that Western science doesnt answer is why at that particular moment in time did the roof fall and why me?

Western science would just chalk it up to unfortunate circumstance, Ward says. With its mystical aspect, witchcraft fills that gap, providing the causal link and explaining why it happened at that particular place and time and to that particular individual.

Ward says one of the most fascinating aspects of non-Western healing practices is its focus on the holistic aspect. These ideas are starting to gain traction in the U.S. with more emphasis on social relationships, community and spirituality.

In Vodou, for instance, practitioners look not only at physical and mental health, but also spiritual health and a persons relationship to the lwa, or spirits. The role of Vodou practitioners, known as servants of the spirits, is to communicate with the lwa, make them offerings and ask for intercession.

Wards course also explores the Wiccan religion, founded by Gerald Gardner in 1929. Gardner did a popular BBC interview that stirred up a great deal of controversy and interest in modern witchcraft.

The Wiccan movement spread from the U.K. to the U.S., where it experienced its greatest growth in the 1960s and 70s, with the civil rights and womens movements and a greater willingness to experiment and explore non-Western or ancient ways of being and healing.

Wiccas emphasis on nature and the sacred feminine made it, in some ways, the response to, and maybe rebellion against, the patriarchal elements of Christianity, Ward says.

The three big takeaways

The focus of the course, he says, is on expanding our horizons while remaining respectful and humble about other peoples traditions. Ward wants students to think about what healing means from a holistic perspective.

We tend to think of healing in the West as mainly physical, emotional and psychological, whereas in the non-Western context, healing means to restore a balance or a sense of wholeness in an individual and his or her environment, including family, friends, nature, spirits and ancestors. So, healing is much more comprehensive it restores a persons health, happiness and wholeness.

Ward hopes students will take away from the course a greater sense of open-mindedness and curiosity, but also humility. Magic is the word that we use to keep us humble, by saying that things are happening that we dont fully understand. We need to respect other peoples views of causality and the etiology of illness and broaden our perspective about what is possibly causing various illnesses.

Ward says the course also peels back the layers of contemporary celebrations like Halloween that depict witches in a stereotypical and sometimes humorous way.

Theres a huge, very complicated worldwide history of what witches are and what they do, Ward says. There should be respect for these traditions, but also understanding.

Read the original:

Course explores 'Magic, Witchcraft and Healing' > News > USC Dornsife - USC Dornsife College of Letters, Arts and Sciences

Posted in Quantum Physics | Comments Off on Course explores ‘Magic, Witchcraft and Healing’ > News > USC Dornsife – USC Dornsife College of Letters, Arts and Sciences

How matters hidden complexity unleashed the power of nuclear physics – Science News Magazine

Posted: at 6:36 am

Matter is a lush tapestry, woven from a complex assortment of threads. Diverse subatomic particles weave together to fabricate the universe we inhabit. But a century ago, people believed that matter was so simple that it could be constructed with just two types of subatomic fibers electrons and protons. That vision of matter was a no-nonsense plaid instead of an ornate brocade.

Physicists of the 1920s thought they had a solid grasp on what made up matter. They knew that atoms contained electrons surrounding a positively charged nucleus. And they knew that each nucleus contained a number of protons, positively charged particles identified in 1919. Combinations of those two particles made up all of the matter in the universe, it was thought. That went for everything that ever was or might be, across the vast, unexplored cosmos and at home on Earth.

The scheme was appealingly tidy, but it swept under the rug a variety of hints that all was not well in physics. Two discoveries in one revolutionary year, 1932, forced physicists to peek underneath the carpet. First, the discovery of the neutron unlocked new ways to peer into the hearts of atoms and even split them in two. Then came news of the positron, identical to the electron but with the opposite charge. Its discovery foreshadowed many more surprises to come. Additional particle discoveries ushered in a new framework for the fundamental bits of matter, now known as the standard model.

That annus mirabilis miraculous year also set physicists sights firmly on the workings of atoms hearts, how they decay, transform and react. Discoveries there would send scientists careening toward a most devastating technology: nuclear weapons. The atomic bomb cemented the importance of science and science journalism in the public eye, says nuclear historian Alex Wellerstein of the Stevens Institute of Technology in Hoboken, N.J. The atomic bomb becomes the ultimate proof that indeed this is world-changing stuff.

Physicists of the 1920s embraced a particular type of conservatism. Embedded deep in their psyches was a reluctance to declare the existence of new particles. Researchers stuck to the status quo of matter composed solely of electrons and protons an idea dubbed the two-particle paradigm that held until about 1930. In that time period, says historian of science Helge Kragh of the University of Copenhagen, Im quite sure that not a single mainstream physicist came up with the idea that there might exist more than two particles. The utter simplicity of two particles explaining everything in natures bounty was so appealing to physicists sensibilities that they found the idea difficult to let go of.

The paradigm held back theoretical descriptions of the neutron and the positron. To propose the existence of other particles was widely regarded as reckless and contrary to the spirit of Occams razor, science biographer Graham Farmelo wrote in Contemporary Physics in 2010.

Still, during the early 20th century, physicists were investigating a few puzzles of matter that would, after some hesitation, inevitably lead to new particles. These included unanswered questions about the identities and origins of energetic particles called cosmic rays, and why chemical elements occur in different varieties called isotopes, which have similar chemical properties but varying masses.

Headlines and summaries of the latest Science News articles, delivered to your inbox

New Zealandborn British physicist Ernest Rutherford stopped just short of positing a fundamentally new particle in 1920. He realized that neutral particles in the nucleus could explain the existence of isotopes. Such particles came to be known as neutrons. But rather than proposing that neutrons were fundamentally new, he thought they were composed of protons combined in close proximity with electrons to make neutral particles. He was correct about the role of the neutron, but wrong about its identity.

Rutherfords idea was convincing, British physicist James Chadwick recounted in a 1969 interview: The only question was how the devil could one get evidence for it. The neutrons lack of electric charge made it a particularly wily target. In between work on other projects, Chadwick began hunting for the particles at the University of Cambridges Cavendish Laboratory, then led by Rutherford.

Chadwick found his evidence in 1932. He reported that mysterious radiation emitted when beryllium was bombarded with the nuclei of helium atoms could be explained by a particle with no charge and with a mass similar to the protons. In other words, a neutron. Chadwick didnt foresee the important role his discovery would play. I am afraid neutrons will not be of any use to anyone, he told the New York Times shortly after his discovery.

Physicists grappled with the neutrons identity over the following years before accepting it as an entirely new particle, rather than the amalgamation that Rutherford had suggested. For one, a proton-electron mash-up conflicted with the young theory of quantum mechanics, which characterizes physics on small scales. The Heisenberg uncertainty principle which states that if the location of an object is well-known, its momentum cannot be suggests that an electron confined within a nucleus would have an unreasonably large energy.

And certain nucleis spins, a quantum mechanical measure of angular momentum, likewise suggested that the neutron was a full-fledged particle, as did improved measurements of the particles mass.

Physicists also resisted the positron, until it became difficult to ignore.

The positrons 1932 detection had been foreshadowed by the work of British theoretical physicist Paul Dirac. But it took some floundering about before physicists realized the meaning of his work. In 1928, Dirac formulated an equation that combined quantum mechanics with Albert Einsteins 1905 special theory of relativity, which describes physics close to the speed of light. Now known simply as the Dirac equation, the expression explained the behavior of electrons in a way that satisfied both theories.

But the equation suggested something odd: the existence of another type of particle, one with the opposite electric charge. At first, Dirac and other physicists clung to the idea that this charged particle might be the proton. But this other particle should have the same mass as the electron, and protons are almost 2,000 times as heavy as electrons. In 1931, Dirac proposed a new particle, with the same mass as the electron but with opposite charge.

Meanwhile, American physicist Carl Anderson of Caltech, independent of Diracs work, was using a device called a cloud chamber to study cosmic rays, energetic particles originating in space. Cosmic rays, discovered in 1912, fascinated scientists, who didnt fully understand what the particles were or how they were produced.

Within Andersons chamber, liquid droplets condensed along the paths of energetic charged particles, a result of the particles ionizing gas molecules as they zipped along. In 1932, the experiments revealed positively charged particles with masses equal to an electrons. Soon, the connection to Diracs theory became clear.

Science News Letter, the predecessor of Science News, had a hand in naming the newfound particle. Editor Watson Davis proposed positron in a telegram to Anderson, who had independently considered the moniker, according to a 1933 Science News Letter article (SN: 2/25/33, p. 115). In a 1966 interview, Anderson recounted considering Davis idea during a game of bridge, and finally going along with it. He later regretted the choice, saying in the interview, I think thats a very poor name.

The discovery of the positron, the antimatter partner of the electron, marked the advent of antimatter research. Antimatters existence still seems baffling today. Every object we can see and touch is made of matter, making antimatter seem downright extraneous. Antimatters lack of relevance to daily life and the terms liberal use in Star Trek means that many nonscientists still envision it as the stuff of science fiction. But even a banana sitting on a counter emits antimatter, periodically spitting out positrons in radioactive decays of the potassium within.

Physicists would go on to discover many other antiparticles all of which are identical to their matter partners except for an opposite electric charge including the antiproton in 1955. The subject still keeps physicists up at night. The Big Bang should have produced equal amounts matter and antimatter, so researchers today are studying how antimatter became rare.

In the 1930s, antimatter was such a leap that Diracs hesitation to propose the positron was understandable. Not only would the positron break the two-particle paradigm, but it would also suggest that electrons had mirror images with no apparent role in making up atoms. When asked, decades later, why he had not predicted the positron after he first formulated his equation, Dirac replied, pure cowardice.

But by the mid-1930s, the two-particle paradigm was out. Physicists understanding had advanced, and their austere vision of matter had to be jettisoned.

Radioactive decay hints that atoms hold stores of energy locked within, ripe for the taking. Although radioactivity was discovered in 1896, that energy long remained an untapped resource. The neutrons discovery in the 1930s would be key to unlocking that energy for better and for worse.

The neutrons discovery opened up scientists understanding of the nucleus, giving them new abilities to split atoms into two or transform them into other elements. Developing that nuclear know-how led to useful technologies, like nuclear power, but also devastating nuclear weapons.

Just a year after the neutron was found, Hungarian-born physicist Leo Szilard envisioned using neutrons to split atoms and create a bomb. [I]t suddenly occurred to me that if we could find an element which is split by neutrons and which would emit two neutrons when it absorbed one neutron, such an element, if assembled in sufficiently large mass, could sustain a nuclear chain reaction, liberate energy on an industrial scale, and construct atomic bombs, he later recalled. It was a fledgling idea, but prescient.

Because neutrons lack electric charge, they can penetrate atoms hearts. In 1934, Italian physicist Enrico Fermi and colleagues started bombarding dozens of different elements with neutrons, producing a variety of new, radioactive isotopes. Each isotope of a particular element contains a different number of neutrons in its nucleus, with the result that some isotopes may be radioactive while others are stable. Fermi had been inspired by another striking discovery of the time. In 1934, French chemists Frdric and Irne Joliot-Curie reported the first artificially created radioactive isotopes, produced by bombarding elements with helium nuclei, called alpha particles. Now, Fermi was doing something similar, but with a more penetrating probe.

There were a few scientific missteps on the way to understanding the results of such experiments. A major goal was to produce brand-new elements, those beyond the last known element on the periodic table at that time: uranium. After blasting uranium with neutrons, Fermi and colleagues reported evidence of success. But that conclusion would turn out to be incorrect.

German chemist Ida Noddack had an inkling that all was not right with Fermis interpretation. She came close to the correct explanation for his experiments in a 1934 paper, writing: When heavy nuclei are bombarded by neutrons, it is conceivable that the nucleus breaks up into several large fragments. But Noddack didnt follow up on the idea. She didnt provide any kind of supporting calculation and nobody took it with much seriousness, says physicist Bruce Cameron Reed of Alma College in Michigan.

In Germany, physicist Lise Meitner and chemist Otto Hahn had also begun bombarding uranium with neutrons. But Meitner, an Austrian of Jewish heritage in increasingly hostile Nazi Germany, was forced to flee in July 1938. She had an hour and a half to pack her suitcases. Hahn and a third member of the team, chemist Fritz Strassmann, continued the work, corresponding from afar with Meitner, who had landed in Sweden. The results of the experiments were puzzling at first, but when Hahn and Strassmann reported to Meitner that barium, a much lighter element than uranium, was a product of the reaction, it became clear what was happening. The nucleus was splitting.

Meitner and her nephew, physicist Otto Frisch, collaborated to explain the phenomenon, a process the pair would call fission. Hahn received the 1944 Nobel Prize in chemistry for the discovery of fission, but Meitner never won a Nobel, in a decision now widely considered unjust. Meitner was nominated for the prize sometimes in physics, other times in chemistry a whopping 48 times, most after the discovery of fission.

Her peers in the physics community recognized that she was part of the discovery, says chemist Ruth Lewin Sime of Sacramento City College in California, who has written extensively about Meitner. That included just about anyone who was anyone.

Word of the discovery soon spread, and on January 26, 1939, renowned Danish physicist Niels Bohr publicly announced at a scientific meeting that fission had been achieved. The potential implications were immediately apparent: Fission could unleash the energy stored in atomic nuclei, potentially resulting in a bomb. A Science News Letter story describing the announcement attempted to dispel any concerns the discovery might raise. The article, titled Atomic energy released, reported that scientists are fearful lest the public become worried about a revolution in civilization as a result of their researches, such as the suggested possibility that the atomic energy may be used as some super-explosive, or as a military weapon (SN: 2/11/39, p. 86). But downplaying the catastrophic implications didnt prevent them from coming to pass.

The question of whether a bomb could be created rested, once again, on neutrons. For fission to ignite an explosion, it would be necessary to set off a chain reaction. That means each fission would release additional neutrons, which could then go on to induce more fissions, and so on. Experiments quickly revealed that enough neutrons were released to make such a chain reaction feasible.

In October 1939, soon after Germany invaded Poland at the start of World War II, an ominous letter from Albert Einstein reached President Franklin Roosevelt. Composed at the urging of Szilard, by then at Columbia University, the letter warned, it is conceivable that extremely powerful bombs of a new type may thus be constructed. American researchers were not alone in their interest in the topic: German scientists, the letter noted, were also on the case.

Roosevelt responded by setting up a committee to investigate. That step would be the first toward the U.S. effort to build an atomic bomb, the Manhattan Project.

On December 2, 1942, Fermi, who by then had immigrated to the United States, and 48 colleagues achieved the first controlled, self-sustaining nuclear chain reaction in an experiment with a pile of uranium and graphite at the University of Chicago. Science News Letter would later call it an event ranking with mans first prehistoric lighting of a fire. While the physicists celebrated their success, the possibility of an atomic bomb was closer than ever. I thought this day would go down as a black day in the history of mankind, Szilard recalled telling Fermi.

The experiment was a key step in the Manhattan Project. And on July 16, 1945, at about 5:30 a.m., scientists led by J. Robert Oppenheimer detonated the first atomic bomb, in the New Mexico desert the Trinity test.

It was a striking sight, as physicist Isidor Isaac Rabi recalled in his 1970 book, Science: The Center of Culture. Suddenly, there was an enormous flash of light, the brightest light I have ever seen or that I think anyone has ever seen. It blasted; it pounced; it bored its way right through you. It was a vision which was seen with more than the eye. It was seen to last forever. You would wish it would stop; although it lasted about two seconds. Finally it was over, diminishing, and we looked toward the place where the bomb had been; there was an enormous ball of fire which grew and grew and it rolled as it grew; it went up into the air, in yellow flashes and into scarlet and green. It looked menacing. It seemed to come toward one. A new thing had just been born; a new control; a new understanding of man, which man had acquired over nature.

Physicist Kenneth Bainbridge put it more succinctly: Now we are all sons of bitches, he said to Oppenheimer in the moments after the test.

The bombs construction was motivated by the fear that Germany would obtain it first. But the Germans werent even close to producing a bomb when they surrendered in May 1945. Instead, the United States bombs would be used on Japan. On August 6, 1945, the United States dropped an atomic bomb on Hiroshima, followed by another on August 9 on Nagasaki. In response, Japan surrendered. More than 100,000 people died as a result of the two attacks, and perhaps as many as 210,000.

I saw a blinding bluish-white flash from the window. I remember having the sensation of floating in the air, survivor Setsuko Thurlow recalled in a speech given upon the awarding of the 2017 Nobel Peace Prize to the International Campaign to Abolish Nuclear Weapons. She was 13 years old when the bomb hit Hiroshima. Thus, with one bomb my beloved city was obliterated. Most of its residents were civilians who were incinerated, vaporized, carbonized.

Humankind entered a new era, with new dangers to the survival of civilization. With nuclear physics, you have something that within 10 years goes from being this arcane academic research area to something that bursts on the world stage and completely changes the relationship between science and society, Reed says.

In 1949, the Soviet Union set off its first nuclear weapon, kicking off the decades-long nuclear rivalry with the United States that would define the Cold War. And then came a bigger, more dangerous weapon: the hydrogen bomb. Whereas atomic bombs are based on nuclear fission, H-bombs harness nuclear fusion, the melding of atomic nuclei, in conjunction with fission, resulting in much larger blasts. The first H-bomb, detonated by the United States in 1952, was 1,000 times as powerful as the bomb dropped on Hiroshima. Within less than a year, the Soviet Union also tested an H-bomb. The H-bomb had been called a weapon of genocide by scientists serving on an advisory committee for the U.S. Atomic Energy Commission, which had previously recommended against developing the technology.

Fears of the devastation that would result from an all-out nuclear war have fed repeated attempts to rein in nuclear weapons stockpiles and tests. Since the signing of the Comprehensive Nuclear Test Ban Treaty in 1996, the United States, Russia and many other countries have maintained a testing moratorium. However, North Korea tested a nuclear weapon as recently as 2017.

Still, the dangers of nuclear weapons were accompanied by a promising new technology: nuclear power.

In 1948, scientists first demonstrated that a nuclear reactor could harness fission to produce electricity. The X-10 Graphite Reactor at Oak Ridge National Laboratory in Tennessee generated steam that powered an engine that lit up a small Christmas lightbulb. In 1951, Experimental Breeder Reactor-I at Idaho National Laboratory near Idaho Falls produced the first usable amount of electricity from a nuclear reactor. The worlds first commercial nuclear power plants began to switch on in the mid- and late 1950s.

But nuclear disasters dampened enthusiasm for the technology, including the 1979 Three Mile Island accident in Pennsylvania and the 1986 Chernobyl disaster in Ukraine, then part of the Soviet Union. In 2011, the disaster at the Fukushima Daiichi power plant in Japan rekindled societys smoldering nuclear anxieties. But today, in an era when the effects of climate change are becoming alarming, nuclear power is appealing because it emits no greenhouse gases directly.

And humankinds mastery over matter is not yet complete. For decades, scientists have been dreaming of another type of nuclear power, based on fusion, the process that powers the sun. Unlike fission, fusion power wouldnt produce long-lived nuclear waste. But progress has been slow. The ITER experiment has been in planning since the 1980s. Once constructed in southern France, ITER aims to, for the first time, produce more energy from fusion than is put in. Whether it is successful may help determine the energy outlook for future centuries.

From todays perspective, the breakneck pace of progress in nuclear and particle physics in less than a century can seem unbelievable. The neutron and positron were both found in laboratories that are small in comparison with todays, and each discovery was attributed to a single physicist, relatively soon after the particles had been proposed. Those discoveries kicked off frantic developments that seemed to roll in one after another.

Now, finding a new element, discovering a new elementary particle or creating a new type of nuclear reactor can take decades, international collaborations of thousands of scientists, and huge, costly experiments.

As physicists uncover the tricks to understanding and controlling nature, it seems, the next level of secrets becomes increasingly difficult to expose.

Continued here:

How matters hidden complexity unleashed the power of nuclear physics - Science News Magazine

Posted in Quantum Physics | Comments Off on How matters hidden complexity unleashed the power of nuclear physics – Science News Magazine

Scientists Perform First-ever Ultracold Atom Interferometry in Space, Leading to Possible Physics Breakthroughs – Science Times

Posted: at 6:36 am

Highly accurate measurements are possible utilizing atom interferometers that use the atom's wave character. As such, atom interferometers can be used to measure the Earth's gravitational field or spot gravitational waves.

For the first time, scientists were able to perform atom interferometry onboard a sounding rocket.

"We have established the technological basis for atom interferometry on board of a sounding rocket and demonstrated that such experiments are not only possible on Earth, but also in space," said study author Professor Patrick Windpassinger of the Institute of Physics at Johannes Gutenberg University Mainz (JGU).

Windpassinger leads a group of German researchers in the study, "Ultracold atom interferometry in space," with findings published in Nature Communications.

Leibniz University Hannover collaborated with a group of researchers from different universities and research centers to launch the MAUS-1 mission in January 2017. This was the first rocket mission wherein a Bose-Einstein condensate was generated in space. This state of matter happens when atoms are cooled to minus 273 degrees Celsius or the temperature close to absolute zero.

(Photo: NIST/Wikimedia Commons)Atoms interfering with themselves. After ultracold atoms are maneuvered into superpositions--each one located in two places simultaneously--they are released to allow interference of each atom's two "selves." They are then illuminated with light, which casts a shadow, revealing a characteristic interference pattern, with red representing higher atom density. The variations in density are caused by the alternating constructive and destructive interference between the two "parts" of each atom, magnified by thousands of atoms acting in unison.

ALSO READ: New Physics Discovered by Scientists May Help Explain Mysteries of the Universe

The ultracold ensemble, the researchers said, showed a promising starting pointfor atom interferometry. Temperature is among the determining factors since measurements can be done more precisely and for longer periods at lower temperatures.

In the experiments, rubidium atom gas was broken up using laser light irradiation and then superpositioned. As forces act on the atoms on their different paths, various interference patterns can be made, and these can be used to gauge the forces that are influencing them, such as gravity.

The study first showed the coherence or interference capability of the Bose-Einstein condensate as an essentially needed property of the atomic ensemble. Here, the atoms in the interferometer were only partly superimposed as the light sequence was varied, leading to a spatial intensity modulation generation.

Researchers thus showed the viability of the concept that could lead to groundbreaking experiments focused on the Earth's gravitational field, spotting gravitational waves, and challenging Einstein's equivalence principle, which is considered breakthroughs in physics.

The team plans to study further the feasibility of high-precision atom interferometry to challenge Einstein's theory of equivalence. Two more rocket launches slated for 2022 and 2023 will have the mission use potassium atoms and rubidium atoms to create interference patterns.

With the freefall acceleration of the two types of atoms compared, the challenge on the equivalence principle with a precision that has not been earlier achieved can be done.

The experiment is an example of continuing research work on quantum technologies, including developments in quantum communication, quantum sensors, and quantum computing.

RELATED ARTICLE: Scientists Develop Atomic-Scale Imaging Technique To Measure The Age Of Planetary Samples Accurately

Check out more news and information on Atomic Theoryon Science Times.

Link:

Scientists Perform First-ever Ultracold Atom Interferometry in Space, Leading to Possible Physics Breakthroughs - Science Times

Posted in Quantum Physics | Comments Off on Scientists Perform First-ever Ultracold Atom Interferometry in Space, Leading to Possible Physics Breakthroughs – Science Times

Page 80«..1020..79808182..90100..»