Quantum mechanics and the return of free will | Tim Andersen – IAI

The common definition of free will often has problems when relating to desire and power to choose. An alternative definition that ties free will to different outcomes for life despite one's past is supported by the probabilistic nature of quantum physics. This definition is compatible with the Many Worlds Interpretation of quantum physics, which refutes the conclusion that randomness does not imply free will, writes Tim Andersen.

Free will is one of those things where people tend to be very attached to its being true or false and yet most people implicitly treat it as true. Consider that we hold people accountable for their actions as if they decided to carry out those actions of their own free will. We reward people for their successes and discoveries likewise. If Albert Einstein didnt really make his discoveries but it was, instead, inevitable that his brain would do so, does he really deserve his Nobel Prize?

SUGGESTED READINGQuantum mechanics makes no sense without the mindBy Shan Gao

Some argue that we should accept that free will is a myth and change our society accordingly. Our justice system (especially in the United States) is heavily invested in the free will hypothesis. We punish people for crimes. We do no treat them like broken machines that need to be fixed. Other nations like Norway, however, take exactly this approach.

Many physicists believe that free will in incompatible with modern physics.

The argument goes like this:

(1) Classical (non-quantum) mechanics is deterministic. Given any initial conditions to a classical system, and the entire future and past state of the system can be determined. There is no free will in determinism.

(2) Quantum mechanics allows for randomness in the outcomes of experiments, but we have no control over those outcomes. There is no free will in randomness.

(3) Human will is a product of the brain which is a physical object. All physical objects are subject to physics and the sum total of physics is contained in classical and quantum mechanics (technically, classical is an approximation of quantum).

Ergo, humans have no free will. Our brains are simply carrying out a program that, while appearing to be making free choices, is in fact just a very complex algorithm.

___

As Schopenhauer said, Man can will what he wants but cannot will what he wills.

___

The logic seems sound and in any philosophical discourse we need to look at the logic and decide (whether freely or not). There are quite a few ways to counter this argument. The first is to object to #3. This is the approach many religions take. Human will is not necessarily reducible to physical causation. Therefore, it is beyond physical law. The brain simply interacts with the will and carries out its commands.

Another is to question the reductionist assumption of the conclusion, i.e., that everything is reducible to the properties of its constituent parts, no matter how complex. If the individual parts are deterministic, so must the whole. Science has not proven that yet. Perhaps if we could model a human brain in a computer in its entirety, we might know better.

Another approach is to question what the scientist means by free will. Most scientists arent philosophers and dont necessarily define their philosophical terms as clearly as their scientific ones. The common definition of free will is that it is the freedom to choose, the ability to decide to choose or do otherwise, to be the source of ones actions. Philosophers largely tie free will to the concept of moral responsibility. Am I morally responsible for my actions?

SUGGESTED VIEWINGNew theories of the universe With Sabine Hossenfelder, Phillip Ball, Bjrn Ekeberg, Sam Henry

To put is precisely, an agent S is morally accountable for performing an action =df. S deserves praise if goes beyond what can be reasonably expected of S and S deserves blame if is morally wrong. The key then is whether an agent has the ability or power to do otherwise.

Now, what does it mean to have the ability to choose or do otherwise? It cant simply mean to have the power because one must have both the power and the desire. But what if one does not have the power to change what one desires? Then you are stuck with no free will or only a pseudo-free will in which you can change your actions but not your desires.

As Schopenhauer said, Man can will what he wants but cannot will what he wills. Consider, if I have a choice to practice my cello or lift weights, I choose to practice my cello. Now, I seem to have had the power to choose to lift weights but I did not have the desire to do so. Did I have the power to desire differently?

From the argument of physics, the brains desires are either fixed results of classical laws or random results of quantum effects. A random quantum fluctuation creates voltage bias in one of my neurons which cascades to other neurons and suddenly I want to lift weights. According to the physicist, I did not choose. It just appeared as if I did. And certainly if I had chosen differently I would have done differently, and yet in reality quantum physics chose for me by rolling a cosmic die.

___

You have to see free will as having the power to have different outcomes for your life despite your past.

___

This kind of free will definition, which is the one most people think of and the one that most scientists seem to assume, has a lot of problems. Its hard to even understand what we really mean by freedom because it gets all muddled with desire.

Without a good definition, it is impossible to argue that something exists or not. Another definition of free will avoids this problem and throws a monkey wrench into the scientist on the streets knee-jerk attitude that free will is impossible in a quantum world.

This alternative is called the categorical analysis and is stated as follows: An agent S has the ability to choose or do otherwise than at time t if and only if it was possible, holding fixed everything up to t, that S choose or do otherwise than at t.

What this means is that we have to take into account the state of the agent up until the time the choice is made and given that state ask if there is a possible world where the agent makes a choice other than the one he or she made. That, then, is what freedom of choice is.

SUGGESTED READINGConsciousness is irrelevant to Quantum MechanicsBy Carlo Rovelli

Oxford physicist David Deutsch favours this definition of free will because it is compatible with his Many Worlds Interpretation (MWI) of quantum physics. But even if you dont accept MWI, what it says is that there are probable states that have the same past up until a point t and then a choice is made and a non-deterministic path is followed. It doesnt matter if those paths are all real worlds as Deutsch believes. What matters is that they have different futures, and all interpretations of quantum physics as-is support this idea.

If that is true, and this is the most important point, then you can say that freedom of choice exists because the agent made different choices in different probable realities. Thus, the agent had the power to choose and exercised it.

This definition of free will is interesting from a physics perspective because it is not true in a classical, deterministic world in which all pasts have the same future, but it is true in a quantum world where all pasts do not share the same future. Thus, it refutes the conclusion from #2 above that randomness does not imply free will. It only does so if you define free will in the way that people commonly understand it which is, frankly, not a defensible definition.

Rather, you have to see free will as having the power to have different outcomes for your life despite your past. Whether you can affect those outcomes by changing your actions or desires is a meaningless statement.

___

Freedom of choice exists because the agent made different choices in different probable realities.

___

Thus if I made the choice to practice in 60% of quantum futures and lift weights in 40%, then that proves I had the power to do otherwise. If I practiced in 100% of futures, then I did not have that power. Whether science can prove this is an open question, but it does not require any modification to quantum theory. Indeed, some modifications attempt to remove this possibility, incorrectly I believe.

While it may seem that this is sleight of hand in changing definitions, it is in reality making the definition of free will precise by saying that it is exactly the power to do otherwise. This is evidenced by quantum physics, i.e., because more than one outcome of a choice can occur from a single state of the universe, an agent does have the power to do otherwise which is what free will is.

See the rest here:

Quantum mechanics and the return of free will | Tim Andersen - IAI

How we could discover quantum gravity without rebuilding space-time – New Scientist

Shutterstock/Sola Solandra

MODERN physics has two stories to tell about our universe. The first says it is fundamentally made of space-time: a continuous, stretchy fabric that has ballooned since the dawn of time. The other says it is fundamentally made of indivisible things that cant decide where they are, or even when.

Both stories are compelling, describing what we observe with incredible accuracy. The big difference, though, is the scale at which they apply. Albert Einsteins theory of general relativity, which describes gravity, space and time, rules over very massive objects and cosmic distances. Quantum physics, meanwhile, governs tiny, sprightly atoms and subatomic particles.

Ultimately, both stories cant be true. Nowhere is this more apparent than at the big bang, where everything in the universe was compacted into an infinitesimally small point. Here, you need a single theory that encompasses gravity and the quantum realm. Why were here is the big question, says Toby Wiseman, a theorist at Imperial College London. It seems that quantum gravity is the only answer.

Alas, it is an answer we are yet to find, despite many decades of searching. Quantum gravity means a reconciliation of the continuous and the indivisible, the predictable and the random. There are many ideas, but none can totally incorporate everything. Were still no better off at understanding the beginning of space and time, says Wiseman.

Most physicists attempting this begin with quantum physics, the workhorse of which is quantum field theory. This describes three of the four forces of nature electromagnetism, the strong nuclear force and the weak nuclear force by quantising them as force-carrying elementary particles. It

See more here:

How we could discover quantum gravity without rebuilding space-time - New Scientist

ETH Zurich Researchers Strengthen Quantum Mechanics with … – HPCwire

May 12, 2023 A group of researchers led by Andreas Wallraff, Professor of Solid State Physics at ETH Zurich, has performed a loophole-free Bell test to disprove the concept of local causality formulated by Albert Einstein in response to quantum mechanics.

By showing that quantum mechanical objects that are far apart can be much more strongly correlated with each other than is possible in conventional systems, the researchers have provided further confirmation for quantum mechanics. Whats special about this experiment is that the researchers were able for the first time to perform it using superconducting circuits, which are considered to be promising candidates for building powerful quantum computers.

An Old Dispute

A Bell test is based on an experimental setup that was initially devised as a thought experiment by British physicist John Bell in the 1960s. Bell wanted to settle a question that the greats of physics had already argued about in the 1930s: Are the predictions of quantum mechanics, which run completely counter to everyday intuition, correct, or do the conventional concepts of causality also apply in the atomic microcosm, as Albert Einstein believed?

To answer this question, Bell proposed to perform a random measurement on two entangled particles at the same time and check it against Bells inequality. If Einsteins concept of local causality is true, these experiments will always satisfy Bells inequality. By contrast, quantum mechanics predicts that they will violate it.

The Last Doubts Dispelled

In the early 1970s, John Francis Clauser and Stuart Freedman carried out the first practical Bell test. In their experiments, the two researchers were able to prove that Bells inequality is indeed violated. But they had to make certain assumptions in their experiments to be able to conduct them in the first place. So, theoretically, it might still have been the case that Einstein was correct to be skeptical of quantum mechanics.

Over time, however, more and more of these loopholes could be closed. Finally in 2015, various groups succeeded in conducting the first truly loophole-free Bell tests.

Promising Applications

Wallraffs group can now confirm these results with a novel experiment. The work by the ETH researchers published in the scientific journal Nature demonstrates that research on this topic has not concluded despite the initial confirmation seven years ago.

A number of factors contribute to this outcome. The experiment conducted by the ETH researchers establishes that, despite their larger size compared to microscopic quantum objects, superconducting circuits still abide by the principles of quantum mechanics. These electronic circuits, which are several hundred micrometers in size and made from superconducting materials, function at microwave frequencies and are known as macroscopic quantum objects.

In addition, Bell tests also have a practical significance. Modified Bell tests can be used in cryptography, for example, to demonstrate that information is actually transmitted in encrypted form, explained Simon Storz, a doctoral student in Wallraffs group. With our approach, we can prove much more efficiently than is possible in other experimental setups that Bells inequality is violated. That makes it particularly interesting for practical applications.

The Search for a Compromise

To carry out their research, the team required an advanced testing facility. A critical aspect of a loophole-free Bell test is ensuring that no information exchange occurs between the two entangled circuits before the completion of the quantum measurements. As information can only travel as fast as the speed of light, the measurement process must be faster than the time taken for a light particle to travel from one circuit to another.

When designing the experiment, striking the right balance is crucial. Increasing the distance between the two superconducting circuits allows for more time to conduct the measurement, but it also complicates the experimental setup. This is due to the need for the entire experiment to be carried out in a vacuum near absolute zero.

The ETH researchers determined that the minimum distance needed for a successful loophole-free Bell test is approximately 33 meters. A light particle takes about 110 nanoseconds to travel this distance in a vacuum, which is slightly longer than the time it took for the researchers to complete the experiment.

Thirty-meter Vacuum

Wallraffs team has built an impressive facility in the underground passageways of the ETH campus. At each of its two ends is a cryostat containing a superconducting circuit. These two cooling apparatuses are connected by a 30-meter-long tube with interiors cooled to a temperature just above absolute zero (273.15C).

Before the start of each measurement, a microwave photon is transmitted from one of the two superconducting circuits to the other so that the two circuits become entangled. Random number generators then decide which measurements are made on the two circuits as part of the Bell test. Next, the measurement results on both sides are compared.

Large-scale Entanglement

After evaluating more than one million measurements, the researchers have shown with very high statistical certainty that Bells inequality is violated in this experimental setup. In other words, they have confirmed that quantum mechanics also allows for non-local correlations in macroscopic electrical circuits and consequently that superconducting circuits can be entangled over a large distance. This opens up interesting possible applications in the field of distributed quantum computing and quantum cryptography.

Building the facility and carrying out the test was a challenge, Wallraff said. We were able to finance the project over a period of six years with funding from an ERC Advanced Grant. Just cooling the entire experimental setup to a temperature close to absolute zero takes considerable effort.

He explained, There are 1.3 tons of copper and 14,000 screws in our machine, as well as a great deal of physics knowledge and engineering know-how. Wallraff further believes that, in principle, it is possible to construct facilities capable of overcoming even greater distances using the same approach. Such technology could potentially be employed to connect superconducting quantum computers across vast distances.

Source: Felix Wrsten, ETH Zrich

Read the rest here:

ETH Zurich Researchers Strengthen Quantum Mechanics with ... - HPCwire

Time Twisted in Quantum Physics: How the Future Might Influence … – SciTechDaily

The 2022 physics Nobel prize was awarded for experimental work demonstrating fundamental breaks in our understanding of the quantum world, leading to discussions around local realism and how it could be refuted. Many theorists believe these experiments challenge either locality (the notion that distant objects require a physical mediator to interact) or realism (the idea that theres an objective state of reality). However, a growing number of experts suggest an alternative approach, retrocausality, which posits that present actions can affect past events, thus preserving both locality and realism.

The 2022 Nobel Prize in physics highlighted the challenges quantum experiments pose to local realism. However, a growing body of experts propose retrocausality as a solution, suggesting that present actions can influence past events, thus preserving both locality and realism. This concept offers a novel approach to understanding causation and correlations in quantum mechanics, and despite some critics and confusion with superdeterminism, it is increasingly seen as a viable explanation for recent groundbreaking experiments, potentially safeguarding the core principles of special relativity.

In 2022, the physics Nobel prize was awarded for experimental work showing that the quantum world must break some of our fundamental intuitions about how the universe works.

Many look at those experiments and conclude that they challenge locality the intuition that distant objects need a physical mediator to interact. And indeed, a mysterious connection between distant particles would be one way to explain these experimental results.

Others instead think the experiments challenge realism the intuition that theres an objective state of affairs underlying our experience. After all, the experiments are only difficult to explain if our measurements are thought to correspond to something real. Either way, many physicists agree about whats been called the death by experiment of local realism.

But what if both of these intuitions can be saved, at the expense of a third? A growing group of experts think that we should abandon instead the assumption that present actions cant affect past events. Called retrocausality, this option claims to rescue both locality and realism.

What is causation anyway? Lets start with the line everyone knows: correlation is not causation. Some correlations are causation, but not all. Whats the difference?

Consider two examples. (1) Theres a correlation between a barometer needle and the weather thats why we learn about the weather by looking at the barometer. But no one thinks that the barometer needle is causing the weather. (2) Drinking strong coffee is correlated with a raised heart rate. Here it seems right to say that the first is causing the second.

The difference is that if we wiggle the barometer needle, we wont change the weather. The weather and the barometer needle are both controlled by a third thing, the atmospheric pressure thats why they are correlated. When we control the needle ourselves, we break the link to the air pressure, and the correlation goes away.

But if we intervene to change someones coffee consumption, well usually change their heart rate, too. Causal correlations are those that still hold when we wiggle one of the variables.

These days, the science of looking for these robust correlations is called causal discovery. Its a big name for a simple idea: finding out what else changes when we wiggle things around us.

In ordinary life, we usually take for granted that the effects of a wiggle are going to show up later than the wiggle itself. This is such a natural assumption that we dont notice that were making it.

But nothing in the scientific method requires this to happen, and it is easily abandoned in fantasy fiction. Similarly in some religions, we pray that our loved ones are among the survivors of yesterdays shipwreck, say. Were imagining that something we do now can affect something in the past. Thats retrocausality.

The quantum threat to locality (that distant objects need a physical mediator to interact) stems from an argument by the Northern Ireland physicist John Bell in the 1960s. Bell considered experiments in which two hypothetical physicists, Alice and Bob, each receive particles from a common source. Each chooses one of several measurement settings, and then records a measurement outcome. Repeated many times, the experiment generates a list of results.

Bell realized that quantum mechanics predicts that there will be strange correlations (now confirmed) in this data. They seemed to imply that Alices choice of setting has a subtle nonlocal influence on Bobs outcome, and vice versa even though Alice and Bob might be light years apart. Bells argument is said to pose a threat to Albert Einsteins theory of special relativity, which is an essential part of modern physics.

But thats because Bell assumed that quantum particles dont know what measurements they are going to encounter in the future. Retrocausal models propose that Alices and Bobs measurement choices affect the particles back at the source. This can explain the strange correlations, without breaking special relativity.

In recent work, weve proposed a simple mechanism for the strange correlation it involves a familiar statistical phenomenon called Berksons bias (see our popular summary here).

Theres now a thriving group of scholars who work on quantum retrocausality. But its still invisible to some experts in the wider field. It gets confused for a different view called superdeterminism.

Superdeterminism agrees with retrocausality that measurement choices and the underlying properties of the particles are somehow correlated.

But superdeterminism treats it like the correlation between the weather and the barometer needle. It assumes theres some mysterious third thing a superdeterminer that controls and correlates both our choices and the particles, the way atmospheric pressure controls both the weather and the barometer.

So superdeterminism denies that measurement choices are things we are free to wiggle at will, they are predetermined. Free wiggles would break the correlation, just as in the barometer case. Critics object that superdeterminism thus undercuts core assumptions necessary to undertake scientific experiments. They also say that it means denying free will, because something is controlling both the measurement choices and particles.

These objections dont apply to retrocausality. Retrocausalists do scientific causal discovery in the usual free, wiggly way. We say it is folk who dismiss retrocausality who are forgetting the scientific method, if they refuse to follow the evidence where it leads.

What is the evidence for retrocausality? Critics ask for experimental evidence, but thats the easy bit: the relevant experiments just won a Nobel Prize. The tricky part is showing that retrocausality gives the best explanation of these results.

Weve mentioned the potential to remove the threat to Einsteins special relativity. Thats a pretty big hint, in our view, and its surprising it has taken so long to explore it. The confusion with superdeterminism seems mainly to blame.

In addition, we and others have argued that retrocausality makes better sense of the fact that the microworld of particles doesnt care about the difference between past and future.

We dont mean that it is all plain sailing. The biggest worry about retrocausation is the possibility of sending signals to the past, opening the door to the paradoxes of time travel. But to make a paradox, the effect in the past has to be measured. If our young grandmother cant read our advice to avoid marrying grandpa, meaning we wouldnt come to exist, theres no paradox. And in the quantum case, its well known that we can never measure everything at once.

Still, theres work to do in devising concrete retrocausal models that enforce this restriction that you cant measure everything at once. So well close with a cautious conclusion. At this stage, its retrocausality that has the wind in its sails, so hull down towards the biggest prize of all: saving locality and realism from death by experiment.

Written by:

This article was first published in The Conversation.

Read this article:

Time Twisted in Quantum Physics: How the Future Might Influence ... - SciTechDaily

Theoretical Physicists Discover Why Optical Cavities Slow Down … – SciTechDaily

Resonant vibrational strong-coupling can inhibit chemical reactions. Strong resonant coupling between cavity and vibrational modes can selectively inhibit a chemical reaction, i.e., preventing the appearance of products, that is present outside the cavity environment. Credit: E. Ronca / C. Schfer

Scientists have discovered why chemical reactions are slowed down in mirrored cavities, where molecules interact with light. The team used Quantum-Electrodynamical Density-Functional Theory to find that the conditions inside the optical cavity affected the energy that makes atoms vibrate around the molecules single bonds, which are critical to the reaction.

Chemical processes are all around us. From novel materials to more effective medicines or plastic products chemical reactions play a key role in the design of the things we use every day. Scientists constantly search for better ways to control these reactions, for example, to develop new materials. Now an international research team led by the MPSD has found an explanation why chemical reactions are slowed down inside mirrored cavities, where molecules are forced to interact with light. Their work, now published in the journal Nature Communications, is a key step in understanding this experimentally observed process.

Chemical reactions occur on the scale of atomic vibrations one million times smaller than the thickness of a human hair. These tiny movements are difficult to control. Established methods include the control of temperature or providing surfaces and complexes in solution made from rare materials. They tackle the problem on a larger scale and cannot target specific parts of the molecule. Ideally, researchers would like to provide only a small amount of energy to some atoms at the right time, just like a billiard player wants to nudge just one ball on the table.

In recent years, it became clear that molecules undergo fundamental changes when they are placed in optical cavities with opposing mirrors. Inside those confines, the system is forced to interact with virtual light, or photons. Crucially, this interaction changes the rate of chemical reactions an effect that was observed in experiments but whose underlying mechanism remained a mystery.

Now a team of theoretical physicists from Germany, Sweden, Italy, and the USA has come up with a possible explanation that qualitatively agrees with the experimental results. The team involved researchers from the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) in Hamburg, Germany, Chalmers University of Technology in Sweden, the Center for Computational Quantum Physics at the Flatiron Institute, Harvard University (both in the U.S.A.), and the Istituto per i Processi Chimico Fisici at the CNR (National Research Council) in Italy.

Using an advanced theoretical method, called Quantum-Electrodynamical Density-Functional Theory (QEDFT), the authors have unveiled the microscopic mechanism which reduces the chemical reaction rate, for the specific case of the deprotection reaction of 1-phenyl-2-trimethylsilylacetylene. Their findings are in agreement with the observations by the group of Thomas Ebbesen in Strasbourg.

The team discovered that the conditions inside the optical cavity affect the energy which makes the atoms vibrate around the molecules single bonds, which are critical for the chemical reaction. Outside the cavity, that energy is usually deposited in a single bond during the reaction, which can ultimately break the bond a key step in a chemical reaction. However, we find that the cavity introduces a new pathway, so that the energy is less likely to be funneled only into a single bond, says lead author Christian Schfer. This is the key process which inhibits the chemical reaction, because the probability to break a specific bond is diminished.

Manipulating materials through the use of cavities (the so-called polaritonic chemistry) is a powerful tool with many potential applications, according to the papers author Enrico Ronca, who works at CNR: For instance, it was observed that coupling to specific vibrational excitations can inhibit, steer, and even catalyze a chemical process at room temperature. Our theoretical work enhances the understanding of the underlying microscopic mechanisms for the specific case of a reaction inhibited by the field.

While the authors point out that important aspects remain to be understood and further experimental validation is required, they also highlight the special role of this new direction. This works puts the controversial field of polaritonic chemistry onto a different level, adds Angel Rubio, the Director of the MPSDs Theory Department. It provides fundamental insights into the microscopic mechanisms that enable the control of chemical reactions. We expect the present findings to be applicable to a larger set of relevant reactions (including click chemical reactions linked to this years Nobel Prize in chemistry) under strong light-matter coupling conditions.

Reference: Shining light on the microscopic resonant mechanism responsible for cavity-mediated chemical reactivity by Christian Schfer, Johannes Flick, Enrico Ronca, Prineha Narang and Angel Rubio, 19 December 2022, Nature Communications.DOI: 10.1038/s41467-022-35363-6

See more here:

Theoretical Physicists Discover Why Optical Cavities Slow Down ... - SciTechDaily

Stanley Deser, Whose Ideas on Gravity Help Explain the Universe … – The New York Times

Stanley Deser, a theoretical physicist who helped illuminate the details of gravity and how it shapes the space-time fabric of the universe, died on April 21 in Pasadena, Calif. He was 92.

His death, at a hospital, was confirmed by his daughter, Abigail Deser.

Physicists have long dreamed of devising a theory of everything a set of equations that neatly and completely describe how the universe works. By the middle of the 20th century, they had come up with two theories that serve as the pillars of modern physics: quantum mechanics and general relativity.

Quantum mechanics describes how, in the subatomic realm, everything is broken up in discrete chunks, or quanta, such as the individual particles of light called photons. Albert Einsteins theory of general relativity had elegantly captured how mass and gravity bend the fabric of space-time.

However, these two pillars did not fit together. General relativity does not contain any notion of quanta; a quantum theory of gravity is an ambition that remains unfinished today.

The problem we face is how to unify these two into a seamless theory of everything, said Michael Duff, an emeritus professor of physics at Imperial College London in England. Stanley was amongst the first to tackle this problem.

In 1959, Dr. Deser, along with two other physicists, Richard Arnowitt and Charles Misner, published what is now known as the ADM formalism (named after the initials of their surnames), which rejiggered the equations of general relativity in a form that laid a foundation for work toward a quantum theory of gravity.

Its a bridge toward quantum, said Edward Witten, a physicist at the Institute for Advanced Study in Princeton, N.J. So far, however, no one has been able to take it to the next step and come up with a unified theory that includes quantum gravity.

The ADM formalism offered additional benefit: It made general relativity equations amenable to computer simulations, enabling scientists to probe phenomena like the space-bending pull of black holes and the universe-shaking explosions when stars collide.

The rejiggered equations split four-dimensional space-time into slices of three-dimensional space, an innovation that allowed computers to handle the complex data and, as Frans Pretorius, a professor of physics at Princeton University, put it, evolve these slices in time to find the full solution.

Dr. Deser is perhaps best known for his work in the 1970s as one of the pioneers of supergravity, which expanded an idea known as supersymmetry to include gravity.

From quantum mechanics, physicists already knew that fundamental particles fell into one of two groups. Familiar constituents of matter like electrons and quarks fall into the group known as fermions; while those that carry fundamental forces like photons, the particles of light that convey the force of electromagnetism, are known as bosons.

Supersymmetry hypothesizes an as-yet-undiscovered boson partner for every fermion, and a fermion partner for each boson.

Dr. Deser worked with Bruno Zumino, one of the originators of supersymmetry, to add gravity to the theory, creating the theory of supergravity. Supergravity includes gravitons the gravitational equivalent of photons and adds a supersymmetric partner, the gravitino.

Experiments using particle accelerators have yet to turn up evidence of any of these partner particles, but the theories have not been disproved, and because of their mathematical elegance, they remain attractive to physicists.

Supergravity is also a key aspect of superstring theories, which attempt to provide a complete explanation of how the universe works, overcoming shortfalls of quantum gravity theories.

Stanley was one of the most influential researchers on questions related to gravity over his extremely long and distinguished career, said Dr. Witten, who has been at the forefront of devising superstring theories.

Stanley Deser was born in Rovno, Poland, a city now known as Rivne and part of Ukraine, on March 19, 1931. As Jews, his parents, Norman, a chemist, and Miriam, fled Polands repressive, antisemitic regime in 1935 for Palestine. But prospects for finding work there were dim, and a few months later they moved to Paris.

In 1940, with World War II engulfing Europe, the family narrowly escaped France after Germany invaded.

They finally realized the danger and decided to leave everything, Dr. Deser wrote of his parents in his autobiography, Forks in the Road. I rushed with my father to empty our safe. That evening, my mother sewed the coins into a belt of towels, a much-practiced maneuver of refugees, while the rest of us packed a few belongings.

The family fled to Portugal and 11 months later obtained visas to emigrate to the United States. They eventually settled in New York City, where Norman and Miriam ran a chemical supplies business.

By age 12 Stanley had been promoted to 10th grade, and he graduated from high school at 14. He earned a bachelors degree in physics from Brooklyn College in 1949 at 18, then went to Harvard, where he studied under Julian Schwinger, a Nobel Prize laureate. He completed his doctorate in 1953.

After postdoctoral fellowships at the Institute for Advanced Study and the Niels Bohr Institute in Copenhagen, Dr. Deser joined the faculty of Brandeis University in 1958.

The following three years, working on the ADM formalism, provided the best run of luck that one could possibly hope for, he wrote in his autobiography.

In an interview last year for Caltechs Heritage Project, Dr. Deser recalled that he, Dr. Arnowitt and Dr. Misner completed much of the work during summers in Denmark, in a kindergarten classroom. The nice thing about this kindergarten, it has blackboards, he said. Denmark is very good that way.

Since the blackboards were mounted low for children, we would crawl and write equations, Dr. Deser said. And the papers just poured out.

Dr. Misner, an emeritus professor of physics at the University of Maryland, said there were parallels between the ADM recasting of general relativity and the quantum field theory of electromagnetism that other physicists were working on, and they were able to apply that experience to general relativity.

The work on supergravity occurred during a stay at the CERN particle laboratory in Geneva where Dr. Zumino worked. In a period of just three weeks, to our amazement, we had a consistent theory, Dr. Deser recalled.

He and Dr. Zumino published a paper about supergravity in June 1976. However, another group of physicists Daniel Freedman, Sergio Ferrara and Peter van Nieuwenhuizen beat them to the punch, describing supergravity in a paper that had been completed about a month before Dr. Deser and Dr. Zumino submitted theirs.

As a result, Dr. Deser said, sometimes the work that he and Dr. Zumino did was overlooked. In 2019, a Breakthrough Prize in Fundamental Physics accompanied by $3 million was awarded to the other team.

He was understandably upset, Dr. Duff, the British physicist, said. I think they could have erred on the side of generosity and included Stanley as the fourth recipient. (Dr. Zumino died in 2014.)

Dr. Schwarz and Dr. Witten, who were members of the committee that awarded the prize, declined to discuss the particulars of the decision, but Dr. Schwarz said, It was a purely scientific decision.

Dr. Deser worked at Brandeis until he retired in 2005. He then moved to Pasadena to be close to his daughter and obtained an unpaid position as a senior research associate at Caltech.

In addition to Abigail, he is survived by two other daughters, Toni Deser and Clara Deser, and four grandchildren.

His wife of 64 years, Elsbeth Deser, died in 2020. A daughter, Eva, died in 1968.

While Dr. Deser was an expert on gravity and general relativity, he was not infallible.

In the Caltech interview, he recalled a paper in which he suggested that gravity could solve some troubling infinities that were showing up in the quantum field theory of electrodynamics.

Other noteworthy physicists had similar thoughts but did not publish them. Dr. Deser did.

It was garbage, he said. During a talk at a conference, Richard Feynman, the Nobel Prize-winning physicist who devised much of quantum electrodynamics, without much difficulty shot me to pieces, which I deserved, he said.

He added, Everybodys entitled to a few strikes.

Read more here:

Stanley Deser, Whose Ideas on Gravity Help Explain the Universe ... - The New York Times

UMD Quantum Physicist Elected to National Academy of Sciences – Maryland Today

A groundbreaking quantum physics researcher who has long been affiliated with the Joint Quantum Institute at the University of Maryland was elected a member of the National Academy of Scienceslast week.

Paul Julienne, an emeritus fellow at JQI and an adjunct professor of physics at UMD, joined 142 other U.S. and international members recognized in 2023 for their exceptional, ongoing achievements in original research. Hes one of 22 current UMD faculty members in the National Academy of Sciences, and 67 named to various esteemed honorary academies.

Julienne helped establish the research field of ultracold matter, which investigates atoms and molecules near absolute zero. His theoretical research includes developing models that describe how cold trapped molecules and atoms can be precisely controlled using magnetic fields or lasers. This research topic has revealed details of atomic states and chemical reactions of ultracold molecules.

I am both gratified and humbled by this honor, which is only possible because of the many excellent colleagues and students with whom I have worked with over the years, Julienne said. I owe them a debt of gratitude, for it is by working together that science advances."

Julienne joined JQI in 2007, soon after its founding as a joint research institute combining the scientific strengths of the University of Maryland with the National Institute of Standards and Technology (NIST). The university and the federal agency have since broadened and deepened their collaboration with other quantum centers and institutes at UMD, helping lay the foundation for the university to become one of the most vibrant loci of quantum research in the world.

UMDs hundreds of researchers, partnerships with government agencies and labs, and collaboration with a wide range of firms in the quantum space inspired university President Darryll J. Pines to refer to the scientific and tech ferment centered at UMD as the Capital of Quantum.

Paul Julienne's election to the National Academy of Sciences highlights his remarkable achievements in the field of ultracold matter and underscores the significance of his contributions to science and the quantum revolution, Pines said. We are honored to have such a distinguished researcher and educator as part of our institution."

Julienne earned a B.S. in chemistry from Wofford College in 1965 and his Ph.D. in chemical physics from the University of North Carolina at Chapel Hill in 1969. He worked as a postdoctoral researcher at the National Bureau of Standards and as a staff researcher at the Naval Research Laboratory before beginning a career of nearly 40 years at NIST, first as a research scientist and then as a NIST fellow, retiring in 2013. Among his other awards and accomplishments, he received the 2015 William F. Meggers Award of the Optical Society of America and the 2004 Davisson-Germer Prize of the American Physical Society and is a fellow of the division of Atomic, Molecular, and Optical Physics of the American Physical Society.

We are proud to see Dr. Julienne honored with one of the highest professional distinctions accorded to a scientist, said Amitabh Varshney, dean of UMDs College of Computer, Mathematical, and Natural Sciences. Our college extends its congratulations to him for this well-deserved recognition.

See the article here:

UMD Quantum Physicist Elected to National Academy of Sciences - Maryland Today

With new experimental method, researchers probe spin structure in 2D materials for first time – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

For two decades, physicists have tried to directly manipulate the spin of electrons in 2D materials like graphene. Doing so could spark key advances in the burgeoning world of 2D electronics, a field where super-fast, small and flexible electronic devices carry out computations based on quantum mechanics.

Standing in the way is that the typical way in which scientists measure the spin of electronsan essential behavior that gives everything in the physical universe its structureusually doesn't work in 2D materials. This makes it incredibly difficult to fully understand the materials and propel forward technological advances based on them. But a team of scientists led by Brown University researchers believe they now have a way around this longstanding challenge. They describe their solution in a new study published in Nature Physics.

In the study, the teamwhich also include scientists from the Center for Integrated Nanotechnologies at Sandia National Laboratories, and the University of Innsbruckdescribe what they believe to be the first measurement showing direct interaction between electrons spinning in a 2D material and photons coming from microwave radiation.

Called a coupling, the absorption of microwave photons by electrons establishes a novel experimental technique for directly studying the properties of how electrons spin in these 2D quantum materialsone that could serve as a foundation for developing computational and communicational technologies based on those materials, according to the researchers.

"Spin structure is the most important part of a quantum phenomenon, but we've never really had a direct probe for it in these 2D materials," said Jia Li, an assistant professor of physics at Brown and senior author of the research. "That challenge has prevented us from theoretically studying spin in these fascinating material for the last two decades. We can now use this method to study a lot of different systems that we could not study before."

The researchers made the measurements on a relatively new 2D material called "magic-angle" twisted bilayer graphene. This graphene-based material is created when two sheets of ultrathin layers of carbon are stacked and twisted to just the right angle, converting the new double-layered structure into a superconductor that allows electricity to flow without resistance or energy waste. Just discovered in 2018, the researchers focused on the material because of the potential and mystery surrounding it.

"A lot of the major questions that were posed in 2018 have still yet to be answered," said Erin Morissette, a graduate student in Li's lab at Brown who led the work.

Physicists usually use nuclear magnetic resonance or NMR to measure the spin of electrons. They do this by exciting the nuclear magnetic properties in a sample material using microwave radiation and then reading the different signatures this radiation causes to measure spin.

The challenge with 2D materials is that the magnetic signature of electrons in response to the microwave excitation is too small to detect. The research team decided to improvise. Instead of directly detecting the magnetization of the electrons, they measured subtle changes in electronic resistance, which were caused by the changes in magnetization from the radiation using a device fabricated at the Institute for Molecular and Nanoscale Innovation at Brown.

These small variations in the flow of the electronic currents allowed the researchers to use the device to detect that the electrons were absorbing the photos from the microwave radiation.

The researchers were able to observe novel information from the experiments. The team noticed, for instance, that interactions between the photons and electrons made electrons in certain sections of the system behave as they would in an anti-ferromagnetic systemmeaning the magnetism of some atoms was canceled out by a set of magnetic atoms that are aligned in a reverse direction.

The new method for studying spin in 2D materials and the current findings won't be applicable to technology today, but the research team sees potential applications the method could lead to in the future. They plan to continue to apply their method to twisted bilayer graphene but also expand it to other 2D material.

"It's a really diverse toolset that we can use to access an important part of the electronic order in these strongly correlated systems and in general to understand how electrons can behave in 2D materials," Morissette said.

More information: Andrew Mounce, Dirac revivals drive a resonance response in twisted bilayer graphene, Nature Physics (2023). DOI: 10.1038/s41567-023-02060-0. http://www.nature.com/articles/s41567-023-02060-0

Journal information: Nature Physics

Originally posted here:

With new experimental method, researchers probe spin structure in 2D materials for first time - Phys.org

Is Quantum Computing a Threat To Current Encryption Methods? – Spiceworks News and Insights

Encryption is the backbone of cybersecurity, keeping data and systems secure. Quantum computing threatens to make todays encryption obsolete. Developing quantum-secure encryption is one of the main challenges facing the cybersecurity sector today, highlights Michael Redding, chief technology officer at Quantropi.

Explaining how quantum computers work is challenging. It involves presenting complicated scientific concepts like superposition, which allows groups of qubits to create multidimensional computational spaces. For those who do not have a background in quantum physics, quantum computing can seem more like science fiction than computer science.

Explaining what quantum computers do, however, is much easier. In essence, they leverage the behavior of subatomic particles to increase computation speed exponentially. When Google announced in October 2019 that it had achieved quantum supremacyOpens a new window , it was celebrating the fact that it had used quantum computing to solve a complex mathematical problem in 3 minutes and 20 seconds. How long would a conventional computer have taken to solve the same problem? According to Google, it would have taken at least 10,000 years.

How will the world use this mind-blowingly fast processing power? Experts predict it will transform a number of industries, from pharmaceuticals to finance to supply chain management. However, the quantum computing use case that has been making the most headlines in recent months is cybersecurity.

Encryption is the backbone of cybersecurity. It is the tool that keeps critical data under lock and key. Without it, security and privacy would be impossible to achieve.

Hackers have a number of avenues for gaining unauthorized access to encrypted information. One popular method involves social engineering attacks that seek to trick someone into revealing the password that provides access to data. Rather than cracking the code, hackers using social engineering attacks simply steal the key.

Data breaches provide another option for obtaining passwords. Reports of breaches regularly make the news, and each breach has the potential to put passwords into the hands of bad actors seeking to obtain access to encrypted data.

Brute force attacks represent a different approach to cracking encryption. Rather than trying to obtain the password from a user or stolen data, these attacks use computers to cycle through possible passwords until the correct one is found. Essentially, brute force attacks figure out passwords through trial and error, leveraging computers to do the work quickly and systematically.

Current encryption methods are considered effective in thwarting brute force attacks, as the most advanced encryption systems work with passwords or keys that are long and complicated or highly random. With todays computers, deciphering the key through trial and error can take millions of years.

However, quantum computing changes the timeline for cracking todays encryption. By exponentially increasing processing speed, quantum computers could break the most advanced keys commonly used today in minutes.

When will bad actors have access to a quantum computer capable of threatening todays encryption? Based on Shors AlgorithmOpens a new window , a quantum computer would need millions of qubits with a quantum circuit depth measured in the billions with essentially perfect calculation fidelity.

Based on todays quantum computing capability, that would put Y2Q into the 2040s, if ever. However, breakthroughs that have been achieved in 2023 in research out of China and Germany using a hybrid classic + quantum attack vector using AI and machine learning have drastically reduced the quantum capabilities required to break asymmetric encryption as compared to Shors Algorithm.

Combining these new AI and machine learning hybrid attack vectors with the rapid advancement of quantum computing capabilities begins to crystalize a pathway to Y2Q in the next 1 to 5 years. It is no longer a question of how to break asymmetric encryption with todays generation of technology the approach has been published. Now, It is only a matter of optimization and continued incremental technology improvements.

To address Y2Qs impact on security, developers are focusing on two main approaches to quantum security: post-quantum cryptography and quantum key distribution. Post-quantum cryptography (PQC) leverages complex mathematical algorithms to provide security that is resistant to quantum attacks, while quantum key distribution involves exploiting the properties of quantum mechanics to bolster security.

PQC provides an efficient means of updating security systems because it is math-based, which allows it to be implemented through computer coding and deployed in end devices with a simple software update. However, PQCs security relies on complex/hard mathematical calculations and, in some cases, large key sizes, both of which come with considerable performance costs.

Organizations that seek to quantum-proof their systems with PQC must be aware that considerable infrastructure updates may be necessary. Because PQC encryption schemes are typically more complex than those currently in use, they require more resources for encrypting and decrypting, including more time, storage space, memory, and network bandwidth.

For the average user relying on PQC for booting machines or encrypting data related to web browsing, the additional processing burden might not be noticeable. However, organizations simultaneously transmitting and receiving thousands or millions of digital transactions per second must consider the impact this will have on their performance. Failure to do so can create dangerous latency in devices that rely on high efficiency, such as the systems that manage computer-aided driving software in autonomous vehicles.

PQC also poses challenges for updating internet of things (IoT) devices to quantum-secure encryption. Smart doorbells and other intelligent appliances will become vulnerable if their encryption systems are not updated, though they typically do not have the processing power to support PQC effectively.

Quantum key distribution (QKD) is another option for quantum-resistant encryption. This approach relies on the laws of quantum physics rather than mathematics to generate and transmit encryption keys between two parties. The natural laws involved in this process also provide warnings to users when QKD transmissions are disturbed or intercepted by bad actors.

Theoretically, QKD provides security that is effective against quantum computing attacks and can withstand attacks for an indefinite amount of time. Practically, however, making it a reality would require overcoming a number of significant technical challenges. QKD uses photon emitters and receivers to create quantum entanglement between two devices. However, the current state of this technology is largely experimental, with few commercial deployments and significant limitations on bandwidth, distance, complexity, and cost that continue to be explored and improved upon.

See More: Why and Where the PQC Market is Gaining Traction

Developing quantum-secure encryption is just the first step toward preparing for Y2Q. In order to be truly quantum secure, organizations must assess where they are vulnerable, determine how to integrate new security systems in those areas, deploy those systems, and test them. It is a process that could take years, and the clock is ticking.

Those who understand the stakes involved are already taking steps. For example, the US government issued a National Security Memorandum in May 2022 that warns of the significant risks that quantum computing poses to the economic and national security of the United States. The memorandum calls for a timely transition to quantum-resistant cryptography.

Rob Joyce, NSA cybersecurity director and deputy national manager for national security systems, highlightedOpens a new window the need to push forward in achieving quantum-resistant systems in his comments on the memorandum. He stated: Implementing approved quantum-resistant cryptographic solutions across all of our systems will not happen overnight, but its critical that we chart a path to get there considering the potential threat of quantum computing.

In the end, public and private organizations need to prepare for Y2Q immediately to protect their data, connected devices, systems, and communications. The time is now.

Are you preparing for Y2Q? How are you upgrading to quantum-secure encryption? Share with us on FacebookOpens a new window , TwitterOpens a new window , and LinkedInOpens a new window . Wed love to hear from you!

Image Source: Shutterstock

Go here to see the original:

Is Quantum Computing a Threat To Current Encryption Methods? - Spiceworks News and Insights

Researchers discover superconductive images are actually 3D and disorder-driven fractals – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Meeting the world's energy demands is reaching a critical point. Powering the technological age has caused issues globally. It is increasingly important to create superconductors that can operate at ambient pressure and temperature. This would go a long way toward solving the energy crisis.

Advancements with superconductivity hinge on advances in quantum materials. When electrons inside of quantum materials undergo a phase transition, the electrons can form intricate patterns, such as fractals. A fractal is a never-ending pattern. When zooming in on a fractal, the image looks the same. Commonly seen fractals can be a tree or frost on a windowpane in winter. Fractals can form in two dimensions, like the frost on a window, or in three-dimensional space like the limbs of a tree.

Dr. Erica Carlson, a 150th Anniversary Professor of Physics and Astronomy at Purdue University, led a team that developed theoretical techniques for characterizing the fractal shapes that these electrons make, in order to uncover the underlying physics driving the patterns.

Carlson, a theoretical physicist, has evaluated high resolution images of the locations of electrons in the superconductor Bi2-xPbzSr2-yLayCuO6+x (BSCO), and determined that these images are indeed fractal and discovered that they extend into the full three-dimensional space occupied by the material, like a tree filling space.

What was once thought of as random dispersions within the fractal images are purposeful and, shockingly, not due to an underlying quantum phase transition as expected, but due to a disorder-driven phase transition.

Carlson led a collaborative team of researchers across multiple institutions and published their findings, titled "Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x," in Nature Communications.

The team includes Purdue scientists and partner institutions. From Purdue, the team includes Carlson, Dr. Forrest Simmons, recent Ph.D. student, and former Ph.D. students Dr. Shuo Liu and Dr. Benjamin Phillabaum. The Purdue team completed their work within the Purdue Quantum Science and Engineering Institute (PQSEI). The team from partner institutions includes Dr. Jennifer Hoffman, Dr. Can-Li Song, Dr. Elizabeth Main of Harvard University, Dr. Karin Dahmen of the University of Urbana-Champaign, and Dr. Eric Hudson of Pennsylvania State University.

"The observation of fractal patterns of orientational ('nematic') domainscleverly extracted by Carlson and collaborators from STM images of the surfaces of crystals of a cuprate high temperature superconductoris interesting and aesthetically appealing on its own, but also of considerable fundamental importance in coming to grips with the essential physics of these materials," says Dr. Steven Kivelson, the Prabhu Goel Family Professor at Stanford University and a theoretical physicist specializing in novel electronic states in quantum materials. "Some form of nematic order, typically thought to be an avatar of a more primitive charge-density-wave order, has been conjectured to play an important role in the theory of the cuprates, but the evidence in favor of this proposition has previously been ambiguous at best. Two important inferences follow from Carlson et al.'s analysis: 1) The fact that the nematic domains appear fractal implies that the correlation lengththe distance over which the nematic order maintains coherenceis larger than the field of view of the experiment, which means that it is very large compared to other microscopic scales. 2) The fact that patterns that characterize the order are the same as those obtained from studies of the three dimensional random-field Ising modelone of the paradigrmatic models of classical statistical mechanicssuggests that the extent of the nematic order is determined by extrinsic quantities and that intrinsically (i.e. in the absence of crystalline imperfections) it would exhibit still longer range correlations not just along the surface, but extending deep into the bulk of the crystal."

High resolution images of these fractals are painstakingly taken in Hoffman's lab at Harvard University and Hudson's lab, now at Penn State, using scanning tunneling microscopes (STM) to measure electrons at the surface of the BSCO, a cuprate superconductor. The microscope scans atom by atom across the top surface of the BSCO, and what they found was stripe orientations that went in two different directions instead of the same direction. The result, seen above in red and blue, is a jagged image that forms interesting patterns of electronic stripe orientations.

"The electronic patterns are complex, with holes inside of holes, and edges that resemble ornate filigree," explains Carlson. "Using techniques from fractal mathematics, we characterize these shapes using fractal numbers. In addition, we use statistics methods from phase transitions to characterize things like how many clusters are of a certain size, and how likely the sites are to be in the same cluster."

Once the Carlson group analyzed these patterns, they found a surprising result. These patterns do not form only on the surface like flat layer fractal behavior, but they fill space in three dimensions. Simulations for this discovery were carried out at Purdue University using Purdue's supercomputers at Rosen Center for Advanced Computing. Samples at five different doping levels were measured by Harvard and Penn State, and the result was similar among all five samples.

The unique collaboration between Illinois (Dahmen) and Purdue (Carlson) brought cluster techniques from disordered statistical mechanics into the field of quantum materials like superconductors. Carlson's group adapted the technique to apply to quantum materials, extending the theory of second order phase transitions to electronic fractals in quantum materials.

"This brings us one step closer to understanding how cuprate superconductors work," explains Carlson. "Members of this family of superconductors are currently the highest temperature superconductors that happen at ambient pressure. If we could get superconductors that work at ambient pressure and temperature, we could go a long way toward solving the energy crisis because the wires we currently use to run electronics are metals rather than superconductors. Unlike metals, superconductors carry current perfectly with no loss of energy. On the other hand, all the wires we use in outdoor power lines use metals, which lose energy the whole time they are carrying current. Superconductors are also of interest because they can be used to generate very high magnetic fields, and for magnetic levitation. They are currently used (with massive cooling devices!) in MRIs in hospitals and levitating trains."

Next steps for the Carlson group are to apply the Carlson-Dahmen cluster techniques to other quantum materials.

"Using these cluster techniques, we have also identified electronic fractals in other quantum materials, including vanadium dioxide (VO2) and neodymium nickelates (NdNiO3). We suspect that this behavior might actually be quite ubiquitous in quantum materials," says Carlson.

This type of discovery leads quantum scientists closer to solving the riddles of superconductivity.

"The general field of quantum materials aims to bring to the forefront the quantum properties of materials, to a place where we can control them and use them for technology," Carlson explains. "Each time a new type of quantum material is discovered or created, we gain new capabilities, as dramatic as painters discovering a new color to paint with."

More information: Can-Li Song et al, Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x, Nature Communications (2023). DOI: 10.1038/s41467-023-38249-3

Journal information: Nature Communications

Continue reading here:

Researchers discover superconductive images are actually 3D and disorder-driven fractals - Phys.org

Collaboration builds fantastical stories from nuggets of truth – Symmetry magazine

Science fiction asks the question What if? and then attempts to answer that question with a story grounded in science fact. Thats why Comma Press, for its latest anthology, paired science fiction writers with CERN scientists, to create untrue stories with a bit of truth in them.

Creating the anthology, titled Collision, started with a call to CERN researchers and alumni of the European physics research center, asking them to describe concepts they thought would inspire good creative writing. Next, authors picked the ideas that called to them most. Each author then met with the scientist who proposed their chosen idea and discussed the science in detail. The anthology consists of a collection of the resulting stories, each with an afterward by the consulting scientist.

Symmetry interviewed three different writer-scientist pairs to learn what it was like to participate in this unusual collaboration: Television writer, television producer and screenwriter Steven Moffat, famous for his work on the tv series Doctor Who and Sherlock, worked with Peter Dong, a physics teacher at the Illinois Math and Science Academy who works with students to analyze data from CERN. Poet, playwright and essayist lisa luxx worked with physicist Carole Weydert, now leader of a unit of an insurance regulatory authority in Luxembourg. And UK-based Short Story Fellow of the Arts Foundation Adam Marek worked with Andrea Giammanco, who continues to do research at CERN as a physicist with the National Fund for Scientific Research in Belgium.

Although the assignment was the same for every pair, they all approached their stories differently.

When Peter Dong first got the call for inspiring physics concepts, he knew exactly what to submit: a strange but real-life theory proposed by physicists Holger Bech Nielsen and Masao Ninomiya.

In the late 2000s, the theorists posited that the universe might for some reason prefer to keep certain fundamental particlessay, the Higgs boson or particles of dark mattera mystery. So much so, they wrote, that the universe was actively conspiring against their discovery.

The search for the Higgs certainly wasnt easy. Only weeks after being turned on for the first time, the Large Hadron Collider at CERN experienced a critical failure, immediately stopping the two most advanced experiments aimed at finding the Higgs. Years earlier and an ocean away, the United States Congress suddenly scrapped plans to build a similar collider in Texas, despite construction having already begun.

These two events, Nielsen and Ninomiya argued, could indicate that discovering the new particle was so improbable that the universe wouldnt allow it to happen.

Dong first read about the theory as a graduate student at the US Department of Energys Fermi National Accelerator Laboratory in 2008. While the probability is absurdly small, and it's a wacky, out-there idea, it's still not impossible, he says. When we come up with these wacky theories, I feel like more people should know about them. They're just so much fun.

In their paper, Nielsen and Ninomiya proposed an experiment thatwhile it would not prove their hypothesiscould at least put it to the test: Shuffle a deck of a million cards in which a single card is marked Shut down the LHC. If that card were randomly pulled from the deck, they would take that as a sign the universe wanted physicists to back off.

The card experiment was never run, and in the end, scientists were able to repair and restart the LHC. In 2012, physicists on the CMS and ATLAS experiments announced the discovery of the Higgs.

Dong had previously tried out using Nielsen and Ninomiyas idea as the basis for a story while auditing a creative writing class, so when he got the CERN email, he was ready with a pitch.

Writer Steven Moffat says Dongs idea stood out to him on the list. I zeroed in on that promptone, because I understood it, or at least I thought I understood part of it, he says. And two, because I could see a story in it. I just love the idea that a bunch of very serious-minded scientists wondered if the universe might be actively trying to stop them.

Moffat and Dong worked together to make sure the science in the story held up. Steven was taking great pains to ask Does this make sense? Would that be right? Dong says.

The result was a mad, paranoid fantasy, Moffat says. But its decorated in the right terminology and derives from something that really happened.

Not all the stories in Collision fall neatly into the sci-fi genre. For her entry, poet lisa luxx explored her lived experiences of adoption and migration through the lens of quantum physics. At a certain point, I had to disentangle [pun not intended] myself from trying to achieve a particular genre, luxx says.

The writer chose a prompt related to supersymmetry, which posits that every particle has a yet unobserved partner with some shared properties. Like Moffat, luxx says she was attracted to the idea she chose because it made some amount of sense to her. The physicist had used quite a poetic quote in her explanation of this theory, luxx says. That immediately drew me in.

Physicist Carole Weydert, who submitted the idea, may have had an artistic way of explaining supersymmetry because she had previously explored another complex physics theory in her own art. It is this idea that you could have a kind of symmetry between everything contained in spacetime and spacetime itself, she says.

Weydert says she [tries] to squeeze in time to paint, sometimes using ripped and cut pages from old textbooks in her work. I try to express this quest for simplification in theoretical physics, that from one very, very basic theory, the whole complexity of the world emerges.

It was not easy to translate the mathematical and theoretical aspects of supersymmetry into a fictionalized story, luxx says. The most challenging part was me grasping exactly the nuance of where physicists are at with understanding supersymmetry, she says. I was learning the theory while writing it.

But the theory eventually clicked into place as a metaphor.

The poet says she wanted to ground the story in a common language to show how entwined physics is with everyday life. Physics are just parts of us. We can't understand ourselves and can't understand society without at least somewhat understanding these theories.

The final piece strayed from Weyderts initial prompt, but Weydert says she is nevertheless pleased with the outcome. I think what is important in these stories is not the science, as such, but connecting it to something that conveys an emotion.

For his contribution to the anthology, short story writer Adam Marek moved away from traditional prose. He instead wrote his piece in the form of an interview for real-life BBC Radio program Desert Island Discs. The 45-minute show tells the story of a famous persons life through music tracks they have chosen.

Ive always loved it as a format for telling the story of someones life, Marek says. And I'd thought it could make a terrific framework for writing a short story.

The prompt Marek chose came from physicist Andrea Giammanco. As a postdoc, Giammanco had spent time researching the dark sector, a collection of hypothetical particles that have thus far gone unobserved. I was in love with the idea of the dark sector popping up in unconventional ways, Giammanco says.

Marek had worked with other scientists on stories for other Comma Press anthologies, but he says this time was unique. It was different, he says. And I think that was because of Andreas particular interests and approach.

Marek and Andrea spoke many times throughout the project, on video calls and email, Marek says. It helped that they shared a love of science fiction. Andrea seemed as fascinated by the writing process as I was with the science. We had lots of questions for each other.

Giammanco says they threw so many ideas at each other that we could have easily written five completely different stories.

In one of these brainstorming sessions, Giammanco says he mentioned offhand that the dark sector may be hiding in plain sight, and it may even be revealed in a reanalysis of old data. This piqued Adams attention, he saysand it ultimately became a key part of the plot.

Throughout the process Giammanco ensured that everything in the story was at least possible. Adam wanted the idea to work from the narrative point of view, he says. But for me, it was paramount to make it work from the scientific point of view.

But staying within the realm of the possible didnt restrict Marek and Giammanco to the realm of the particularly plausible. Marek says that flexibility helped him find a creative way for the story to end.

At the peak of being very stressed out about the story and not knowing how to finish it, he says, Andrea just happened to send me an email about ghosts.

Giammanco had sent Marek an article that quoted physicist and pop culture figure Brian Cox asserting that the LHC had unintentionally disproved the existence of incorporeal beings by failing to detect any evidence of them. But Giammanco had laid out an alternative argument: If ghosts did exist, they would just need to be a part of the dark sector, which the LHC has not been able to reach. It just fixed it all, Marek says. It was really, really fortuitous for the story.

Whether or not any of the ideas in the stories collected in Collision turns out to be true, the project highlights what both writers and scientists have in common: the ongoing quest to imagine What if?

Originally posted here:

Collaboration builds fantastical stories from nuggets of truth - Symmetry magazine

Physicists tracked electron recollision in real-time – Tech Explorist

The measurement of the fastest dynamical processes in nature typically relies on observing the nonlinear response of a system to precisely timed interactions with external stimuli. This usually requires two (or more) controlled events, with time-resolved information gained by controllably varying the interpulse delays.

Using a breakthrough technique created by MPIK physicists and used to verify quantum-dynamics theory by collaborators at MPI-PKS, the movement of an electron in a strong infrared laser field is tracked in real-time. The experimental method connects the free-electron mobility caused by the subsequent near-infrared pulse to the absorption spectrum of the ionizing extreme ultraviolet pulse.

Although the electron is a quantum object, the classical description of its motion is appropriate for our experimental technique.

Strong-field physics fundamentally depends on high-harmonic generation, which converts optical or near-infrared (NIR) light into the extreme ultraviolet (XUV) regime. In the well-known three-step concept, the driving light field (1) ionizes the electron by tunnel ionization, (2) accelerates it away and back to the ionic core, where the electron (3) recollides and emits XUV light if it recombines.

In this study, physicists replaced the first step with an XUV single-photon ionization, which has a twofold advantage: First, one can choose the ionization time relative to the NIR phase. Second, the NIR laser can be tuned to low intensities where tunnel ionization is practically impossible. This allows us to study strong-field-driven electron recollision in a low-intensity limiting case.

Attosecond transient absorption spectroscopy, previously established by a team led by Christian Ott, for bound electrons, is the method used here, along with reconstructing the time-dependent dipole moment. It links the time-dependent dipole moment with the classical motion (trajectories) of the ionized electrons, in this case, by extending the approach to free electrons.

Ph.D. student Tobias Heldt said,Our new method, applied to helium as a model system, links the absorption spectrum of the ionizing light to the electron trajectories. This allows us to study ultrafast dynamics with a single spectroscopic measurement without scanning a time delay to compose the dynamics frame by frame.

The results of the measurements indicate that, depending on the experimental settings, circular polarisation of the light wave can increase the likelihood of bringing the electron back to the ion. Despite seeming counterintuitive, theorists had anticipated this result.

This interpretation of recolliding periodic orbits is also justified by classical simulations. Whenever electron (re-)collides with the helium atom (the green line intersects the white center line), it leads to a characteristic modification and increase of the time-dependent atomic dipole (the result of the quick red-blue oscillation near the center line), which an attosecond absorption-spectroscopy experiment can pick up.

Group leader Christian Ott is optimistic about the future potential of this new approach:In general, our technique allows us to explore laser-driven electron motion in a new lower-intensity regime, and it could further be applied on various systems, e.g., for studying the laser-driven electron dynamics within larger atoms or molecules.

Journal Reference:

Read the original post:

Physicists tracked electron recollision in real-time - Tech Explorist

Physicists discover ‘stacked pancakes of liquid magnetism’ – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Physicists have discovered "stacked pancakes of liquid magnetism" that may account for the strange electronic behavior of some layered helical magnets.

The materials in the study are magnetic at cold temperatures and become nonmagnetic as they thaw. Experimental physicist Makariy Tanatar of Ames National Laboratory at Iowa State University noticed perplexing electronic behavior in layered helimagnetic crystals and brought the mystery to the attention of Rice theoretical physicist Andriy Nevidomskyy, who worked with Tanatar and former Rice graduate student Matthew Butcher to create a computational model that simulated the quantum states of atoms and electrons in the layered materials.

Magnetic materials undergo a "thawing" transition as they warm up and become nonmagnetic. The researchers ran thousands of Monte Carlo computer simulations of this transition in helimagnets and observed how the magnetic dipoles of atoms inside the material arranged themselves during the thaw. Their results were published in a recent study in Physical Review Letters.

At a submicroscopic level, the materials under study are composed of thousands of 2D crystals stacked one atop another like pages in a notebook. In each crystal sheet, atoms are arrayed in lattices, and the physicists modeled quantum interactions both within and between sheets.

"We're used to thinking that if you take a solid, like a block of ice, and you heat it up, eventually it will become a liquid, and at a higher temperature, it will evaporate and become a gas," said Nevidomskyy, an associate professor of physics and astronomy and member of the Rice Quantum Initiative. "A similar analogy can be made with magnetic materials, except that nothing evaporates in a true sense of the word."

"The crystal is still intact," he said. "But if you look at the arrangement of the little magnetic dipoleswhich are like compass needlesthey start out in a correlated arrangement, meaning that if you know which way one of them is pointing, you can determine which way any of them points, regardless how far away it is in the lattice. That is the magnetic statethe solid in our analogy. As you heat up, the dipoles eventually will become completely independent, or random, with respect to one another. That's known as a paramagnet, and it is analogous to a gas."

Nevidomskyy said physicists typically think of materials either having magnetic order or lacking it.

"A better analogy from the classical viewpoint would be a block of dry ice," he said. "It kind of forgets about the liquid phase and goes straight from ice into gas. That's what magnetic transitions are usually like in the textbooks. We are taught that you start with something correlated, let's say a ferromagnet, and at some point the order parameter disappears, and you end up with a paramagnet."

Tanatar, a research scientist at Ames' Superconductivity and Magnetism Low-Temperature Laboratory, had found signs that the transition from magnetic order to disorder in helical magnets was marked by a transitory phase in which electronic properties, like resistance, differed by direction. For instance, they might differ if they were measured horizontally, from side to side, as opposed to vertically from top to bottom. This directional behavior, which physicists call anisotropy, is a hallmark of many quantum materials like high-temperature superconductors.

"These layered materials don't look the same in the vertical and horizontal directions," said Nevidomskyy. "That's the anisotropy. Makariy's intuition was that anisotropy was affecting how magnetism melts in the material, and our modeling demonstrated that to be true and showed why it happens."

The model showed that the material passes through an intermediate phase as it transitions from magnetic order to disorder. In that phase, dipole interactions are much stronger within sheets than between them. Moreover, the correlations between the dipoles resembled those of a liquid, rather than a solid. The result is "flattened puddles of magnetic liquids that are stacked up like pancakes," Nevidomskyy said. In each puddle-like pancake, dipoles point roughly in the same direction, but that sense of direction varies between neighboring pancakes.

"It's a bunch of atoms all with their dipoles pointing in the same direction," Nevidomskyy said. "But then, if you go up one layer, all of them are pointing in a different random direction."

The atomic arrangement in the material "frustrates" the dipoles and keeps them from aligning in a uniform direction throughout the material. Instead, the dipoles in the layers shift, rotating slightly in response to changes in neighboring pancakes.

"Frustrations make it difficult for the arrows, these magnetic dipoles, to decide where they want to point, at one angle or another," Nevidomskyy said. "And to relieve that frustration, they tend to rotate and shift in each layer."

Tanatar said, "The idea is that you have two competing magnetic phases. They are fighting each other, and as a result you have a transition temperature for these phases that is lower than it would be without competition. And in this competition scenario, the phenomena that lead to magnetic order are different from the phenomena when you don't have this competition."

Tanatar and Nevidomskyy said that while there's no immediate application for the discovery, it may nevertheless offer hints about the still-unexplained physics of other anisotropic materials like high-temperature superconductors.

Despite the name, high-temperature superconductivity occurs at very cold temperatures. One theory suggests that materials may become superconductors when they are cooled in the vicinity of a quantum critical point, a temperature sufficient to suppress long-range magnetic order and give rise to effects brought about by strong quantum fluctuations. For example, several magnetic "parent" materials have been shown to harbor superconductivity close to a quantum critical point where magnetism disappears.

"Once you suppress the main effect, the long-range magnetic ordering, you may give way to weaker effects like superconductivity," Tanatar said. "This is one of the leading theories of unconventional superconductivity. In our study, we show that you can do the same thing in a different way, with frustration or competing interactions."

More information: Matthew W. Butcher et al, Anisotropic Melting of Frustrated Ising Antiferromagnets, Physical Review Letters (2023). DOI: 10.1103/PhysRevLett.130.166701

Journal information: Physical Review Letters

See the article here:

Physicists discover 'stacked pancakes of liquid magnetism' - Phys.org

Is there any evidence that the "aether" exists? – Big Think

All throughout the Universe, different types of signals propagate. Some of them, like sound waves, require a medium to travel through. Others, like light or gravitational waves, are perfectly content to traverse the vacuum of space, seemingly defying the need for a medium altogether. Irrespective of how they do it, all of these signals can be detected from the effects they have on all the matter and energy that they interact with: both along their journey through space all the way up until their eventual arrival at their final destination.

But is it truly possible for waves to travel through the vacuum of space itself, without any need for a medium to propagate through at all? For some of us, this is a very counterintuitive notion, as the notion of things existing within and moving through some form of empty nothingness just doesnt make any sense. But plenty of things in physics dont make intuitive sense, as it isnt up to humans to tell nature what does and doesnt make sense. Instead, all we can do is ask the Universe questions about itself through experiment, observation, and measurement, and follow natures answers to the best conclusions we can draw. Although theres no way to disprove the aethers (or anything else thats unobservable) existence, we can certainly look at the evidence and allow it to take us wherever it will.

Whether through a medium, like mechanical waves, or in a vacuum, like electromagnetic and gravitational waves, every ripple that propagates has a propagation speed. In no case is the propagation speed infinite, and in theory, the speed at which gravitational ripples propagate should be the same as the maximum speed in the Universe: the speed of light.

Back in the earliest days of science before Newton, going back hundreds or even thousands of years we only had large-scale, macroscopic phenomena to investigate. The waves we observed came in many different varieties, including:

In the case of all of these waves, matter is involved. That matter provides a medium for these waves to travel through, and as the medium either compresses-and-rarifies in the direction of propagation (a longitudinal wave) or oscillates perpendicular to the direction of propagation (a transverse wave), the signal is transported from one location to another.

This diagram, dating back to Thomas Youngs work in the early 1800s, is one of the oldest pictures that demonstrate both constructive and destructive interference as arising from wave sources originating at two points: A and B. This is a physically identical setup to a double slit experiment, even though it applies just as well to water waves propagated through a tank.

As we began to investigate waves more carefully, a third type began to emerge. In addition to longitudinal and transverse waves, a type of wave where each of the particles involved underwent motion in a circular path a surface wave was discovered. The rippling characteristics of water, which were previously thought to be either longitudinal or transverse waves exclusively, were shown to also contain this surface wave component.

All three of these types of waves are examples of mechanical waves, which is where some type of energy is transported from one location to another through a material, matter-based medium. A wave that travels through a spring, a slinky, water, the Earth, a string, or even the air, all require an impetus for creating some initial displacement from equilibrium, and then the wave carries that energy through a medium toward its destination.

A series of particles moving along circular paths can appear to create a macroscopic illusion of waves. Similarly, individual water molecules that move in a particular pattern can produce macroscopic water waves, individual photons make the phenomenon we perceive as light waves, and the gravitational waves we see are likely made out of individual quantum particles that compose them: gravitons.

It makes sense, then, that as we discovered new types of waves, wed assume they had similar properties to the classes of waves we already knew about. Even before Newton, the aether was the name given to the void of space, where the planets and other celestial objects resided. Tycho Brahes famous 1588 work,De Mundi Aetherei Recentioribus Phaenomenis, literally translates as On Recent Phenomena in the Aethereal World.

The aether, it was assumed, was the medium inherent to space that all objects, from comets to planets to starlight itself, traveled through. Whether light was a wave or a corpuscle, though, was a point of contention for many centuries. Newton claimed it was a corpuscle, while Christiaan Huygens, his contemporary, claimed it was a wave. The issue wasnt decided until the 19th century,where experiments with light unambiguously revealed its wave-like nature. (With modern quantum physics, we now know it behaves like a particle also, but its wave-like nature cannot be denied.)

The results of an experiment, showcased using laser light around a spherical object, with the actual optical data. Note the extraordinary validation of Fresnels theorys prediction: that a bright, central spot would appear in the shadow cast by the sphere, verifying the absurd prediction of the wave theory of light. Logic, alone, would not have gotten us here.

This was further borne out as we began to understand the nature of electricity and magnetism. Experiments that accelerated charged particles not only showed that they were affected by magnetic fields, but that when you bent a charged particle with a magnetic field, it radiated light. Theoretical developments showed that light itself was an electromagnetic wave that propagated at a finite, large, but calculable velocity, today known asc, the speed of light in a vacuum.

If light was an electromagnetic wave, and all waves required a medium to travel through, and as all the heavenly bodies traveled through the medium of space then surely that medium itself, the aether, was the medium that light traveled through. The biggest question remaining, then, was to determine what properties the aether itself possessed.

In Descartes vision of gravity, there was an aether permeating space, and only the displacement of matter through it could explain gravitation. This, unfortunately, did not lead to an accurate formulation of gravity that matched with observations.

One of the most important points about what the aethercouldntbe was figured out by Maxwell himself, who was the first to derive the electromagnetic nature of light waves. In an 1874 letter to Lewis Campbell, he wrote:

It may also be worth knowing that the aether cannot be molecular. If it were, it would be a gas, and a pint of it would have the same properties as regards heat, etc., as a pint of air, except that it would not be so heavy.

In other words, whatever the aether was or more accurately, whatever it was that electromagnetic waves propagated through it could not have many of the traditional properties that other, matter-based media possessed. It could not be composed of individual particles. It could not contain heat. It could not be a conduit for the transfer of energy through it. In fact, just about the only thing left that the aether was allowed to do was serve as a background medium for things that were known to travel but didnt otherwise seem to require a medium, like light, to actually travel through.

If you split light into two perpendicular components and bring them back together, they will produce an interference pattern. If theres a medium that light is traveling through, the interference pattern should depend on how your apparatus is oriented relative to that motion.

All of this led to the most important experiment for detecting the aether: the Michelson-Morley experiment. If aether really were a medium for light to travel through, then the Earth should be passing through the aether as it rotated on its axis and revolved around the Sun. Even though we only revolve at a speed of around 30 km/s, thats a substantial fraction (about 0.01%) of the speed of light.

With a sensitive enough interferometer, if light were a wave traveling through this medium, we should detect a shift in lights interference pattern dependent on the angle the interferometer made with our direction of motion. Michelson alone tried to measure this effect in 1881, but his results were inconclusive. 6 years later, with Morley, they reached sensitivities that were just 1/40th the magnitude of the expected signal. Their experiment, however, yielded a null result; there was no evidence for the aether at all.

The Michelson interferometer (top) showed a negligible shift in light patterns (bottom, solid) as compared with what was expected if Galilean relativity were true (bottom, dotted). The speed of light was the same no matter which direction the interferometer was oriented, including with, perpendicular to, or against the Earths motion through space.

Aether enthusiasts contorted themselves in knots attempting to explain this null result.

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

All of these possibilities, despite their arbitrary constants and parameters, were seriously considered right up until Einsteins relativity came along. Once the realization came about thatthe laws of physics should be, and in fact were, the same for all observers in all frames of reference, the idea of an absolute frame of reference, which the aether absolutely was, was no longer necessary or tenable.

If you allow light to come from outside your environment to inside, you can gain information about the relative velocities and accelerations of the two reference frames. The fact that the laws of physics, the speed of light, and every other observable is independent of your reference frame is strong evidence against the need for an aether.

What all of this means is that the laws of physics dont require the existence of an aether; they work just fine without one. Today, with our modern understanding of not just Special Relativity but also General Relativity which incorporates gravitation we recognize that both electromagnetic waves and gravitational waves dont require any sort of medium to travel through at all. The vacuum of space, devoid of any material entity, is enough all on its own.

This doesnt mean, however, that weve disproven the existence of the aether. All weve proven, and indeed all were capable of proving, is that if there is an aether, it has no properties that are detectable by any experiment were capable of performing. It doesnt affect the motion of light or gravitational waves through it, not under any physical circumstances, which is equivalent to stating that everything we observe is consistent with its non-existence.

Visualization of a quantum field theory calculation showing virtual particles in the quantum vacuum. (Specifically, for the strong interactions.) Even in empty space, this vacuum energy is non-zero, and what appears to be the ground state in one region of curved space will look different from the perspective of an observer where the spatial curvature differs. As long as quantum fields are present, this vacuum energy (or a cosmological constant) must be present, too.

If something has no observable, measurable effects on our Universe in any way, shape or form, even in principle, we consider that thing to be physically non-existent. But the fact that theres nothing pointing to the existence of the aether doesnt mean we fully understand what empty space, or the quantum vacuum, actually is. In fact, there are a whole slew of unanswered, open questions about exactly that topic plaguing the field today.

Why does empty space still have a non-zero amount of energy dark energy, or a cosmological constant intrinsic to it? If space is discrete at some level, does that imply a preferred frame of reference, where that discrete size is maximized under the rules of relativity? Can light or gravitational waves exist without space to travel through, and does that mean there is some type of propagation medium, after all?

As Carl Sagan famously said, Absence of evidence is not evidence of absence. We have no proof that the aether exists, but can never prove the negative: that no aether exists. All we can demonstrate, and have demonstrated, is that if the aether exists, it has no properties that affect the matter and radiation that we actually do observe, and so the burden isnt on those looking to disprove its existence: the burden of proof is on those who favor the aether, to provide evidence that it truly is real.

See original here:

Is there any evidence that the "aether" exists? - Big Think

20 Fastest-Growing Cities in the Southeast – Yahoo Finance

In this piece, we will take a look at the 20 fastest-growing cities in the Southeast. For more cities, head on over to 5 Fastest-Growing Cities in the Southeast.

The Southeastern region of the United States, as defined by the Geological Survey, is made up of eleven states in the continental United States, the Virgin Islands, and Puerto Rico. It is one of the oldest regions in America, with the first signs of human presence in the area dating back to tens of thousands of years. Additionally, the Southeastern states were also a crucial part of the Civil War in the 19th century, with the bulk of these deciding to break away from America.

Since then, the Southeast has significantly changed and is a global leader in industrial production and manufacturing. In fact, one of the most famous facilities in the world is located in the Southwestern state of Florida. Florida is from where humans launched rockets to the Moon, and it is the only region in the world with the honor. Courtesy of its proximity to the equator, Florida makes for the perfect site to launch rockets from as they can capitalize on the Earth's rotation around its axis and follow the equator to gain an additional 'oomph' to their speed during launch. The launch site is the National Aeronautics and Space Administration's (NASA) Kennedy Space Center - and it still plays a crucial part in the American space program by being the only launch site in the country that is capable of supporting crewed missions. The KSC is also NASA's only rocket launch site in the country, with other facilities belonging to the Space Force.

However, the space center is not the only crucial NASA facility in the region. Another highly important site in the Southwest is the Stennis Test Center. Stennis is located in Hancock County, Mississippi, and it is responsible for producing some of the most breathtaking visuals in America. This is due to the fact Stennis is the only site in America that can withstand the unimaginable forces generated by a rocket engine. Like KSC, Stennis is also a historic site since it was built to test the massive F-1 engines for the Saturn V rocket.

Story continues

Yet, as if sending people to the Moon and blasting some of the most powerful rocket engines made in human history was not enough, the Southeast is also busy leading America in other high technology areas. For instance, the second largest research park in the world, the Cummings Research Park, is in the Southeastern state of Alabama. Cummings has more than three hundred companies, and these include high end space and aerospace companies such as Lockheed Martin, Dynetics, and Teledyne. Additionally, the Southeast is also home to Florida State University. The university has the only high end magnetic laboratory in America, the National High Magnetic Field Laboratory (MagLab). This facility has several programs which range from spectroscopy to electromagnetic resonance, researching quantum physics, and high magnetic field experiments and research.

Not only is the second largest research triangle in the Southeast, but the biggest triangle - namely the Research Triangle Park in North Carolina which covers seven thousand acres - is also in the region. The RTP, like Cummings, has more than three hundred companies, which include some of the largest and most advanced firms on the planet. This list includes the pharmaceutical giant GSK and the networking equipment manufacturer Cisco. Other firms in the area include agriculture technology companies, medical device firms, and biotechnology companies.

The region is also home to some of the biggest companies in the world. These include The Coca-Cola Company (NYSE:KO), United Parcel Service, Inc. (NYSE:UPS), Walmart Inc. (NYSE:WMT), and Delta Air Lines, Inc. (NYSE:DAL). These are just a handful of firms in the area, and most of these are in Atlanta - one of the Southeast's largest and most prosperous cities. Atlanta is the capital of Georgia and has one of the biggest gross domestic products (GDPs) in the world and in the U.S. There are more than a thousand multinational firms in the area, and the city's economy is dominated by information technology and broadcasting markets. Another prosperous Southeastern city is Miami - the second largest city in Florida in terms of population. Miami has some of the busiest sea and air ports in America, and its location also makes it a hub of the cruise ship industry in the U.S.

With these details in mind, let's take a look at some of the fastest-growing cities in the Southeast.

20 Fastest-Growing Cities in the Southeast

rayna-tuero-AbTcdSQ3v2E-unsplash

Our Methodology

To compile our list of the fastest growing cities in the Southeast, we first narrowed down the 30 largest cities in the region in terms of population. Then, their inbound move rates, which are the percentage of inbound movers over the total inbound and outbound movers, were determined. Finally, the corresponding metropolitan areas were ranked accordingly, out of which the top twenty fastest-growing cities in the Southwest were chosen and are listed below.

Net Inbound Move Rate: 52.3%

Winston-Salem is a city in North Carolina with a quarter of a million people living in its boundaries. Its metropolitan area has five counties, and the city houses several large food and clothing companies. Additionally, the city has a strong presence in the healthcare segment.

Net Inbound Move Rate: 52.4%

Saint Petersburg is a city in Florida and one of the largest in the state in terms of population. The city also sits at the heart of Florida's name as the Sunshine State since it gets sunlight for most days of the year. Saint Petersburg has a diversified economy, with healthcare, insurance, and retail companies being some of the largest employers in the area.

Net Inbound Move Rate: 52.4%

Tampa is another Floridan city. It houses nearly four hundred people and was set up in 1823. Tampa has the biggest port in the state in terms of tonnage handled. It also has a large presence of the U.S. Air Force, and headquarters some of the most crucial Pentagon installations.

Net Inbound Move Rate: 52.7%

Augusta is one of the smallest cities on our list since it houses a little over two hundred thousand people. Augusta is in Georgia and is one of the oldest cities on our list since it was set up in the early 19th century. It houses facilities for several important companies such as T-Mobile, Kellogg's, Kimberly Clark, and Nutrien.

Net Inbound Move Rate: 53%

Durham is one of the largest cities in North Carolina It is known to have one of the best universities in the world, Duke University. Additionally, International Business Machines (IBM) has a large presence in the city as well as the presence of large healthcare and investment firms.

Net Inbound Move Rate: 53.4%

Washington D.C. is the capital of the United States. It houses the Capitol, the White House, and the Supreme Court. Additionally, the city is also home to several international financing institutions and has some of the highest costs of living in the U.S.

Net Inbound Move Rate: 53.5%

Jacksonville is the largest city in Florida in terms of population. Almost one million people live within the city's boundaries, and Jacksonville is also part of one of Florida's largest metropolitan areas. It houses several Fortune 500 companies and is a growing financial center.

Net Inbound Move Rate: 53.6%

Orlando houses more than three hundred thousand people and is a tourism hub in Florida. It also has a large technology park and a presence of important aerospace firms.

Net Inbound Move Rate: 53.8%

Baton Rouge is the capital of Louisiana. The city has some of the largest petroleum facilities in the world, as well as a strong presence of the chemical industry.

Net Inbound Move Rate: 53.8%

Atlanta is one of the most prosperous cities in the Southeast. It has some of the largest numbers of Fortune 500 firms in the region.

Net Inbound Move Rate: 54%

Greensboro is a city in North Carolina. It has a concentration of a variety of manufacturing and industrial companies.

Net Inbound Move Rate: 54.2%

Lexington is the second largest city in Kentucky. It houses several important Fortune 500 companies operating in technology and other industries.

Net Inbound Move Rate: 54.3%

Raleigh is the tenth largest city in the Southeast. The education system is one of the city's largest employers.

Net Inbound Move Rate: 54.7%

Chesapeake is the second largest city in Virginia. Most of its residents are employed in the public sector and healthcare.

Net Inbound Move Rate: 54.7%

Norfolk is another Virginian city and one of the more smaller ones on our list since it houses less than half a million people.

Click to continue reading and see 5 Fastest-Growing Cities in the Southeast.

Suggested Articles:

Disclosure: None.20 Fastest-Growing Cities in the Southeast is originally published on Insider Monkey.

Go here to read the rest:

20 Fastest-Growing Cities in the Southeast - Yahoo Finance

Team creates heaviest Schrdinger’s cat yet – Futurity: Research News

Share this Article

You are free to share this article under the Attribution 4.0 International license.

Researchers have created the heaviest Schrdinger cat to date by putting a crystal in a superposition of two oscillation states.

Even if you are not a quantum physicist, you will most likely have heard of Schrdingers famous cat. Erwin Schrdinger came up with the feline that can be alive and dead at the same time in a thought experiment in 1935. The obvious contradictionafter all, in everyday life we only ever see cats that are either alive or deadhas prompted scientists to try to realize analogous situations in the laboratory. So far, they have managed to do so using, for instance, atoms or molecules in quantum mechanical superposition states of being in two places at the same time.

At ETH Zurich, a team of researchers led by Yiwen Chu, professor at the Laboratory for Solid State Physics, report their findings in the journal Science. The work could lead to more robust quantum bits and shed light on the mystery of why quantum superpositions are not observed in the macroscopic world.

In Schrdingers original thought experiment, a cat is locked up inside a metal box together with a radioactive substance, a Geiger counter, and a flask of poison. In a certain time-framean hour, sayan atom in the substance may or may not decay through a quantum mechanical process with a certain probability, and the decay products might cause the Geiger counter to go off and trigger a mechanism that smashes the flask containing the poison, which would eventually kill the cat.

Since an outside observer cannot know whether an atom has actually decayed, they also dont know whether the cat is alive or deadaccording to quantum mechanics, which governs the decay of the atom, it should be in an alive/dead superposition state.

Of course, in the lab we cant realize such an experiment with an actual cat weighing several kilograms, says Chu. Instead, she and her coworkers managed to create a so-called cat state using an oscillating crystal, which represents the cat, with a superconducting circuit representing the original atom. That circuit is essentially a quantum bit or qubit that can take on the logical states 0 or 1 or a superposition of both states, 0+1.

The link between the qubit and the crystal cat is not a Geiger counter and poison, but rather a layer of piezoelectric material that creates an electric field when the crystal changes shape while oscillating. That electric field can be coupled to the electric field of the qubit, and hence the superposition state of the qubit can be transferred to the crystal.

As a result, the crystal can now oscillate in two directions at the same timeup/down and down/up, for instance. Those two directions represent the alive or dead states of the cat. By putting the two oscillation states of the crystal in a superposition, we have effectively created a Schrdinger cat weighing 16 micrograms, explains Chu. That is roughly the mass of a fine grain of sand and nowhere near that of a cat, but still several billion times heavier than an atom or molecule, making it the fattest quantum cat to date.

In order for the oscillation states to be true cat states, it is important that they be macroscopically distinguishable. This means that the separation of the up and down states should be larger than any thermal or quantum fluctuations of the positions of the atoms inside the crystal. Chu and her colleagues checked this by measuring the spatial separation of the two states using the superconducting qubit. Even though the measured separation was only a billionth of a billionth of a metersmaller than an atom, in factit was large enough to clearly distinguish the states.

In the future, Chu would like to push the mass limits of her crystal cats even further. This is interesting because it will allow us to better understand the reason behind the disappearance of quantum effects in the macroscopic world of real cats, she says. Beyond this rather academic interest, there are also potential applications in quantum technologies.

For instance, quantum information stored in qubits could be made more robust by using cat states made up of a huge number of atoms in a crystal rather than relying on single atoms or ions, as is currently done. Also, the extreme sensitivity of massive objects in superposition states to external noise could be exploited for precise measurements of tiny disturbances such as gravitational waves or for detecting dark matter.

Source: ETH Zurich

Go here to read the rest:

Team creates heaviest Schrdinger's cat yet - Futurity: Research News

New Way to Investigate the Expansion of the Early Universe – AZoQuantum

Scientists have found a new generic production mechanism of gravitational waves produced by a phenomenon called oscillons. Several theories propose that these phenomena arisefrom the fragmentation into solitonic lumps of the inflaton field that directed the rapid expansion of the early Universe. This study was reported in a new study published in Physical Review Letters.

The outcomes have paved the way for showing exciting new understanding regarding the earliest moments of the Universe.

The inflationary period is the moment that occurred following the Big Bang. The exponential expansion of the Universe is believed to be due to this inflationary period. In several theories of cosmology, the formation of oscillons occurs right after the quick expansion period.

Oscillons, a kind of localized non-linear massive structure, can develop from fields, like the inflaton field that are oscillating at high frequencies. Such structures can persist for long periods, and as scientists found, their ultimate decay can produce a considerable quantity of gravitational waves, which are ripples in space-time.

In their research, Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) Project Researcher Kaloian D. Lozanov, and Kavli IPMU Visiting Associate Scientist, International Center for Quantum-field Measurement Systems for Studies of the Universe and Particles (QUP) Senior Scientist, and High Energy Accelerator Research Organization (KEK) Theory Center Assistant Professor Volodymyr Takhistov, replicated the inflaton field evolution at the time of the early Universe and discovered that oscillons were present. Then, they found that oscillon decay could produce gravitational waves that would be detectable by forthcoming gravitational wave observatories.

The results offer a new test of the dynamics of the early Universe independent of the traditionally studied cosmic microwave background radiation. These gravitational waves discovery would demonstrate a new window into the earliest moments of the Universe and could aid in shedding light on a few of the pressing essential questions in cosmology.

With the continuing development of supercomputing resources and gravitational wave detectors, even more insight into the early moments of the Universe can be gained in the future. Altogether, the new research establishes the power of integrating theoretical models with cutting-edge computational observations and techniques to unleash new insights into the evolution of the Universe.

Lozanov, K. D., & Takhistov, V. (2023). Enhanced Gravitational Waves from Inflaton Oscillons. Physical Review Letters. doi.org/10.1103/physrevlett.130.181002.

Source: https://www.ipmu.jp/en

Excerpt from:

New Way to Investigate the Expansion of the Early Universe - AZoQuantum

Leaky-wave metasurfaces: A perfect interface between free-space … – Science Daily

Researchers at Columbia Engineering have developed a new class of integrated photonic devices -- "leaky-wave metasurfaces" -- that can convert light initially confined in an optical waveguide to an arbitrary optical pattern in free space. These devices are the first to demonstrate simultaneous control of all four optical degrees of freedom, namely, amplitude, phase, polarization ellipticity, and polarization orientation -- a world record. Because the devices are so thin, transparent, and compatible with photonic integrated circuits (PICs), they can be used to improve optical displays, LIDAR (Light Detection and Ranging), optical communications, and quantum optics.

"We are excited to find an elegant solution for interfacing free-space optics and integrated photonics -- these two platforms have traditionally been studied by investigators from different subfields of optics and have led to commercial products addressing completely different needs," said Nanfang Yu, associate professor of applied physics and applied mathematics who is a leader in research on nanophotonic devices. "Our work points to new ways to create hybrid systems that utilize the best of both worlds -- free-space optics for shaping the wavefront of light and integrated photonics for optical data processing -- to address many emerging applications such as quantum optics, optogenetics, sensor networks, inter-chip communications, and holographic displays."

Bridging free-space optics and integrated photonics

The key challenge of interfacing PICs and free-space optics is to transform a simple waveguide mode confined within a waveguide -- athin ridge defined on a chip -- into a broad free-space wave with a complex wavefront, and vice versa. Yu's team tackled this challenge by building on their invention last fall of "nonlocal metasurfaces" and extended the devices' functionality from controlling free-space light waves to controlling guided waves.

Specifically, they expanded the input waveguide mode by using a waveguide taper into a slab waveguide mode -- a sheet of light propagating along the chip. "We realized that the slab waveguide mode can be decomposed into two orthogonal standing waves -- waves reminiscent of those produced by plucking a string," said Heqing Huang, a PhD student in Yu's lab and co-first author of the study, published today in Nature Nanotechnology. "Therefore, we designed a 'leaky-wave metasurface' composed of two sets of rectangular apertures that have a subwavelength offset from each other to independently control these two standing waves. The result is that each standing wave is converted into a surface emission with independent amplitude and polarization; together, the two surface emission components merge into a single free-space wave with completely controllable amplitude, phase, and polarization at each point over its wavefront."

From quantum optics to optical communications to holographic 3D displays

Yu's team experimentally demonstrated multiple leaky-wave metasurfaces that can convert a waveguide mode propagating along a waveguide with a cross-section on the order of one wavelength into free-space emission with a designer wavefront over an area about 300 times the wavelength at the telecom wavelength of 1.55 microns. These include:

A leaky-wave metalens that produces a focal spot in free space. Such a device will be ideal for forming a low-loss, high-capacity free-space optical link between PIC chips; it will also be useful for an integrated optogenetic probe that produces focused beams to optically stimulate neurons located far away from the probe.

Aleaky-wave optical-lattice generator that can produce hundreds of focal spots forming a Kagome lattice pattern in free space. In general, the leaky-wave metasurface can produce complex aperiodic and three-dimensional optical lattices to trap cold atoms and molecules. This capability will enable researchers to study exotic quantum optical phenomena or conduct quantum simulations hitherto not easily attainable with other platforms, and enable them to substantially reduce the complexity, volume, and cost of atomic-array-based quantum devices. For example, the leaky-wave metasurface could be directly integrated into the vacuum chamber to simplify the optical system, making portable quantum optics applications, such as atomic clocks, a possibility.

A leaky-wave vortex-beam generator that produces a beam with a corkscrew-shaped wavefront. This could lead to a free-space optical link between buildings that relies on PICs to process information carried by light, while also using light waves with shaped wavefronts for high-capacity intercommunication.

A leaky-wave hologram that can displace four distinct images simultaneously: two at the device plane (at two orthogonal polarization states) and another two at a distance in the free space (also at two orthogonal polarization states). This function could be used to make lighter, more comfortable augmented reality goggles and more realistic holographic 3D displays.

Device fabrication

Device fabrication was carried out at the Columbia Nano Initiative cleanroom, and at the Advanced Science Research Center NanoFabrication Facility at the Graduate Center of the City University of New York.

Next steps

Yu's current demonstration is based on a simple polymer-silicon nitride materials platform at near-infrared wavelengths. His team plans next to demonstrate devices based on the more robust silicon nitride platform, which is compatible with foundry fabrication protocols and tolerant to high optical power operation. They also plan to demonstrate designs for high output efficiency and operation at visible wavelengths, which is more suitable for applications such as quantum optics and holographic displays.

The study was supported by the National Science Foundation (grant no. QII-TAQS-1936359 (H.H., Y.X., and N.Y.) and no. ECCS-2004685 (S.C.M., C.-C.T., and N.Y.)), the Air Force Office of Scientific Research (no. FA9550-16-1-0322 (N.Y.)), and the Simons Foundation (A.C.O. and A.A). S.C.M. acknowledges support from the NSF Graduate Research Fellowship Program (grant no. DGE-1644869).

Read more here:

Leaky-wave metasurfaces: A perfect interface between free-space ... - Science Daily

What I’ve Learned: Senator Mark Kelly on Space Travel, Wife Gabby Giffords, and Gun Reform – Esquire

CONOR E. RALPH/The New York Times/Redux

Mark Kelly, fifty-nine, is the junior U.S. senator from Arizona. Previously he was a NASA astronaut and U.S. Navy pilot. He was inducted into the U.S. Astronaut Hall of Fame on May 6, 2023. He spoke to Esquire from his office in Washington, D.C.

What have I learned from being a twin? Take risksbecause you have a ready supply of spare parts.

Twins have weird habits with each other. My brother and I have shaken hands once in our lives. The only time we ever did was when he got back from space and there was a camera there, specifically to capture the moment.

The wildest job Ive had is the United States Senate.

Some of my other jobs, whether it was flying airplanes in the Navy or fifteen years at NASA, were about science, data, and facts. On Capitol Hill, those are somewhat fluid concepts. Sometimes you see people just make some shit up. Thats surprising. In the United States Senate, people take this job seriously. People from both sides of the aisle try to stick with reality and facts. But occasionally, that gets pushed aside for political purposes.

I grew up Catholic. Had to go to Sunday school, for some reason, on Wednesdays.

My mom was the first woman to be a police officer in West Orange, New Jersey. She worked in a jail before that, as a guard. She was a trailblazer. She taught my brother and me about hard work and doing something thats really outside your comfort zone.

Mark Kelly and his twin brother, Scott, with their mother in the 1960s.

When I was in the ninth grade, I bought myself a stereo system and collected a bunch of albums. I had a lot of Elton John. Yeah, I think I liked Rocket Man.

I always felt like there was mandatory public service for me. My grandfather served in World War II in the Pacific. My other grandfather served in the Merchant Marine in the Atlantic during World War II. My dad was in the 82nd Airborne. I felt like it was my obligation to serve our country. It worked well with my goals. I wanted to fly off an aircraft carrier, and I also wanted to be an astronaut.

I flew thirty-nine combat missions over Iraq and Kuwait. After the first couple dozen of them, even though youre being shot at, you get used to it. But youre still wringing your hands over landing on the aircraft carrier at 3:00 in the morning. If you bolterif the hook on your airplane doesnt grab the wireyouve got to go around. Now a lot of people are watching. If you dont get aboard after two or three tries, youve got to go to the tanker to refuel. Now youre really starting to get stressed.

Ive got fifty-four days in space. I did make four trips to the space station. I was the last commander of the space shuttle Endeavour. I brought up whats arguably the most valuable payload ever floated in the space shuttle: It was the Alpha Magnetic Spectrometer on my last flight.

My third flight, we brought up toilet parts. You dont want the toilet to break in space. Not a good situation.

People dont realize that physiologically being in space is hard on you as a person. You lose blood volume. The fluid shifts in your body because of the lack of gravity. You just start to urinate out plasma volume. You come back very dehydrated. It messes up your neuro-vestibular system. Things that are pretty simple on earth become hard. Theres also a reasonable chance the whole things just going to blow up on you at liftoff.

The most rewarding thing about being in space is that youre doing something thats very technically challenging. A large team of people is trying to work together to accomplish some challenging goals for our country. Youve prepared for something for years. When it goes well and you come back safely and youve got everything done, its very rewarding.

We live on an island in our solar system. We dont have any other place to go. We need to take care of this planet because its the only one weve got.

Mark Kelly, right, and his brother, Scott, left, in Phoenix in November 2022.

At a very basic level, humans are explorers. We want to see whats over the next hill. We want to see whats across the ocean. We want to see whats on the surface of the moon. When we do that, we learn more about ourselves, but we also wind up developing significant technology that drives our economy for decades. From an economic standpoint, even though its an up-front investment and these things cost money, NASA is one of the great values for the U.S. taxpayer, because the country gets a very good return on that investment.

The Martian is the best movie about being in space.

Its likely that theres other life out there. Whether its intelligent life that travels around its own solar system or around the galaxy, at this point theres no evidence of that. Not that I can tell you, anyway.

Mark Kelly aboard Space Shuttle Endeavour prior to a simulated launch countdown in November 2001.

We have gun laws like no other country on the planet. Its crazy that we dont do background checks for all gun sales. Thats just something that most Americans think is the right thing to do. We give access to the most powerful weaponsweapons that are designed for the military, designed to kill a lot of people in a short period of timeto just about anybody. That doesnt make a lot of sense to me. We should have higher standards to own these kinds of firearms.

Im a believer in the Second Amendment. Im a gun owner myself. But we, as a country, have let this thing get out of control. Thats why we have mass shooting after mass shooting.

I wouldnt say Ive forgiven the guy who shot my wife Gabby [Giffords] or would ever forgive him for what he did. He murdered six people, including a nine-year-old girl. Shot Gabby in the head. Shot twelve other individuals, some multiple times. Horrific. But unless somebody asks me about it, I dont spend any time thinking about him or thinking about this. Im a lot like my wifemove ahead, move ahead. Thats not something that I spend any time on.

After what happened to Gabby and that she survived and thought about coming back, I often think theres a greater purpose in the world for certain people. That things dont just always happen by accident.

Gabby Giffords and Mark Kelly on the campaign trail in Arizona in November 2022. Kelly was first elected to the U.S. Senate in 2020 in a special election following the death of Senator John McCain. He was reelected to a full term in 2022.

When we finally send somebody to Mars, its probably a pretty good chance itll be somebody with a lot of experience but whos a little bit older. The reason is that the radiation dose is really high. The one way you can mitigate that is you send somebody whos older, because they have less time left to get cancer from the radiation and die from it. Theyre going to die from something else first. When we send somebody, its not going to be like Neil Armstrong at thirty-nine years old or Buzz Aldrin at thirty-nine. Itll probably be some guy around my age, sixty.

I could do that mission right now, and I would love toif it wasnt one-way. Id like to come back.

What Ive learned being on the Aging Committee are these programs, whether its Meals on Wheels or Medicare prescription-drug programs, are for people who really need them. The bill that the House passed would take away Meals on Wheels from a million seniors. Its a horrible thing to do to people.

Senator Mark Kelly at the Arizona Science Center in April 2023.

I only think about running for president when somebody like you asks that question, then Ill go on to the next thing I have to do, which is another fifty things today.

More than anything else from being a parent, I learned patience.

Window seat or aisle? I got to see this earth from space for fifty-four days. Aisle seat.

Read more from Esquire's iconic interview series, What I've Learned:

Read more:

What I've Learned: Senator Mark Kelly on Space Travel, Wife Gabby Giffords, and Gun Reform - Esquire

New golden era of space exploration matches upheavals of past – Milwaukee Journal Sentinel

David M. Shribman| Milwaukee Journal Sentinel

SpaceX successfully launches another batch of Starlink satellites

A SpaceX Falcon 9 rocket tore through the early morning sky Thursday, taking more Starlink internet satellites to orbit from Cape Canaveral Space Force Station in Florida.

Rob Landers, Florida Today

VERO BEACH, Fla.I saw it lift into the dusky heavens, reaching upward in a stunning ballet of determination and grace, creeping across the sky in an orange streak, stretching toward earth orbit. And somehow the rise of the SpaceX Falcon-Heavy rocket defied the notion that there is no revelation in repetition.

Americans have been launching rockets from Florida for 73 years; the first one was on July 24, 1950, a date hardly anyone marks or even is aware of. Since Project Apollo, which catapulted men to the moon, ended 48 years ago, and the eclipse of the Space Shuttle, which mounted 135 missions, spaceflight prompted a certain ennui. The days when a black-and-white television atop a tall metal tower was wheeled into classrooms for schoolchildren to witness a Project Mercury launch became a fading memory, like the lyrics of a Shelley Fabares song.

But suddenly it is pass to say that spaceflight is pass.

The past few weeks have proved that. The astronauts who will return to lunar orbit on Artemis II were identified to much fanfare and much public interest. A Jupiter Icy Moons Explore satellite, known as Juice, took off from the Guiana Space Center in Kourou, French Guiana, on a mission that will take it through 25 flybys of Jupiters Callisto, Europa and Ganymede moonsone of the most exciting missions we have ever flown in the solar system, in the characterization of Josef Aschbacher, the head of the European Space Agency, and by far the most complex.

And there is more. The Starship rocket lifted off the pad in Southern Texas, cleared the launchpad, and flew for four minutes before collapsing into a spectacular fireballand yet SpaceX declared the mission a great success. Mission personnel from a private Japanese company may have lost contact with the ispace lunar lander, but the Hakuto-R Mission 1 vehicle is presumed to have crashed in the Atlas Crater on the near side of the moonlike the Starship, an achievement amid disappointment.

The $97 million SpaceX rocket that slipped the surly bounds of earth recently was carrying satellites designed to improve internet service in the Americas, Europe, the Middle East, and the Asia Pacific region, with another satellite designed to provide high speed internet to remote areas in Alaska. Neither their form nor function was remotely conceivable when the Apollo 8 astronauts circled the moon at Christmastime in 1968, sending unforgettable pictures of their home planet and reading from the Book of Genesis.

That was 55 years ago, and yet the parallels between 1968 and 2023 are unmistakable: Social tensions. Cultural upheaval. Political divisions. A sense of despair. And yet when Bill Anders, Jim Lovell and Frank Borman began reading the verses of the Bible, there was a glimmer of hope on the old planet.

We could have a moment like that in a year or so from now when the Artemis astronauts return to moon orbit, said Jennifer Levasseur, curator at the National Air and Space Museum. There is something about the state of the world today that seems similar to 1968. It makes me think that this is just right time for something like this. We are building to a pretty big moment.

Jeremy Hansen, the Canadian astronaut who will travel aboard the first manned Artemis mission, believes so. How do we actually get eight billion people to row in the same direction and work on [our] problems? he asked when the Artemis astronauts submitted to a Canadian Press interview. Because these are global problems. We can do great things together. We can do better as a human race. And heres one small example.

The editorialists at Canadas Globe and Mail newspaper picked up the theme. Ours is a world and a moment that sorely needs a reason to look up in astonished unison, they wrote. We dont get many shared experiences any more. Our histories, our entertainment, our windows on the world even the facts of our basic reality are fragmented into choose-your-own-adventure shards.

Colonel Hansen is the lineal descendant of Marc Garneau, the first Canadian in space, the way the three American Artemis astronauts are the descendants of Alan Shepard, the first American in space. The Apollo lunar missions drove innovation in ways never imagined, but they brought us more, Mr. Garneau told me. They left us proud and even awed by what we accomplished. They gave us confidence. They made us realize we could achieve the extraordinarily difficult. They brought us together and inspired humanity. They moved us forward. We need to build on that.

Everyone who has been into space feels that way. Jay Apt, who flew on four Space Shuttle missions, one as commander, believes space travel is an antidote to earthbound lassitude and public pessimism.

Optimism is essential to provide the energy people need to do almost anything outside of their daily routine, whether it be founding a small business, discovering the secrets of electricity, or having children, he said. Exploration in pretty much any era is inherently optimistic and draws the best from optimistic people, which is why I personally get a thrill seeing the images from space telescopes, Earth views from the space station, and can't wait for the photos and videos from the crews that will circle and land on the moon in the decades to come.

He is not alone. Ms. Levasseur, the museum curator, sees a definite change in the way people mingle amid the space capsules on the display floor. I see a connection that young people have with space program I havent seen before, she said. Its palpable.

Americans havent always felt that way. Dwight Eisenhower, who was no romantic, was skeptical of mounting a space effort even after the Soviet Union launched Sputnik, the first satellite, in 1957. Id like to know whats on the other side of the moon, he said, but I wont pay to find out this year. When his successor, John F. Kennedy, launched the American effort to reach the moon, Mr. Eisenhower said, Anybody who would spend $40 billion in a race to the moon for national prestige is nuts.

Nuts we were, and nuts we are.

We choose to go to the Moon in this decade and do the other things, not because they are easy, Mr. Kennedy said in his challenge to NASA, but because they are hard." It still is hard, but away we go.

David M. Shribman is the former executive editor of the Pittsburgh Post-Gazette. Emaildshribman@post-gazette.com. Twitter:@ShribmanPG

Read the rest here:

New golden era of space exploration matches upheavals of past - Milwaukee Journal Sentinel