A.I. Week: The future is here and it’s cloning your voice – WXYZ 7 Action News Detroit

(WXYZ) The future is here, and its using your voice. There are already apps available where you can use artificial voices, listen to them, or you can make one.

Now some of these voices are pretty familiar: JFK, Barack Obama, Kevin Hart, the list goes on.

"It's pretty scary," said one Shelby Township resident. Adding that she would "absolutely" believe that it was a real person.

Those voices were made using an app called Play.ht. The app can also create clones of voices. It took a couple hours but I created a clone of my own voice.

Artificial intelligence clones Sarah's voice

"Were at a point in time with artificial intelligence where you can recreate somebody's voice after just having 3 seconds of their audio," said Sinead Bovell, Futurist and WAYE founder.

WAYE is a company which educates on the future of technology. Bovell travels the world to talk about artificial intelligence.

Related:

While Bovell is worried about the technology, she said she feels in some ways we have been here before.

"When the radio was invented, it disrupted so many things and there was a lot of legitimate fear," she said.

As with most all progression, Bovell says there are pros and cons to the artificial intelligence.

Full interview with futurist and founder of WAYE

Pros: it can be used to give people with disabilities back their ability to speak and make a lot of peoples jobs easier when it comes to careers like podcasting and video creation.

Cons: alarm bells have been ringing as to artificial intelligence scams where bad actors have used A.I. voice cloning to pretend to be someone else, threatening them over the phone or attempting to gain access to bank accounts.

"As A.I. becomes more advanced and more pervasive that means things like phone calls and different video segments we may see, we have to think critically about how those were created or who is actually behind them," said Bovell.

She says there is artificial intelligence in the works to let you know if a voice is real or fake.

Its a bit of a cat and mouse game, but what you can do is call family or friends directly to confirm their identity, come up with a safe word, and be very careful with what phone numbers you pick up for, as well as with who you share your personal information.

Similar to when the internet came out, we all need to educate ourselves, and prepare to adjust.

"Kinda scary but at the same time I think its just the advancement of ... the world now," said Imani Jones of Detroit.

Bovell adds, "we really have been here before and we of course will find ways to move through it."

More:

A.I. Week: The future is here and it's cloning your voice - WXYZ 7 Action News Detroit

When my mom died, I wanted to clone her – Insider

The author and his mother. Courtesy of the author

I suffered grief of biblical proportions when my mother died at 90 in 2018. In a haze of sadness, it seemed comforting to imagine cloning her in a fertility clinic lab, where I could raise her as my daughter.

My mother, Naimma, was born in a remote village by the Tigris river in Iraq, where she faced many hardships. She grew up in a small Arabic community that shared the Muslim religion. She was shaped by her parents, whom I never met. She mourned their early deaths, praying for them at the cemetery daily for years. She had an arranged marriage at 13 and welcomed her first child at 14. I was No. 5 of her six kids. She had just come to the US when I was born and had to learn English.

I was 12 when my father died suddenly in 1969 from a heart attack. My mother now had to navigate raising her children on her own. She took the GRE, worked hard at California State University, Los Angeles, and obtained her teaching credentials. She taught for the Los Angeles Unified School District in the inner city until she retired.

She was so proud that I was going to be a doctor and made it possible for me to pay for medical school in Los Angeles. I want to thank her for the sacrifices she made to ensure my life was better. I wished I could make her new life better.

But after re-tethering to reality about cloning my mother, I saw the futility in my fantasy. She would be a project, a designed object with unrealistic expectations. She would grow up loving me as a father. Parenting is a social activity, not purely biological.

My cloned mom would be over 90 years younger than my real mom. Though she might look like her eventually, she would not have the unique experiences that made her the woman I missed so much.

My cloned mom would not meet or marry my father. She would not have her hilarious malaprops, like "The Star Bangler Spangle" and, "You are a rat pack." She would not be guiding me to the hadj in Mecca and showing me the traditions of Ramadan.

Now religion would be her choice. She would not be underemployed because of her early limited access to education. Her eventual medical issues could be mitigated or even prevented, as I'd be aware of what her body did as she aged. I would not have to stand by watching her slowly slide toward dementia.

What relationship would my brothers and sisters have to a reincarnation of our mother? Some have already passed away, so even before my new mom was born, she would have children who died. The rest of them could be jealous that she would love me more than them. They would be confused aunts and uncles instead of sons and daughters.

A clone is not a perfect copy of an individual. If we cloned John F. Kennedy, Princess Diana, or Martin Luther King Jr., these children would be unlikely to meet the expectations to achieve what was accomplished by their genetic predecessors.

No, there would be no solace in attempting to recreate my mother. The new Naimma would be a completely different person, even if, anatomically, she had the same genome. She would not be my mom, whom I miss so much.

Samir Shahin, MD, is a family-practice physician in Los Angeles. He wrote a sci-fi romance novel, "Override," about sending embryos into space with an artificial-intelligence caretaker.

Loading...

View post:

When my mom died, I wanted to clone her - Insider

Ahsoka’s Baylan Actually Could Be This Huge Star Wars Character … – Screen Rant

The upcoming Ahsoka series will feature Ray Stevenson as Baylan Skoll, and this seemingly new antagonist could actually be an existing Star Wars character. Baylan was first revealed in the Ahsoka teaser trailer at Star Wars Celebration 2023, wielding an orange lightsaber and clashing blades with Ahsoka Tano. Not much is known about Baylan and his apprentice, Shin Hati, other than that they're working with Morgan Elsbeth, meaning they could team up with Grand Admiral Thrawn himself. While this would tie in nicely with Ahsoka's mission, it also hints at Baylan having a secret identity that will change the course of both Ahsoka and the Mandoverse.

Related: Ray Stevenson Gies Insight Into Ahsoka's Orange Lightsaber-Wielding Villains

If Baylan is revealed to be an existing Star Wars character, several story possibilities would open up based on that character's history. This would allow the team behind Ahsoka to tie Baylan's character into Thrawn's plan and create even more connections to the other Star Wars TV shows. The concepts that would become available could lead to a lot of really fun and creative moments in Ahsoka, taking the live-action Star Wars show to the next level. All of these possibilities hinge on Baylan being confirmed as an existing Star Wars character, and there's actually quite a bit of evidence to suggest that it will happen.

There's a good chance that Baylan is actually Joruus C'Baoth from Timothy Zahn's Thrawn trilogy of books. In the Star Wars Legends timeline, Joruus C'Baoth was a mad clone of Jedi Master Jorus C'Baoth, who teamed up with Thrawn in Heir to the Empire as a means of achieving his own goals. He tried converting Luke Skywalker to his side, but Luke saw how he had fallen to the dark and refused, which led to a climactic duel at the end of the trilogy. Given the Star Wars TV shows' focus on Imperial cloning and C'Baoth's connection to Thrawn, making Baylan his canon equivalent would make sense for the story.

Thrawn first met Jorus C'Baoth in the years before the Clone Wars, where the two came into a conflict that ended with Thrawn destroying C'Baoth and the Outbound Flight project. Perhaps something similar happened in canon, which would explain Thrawn creating C'Baoth's clone, or discovering that Palpatine had created him, like in Legends. Baylan bears a striking resemblance to C'Baoth, albeit with much shorter hair, and being a dark Jedi could explain the meaning behind the orange lightsaber color. Perhaps Baylan will try to recruit Ahsoka as C'Baoth did with Luke in Legends, given that she's technically no longer a Jedi.

Baylan being Joruus C'Baoth would also set up more Jedi clones, just like in the Thrawn trilogy. In The Last Command, C'Baoth created a clone of Luke named Luuke Skywalker, grown from the hand that he lost in The Empire Strikes Back. Incorporating Luke as a villain would be an incredibly bold way to feature him in Ahsoka, and having a slightly altered clone would allow for a new actor, rather than another CGI recreation of Mark Hamill. This would also be a great way to continue Ahsoka's character arc, as an evil Luke would remind her of Anakin Skywalker and drudge up old wounds that she hasn't fully overcome.

Another shocking possibility would be Ezra Bridger, who was last seen disappearing with Thrawn into hyperspace. Since Thrawn knows that Ahsoka is looking for Ezra, perhaps he'll create a clone to fool her and eventually strike when Thrawn sees fit. This would make for an incredible twist if the show waited a few episodes before the reveal, and it would make the real Ezra's return even more satisfying. Star Wars Rebels briefly teased Ezra turning to the dark side in season 3, but this idea was quickly dropped, so having an evil Ezra clone is a great opportunity to build on this concept.

Related: Did New Ahsoka Footage Tease Ezra Bridger's Turn To The Dark Side?

If Ahsoka does bring back Joruus C'Baoth and even clones of existing characters, then Star Wars canon should also bring Mara Jade into the Mandalorian era. She was an integral part of the Thrawn trilogy, the former Emperor's Hand who had failed to kill Luke Skywalker, only to team up with him to stop C'Baoth. She was the one who killed Luke's evil clone and C'Baoth himself, so given how intertwined their stories are in Legends, she deserves to be brought back if he is. Of course, much of Mara's Legends story would no longer work in canon, namely her marriage to Luke, but there are ways to work around this.

If Luke and Mara were to meet in canon, then there could be some slight adjustments to their history. They could simply be friends or allies, or have a brief romance that doesn't lead to marriage, allowing some of their story to be the same without contradicting canon. Mara was eventually killed in Legends, so her death could be shifted to earlier in the timeline, explaining her absence in the sequel trilogy and maybe even allowing her and Luke to have been married after all. However, if Mara doesn't appear in Ahsoka, her role could be filled by a different character, such as a former Inquisitor or even Shin Hati.

Related: Ahsoka Villains Explained: Names, Identity, Weapons & More

Mount Tantiss is one of the most significant elements to be reincorporated into Star Wars canon, and it carries even greater importance if Baylan is in fact Joruus C'Baoth. The Emperor's secret cloning facility first appeared in Heir to the Empire, guarded by Joruus C'Baoth himself, which brought the mad Jedi into conflict with Thrawn before they agreed to work together. Mount Tantiss returned to canon in Star Wars: The Bad Batch, and this would be the perfect setup for Joruus C'Baoth to return in Ahsoka. Thrawn also used Mount Tantiss in Legends to create his own clone army, which ties in nicely with a Jedi clone.

Because Mount Tantiss was used for Palpatine's cloning project, a Jedi clone would set up his return in Star Wars: The Rise of Skywalker. The Star Wars TV shows have hinted that the various cloning projects all lead to Palpatine or Snoke, with Thrawn's "Project Necromancer" being the most recent hint. C'Baoth's appearance would honor the original Thrawn story, build on Star Wars' current projects, and set up the future timeline all at once. The Star Wars franchise has done an excellent job setting up a canon adaptation of the Thrawn trilogy, and having Ahsoka's Baylan be Joruus C'Baoth would serve as the final piece that brings it all together.

Visit link:

Ahsoka's Baylan Actually Could Be This Huge Star Wars Character ... - Screen Rant

Star Wars Is Finally Getting Over Its Palpatine Obsession… But … – Screen Rant

Star Wars is finally moving away from its obsession with Emperor Palpatine, but a certain storyline involving Grand Admiral Thrawn could undo that progress. Palpatines return in Star Wars: The Rise of Skywalker was highly controversial not only did it seem convoluted within the context of the Skywalker sagas overarching narrative, but it also came across as an attempt to cash in on nostalgia. Making Rey a Palpatine was unnecessary, but Star Wars has, at the very least, been trying to make more sense of Darth Sidious return through other media since The Rise of Skywalkers release.

These attempts include comic series like Charles Soules The Rise of Kylo Ren, which goes some way to explain how Palpatine manipulated Ben Solo through Snoke. Star Wars: The Bad Batch has also delved into the possible circumstances of Palpatines return, taking a closer look at the cloning research that occurred after the rise of the Empire. Even so, Star Wars is a franchise with endless possibilities it can, and should, move past Emperor Palpatine and Darth Vader as its main antagonists. Thankfully, Star Wars has been taking steps in the right direction, with several upcoming projects seemingly neglecting Palpatine entirely. However, the threat of Palpatine could still be looming over a certain corner of the Star Wars universe.

Three recent and upcoming Star Wars projects have avoided Palpatine in their stories. While it seemed as though The Mandalorian would reveal its cloning storyline as being Palpatines failed attempts at Snoke, the clones turned out to be Moff Gideons instead even Grogus involvement was explained as Gideons plan to gain Force abilities himself. The Acolyte, an upcoming Star Wars show set during the tail end of the High Republic, cannot include Palpatine, as he wasnt even born yet. Reys new Jedi Order movie surely wont include Palpatine; his being resurrected once again would be a terrible decision and undermine what the movie is supposed to represent: the start of a new Star Wars era.

These projects are all the better because they exclude Palpatine. The Emperor is an iconic villain, one who has undoubtedly provided incredible storytelling moments and interesting plot twists. But a franchise as expansive as Star Wars cant rely on the legacy of one villain forever, and The Mandalorian, The Acolyte, the High Republic Era at large, and Reys upcoming New Jedi Order movie all prove why its so important to create new characters or focus on ones that havent had a chance to truly shine yet in live-action canon. Enter: Grand Admiral Thrawn.

As Ahsoka will partially function as a Star Wars Rebels sequel, its no wonder the show is bringing back Grand Admiral Thrawn. Thrawn, both in canon and Legends, has proven to be a formidable enemy and tactical mastermind. He poses a serious threat to the heroes of the Mandoverse and will undoubtedly shake up the status quo. It seems likely that Ahsoka, The Mandalorian season 4, The Mandalorian movie, and perhaps even Star Wars: Skeleton Crew will tell an overarching canon adaptation of the Heir to the Empire storyline. Thrawn, on his own, is an incredibly compelling character, and yet certain hints from The Mandalorian season 3 may link his story to Palpatines.

Related: Who Is Grand Admiral Thrawn? Star Wars Villain Origin & Future

Should that link exist? Arguably, no, though its a tough line to walk. While Project Necromancer and Thrawns hinted cloning efforts could be part of Palpatines backstory, it seems a shame to have a character as interesting as Thrawn be merely another cog in Palpatines master plan. While Star Wars efforts to turn Palpatines resurrection into something more believable have been mostly successful, theres a risk of using the remaining Mandoverse story merely as a vehicle to make The Rise of Skywalker more meaningful. But doing that also makes sense, given the Mandoverses placement in the timeline. As long as Star Wars stories are being told between the prequel, original, and sequel trilogies, theres always a risk that Palpatine might overshadow the everything.

Read this article:

Star Wars Is Finally Getting Over Its Palpatine Obsession... But ... - Screen Rant

Proof That a Complex Quantum Network Is Truly Quantum – Physics

May 11, 2023• Physics 16, s71

Researchers prove the fully nonclassical nature of a three-party quantum network, a requirement for developing secure quantum communication technologies.

In 1964, John Stewart Bell predicted that correlations between measurements made by two parties on a pair of entangled particles could confirm the fundamental nonclassical nature of the quantum world. In the past few years, researchers have performed various tests of Bells predictions that were rigorous enough to rule out classical explanations. Now researchers in China and Spain have done the same for a more complex systema quantum network in which three parties make measurements on pairs of entangled particles generated by two sources [1]. The researchers say that their stringent confirmation of quantum phenomena is encouraging for the development of future secure quantum communication networks.

To ensure a rigorous test of nonclassicality, and thereby prove that classical assumptions of local realism are invalid, the experiment must be carefully designed. If the parties making the measurements can communicate classically during the experiment, or if the two devices creating the entangled particles can influence one another, seemingly quantum behaviors can have classical explanations. In a quantum communication network these loopholes could allow eavesdroppers to listen in.

In their experiment, the researchers close these loopholes by placing each element of their network about 100 m apart. They also determine the measurement settings of the three parties using different quantum random number generators to make sure that the measurements are truly independent. These precautions allow the researchers to demonstrate that the network satisfies a condition known as full network nonlocality, which certifies that neither of the sources of entangled particles can be described by classical physics.

Marric Stephens

Marric Stephens is a Corresponding Editor forPhysics Magazine based in Bristol, UK.

Xue-Mei Gu, Liang Huang, Alejandro Pozas-Kerstjens, Yang-Fan Jiang, Dian Wu, Bing Bai, Qi-Chao Sun, Ming-Cheng Chen, Jun Zhang, Sixia Yu, Qiang Zhang, Chao-Yang Lu, and Jian-Wei Pan

Phys. Rev. Lett. 130, 190201 (2023)

Published May 11, 2023

A new kind of 3D optical lattice traps atoms using focused laser spots replicated in multiple planes and could eventually serve as a quantum computing platform. Read More

See original here:

Proof That a Complex Quantum Network Is Truly Quantum - Physics

Experiment contradicts Einstein and reveals spooky quantum action with superconducting qubits 30 meters apart – EL PAS USA

Quside/ICFOs ultra-fast and ultra-pure random number quantum generator used in the experiment.Quside/ICFO

Physicist James Trefil once said that quantum mechanics is a place where the human brain will simply never feel comfortable. This discomfort happens because nature, at a microscopic scale, obeys laws at odds with our perception of macroscopic reality. These laws include superposition (a particle can simultaneously be in different states, like Erwin Schrdingers live and dead cat), and quantum entanglement at a distance. Albert Einstein described the latter as spooky action at a distance, a principle allowing particles separated by distance to respond instantaneously and behave as a single system. A spectacular experiment that defies the speed of light was recently published in Nature by an international team of scientists led by the Swiss Federal Institute of Technology (ETH) in Zurich, collaborating with Spains Institute of Photonic Sciences (ICFO) and Quside, a quantum computing company. The study demonstrated for the first time super quick quantum random number generators that enable spooky action at a distance between superconducting quantum bits.

This experiments results contradict Einstein, who once considered quantum entanglement impossible. The physicist believed in the principle of locality, which states that an object is influenced directly only by its immediate surroundings. But advances in quantum physics have shown that two entangled particles can share a single unified state, even if they are 30 meters apart, as in the Zurich experiment.

Einstein could not accept that an action in one place could have an instantaneous effect elsewhere. But John Bell proved in 1964 that quantum entanglement exists. Subsequent experiments with this property by John Clauser, Alain Aspect and Anton Zeilinger earned them a Nobel Prize in 2022.

A major achievement of the study published in Nature is that it experimentally demonstrated in Bell tests performed on pairs of spatially separated, entangled quantum systems that quantum physics does not follow the principle of local causality with no so-called loopholes. The absence of loopholes means everything happens exactly as predicted by quantum physics no communication between particles.

A similar experiment was conducted a year ago by Spanish physicist Adn Cabello of the University of Seville (Spain) with ytterbium and barium ions (Science Advances). But the Nature study raised the complexity level by using two superconducting qubits entangled at temperatures close to absolute zero (-273.15C or -459.67F) and 30 meters apart.

Simultaneous measurements of the two qubits showed synchronized responses consistent with spooky action or entanglement at a distance. To demonstrate the absence of loopholes (that the coordination of states did not come from signals sent between qubits), the scientists made random 17-nanosecond measurements, which is the time it takes light to travel five meters. A full measurement required another 62 nanoseconds, the time for light to travel 21 meters. Because the systems were 30 meters apart, communication between the two was impossible.

The new study is significant because it has practical applications beyond the theoretical proof. Morgan W. Mitchell, a professor at the Catalan Institution for Research and Advanced Studies (ICREA) and a co-author of the study, said, With ordinary computing, your home device communicates constantly through the internet with a server. But to do something equivalent with quantum computers, we need to communicate them somehow, but not using classical bits. We have to use quantum bits and entanglement is the most efficient way to do this.

Mitchell said, This study shows that experiments like this can be done with the same superconductors used by Google and IBM. Other experiments used systems with a single pair of particles, but ours created entanglement between many electrons at both sites. And we achieved this for the first time without loopholes.

According to Mitchell, their experiment made progress toward distributed quantum computing with multiple computers at multiple sites Its a long-term goal that were not going to achieve immediately. But this experiment demonstrated its feasibility.

Carlos Abelln an expert in photonics and Qusides co-founder and CEO, said the experiment created a spectacular and unique technology that synchronized two particles with unprecedented speed. This required generating quantum random numbers and extracting them at extraordinarily fast speeds (17 nanoseconds) to eliminate any possibility of communication between the qubits. We had to engineer new ways of generating and extracting the random numbers before the information reached the other side. We needed to double the speed of earlier systems, said Abelln. Instead of using one device for calculations, we connected eight devices in parallel, and then synchronized and combined the signals. This gave us 16 random number generators with double the speed. If we had taken 19 nanoseconds instead of 17, the experiment would have been invalidated.

The experiment proved that quantum information can be transmitted between separate superconducting circuits housed in cryogenic systems. In other words, it works with currently available quantum computing systems. But why two separate systems can behave as one is still unexplained. Its a question for the philosophers, and a very difficult one at that. You can ask 10 different physicists and youre going to get 10 different answers. Its a mystery for new generations to solve. But these experiments prove that it really exists, said Mitchell.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAS USA Edition

Read more:

Experiment contradicts Einstein and reveals spooky quantum action with superconducting qubits 30 meters apart - EL PAS USA

Leading mathematician wants to solve the riddle of a million quantum particles: "It can only be done with a blackboard … – EurekAlert

image:Professor Sren Fournais of the University of Copenhagen view more

Credit: Jim Hyer/University of Copenhagen

"Imagine one of those spectacular opening ceremonies at the Olympics, where a huge crowd suddenly gathers as a unit and synchronizes its movements like a flock of starlings. In a few very special and strange cases, the same occurs in the world of atoms. In them, a million atoms that have all entered the same quantum state behave completely synchronously," explains University of Copenhagen math professor Sren Fournais.

Fournais is referring to the mysterious quantum phenomenon known as Bose-Einstein condensates. These can occur if certain kinds of atoms are successfully cooled to temperatures near absolute zero. Here, anywhere from 100,000 to several million atoms entangle, with all of them transitioning into the same quantum state at the same moment a point at which the substance is neither solid, liquid, gas nor plasma. It is often described as being in its own fifth state.

While Bose and Einstein predicted the existence of such a phenomenon in the 1920s, it wasnt until 1995 that a Bose-Einstein condensate was produced in the lab a Nobel Prize winning achievement. And even though researchers worldwide are busy exploring the new quantum state, much remains a mystery.

"Our general understanding of these extreme physical systems is incomplete. We lack the mathematical tools to analyse and understand condensates and thereby the ability to possibly put them to good use. The mathematical challenge is that you have a tremendous number of particles interacting with each other, and that the correlations between these particles are crucial for their behavior. The equations were written down a long time ago, but one cannot just solve them. In the years ahead, well focus on understanding these solutions," says University of Copenhagen mathematics professor Sren Fournais.

Fournais is one of the world's leading researchers in quantum mechanical equations and has already spent ten years of his career wrestling with the wondrous nature of Bose-Einstein condensates. He is now dedicating the next five years to getting closer to the phenomenons complex mathematical solutions. To do so, the European Research Council has awarded him DKK 15 million through one of its highly prestigious ERC Advanced Grants.

Quantum mechanics takes place in the micro-world of atoms, within which the quantum effects are so minuscule that we do not experience them in our daily lives. The truly fascinating aspect of a Bose-Einstein condensate is that because it is made up of huge masses of atoms, it is nearly large enough to be observed with the naked eye. This makes it ideal for teaching us about quantum mechanics and for conducting experiments on a scale large enough to actually see them.

Researchers around the world are working to exploit the quantum properties of Bose-Einstein condensates in various ways. In 2022, Dutch scientists built an atomic laser based on a Bose-Einstein condensate. Danish professor Lene Hau of Harvard University has demonstrated that she can stop the light using a Bose-Einstein condensate. Work is also underway around the world to base a quantum computer on these icy atoms. However, there are still plenty of bumps in the road and the condensates are still only used at the basic research level.

According to Sren Fournais, the missing answers are found in the equations or at least the theoretical answers:

"The beautiful and fascinating thing about mathematical physics is that we can write down the laws of nature in relatively simple equations on a piece of paper. Those equations contain an incredible amount of information in fact, all the answers. But how can we extract the information that tells us what we want to know about these wild physical systems? It's a huge challenge, but it's possible," says Fournais.

As with the phase transition that occurs when water freezes into ice, Bose-Einstein condensation, in which atoms transition to a quantum state, is a phase transition. It is this physical transformation that Sren Fournais dreams of finding the mathematical solution to.

"My big dream is to mathematically prove the phase transition that the Bose-Einstein condensation is. But demonstrating phase transition is notorious for being extremely difficult, because you go from particles moving randomly about to them sticking. A symmetry is broken. It is very difficult to see why the particles do so and exactly when it will happen. Consequently, up until now, there is only one physical system where we have succeeded in saying something mathematical about this phase transition," says Fournais, who adds:

"I think that its unrealistic to solve this task in five years but hope that the project will get us closer than we are today. And then we have many intermediate objectives that we hope to achieve along the way."

Over the past 5-10 years, Professor Fournais has been behind some of the most significant global breakthroughs having to do with the mathematical understanding of Bose-Einstein condensates. Among other things, he proved the formula for the ground-state energy of a Bose-Einstein condensate a question that had remained unanswered since about 1960.

So, how does one of Europe's leading mathematicians approach such a task? According to Sren Fournais, it doesnt happen in front of a computer:

"A computer can make a numerical calculation for 10 or 20 particles, but not a million. So, the computer is not a useful tool for us. Instead, it's a matter of a lot of coffee, good ideas and hard work at the blackboard with chalk," says the researcher, who concludes:

"A typical week begins with a few good new ideas from the previous weekend that you are eager to pursue. And then you work Monday, Tuesday and Wednesday until it seems to be progressing. Except that on Thursday, you realize that it isnt. But you get wiser. And then, the next round of ideas for the following week is perhaps a bit more tested. And then at some point, everything falls into place."

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Excerpt from:

Leading mathematician wants to solve the riddle of a million quantum particles: "It can only be done with a blackboard ... - EurekAlert

Stephen Hawking and I created his final theory of the cosmoshere’s what it reveals about the origins of time and life – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

written by researcher(s)

proofread

The late physicist Stephen Hawking first asked me to work with him to develop "a new quantum theory of the Big Bang" in 1998. What started out as a doctoral project evolved over some 20 years into an intense collaboration that ended only with his passing on March 14 2018.

The enigma at the center of our research throughout this period was how the Big Bang could have created conditions so perfectly hospitable to life. Our answer is being published in a new book, "On the Origin of Time: Stephen Hawking's Final Theory."

Questions about the ultimate origin of the cosmos, or universe, take physics out of its comfort zone. Yet this was exactly where Hawking liked to venture. The prospector hopeto crack the riddle of cosmic design drove much of Hawking's research in cosmology. "To boldly go where Star Trek fears to tread" was his mottoand also his screen saver.

Our shared scientific quest meant that we inevitably grew close. Being around him, one could not fail to be influenced by his determination and optimism that we could tackle mystifying questions. He made me feel as if we were writing our own creation story, which, in a sense, we did.

In the old days, it was thought that the apparent design of the cosmos meant there had to be a designera God. Today, scientists instead point to the laws of physics. These laws have a number of striking life-engendering properties. Take the amount of matter and energy in the universe, the delicate ratios of the forces, or the number of spatial dimensions.

Physicists have discovered that if you tweak these properties ever so slightly, it renders the universe lifeless. It almost feels as if the universe is a fixeven a big one.

But where do the laws of physics come from? From Albert Einstein to Hawking in his earlier work, most 20th-century physicists regarded the mathematical relationships that underlie the physical laws as eternal truths. In this view, the apparent design of the cosmos is a matter of mathematical necessity. The universe is the way it is because nature had no choice.

Around the turn of the 21st century, a different explanation emerged. Perhaps we live in a multiverse, an enormous space that spawns a patchwork of universes, each with its own kind of Big Bang and physics. It would make sense, statistically, for a few of these universes to be life-friendly.

However, soon such multiverse musings got caught in a spiral of paradoxes and no verifiable predictions.

Can we do better? Yes, Hawking and I found out, but only by relinquishing the idea, inherent in multiverse cosmology, that our physical theories can take a God's-eye view, as if standing outside the entire cosmos.

It is an obvious and seemingly tautological point: cosmological theory must account for the fact that we exist within the universe. "We are not angels who view the universe from the outside," Hawking told me. "Our theories are never decoupled from us."

We set out to rethink cosmology from an observer's perspective. This required adopting the strange rules of quantum mechanics, which governs the microworld of particles and atoms.

According to quantum mechanics, particles can be in several possible locations at the same timea property called superposition. It is only when a particle is observed that it (randomly) picks a definite position. Quantum mechanics also involves random jumps and fluctuations, such as particles popping out of empty space and disappearing again.

In a quantum universe, therefore, a tangible past and future emerge out of a haze of possibilities by means of a continual process of observing. Such quantum observations don't need to be carried out by humans. The environment or even a single particle can "observe".

Countless such quantum acts of observation constantly transform what might be into what does happen, thereby drawing the universe more firmly into existence. And once something has been observed, all other possibilities become irrelevant.

We discovered that when looking back at the earliest stages of the universe through a quantum lens, there's a deeper level of evolution in which even the laws of physics change and evolve, in sync with the universe that is taking shape. What's more, this meta-evolution has a Darwinian flavor.

Variation enters because random quantum jumps cause frequent excursions from what's most probable. Selection enters because some of these excursions can be amplified and frozen, thanks to quantum observation. The interplay between these two competing forcesvariation and selectionin the primeval universe produced a branching tree of physical laws.

The upshot is a profound revision of the fundamentals of cosmology. Cosmologists usually start by assuming laws and initial conditions that existed at the moment of the Big Bang, then consider how today's universe evolved from them. But we suggest that these laws are themselves the result of evolution.

Dimensions, forces, and particle species transmute and diversify in the furnace of the hot Big Bangsomewhat analogous to how biological species emerge billions of years laterand acquire their effective form over time.

Moreover, the randomness involved means that the outcome of this evolutionthe specific set of physical laws that makes our universe what it iscan only be understood in retrospect.

In some sense, the early universe was a superposition of an enormous number of possible worlds. But we are looking at the universe today at a time when humans, galaxies and planets exist. That means we see the history that led to our evolution.

We observe parameters with "lucky values". But we are wrong to assume they were somehow designed or always like that.

The crux of our hypothesis is that, reasoning backward in time, evolution towards more simplicity and less structure continues all the way. Ultimately, even time and, with it, the physical laws fade away.

This view is especially borne out of the holographic form of our theory. The "holographic principle" in physics predicts that just as a hologram appears to have three dimensions when it is in fact encoded in only two dimensions, the evolution of the entire universe is similarly encoded on an abstract, timeless surface.

Hawking and I view time and causality as "emergent qualities", having no prior existence but arising from the interactions between countless quantum particles. It's a bit like how temperature emerges from many atoms moving collectively, even though no single atom has temperature.

One ventures back in time by zooming out and taking a fuzzier look at the hologram. Eventually, however, one loses all information encoded in the hologram. This would be the origin of timethe Big Bang.

For almost a century, we have studied the origin of the universe against the stable background of immutable laws of nature. But our theory reads the universe's history from within and as one that includes, in its earliest stages, the genealogy of the physical laws. It isn't the laws as such but their capacity to transmute that has the final word.

Future cosmological observations may find evidence of this. For instance, precision observations of gravitational wavesripples in the fabric of spacetimemay reveal signatures of some of the early branches of the universe. If spotted, Hawking's cosmological finale may well prove to be his greatest scientific legacy.

See more here:

Stephen Hawking and I created his final theory of the cosmoshere's what it reveals about the origins of time and life - Phys.org

Quantum mechanics and the return of free will | Tim Andersen – IAI

The common definition of free will often has problems when relating to desire and power to choose. An alternative definition that ties free will to different outcomes for life despite one's past is supported by the probabilistic nature of quantum physics. This definition is compatible with the Many Worlds Interpretation of quantum physics, which refutes the conclusion that randomness does not imply free will, writes Tim Andersen.

Free will is one of those things where people tend to be very attached to its being true or false and yet most people implicitly treat it as true. Consider that we hold people accountable for their actions as if they decided to carry out those actions of their own free will. We reward people for their successes and discoveries likewise. If Albert Einstein didnt really make his discoveries but it was, instead, inevitable that his brain would do so, does he really deserve his Nobel Prize?

SUGGESTED READINGQuantum mechanics makes no sense without the mindBy Shan Gao

Some argue that we should accept that free will is a myth and change our society accordingly. Our justice system (especially in the United States) is heavily invested in the free will hypothesis. We punish people for crimes. We do no treat them like broken machines that need to be fixed. Other nations like Norway, however, take exactly this approach.

Many physicists believe that free will in incompatible with modern physics.

The argument goes like this:

(1) Classical (non-quantum) mechanics is deterministic. Given any initial conditions to a classical system, and the entire future and past state of the system can be determined. There is no free will in determinism.

(2) Quantum mechanics allows for randomness in the outcomes of experiments, but we have no control over those outcomes. There is no free will in randomness.

(3) Human will is a product of the brain which is a physical object. All physical objects are subject to physics and the sum total of physics is contained in classical and quantum mechanics (technically, classical is an approximation of quantum).

Ergo, humans have no free will. Our brains are simply carrying out a program that, while appearing to be making free choices, is in fact just a very complex algorithm.

___

As Schopenhauer said, Man can will what he wants but cannot will what he wills.

___

The logic seems sound and in any philosophical discourse we need to look at the logic and decide (whether freely or not). There are quite a few ways to counter this argument. The first is to object to #3. This is the approach many religions take. Human will is not necessarily reducible to physical causation. Therefore, it is beyond physical law. The brain simply interacts with the will and carries out its commands.

Another is to question the reductionist assumption of the conclusion, i.e., that everything is reducible to the properties of its constituent parts, no matter how complex. If the individual parts are deterministic, so must the whole. Science has not proven that yet. Perhaps if we could model a human brain in a computer in its entirety, we might know better.

Another approach is to question what the scientist means by free will. Most scientists arent philosophers and dont necessarily define their philosophical terms as clearly as their scientific ones. The common definition of free will is that it is the freedom to choose, the ability to decide to choose or do otherwise, to be the source of ones actions. Philosophers largely tie free will to the concept of moral responsibility. Am I morally responsible for my actions?

SUGGESTED VIEWINGNew theories of the universe With Sabine Hossenfelder, Phillip Ball, Bjrn Ekeberg, Sam Henry

To put is precisely, an agent S is morally accountable for performing an action =df. S deserves praise if goes beyond what can be reasonably expected of S and S deserves blame if is morally wrong. The key then is whether an agent has the ability or power to do otherwise.

Now, what does it mean to have the ability to choose or do otherwise? It cant simply mean to have the power because one must have both the power and the desire. But what if one does not have the power to change what one desires? Then you are stuck with no free will or only a pseudo-free will in which you can change your actions but not your desires.

As Schopenhauer said, Man can will what he wants but cannot will what he wills. Consider, if I have a choice to practice my cello or lift weights, I choose to practice my cello. Now, I seem to have had the power to choose to lift weights but I did not have the desire to do so. Did I have the power to desire differently?

From the argument of physics, the brains desires are either fixed results of classical laws or random results of quantum effects. A random quantum fluctuation creates voltage bias in one of my neurons which cascades to other neurons and suddenly I want to lift weights. According to the physicist, I did not choose. It just appeared as if I did. And certainly if I had chosen differently I would have done differently, and yet in reality quantum physics chose for me by rolling a cosmic die.

___

You have to see free will as having the power to have different outcomes for your life despite your past.

___

This kind of free will definition, which is the one most people think of and the one that most scientists seem to assume, has a lot of problems. Its hard to even understand what we really mean by freedom because it gets all muddled with desire.

Without a good definition, it is impossible to argue that something exists or not. Another definition of free will avoids this problem and throws a monkey wrench into the scientist on the streets knee-jerk attitude that free will is impossible in a quantum world.

This alternative is called the categorical analysis and is stated as follows: An agent S has the ability to choose or do otherwise than at time t if and only if it was possible, holding fixed everything up to t, that S choose or do otherwise than at t.

What this means is that we have to take into account the state of the agent up until the time the choice is made and given that state ask if there is a possible world where the agent makes a choice other than the one he or she made. That, then, is what freedom of choice is.

SUGGESTED READINGConsciousness is irrelevant to Quantum MechanicsBy Carlo Rovelli

Oxford physicist David Deutsch favours this definition of free will because it is compatible with his Many Worlds Interpretation (MWI) of quantum physics. But even if you dont accept MWI, what it says is that there are probable states that have the same past up until a point t and then a choice is made and a non-deterministic path is followed. It doesnt matter if those paths are all real worlds as Deutsch believes. What matters is that they have different futures, and all interpretations of quantum physics as-is support this idea.

If that is true, and this is the most important point, then you can say that freedom of choice exists because the agent made different choices in different probable realities. Thus, the agent had the power to choose and exercised it.

This definition of free will is interesting from a physics perspective because it is not true in a classical, deterministic world in which all pasts have the same future, but it is true in a quantum world where all pasts do not share the same future. Thus, it refutes the conclusion from #2 above that randomness does not imply free will. It only does so if you define free will in the way that people commonly understand it which is, frankly, not a defensible definition.

Rather, you have to see free will as having the power to have different outcomes for your life despite your past. Whether you can affect those outcomes by changing your actions or desires is a meaningless statement.

___

Freedom of choice exists because the agent made different choices in different probable realities.

___

Thus if I made the choice to practice in 60% of quantum futures and lift weights in 40%, then that proves I had the power to do otherwise. If I practiced in 100% of futures, then I did not have that power. Whether science can prove this is an open question, but it does not require any modification to quantum theory. Indeed, some modifications attempt to remove this possibility, incorrectly I believe.

While it may seem that this is sleight of hand in changing definitions, it is in reality making the definition of free will precise by saying that it is exactly the power to do otherwise. This is evidenced by quantum physics, i.e., because more than one outcome of a choice can occur from a single state of the universe, an agent does have the power to do otherwise which is what free will is.

See the rest here:

Quantum mechanics and the return of free will | Tim Andersen - IAI

ETH Zurich Researchers Strengthen Quantum Mechanics with … – HPCwire

May 12, 2023 A group of researchers led by Andreas Wallraff, Professor of Solid State Physics at ETH Zurich, has performed a loophole-free Bell test to disprove the concept of local causality formulated by Albert Einstein in response to quantum mechanics.

By showing that quantum mechanical objects that are far apart can be much more strongly correlated with each other than is possible in conventional systems, the researchers have provided further confirmation for quantum mechanics. Whats special about this experiment is that the researchers were able for the first time to perform it using superconducting circuits, which are considered to be promising candidates for building powerful quantum computers.

An Old Dispute

A Bell test is based on an experimental setup that was initially devised as a thought experiment by British physicist John Bell in the 1960s. Bell wanted to settle a question that the greats of physics had already argued about in the 1930s: Are the predictions of quantum mechanics, which run completely counter to everyday intuition, correct, or do the conventional concepts of causality also apply in the atomic microcosm, as Albert Einstein believed?

To answer this question, Bell proposed to perform a random measurement on two entangled particles at the same time and check it against Bells inequality. If Einsteins concept of local causality is true, these experiments will always satisfy Bells inequality. By contrast, quantum mechanics predicts that they will violate it.

The Last Doubts Dispelled

In the early 1970s, John Francis Clauser and Stuart Freedman carried out the first practical Bell test. In their experiments, the two researchers were able to prove that Bells inequality is indeed violated. But they had to make certain assumptions in their experiments to be able to conduct them in the first place. So, theoretically, it might still have been the case that Einstein was correct to be skeptical of quantum mechanics.

Over time, however, more and more of these loopholes could be closed. Finally in 2015, various groups succeeded in conducting the first truly loophole-free Bell tests.

Promising Applications

Wallraffs group can now confirm these results with a novel experiment. The work by the ETH researchers published in the scientific journal Nature demonstrates that research on this topic has not concluded despite the initial confirmation seven years ago.

A number of factors contribute to this outcome. The experiment conducted by the ETH researchers establishes that, despite their larger size compared to microscopic quantum objects, superconducting circuits still abide by the principles of quantum mechanics. These electronic circuits, which are several hundred micrometers in size and made from superconducting materials, function at microwave frequencies and are known as macroscopic quantum objects.

In addition, Bell tests also have a practical significance. Modified Bell tests can be used in cryptography, for example, to demonstrate that information is actually transmitted in encrypted form, explained Simon Storz, a doctoral student in Wallraffs group. With our approach, we can prove much more efficiently than is possible in other experimental setups that Bells inequality is violated. That makes it particularly interesting for practical applications.

The Search for a Compromise

To carry out their research, the team required an advanced testing facility. A critical aspect of a loophole-free Bell test is ensuring that no information exchange occurs between the two entangled circuits before the completion of the quantum measurements. As information can only travel as fast as the speed of light, the measurement process must be faster than the time taken for a light particle to travel from one circuit to another.

When designing the experiment, striking the right balance is crucial. Increasing the distance between the two superconducting circuits allows for more time to conduct the measurement, but it also complicates the experimental setup. This is due to the need for the entire experiment to be carried out in a vacuum near absolute zero.

The ETH researchers determined that the minimum distance needed for a successful loophole-free Bell test is approximately 33 meters. A light particle takes about 110 nanoseconds to travel this distance in a vacuum, which is slightly longer than the time it took for the researchers to complete the experiment.

Thirty-meter Vacuum

Wallraffs team has built an impressive facility in the underground passageways of the ETH campus. At each of its two ends is a cryostat containing a superconducting circuit. These two cooling apparatuses are connected by a 30-meter-long tube with interiors cooled to a temperature just above absolute zero (273.15C).

Before the start of each measurement, a microwave photon is transmitted from one of the two superconducting circuits to the other so that the two circuits become entangled. Random number generators then decide which measurements are made on the two circuits as part of the Bell test. Next, the measurement results on both sides are compared.

Large-scale Entanglement

After evaluating more than one million measurements, the researchers have shown with very high statistical certainty that Bells inequality is violated in this experimental setup. In other words, they have confirmed that quantum mechanics also allows for non-local correlations in macroscopic electrical circuits and consequently that superconducting circuits can be entangled over a large distance. This opens up interesting possible applications in the field of distributed quantum computing and quantum cryptography.

Building the facility and carrying out the test was a challenge, Wallraff said. We were able to finance the project over a period of six years with funding from an ERC Advanced Grant. Just cooling the entire experimental setup to a temperature close to absolute zero takes considerable effort.

He explained, There are 1.3 tons of copper and 14,000 screws in our machine, as well as a great deal of physics knowledge and engineering know-how. Wallraff further believes that, in principle, it is possible to construct facilities capable of overcoming even greater distances using the same approach. Such technology could potentially be employed to connect superconducting quantum computers across vast distances.

Source: Felix Wrsten, ETH Zrich

Read the rest here:

ETH Zurich Researchers Strengthen Quantum Mechanics with ... - HPCwire

Time Twisted in Quantum Physics: How the Future Might Influence … – SciTechDaily

The 2022 physics Nobel prize was awarded for experimental work demonstrating fundamental breaks in our understanding of the quantum world, leading to discussions around local realism and how it could be refuted. Many theorists believe these experiments challenge either locality (the notion that distant objects require a physical mediator to interact) or realism (the idea that theres an objective state of reality). However, a growing number of experts suggest an alternative approach, retrocausality, which posits that present actions can affect past events, thus preserving both locality and realism.

The 2022 Nobel Prize in physics highlighted the challenges quantum experiments pose to local realism. However, a growing body of experts propose retrocausality as a solution, suggesting that present actions can influence past events, thus preserving both locality and realism. This concept offers a novel approach to understanding causation and correlations in quantum mechanics, and despite some critics and confusion with superdeterminism, it is increasingly seen as a viable explanation for recent groundbreaking experiments, potentially safeguarding the core principles of special relativity.

In 2022, the physics Nobel prize was awarded for experimental work showing that the quantum world must break some of our fundamental intuitions about how the universe works.

Many look at those experiments and conclude that they challenge locality the intuition that distant objects need a physical mediator to interact. And indeed, a mysterious connection between distant particles would be one way to explain these experimental results.

Others instead think the experiments challenge realism the intuition that theres an objective state of affairs underlying our experience. After all, the experiments are only difficult to explain if our measurements are thought to correspond to something real. Either way, many physicists agree about whats been called the death by experiment of local realism.

But what if both of these intuitions can be saved, at the expense of a third? A growing group of experts think that we should abandon instead the assumption that present actions cant affect past events. Called retrocausality, this option claims to rescue both locality and realism.

What is causation anyway? Lets start with the line everyone knows: correlation is not causation. Some correlations are causation, but not all. Whats the difference?

Consider two examples. (1) Theres a correlation between a barometer needle and the weather thats why we learn about the weather by looking at the barometer. But no one thinks that the barometer needle is causing the weather. (2) Drinking strong coffee is correlated with a raised heart rate. Here it seems right to say that the first is causing the second.

The difference is that if we wiggle the barometer needle, we wont change the weather. The weather and the barometer needle are both controlled by a third thing, the atmospheric pressure thats why they are correlated. When we control the needle ourselves, we break the link to the air pressure, and the correlation goes away.

But if we intervene to change someones coffee consumption, well usually change their heart rate, too. Causal correlations are those that still hold when we wiggle one of the variables.

These days, the science of looking for these robust correlations is called causal discovery. Its a big name for a simple idea: finding out what else changes when we wiggle things around us.

In ordinary life, we usually take for granted that the effects of a wiggle are going to show up later than the wiggle itself. This is such a natural assumption that we dont notice that were making it.

But nothing in the scientific method requires this to happen, and it is easily abandoned in fantasy fiction. Similarly in some religions, we pray that our loved ones are among the survivors of yesterdays shipwreck, say. Were imagining that something we do now can affect something in the past. Thats retrocausality.

The quantum threat to locality (that distant objects need a physical mediator to interact) stems from an argument by the Northern Ireland physicist John Bell in the 1960s. Bell considered experiments in which two hypothetical physicists, Alice and Bob, each receive particles from a common source. Each chooses one of several measurement settings, and then records a measurement outcome. Repeated many times, the experiment generates a list of results.

Bell realized that quantum mechanics predicts that there will be strange correlations (now confirmed) in this data. They seemed to imply that Alices choice of setting has a subtle nonlocal influence on Bobs outcome, and vice versa even though Alice and Bob might be light years apart. Bells argument is said to pose a threat to Albert Einsteins theory of special relativity, which is an essential part of modern physics.

But thats because Bell assumed that quantum particles dont know what measurements they are going to encounter in the future. Retrocausal models propose that Alices and Bobs measurement choices affect the particles back at the source. This can explain the strange correlations, without breaking special relativity.

In recent work, weve proposed a simple mechanism for the strange correlation it involves a familiar statistical phenomenon called Berksons bias (see our popular summary here).

Theres now a thriving group of scholars who work on quantum retrocausality. But its still invisible to some experts in the wider field. It gets confused for a different view called superdeterminism.

Superdeterminism agrees with retrocausality that measurement choices and the underlying properties of the particles are somehow correlated.

But superdeterminism treats it like the correlation between the weather and the barometer needle. It assumes theres some mysterious third thing a superdeterminer that controls and correlates both our choices and the particles, the way atmospheric pressure controls both the weather and the barometer.

So superdeterminism denies that measurement choices are things we are free to wiggle at will, they are predetermined. Free wiggles would break the correlation, just as in the barometer case. Critics object that superdeterminism thus undercuts core assumptions necessary to undertake scientific experiments. They also say that it means denying free will, because something is controlling both the measurement choices and particles.

These objections dont apply to retrocausality. Retrocausalists do scientific causal discovery in the usual free, wiggly way. We say it is folk who dismiss retrocausality who are forgetting the scientific method, if they refuse to follow the evidence where it leads.

What is the evidence for retrocausality? Critics ask for experimental evidence, but thats the easy bit: the relevant experiments just won a Nobel Prize. The tricky part is showing that retrocausality gives the best explanation of these results.

Weve mentioned the potential to remove the threat to Einsteins special relativity. Thats a pretty big hint, in our view, and its surprising it has taken so long to explore it. The confusion with superdeterminism seems mainly to blame.

In addition, we and others have argued that retrocausality makes better sense of the fact that the microworld of particles doesnt care about the difference between past and future.

We dont mean that it is all plain sailing. The biggest worry about retrocausation is the possibility of sending signals to the past, opening the door to the paradoxes of time travel. But to make a paradox, the effect in the past has to be measured. If our young grandmother cant read our advice to avoid marrying grandpa, meaning we wouldnt come to exist, theres no paradox. And in the quantum case, its well known that we can never measure everything at once.

Still, theres work to do in devising concrete retrocausal models that enforce this restriction that you cant measure everything at once. So well close with a cautious conclusion. At this stage, its retrocausality that has the wind in its sails, so hull down towards the biggest prize of all: saving locality and realism from death by experiment.

Written by:

This article was first published in The Conversation.

Read this article:

Time Twisted in Quantum Physics: How the Future Might Influence ... - SciTechDaily

How we could discover quantum gravity without rebuilding space-time – New Scientist

Shutterstock/Sola Solandra

MODERN physics has two stories to tell about our universe. The first says it is fundamentally made of space-time: a continuous, stretchy fabric that has ballooned since the dawn of time. The other says it is fundamentally made of indivisible things that cant decide where they are, or even when.

Both stories are compelling, describing what we observe with incredible accuracy. The big difference, though, is the scale at which they apply. Albert Einsteins theory of general relativity, which describes gravity, space and time, rules over very massive objects and cosmic distances. Quantum physics, meanwhile, governs tiny, sprightly atoms and subatomic particles.

Ultimately, both stories cant be true. Nowhere is this more apparent than at the big bang, where everything in the universe was compacted into an infinitesimally small point. Here, you need a single theory that encompasses gravity and the quantum realm. Why were here is the big question, says Toby Wiseman, a theorist at Imperial College London. It seems that quantum gravity is the only answer.

Alas, it is an answer we are yet to find, despite many decades of searching. Quantum gravity means a reconciliation of the continuous and the indivisible, the predictable and the random. There are many ideas, but none can totally incorporate everything. Were still no better off at understanding the beginning of space and time, says Wiseman.

Most physicists attempting this begin with quantum physics, the workhorse of which is quantum field theory. This describes three of the four forces of nature electromagnetism, the strong nuclear force and the weak nuclear force by quantising them as force-carrying elementary particles. It

See more here:

How we could discover quantum gravity without rebuilding space-time - New Scientist

Theoretical Physicists Discover Why Optical Cavities Slow Down … – SciTechDaily

Resonant vibrational strong-coupling can inhibit chemical reactions. Strong resonant coupling between cavity and vibrational modes can selectively inhibit a chemical reaction, i.e., preventing the appearance of products, that is present outside the cavity environment. Credit: E. Ronca / C. Schfer

Scientists have discovered why chemical reactions are slowed down in mirrored cavities, where molecules interact with light. The team used Quantum-Electrodynamical Density-Functional Theory to find that the conditions inside the optical cavity affected the energy that makes atoms vibrate around the molecules single bonds, which are critical to the reaction.

Chemical processes are all around us. From novel materials to more effective medicines or plastic products chemical reactions play a key role in the design of the things we use every day. Scientists constantly search for better ways to control these reactions, for example, to develop new materials. Now an international research team led by the MPSD has found an explanation why chemical reactions are slowed down inside mirrored cavities, where molecules are forced to interact with light. Their work, now published in the journal Nature Communications, is a key step in understanding this experimentally observed process.

Chemical reactions occur on the scale of atomic vibrations one million times smaller than the thickness of a human hair. These tiny movements are difficult to control. Established methods include the control of temperature or providing surfaces and complexes in solution made from rare materials. They tackle the problem on a larger scale and cannot target specific parts of the molecule. Ideally, researchers would like to provide only a small amount of energy to some atoms at the right time, just like a billiard player wants to nudge just one ball on the table.

In recent years, it became clear that molecules undergo fundamental changes when they are placed in optical cavities with opposing mirrors. Inside those confines, the system is forced to interact with virtual light, or photons. Crucially, this interaction changes the rate of chemical reactions an effect that was observed in experiments but whose underlying mechanism remained a mystery.

Now a team of theoretical physicists from Germany, Sweden, Italy, and the USA has come up with a possible explanation that qualitatively agrees with the experimental results. The team involved researchers from the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) in Hamburg, Germany, Chalmers University of Technology in Sweden, the Center for Computational Quantum Physics at the Flatiron Institute, Harvard University (both in the U.S.A.), and the Istituto per i Processi Chimico Fisici at the CNR (National Research Council) in Italy.

Using an advanced theoretical method, called Quantum-Electrodynamical Density-Functional Theory (QEDFT), the authors have unveiled the microscopic mechanism which reduces the chemical reaction rate, for the specific case of the deprotection reaction of 1-phenyl-2-trimethylsilylacetylene. Their findings are in agreement with the observations by the group of Thomas Ebbesen in Strasbourg.

The team discovered that the conditions inside the optical cavity affect the energy which makes the atoms vibrate around the molecules single bonds, which are critical for the chemical reaction. Outside the cavity, that energy is usually deposited in a single bond during the reaction, which can ultimately break the bond a key step in a chemical reaction. However, we find that the cavity introduces a new pathway, so that the energy is less likely to be funneled only into a single bond, says lead author Christian Schfer. This is the key process which inhibits the chemical reaction, because the probability to break a specific bond is diminished.

Manipulating materials through the use of cavities (the so-called polaritonic chemistry) is a powerful tool with many potential applications, according to the papers author Enrico Ronca, who works at CNR: For instance, it was observed that coupling to specific vibrational excitations can inhibit, steer, and even catalyze a chemical process at room temperature. Our theoretical work enhances the understanding of the underlying microscopic mechanisms for the specific case of a reaction inhibited by the field.

While the authors point out that important aspects remain to be understood and further experimental validation is required, they also highlight the special role of this new direction. This works puts the controversial field of polaritonic chemistry onto a different level, adds Angel Rubio, the Director of the MPSDs Theory Department. It provides fundamental insights into the microscopic mechanisms that enable the control of chemical reactions. We expect the present findings to be applicable to a larger set of relevant reactions (including click chemical reactions linked to this years Nobel Prize in chemistry) under strong light-matter coupling conditions.

Reference: Shining light on the microscopic resonant mechanism responsible for cavity-mediated chemical reactivity by Christian Schfer, Johannes Flick, Enrico Ronca, Prineha Narang and Angel Rubio, 19 December 2022, Nature Communications.DOI: 10.1038/s41467-022-35363-6

See more here:

Theoretical Physicists Discover Why Optical Cavities Slow Down ... - SciTechDaily

UMD Quantum Physicist Elected to National Academy of Sciences – Maryland Today

A groundbreaking quantum physics researcher who has long been affiliated with the Joint Quantum Institute at the University of Maryland was elected a member of the National Academy of Scienceslast week.

Paul Julienne, an emeritus fellow at JQI and an adjunct professor of physics at UMD, joined 142 other U.S. and international members recognized in 2023 for their exceptional, ongoing achievements in original research. Hes one of 22 current UMD faculty members in the National Academy of Sciences, and 67 named to various esteemed honorary academies.

Julienne helped establish the research field of ultracold matter, which investigates atoms and molecules near absolute zero. His theoretical research includes developing models that describe how cold trapped molecules and atoms can be precisely controlled using magnetic fields or lasers. This research topic has revealed details of atomic states and chemical reactions of ultracold molecules.

I am both gratified and humbled by this honor, which is only possible because of the many excellent colleagues and students with whom I have worked with over the years, Julienne said. I owe them a debt of gratitude, for it is by working together that science advances."

Julienne joined JQI in 2007, soon after its founding as a joint research institute combining the scientific strengths of the University of Maryland with the National Institute of Standards and Technology (NIST). The university and the federal agency have since broadened and deepened their collaboration with other quantum centers and institutes at UMD, helping lay the foundation for the university to become one of the most vibrant loci of quantum research in the world.

UMDs hundreds of researchers, partnerships with government agencies and labs, and collaboration with a wide range of firms in the quantum space inspired university President Darryll J. Pines to refer to the scientific and tech ferment centered at UMD as the Capital of Quantum.

Paul Julienne's election to the National Academy of Sciences highlights his remarkable achievements in the field of ultracold matter and underscores the significance of his contributions to science and the quantum revolution, Pines said. We are honored to have such a distinguished researcher and educator as part of our institution."

Julienne earned a B.S. in chemistry from Wofford College in 1965 and his Ph.D. in chemical physics from the University of North Carolina at Chapel Hill in 1969. He worked as a postdoctoral researcher at the National Bureau of Standards and as a staff researcher at the Naval Research Laboratory before beginning a career of nearly 40 years at NIST, first as a research scientist and then as a NIST fellow, retiring in 2013. Among his other awards and accomplishments, he received the 2015 William F. Meggers Award of the Optical Society of America and the 2004 Davisson-Germer Prize of the American Physical Society and is a fellow of the division of Atomic, Molecular, and Optical Physics of the American Physical Society.

We are proud to see Dr. Julienne honored with one of the highest professional distinctions accorded to a scientist, said Amitabh Varshney, dean of UMDs College of Computer, Mathematical, and Natural Sciences. Our college extends its congratulations to him for this well-deserved recognition.

See the article here:

UMD Quantum Physicist Elected to National Academy of Sciences - Maryland Today

Stanley Deser, Whose Ideas on Gravity Help Explain the Universe … – The New York Times

Stanley Deser, a theoretical physicist who helped illuminate the details of gravity and how it shapes the space-time fabric of the universe, died on April 21 in Pasadena, Calif. He was 92.

His death, at a hospital, was confirmed by his daughter, Abigail Deser.

Physicists have long dreamed of devising a theory of everything a set of equations that neatly and completely describe how the universe works. By the middle of the 20th century, they had come up with two theories that serve as the pillars of modern physics: quantum mechanics and general relativity.

Quantum mechanics describes how, in the subatomic realm, everything is broken up in discrete chunks, or quanta, such as the individual particles of light called photons. Albert Einsteins theory of general relativity had elegantly captured how mass and gravity bend the fabric of space-time.

However, these two pillars did not fit together. General relativity does not contain any notion of quanta; a quantum theory of gravity is an ambition that remains unfinished today.

The problem we face is how to unify these two into a seamless theory of everything, said Michael Duff, an emeritus professor of physics at Imperial College London in England. Stanley was amongst the first to tackle this problem.

In 1959, Dr. Deser, along with two other physicists, Richard Arnowitt and Charles Misner, published what is now known as the ADM formalism (named after the initials of their surnames), which rejiggered the equations of general relativity in a form that laid a foundation for work toward a quantum theory of gravity.

Its a bridge toward quantum, said Edward Witten, a physicist at the Institute for Advanced Study in Princeton, N.J. So far, however, no one has been able to take it to the next step and come up with a unified theory that includes quantum gravity.

The ADM formalism offered additional benefit: It made general relativity equations amenable to computer simulations, enabling scientists to probe phenomena like the space-bending pull of black holes and the universe-shaking explosions when stars collide.

The rejiggered equations split four-dimensional space-time into slices of three-dimensional space, an innovation that allowed computers to handle the complex data and, as Frans Pretorius, a professor of physics at Princeton University, put it, evolve these slices in time to find the full solution.

Dr. Deser is perhaps best known for his work in the 1970s as one of the pioneers of supergravity, which expanded an idea known as supersymmetry to include gravity.

From quantum mechanics, physicists already knew that fundamental particles fell into one of two groups. Familiar constituents of matter like electrons and quarks fall into the group known as fermions; while those that carry fundamental forces like photons, the particles of light that convey the force of electromagnetism, are known as bosons.

Supersymmetry hypothesizes an as-yet-undiscovered boson partner for every fermion, and a fermion partner for each boson.

Dr. Deser worked with Bruno Zumino, one of the originators of supersymmetry, to add gravity to the theory, creating the theory of supergravity. Supergravity includes gravitons the gravitational equivalent of photons and adds a supersymmetric partner, the gravitino.

Experiments using particle accelerators have yet to turn up evidence of any of these partner particles, but the theories have not been disproved, and because of their mathematical elegance, they remain attractive to physicists.

Supergravity is also a key aspect of superstring theories, which attempt to provide a complete explanation of how the universe works, overcoming shortfalls of quantum gravity theories.

Stanley was one of the most influential researchers on questions related to gravity over his extremely long and distinguished career, said Dr. Witten, who has been at the forefront of devising superstring theories.

Stanley Deser was born in Rovno, Poland, a city now known as Rivne and part of Ukraine, on March 19, 1931. As Jews, his parents, Norman, a chemist, and Miriam, fled Polands repressive, antisemitic regime in 1935 for Palestine. But prospects for finding work there were dim, and a few months later they moved to Paris.

In 1940, with World War II engulfing Europe, the family narrowly escaped France after Germany invaded.

They finally realized the danger and decided to leave everything, Dr. Deser wrote of his parents in his autobiography, Forks in the Road. I rushed with my father to empty our safe. That evening, my mother sewed the coins into a belt of towels, a much-practiced maneuver of refugees, while the rest of us packed a few belongings.

The family fled to Portugal and 11 months later obtained visas to emigrate to the United States. They eventually settled in New York City, where Norman and Miriam ran a chemical supplies business.

By age 12 Stanley had been promoted to 10th grade, and he graduated from high school at 14. He earned a bachelors degree in physics from Brooklyn College in 1949 at 18, then went to Harvard, where he studied under Julian Schwinger, a Nobel Prize laureate. He completed his doctorate in 1953.

After postdoctoral fellowships at the Institute for Advanced Study and the Niels Bohr Institute in Copenhagen, Dr. Deser joined the faculty of Brandeis University in 1958.

The following three years, working on the ADM formalism, provided the best run of luck that one could possibly hope for, he wrote in his autobiography.

In an interview last year for Caltechs Heritage Project, Dr. Deser recalled that he, Dr. Arnowitt and Dr. Misner completed much of the work during summers in Denmark, in a kindergarten classroom. The nice thing about this kindergarten, it has blackboards, he said. Denmark is very good that way.

Since the blackboards were mounted low for children, we would crawl and write equations, Dr. Deser said. And the papers just poured out.

Dr. Misner, an emeritus professor of physics at the University of Maryland, said there were parallels between the ADM recasting of general relativity and the quantum field theory of electromagnetism that other physicists were working on, and they were able to apply that experience to general relativity.

The work on supergravity occurred during a stay at the CERN particle laboratory in Geneva where Dr. Zumino worked. In a period of just three weeks, to our amazement, we had a consistent theory, Dr. Deser recalled.

He and Dr. Zumino published a paper about supergravity in June 1976. However, another group of physicists Daniel Freedman, Sergio Ferrara and Peter van Nieuwenhuizen beat them to the punch, describing supergravity in a paper that had been completed about a month before Dr. Deser and Dr. Zumino submitted theirs.

As a result, Dr. Deser said, sometimes the work that he and Dr. Zumino did was overlooked. In 2019, a Breakthrough Prize in Fundamental Physics accompanied by $3 million was awarded to the other team.

He was understandably upset, Dr. Duff, the British physicist, said. I think they could have erred on the side of generosity and included Stanley as the fourth recipient. (Dr. Zumino died in 2014.)

Dr. Schwarz and Dr. Witten, who were members of the committee that awarded the prize, declined to discuss the particulars of the decision, but Dr. Schwarz said, It was a purely scientific decision.

Dr. Deser worked at Brandeis until he retired in 2005. He then moved to Pasadena to be close to his daughter and obtained an unpaid position as a senior research associate at Caltech.

In addition to Abigail, he is survived by two other daughters, Toni Deser and Clara Deser, and four grandchildren.

His wife of 64 years, Elsbeth Deser, died in 2020. A daughter, Eva, died in 1968.

While Dr. Deser was an expert on gravity and general relativity, he was not infallible.

In the Caltech interview, he recalled a paper in which he suggested that gravity could solve some troubling infinities that were showing up in the quantum field theory of electrodynamics.

Other noteworthy physicists had similar thoughts but did not publish them. Dr. Deser did.

It was garbage, he said. During a talk at a conference, Richard Feynman, the Nobel Prize-winning physicist who devised much of quantum electrodynamics, without much difficulty shot me to pieces, which I deserved, he said.

He added, Everybodys entitled to a few strikes.

Read more here:

Stanley Deser, Whose Ideas on Gravity Help Explain the Universe ... - The New York Times

With new experimental method, researchers probe spin structure in 2D materials for first time – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

For two decades, physicists have tried to directly manipulate the spin of electrons in 2D materials like graphene. Doing so could spark key advances in the burgeoning world of 2D electronics, a field where super-fast, small and flexible electronic devices carry out computations based on quantum mechanics.

Standing in the way is that the typical way in which scientists measure the spin of electronsan essential behavior that gives everything in the physical universe its structureusually doesn't work in 2D materials. This makes it incredibly difficult to fully understand the materials and propel forward technological advances based on them. But a team of scientists led by Brown University researchers believe they now have a way around this longstanding challenge. They describe their solution in a new study published in Nature Physics.

In the study, the teamwhich also include scientists from the Center for Integrated Nanotechnologies at Sandia National Laboratories, and the University of Innsbruckdescribe what they believe to be the first measurement showing direct interaction between electrons spinning in a 2D material and photons coming from microwave radiation.

Called a coupling, the absorption of microwave photons by electrons establishes a novel experimental technique for directly studying the properties of how electrons spin in these 2D quantum materialsone that could serve as a foundation for developing computational and communicational technologies based on those materials, according to the researchers.

"Spin structure is the most important part of a quantum phenomenon, but we've never really had a direct probe for it in these 2D materials," said Jia Li, an assistant professor of physics at Brown and senior author of the research. "That challenge has prevented us from theoretically studying spin in these fascinating material for the last two decades. We can now use this method to study a lot of different systems that we could not study before."

The researchers made the measurements on a relatively new 2D material called "magic-angle" twisted bilayer graphene. This graphene-based material is created when two sheets of ultrathin layers of carbon are stacked and twisted to just the right angle, converting the new double-layered structure into a superconductor that allows electricity to flow without resistance or energy waste. Just discovered in 2018, the researchers focused on the material because of the potential and mystery surrounding it.

"A lot of the major questions that were posed in 2018 have still yet to be answered," said Erin Morissette, a graduate student in Li's lab at Brown who led the work.

Physicists usually use nuclear magnetic resonance or NMR to measure the spin of electrons. They do this by exciting the nuclear magnetic properties in a sample material using microwave radiation and then reading the different signatures this radiation causes to measure spin.

The challenge with 2D materials is that the magnetic signature of electrons in response to the microwave excitation is too small to detect. The research team decided to improvise. Instead of directly detecting the magnetization of the electrons, they measured subtle changes in electronic resistance, which were caused by the changes in magnetization from the radiation using a device fabricated at the Institute for Molecular and Nanoscale Innovation at Brown.

These small variations in the flow of the electronic currents allowed the researchers to use the device to detect that the electrons were absorbing the photos from the microwave radiation.

The researchers were able to observe novel information from the experiments. The team noticed, for instance, that interactions between the photons and electrons made electrons in certain sections of the system behave as they would in an anti-ferromagnetic systemmeaning the magnetism of some atoms was canceled out by a set of magnetic atoms that are aligned in a reverse direction.

The new method for studying spin in 2D materials and the current findings won't be applicable to technology today, but the research team sees potential applications the method could lead to in the future. They plan to continue to apply their method to twisted bilayer graphene but also expand it to other 2D material.

"It's a really diverse toolset that we can use to access an important part of the electronic order in these strongly correlated systems and in general to understand how electrons can behave in 2D materials," Morissette said.

More information: Andrew Mounce, Dirac revivals drive a resonance response in twisted bilayer graphene, Nature Physics (2023). DOI: 10.1038/s41567-023-02060-0. http://www.nature.com/articles/s41567-023-02060-0

Journal information: Nature Physics

Originally posted here:

With new experimental method, researchers probe spin structure in 2D materials for first time - Phys.org

Is Quantum Computing a Threat To Current Encryption Methods? – Spiceworks News and Insights

Encryption is the backbone of cybersecurity, keeping data and systems secure. Quantum computing threatens to make todays encryption obsolete. Developing quantum-secure encryption is one of the main challenges facing the cybersecurity sector today, highlights Michael Redding, chief technology officer at Quantropi.

Explaining how quantum computers work is challenging. It involves presenting complicated scientific concepts like superposition, which allows groups of qubits to create multidimensional computational spaces. For those who do not have a background in quantum physics, quantum computing can seem more like science fiction than computer science.

Explaining what quantum computers do, however, is much easier. In essence, they leverage the behavior of subatomic particles to increase computation speed exponentially. When Google announced in October 2019 that it had achieved quantum supremacyOpens a new window , it was celebrating the fact that it had used quantum computing to solve a complex mathematical problem in 3 minutes and 20 seconds. How long would a conventional computer have taken to solve the same problem? According to Google, it would have taken at least 10,000 years.

How will the world use this mind-blowingly fast processing power? Experts predict it will transform a number of industries, from pharmaceuticals to finance to supply chain management. However, the quantum computing use case that has been making the most headlines in recent months is cybersecurity.

Encryption is the backbone of cybersecurity. It is the tool that keeps critical data under lock and key. Without it, security and privacy would be impossible to achieve.

Hackers have a number of avenues for gaining unauthorized access to encrypted information. One popular method involves social engineering attacks that seek to trick someone into revealing the password that provides access to data. Rather than cracking the code, hackers using social engineering attacks simply steal the key.

Data breaches provide another option for obtaining passwords. Reports of breaches regularly make the news, and each breach has the potential to put passwords into the hands of bad actors seeking to obtain access to encrypted data.

Brute force attacks represent a different approach to cracking encryption. Rather than trying to obtain the password from a user or stolen data, these attacks use computers to cycle through possible passwords until the correct one is found. Essentially, brute force attacks figure out passwords through trial and error, leveraging computers to do the work quickly and systematically.

Current encryption methods are considered effective in thwarting brute force attacks, as the most advanced encryption systems work with passwords or keys that are long and complicated or highly random. With todays computers, deciphering the key through trial and error can take millions of years.

However, quantum computing changes the timeline for cracking todays encryption. By exponentially increasing processing speed, quantum computers could break the most advanced keys commonly used today in minutes.

When will bad actors have access to a quantum computer capable of threatening todays encryption? Based on Shors AlgorithmOpens a new window , a quantum computer would need millions of qubits with a quantum circuit depth measured in the billions with essentially perfect calculation fidelity.

Based on todays quantum computing capability, that would put Y2Q into the 2040s, if ever. However, breakthroughs that have been achieved in 2023 in research out of China and Germany using a hybrid classic + quantum attack vector using AI and machine learning have drastically reduced the quantum capabilities required to break asymmetric encryption as compared to Shors Algorithm.

Combining these new AI and machine learning hybrid attack vectors with the rapid advancement of quantum computing capabilities begins to crystalize a pathway to Y2Q in the next 1 to 5 years. It is no longer a question of how to break asymmetric encryption with todays generation of technology the approach has been published. Now, It is only a matter of optimization and continued incremental technology improvements.

To address Y2Qs impact on security, developers are focusing on two main approaches to quantum security: post-quantum cryptography and quantum key distribution. Post-quantum cryptography (PQC) leverages complex mathematical algorithms to provide security that is resistant to quantum attacks, while quantum key distribution involves exploiting the properties of quantum mechanics to bolster security.

PQC provides an efficient means of updating security systems because it is math-based, which allows it to be implemented through computer coding and deployed in end devices with a simple software update. However, PQCs security relies on complex/hard mathematical calculations and, in some cases, large key sizes, both of which come with considerable performance costs.

Organizations that seek to quantum-proof their systems with PQC must be aware that considerable infrastructure updates may be necessary. Because PQC encryption schemes are typically more complex than those currently in use, they require more resources for encrypting and decrypting, including more time, storage space, memory, and network bandwidth.

For the average user relying on PQC for booting machines or encrypting data related to web browsing, the additional processing burden might not be noticeable. However, organizations simultaneously transmitting and receiving thousands or millions of digital transactions per second must consider the impact this will have on their performance. Failure to do so can create dangerous latency in devices that rely on high efficiency, such as the systems that manage computer-aided driving software in autonomous vehicles.

PQC also poses challenges for updating internet of things (IoT) devices to quantum-secure encryption. Smart doorbells and other intelligent appliances will become vulnerable if their encryption systems are not updated, though they typically do not have the processing power to support PQC effectively.

Quantum key distribution (QKD) is another option for quantum-resistant encryption. This approach relies on the laws of quantum physics rather than mathematics to generate and transmit encryption keys between two parties. The natural laws involved in this process also provide warnings to users when QKD transmissions are disturbed or intercepted by bad actors.

Theoretically, QKD provides security that is effective against quantum computing attacks and can withstand attacks for an indefinite amount of time. Practically, however, making it a reality would require overcoming a number of significant technical challenges. QKD uses photon emitters and receivers to create quantum entanglement between two devices. However, the current state of this technology is largely experimental, with few commercial deployments and significant limitations on bandwidth, distance, complexity, and cost that continue to be explored and improved upon.

See More: Why and Where the PQC Market is Gaining Traction

Developing quantum-secure encryption is just the first step toward preparing for Y2Q. In order to be truly quantum secure, organizations must assess where they are vulnerable, determine how to integrate new security systems in those areas, deploy those systems, and test them. It is a process that could take years, and the clock is ticking.

Those who understand the stakes involved are already taking steps. For example, the US government issued a National Security Memorandum in May 2022 that warns of the significant risks that quantum computing poses to the economic and national security of the United States. The memorandum calls for a timely transition to quantum-resistant cryptography.

Rob Joyce, NSA cybersecurity director and deputy national manager for national security systems, highlightedOpens a new window the need to push forward in achieving quantum-resistant systems in his comments on the memorandum. He stated: Implementing approved quantum-resistant cryptographic solutions across all of our systems will not happen overnight, but its critical that we chart a path to get there considering the potential threat of quantum computing.

In the end, public and private organizations need to prepare for Y2Q immediately to protect their data, connected devices, systems, and communications. The time is now.

Are you preparing for Y2Q? How are you upgrading to quantum-secure encryption? Share with us on FacebookOpens a new window , TwitterOpens a new window , and LinkedInOpens a new window . Wed love to hear from you!

Image Source: Shutterstock

Go here to see the original:

Is Quantum Computing a Threat To Current Encryption Methods? - Spiceworks News and Insights

Researchers discover superconductive images are actually 3D and disorder-driven fractals – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Meeting the world's energy demands is reaching a critical point. Powering the technological age has caused issues globally. It is increasingly important to create superconductors that can operate at ambient pressure and temperature. This would go a long way toward solving the energy crisis.

Advancements with superconductivity hinge on advances in quantum materials. When electrons inside of quantum materials undergo a phase transition, the electrons can form intricate patterns, such as fractals. A fractal is a never-ending pattern. When zooming in on a fractal, the image looks the same. Commonly seen fractals can be a tree or frost on a windowpane in winter. Fractals can form in two dimensions, like the frost on a window, or in three-dimensional space like the limbs of a tree.

Dr. Erica Carlson, a 150th Anniversary Professor of Physics and Astronomy at Purdue University, led a team that developed theoretical techniques for characterizing the fractal shapes that these electrons make, in order to uncover the underlying physics driving the patterns.

Carlson, a theoretical physicist, has evaluated high resolution images of the locations of electrons in the superconductor Bi2-xPbzSr2-yLayCuO6+x (BSCO), and determined that these images are indeed fractal and discovered that they extend into the full three-dimensional space occupied by the material, like a tree filling space.

What was once thought of as random dispersions within the fractal images are purposeful and, shockingly, not due to an underlying quantum phase transition as expected, but due to a disorder-driven phase transition.

Carlson led a collaborative team of researchers across multiple institutions and published their findings, titled "Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x," in Nature Communications.

The team includes Purdue scientists and partner institutions. From Purdue, the team includes Carlson, Dr. Forrest Simmons, recent Ph.D. student, and former Ph.D. students Dr. Shuo Liu and Dr. Benjamin Phillabaum. The Purdue team completed their work within the Purdue Quantum Science and Engineering Institute (PQSEI). The team from partner institutions includes Dr. Jennifer Hoffman, Dr. Can-Li Song, Dr. Elizabeth Main of Harvard University, Dr. Karin Dahmen of the University of Urbana-Champaign, and Dr. Eric Hudson of Pennsylvania State University.

"The observation of fractal patterns of orientational ('nematic') domainscleverly extracted by Carlson and collaborators from STM images of the surfaces of crystals of a cuprate high temperature superconductoris interesting and aesthetically appealing on its own, but also of considerable fundamental importance in coming to grips with the essential physics of these materials," says Dr. Steven Kivelson, the Prabhu Goel Family Professor at Stanford University and a theoretical physicist specializing in novel electronic states in quantum materials. "Some form of nematic order, typically thought to be an avatar of a more primitive charge-density-wave order, has been conjectured to play an important role in the theory of the cuprates, but the evidence in favor of this proposition has previously been ambiguous at best. Two important inferences follow from Carlson et al.'s analysis: 1) The fact that the nematic domains appear fractal implies that the correlation lengththe distance over which the nematic order maintains coherenceis larger than the field of view of the experiment, which means that it is very large compared to other microscopic scales. 2) The fact that patterns that characterize the order are the same as those obtained from studies of the three dimensional random-field Ising modelone of the paradigrmatic models of classical statistical mechanicssuggests that the extent of the nematic order is determined by extrinsic quantities and that intrinsically (i.e. in the absence of crystalline imperfections) it would exhibit still longer range correlations not just along the surface, but extending deep into the bulk of the crystal."

High resolution images of these fractals are painstakingly taken in Hoffman's lab at Harvard University and Hudson's lab, now at Penn State, using scanning tunneling microscopes (STM) to measure electrons at the surface of the BSCO, a cuprate superconductor. The microscope scans atom by atom across the top surface of the BSCO, and what they found was stripe orientations that went in two different directions instead of the same direction. The result, seen above in red and blue, is a jagged image that forms interesting patterns of electronic stripe orientations.

"The electronic patterns are complex, with holes inside of holes, and edges that resemble ornate filigree," explains Carlson. "Using techniques from fractal mathematics, we characterize these shapes using fractal numbers. In addition, we use statistics methods from phase transitions to characterize things like how many clusters are of a certain size, and how likely the sites are to be in the same cluster."

Once the Carlson group analyzed these patterns, they found a surprising result. These patterns do not form only on the surface like flat layer fractal behavior, but they fill space in three dimensions. Simulations for this discovery were carried out at Purdue University using Purdue's supercomputers at Rosen Center for Advanced Computing. Samples at five different doping levels were measured by Harvard and Penn State, and the result was similar among all five samples.

The unique collaboration between Illinois (Dahmen) and Purdue (Carlson) brought cluster techniques from disordered statistical mechanics into the field of quantum materials like superconductors. Carlson's group adapted the technique to apply to quantum materials, extending the theory of second order phase transitions to electronic fractals in quantum materials.

"This brings us one step closer to understanding how cuprate superconductors work," explains Carlson. "Members of this family of superconductors are currently the highest temperature superconductors that happen at ambient pressure. If we could get superconductors that work at ambient pressure and temperature, we could go a long way toward solving the energy crisis because the wires we currently use to run electronics are metals rather than superconductors. Unlike metals, superconductors carry current perfectly with no loss of energy. On the other hand, all the wires we use in outdoor power lines use metals, which lose energy the whole time they are carrying current. Superconductors are also of interest because they can be used to generate very high magnetic fields, and for magnetic levitation. They are currently used (with massive cooling devices!) in MRIs in hospitals and levitating trains."

Next steps for the Carlson group are to apply the Carlson-Dahmen cluster techniques to other quantum materials.

"Using these cluster techniques, we have also identified electronic fractals in other quantum materials, including vanadium dioxide (VO2) and neodymium nickelates (NdNiO3). We suspect that this behavior might actually be quite ubiquitous in quantum materials," says Carlson.

This type of discovery leads quantum scientists closer to solving the riddles of superconductivity.

"The general field of quantum materials aims to bring to the forefront the quantum properties of materials, to a place where we can control them and use them for technology," Carlson explains. "Each time a new type of quantum material is discovered or created, we gain new capabilities, as dramatic as painters discovering a new color to paint with."

More information: Can-Li Song et al, Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x, Nature Communications (2023). DOI: 10.1038/s41467-023-38249-3

Journal information: Nature Communications

Continue reading here:

Researchers discover superconductive images are actually 3D and disorder-driven fractals - Phys.org

Collaboration builds fantastical stories from nuggets of truth – Symmetry magazine

Science fiction asks the question What if? and then attempts to answer that question with a story grounded in science fact. Thats why Comma Press, for its latest anthology, paired science fiction writers with CERN scientists, to create untrue stories with a bit of truth in them.

Creating the anthology, titled Collision, started with a call to CERN researchers and alumni of the European physics research center, asking them to describe concepts they thought would inspire good creative writing. Next, authors picked the ideas that called to them most. Each author then met with the scientist who proposed their chosen idea and discussed the science in detail. The anthology consists of a collection of the resulting stories, each with an afterward by the consulting scientist.

Symmetry interviewed three different writer-scientist pairs to learn what it was like to participate in this unusual collaboration: Television writer, television producer and screenwriter Steven Moffat, famous for his work on the tv series Doctor Who and Sherlock, worked with Peter Dong, a physics teacher at the Illinois Math and Science Academy who works with students to analyze data from CERN. Poet, playwright and essayist lisa luxx worked with physicist Carole Weydert, now leader of a unit of an insurance regulatory authority in Luxembourg. And UK-based Short Story Fellow of the Arts Foundation Adam Marek worked with Andrea Giammanco, who continues to do research at CERN as a physicist with the National Fund for Scientific Research in Belgium.

Although the assignment was the same for every pair, they all approached their stories differently.

When Peter Dong first got the call for inspiring physics concepts, he knew exactly what to submit: a strange but real-life theory proposed by physicists Holger Bech Nielsen and Masao Ninomiya.

In the late 2000s, the theorists posited that the universe might for some reason prefer to keep certain fundamental particlessay, the Higgs boson or particles of dark mattera mystery. So much so, they wrote, that the universe was actively conspiring against their discovery.

The search for the Higgs certainly wasnt easy. Only weeks after being turned on for the first time, the Large Hadron Collider at CERN experienced a critical failure, immediately stopping the two most advanced experiments aimed at finding the Higgs. Years earlier and an ocean away, the United States Congress suddenly scrapped plans to build a similar collider in Texas, despite construction having already begun.

These two events, Nielsen and Ninomiya argued, could indicate that discovering the new particle was so improbable that the universe wouldnt allow it to happen.

Dong first read about the theory as a graduate student at the US Department of Energys Fermi National Accelerator Laboratory in 2008. While the probability is absurdly small, and it's a wacky, out-there idea, it's still not impossible, he says. When we come up with these wacky theories, I feel like more people should know about them. They're just so much fun.

In their paper, Nielsen and Ninomiya proposed an experiment thatwhile it would not prove their hypothesiscould at least put it to the test: Shuffle a deck of a million cards in which a single card is marked Shut down the LHC. If that card were randomly pulled from the deck, they would take that as a sign the universe wanted physicists to back off.

The card experiment was never run, and in the end, scientists were able to repair and restart the LHC. In 2012, physicists on the CMS and ATLAS experiments announced the discovery of the Higgs.

Dong had previously tried out using Nielsen and Ninomiyas idea as the basis for a story while auditing a creative writing class, so when he got the CERN email, he was ready with a pitch.

Writer Steven Moffat says Dongs idea stood out to him on the list. I zeroed in on that promptone, because I understood it, or at least I thought I understood part of it, he says. And two, because I could see a story in it. I just love the idea that a bunch of very serious-minded scientists wondered if the universe might be actively trying to stop them.

Moffat and Dong worked together to make sure the science in the story held up. Steven was taking great pains to ask Does this make sense? Would that be right? Dong says.

The result was a mad, paranoid fantasy, Moffat says. But its decorated in the right terminology and derives from something that really happened.

Not all the stories in Collision fall neatly into the sci-fi genre. For her entry, poet lisa luxx explored her lived experiences of adoption and migration through the lens of quantum physics. At a certain point, I had to disentangle [pun not intended] myself from trying to achieve a particular genre, luxx says.

The writer chose a prompt related to supersymmetry, which posits that every particle has a yet unobserved partner with some shared properties. Like Moffat, luxx says she was attracted to the idea she chose because it made some amount of sense to her. The physicist had used quite a poetic quote in her explanation of this theory, luxx says. That immediately drew me in.

Physicist Carole Weydert, who submitted the idea, may have had an artistic way of explaining supersymmetry because she had previously explored another complex physics theory in her own art. It is this idea that you could have a kind of symmetry between everything contained in spacetime and spacetime itself, she says.

Weydert says she [tries] to squeeze in time to paint, sometimes using ripped and cut pages from old textbooks in her work. I try to express this quest for simplification in theoretical physics, that from one very, very basic theory, the whole complexity of the world emerges.

It was not easy to translate the mathematical and theoretical aspects of supersymmetry into a fictionalized story, luxx says. The most challenging part was me grasping exactly the nuance of where physicists are at with understanding supersymmetry, she says. I was learning the theory while writing it.

But the theory eventually clicked into place as a metaphor.

The poet says she wanted to ground the story in a common language to show how entwined physics is with everyday life. Physics are just parts of us. We can't understand ourselves and can't understand society without at least somewhat understanding these theories.

The final piece strayed from Weyderts initial prompt, but Weydert says she is nevertheless pleased with the outcome. I think what is important in these stories is not the science, as such, but connecting it to something that conveys an emotion.

For his contribution to the anthology, short story writer Adam Marek moved away from traditional prose. He instead wrote his piece in the form of an interview for real-life BBC Radio program Desert Island Discs. The 45-minute show tells the story of a famous persons life through music tracks they have chosen.

Ive always loved it as a format for telling the story of someones life, Marek says. And I'd thought it could make a terrific framework for writing a short story.

The prompt Marek chose came from physicist Andrea Giammanco. As a postdoc, Giammanco had spent time researching the dark sector, a collection of hypothetical particles that have thus far gone unobserved. I was in love with the idea of the dark sector popping up in unconventional ways, Giammanco says.

Marek had worked with other scientists on stories for other Comma Press anthologies, but he says this time was unique. It was different, he says. And I think that was because of Andreas particular interests and approach.

Marek and Andrea spoke many times throughout the project, on video calls and email, Marek says. It helped that they shared a love of science fiction. Andrea seemed as fascinated by the writing process as I was with the science. We had lots of questions for each other.

Giammanco says they threw so many ideas at each other that we could have easily written five completely different stories.

In one of these brainstorming sessions, Giammanco says he mentioned offhand that the dark sector may be hiding in plain sight, and it may even be revealed in a reanalysis of old data. This piqued Adams attention, he saysand it ultimately became a key part of the plot.

Throughout the process Giammanco ensured that everything in the story was at least possible. Adam wanted the idea to work from the narrative point of view, he says. But for me, it was paramount to make it work from the scientific point of view.

But staying within the realm of the possible didnt restrict Marek and Giammanco to the realm of the particularly plausible. Marek says that flexibility helped him find a creative way for the story to end.

At the peak of being very stressed out about the story and not knowing how to finish it, he says, Andrea just happened to send me an email about ghosts.

Giammanco had sent Marek an article that quoted physicist and pop culture figure Brian Cox asserting that the LHC had unintentionally disproved the existence of incorporeal beings by failing to detect any evidence of them. But Giammanco had laid out an alternative argument: If ghosts did exist, they would just need to be a part of the dark sector, which the LHC has not been able to reach. It just fixed it all, Marek says. It was really, really fortuitous for the story.

Whether or not any of the ideas in the stories collected in Collision turns out to be true, the project highlights what both writers and scientists have in common: the ongoing quest to imagine What if?

Originally posted here:

Collaboration builds fantastical stories from nuggets of truth - Symmetry magazine

Physicists tracked electron recollision in real-time – Tech Explorist

The measurement of the fastest dynamical processes in nature typically relies on observing the nonlinear response of a system to precisely timed interactions with external stimuli. This usually requires two (or more) controlled events, with time-resolved information gained by controllably varying the interpulse delays.

Using a breakthrough technique created by MPIK physicists and used to verify quantum-dynamics theory by collaborators at MPI-PKS, the movement of an electron in a strong infrared laser field is tracked in real-time. The experimental method connects the free-electron mobility caused by the subsequent near-infrared pulse to the absorption spectrum of the ionizing extreme ultraviolet pulse.

Although the electron is a quantum object, the classical description of its motion is appropriate for our experimental technique.

Strong-field physics fundamentally depends on high-harmonic generation, which converts optical or near-infrared (NIR) light into the extreme ultraviolet (XUV) regime. In the well-known three-step concept, the driving light field (1) ionizes the electron by tunnel ionization, (2) accelerates it away and back to the ionic core, where the electron (3) recollides and emits XUV light if it recombines.

In this study, physicists replaced the first step with an XUV single-photon ionization, which has a twofold advantage: First, one can choose the ionization time relative to the NIR phase. Second, the NIR laser can be tuned to low intensities where tunnel ionization is practically impossible. This allows us to study strong-field-driven electron recollision in a low-intensity limiting case.

Attosecond transient absorption spectroscopy, previously established by a team led by Christian Ott, for bound electrons, is the method used here, along with reconstructing the time-dependent dipole moment. It links the time-dependent dipole moment with the classical motion (trajectories) of the ionized electrons, in this case, by extending the approach to free electrons.

Ph.D. student Tobias Heldt said,Our new method, applied to helium as a model system, links the absorption spectrum of the ionizing light to the electron trajectories. This allows us to study ultrafast dynamics with a single spectroscopic measurement without scanning a time delay to compose the dynamics frame by frame.

The results of the measurements indicate that, depending on the experimental settings, circular polarisation of the light wave can increase the likelihood of bringing the electron back to the ion. Despite seeming counterintuitive, theorists had anticipated this result.

This interpretation of recolliding periodic orbits is also justified by classical simulations. Whenever electron (re-)collides with the helium atom (the green line intersects the white center line), it leads to a characteristic modification and increase of the time-dependent atomic dipole (the result of the quick red-blue oscillation near the center line), which an attosecond absorption-spectroscopy experiment can pick up.

Group leader Christian Ott is optimistic about the future potential of this new approach:In general, our technique allows us to explore laser-driven electron motion in a new lower-intensity regime, and it could further be applied on various systems, e.g., for studying the laser-driven electron dynamics within larger atoms or molecules.

Journal Reference:

Read the original post:

Physicists tracked electron recollision in real-time - Tech Explorist